IndexedDB - Bulk inserts are slow

IndexedDB - Bulk inserts are slow

I'm currently building a PWA that needs to persist some data. Lots of data, actually. In my application, I need to store products reference (about 200k of them). An easy task for IndexedDB ? Not so much...

Bulk inserting

The IndexedDB api is not great, but with a couple of helper libs, you can get something that works decently. Until you try to insert 200k records. Yes, you can "bulk insert" them by only using one transaction for multiple records (between 5k and 10k seemed to work best for me), but it's still really slow.

Here for example, it stores all items with a valid ean code:

  batchItemUpdate(db, storeName, items) {
    return new Promise((resolve, reject) => {
      const tr = db.transaction(storeName, "readwrite", { durability: "relaxed" });
      const store = tr.objectStore(storeName);
      const results = [];

      // Basically, we gain a lot of speed by having only one oncomplete handler
      tr.onerror = reject;
      tr.oncomplete = () => resolve(results);

      for (let i = 0; i < items.length; i++) {
        const item = items[i];
        if (!item["ean"]) {
          continue;
        }
        // operation is converted to a promise by IDB
        store
          .put(item)
          .then((e) => {
            results.push(e);
          })
          .catch(reject);
      }

      tr.commit();
    });
  }

But it's not really fast (unless I missed something?). It takes about 1 or 2 seconds per 5k records. It's even slower on mobile devices, which is where the PWA will be used.

If you have a specific use case, a in memory database works better

So, I had to try something else. Basically, I get all my product references from the server. Let's store them in batches.

    const tr = db.transaction("pages", "readwrite", { durability: "relaxed" });
    const store = tr.objectStore("pages");
    store.clear();

    const bulkSize = 5000;
    for (let i = 0; i < Math.ceil(products.length / bulkSize); i++) {
      const chunk = products.slice(i * bulkSize, (i + 1) * bulkSize);
      const key = "page_" + i;
      store.put(chunk, key);
    }
    buildProductsFromDb();

In the snippet above, I save the data big chunks of 5000k. That goes much faster. Then, I call the `buildProductsFromDb' which basically do this:

async function buildProductsFromDb() {
  const productPages = await db.getAll("pages");
  let j = 0;
  productPages.forEach((page) => {
    for (let i = 0; i < page.length; i++) {
      j++;
      productsDb[page[i]["ean"]] = page[i];
    }
  });
}

It loops over my chunks of 5k records and I store them in an object with the ean key as the key. This allows me to easily search by ean without worries, which is all I need to do.

And... it works ! Much much faster than having to query the IndexedDB and storing all records one by one.

Lesson learned: if you have a specific use case (read only, by a specific key), you can avoid performances issues from IndexedDB.