Skip to main content

Command Palette

Search for a command to run...

IndexedDB - Bulk inserts are slow

Updated
3 min read
IndexedDB - Bulk inserts are slow
T

A Web Developer based in Belgium, specialized in PHP & JS development

I'm currently building a PWA that needs to persist some data. Lots of data, actually. In my application, I need to store products reference (about 200k of them). An easy task for IndexedDB ? Not so much...

Bulk inserting

The IndexedDB api is not great, but with a couple of helper libs, you can get something that works decently. Until you try to insert 200k records. Yes, you can "bulk insert" them by only using one transaction for multiple records (between 5k and 10k seemed to work best for me), but it's still really slow.

Here for example, it stores all items with a valid ean code:

  batchItemUpdate(db, storeName, items) {
    return new Promise((resolve, reject) => {
      const tr = db.transaction(storeName, "readwrite", { durability: "relaxed" });
      const store = tr.objectStore(storeName);
      const results = [];

      // Basically, we gain a lot of speed by having only one oncomplete handler
      tr.onerror = reject;
      tr.oncomplete = () => resolve(results);

      for (let i = 0; i < items.length; i++) {
        const item = items[i];
        if (!item["ean"]) {
          continue;
        }
        // operation is converted to a promise by IDB
        store
          .put(item)
          .then((e) => {
            results.push(e);
          })
          .catch(reject);
      }

      tr.commit();
    });
  }

But it's not really fast (unless I missed something?). It takes about 1 or 2 seconds per 5k records. It's even slower on mobile devices, which is where the PWA will be used.

If you have a specific use case, a in memory database works better

So, I had to try something else. Basically, I get all my product references from the server. Let's store them in batches.

    const tr = db.transaction("pages", "readwrite", { durability: "relaxed" });
    const store = tr.objectStore("pages");
    store.clear();

    const bulkSize = 5000;
    for (let i = 0; i < Math.ceil(products.length / bulkSize); i++) {
      const chunk = products.slice(i * bulkSize, (i + 1) * bulkSize);
      const key = "page_" + i;
      store.put(chunk, key);
    }
    buildProductsFromDb();

In the snippet above, I save the data big chunks of 5000k. That goes much faster. Then, I call the `buildProductsFromDb' which basically do this:

async function buildProductsFromDb() {
  const productPages = await db.getAll("pages");
  let j = 0;
  productPages.forEach((page) => {
    for (let i = 0; i < page.length; i++) {
      j++;
      productsDb[page[i]["ean"]] = page[i];
    }
  });
}

It loops over my chunks of 5k records and I store them in an object with the ean key as the key. This allows me to easily search by ean without worries, which is all I need to do.

And... it works ! Much much faster than having to query the IndexedDB and storing all records one by one.

Lesson learned: if you have a specific use case (read only, by a specific key), you can avoid performances issues from IndexedDB.

M

Wow, this is a great idea, I've never seen anyone do it before. honestly I'm still a bit surprised that it works 😅 But wouldn't it be better to split your product's info into multiple tables? For example you could create a table that only has the product's name so you can run search queries on it and put a reference in it to other tables that have prices, brands, and other info... This could be easier to manage and also you're not gonna have to write all of the data to Indexeddb at once, you can just write the names that you need to display to the DB. and the rest of the info can be written to the DB in the background while the user browses through your app. If you have other opinions or better ways to do this please share with me your knowledge as I'm currently working on an app with almost the same requirements and it's my first time working with indexeddb 🙏, Thanks again for the great article 🙂.

T

My use case is that I needed to synchronize a product database. Basically, I get a really big set of data, that needs to work offline.

My first naive approach was inserting them one by one, basically replicating one row per product. But that didn't work, so I ended inserting them as "pages" then dealing with it in memory.

I'm not saying this works for every situation, but for me, it made such a big difference. The app feels really instant and that works really well :)

1
M

Thomas Portelange That's great. Thank you very much for sharing 🤍