Skip to content Skip to sidebar Skip to footer

Executing 100K Promises In Javascript In Small 50 Chunks

I have a function that makes a REST call to a service and returns a promise. lets call that function Execute(). The function takes an ID and sends the ID as a GET parameter to a RE

Solution 1:

You can use async/await to perform asynchronous task in sequential order, schedule a call to the same function is the original array of contains elements, else return array of results

let arr = Array.from({
  length: 2000
}, (_, i) => i);

let requests = arr.slice(0);

let results = [];

let fn = async(chunks, results) => {
  let curr;
  try {
    curr = await Promise.all(chunks.map(prop => 
             new Promise(resolve => setTimeout(resolve, 500, prop))));
    results.push(curr);
    console.log(curr);
  } catch(err) {
    throw err
  }

  return curr !== undefined && requests.length 
         ? fn(requests.splice(0, 50), results) 
         : results
}

fn(requests.splice(0, 50), results)
.then(data => console.log(data))
.catch(err => console.error(err))

Solution 2:

With promises itself, this can be done only with recursion.

If you can use new version of Node.js, use async await, it will work as you expect, then you can use await Promise.all(promises)

If you cant, then there is nice library (called Async) that can execute 50 asynchronous calls at once with this method: https://caolan.github.io/async/v3/docs.html#parallelLimit

It is even better than chunks, because if you have 1 slow callback in chunk, it will block everything else. With parallel limit, it just keep executing 50 callbacks all the time. (however you can just pre-create chunks by 50 if you insist on them and use .series method)


Solution 3:

You can wrap promise by function and push it an array.
After splitting the array in chunks and process with reduce.

Npm package.
https://www.npmjs.com/package/concurrency-promise


Post a Comment for "Executing 100K Promises In Javascript In Small 50 Chunks"