Execute many promises sequentially (Concept)












0















(My target is clarify my concept about the problem, not code)



I want execute a array of promises sequentially, but nodeJS throw a strange error about many promises executed in parallel.(Because i limited that array to 20 promises and works, 50 promises and works, but 9000 promises and explode..)




  • I know that we have some solutions like, array.reduce(), loops, etc

  • I know about the promises states (my array have pending promises initially)


My question: I can execute 20 promises, then another 20 promises, etc but... If im executing my promises sequentially, nodeJS must execute 9k promises without problem? I have a bad concept? My code is wrong?



(Im doubting because nodeJS wait some time before begin to resolve the promises)



My case: i trying download 9k+ images (with axios), then save each one and then wait 5 seconds sequentially.
[download 1 image, save that image, wait 5 seconds, then download next image, save.., wait..., etc.. ] Possible?










share|improve this question




















  • 2





    So first question, why in the world would you want to chain 9000 promises? That is just poor design. Doesn't the api or tool you use offer bulk gets?

    – basic
    Jan 3 at 14:45











  • i cannot find any reasons for such an amount of promises

    – messerbill
    Jan 3 at 14:49











  • If we talk theoretically, did you try to config node to run with more memory?

    – Maayao
    Jan 3 at 14:51








  • 1





    I think Promise.all only care about if all promises was resolved, and allow execute in parallel, i tried it. Thanks

    – AndresSp
    Jan 3 at 15:00






  • 1





    Take a look at this npm package: npmjs.com/package/promise-queue

    – Jaime
    Jan 3 at 15:24
















0















(My target is clarify my concept about the problem, not code)



I want execute a array of promises sequentially, but nodeJS throw a strange error about many promises executed in parallel.(Because i limited that array to 20 promises and works, 50 promises and works, but 9000 promises and explode..)




  • I know that we have some solutions like, array.reduce(), loops, etc

  • I know about the promises states (my array have pending promises initially)


My question: I can execute 20 promises, then another 20 promises, etc but... If im executing my promises sequentially, nodeJS must execute 9k promises without problem? I have a bad concept? My code is wrong?



(Im doubting because nodeJS wait some time before begin to resolve the promises)



My case: i trying download 9k+ images (with axios), then save each one and then wait 5 seconds sequentially.
[download 1 image, save that image, wait 5 seconds, then download next image, save.., wait..., etc.. ] Possible?










share|improve this question




















  • 2





    So first question, why in the world would you want to chain 9000 promises? That is just poor design. Doesn't the api or tool you use offer bulk gets?

    – basic
    Jan 3 at 14:45











  • i cannot find any reasons for such an amount of promises

    – messerbill
    Jan 3 at 14:49











  • If we talk theoretically, did you try to config node to run with more memory?

    – Maayao
    Jan 3 at 14:51








  • 1





    I think Promise.all only care about if all promises was resolved, and allow execute in parallel, i tried it. Thanks

    – AndresSp
    Jan 3 at 15:00






  • 1





    Take a look at this npm package: npmjs.com/package/promise-queue

    – Jaime
    Jan 3 at 15:24














0












0








0








(My target is clarify my concept about the problem, not code)



I want execute a array of promises sequentially, but nodeJS throw a strange error about many promises executed in parallel.(Because i limited that array to 20 promises and works, 50 promises and works, but 9000 promises and explode..)




  • I know that we have some solutions like, array.reduce(), loops, etc

  • I know about the promises states (my array have pending promises initially)


My question: I can execute 20 promises, then another 20 promises, etc but... If im executing my promises sequentially, nodeJS must execute 9k promises without problem? I have a bad concept? My code is wrong?



(Im doubting because nodeJS wait some time before begin to resolve the promises)



My case: i trying download 9k+ images (with axios), then save each one and then wait 5 seconds sequentially.
[download 1 image, save that image, wait 5 seconds, then download next image, save.., wait..., etc.. ] Possible?










share|improve this question
















(My target is clarify my concept about the problem, not code)



I want execute a array of promises sequentially, but nodeJS throw a strange error about many promises executed in parallel.(Because i limited that array to 20 promises and works, 50 promises and works, but 9000 promises and explode..)




  • I know that we have some solutions like, array.reduce(), loops, etc

  • I know about the promises states (my array have pending promises initially)


My question: I can execute 20 promises, then another 20 promises, etc but... If im executing my promises sequentially, nodeJS must execute 9k promises without problem? I have a bad concept? My code is wrong?



(Im doubting because nodeJS wait some time before begin to resolve the promises)



My case: i trying download 9k+ images (with axios), then save each one and then wait 5 seconds sequentially.
[download 1 image, save that image, wait 5 seconds, then download next image, save.., wait..., etc.. ] Possible?







javascript node.js asynchronous promise axios






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jan 3 at 16:38









Armel

1,204920




1,204920










asked Jan 3 at 14:42









AndresSpAndresSp

2418




2418








  • 2





    So first question, why in the world would you want to chain 9000 promises? That is just poor design. Doesn't the api or tool you use offer bulk gets?

    – basic
    Jan 3 at 14:45











  • i cannot find any reasons for such an amount of promises

    – messerbill
    Jan 3 at 14:49











  • If we talk theoretically, did you try to config node to run with more memory?

    – Maayao
    Jan 3 at 14:51








  • 1





    I think Promise.all only care about if all promises was resolved, and allow execute in parallel, i tried it. Thanks

    – AndresSp
    Jan 3 at 15:00






  • 1





    Take a look at this npm package: npmjs.com/package/promise-queue

    – Jaime
    Jan 3 at 15:24














  • 2





    So first question, why in the world would you want to chain 9000 promises? That is just poor design. Doesn't the api or tool you use offer bulk gets?

    – basic
    Jan 3 at 14:45











  • i cannot find any reasons for such an amount of promises

    – messerbill
    Jan 3 at 14:49











  • If we talk theoretically, did you try to config node to run with more memory?

    – Maayao
    Jan 3 at 14:51








  • 1





    I think Promise.all only care about if all promises was resolved, and allow execute in parallel, i tried it. Thanks

    – AndresSp
    Jan 3 at 15:00






  • 1





    Take a look at this npm package: npmjs.com/package/promise-queue

    – Jaime
    Jan 3 at 15:24








2




2





So first question, why in the world would you want to chain 9000 promises? That is just poor design. Doesn't the api or tool you use offer bulk gets?

– basic
Jan 3 at 14:45





So first question, why in the world would you want to chain 9000 promises? That is just poor design. Doesn't the api or tool you use offer bulk gets?

– basic
Jan 3 at 14:45













i cannot find any reasons for such an amount of promises

– messerbill
Jan 3 at 14:49





i cannot find any reasons for such an amount of promises

– messerbill
Jan 3 at 14:49













If we talk theoretically, did you try to config node to run with more memory?

– Maayao
Jan 3 at 14:51







If we talk theoretically, did you try to config node to run with more memory?

– Maayao
Jan 3 at 14:51






1




1





I think Promise.all only care about if all promises was resolved, and allow execute in parallel, i tried it. Thanks

– AndresSp
Jan 3 at 15:00





I think Promise.all only care about if all promises was resolved, and allow execute in parallel, i tried it. Thanks

– AndresSp
Jan 3 at 15:00




1




1





Take a look at this npm package: npmjs.com/package/promise-queue

– Jaime
Jan 3 at 15:24





Take a look at this npm package: npmjs.com/package/promise-queue

– Jaime
Jan 3 at 15:24












1 Answer
1






active

oldest

votes


















1














I would have used something like a worker pool instead of executing things in a batch of 20 each time, you will always end up waiting for the last one to finish before you start next 20 batch, instead you should set a limit of how many continious download you want to do so you have no more then 20 promises and not a long chain of 9000



The same thing can be accomplish with iterators also. (a same iterator can be passed to different workers and while someone calls the first item the next worker will always get the next one)



So with zero dependencies i would do something like this:



const sleep = n => new Promise(rs => setTimeout(rs, 1000))

async function sequentialDownload(iterator) {
for (let [index, url] of iterator) {
// figure out where to save the file
const path = path.resolve(__dirname, 'images', index + '.jpg')
// download all images as a stream
const res = await axios.get(index, { responseType: 'stream' })

// pipe the stream to disc
const writer = fs.createWriteStream(path)
res.data.pipe(writer)

// wait for the download to complete
await new Promise(resolve => writer.on('finish', resolve))
// wait a extra 5 sec
await sleep(5000)
}
}

const arr = [url1, url2, url3] // to be downloaded
const workers = new Array(20) // create 20 "workers"
.fill(arr.entries()) // fill it with same iterator
.map(sequentialDownload) // start working

Promise.all(workers).then(() => {
console.log('done downloading everything')
})





share|improve this answer
























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54024495%2fexecute-many-promises-sequentially-concept%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1














    I would have used something like a worker pool instead of executing things in a batch of 20 each time, you will always end up waiting for the last one to finish before you start next 20 batch, instead you should set a limit of how many continious download you want to do so you have no more then 20 promises and not a long chain of 9000



    The same thing can be accomplish with iterators also. (a same iterator can be passed to different workers and while someone calls the first item the next worker will always get the next one)



    So with zero dependencies i would do something like this:



    const sleep = n => new Promise(rs => setTimeout(rs, 1000))

    async function sequentialDownload(iterator) {
    for (let [index, url] of iterator) {
    // figure out where to save the file
    const path = path.resolve(__dirname, 'images', index + '.jpg')
    // download all images as a stream
    const res = await axios.get(index, { responseType: 'stream' })

    // pipe the stream to disc
    const writer = fs.createWriteStream(path)
    res.data.pipe(writer)

    // wait for the download to complete
    await new Promise(resolve => writer.on('finish', resolve))
    // wait a extra 5 sec
    await sleep(5000)
    }
    }

    const arr = [url1, url2, url3] // to be downloaded
    const workers = new Array(20) // create 20 "workers"
    .fill(arr.entries()) // fill it with same iterator
    .map(sequentialDownload) // start working

    Promise.all(workers).then(() => {
    console.log('done downloading everything')
    })





    share|improve this answer




























      1














      I would have used something like a worker pool instead of executing things in a batch of 20 each time, you will always end up waiting for the last one to finish before you start next 20 batch, instead you should set a limit of how many continious download you want to do so you have no more then 20 promises and not a long chain of 9000



      The same thing can be accomplish with iterators also. (a same iterator can be passed to different workers and while someone calls the first item the next worker will always get the next one)



      So with zero dependencies i would do something like this:



      const sleep = n => new Promise(rs => setTimeout(rs, 1000))

      async function sequentialDownload(iterator) {
      for (let [index, url] of iterator) {
      // figure out where to save the file
      const path = path.resolve(__dirname, 'images', index + '.jpg')
      // download all images as a stream
      const res = await axios.get(index, { responseType: 'stream' })

      // pipe the stream to disc
      const writer = fs.createWriteStream(path)
      res.data.pipe(writer)

      // wait for the download to complete
      await new Promise(resolve => writer.on('finish', resolve))
      // wait a extra 5 sec
      await sleep(5000)
      }
      }

      const arr = [url1, url2, url3] // to be downloaded
      const workers = new Array(20) // create 20 "workers"
      .fill(arr.entries()) // fill it with same iterator
      .map(sequentialDownload) // start working

      Promise.all(workers).then(() => {
      console.log('done downloading everything')
      })





      share|improve this answer


























        1












        1








        1







        I would have used something like a worker pool instead of executing things in a batch of 20 each time, you will always end up waiting for the last one to finish before you start next 20 batch, instead you should set a limit of how many continious download you want to do so you have no more then 20 promises and not a long chain of 9000



        The same thing can be accomplish with iterators also. (a same iterator can be passed to different workers and while someone calls the first item the next worker will always get the next one)



        So with zero dependencies i would do something like this:



        const sleep = n => new Promise(rs => setTimeout(rs, 1000))

        async function sequentialDownload(iterator) {
        for (let [index, url] of iterator) {
        // figure out where to save the file
        const path = path.resolve(__dirname, 'images', index + '.jpg')
        // download all images as a stream
        const res = await axios.get(index, { responseType: 'stream' })

        // pipe the stream to disc
        const writer = fs.createWriteStream(path)
        res.data.pipe(writer)

        // wait for the download to complete
        await new Promise(resolve => writer.on('finish', resolve))
        // wait a extra 5 sec
        await sleep(5000)
        }
        }

        const arr = [url1, url2, url3] // to be downloaded
        const workers = new Array(20) // create 20 "workers"
        .fill(arr.entries()) // fill it with same iterator
        .map(sequentialDownload) // start working

        Promise.all(workers).then(() => {
        console.log('done downloading everything')
        })





        share|improve this answer













        I would have used something like a worker pool instead of executing things in a batch of 20 each time, you will always end up waiting for the last one to finish before you start next 20 batch, instead you should set a limit of how many continious download you want to do so you have no more then 20 promises and not a long chain of 9000



        The same thing can be accomplish with iterators also. (a same iterator can be passed to different workers and while someone calls the first item the next worker will always get the next one)



        So with zero dependencies i would do something like this:



        const sleep = n => new Promise(rs => setTimeout(rs, 1000))

        async function sequentialDownload(iterator) {
        for (let [index, url] of iterator) {
        // figure out where to save the file
        const path = path.resolve(__dirname, 'images', index + '.jpg')
        // download all images as a stream
        const res = await axios.get(index, { responseType: 'stream' })

        // pipe the stream to disc
        const writer = fs.createWriteStream(path)
        res.data.pipe(writer)

        // wait for the download to complete
        await new Promise(resolve => writer.on('finish', resolve))
        // wait a extra 5 sec
        await sleep(5000)
        }
        }

        const arr = [url1, url2, url3] // to be downloaded
        const workers = new Array(20) // create 20 "workers"
        .fill(arr.entries()) // fill it with same iterator
        .map(sequentialDownload) // start working

        Promise.all(workers).then(() => {
        console.log('done downloading everything')
        })






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Jan 3 at 15:25









        EndlessEndless

        12.8k65471




        12.8k65471
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54024495%2fexecute-many-promises-sequentially-concept%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Monofisismo

            Angular Downloading a file using contenturl with Basic Authentication

            Olmecas