How to limit (or queue) calls to external processes in Node.JS?

Moonlightos

New Member
ScenarioI have a Node.JS service (written using ExpressJS) that accepts image uploads via DnD (example). After an image is uploaded, I do a few things to it:[*]Pull EXIF data from it[*]Resize itThese calls are being handled via the node-imagemagick module at the moment and my code looks something like this:\[code\]app.post('/upload', function(req, res){ ... <stuff here> .... im.readMetadata('./upload/image.jpg', function(err, meta) { // handle EXIF data. }); im.resize(..., function(err, stdout, stderr) { // handle resize. });});\[/code\]QuestionAs some of you already spotted, the problem is that if I get enough simultaneous uploads, every single one of those uploads will spawn an 'identity' call then a resize operation (from Image Magick), effectively killing the server under high load.Just testing with \[code\]ab -c 100 -n 100\[/code\] locks my little 512 Linode dev server up such that I have to force a reboot. I understand that my test may just be too much load for the server, but I would like a more robust approach to processing these requests so I have a more graceful failure then total VM suicide.In Java I solved this issue by creating a fixed-thread ExecutorService that queues up the work and executes it on at most X number of threads.In Node.JS, I am not even sure where to start to solve a problem like this. I don't quite have my brain wrapped around the non-threaded nature and how I can create a async JavaScript function that queues up the work while another... (thread?) processes the queue.Any pointers on how to think about this or how to approach this would be appreciated.AddendumThis is not the same as this question about FFMpeg, although I imagine that person will have this exact same question as soon as his webapp is under load as it boils down to the same problem (firing off too many simultaneous native processes in parallel).
 
Top