I wonder if there is a possibility in Newman or collection runner to run a collection which has for example 1 milon queries?
In a standard approach there isn’t. Node process collects data to an error: “Javascript heap out of memory”.
I really don’t want to store any data for the report it’s useless for me. Only globals variables can be stored. I just want to send queries almost infinitely.
Is there any possibility to run long collection in that environment?
Thanks for any kind of help.
Are you able to explain more about the actual use case here - I’m trying to understand the why behind your problem.
What feedback are you looking to get from 1 million, 500K, 100K requests running? Are you testing for a particular thing? Are you doing some form of Load or Performance Test? Are you seeding a database with a bunch of data?
I fill my database with objects, events, trends etc.
In my collection I have 20 queries but I jump to them plenty of times. Lets say that I jump to the query which creates an event 1 milion times.
I would be delighted if there will be an option to not store any data for the report and just keep sending an infinite number of queries (Memory occupation by node process doesnt increase).
Node only allocates around ~1.5 GB of memory, and Newman uses all of it, so it crashes.
You can run Newman as la Node.js library and do something like:
node --max-old-space-size=8192 newman.js #increase to 8gb
But don’t get your hopes up. I had only managed to get ~100000 to run without increasing the memory. You may need a lot of memory to finish 1 million. I had even tried disabling the cli reporter but to no avail.
My first impression is that there is a memory leak somewhere.
In my case, i use a collection of 5 requests with 2500 iterations and it fails after about 10K requests. The logs getting generated out of this is about 10 MB (i can see that in Jenkins), so i don’t think logging is an issue here.
Why would newman use 1.5GB? If it stores all the data, it would be nice to have an option to not do that. We do nowhere near that many tests but still get crashes. Another issue is that it’s EXTREMELY SLOW. The request takes a two minutes combined but the test runs for hours.
I tried to run 50 collections in parallel using Newman npm. Each run adds 10 MB of memory usage (that’s a huge amount for a single collection), and each operation adds a further load of 1-2 MB.