Just asking for some help if possible!
I am very new to coding, so going too deep into writing code is slightly confusing (I am learning Python and Javascipt but this is taking time!)
So I work in the R&D department of a security company. We use digital security scanners, which we can obtain data from with API calls e.g a GET request of
I have managed to find out how to connect to multiple units in Postman using their IP addresses and run a collection which pulls the logs from these systems, 50 times in a row per unit, and this pulls 32 ‘logs’ at a time until there is no data left to pull…
Using the collection runner allows me to automate this every day to suit which is fantastic…my issue is that I need to export the response body from each of these calls to a file, automatically…
Ideally this can run every day and pull all the data logs off these systems into a file, so that at the end each month we can analyse all the data…
I have read a lot about Newman and Node.js but struggling to get my head around this at the moment!
I don’t quite understand why you need the raw responses.
I’ve seen it requested quite a few times but don’t quite see the point in most cases.
I don’t know how you want to analyse the data, so you may have a perfectly good reason for doing this.
I would craft relevant tests for the information I need. Then using Newman, you can get an HTML report of the results which you can analyse. Even better, you can then schedule the Newman job using various methods including continuous integration tools like Jenkins or Azure Dev Ops.
Another option that isn’t using Postman if you really need to analyse the data in depth is to use PowerBI if you have that available. You can pull data straight into the reporting tool from the API’s and as it’s a reporting tool that excels with multiple data sets it will give you more flexibility on the analysis.
The following links\discussions may be of use.