Generate array of JSON from CSV file

Hi, I’m looking to know if the following is possible with Postman using maybe a pre-request script.

Let’s say I have a CSV file with 100 entries, each of them with 3 keys like “id”, “value”, “description”, and I want to create one request to hit an endpoint with all of theses entries in the body, I’d need to read all entries from the CSV, convert them into an array of JSONs, and then send the request.

So far I’ve seen how to iterate over each entry and create a request for each of them, but I’d like to avoid calling my endpoint 100 times, when I could just call it once with all entries in an array.

Would something like that be achievable at all with Postman? Or am I better off just creating the array outside of Postman first?

Hey @chatillon.mathieu :wave:

What’s the format of the request body? Can you provide an example please?

There’s nothing in the app that would convert that file and allow you to run it as a single request - CSVs can be used with the Runner but like you mentioned, that’s going to use each line as an iteration and run the same request X times until the end of the file.

Services like this could help with the CSV > JSON conversion:

https://www.convertcsv.com/csv-to-json.htm

Hi @danny-dainton!

I’d like to convert something like that:

id,value,description
1,hello,this is hello
2,hi,this is hi

into

[
  { "id": "1", "value": "hello",  "description": "this is hello" },
  { "id": "2", "value": "hi",  "description": "this is hi" },
]

Which I may have to do for thousands and thousands of items, I was just wondering if Postman script had a way to read the CSV, and generate some request bodies from it using some JS code, which would also allow me to maybe batch these calls into groups of 100 items (e.g. 10 requests with each 100 items in an array spaced by 5 seconds) at a time for example.

But I believe it’s indeed much easier to simply use an external tool to convert it!
Thanks a lot for this quick reply :slight_smile:

Currently, there isn’t a way to read a file from your filesystem and manipulate the data. You can kind of do that with the CSV file in the Collection Runner and using pm.iterationData.* in a pre-request script but I don’t think that would be what you need here.

Seems using the external tool will do the conversion leg work in no time and give you the JSON array for the request body.

Once you have to data, you could use a script inside Postman to split them down into smaller arrays.

Here’s a very basic example using the _.chunk() method from Lodash.

_.chunk() is great indeed! Any idea of how I could exploit it to save the output somehow?

Umm…depends on how you’re going to eventually use each of them.

This would store each one as a new global variable:

let bigArray = [
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"},
    { "id": "{{$randomInt}}"}
]


let arrList = _.chunk(bigArray, 3);

_.each(arrList, (arr, index) => {
    pm.globals.set(`arr-${index}`, JSON.stringify(arr))
})

Maybe something there that you could work with.