Iterate through the data object using JS in Tests?

All,

Newbie here to Postman, JS, API testing etc. so please bear with me. I come from a test automation background and am trying to use Postman to do something that seems reasonable but am not sure how to pull it off. I’ve been playing around and think I have a pretty good handle on environments, variables, data files, collections, etc. and will try to use the correct terminology. Please correct me where I may be wrong/off base.

I am working on a bit of a POC where I am sending a request to a web service that returns a JSON formatted document list with elements such as document type, document id, document date, etc. given an account number.

What I want to do is validate each of the elements for each document returned within a single request against the iterations in the data file without submitting the same request for each iteration of the data set. I’ve played around with data files and see references to the data object but am unclear on how to reference the array sections and enumerate on them similar to how one can in the response. I am using JSON format for my data file based on the example here: https://www.dropbox.com/s/o2cguyx4iv053j6/data-article.json?dl=0

If this is possible, I have a few more questions. If my data file has 5 array sections in it but I only run one iteration, are all sections of the data file loaded into the data object or is only the first iteration loaded? Will I have access to each section of all of the data sets in the data file in this scenario?

I’m able to access the individual elements of the data file when using the collection runner and running all iterations in the data file but haven’t figured out how to access and enumerate through the entire data file programmatically. I’ve tried using data.length to get a count of the sections but it returns null.

Any insight on how to do this would be greatly appreciated.

TIA

There is a 1:1 mapping of the array ad iteration. The first item is sent to the first iteration and so on.

I have one idea though, to access all data, try sending all data in one array ~ maybe as JSON inside this!

[{
    allData: "[{\"item1\": \"value1\"}, {\"item2\": \"value2\"}]"
}]

Then inside your scripts, you can use JSON.parse to convert it back to JSON

var allDataArray = JSON.parse(pm.data.get('allData');

Thanks for the point in the right direction! I tried a variation of what you suggested before posting the OP but the collection runner was throwing a syntax error. After some playing around however I did get it to read/preview my data file without error.

Now however when I attempt to run I am getting this:

TypeError | Cannot read property ‘get’ of undefined

Sample data:

[
{
“NCPValidation”:
[
{“membernumber”: “123456”, “taxId”:“123456789”},
{“membernumber”: “654321”, “taxId”:“987654321”}
]
}
]

Line that is throwing the error:

var allDataArray = JSON.parse(pm.data.get(‘NCPValidation’));

It looks like it doesn’t like/can’t find the object name in my data array. Probably something simple but being a newbie, I’m struggling.

Help?

@JWolf, If you’re using data files in the Collection Runner or in Newman, you’ll have access to a data object, which is a dictionary of data values in the current test run. Check here https://www.getpostman.com/docs/postman/scripts/postman_sandbox

Also, your sample data needs to be updated (you have to pass array as a string, see below)

Sample data:
[
{
“NCPValidation”: “[{“membernumber”: “123456”, “taxId”:“123456789”},
{“membernumber”: “654321”, “taxId”:“987654321”}]”
}
]

Now in your script you can extract that array as JSON.parse(data.NCPValidation)) .

Hope that is what you are looking for.

Thanks

That is what I’m looking for, however there’s still something wrong with the JSON. Do the quotes need to be escaped?

When I copy and paste the above into a text editor, correct the quotes, and load it into the PM collection runner, select the json data file type and hit preview I get this:

Error reading data file: SyntaxError: Unexpected token in JSON at position 27.

This will of course not run with this error.

Suggestions?

@JWolf You are right, the snippet I provided you was just the copy of your JSON data, The actual JSON should skip the quotes like this

[{
  "NCPValidation": "[{\"membernumber\": \"123456\",\"taxId\":\"123456789\"},{\"membernumber\": \"654321\",\"taxId\":\"987654321\"}]"
}]

Thanks

Ok, I had tried that exact thing before but was still getting errors. It appears that when you insert a line break into the file between data sets for readability/maintainability, it breaks it.

Are there any tricks to achieve this? As you might expect, my real data set per validation set is much larger than the sample and it becomes unwieldy to have to have the entire data file content be on the same line.

Suggestions?

Can you please share how a CSV file with the same array would look like?

I suspect it’d look like this:

membernumber,taxId
123456,123456789
654321,987654321

The idea of a csv would work and I found a node package - (https://www.npmjs.com/package/csv-array) that will take a csv file and stick it into a json array. I have used npm to install the module but when I try to use it given the usage instructions, it throws this in the Postman console:

Error | Cannot find module ‘csv-array’

I’ve moved it around, tried to path directly to where it’s installed using proper module referencing: var csv = require(’…/…/…/node_modules/csv-array’) to no avail. I think this might be a solution if I could get non core modules to work in Postman.

I even tried to use the core process module(https://nodejs.org/api/process.html) and attempting to do even the most basic operation with it to get the current directory to try and debug the above issue:

console.log('Current directory: ' + process.cwd());

I get an error:

ReferenceError | process is not defined

Feel like I’m fishing around in the dark so far with this tool…

I tried using the below and it worked. Example: My test data file has multiple usernames/password. In my CSV file I write as

OneValue
"xyz,xyz123
abc, abc123
ghj,ghj123"

In postman, I have 3 requests but I dont want to iterate it 3 times. I enter 1 data in the CSV file as above and write a test in pre-requisite script to send different data each time reading in from the csv file

My Sample pre-requisite script:

let newArray = data.OneValue; //newArray will have the entire string present in OneValue
let dataArray = newArray.split(’\n’);
//dataArray will have the string as an array with each element having the set of data
myData = dataArray[1].split(’,’);
//Splitting the array with comma to get each values separately
pm.environment.set(“username”, myData[0]);
pm.environment.set(“password”, myData[1]);

I am using this script for each of my 3 requests changing the dataArray[number] accordingly. This way I am able to send different data for different requests from the data file and the data file (CSV is readable) and can be utilized for multiple datas as well. @JWolf - I am not sure if this was your initial query, but just sharing this. Thanks!

Thanks for the insight. I think I’m trying to do kind of the opposite of what you are presenting here in that I want to send a single request and use the data file both for input and for validation of the response. Ideally I’d like to leverage two files, one as the data file, and one as the validation file but js seems to be limited in what it can do.

My data is more complex than what I’ve provided as an example and rather than having a single set of two values, I have 8 values per validation set and a dynamic set of validation sets depending on the data sent during the request. So effectively I’ll send a membernumber and a taxId, and get a different result for each request. I have validation data for each document returned in a response. Since Postman only supports a single file and js doesn’t allow you to load/open/parse an actual file on disk, this is where I run into limitations.

I would really prefer to work with a key/value pair motif that a json array offers.

I think I can achieve this if I can get external packages to work so that I could leverage the csv-array package.

I am going to post a thread about that issue and see if I get some resolution.