My question:
I am building a test suite with Postman and Newman runner being executed from Octopus. We have different scenarios and almost all of them need to be data-driven. The way our CI/CD pipeline works involves a yml file with our test “definitions”, basically a json from which the runner gets information of what collection to run, optionally a folder to execute tests from, and a data file to use…
Since we have many different scenarios, it would be necessary to group the tests in many subfolders in order to user a different test data file for each folder, thus creating a lot of definitions in my yml file, and I suspect that causes too much overhead on the overall testing time… So… I was thinking that if I could just run all the tests with a single test-data file which would include a collection of objects for each need, and then preprocess it before each group of tests to extract the desired object collection and use it for the tests inside that folder, that could help a lot…
like this: test-data.json
[
{
"Clients":[
{
"ClientName":"John"
},
{
"ClientName":"Peter"
}
]
},
{
"Suppliers":[
{
"SupplierName":"George"
},
{
"SupplierName":"Robert"
}
]
}
]
And then, in folder for Clients test, in a pre-request script I could extract from the data file the Clients collections, and use that for the tests… and so on
Could something like this be achievable?
Thanks