Hi, i’m pretty new to using Postman for my job but have been tasked with figuring out a way to run an API call with multiple iterations on a weekly/monthly cadence to update our cloud configuration.
Basically i am being provided a formatted CSV file that I need to have run into a collection, but the CSV file contain more then 50 iterations of the API call. Is there a way to allow this to accept more then 50 in a schedule? when running it normally there is no problem.
This is for a retail POS environment where we are uploading offline tax data to of could POS solution based on tax changes from local authorities. We need to have a monthly cadence in doing so.
Hi @sparcgroupllc Welcome to the Postman Community!
There are multiple options you can explore for running a collection on a schedule. For your use case, I will recommend using a scheduled collection run or a Monitors (if you want to receive alerts when certain events happen like a request failure). Both options allow you schedule a collection run on a monthly/weekly/daily or even hourly cadence.
When uploading data(CSV, JSON, etc) in a collection run, you’re limited to uploading a maximum of 5MB so if your data size exceeds 5MB, you will get an error.
@gbadebo-bello i have the schedule set up and that all looks good however the problem is the .csv. Each line in the csv is an iteration of the runner i’m trying to run. it only allows 50 at a time. I need to run the entire csv in 1 shot which can be 400-500 depending on changes for that month. The file size isn’t a problem, it’s the number of iterations it allows to run, thats my problem.
Perhaps you’re modelling this the wrong way. I have a few questions.
- Can you share the structure of your setup and how the data in this CSV file is being ran?
- How are you translating each line in the CSV to a single iteration of the runner?
- Can you share a screenshot of this limit saying you’re only limited to 50 scheduled runs?
@gbadebo-bello here’s a screen of my schedule. Apologies i dont have the CSV file on hand at the moment. but the Schedule is running a specific POST command. That post command is the second screen shot. Each line in the CSV corresponds to a iteration of the POST being run. The file can be 400 or so lines long and while yes it can be trimmed, we have to do 1 file per location and a total of 500 locations. so breaking it out would be very very cumbersome
Thank you for providing this additional detail. I have inquired internally so I can provide you with a more informed response. I’ll let you know once I hear back.
Unfortunately, there currently isn’t a way to raise or bypass this limit. However, the team internally is looking into the possibility of increasing this limit.
If this CSV data can be presented has JSON, there are manual ways you could go about running a single request multiple times while using the collection runner.
If the data is presented as an array of JSONs, you can update your collection or environment variable with the current JSON data while shifting or popping objects from the JSON array in your pre-request scripts. Then use
pm.setNextRequest() to run the the actual request.
In the actual request, always run
pm.setNextRequest to the request that holds the variable.
Here is an example code of what this could look like.
This might not be the exact code you’ll need, but it should give you an idea of the workflow.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.