Your question may already have an answer on the community forum. Please search for related topics, and then read through the guidelines before creating a new topic.
Here’s an outline with best practices for making your inquiry.
Here are my questions:
Is there any way to improve the test script which will allow me to capture more than 2000 values in a JSON array from an API response and later use it for the next API?
Alternatively, instead of sending a JSON, how do I capture all the data from Metabase in the form of CSV, saved in an the Postman environment, so that I can later run the CSV using collection runner to send the data
How do I automate the collection runner that grabs the CSV from API and then send the results to another API to run on a specific time every day like a cronjob?
Details (like screenshots):
I’ve already tried:
The script I created on the test script only works if I want to chain the responseBody which is
a JSON object or specific JSON object values in an array, which is later saved as environment variable and values which is then used later to send it to another API. But the JSON generated via original source API has multiple JSON objects in an array up to 2000 values.
And I can’t keep adding more , ,  till the  right? I wish there were simpler ways to achieve this.