I have a request that returns an array for the body. If the body contains an array of 100 records, then I have hit the request limit and must call the request again with a param of page incremented for each request.
For exampe, if I have 250 total records to retrieve then I would need to make the following request calls.
I have watched the pagination tutorial (https://youtu.be/6VLquc7A56Y?si=V6nn6x-Y3KoyQHST). However, my flow only runs the initial request. I can’t get the request to trigger again with the new value for page.
I believe the issue is your client_id, created_from, and created_to are only being passed into your block once. You can make the values inline (change the block type from select to string on those variables) on the send request block instead and this should work for one additional iteration. Your page variable also isn’t being incremented.
@flows-daniel Your recommendation fixed the issue for calling the request iteratively. Can’t thank you enough.
Now, do you have a good recommendation on how I can maintain a running counter that I increment each time my results come back with 100 records. This is my indicator that there are more records to pull, but I have to include the param page= to grab the next set of records. From my reading, it does not seem like it is possible to create a variable, then change the value of the variable.
Use an If block to first check if it’s equal to 100, if true, you can pass a page value of 0 into the evaluate block which then increments by one and passes that value back into itself for the next loop. That value can also be passed to your send request block.
Currently I believe your logic will just call the request with a page of 0 in the case the count is less than 100 which probably isn’t what you’re wanting to do.
Now that I have the request conditionally repeating based on the results of the previous response. Each request provides back an array of data. I need to collect the results of all requests, then send these result to a for loop. Currently, my flow only collects the results for each request, then the next request overwrites the previous values. How can I concatenate the arrays and persist them until all request have been completed?
You can use similar logic and add an evaluate block before the if block to accumulate the responses and then your “else” condition becomes the final list once you’ve iterated through the API:
So here you’re starting with a blank list, appending the first set of results, sending it through the if block and then passing that back as the list to append to until the final else condition is met.
Can thank you enough. Your solutions all worked and helped get me to my end goal. You are definitely a Postman Flow genius.
I have one final question. I would like to share this flow with a coworker, but it currently resides in my personal workspace. Is there a way to export and import this Flow? I attempted to use the Share link provided, but that created a workspace that gave them access to all my Collections, Environments, etc.
I did create a new workspace to share Collections, Environments, etc. with this coworker, but I can’t find a way to copy / export my Flow in my personal workspace over to this new workspace I am sharing with my coworker.
The two ways to share the flow are either put the flow in a workspace that you make public (anyone can fork), or you can move/fork the flow within your own team (your co-worker has to be on the same team as you).
Flows does not currently have an export feature.
If you’re on the same team for the new shared workspace, you can fork the flow into that new workspace (but the send request blocks will still point to the collections in your private workspace, so you will need to update the send request blocks to point to the collection in the shared workspace).
I had attempted the fork option prior to asking you for assistance. The workspace selection did not look like a dropdown, so I never clicked in the field to see that I could fork to another workspace.
Thank you so much for all your assistance. I feel like I have a much better grasp of Flows with all the help you provided.