What is Time supposed to be?

Specifically I am talking about this:


What is that time supposed to be? Before today I thought it was supposed to be the time it took to get the response back but I just timed it and that is definitely not the case.

It is the total time taken to receive and decode the response. As such, it also includes the usual compute time that all clients would take after receiving the response before any application level stuff can be performed.

I am finding that it’s actually taking much longer in order for me to see the response in Postman (about 60 seconds longer). Is that extra time just the amount of processing time Postman requires to display the data?

Can you verify how long cUrl or a browser takes to show the response in comparison?

Unfortunately that’s not possible anymore. The response has changed so that much less data is being returned so the times are much lower now.

I think I should reply here to keep the thread complete.

After receiving response from service, HTTP clients (including Postman) has to decode the response and it stitch multiple response buffers depending on what compression / encoding is used to return the data. That might take some time. But there’s no way to tell how much time is a justified overhead without comparing it to cURL (with response decoding turned on.) Postman would be a tad extra than cURL owing to the overhead of moving data around in GUI.