Get ESOCKETTIMEDOUT only when running performance tests

I have a long lived POST request (~3mins) API that I am testing. It works fine when run in isolation (SEND) or using the collection runner in functional test mode (can make 20 requests in a row without errors) but fails every request (ESOCKETTIMEDOUT) when running in performance testing mode, even when using only 1 fixed virtual user.

This seems like a timeout issue (since this is not a server error) but I have all of the timeouts (that I can find) set to 0 (infinite) and also tried setting to 20mins with no difference.
Why would the socket close prematurely in performance testing mode but not functional testing mode?

It seems to me that Functional test mode and performance test mode (with only 1 fixed user) should be effectively the same. Only difference is functional specifies iterations where performance specifies time.

Am I missing something?

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.