Performance test ignoring some tests

I am using Collection Runner to execute a performance test with 10 virtual users, using a fixed load profile and running for 1 minute. The JSON data file contains an array of 10 objects. Each one is identical with slight modifications in a couple of data items.

The test runs successfully but only ever sends requests for 8 out of the 10 users. It is consistently ignoring tests 7 and 8 from the file.

If I take the data for those test cases and make a single call to the API they work successfully.

When I was testing this last month (before I hit the 25 calls per month limit) it was successfully executing all 10 tests.

Is there anything I can check to find out why it never sends API calls for these two particular tests?

Hey @martinperrie01 :wave:

Are there scripts in the Collection that might be impacting the workflow here? Does the Collection have any unsaved changes in the request that might be getting picked up by the runner?

Have you tried creating a Monitor and manually running this to see if the same behaviour is seen there?

Hi @danny-dainton :wave:
Thanks for your reply. There are no scripts in the collection, nor any unsaved changes. I haven’t used a Monitor before, but I’ll investigate that and see what happens. Something new to learn!!

I have looked into Monitors - but realise that they run on the Postman Cloud, so the endpoint needs to be publicly available. That’s not currently the case with the API I’m testing.

I ran some more performance tests with Collection Runner and tried re-ordering the tests in the JSON data file to see if that made any difference.

In the original file tests 7 and 8 were being ignored. I moved tests 9 and 10 above 7 and 8. Now 9 and 10 are being ignored and 7 and 8 are running successfully.

No idea what that tells me :sweat_smile: Maybe I re-type the whole data file and see what happens then.

I don’t know what logic that you may have in the Collection that would be impacting the running order.

What scripts to the have?

Do any of them us pm.execution.setNextRequest() or pm.execution.skipRequest()?

A completely blank Collection wouldn’t jump or skip over specific request unless you tell it too so there must be something in the Collection that’s changing the order when it’s executed.

The collection is really simple - just one POST request in it. No pre-request scripts. I’ve got the following post-response script, just checking for the data returned from the endpoint :-

pm.test("Status code is 200", () => {
  pm.expect(pm.response.code).to.eql(200);
});
pm.test("PersonCount returned", function () {
    var jsonData = pm.response.json();
    pm.expect(jsonData.PersonCount).to.exist;
});
pm.test("New user", function () {
    var jsonData = pm.response.json();
    pm.expect(jsonData.Status).to.equal('N');
});

Out of interest I tried running with 12 virtual users. This time 4 requests were missed and it only ran 8 out of the 12.

I’m on the free plan here … could that be the issue? It says that the performance test data file is a feature trial, available for a limited time with full access. I can’t see any mention of restrictions such as the number of virtual users.

@martinperrie01

The test runs successfully but only ever sends requests for 8 out of the 10 users

When you run a test with 10 VUs, we do run that many. There is no clear way to tell how many VUs are being run, at least if a data file is not being used.

In your case, you have a data file with 10 rows. And you have noticed that data row #7 and #8 are not being used at all. Can you please provide me with info on how that was arrived at?

Also, let me know which " VU data mapping" you have been using for this when the issue occurs - Ordered or Randomized ?

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.