What's the best way to run automation using deployed flows + Newman on AWS CodeBuild

Hey everyone! I’m building a test automation framework using Postman and wanted to share a workflow challenge I’m running into with Flows + Native Git + deployment + CI/CD.

What I’m trying to achieve:

  1. Common Requests — Shared API requests that act as reusable building blocks
  2. Flows as Common Modules — Reusable flows that chain these requests together
  3. Flows as Test Cases — Test execution flows (Smoke, Regression, E2E) that consume the common module flows
  4. Deploy & Run via CLI — Deploy these flows and trigger them via Postman CLI in CI/CD pipelines (AWS CodePipeline)
  5. Git for collaboration — Everything version-controlled so multiple teams can collaborate via branches and PRs

Problem 1 — Deploy button missing in Local View:

When a workspace is connected to a Git repo (Native Git), flows in Local View can’t be deployed — the Deploy button isn’t visible. I raised this with Postman support and they confirmed the behaviour, suggesting pushing flows to Cloud View as a workaround.

But this creates a friction point: the whole value of Native Git is developing locally with Git workflows, yet deployment only works from Cloud View. So you end up maintaining two states.

Problem 2 — Collection format mismatch for CI/CD:

With a Git-connected workspace, when I create a collection, Postman v12 saves it as YAML files (Collection v3 format) in the repo. These YAML files get pushed to the remote repo and are available in my CI/CD pipeline (AWS CodePipeline with CodeBuild).

But here’s the catch — Newman only runs .json collections, not .yaml. So the collections sitting in my Git repo in v3 YAML format can’t be directly executed by Newman in the pipeline. There’s no straightforward way to go from Git-backed YAML collections to Newman execution without some conversion step.

What I’d love to see / need help with:

  • Ability to deploy flows directly from Local View (Git-connected workspaces)
  • A clear path for running Git-backed collections (YAML v3) in CI/CD — either Newman supporting YAML or a Postman CLI alternative that does
  • Newman or Postman CLI support for running deployed flows natively, so flows can be first-class citizens in CI/CD
  • A recommended end-to-end workflow for: Git-backed requests → Flows → Deploy → Run in CI/CD

Ideally, I want to be able to deploy my test flows and trigger them from my CI/CD pipeline. Is anyone else trying to use deployed flows as the execution layer for automated testing? How are you bridging the gap between Git-connected workspaces and CI/CD execution?

Would really appreciate any suggestions or workarounds!

1 Like

Hey @shubhankar-bhardwaj :waving_hand:

Just trying to break this down a little bit and as I have more information about the Flows part of the workflow, Me or someone from the team can come back and provide an update.

A clear path for running Git-backed collections (YAML v3) in CI/CD — either Newman supporting YAML or a Postman CLI alternative that does

The Postman CLI is going to be the way forward here, Newman is just not at feature parity with the platform anymore and a lot of the newer features are not support there.

The yaml Collections can be run in a pipeline (I’m using GH Actions as an example) like this, rather than adding a single file reference, you would use the folder path:

name: Run Postman Collection

on:
  push:
    branches:
      - main
  pull_request:
    branches:
      - main
  workflow_dispatch:

jobs:
  run-collection:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Install Postman CLI
        run: |
          curl -o- "https://dl-cli.pstmn.io/install/linux64.sh" | sh

      - name: Run collection
        run: |
          postman collection run "postman/collections/Library API" \
            --mock "postman/mocks/library-api-mock.js" \
            -e "postman/environments/Library API Environment.environment.yaml"

There are a couple of install methods for the Postman CLI, this one or installing the npm package (npm install -g postman-cli). I have also used an environment file here but as that is a single file, you can simply reference that directly.

In the same command, you can see that i’m using a local mock server which have been created in Postman and the config file live in the same repo so they can be using in the pipeline run.

Hi @shubhankar-bhardwaj! I’m on the Flows team at Postman, so I’ll focus on the
Flows-specific parts of your post.

How Flows deployment works today

Deploying a flow requires it to exist on Postman Cloud. Deployment provisions cloud infrastructure to host and run the flow, which is why the Deploy button isn’t available in Native Git mode. To deploy, flows need to be synced to Postman Cloud first.

Your workflow with the Postman CLI

You mentioned wanting to deploy and trigger flows from CI/CD, and the Postman CLI supports deploying and triggering flows today. Once your flows are synced to Postman Cloud from the app, the rest of the workflow can be driven from the CLI:

  1. List your flows postman flows list -w gives you the flow IDs in your workspace.
  2. Deploy via CLI postman flows deploy <flow id> deploys a flow directly from the command line.
  3. Trigger your deployed via your CI/CD pipeline

What’s coming

postman workspace push will add support for flows, which means syncing flows from your Git repo to Cloud will become a CLI command too. At that point, the full workflow is scriptable end-to-end from your CI/CD pipeline:

workspace push → flows list → flows deploy → flows trigger

No app interaction needed.

A couple of questions for you

With the CLI commands above, does that cover the deploy + trigger workflow you’re looking for in CI/CD?

When you mentioned “maintaining two states” with Local View vs Cloud View, is the main pain point the manual sync step, or is there something else about the experience that’s creating friction?

Thanks again for the feedback, it’s really helpful. Let us know how it goes!

Hi Kyler,

Thanks for the detailed response! I’ve been trying to put this all together and I’m hitting a wall. Let me explain my exact setup and what I’m trying to achieve, because I think there’s a fundamental workflow gap that I need help understanding.

My structure:

Collections (in Git-connected workspace, YAML v3):

  • Common-Collections/ — Reusable API requests (e.g., EPG Proxy request that takes channel, date, duration params and returns EPG data)
  • Test-Executions/ — Where I want to keep my test execution logic

Flows:

  • Common-Flow-Modules/ — Reusable flow modules that use collection requests as building blocks. For example, an EPG Module flow that takes the EPG Proxy request from Common-Collections, passes in parameters, and produces a structured output that other flows can consume.
  • Tests/DP-Tests/Smoke/ — Test flows that use the Common-Flow-Modules and add validation logic on top. For example, a smoke test flow that calls the EPG Module, gets the response, and validates the data.

What I want to achieve:

I want to deploy my test flows so they can be triggered from CI/CD (AWS CodePipeline), and I want assertions/test scripts that produce pass/fail results I can use for reporting.

Where I’m completely stuck:

  1. I design my collection requests and flows in Local View (Git-connected). But I can’t deploy from Local View.

  2. So I switch to Cloud View to deploy. But in Cloud View, I can’t see my Local View folder structure — my carefully organized Common-Flow-Modules/ and Tests/ hierarchy isn’t there unless I manually sync everything.

  3. Once I deploy a flow, it gets a URL endpoint. Now what? To run this in CI/CD with assertions, do I need to:

    • Create a new collection request that calls the deployed flow’s URL?
    • Write post-response scripts (pm.test assertions) in that new request?
    • Run that collection via Newman or Postman CLI?

    So essentially: Collection Request → hits deployed Flow URL → Flow executes (calls EPG API, processes data) → returns response → post-response script in the calling request validates the response?

  4. If that’s the pattern, where do I keep these “test runner” requests that call deployed flows? In a separate collection? Inside Test-Executions? Do they live in Git or only in Cloud View?

  5. And here’s the reporting problem — even if I get the above working, Postman CLI doesn’t support HTML/JUnit/JSON reporters for YAML v3 collections (I get Error: Reporter "html", "junit" is not supported for multi-protocol collections). So I have no way to generate test reports in CI/CD.

What I really need clarity on:

What is the recommended end-to-end workflow for someone who wants to:

  • Design requests and flows in a Git-connected workspace (Local View)
  • Deploy flows to the cloud
  • Trigger those flows from CI/CD
  • Get pass/fail assertions and generate test reports

Right now it feels like each piece exists but they don’t connect into a coherent workflow. I’d really appreciate a step-by-step of how Postman envisions this working, because I want to make this my team’s primary automation framework and I need to understand the right pattern.

Thanks again for the support — it’s clear the tooling is evolving fast, I just need help understanding the intended workflow so I’m building on the right foundation.

I tried the steps you explained and hit another blocker, for more context I am attaching ss -

I have one more question -

  1. When should I push newly added flow to cloud, after making changes to flows/adding new flow in feature branch of my git repo and pulling that into master??.
1 Like

Hey @shubhankar-bhardwaj, thanks for the detailed write-up. This is incredibly helpful feedback.

Let me address where you’re stuck and what’s available today:

Why Cloud View doesn’t show your local folder structure (Step 2)

The key piece here is that you need to push your local changes to the Postman Cloud before they’ll be visible in Cloud View. When you’re working in Local View/Native Git mode, your Flows and any supporting Collections live locally until you explicitly push them. That’s why your Common-Flow-Modules/ and Tests/ hierarchy isn’t showing up. It’s not a sync issue;
It’s that the push step is required to get those assets into the Postman Cloud, where deployment happens.

So the workflow today is: make changes locally, push to cloud, then deploy from the app.

Triggering deployed flows in CI/CD (Step 3)

You don’t need to create a separate collection request to call your deployed flow’s URL. The Postman CLI now supports triggering flows directly:

postman flows trigger <flowid>

You can use postman flows list -w to discover your flow IDs. That said, creating a collection request that hits the flow’s endpoint is also a valid approach if you prefer that pattern for organizing your test assertions.

The current end-to-end workflow:

  1. Develop flows locally with Native Git
  2. Push to Postman Cloud (from the app)
  3. Deploy the flow (from the app)
  4. Use postman flows trigger in your CI/CD pipeline

When to push flows to cloud

After you’ve made your changes and merged to your production branch (whether that’s master, main, or whatever your team uses), push to cloud and deploy. The push is what bridges your local Git workflow to the Postman Cloud for deployment.

On reporters

You’re right that the Postman CLI currently doesn’t support HTML/JUnit/JSON reporters for YAML v3 collections. One option worth exploring is using the Postman CLI’s collection runner (postman collection run), which supports HTML, JUnit, and JSON reporters for HTTP collection runs:

postman collection run -r cli,junit --reporter-junit-export ./results.xml

This way, you get your CI/CD report output while still leveraging Flows for the automation logic.

We’d be curious to hear if that approach works for your setup.

Looking ahead

This is an active area of development for us, and your write-up captures exactly the kind of end-to-end workflow we’re working to enable. Your feedback is invaluable.

2 Likes

Hi Kyler,

Thanks for detailed response. I tried the method you suggested to -

  1. Maintain the flows folder structure.
  2. Run the deployed flows as a collection request.

About #1 -

  • I created a folder using mkdir command from terminal and moved existing flows common module into it, merged the changes to master and pushed the changes to the postman cloud –> pulled the changes, but the folder structure wasn’t maintained.

About #2 -

  • Pulled flow has the option to deploy and run before deploying as well.
  • I deployed the flow successfully.
  • Using the flow url I simply ran a get request and got error

What I understand from the error is somehow environment file is not being read. I would request postman team to help me by replicating the issue and if you could help me with a short video showing - selecting evn, deploying flow to collection you know the whole workflow or direct me to an existing page/video that would be great help

Hi @mission-geoscientis8 ,

I tried running tests using postman collection option but it only supports cli option -

postman collection run ./collections/Integration-platform-automation-collection -i “Smoke-Tests” -r html,cli --reporter-junit-export ./result/test_results.xml

Error: Reporter “html” is not supported for multi-protocol collections.

Only the “cli” reporter is supported for multi-protocol collections.

postman collection run ./collections/Integration-platform-automation-collection -i “Smoke-Tests” -r cli,junit --reporter-junit-export ./result/test_results.xml

Error: Reporter “junit” is not supported for multi-protocol collections.

Only the “cli” reporter is supported for multi-protocol collections.

Could the postman team add support for other formats as well in case of multi-protocol collections.

Hi @mission-geoscientis8 ,

Sharing 4 related issues I’ve hit while using Postman Flows for test automation:

  1. Folder structure not preserved on Cloud — I have Flows organized into subfolders (Common-Modules, Tests) on my local/master branch. When pushed to Cloud, only the flat list of flows is visible; folders are lost.

  2. Deployed flows fail when modules live in subfolders — Deploying a flow on Cloud fails (error code 1, “Action did not run successfully”). I believe this is because the referenced common module was in a subfolder that Cloud doesn’t recognize.

  3. No report generation for multi-protocol collections — After flattening folders (which fixes #2), I can’t generate JUnit/HTML/JSON reports. The CLI returns: Reporter "junit" is not supported for multi-protocol collections.

  4. Deployed flows not visible when switching back to local branch — After deploying a flow on Cloud, switching back to the local branch doesn’t show the deployed flow in the collection folder. There’s no sync or visibility of Cloud-deployed flows from the local/Git branch view.

Net result: there’s currently no path that gives us both modular flow organization AND structured test reporting. Are any of these on the roadmap?