Redsauce's software and cybersecurity blog

Postman CLI in CI/CD: Automate and Scale Your Tests

Posted by Daniel Balletbó; Isabel Arnaiz

{1570}

After covering the basics of the tool in Postman: a quick guide for beginners, diving into variables, environments, and collaboration in Postman: advanced guide with environments, tests and mock servers, and exploring CI/CD integration through Newman: Postman’s CLI, this article takes one more step forward in optimizing your Postman workflows.


You’ll find practical examples, advanced configurations, and expert tips to make the most of the Postman tool suite—helping your team automate API testing, collaborate efficiently, and monitor it all with ease.

Postman CLI

Postman CLI is a secure command-line companion for Postman. It is verified and backed by Postman, and allows you to:

  • Run a collection by its ID or path

  • Send execution results to Postman by default

  • Log in and log out via the CLI

  • Validate API definitions against API security rules

Installing Postman CLI

Postman CLI supports the same OS requirements as the standard Postman desktop app.


The installation commands vary by operating system:


Windows:

powershell.exe -NoProfile -InputFormat None -ExecutionPolicy AllSigned -Command "[System.Net.ServicePointManager]::SecurityProtocol = 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://dl-cli.pstmn.io/install/win64.ps1'))"

Linux:

curl -o- "https://dl-cli.pstmn.io/install/linux64.sh" | sh

Basic Postman CLI Commands

Here are three essential Postman CLI commands. You'll need your APIKEY and collection ID. To get them, follow the steps outlined in our previous article: Automate your tests with Newman: Postman's CLI, skipping anything specific to Newman.


Login / Logout

postman login --with-api-key ABCD-1234-1234-1234-1234-1234

postman logout

Run collections

postman collection run 12345678-12345ab-1234-1ab2-1ab2-ab1234112a12

Running Postman collections in a GitLab CI pipeline

Postman CLI can be used to execute collections manually or in automated pipelines. Here's how to run them automatically with GitLab CI:

Steps:

  • Make sure you have an API key. See how to create one

  • Get the collection_uid for the collection you want to run. Instructions here, ignore what concerns newman.

  • Get the environment_uid for the environment linked to the collection. Instructions here

  • Open Postman, select the collection you want to run, and click "Run".

  • Inside will appear the options to specify what is needed inside the .gitlab-ci.yml:


    • Select the appropriate collection and environment

    • Select the CI configuration we want, in this case it will be gitlab and a linux will be used.


  • Once we have completed the above steps, we will see the structure with the details of what we need to add to the .gitlab-ci.yml file.



Notes:

  • That document will vary depending on your needs.

  • By default Postman uses an image which includes several options that facilitate ongoing interactions, but it is a very heavy image and may have things in it that you don't really need. It is always a good idea to look for something that best suits your needs and make sure that postman cli offers support for that image (e.g. alpine is not supported).

  • Depending on the image you use the commands will change. the gitlab project configuration for the key api.

  • You will need to create an environment variable within the gitlab project configuration for the key api. the other is to use the api with the corresponding uids and the other is to use the json files generated by postman.

  • There are changes as to how the requests are read depending on which method is being used. The exposed method is to use the api with the corresponding uids the other is to use the json files generated by postman.

    • What is the difference?

      When you are passing a :something parameter by url

Monitoring

Postman offers a monitoring feature to schedule and track your collection’s executions:



Jenkins

Scheduling executions from Jenkins is the final step for effective automation. We must install Newman either directly on Jenkins or on the slave that will execute the tests. Additionally, we’ll need the email extension plugin and the Log Parser. The latter plugin must appear in Jenkins settings as shown below:




In the description, we indicate what will be parsed, for example, “Error parsing”, and in the Parsing Rules File field, we provide the name of a file that must be in Jenkins’s root directory with the rules. This file contains the following:


error /AssertionError/


With this configuration, we create a new job with the following information:

  • Project name: SWPostman (for example)

  • Where it will be executed: testing (for example)

  • Source code: Git, adding the repository path where the code is stored

  • Build → Add build step → Execute Shell and we write the same series of commands used to run Newman locally: newman run -e swenvironment.json swcollection.json

  • Post-build Actions → Add post-build action → Console Output (build log) parsing

    • Select Use global rule, and in Select Parsing Rules, select “Error Parsing”, which is the description we gave to the parsing rule.

  • Post-build Actions → Add post-build action → Editable Email Notification, and fill in the following:

    • Project Recipient List: Email addresses that should receive the email.

    • Advanced Settings → Add Trigger → Script – After Build

      • Trigger Script → Groovy Script:

        //it will only send an email if AssertionError is found in the logsbuild.logFile.text.readLines().any { it =~ /.*AssertionError.*/ }

Save and run with "Build now".


The console output will show practically the same thing we saw when running locally, plus additional lines indicating part of the job process.


When executing a job that intentionally contains an error, an email is also received with a brief summary of the execution and a link to access that execution in Jenkins.


From the execution summary, we can click on the “Parsed Console Output” link and click on the generated error. This allows us to jump directly to the part of the log where the error appears:



JenkinsFile

pipeline{
agent {
label 'testing'
}
stages {
stage('Test') {
steps {
checkout scm
echo "Executing tests..."
sh "newman run -e swenvironment.json swcollection.json -r htmlextra"
echo "Test execution completed."
}
}
}
}

Newman Reports

Newman allows us to generate an HTML report with a breakdown of the requests and their results:




To generate this file, we must first install htmlextra:npm install -g newman-reporter-htmlextra


Once installed, to generate the HTML file with the results, we run:newman run -e swenvironment.json swcollection.json -r htmlextra


This command will create a folder named newman, where the resulting files will be stored:

Install Plugin to Add HTML Format in Jenkins

In addition to generating reports locally, we can also make Jenkins generate an HTML document with the information about the requests and their results. To do so, we must first install the HTML Publisher plugin.


First, in Jenkins, go to Manage Jenkins > Manage Plugins and search for the plugin to install it.







In Jenkins Job:


In JenkinsFile:

post{
always {
publishHTML (target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: 'newman/',
reportFiles: 'SW2*.html',
reportName: "NewmanReport"
])
}
}

Viewing the file from Jenkins:

If we view the generated .html file from within Jenkins, we will see that it appears without .css. To fix this, we need to do the following:

  • Dashboard -> Manage Jenkins -> Manage Nodes -> Click the gear icon on the node (on the right) -> Interactive Console

  • Once in the console, enter the following command: $System.setProperty("hudson.model.DirectoryBrowserSupport.CSP", "")

  • Then click execute.

Once this is done, all future executions will display the .css correctly.

Environment Variables:

In the case where we want to execute Newman by accessing the collection through the Postman API, we will need to provide an API KEY (explained here).If we hardcode this into Jenkins, it becomes a security risk since anyone with access to Jenkins would be able to see our API KEY.To fix this, we will store the KEY value as an environment variable.


To do so, go to:

Dashboard -> Manage Jenkins -> Manage Credentials -> In the Store, choose Jenkins (depending on the scope you want) -> Global credentials (unrestricted) -> Add some credentials

Once there, you will see several options under Kind. If what we want is to store a KEY value, we’ll select “Secret text”.


In the “Secret” field, enter the value of the API KEY, and in the “ID” field, give the variable a name so we can reference it later. Optionally, you can add a description for the variable.


Once created, go to the Job where you want to use it. There, under Build Environment, select “Use secret text(s) or file(s)”. Specify the name that the variable should have in that job, and map it to the global variable you created.In this case, the name of our variable in the job will be APIKEY, so in the execution command, we’ll access its value using ${APIKEY}.


The interesting part is that the value of the APIKEY will not be displayed at any point. In fact, if we execute the job and check the console output, we’ll see that instead of the actual value of the APIKEY, we get ****:




Now that you know all this, you'll be able to go beyond simple test execution, integrating it perfectly into your development cycle and ensuring that both your team and your APIs reach their full potential.The combination of the previous guides with the techniques described in this article will help you consolidate your quality and test coverage strategy.


If you’ve read the full guide, thank you for making it this far. If you’re still hungry for more, you can keep reading our free ebook with many more tips on QA and test automation.


Download free QA ebook

About us

You have reached the blog of Redsauce, a team of experts in QA and software development. Here we will talk about agile testing, automation, programming, cybersecurity… Welcome!