Jenkins Pipelines
Jenkins is the most widely deployed open-source automation server in the world. It has been a cornerstone of CI/CD since long before GitHub Actions existed, and it remains the platform of choice for many enterprise teams because of its flexibility, self-hosted nature, and vast plugin ecosystem. If you work in a large organization, there is a good chance you will encounter Jenkins at some point in your career.
This lesson covers the fundamentals of Jenkins pipelines — how to define your build, test, and quality gate stages as code — along with advanced topics like parallel execution, distributed builds, and the plugins that make Jenkins a powerful quality engineering platform.
Pipeline as Code: The Jenkinsfile
A Jenkinsfile is a text file that defines your CI/CD pipeline and is checked into your repository alongside your application code. This is the "pipeline as code" approach, and it brings the same benefits as infrastructure as code: version control, code review, history, and reproducibility.
Jenkins supports two pipeline syntaxes:
- Declarative Pipeline: A structured, opinionated syntax that is easier to read and write. It enforces a specific structure with predefined sections like
pipeline,agent,stages, andpost. This is the recommended approach for most teams. - Scripted Pipeline: A flexible, Groovy-based syntax that gives you full programming power. It uses
nodeandstageblocks but allows arbitrary Groovy code. Use this when you need logic that the declarative syntax cannot express.
Declarative Pipeline Structure
Here is a complete Declarative pipeline that builds, tests, runs quality checks, and deploys:
pipeline {
agent any
environment {
NODE_ENV = 'test'
}
stages {
stage('Build') {
steps {
sh 'npm ci'
sh 'npm run build'
}
}
stage('Test') {
steps {
sh 'npm test -- --coverage'
}
post {
always {
junit 'test-results/*.xml'
publishHTML([
reportDir: 'coverage/lcov-report',
reportFiles: 'index.html',
reportName: 'Coverage Report'
])
}
}
}
stage('Quality Gate') {
parallel {
stage('Lint') {
steps {
sh 'npm run lint'
}
}
stage('Security Scan') {
steps {
sh 'npm audit --audit-level=high'
}
}
stage('Accessibility') {
steps {
sh 'npm start &'
sh 'npx wait-on http://localhost:3000'
sh 'npx pa11y-ci'
}
}
}
}
stage('Deploy') {
when {
branch 'main'
}
steps {
sh './deploy.sh'
}
}
}
post {
failure {
mail to: 'team@example.com',
subject: "Pipeline Failed: ${env.JOB_NAME}",
body: "Check: ${env.BUILD_URL}"
}
}
}
Let us walk through the key sections:
agent any: Tells Jenkins to run the pipeline on any available agent (build machine). You can also specify a specific agent label, a Docker image, or a Kubernetes pod.environment: Defines environment variables available to all stages.stages: The ordered list of stages in your pipeline. Each stage has a name and a set of steps.parallel: Runs multiple stages simultaneously, reducing total pipeline time.when: Conditional execution. The Deploy stage only runs on themainbranch.post: Actions that run after stages complete. Thealwaysblock runs regardless of success or failure; thefailureblock runs only when the pipeline fails.
Scripted Pipeline
The Scripted syntax gives you full Groovy programming capabilities. Here is the same pipeline in Scripted form:
node {
stage('Build') {
checkout scm
sh 'npm ci'
sh 'npm run build'
}
stage('Test') {
try {
sh 'npm test -- --coverage'
} finally {
junit 'test-results/*.xml'
}
}
stage('Quality Gate') {
parallel(
lint: { sh 'npm run lint' },
security: { sh 'npm audit --audit-level=high' },
accessibility: {
sh 'npm start &'
sh 'npx wait-on http://localhost:3000'
sh 'npx pa11y-ci'
}
)
}
if (env.BRANCH_NAME == 'main') {
stage('Deploy') {
sh './deploy.sh'
}
}
}
The Scripted syntax is more flexible — you can use if statements, loops, try/catch/finally, and any Groovy construct. However, this flexibility can make pipelines harder to read and maintain. The declarative syntax is almost always sufficient for quality engineering workflows.
Parallel Stages for Faster Builds
Quality engineering pipelines can become slow if every check runs sequentially. A linting step, a test suite, an accessibility scan, and a security audit might each take 2-5 minutes. Running them one after another means 8-20 minutes per pipeline run.
Parallel stages solve this by running independent checks simultaneously:
stage('Quality Checks') {
parallel {
stage('Unit Tests') {
steps {
sh 'npm test'
}
}
stage('Lint') {
steps {
sh 'npm run lint'
}
}
stage('Security Scan') {
steps {
sh 'npm audit --audit-level=high'
}
}
stage('Accessibility') {
agent { docker { image 'node:20' } }
steps {
sh 'npm ci'
sh 'npm start &'
sh 'npx wait-on http://localhost:3000'
sh 'npx pa11y-ci'
}
}
}
}
With parallelism, your total pipeline time is determined by the slowest individual check, not the sum of all checks. If your slowest check takes 5 minutes, the entire quality gate stage completes in approximately 5 minutes regardless of how many parallel checks you add.
Jenkins Agents and Distributed Builds
Jenkins uses a controller-agent architecture. The controller (formerly called "master") manages the pipeline logic, UI, and scheduling. Agents (formerly "slaves") are the machines that actually execute build steps. This architecture lets you:
- Scale horizontally: Add more agents to handle more concurrent builds.
- Use specialized agents: Run Windows tests on Windows agents, macOS tests on macOS agents, and Linux tests on Linux agents.
- Isolate builds: Each build runs on a fresh agent (or in a fresh container), preventing contamination between builds.
- Use Docker agents: Run each stage in a Docker container, ensuring consistent environments without manual agent configuration.
pipeline {
agent none // No default agent
stages {
stage('Test on Linux') {
agent { label 'linux' }
steps {
sh 'npm ci && npm test'
}
}
stage('Test on Windows') {
agent { label 'windows' }
steps {
bat 'npm ci && npm test'
}
}
stage('Build Docker Image') {
agent { docker { image 'docker:latest' } }
steps {
sh 'docker build -t myapp .'
}
}
}
}
The Jenkins Plugin Ecosystem
Jenkins has over 1,800 plugins, and many of them are directly relevant to quality engineering. Here are the most important ones:
- SonarQube Scanner: Integrates SonarQube static analysis into your pipeline. Reports code smells, bugs, vulnerabilities, and coverage to the SonarQube dashboard. Supports quality gate pass/fail decisions.
- OWASP Dependency-Check: Scans project dependencies against the National Vulnerability Database (NVD). Produces detailed reports of known vulnerabilities in your dependency tree.
- JUnit: Publishes test results in JUnit XML format. Provides trend charts showing test pass/fail rates over time.
- HTML Publisher: Publishes HTML reports (coverage, accessibility, Lighthouse) directly in the Jenkins UI.
- Warnings Next Generation: Aggregates warnings from dozens of tools (ESLint, StyleLint, PyLint, and more) into a unified dashboard with trend charts.
- Pipeline Utility Steps: Provides helper steps for common pipeline operations like reading files, writing files, and manipulating data.
Blue Ocean UI
Jenkins' classic UI is functional but dated. Blue Ocean is a modern UI that provides a visual pipeline editor, a clearer view of pipeline stages, and better log readability. It is particularly useful for quality engineering because it makes it easy to see which quality check failed, at which stage, with clear visual indicators.
Blue Ocean is installed as a plugin and runs alongside the classic UI. You can switch between them at any time. While Blue Ocean's development has slowed in recent years, it remains a significant improvement for day-to-day pipeline monitoring.
Shared Libraries
As your organization grows, you will want consistent quality checks across all repositories. Jenkins shared libraries let you extract common pipeline logic into a separate repository that all Jenkinsfiles can import:
// In the shared library: vars/qualityGate.groovy
def call(Map config = [:]) {
parallel(
lint: { sh 'npm run lint' },
test: { sh "npm test -- --coverage --threshold=${config.coverageThreshold ?: 80}" },
security: { sh 'npm audit --audit-level=high' }
)
}
// In a project's Jenkinsfile:
@Library('quality-pipeline') _
pipeline {
agent any
stages {
stage('Build') {
steps { sh 'npm ci' }
}
stage('Quality Gate') {
steps {
qualityGate(coverageThreshold: 85)
}
}
}
}
Shared libraries centralize your quality standards. When you want to add a new check (like accessibility scanning) to every project, you update the shared library once, and every project picks up the change on its next run.
Jenkins vs GitHub Actions
Both platforms are capable CI/CD solutions, but they have different strengths:
- Hosting: GitHub Actions is a managed service — GitHub runs and maintains the infrastructure. Jenkins is self-hosted — you manage the servers, updates, and security.
- Configuration: GitHub Actions uses YAML. Jenkins uses Groovy-based Jenkinsfiles. YAML is simpler; Groovy is more powerful.
- Integration: GitHub Actions is deeply integrated with GitHub (PRs, issues, branch protection). Jenkins integrates with any Git host but requires more configuration.
- Cost: GitHub Actions is free for public repositories and has generous free tiers for private repos. Jenkins is free software, but you pay for the infrastructure to run it.
- Plugins: Jenkins has 1,800+ plugins. GitHub Actions has a marketplace with thousands of reusable actions. Both ecosystems are rich.
- Control: Jenkins gives you full control over the build environment. GitHub Actions runners are managed by GitHub (unless you set up self-hosted runners).
For teams using GitHub as their primary platform, GitHub Actions is usually the simpler choice. For enterprise teams that need full control over their build infrastructure, or teams that are not on GitHub, Jenkins remains an excellent option.
Resources
- Jenkins Documentation — Official documentation covering installation, pipeline syntax, plugins, and administration
- Jenkins Pipeline Syntax — Complete reference for Declarative and Scripted pipeline syntax