This is the second in a series of posts about creating reusable Azure DevOps YAML pipelines across many projects. In these posts, Iโll start with simple CI/CD pipelines and progress to a complex, dynamic pipeline.
- CI/CD YAML Pipelines
- Creating a Build Pipeline Template (this post)
- Creating a Deploy Pipeline Template
- Adding โfeature flagsโ to a pipeline
- Dynamic CI/CD Pipeline
- Azure DevOps Pipeline Tips and Tricks
๐ I assume you have read the previous post and are familiar with the basic application lifecycle concepts for containerized applications.
The YAML, sample source code, and pipelines are all in this AzDO Project.
The Problem
Now that I have a nice set of YAML CI/CD pipelines, I want to reuse them across my organization. Many of my applications are similar, and they all build Docker images.
The Solution
In this post, Iโll take the build YAML from the previous post, and move chunks of YAML (templates) into a separate repository. Then I can use those chunks in pipelines across my organization. By having one central location, I can make changes and fixes to the shared YAML and all the pipelines that use it get updated.
๐ Brief Lesson on Templates
Templates are chunks of YAML that can be included in other YAML files, similar to
#include
in C++. In this post Iโll use the termtemplate
to refer toIncludes Templates
(Iโll coverExtends Templates
in the next post).Each template contains one type of AzDO object, such as a
stages
,jobs
,steps
, orvariables
. One of those keywords will be in each file, and it can only be used under the same keyword. E.g if the template hassteps
it can only go understeps
in the calling YAML. Templates can have parameters, so you can make them as flexible as possible.
I used a template in the first post to pull in the variables for the pipeline. In that case, the template was in the same repo as the caller. In this post, the templates will be in a separate repo.
The Template Repository
For the templates, I create an azdo-templates
repository with the following structure:
.
โโโ jobs
โโโ stages
โโโ steps
โโโ variables
As the folder names suggest, each one will contain only templates of that type. To use templates from this repository, include the repository in the resources
section of the calling pipeline, like the following:
resources:
repositories:
- repository: templates # Arbitrary name that we use to reference this repo
type: git
name: azdo-templates # The name of the repo
ref: releases/v1.0 # The branch to use
If I have a file jobs/build.yml
in my repo, I can use it in a pipeline like this:
jobs:
- template: jobs/build.yml@templates # @templates is the 'repository' name from above
Branching Strategy
Notice when I reference the azdo-templates
repository, I use a branch name of releases/v1.0
. The main
branch of the template repo is always the latest branch and will be in sync with the highest numbered releases
branch. The process of making changes to the template is as follows:
- Make a new branch off
main
- Make changes to the template
- Test the changes in a calling pipeline by updating another pipeline to use the new branch
- Merge the branch to
main
- If the changes are not breaking, merge
main
to the highest numberedreleases
branch, e.g.releases/v1.0
- If the changes are breaking, branch off
main
with a newreleases
branch, e.g.releases/v1.1
- Revert or update the ref in the calling pipeline.
This allows users of the template to get fixes or new functionality automatically. But if there are breaking changes, they can choose when to opt into the new release. (I initially used tags, but that got messy with git.)
Creating the Build Templates
Looking at the build pipeline from the previous example it had the following tasks:
- Checkout the code
- Build and test in Docker
- Publish the build output
- Publish the test results
- Publish the code coverage
- If not a dry run
- Publish the Docker image locally
- Push the Docker image to the registry
That list of tasks is what any Docker build runs. Letโs see how to make that a template
. I could make the template
a stage
, job
, or steps
. Usually, you want the template library to be a bunch of smaller Lego-like pieces that can be combined into larger templates, or used by themselves. Iโll make this one steps
since that way it can be used in any job
, or used multiple times in one job
. I can always wrap it in a job
template if I need to.
The original pipeline had a dry run parameter, so Iโll need that. Then, since this template will go under steps
in the calling pipeline, Iโll add steps
to the template.
parameters:
- name: isDryRun
type: boolean
steps:
I donโt have a default
or displayName
for the parameter, since the name makes it obvious, and it will never be shown in the UI. Iโll always want the caller to pass in the parameter, so it has no default.
In the build pipeline, I set the tags variable like this:
variables:
- name: tags
${{ if eq(variables['Build.SourceBranchName'],'main') }}:
value: "$(Build.BuildId)"
${{ else }}:
value: "$(Build.BuildId)-prerelease"
steps
canโt contain variables
so we canโt do the same thing. Instead, weโll add another parameter tags
. You may think, why canโt I set the default
value for the parameter like I did for the value
of the variable? Unfortunately, the template syntax is not allowed in parameters
Now Iโll add the steps from the build pipeline. In the YAML below, I copied the steps
from the build pipelineโs YAML. Next, I need to review this YAML to see if we need to add more parameters.
steps:
- checkout: self
displayName: 'Checkout source code'
- task: Docker@2
displayName: Build and test sample-api # ๐ The name is hard-coded!
inputs:
repository: sample-api # ๐ name, again
command: build
Dockerfile: DevOps/Dockerfile # ๐ more hardcoded values that may be ok
buildContext: ./src # ๐
tags: $(tags) # ๐ This is now a parameter
# ๐ We can make some of these parameters, rely on standards for others
arguments: >-
--build-arg BUILD_VERSION=$(Build.BuildNumber)
--target build-test-output
--output $(Agent.TempDirectory)/output
- task: PublishPipelineArtifact@1
displayName: 'Publish build log'
inputs:
targetPath: $(Agent.TempDirectory)/output/logs
artifact: buildLog
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: 'Publish Test Results'
inputs:
testResultsFormat: VSTest
testResultsFiles: '**/*.trx'
searchFolder: $(Agent.TempDirectory)/output/testResults
publishRunAttachments: true
failTaskOnFailedTests: true
- task: PublishCodeCoverageResults@2
displayName: 'Publish coverage reports'
inputs:
codeCoverageTool: 'cobertura'
summaryFileLocation: $(Agent.TempDirectory)/output/testResults/coverage/coverage.cobertura.xml
- ${{ if not(parameters.isDryRun) }}:
- task: Docker@2
displayName: Publish my-sample-api # ๐ name, again
inputs:
repository: my-sample-api # ๐ name, again
command: build
Dockerfile: $(Agent.TempDirectory)/Dockerfile
buildContext: ./src
tags: $(tags) # ๐ tags again
arguments: --build-arg BUILD_VERSION=$(Build.BuildNumber)
- task: Docker@2
displayName: Push my-sample-api Image to the ACR
inputs:
repository: my-sample-api # ๐ name, again
command: push
tags: $(tags) # ๐ tags again
A word about
$(tags)
. That will work, as long as the caller has set a variabletags
. That is equivalent to using a global variable, which we all know is evil. Itโs best to use a parameter instead of macro syntax in asteps
template. You can use it injobs
orstages
templates, if you defined the variable in the template.
Adding a few parameters clears all that up. If the caller conforms to our typical project layout, the only parameters needed are isDryRun
and repositoryName
.
parameters:
...
- name: repositoryName
type: string
- name: tags
type: string
displayName: Comma-separated tags for the docker image # ๐ displayName tells the expected format
- name: buildNumber
type: string
default: '$(Build.BuildNumber)'
- name: context
type: string
default: './src'
- name: dockerfile
type: string
default: 'DevOps/Dockerfile'
steps:
- checkout: self
displayName: 'Checkout source code'
- task: Docker@2
displayName: Build and test ${{ parameters.repositoryName }} # โ
inputs:
repository: ${{ parameters.repositoryName }} # โ
command: build
Dockerfile: ${{ parameters.dockerfile }} # โ
buildContext: ${{ parameters.context }} # โ
tags: ${{ parameters.tags }} # โ
# ๐ โ
arguments: >-
--build-arg BUILD_VERSION=${{ parameters.buildNumber }}
--target build-test-output
--output $(Agent.TempDirectory)/output
- task: PublishPipelineArtifact@1
displayName: 'Publish build log'
inputs:
targetPath: $(Agent.TempDirectory)/output/logs
artifact: buildLog
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: 'Publish Test Results'
inputs:
testResultsFormat: VSTest
testResultsFiles: '**/*.trx'
searchFolder: $(Agent.TempDirectory)/output/testResults
publishRunAttachments: true
failTaskOnFailedTests: true
- task: PublishCodeCoverageResults@2
displayName: 'Publish coverage reports'
inputs:
codeCoverageTool: 'cobertura'
summaryFileLocation: $(Agent.TempDirectory)/output/testResults/coverage/coverage.cobertura.xml
- task: Docker@2
displayName: Publish ${{ parameters.repositoryName }} # โ
inputs:
repository: ${{ parameters.repositoryName }} # โ
command: build
Dockerfile: ${{ parameters.dockerfile }} # โ
buildContext: ${{ parameters.context }} # โ
tags: ${{ parameters.tags }} # โ
arguments: --build-arg BUILD_VERSION=${{ parameters.buildNumber }} # โ
- ${{ if not(parameters.isDryRun) }}:
- task: Docker@2
displayName: Push ${{ parameters.repositoryName }} Image to the ACR
inputs:
repository: ${{ parameters.repositoryName }} # โ
command: push
tags: ${{ parameters.tags }} # โ
Note that the
buildNumber
uses macro syntax (runtime) for the default values. If the caller does not pass inbuildNumber
at runtime, it will use the value of the predefined variable$(Build.BuildNumber)
You can use any variable that is defined as the default, but as mentioned above, use care with global variables.Is using macro syntax for default parameters a cool ๐ feature or evil ๐ side effect? You decide.
What if some apps donโt create unit test output or have special parameters to pass into their build? They canโt use this template. But wait! Why not add more parameters?
- name: dockerBuildArguments
type: string
displayName: Any additional arguments for docker build
default: '--build-arg BUILD_VERSION=$(Build.BuildNumber)'
- name: dockerBuildOutputArguments
type: string
displayName: Output arguments for docker build, leave empty if no unit test output from Docker
default: '--target output --output $(Agent.TempDirectory)/output'
I use displayName
in this case to better explain the parameter.
Next, I update the Docker tasks to use these parameters. If buildOutputArguments
is empty it will skip the build and test step.
# ๐ make it conditional to do build and test
- ${{ if ne(replace(parameters.buildOutputArguments,' ',''), '') }}:
- task: Docker@2
displayName: Build and test ${{ parameters.repositoryName }}
inputs:
containerRegistry: ${{ parameters.registry }}
repository: ${{ parameters.repositoryName }}
command: build
Dockerfile: ${{ parameters.dockerfile }}
buildContext: ${{ parameters.context }}
tags: ${{ parameters.tags }}
arguments: ${{ parameters.buildArguments }} ${{ parameters.buildOutputArguments }} # ๐ add arguments
# ๐ this will still build if needed
- task: Docker@2
displayName: Publish ${{ parameters.repositoryName }}
inputs:
repository: ${{ parameters.repositoryName }}
command: build
Dockerfile: ${{ parameters.dockerfile }}
buildContext: ${{ parameters.context }}
tags: ${{ parameters.tags }}
arguments: ${{ parameters.buildArguments }} # ๐ add arguments
Calling The Build Template
Now that we have a template, letโs revamp the old build pipeline to use it.
name: '1.1.$(Rev:r)$(buildSuffix)'
parameters:
- name: isDryRun
type: boolean
displayName: Perform a dry run - do not push the docker image
default: true # for the blog sample testing we always do dry run
# default: false
trigger:
branches:
include:
- refs/heads/main
- refs/heads/develop
- refs/heads/release/*
paths:
exclude:
- 'DevOps'
- 'doc'
- '*.md'
- '*.ps*1'
pr:
branches:
include:
- refs/heads/main
- refs/heads/develop
paths:
exclude:
- 'DevOps'
- 'doc'
- '*.md'
- '*.ps*1'
variables:
- name: buildSuffix
# Set the build suffix to DRY RUN if it's a dry run, that is used in the name
${{ if parameters.isDryRun }}:
value: '-DRYRUN'
${{ else }}:
value: ''
- name: tags
${{ if eq(variables['Build.SourceBranchName'],'main') }}:
value: "$(Build.BuildId)"
${{ else }}:
value: "$(Build.BuildId)-prerelease"
# ๐ add the template repository
resources:
repositories:
- repository: templates # name after the @ below
type: git
name: azdo-templates
ref: releases/v1.0
jobs:
- job: build
displayName: Build
pool:
vmImage: ubuntu-latest
steps:
# ๐ ripped out all the steps and added the template
- template: steps/build.yml@templates
parameters:
isDryRun: ${{ parameters.isDryRun }}
repositoryName: sample-api
tags: $(tags)
# ๐ these are optional, and I like the defaults
# dockerfile: 'DevOps/Dockerfile'
# context: './src'
# buildNumber: '$(Build.BuildNumber)'
Now we can run the build pipeline just as before. The interesting part of this exercise is that the expanded YAML for the templated and non-templated pipelines steps are the same! We just made the YAML reusable. Hereโs a screenshot of the original pipeline on the left and templated pipeline on the right.
From the completed job, if you use the kebab menu to Download logs
and view the expanded YAML the only difference will be the added resources
section. The template
keyword is replaced with the steps from the template.
Summary
In this post, I showed you have to take a build pipeline and create a template from it. Converting any YAML pipeline will follow the same steps. In the next post, Iโll do that for the deploy pipeline.
Links
- This sampleโs source the YAML is in the
DevOps-templated
folder - This sampleโs Build pipeline in Azure DevOps
- The template repo
Azure DevOps documentation: