How to reuse pipelines via Azure Pipelines

Dion van Velde
6 min readJan 29, 2021

Azure Devops is a Microsoft service with which code projects can be made available to others. In this technical blog I will give an introduction to how I reuse templates. The Azure Pipelines work with almost any programming language or project type. This tool combines continuous integration (CI) and continuous delivery (CD) to continuously compile and test the project. This is a method for the frequent delivery of applications to (end) customers in which almost all steps are automated.

With Azure Pipelines it is common to define the build pipelines in yaml. These are text files in which all steps are written out. These yaml files are checked into the version control system. The advantage of this method is that they can be different per branch, so that you do not have to implement changes at a central moment.

Especially in a more complex project it is useful to reuse different parts for different parts, for example. There are a number of components in the project below. After all, it is not always necessary to build everything, especially if you have not made any changes.

An important advantage of splitting it up is that you can reuse different templates, this prevents duplicate code and the name of the files already gives an overview of the tasks that are performed in this.

In the project that I use here as an example, I use the following distribution:

1. Server. The backend of the website, in our case, is the Sitecore platform
2. Client. Separate front-end frameworks that are separate from the backend.
3. Services. Separate Windows services that perform background tasks.
4. Develop CI. That is a combination of all the above pipelines

Below a screenshot of what the pipeline layout looks like.

Pipelines

Develop CI

I start here with the Develop CI because it contains all the different parts. In Azure Devops, click “New pipeline” in the pipeline window. Then I select the location of the repository. This can be in Azure Devops as well as on Github. When click next there is a list of different options. I start here with the starter pipeline and choose a location. For now I go for the Develop CI from starting point: /azure-pipeline/develop-ci.yml When you click on “Save and Run” it will be checked in. Then the pipeline is executed.

Review your pipeline

Parallel running machines with Stages

In the starter example, only 1 machine is used at a time, but for a larger project it is more convenient if things run parallel. There are costs involved. If your pipelines are protected and you use the free variant, you have 1 machine available per organization with a maximum of 1800 minutes per month. This may change, but this is the current situation as of January 2021.

The stages in the yaml pipeline are, so to speak, the different books and jobs the chapters. The jobs are performed on different machines. The final step “Merge” waits for the Client and Server stages to be completed. Below the yaml pipeline for /azure-pipeline/develop-ci.yml

trigger: none

pr:
- development

variables:
buildPlatform: "Any CPU"
buildConfiguration: "Release"
webroot: "$(build.artifactstagingdirectory)/wwwroot"
serverPool: "windows-latest"
clientPool: "ubuntu-20.04"
serverSitecoreArtifact: "server_sitecore"
serverUnicornDataArtifact: "server_unicorn_sitecore_data"
serverConfigCollectionArtifact: "server_config_collection"
clientArtifact: "client"
mergeArtifact: "merge"
npm_config_cache: $(Pipeline.Workspace)/.npm

stages:
- stage: Server
dependsOn: []
jobs:
- template: /azure-pipeline/jobs/server_sitecore.yml
parameters:
pool: "$(serverPool)"
artifact: "$(serverSitecoreArtifact)"

- template: /azure-pipeline/jobs/server_config_collection.yml
parameters:
pool: "$(serverPool)"
artifact: "$(serverConfigCollectionArtifact)"
serverUnicornDataArtifact: "$(serverUnicornDataArtifact)"

- stage: Client
dependsOn: []
jobs:
- template: /azure-pipeline/jobs/client.yml
parameters:
pool: "$(clientPool)"
artifact: "$(clientArtifact)"

- stage: Services
dependsOn: []
jobs:
- template: /azure-pipeline/jobs/services.yml
parameters:
pool: "$(serverPool)"
solutions:
- key: service1
value: "/src/service1.csproj"

- stage: Merge
dependsOn:
- Server
- Client
jobs:
- template: /azure-pipeline/jobs/merge.yml
parameters:
pool: "$(serverPool)"
artifact: "$(collectionArtifact)"
serverSitecoreArtifact: "$(serverSitecoreArtifact)"
clientArtifact: "$(clientArtifact)"
mergeArtifact: "$(mergeArtifact)"

Go through a list to build different services

When you zoom in further on the Services task. This is described below. The services yaml is called as a template in the “Develop CI” pipeline under the heading templates. The result is that this job creates multiple artifacts. An artifact is the result of a build task. This can be a folder with dll’s or a folder with css and javascript.

In the develop ci yaml I define the parameters. The yaml services also contain parameters, but they are overwritten. These are only there in case the parameter is not included.

Below the yaml pipeline for /azure-pipeline/jobs/services.yml

parameters:
pool:
vmImage: windows-2019
nuGetVersion: "5.2.0"
solutionDir: '$(Build.SourcesDirectory)/src/'
slnFile: '$(Build.SourcesDirectory)/src/Project.sln'
job: ServicesRabbit
solutions:
- key: service1
value: "/src/service1.csproj"
BuildArguments: "/p:DeployOnBuild=True /p:DeployDefaultTarget=WebPublish /p:WebPublishMethod=FileSystem"

jobs:
- job: ${{ parameters.job }}
displayName: "Services "
workspace:
clean: all
pool:
vmImage: ${{ parameters.pool }}
demands:
- msbuild
variables:
outputPath: '$(build.artifactstagingdirectory)\is_over_written_default'
steps:
- checkout: self
clean: true
fetchDepth: 1

- template: /azure-pipeline/steps/server_net_restore.yml
parameters:
solution: ${{ parameters.slnFile }}

- ${{ each item in parameters.solutions }}:

- task: PowerShell@2
enabled: true
displayName: "[net] update output path"
inputs:
targetType: 'inline'
script: |
$updateOutputPath = "$(build.artifactstagingdirectory)/${{ item.key }}"
Write-Output "##vso[task.setvariable variable=outputPath]$updateOutputPath"

- task: VSBuild@1
displayName: "[net] Build ${{ item.key }}"
inputs:
solution: "$(Build.SourcesDirectory)${{ item.value }}"
vsVersion: "16.0"
msbuildArgs: '${{ parameters.BuildArguments }} /p:SolutionDir="${{ parameters.solutionDir }}"'
platform: "$(buildPlatform)"
configuration: "$(buildConfiguration)"
maximumCpuCount: true

- task: PublishPipelineArtifact@1
enabled: true
displayName: "[net] Publish Project ${{ item.key }}"
inputs:
artifactName: ${{ item.key }}
targetPath: $(build.artifactstagingdirectory)/${{ item.key }}
publishLocation: "pipeline"

Separate steps within the jobs

To reuse parts within the different jobs, break down the steps. This is useful for retrieving the external software libraries, for example. This step is used in several places. To avoid duplicate code use this as a template.

In the example below I am now doing the restore of the solution. For this project the .NET Framework packages.config style is used. The first task searches for all “packages.config” files in the project. A hash is made of this so that if anything changes one of these files, the old cache folder is not used. If the config files do not change, the key remains the same. In Azure Devops, the cache is key scoped to the pipeline and if not used within 7 days, the content is deleted.

The contents of: /azure-pipeline/steps/server_net_restore.yml

parameters:
verbose: false
nuGetVersion: "5.2.0"
solution: '$(Build.SourcesDirectory)\src\Project.sln’

steps:

- task: Cache@2
displayName: "[net] Cache NuGet packages"
inputs:
key: 'nuget | "$(Agent.OS)" | **/packages.config,!**/bin/**'
restoreKeys: |
nuget | "$(Agent.OS)"
path: '$(Build.SourcesDirectory)/src/packages'

- task: NuGetToolInstaller@1
displayName: "[net] Use NuGet ${{ parameters.nuGetVersion }}"
inputs:
versionSpec: ${{ parameters.nuGetVersion }}

- task: NuGetCommand@2
displayName: "[net] NuGet Restore"
inputs:
restoreSolution: "${{ parameters.solution }}"
feedsToUse: config

Merge job

The last step in the build process is to combine the artifacts before they are released to an environment. It is up to you whether you do this in the build step or whether you do this during the release. It is not visible in the example below, but in my case there were different front-end frameworks in different locations, so it seemed smart to bring everything together before the release starts.

parameters:
pool: "ubuntu-latest"
serverSitecoreArtifact: "server_sitecore"
clientArtifact: "client"
mergeArtifact: "merge"

jobs:
- job: Merge
displayName: Merge website artifacts
pool:
vmImage: ${{ parameters.pool }}
steps:
- checkout: none

- task: DownloadPipelineArtifact@2
displayName: "[$(serverSitecoreArtifact)] DownloadPipelineArtifact"
inputs:
artifact: $(serverSitecoreArtifact)
downloadPath: '$(pipeline.workspace)/$(serverSitecoreArtifact)'

- task: DownloadPipelineArtifact@2
displayName: "[$(clientArtifact)] DownloadPipelineArtifact"
inputs:
artifact: $(clientArtifact)
downloadPath: '$(pipeline.workspace)/$(clientArtifact)'

- task: PowerShell@2
enabled: true
displayName: rename $(serverSitecoreArtifact) to $(mergeArtifact)
inputs:
targetType: 'inline'
script: |
Move-Item -Path $(pipeline.workspace)/$(serverSitecoreArtifact) -Destination $(pipeline.workspace)/$(mergeArtifact)

- task: PowerShell@2
enabled: true
displayName: Merge $(clientArtifact) in $(mergeArtifact) (When /assets/client already exist)
inputs:
targetType: 'inline'
script: |
Get-ChildItem -Path $(pipeline.workspace)/$(clientArtifact) | Copy-Item -Destination $(pipeline.workspace)/$(mergeArtifact)/assets/client -Recurse -Force

- task: PublishPipelineArtifact@1
enabled: true
displayName: Publish $(mergeArtifact) Artifact
inputs:
targetPath: "$(pipeline.workspace)/$(mergeArtifact)"
artifact: $(mergeArtifact)
publishLocation: "pipeline"

The conclusion that it is quite possible to split pipelines into separate files, making it easier to keep an overview. This way you avoid duplicate code and automate the release of the application.

This blog first appeared on qdraw.nl

--

--

Dion van Velde

Blogger, Photographer, swimmer, Software Developer at We Are You (Den Bosch) https://qdraw.nl/blog