
Azure DevOps pipelines can be extended to automate and streamline build and deployment processes, making them more efficient and reliable. This allows teams to focus on higher-value tasks.
With Azure DevOps pipeline extends, teams can automate tasks such as code analysis, testing, and deployment, reducing manual errors and increasing consistency. Azure DevOps pipeline extends can be integrated with various tools and services, including GitHub, Jenkins, and Docker.
By extending Azure DevOps pipelines, teams can also improve collaboration and communication, as all stakeholders can see the pipeline's progress and results in real-time. This transparency and visibility help to identify and resolve issues more quickly.
Configuring the Pipeline
You can add the Required YAML Template check to an environment in Azure DevOps just like an Approval would be. This check ensures that the pipeline uses the extends template.
If you try to deploy to an environment without using the extends template, it will fail the required template check. This will show you the check that failed and provide a hyperlink to the checks for that environment.
The pipeline hierarchy in Azure DevOps consists of Stages, Jobs, and Steps. A stage contains multiple jobs and jobs contain multiple steps. Make sure to keep an eye on the required indents and dashes when creating a pipeline.
A deployment stage in Azure DevOps has a specially named job called deployment, which allows for additional options such as deployment history and deployment strategies. This job is different from a standard job type.
Here's a breakdown of the deployment stage properties:
- Environment: This is set to the name of the environment, such as 'Staging' or 'Production'.
- Strategy: This section has lifecycle hooks that can be used in different deployment strategies. For this walkthrough, we're using the RunOnce strategy.
- Lifecycle hooks: Each hook has its own set of steps to execute. For example, the deploy hook has steps to extract files from a zip and deploy them to an Azure App Service.
Pipeline Structure
Pipeline Structure is a fundamental concept in Azure DevOps pipelines. A pipeline is comprised of Stages, Jobs, and Steps.
A stage contains multiple jobs, and jobs contain multiple steps. The required indents and dashes in the YAML syntax can be tricky to keep track of, but syntax checker add-ons in Visual Studio Code can help prevent errors.
In a multistage pipeline, stages can run sequentially or in parallel depending on how you set dependencies up. Jobs in a stage all run in parallel, and tasks within a job run sequentially.
Here's a breakdown of the pipeline hierarchy:
In the example provided, the pipeline has a stage for the build with three jobs: one to build and create the application artifact, one to build and create the functional test artifact, and one to create the infrastructure artifact. These jobs all run in parallel, reducing the overall time to complete the stage.
Pipeline Deployment
Pipeline Deployment is a crucial aspect of Azure DevOps, allowing you to automate the deployment of your code to various environments. You can deploy to multiple environments, such as staging and production, with different dependencies between the stages.
A pipeline is a collection of stages, and stages can run sequentially or in parallel depending on how you set dependencies up. For example, in the build stage, you can have three different jobs that run in parallel: one to build and create the application artifact, one to build and create the functional test artifact, and one to create the infrastructure artifact.
To deploy to a staging environment, you'll need to create a separate stage with a deployment job that extracts the files from the zip created in the build stage and deploys them to an Azure App Service. This stage will have a property named environment set to 'Staging' and a strategy section with lifecycle hooks that can be used in different deployment strategies.
In the production environment, you'll need to update the stage and job names to indicate they are for production, and add a dependency on the build stage and the staging stage. However, most projects prefer to have an approval step in between stages.
Here are the different stages in a pipeline deployment:
- Build: Runs as before, no changes
- DeployPR: Runs if build stage was successful, the “Build reason” equals “PullRequest”, and the “Pull Request Id” variable is not null.
- DeployDev, DeployQA, and DeployProd: Runs in serial, if the build stage was successful, and the source branch being run equals “master” – essentially it’s not a pull request.
Pipeline Triggers and Variables
Specifying triggers in Azure DevOps reduces how often the pipeline is run, allowing you to direct the build to only run on specific events, such as pull requests created for the master branch or on a merge to the master branch.
Triggers can be set to run the pipeline on specific branch policies, like requiring a passing build on pull requests. This ensures that the pipeline only runs when necessary, saving time and resources.
A unique name for the build can be created using the name property, allowing you to format the date and branch name as desired for your team.
Triggers, Name, and Variables
Specifying triggers is a great way to reduce how often a pipeline is run, as it allows you to control which events trigger the pipeline.
In the example, triggers are set to run the build on pull requests created for the master branch and on a merge to the master branch.
A unique name for the build can be created using the "name" keyword, which by default sets the date and the unique build ID in Azure.
In the example, the default name format is overwritten to include the branch name, which can be modified to suit the team's preferences.
Variables are pipeline-specific properties that can be reused throughout the file, referenced using $(variableName) syntax.
The AgentImage variable is a great example of this, as it's used to store the agent image and can be referenced throughout the pipeline.
Parameter Data Types
Parameter data types are used to define the structure and format of the data that's being passed around in your pipeline. This is crucial for ensuring that the data is correctly interpreted and processed.
You can use various data types, including string, number, and boolean, which are all straightforward. For example, a string can be a simple text value, like "hello world".
The number data type can be restricted to a specific range, such as 1-100, or it can accept any number-like string. This flexibility is useful for handling different types of numerical data.
Boolean values are limited to true or false, making it easy to represent binary choices.
You can also use object data types to represent more complex structures, like a dictionary or a list of key-value pairs. This is particularly useful for storing and retrieving data that has multiple attributes.
Some data types, like step, stepList, job, jobList, deployment, deploymentList, stage, and stageList, use a standard YAML schema format. This format is widely used and understood, making it easy to work with.
Here's a summary of the data types you can use:
You can also directly reference an object's keys and corresponding values, making it easy to access and manipulate the data.
Pipeline Security
Pipeline security is a top priority for Azure DevOps pipeline extends. You can restrict access to your pipelines with Azure Active Directory (AAD) groups, allowing you to control who can view and manage your pipelines.
Azure DevOps provides a feature called "pipeline secrets" to securely store sensitive data such as API keys and passwords. This feature is especially useful for storing sensitive data that should not be committed to source control.
You can also use Azure Key Vault to securely store and manage sensitive data, such as API keys and certificates. This provides an additional layer of security and helps to centralize your sensitive data.
Azure DevOps provides a feature called " pipeline variables" to securely store sensitive data. You can use pipeline variables to store sensitive data and then reference them in your pipeline.
Azure DevOps pipeline extends also supports integration with Azure Security Center, which provides advanced security features such as threat detection and vulnerability assessment. This helps to identify and remediate potential security issues in your pipeline.
Dependencies
In Azure DevOps pipeline, dependencies are crucial for ensuring that stages run in the correct order. This is achieved by using the dependsOn property in the deployment stage to specify the stages that must complete before it runs.
The dependsOn property is an array of stages that this stage should verify have successfully completed before running. This ensures that the deployment stage doesn't run before or in parallel with the Build stage, which needs the artifact created in the Build stage.
To download artifacts created from previous stages, you can use the download task. This task will retrieve the artifact specified, which in this case is the one created in the Build stage (named 'app').
The artifact specified to download is contained in the variable $(Pipeline.Workspace), which is defined in the build. This structure was defined in the build, and can be reviewed to refresh your memory.
The location where artifacts are downloaded to is specified by the archiveFilePatterns/destinationFolder. Here, you'll extract files to a new directory and specify a files folder.
A package named the same as the project is created by the Dot Net Core publish task, which puts all files inside it.
Pipeline Staging and Production
You can deploy your code to two different app services, staging and production, with the appropriate dependencies between the stages. In Azure DevOps, you can set a pre-deployment approval check before deploying to the production infrastructure.
To deploy to the staging environment, you'll create a stage with a deployment job, where you'll extract files from a zip and deploy them to an Azure App Service. The stage will have a property named environment set to 'Staging', and a strategy section with lifecycle hooks that can be used in different deployment strategies.
You'll also need to update the stage and job names to indicate they are for production when deploying to the production environment. This includes updating the dependsOn section to indicate a dependency on the build stage as well as the staging stage.
Here's a summary of the dependencies between stages:
By setting these dependencies, you can ensure that the production environment is not deployed until the build and staging environments have completed successfully.
Frequently Asked Questions
How do I add an extension to my Azure pipeline?
To add an extension to your Azure pipeline, select the shopping bag icon, then browse the marketplace and choose the extension you want to install. Next, select your organization and click Install to complete the process.
Sources
- https://josh-ops.com/posts/extends-template/
- https://learn.microsoft.com/en-us/azure/devops/pipelines/process/template-parameters
- https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/
- https://samlearnsazure.blog/2020/02/27/creating-a-dynamic-pull-request-environment-with-azure-pipelines/
- https://mercuryworks.com/blog/creating-a-multi-stage-pipeline-in-azure-devops
Featured Images: pexels.com