Azure DevOps Terraform is a game-changer for infrastructure management. It allows you to manage your infrastructure as code, making it easier to version, collaborate, and reproduce your infrastructure.
By integrating Terraform with Azure DevOps, you can automate the process of creating and managing your infrastructure. This includes creating and managing virtual machines, networks, and storage.
Terraform's infrastructure as code approach helps to reduce errors and improve consistency. It also makes it easier to track changes and audit your infrastructure.
With Azure DevOps Terraform, you can automate the deployment of your infrastructure, making it easier to get started with your project. This is especially useful for large-scale projects where manual deployment can be time-consuming and prone to errors.
Azure DevOps Setup
To set up Azure DevOps for Terraform, start by logging in and choosing Environments under pipelines. Create a new environment by selecting Create Environment and naming it, such as "dev".
Next, configure approvals and checks by selecting Approvals and then creating a new approval. This allows you to specify a user or group to provide approvals before a deployment stage runs.
In your Azure DevOps project, select Install to set up the Terraform YAML pipeline file, which will be present as a new file. This is an important step in getting Terraform up and running with Azure DevOps.
Setup Remote State
To set up remote state in Azure DevOps, you need to create an Azure Storage Account that supports encryption at rest, transparent geo-replication, and other features.
The account should have features such as blob versioning, HTTPS only access, private access only, and a minimum TLS version of 1.2.
You can create a script using your favorite tool, such as Azure CLI or PowerShell, to set up the storage account.
Here's a list of the required features:
- Encryption at rest
- Transparent geo-replication (RA-GZRS)
- Soft delete for blobs and containers
- Blob versioning
- HTTPS only access
- Private access only
- Minimum TLS version 1.2
Additionally, you should enable features like storage account firewall, advanced threat protection, customer managed encryption keys, and Azure Monitor diagnostics and logging.
If you only enable one feature, make sure it's blob versioning, which allows you to recover older versions of the state file.
Deploy and Manage Infrastructure with Remote State and Yaml
Deploying and managing infrastructure with remote state and YAML is a crucial step in Azure DevOps setup. You'll need to create a script to set up an Azure Storage Account that supports encryption at rest, transparent geo-replication, and other features.
To get started, you'll need to create a remote backend for your Terraform state storage. This involves creating a storage account with a unique name in a new resource group, configuring its location and redundancy, and creating a blob container called "tfstate" to hold the tfstate file for your infrastructure.
A remote backend is essential for Terraform automation, as it allows multiple build agents to share state changes. You can create a script using your favorite tool, such as Azure CLI or PowerShell, to set up the storage account and configure the remote backend.
Here's a breakdown of the features you'll need to enable in your storage account:
- Encryption at rest
- Transparent geo-replication (RA-GZRS)
- Soft delete for blobs and containers
- Blob versioning
- HTTPS only access
- Private access only
- Minimum TLS version 1.2
You'll also need to set up a storage account firewall, advanced threat protection, customer-managed encryption keys, and Azure Monitor diagnostics and logging.
To configure the remote backend, you'll need to create a backend block in your Terraform configuration file. You can declare the backend configuration outside the configuration code base by creating an external file, such as azurerm.backend.tfvars.
Here's an example of what the backend block might look like:
```
backend "azurerm" {
resource_group_name = var.resource_group_name
storage_account_name = var.storage_account_name
container_name = var.container_name
key = var.key
}
```
You can then reference this backend configuration in your Terraform configuration file using the `backend` command.
In your Azure DevOps pipeline, you'll need to install Terraform and configure the remote backend. You can do this by adding a task to your pipeline that installs Terraform and sets up the remote backend.
Here's an example of what the YAML configuration file might look like:
```
trigger:
- master
paths:
- infrastructure/variables.tf
- infrastructure/main.tf
pool:
vmImage: 'ubuntu-latest'
variables:
subscription_id: ${{ secrets.subscription_id }}
application_id: ${{ secrets.application_id }}
tenant_id: ${{ secrets.tenant_id }}
storage_accounts: ${{ secrets.storage_accounts }}
blob_storage: ${{ secrets.blob_storage }}
state_file: ${{ secrets.state_file }}
sa-resource_group: ${{ secrets.sa-resource_group }}
steps:
- task: TerraformInstaller@0
displayName: 'Install Terraform'
inputs:
terraformVersion: '1.1.7'
- task: AzureCLI@2
displayName: 'Login to Azure'
inputs:
azureSubscription: 'your-subscription-name'
script: |
az login --tenant ${tenant_id}
az account set --subscription ${subscription_id}
- task: TerraformTask@0
displayName: 'Terraform init'
inputs:
command: 'init'
workingDirectory: 'infrastructure'
- task: TerraformTask@0
displayName: 'Terraform apply'
inputs:
command: 'apply'
workingDirectory: 'infrastructure'
autoApprove: true
```
This YAML configuration file sets up a pipeline that installs Terraform, logs in to Azure, initializes the Terraform configuration, and applies the configuration to the remote backend.
By following these steps, you can deploy and manage your infrastructure with remote state and YAML in Azure DevOps.
Infrastructure Deployment Stage
To create an infrastructure deployment stage in Azure DevOps, you need to add a new stage to your pipeline definition. This stage will go after the build stage, which produces a plan artifact and facilitates review and approval.
The deployment stage has four steps: Extract the Terraform configuration and plan, Add Service Principal credentials to the environment, Pin Terraform to a specific version, and Terraform apply.
To connect this job to the dev environment, you need to use the environment: dev property. This ensures that all checks configured for that environment must pass before any steps in this job can run.
The pipeline secrets for Terraform to use are the same during deployment as it was during the build. The deploy pipeline uses the same pipeline template described above.
Here's a breakdown of the steps in the deployment stage:
- Extract the Terraform configuration and plan
- Add Service Principal credentials to the environment
- Pin Terraform to a specific version
- Terraform apply
Make sure to set up the pipeline secrets for Terraform to use, as it's the same during deployment as it was during the build.
Service Principal and Credentials
To authenticate with Azure, we can use Service Principal credentials, which are represented as a Service Connection in Azure DevOps. These credentials are stored as pipeline secrets, making them secure and easily accessible.
A Service Principal is essentially a managed identity for an application or service in Azure, offering benefits such as automation of tasks, improved security, and creation of auditable traces.
To create a Service Principal, we can use the Azure CLI task to log into Azure and set the values as pipeline secrets. This approach is more efficient than maintaining a separate variable group or repeating the script in later steps.
Alternatively, we can create a Service Principal using the Azure Portal, which involves creating a separate file called "secrets.txt" to store sensitive information. The Service Principal can be created by running a command in the VS Code terminal, which will display the details of the account in a specific format.
The Service Principal credentials consist of three values: CLIENT_ID (appId), CLIENT_SECRET (password), and TENANT_ID (tenant). These values are used to log in and authenticate with Azure.
Here are the steps to create and store Service Principal credentials:
- Create a Service Principal using the Azure CLI task or Azure Portal
- Store the Service Principal credentials in a file called "secrets.txt"
- Export the credentials as environment variables for Terraform to use
- Store the credentials securely as pipeline secrets in Azure DevOps
Storage Account and Key Vault
In Azure DevOps, you can store sensitive data like secrets and keys in a Key Vault, which is a secure storage system.
To link your Key Vault secrets to a Variable Group, navigate to your Team Project in the Azure DevOps portal and go to Pipelines > Library.
Select your newly created Variable Group and toggle the “Link secrets from an Azure key vault as variables.” This will automatically create variables for all of your secrets.
Next, select your subscription and your Key Vault, and click the Add button to import your secrets. You may need to press “authorize” to allow Azure DevOps access to your subscription and/or your Key Vault.
All of your linked key vault secrets will now be visible in your Variable Group, making it easy to manage and use them in your Azure DevOps pipelines.
Configuration and Variables
In Azure DevOps Terraform, configuration and variables are crucial for a smooth deployment process.
You can link your Key Vault secrets to a Variable Group in the Azure DevOps portal, which will automatically create variables for all your secrets.
To define a trigger in your YAML configuration file, you need to specify a trigger that causes the DevOps Pipeline to run when a commit is done on the Master Branch of your repo.
The YAML configuration file should also include paths, pool, group, and other variables such as subscription_id, application_id, tenant_id, and storage_accounts.
Here's a list of the configuration variables used in the Powershell script and Terraform files:
- tf_state_resource_group_name: The name of the resource group.
- tf_state_storage_account_name: The name of the storage account.
- tf_state_storage_account_container_name: The name of the storage account container.
- project_name: The name of the project.
- azure_region: The azure region where all the resources will be created.
- azure_subscription_id: The azure subscription ID.
- azure_tenant_id: The azure tenant ID.
- azdo_org_url: The URL of the Azure DevOps organization.
- azdo_project_name: The name of the Azure DevOps project where the variable group will be created.
- azdo_pat: An Azure DevOps PAT (Personal Access Token).
These variables are used to store the configuration settings and can be changed to your liking, but the names must not be changed or the script will break.
Extract Configuration
Extracting the Terraform configuration and plan is a crucial step in the deployment process. This is done using the ExtractFiles task in the pipeline template, which unpacks the stored configuration and plan from a gzipped tar archive.
The deploy agent doesn't have access to the Terraform build artifact, so this step is necessary to make the configuration and plan available. This ensures that the deployment stage can run smoothly without any issues.
To extract the configuration and plan, the pipeline template uses the ExtractFiles task to unpack the stored artifact. This task is essential for making the configuration and plan available to the deployment stage.
Here's a list of the tasks involved in extracting the configuration and plan:
- ExtractFiles task: Unpacks the stored configuration and plan from a gzipped tar archive.
- Download and verify Terraform version: Ensures that the expected Terraform version is available on the deploy agent.
By following these steps, you can successfully extract the Terraform configuration and plan, making it ready for deployment.
Connect Variable Group to Key Vault
To connect your Variable Group to Key Vault, navigate to your organization and Team Project in the Azure DevOps portal. You'll need to toggle the "Link secrets from an Azure key vault as variables" option in the Pipelines > Library section.
Select your subscription and Key Vault from the dropdown menus, and press "authorize" to allow Azure DevOps access to your subscription and/or Key Vault. This will enable the connection.
Click the Add button to import your secrets, and select all of your secrets to link to the Variable Group. This will automatically create variables for all of your secrets, with the Name of the variable being the key vault secret entry, and the value of the variable being the secret.
Yaml Configuration File
The YAML configuration file is where the magic happens in Azure DevOps Terraform. It's where you define the trigger that makes your pipeline run automatically when a commit is done on the Master Branch of your repo.
To make the pipeline run automatically, you need to define a trigger. The trigger below causes the DevOps Pipeline to run when a commit is done on the Master Branch of your repo.
You can specify which files or directories to include or exclude in the CI build. By default, the pipeline will run when a commit is done on any file. But if your project contains other files not relating to this Terraform project, you only want it to run when changes are done in your two terraform files (variables.tf and main.tf).
Here are the key components of the YAML configuration file:
- Trigger: To make the pipeline automatically run, we need to define a trigger.
- Paths: Specify which files or directories to include or exclude in the CI build.
- Pool: The VM that is going to run your code and deploy your infrastructure.
- Group: The Variable Group containing your secrets and their corresponding values.
- subscription_id: The subscription ID you are deploying your resources to.
- application_id: The application ID of your service principal name.
- tenant_id: The ID of your Azure tenant.
- storage_accounts: The Storage Account that houses your Storage Container that contains your state file.
- blob_storage: The Storage Container that will house your state file.
- state_file: The name of your Terraform state file.
- sa-resource_group: The Resource Group that your Storage Account is in.
These variables are automatically imported from your Variable Group, and you can call them by their name. For example, sa01-azdo-accesskey is the name of the variable in your Variable Group.
CI/CD and Deployment
In Azure DevOps, you can take advantage of Continuous Integration/Continuous Deployment (CI/CD) by making changes to your configuration file, and Azure DevOps + Terraform will handle the rest.
To enable CI/CD, you'll need to set up a pipeline that can automatically run whenever changes are made to your repository. You can do this by creating a pipeline that retrieves the source from Azure Repos and performs Terraform commands such as init, validate, fmt, and plan.
With CI/CD enabled, you can make changes to your configuration file, such as changing a Network Security Rule in your main.tf file, and Azure DevOps + Terraform will automatically run the pipeline to apply the changes.
Azure DevOps can integrate with various Infrastructure as Code (IaC) tools, including Terraform, Pulumi, and Ansible, as well as vendor-specific tools such as AWS Cloud Formation, Azure Bicep, and Azure Resource Manager (ARM).
To create a pipeline, you'll need to create an empty starter pipeline and paste YAML code into it. The pipeline will automatically retrieve the source from Azure Repos and perform Terraform commands such as init, validate, fmt, and plan.
Here are the steps to create a pipeline:
- Create an empty starter pipeline
- Paste YAML code into it
- Configure the pipeline to retrieve the source from Azure Repos
- Configure the pipeline to perform Terraform commands such as init, validate, fmt, and plan
With a pipeline set up, you can make changes to your configuration file and Azure DevOps + Terraform will automatically run the pipeline to apply the changes. This makes it easy to manage your infrastructure and keep it up to date with the latest changes.
Infrastructure as Code
Infrastructure as Code is a key concept in Azure DevOps Terraform. Azure DevOps can integrate with various Infrastructure as Code (IaC) tools like Terraform, Pulumi, and Ansible, as well as vendor-specific tools such as AWS Cloud Formation, Azure Bicep, and Azure Resource Manager (ARM).
To enable IaC support, you need to install the IaC extension. This can be done by accessing the marketplace section in Azure DevOps, which can be found under "Organization Settings >> Extensions >> Browse Marketplace". The IaC extension is required to integrate Azure DevOps with Terraform and other IaC tools.
Azure DevOps integrates with Terraform through the use of pipeline templates and variables. The deploy pipeline uses the same pipeline template described above, which includes steps such as Extracting the Terraform configuration and plan, adding Service Principal credentials to the environment, and Terraform apply.
Pin to Version
Pin to Version is a crucial step in our Terraform pipeline. It ensures consistent results by specifying the exact version of Terraform to use.
Azure DevOps hosted build agents come with a default version of Terraform, but we need to pin it to a specific version to avoid any discrepancies. This is where the pipeline-template's pin-terraform.yml script comes in, which relies on two pipeline variables: Terraform version and hash value.
These variables are defined in a variable group and referenced by our pipeline definition outside this template. By specifying the exact version of Terraform, we can reproduce the same results every time we run the pipeline.
Here's a quick rundown of the pipeline-template's pin-terraform.yml script:
By pinning Terraform to a specific version, we can ensure that our infrastructure as code is reproducible and reliable. This is a critical step in maintaining the integrity of our infrastructure, and it's essential to get it right.
Infrastructure as Code (IaC)
Infrastructure as Code (IaC) is a game-changer for managing and provisioning infrastructure. It allows you to define your infrastructure using code, making it easier to version, track, and reproduce your infrastructure configurations.
Azure DevOps integrates with various IaC tools, including Terraform, Pulumi, and Ansible, as well as vendor-specific tools like AWS Cloud Formation and Azure Bicep. To enable IaC support in Azure DevOps, you need to install the IaC extension, which can be done by accessing the marketplace section in Azure DevOps.
A key benefit of IaC is that it allows you to store your infrastructure state in a remote backend, rather than locally. This helps prevent data loss and ensures that your infrastructure configurations are always up-to-date.
Here are the steps to create a remote backend in Terraform:
- Create a storage account in a new Resource Group
- Configure the storage account location and redundancy
- Create a blob container called "tfstate" to hold the tfstate file
- Create the Backend.tf file, specifying the storage account details
By using a remote backend, you can ensure that your infrastructure configurations are always accurate and up-to-date, and that you can easily reproduce your infrastructure environments.
Terraform provides a way to split plan and apply into separate steps within automation systems, using explicit execution plan files. This allows you to review and approve the plan before applying it to your infrastructure.
Here are the key steps in a Terraform pipeline:
- Shallow Clone
- Add Service Principal credentials to the environment
- Pin Terraform to a specific version
- Terraform init
- Terraform plan
- Publish the Terraform configuration and plan file
Note that the deployment stage of the pipeline involves applying the Terraform plan to your infrastructure, using the Terraform apply command.
Frequently Asked Questions
What is the Azure equivalent of Terraform?
The Azure equivalent of Terraform is Bicep, a domain-specific language developed by Microsoft for Infrastructure as Code (IaC). Bicep offers a similar IaC experience to Terraform, but is specifically designed for Azure.
Is Terraform is a DevOps tool?
Yes, Terraform is a DevOps tool used for provisioning and managing cloud deployments. It's a valuable resource for teams looking to streamline their cloud infrastructure management.
Is Terraform a CI CD pipeline?
No, Terraform is not a CI/CD pipeline itself, but it can be used to manage and orchestrate resources within your CI/CD platform
What is the difference between Terraform and Azure pipeline?
Terraform and Azure Pipeline are two separate tools used for infrastructure deployment, with Terraform focusing on infrastructure as code (IaC) and Azure Pipeline focusing on continuous integration and continuous deployment (CI/CD). Understanding the difference between these tools is crucial for efficient and scalable cloud management.
How do I run CI CD pipeline in Azure DevOps?
To run a CI/CD pipeline in Azure DevOps, follow these steps: access Azure DevOps, create a new pipeline, and configure pipeline settings through a series of guided steps.
Sources
- https://jamesrcounts.com/2021/07/07/terraform-pipelines-with-azure-devops.html
- https://www.thelazyadministrator.com/2020/04/28/deploy-and-manage-azure-infrastructure-using-terraform-remote-state-and-azure-devops-pipelines-yaml/
- https://www.mytechramblings.com/posts/how-to-bootstrap-terraform-and-azdo-to-start-deploying-iac-to-azure/
- https://everythingdevops.dev/automating-azure-infrastructure-with-terraform-and-azure-devops/
- https://medium.com/@DiggerHQ/implementing-azure-devops-ci-cd-pipeline-for-terraform-6287fd25d512
Featured Images: pexels.com