Azure DevOps loop a complex object

This is a subject that I have not found much or any documentation on, so I wanted to share what I did. When creating the parameters to pass into a Template it can get very long, plus when two arrays depend on each other it can get very complex, so ironically a complex object makes this simple.

For my example I have a template where I would like to process Virtual Machines(VM) and its associated Disk if it has one. Therefore I need the VMs name and the Disk name for each, which can be done in many other ways, but this is for a specific example.

You could pass in the parameters as lists for VM names and Disk names as per below example, but then if a VM doesn’t have a disk or has many disk, the indexes would not line up.

	- Name: virtualMachineNames
	    Type: object
	    Default: ['vm1','vm2']
	- Name: diskNames
	   Type: object
       Default: ['vm1Disk1','vm1Disk2','vm2Disk1']

Instead the object can be just that, but the catch I found is you can’t define a strict format. Therefore, I would suggest adding a comment to the file to demo a example format. In the below example I have added a default version just for this demo.

- Name: virtualMachines
    Type: object
    Default: 
	    - Vm:
		    Name: 'vm1'
		    Disks: ['vm1Disk1','vm1Disk2']
	    - Vm:
		    Name: 'vm2'
            Disks: ['vm2Disk1'']

You can keep all of you properties on one level, which removes the ‘vm’ part and still do the looping below, but for prettiness and as close to a JSON object, I like doing it like this.

We then can loop through these properties just like we would a list of items, and then we access the properties like an object.

- ${{ each vm in parameters.virtualMachines }}:
      - task: AzureCLI@2
        displayName: Check ${{ vm.name }} Disks

Push Docker Image to ACR without Service Connection in Azure DevOps

If you are like me and using infrastructure as code to deploy your Azure Infrastructure then using the Azure DevOps Docker task doesn’t work. To use this task you need to know what your Azure Container Registry(ACR) is and have it configured to be able to push your docker images to the registry, but you don’t know that yet. Here I show how you can still use Azure DevOps to push your images to a dynamic ACR.

In my case I am using Terraform to create the Container Registry and with that I pass what I want it to be called. For example ‘prc-acr’ which will generate an ACR with the full login server name ‘prc-acr.azurecr.io’. This can then be used later for sending the images to the correct registry.

When using the official Microsoft Docker Task the documentation asks that your have a Service Connection to your ACR. To do this though you need the registry login server name, username and password to connect, which unless you keep the registry static you will not know. Therefore, you can’t create the connection to then push your images up. I did read some potential methods to dynamically create this connection, but then we need to manage these so they do not get out of control.

To push the image we need only two things, a connection to Azure and where to push the image. The first we can get set up as we know the tenant and subscription we will be deploying to. The connection can be made up by following this guide to connection Azure to Azure DevOps. The other part of where to send the image, we mentioned earlier when we created the ACT in Terraform calling it ‘prc-acr’.

With these details we can use the Azure CLI to push the image to the ACR. First your need to login to the ACR using:

az acr login --name 'prc-acr'

This will connect you into the ACR that was created in Azure. From there you will need to tag your image with the acr login server name with registry name and tag. For example:

docker tag prcImage:latest prc-acr.azurecr.io/prc-registry:latest

This will then tell docker where to push the image to while you are logged in to the Azure Container Registry, which means from there we simply just need to push the image with that tag in the standard docker method:

docker push prc-acr.azurecr.io/prc-registry:latest

Now this is very each and simple as we do not need a connection to the Container Registry, but just a connection to the Azure environment. These details can then be used with the Azure CLI Task as below, where I am passing in the following parameters.

Parameter NameExample ValueDescription
azureServiceConnectionAzureServiceConnectionService Connection name to Azure
azureContainerRegistryNamePrc-acrAzure Container Registry Name
dockerImageprcImageDocker Image Name
tagNameLatestDocker Tag Name
registryNamePrc-registryACR Registry Name
steps:
  - task: AzureCLI@2
    displayName: 'Push Docker Image to ACR'
    inputs:
      azureSubscription: ${{parameters.azureServiceConnection}}
      scriptType: 'ps'
      scriptLocation: 'inlineScript'
      inlineScript: |
        az acr login --name ${{parameters.azureContainerRegistryName}}
        docker tag ${{parameters.dockerImage}}:${{parameters.tagName}} ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}
        docker push ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}

Terraform remote backend for cloud and local with Azure DevOps Terraform Task

When working with Terraform, you will do a lot of work/testing locally. Therefore, you do not want to store your state file in a remote storage, and instead just store it locally. However, when deploy you don’t want to then be converting the configuration at that point and can get messy working with Azure DevOps. This is a solution that works for both local development and production deployment with the Azure DevOps Terraform Task.

The official Terraform Task in Azure DevOps by Microsoft is https://marketplace.visualstudio.com/items?itemName=ms-devlabs.custom-terraform-tasks

When using this task you configure the cloud provider you will be using as a Backend service like Azure, Amazon Web Services (AWS) or Google Cloud Platform (GCP). These details can be used to configure the Backend Service to store the State file, but they require the Terraform code to implement the service.

You can see all the different types here: https://www.terraform.io/docs/backends/types/index.html

For this walk through I will use the Azure Resource Manager, which uses an Azure Storage Account, as the example, but as mentioned this can be used in any provider.

https://www.terraform.io/docs/backends/types/azurerm.html

This would be the standard Terraform configuration you would need for setting up the Backend Service for Azure:
 

terraform {
  backend "azurerm" {
    resource_group_name  = "StorageAccount-ResourceGroup"
    storage_account_name = "abcd1234"
    container_name       = "tfstate"
    key                  = "prod.terraform.tfstate"
  }
}

When using this locally, you don’t want any of this in your main.tf Terraform file else it will error with no detail or add the state to the Azure Storage Account. Therefore, locally you will not add this.

Instead during the deployment using the Azure DevOps Pipelines, we will inject the configuration. This will be done, by inserting a backend.tf file using PowerShell. Within the file, we will inject the configuration, but we don’t need all the parameters as they are inserted by the Terraform task.

We will inject just:

terraform {
  backend "azurerm" {
  }
}

Which as a single like string we will need to stringify it to:

"terraform { `r`n backend ""azurerm"" {`r`n} `r`n }"

Using the PowerShell task we can then check for if the file already exist and if not then inject it into the same location as the main.tf file. This then causes when Terraform runs to process it with a Backend Service and with the Azure details we have provided in the Task.

- powershell: |
        $filename = "backend.tf"
        $path = "${{parameters.terraformPath}}"
        $pathandfile = "$path\$filename"
        if ((Test-Path -Path $pathandfile) -eq $false){
            New-Item -Path $path -Name $filename -ItemType "file" -Value "terraform { `r`n backend ""azurerm"" {`r`n} `r`n }"
        }
      failOnStderr: true
      displayName: 'Create Backend Azure'

- task: TerraformTaskV1@0
    inputs:
      provider: ${{parameters.provider}}
      command: 'init'
      workingDirectory: ${{parameters.terraformPath}}
      backendServiceArm: AzureServiceConnection
      backendAzureRmResourceGroupName: TerraformRg
      backendAzureRmStorageAccountName:TerraformStateAccount
      backendAzureRmContainerName: TerraformStateContainer
      backendAzureRmKey:  ***
      environmentServiceNameAzureRM:  AzureServiceConnection

With this solution you will be able to work locally with Terraform and also during deployment have a remote Backend Service configured.

I would suggest using the Pipeline YAML to put an IF statement round the PowerShell if using this in a template:

- ${{ if eq(parameters.provider, 'azurerm')  }}:
    - powershell: |
        $filename = "backend.tf"
        $path = "${{parameters.terraformPath}}"
        $pathandfile = "$path\$filename"
        if ((Test-Path -Path $pathandfile) -eq $false){
            New-Item -Path $path -Name $filename -ItemType "file" -Value "terraform { `r`n backend ""azurerm"" {`r`n} `r`n }"
        }
      failOnStderr: true
      displayName: 

Azure DevOps Pipeline Templates and External Repositories

Working with Azure DevOps you can use YAML to create the build and deployment pipelines. To make this easier and more repeatable you can also use something called templates. However, if you want to use them in multiple repositories you don’t want to repeat yourself. There is a method to get these shared as I will demo below.

When I format my folders for holding the YAML files, I like to mirror how they were built in the UI editor in Azure DevOps website. That is with Tasks like DotNetCli and Group Tasks that are a collection of Tasks to complete a job like Build Dotnet Core Application.

DevOps
–Tasks
—-DotNetCli.yml
–GroupTasks
—-BuildDotnetApp.yml

In this method the ‘BuildDotnetApp.yml’ would inherit the ‘DotNetCli.yml’ and other Group Tasks could also inherit it as well. This makes them more reusable and dynamic, plus easier to upgrade if you need to change a Task version or add a new parameter.

This would be the Dot Net Core CLI Task:

parameters:
  diplayName: 'DotNetCoreCLI'
  projects: ''
  arguments: ''
  command: build
  customScript: ''
  continueOnError: false

steps:
- task: DotNetCoreCLI@2
  displayName: ${{parameters.diplayName}}
  inputs:
    publishWebProjects: false
    command: ${{parameters.command}}
    projects: ${{parameters.projects}}
    arguments: ${{parameters.arguments}}
    zipAfterPublish: false
    custom: ${{parameters.customScript}}
    continueOnError: ${{parameters.continueOnError}}

And can then be called in like below. Remember that the folder path is relative to where this file is hosted.

steps:
- template: ../Tasks/_DotNetCoreCLI.yml
  parameters:
    diplayName: 'Restore .NetCore Projects'
    projects:  '**/MicroServices/**/*.API.csproj'
    arguments: '--packages $(Build.SourcesDirectory)\packages'
    command: restore

- template: ../Tasks/_DotNetCoreCLI.yml
  parameters:
    diplayName: 'Build .NetCore Projects'
    projects:  '**/*.csproj'
    arguments: '--configuration $(BuildConfiguration) --output $(Build.SourcesDirectory)\bin\$(BuildConfiguration)'
    command: build

You can read more on using templates in the Azure DevOps Documentation.
https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops

Now we have these great reusable templates, we don’t want them sitting in a multiple repository to then be maintained in multiple times.

The idea here would to move these files to a single repository for example ‘deployment-files’, which will contain all them files to then be referenced later.

The first thing we need to do is reference this new repository in the applications pipeline file. Below is a standard azure pipeline file for building the dotnet application. It has array of stages with the first stage being the CI Build, a single job and the default agent pool.

stages:
  - stage: 'CIBuild'
    displayName: 'CI  Service'
    jobs:
      - job: CI_Service
        displayName: CI Service
        continueOnError: false
        pool:
          displayName: "CI Service"
          name: Default
        workspace:
          clean: all
        timeoutInMinutes: 120
        cancelTimeoutInMinutes: 2
        steps:

To add a reference to another repository you will need to add the following to the top of the file.

This reference will have a alias name, type of repository, location to the repository and a reference to the git branch reference as below.

resources:
  repositories:
    - repository: DeploymentTemplates #alias name
      type: git #type of repository
      name: deployment-files #repository name
      ref: 'refs/heads/main' #git branch reference

This is making a reference to another Azure DevOps Repository in the same Organisation, which might work for some setup, but others might have them in different repositories or different vendors like GitHub. The other alternative to this method above is you might want to get the reference from a Pipeline Artifacts after a build, which you can also do by following the instructions in this documentation. https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema

With this reference, it means you have access to the repository, but it doesn’t do a git pull as far as I could tell. This might just be for repositories in the same system like Azure DevOps, but it does make things simple as your not download more resources when running the pipeline.

Now you have access to the repository you can call upon the templates in the same method as you normally would with once slight change. You need to reference the file relative to the location it is in the deployment files repository, not the current applications. The other part is you need to add ‘@alias name’ to the end of the path, so it knows where to get the files from. For our example it would look like this.

steps:
- template: DevOps/Tasks/_DotNetCoreCLI.yml@DeploymentTemplates
  parameters:
    diplayName: 'Restore .NetCore Projects'
    projects:  '**/MicroServices/**/*.API.csproj'
    arguments: '--packages $(Build.SourcesDirectory)\packages'
    command: restore

- template: DevOps/Tasks/_DotNetCoreCLI.yml@DeploymentTemplates
  parameters:
    diplayName: 'Build .NetCore Projects'
    projects:  '**/*.csproj'
    arguments: '--configuration $(BuildConfiguration) --output $(Build.SourcesDirectory)\bin\$(BuildConfiguration)'
    command: build

Notice I am not using  the ‘../Task’, but directly referencing the  path ‘DevOps/Task’. Also I have added the ‘@DeploymentTemplates’ to the end of the path.

Here is the full example.

Deployment Files Repository:
Location = ‘DevOps/Tasks’

parameters:
  diplayName: 'DotNetCoreCLI'
  projects: ''
  arguments: ''
  command: build
  customScript: ''
  continueOnError: false

steps:
- task: DotNetCoreCLI@2
  displayName: ${{parameters.diplayName}}
  inputs:
    publishWebProjects: false
    command: ${{parameters.command}}
    projects: ${{parameters.projects}}
    arguments: ${{parameters.arguments}}
    zipAfterPublish: false
    custom: ${{parameters.customScript}}
    continueOnError: ${{parameters.continueOnError}}



Application Repository:
Location = ‘azurepipeline.yml’

resources:
  repositories:
    - repository: DeploymentTemplates #alias name
      type: git #type of repository
      name: deployment-files #repository name
      ref: 'refs/heads/main' #git branch reference
stages:
  - stage: 'CIBuild'
    displayName: 'CI  Service'
    jobs:
      - job: CI_Service
        displayName: CI Service
        continueOnError: false
        pool:
          displayName: "CI Service"
          name: Default
        workspace:
          clean: all
        timeoutInMinutes: 120
        cancelTimeoutInMinutes: 2
	steps:
	- template: DevOps/Tasks/_DotNetCoreCLI.yml@DeploymentTemplates
	  parameters:
	    diplayName: 'Restore .NetCore Projects'
	    projects:  '**/MicroServices/**/*.API.csproj'
	    arguments: '--packages $(Build.SourcesDirectory)\packages'
	    command: restore
	
	- template: DevOps/Tasks/_DotNetCoreCLI.yml@DeploymentTemplates
	  parameters:
	    diplayName: 'Build .NetCore Projects'
	    projects:  '**/*.csproj'
	    arguments: '--configuration $(BuildConfiguration) --output $(Build.SourcesDirectory)\bin\$(BuildConfiguration)'
	    command: build