Automatic Change Checking on Pipeline Stages

Azure DevOps has the ability to add multiple stage approval styles like human intervention, Azure Monitoring and schedules. However, there is not the ability to value if a source has changed before triggering a Stage. This can be done on the Pipeline level, but not the Stage level where it can be helpful to have sometimes. I have a PowerShell solution on how this can be done.

The problem to solve came when I had a Pipeline that build multiple Nuget Packages. Without having a human intervention approval button on each of the stages to publish the Nuget Package, they would Publish each time even if there was not changes. This would then cause the version number in the Artifacts to keep increasing for no reason.

Therefore, we wanted a method to automatically check if there has been a change in a particular source location, before allowing it to publish the Package. This has been done using PowerShell, to get the requested branch changes through Git. It can get all the names of the changed files, which included the full path, then we can check what it contains.

The code below gets the Diff files, looping through each of them while checking if it matches the provided path. If it does then it can set the global variable to say there has been changes. I did add an option to break on the first found item, but you could leave it to keep going and leave a log of all the files changed. In my case, I just wanted to know if there has or has not been a change.

$files = $(git diff HEAD HEAD~ --name-only)
$changedFiles = $files -split ' '
$changedCount = $changedFiles.Length
Write-Host("Total changed $changedCount")
$hasBeenChanged = $false
For ($i = 0; $i -lt $count; $i++) {
    $changedFile = $changedFiles[$i]
    if ($changedFile -like "${{parameters.projectPath}}") {
        Write-Host("CHANGED: $changedFile")
        $hasBeenChanged = $true
        if (${{parameters.breakOnChange}} -eq 'true')
            break
    }
}

You can then set the outcome as an output variable of the task for usage later as per this Template below:

- name: 'projectPath'
    type: string
  - name: 'name'
    type: string
  - name: 'breakOnChange'
    type: string
    default: 'true'
steps:
  - task: PowerShell@2
    name: ${{parameters.name}}
    displayName: 'Code change check for ${{parameters.name}}'
    inputs:
      targetType: 'inline'
      script: |
        $files = $(git diff HEAD HEAD~ --name-only)
        $changedFiles = $files -split ' '
        $changedCount = $changedFiles.Length
        Write-Host("Total changed $changedCount")
        $hasBeenChanged = $false
        For ($i = 0; $i -lt $count; $i++) {
            $changedFile = $changedFiles[$i]
            if ($changedFile -like "${{parameters.projectPath}}") {
                Write-Host("CHANGED: $changedFile")
                $hasBeenChanged = $true
                if (${{parameters.breakOnChange}} -eq 'true')
                break
            }
        }
        if ($hasBeenChanged -eq $true) {
            Write-Host "##vso[task.setvariable variable=IsContainFile;isOutput=true]True"
            break
        }
        else {
            Write-Host "##vso[task.setvariable variable=IsContainFile;isOutput=true]False"
        }

Once you have that output variable, it can be used throughout your pipeline as a condition on Stages, Jobs and Task, plus more. Here I use it on a Stage to check before it is run:

- stage: '${{parameters.projectExtension}}Build'
    displayName: ' Nuget Stage'
    condition: eq(variables['${{parameters.name}}.IsContainFile'], 'true')

Merge Azure DevOps Pipeline Templates

As mentioned in my previous post about Azure DevOps Local Pipeline Testing, the available method of testing with the Azure DevOps API doesn’t have the feature to merge the YAML Templates. Therefore, I have set out to try and solve this issue..

You can view the full PowerShell script on GitHub.(https://github.com/PureRandom/AzureDevOpsYamlMerge-ps)

Please feel free to advise on more efficient coding and suggestions of other use cases that need to be considered.

Below I will walk through what it currently does as of the date of this post. I have tried to consider most, if not all, of the scenarios that I have come across, but I am sure there are other ways that need to be solved.

To use the script you simply need to pass it the root location of where your YAML is stored and the name of the main YAML file. For Example:

$yamlPath = "C:\Users\pateman.workspace\CodeRepos\"
$yamlName = "azurepipeline.yml"
$outputPath = processMainPipeline -pipelineYaml $yamlName -rootPath $yamlPath
Write-Host "Parsed into $outputPath"

This will read through each line and rebuild the YAML file. As it reads through if it finds a line that contains the template syntax then the processing starts, but only if the template path does not contain the ‘@’ symbol as that is assumed to be a path in a remote repository.

In the processing it will extract the parameters that are being passed to the template. Then getting a copy of the template YAML into a variable, it will start reading this file and rebuilding it. First it will assume the Parameters are set at the top, so it will extract the parameters. If the parameter found has been set by the main YAML then it will do nothing, else it will create the entry and update value from the default property.

Once it has all the parameters it can find and replace these as it goes through the file. Finally insert this now update version of the template into the same slot as where the template reference was in the main YAML.

This is then saved in the root location, where you can use this file in the pipeline testing API.

Testing Azure DevOps Pipelines Locally

I have tried and tried before, plus also read a lot of other people asking for the ability to test Azure YAML Pipelines locally. Now although it is not perfect and some room for improvement, Microsoft has create a way to test your pipeline with some local testing and remote testing without having to check in your YAML.

The issue to solve is, you making a change to a Azure DevOps Pipelines YAML, then to test this you need to check it in just to find out something small like, the value you passed is not correct, or the linked repository isn’t spelt correctly. This can then get very annoying as you need to keep doing this back and forward until it is finally able to run.

A quick tool you can use, which is completely local, is the Visual Studio Code (VS Code) extension ‘Azure Pipelines‘. This will validate your YAMLs formatting and in the Output window to the left it can show you the hierarchy flow of the YAML, which is the first step to validate.

Another handy tool is ‘Azure DevOps Snippets‘, which is an auto-complete library of the Azure DevOps Tasks. This can help in getting the correct names of parameters and tasks, instead of having to need a solid memory.

However, these tools will only help with formatting the local YAML and without the parameters you have set in the Azure DevOps Library. What we need is to test this remotely against the information in Azure DevOps and for that Microsoft have provided an API.

This API can be used to run a pipeline remotely that is stored in Azure DevOps, so you can automate via the API to trigger builds. It also has two handy parameters. One being ‘previewRun’, a boolean to determine if this should run the pipeline for real or validate the run. This will use all the information you have set in Azure DevOps and the pipeline, but it will not create a new build and run code. What it will do is use the correct parameters, link to the other real repositories and check all the data will work. It is also the setting use when you edit the pipeline in Azure DevOps and select ‘Validate’ from the options menu in the top right.

This is really good, but would still require you to check in the YAML that you have edited, which is why there is the parameter ‘yamlOverride’ in the API. In this parameter you can add you newly edited YAML content and this will replace what is store in Azure DevOps without overwriting it.

Here is an example POST Body:

POST:

https://dev.azure.com/MyOrg/7e8cdd97-0000-0000-a9ed-8eb0e5c748f5/_apis/pipelines/12/runs
{
   "resources":{
      "pipelines":{
         
      },
      "repositories":{
         "self":{
            "refName":"azure-pipelines"
         }
      },
      "builds":{
         
      },
      "containers":{
         
      },
      "packages":{
         
      }
   },
   "templateParameters":{
      
   },
   "previewRun":true,
   "yamlOverride":"MyYAML'"
}

To make life a little easier, if you are using PowerShell there is a VSTeam Module you can install to run this command through PowerShell. It takes the ID of the pipeline you want to run against, which you can get from the pipeline URL in Azure DevOps. It then take the file path to the local YAML file and the project name from Azure DevOps.

Test-VSTeamYamlPipeline -PipelineId 12 -FilePath .\azure-pipelines.yml -ProjectName ProjectName


These are great tools and solve a lot of the issue, but they do not solve all of them. You would still need internet connection to run these, as they run on the remote DevOps instance. Also, I have a repository purely containing groups of YAML tasks for better reusability, which you cannot run tests on these as they are not complete pipelines only templats, so I would still need to do the check in and run process. Finally, it does not merge local YAML files. If you have a main file and then templates locally linked it will not merge them, you need to create a single YAML to run this.

However, I don’t think you can solve all of these as it would require you to run Azure DevOps locally in an exact copy of the live instance you have. If you was using Azure DevOps Server then you might be able to take copies of these and run them in a container on your local, but it could be a longer process compare to the current one.

If you have any other methods to make developing Azure DevOps YAML Pipelines easier then please share.

Push Docker Image to ACR without Service Connection in Azure DevOps

If you are like me and using infrastructure as code to deploy your Azure Infrastructure then using the Azure DevOps Docker task doesn’t work. To use this task you need to know what your Azure Container Registry(ACR) is and have it configured to be able to push your docker images to the registry, but you don’t know that yet. Here I show how you can still use Azure DevOps to push your images to a dynamic ACR.

In my case I am using Terraform to create the Container Registry and with that I pass what I want it to be called. For example ‘prc-acr’ which will generate an ACR with the full login server name ‘prc-acr.azurecr.io’. This can then be used later for sending the images to the correct registry.

When using the official Microsoft Docker Task the documentation asks that your have a Service Connection to your ACR. To do this though you need the registry login server name, username and password to connect, which unless you keep the registry static you will not know. Therefore, you can’t create the connection to then push your images up. I did read some potential methods to dynamically create this connection, but then we need to manage these so they do not get out of control.

To push the image we need only two things, a connection to Azure and where to push the image. The first we can get set up as we know the tenant and subscription we will be deploying to. The connection can be made up by following this guide to connection Azure to Azure DevOps. The other part of where to send the image, we mentioned earlier when we created the ACT in Terraform calling it ‘prc-acr’.

With these details we can use the Azure CLI to push the image to the ACR. First your need to login to the ACR using:

az acr login --name 'prc-acr'

This will connect you into the ACR that was created in Azure. From there you will need to tag your image with the acr login server name with registry name and tag. For example:

docker tag prcImage:latest prc-acr.azurecr.io/prc-registry:latest

This will then tell docker where to push the image to while you are logged in to the Azure Container Registry, which means from there we simply just need to push the image with that tag in the standard docker method:

docker push prc-acr.azurecr.io/prc-registry:latest

Now this is very each and simple as we do not need a connection to the Container Registry, but just a connection to the Azure environment. These details can then be used with the Azure CLI Task as below, where I am passing in the following parameters.

Parameter NameExample ValueDescription
azureServiceConnectionAzureServiceConnectionService Connection name to Azure
azureContainerRegistryNamePrc-acrAzure Container Registry Name
dockerImageprcImageDocker Image Name
tagNameLatestDocker Tag Name
registryNamePrc-registryACR Registry Name
steps:
  - task: AzureCLI@2
    displayName: 'Push Docker Image to ACR'
    inputs:
      azureSubscription: ${{parameters.azureServiceConnection}}
      scriptType: 'ps'
      scriptLocation: 'inlineScript'
      inlineScript: |
        az acr login --name ${{parameters.azureContainerRegistryName}}
        docker tag ${{parameters.dockerImage}}:${{parameters.tagName}} ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}
        docker push ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}

Terraform remote backend for cloud and local with Azure DevOps Terraform Task

When working with Terraform, you will do a lot of work/testing locally. Therefore, you do not want to store your state file in a remote storage, and instead just store it locally. However, when deploy you don’t want to then be converting the configuration at that point and can get messy working with Azure DevOps. This is a solution that works for both local development and production deployment with the Azure DevOps Terraform Task.

The official Terraform Task in Azure DevOps by Microsoft is https://marketplace.visualstudio.com/items?itemName=ms-devlabs.custom-terraform-tasks

When using this task you configure the cloud provider you will be using as a Backend service like Azure, Amazon Web Services (AWS) or Google Cloud Platform (GCP). These details can be used to configure the Backend Service to store the State file, but they require the Terraform code to implement the service.

You can see all the different types here: https://www.terraform.io/docs/backends/types/index.html

For this walk through I will use the Azure Resource Manager, which uses an Azure Storage Account, as the example, but as mentioned this can be used in any provider.

https://www.terraform.io/docs/backends/types/azurerm.html

This would be the standard Terraform configuration you would need for setting up the Backend Service for Azure:
 

terraform {
  backend "azurerm" {
    resource_group_name  = "StorageAccount-ResourceGroup"
    storage_account_name = "abcd1234"
    container_name       = "tfstate"
    key                  = "prod.terraform.tfstate"
  }
}

When using this locally, you don’t want any of this in your main.tf Terraform file else it will error with no detail or add the state to the Azure Storage Account. Therefore, locally you will not add this.

Instead during the deployment using the Azure DevOps Pipelines, we will inject the configuration. This will be done, by inserting a backend.tf file using PowerShell. Within the file, we will inject the configuration, but we don’t need all the parameters as they are inserted by the Terraform task.

We will inject just:

terraform {
  backend "azurerm" {
  }
}

Which as a single like string we will need to stringify it to:

"terraform { `r`n backend ""azurerm"" {`r`n} `r`n }"

Using the PowerShell task we can then check for if the file already exist and if not then inject it into the same location as the main.tf file. This then causes when Terraform runs to process it with a Backend Service and with the Azure details we have provided in the Task.

- powershell: |
        $filename = "backend.tf"
        $path = "${{parameters.terraformPath}}"
        $pathandfile = "$path\$filename"
        if ((Test-Path -Path $pathandfile) -eq $false){
            New-Item -Path $path -Name $filename -ItemType "file" -Value "terraform { `r`n backend ""azurerm"" {`r`n} `r`n }"
        }
      failOnStderr: true
      displayName: 'Create Backend Azure'

- task: TerraformTaskV1@0
    inputs:
      provider: ${{parameters.provider}}
      command: 'init'
      workingDirectory: ${{parameters.terraformPath}}
      backendServiceArm: AzureServiceConnection
      backendAzureRmResourceGroupName: TerraformRg
      backendAzureRmStorageAccountName:TerraformStateAccount
      backendAzureRmContainerName: TerraformStateContainer
      backendAzureRmKey:  ***
      environmentServiceNameAzureRM:  AzureServiceConnection

With this solution you will be able to work locally with Terraform and also during deployment have a remote Backend Service configured.

I would suggest using the Pipeline YAML to put an IF statement round the PowerShell if using this in a template:

- ${{ if eq(parameters.provider, 'azurerm')  }}:
    - powershell: |
        $filename = "backend.tf"
        $path = "${{parameters.terraformPath}}"
        $pathandfile = "$path\$filename"
        if ((Test-Path -Path $pathandfile) -eq $false){
            New-Item -Path $path -Name $filename -ItemType "file" -Value "terraform { `r`n backend ""azurerm"" {`r`n} `r`n }"
        }
      failOnStderr: true
      displayName: