Azure DevOps loop a complex object

This is a subject that I have not found much or any documentation on, so I wanted to share what I did. When creating the parameters to pass into a Template it can get very long, plus when two arrays depend on each other it can get very complex, so ironically a complex object makes this simple.

For my example I have a template where I would like to process Virtual Machines(VM) and its associated Disk if it has one. Therefore I need the VMs name and the Disk name for each, which can be done in many other ways, but this is for a specific example.

You could pass in the parameters as lists for VM names and Disk names as per below example, but then if a VM doesn’t have a disk or has many disk, the indexes would not line up.

	- Name: virtualMachineNames
	    Type: object
	    Default: ['vm1','vm2']
	- Name: diskNames
	   Type: object
       Default: ['vm1Disk1','vm1Disk2','vm2Disk1']

Instead the object can be just that, but the catch I found is you can’t define a strict format. Therefore, I would suggest adding a comment to the file to demo a example format. In the below example I have added a default version just for this demo.

- Name: virtualMachines
    Type: object
    Default: 
	    - Vm:
		    Name: 'vm1'
		    Disks: ['vm1Disk1','vm1Disk2']
	    - Vm:
		    Name: 'vm2'
            Disks: ['vm2Disk1'']

You can keep all of you properties on one level, which removes the ‘vm’ part and still do the looping below, but for prettiness and as close to a JSON object, I like doing it like this.

We then can loop through these properties just like we would a list of items, and then we access the properties like an object.

- ${{ each vm in parameters.virtualMachines }}:
      - task: AzureCLI@2
        displayName: Check ${{ vm.name }} Disks

Automatic Change Checking on Pipeline Stages

Azure DevOps has the ability to add multiple stage approval styles like human intervention, Azure Monitoring and schedules. However, there is not the ability to value if a source has changed before triggering a Stage. This can be done on the Pipeline level, but not the Stage level where it can be helpful to have sometimes. I have a PowerShell solution on how this can be done.

The problem to solve came when I had a Pipeline that build multiple Nuget Packages. Without having a human intervention approval button on each of the stages to publish the Nuget Package, they would Publish each time even if there was not changes. This would then cause the version number in the Artifacts to keep increasing for no reason.

Therefore, we wanted a method to automatically check if there has been a change in a particular source location, before allowing it to publish the Package. This has been done using PowerShell, to get the requested branch changes through Git. It can get all the names of the changed files, which included the full path, then we can check what it contains.

The code below gets the Diff files, looping through each of them while checking if it matches the provided path. If it does then it can set the global variable to say there has been changes. I did add an option to break on the first found item, but you could leave it to keep going and leave a log of all the files changed. In my case, I just wanted to know if there has or has not been a change.

$files = $(git diff HEAD HEAD~ --name-only)
$changedFiles = $files -split ' '
$changedCount = $changedFiles.Length
Write-Host("Total changed $changedCount")
$hasBeenChanged = $false
For ($i = 0; $i -lt $count; $i++) {
    $changedFile = $changedFiles[$i]
    if ($changedFile -like "${{parameters.projectPath}}") {
        Write-Host("CHANGED: $changedFile")
        $hasBeenChanged = $true
        if (${{parameters.breakOnChange}} -eq 'true')
            break
    }
}

You can then set the outcome as an output variable of the task for usage later as per this Template below:

- name: 'projectPath'
    type: string
  - name: 'name'
    type: string
  - name: 'breakOnChange'
    type: string
    default: 'true'
steps:
  - task: PowerShell@2
    name: ${{parameters.name}}
    displayName: 'Code change check for ${{parameters.name}}'
    inputs:
      targetType: 'inline'
      script: |
        $files = $(git diff HEAD HEAD~ --name-only)
        $changedFiles = $files -split ' '
        $changedCount = $changedFiles.Length
        Write-Host("Total changed $changedCount")
        $hasBeenChanged = $false
        For ($i = 0; $i -lt $count; $i++) {
            $changedFile = $changedFiles[$i]
            if ($changedFile -like "${{parameters.projectPath}}") {
                Write-Host("CHANGED: $changedFile")
                $hasBeenChanged = $true
                if (${{parameters.breakOnChange}} -eq 'true')
                break
            }
        }
        if ($hasBeenChanged -eq $true) {
            Write-Host "##vso[task.setvariable variable=IsContainFile;isOutput=true]True"
            break
        }
        else {
            Write-Host "##vso[task.setvariable variable=IsContainFile;isOutput=true]False"
        }

Once you have that output variable, it can be used throughout your pipeline as a condition on Stages, Jobs and Task, plus more. Here I use it on a Stage to check before it is run:

- stage: '${{parameters.projectExtension}}Build'
    displayName: ' Nuget Stage'
    condition: eq(variables['${{parameters.name}}.IsContainFile'], 'true')

Merge Azure DevOps Pipeline Templates

As mentioned in my previous post about Azure DevOps Local Pipeline Testing, the available method of testing with the Azure DevOps API doesn’t have the feature to merge the YAML Templates. Therefore, I have set out to try and solve this issue..

You can view the full PowerShell script on GitHub.(https://github.com/PureRandom/AzureDevOpsYamlMerge-ps)

Please feel free to advise on more efficient coding and suggestions of other use cases that need to be considered.

Below I will walk through what it currently does as of the date of this post. I have tried to consider most, if not all, of the scenarios that I have come across, but I am sure there are other ways that need to be solved.

To use the script you simply need to pass it the root location of where your YAML is stored and the name of the main YAML file. For Example:

$yamlPath = "C:\Users\pateman.workspace\CodeRepos\"
$yamlName = "azurepipeline.yml"
$outputPath = processMainPipeline -pipelineYaml $yamlName -rootPath $yamlPath
Write-Host "Parsed into $outputPath"

This will read through each line and rebuild the YAML file. As it reads through if it finds a line that contains the template syntax then the processing starts, but only if the template path does not contain the ‘@’ symbol as that is assumed to be a path in a remote repository.

In the processing it will extract the parameters that are being passed to the template. Then getting a copy of the template YAML into a variable, it will start reading this file and rebuilding it. First it will assume the Parameters are set at the top, so it will extract the parameters. If the parameter found has been set by the main YAML then it will do nothing, else it will create the entry and update value from the default property.

Once it has all the parameters it can find and replace these as it goes through the file. Finally insert this now update version of the template into the same slot as where the template reference was in the main YAML.

This is then saved in the root location, where you can use this file in the pipeline testing API.

Testing Azure DevOps Pipelines Locally

I have tried and tried before, plus also read a lot of other people asking for the ability to test Azure YAML Pipelines locally. Now although it is not perfect and some room for improvement, Microsoft has create a way to test your pipeline with some local testing and remote testing without having to check in your YAML.

The issue to solve is, you making a change to a Azure DevOps Pipelines YAML, then to test this you need to check it in just to find out something small like, the value you passed is not correct, or the linked repository isn’t spelt correctly. This can then get very annoying as you need to keep doing this back and forward until it is finally able to run.

A quick tool you can use, which is completely local, is the Visual Studio Code (VS Code) extension ‘Azure Pipelines‘. This will validate your YAMLs formatting and in the Output window to the left it can show you the hierarchy flow of the YAML, which is the first step to validate.

Another handy tool is ‘Azure DevOps Snippets‘, which is an auto-complete library of the Azure DevOps Tasks. This can help in getting the correct names of parameters and tasks, instead of having to need a solid memory.

However, these tools will only help with formatting the local YAML and without the parameters you have set in the Azure DevOps Library. What we need is to test this remotely against the information in Azure DevOps and for that Microsoft have provided an API.

This API can be used to run a pipeline remotely that is stored in Azure DevOps, so you can automate via the API to trigger builds. It also has two handy parameters. One being ‘previewRun’, a boolean to determine if this should run the pipeline for real or validate the run. This will use all the information you have set in Azure DevOps and the pipeline, but it will not create a new build and run code. What it will do is use the correct parameters, link to the other real repositories and check all the data will work. It is also the setting use when you edit the pipeline in Azure DevOps and select ‘Validate’ from the options menu in the top right.

This is really good, but would still require you to check in the YAML that you have edited, which is why there is the parameter ‘yamlOverride’ in the API. In this parameter you can add you newly edited YAML content and this will replace what is store in Azure DevOps without overwriting it.

Here is an example POST Body:

POST:

https://dev.azure.com/MyOrg/7e8cdd97-0000-0000-a9ed-8eb0e5c748f5/_apis/pipelines/12/runs
{
   "resources":{
      "pipelines":{
         
      },
      "repositories":{
         "self":{
            "refName":"azure-pipelines"
         }
      },
      "builds":{
         
      },
      "containers":{
         
      },
      "packages":{
         
      }
   },
   "templateParameters":{
      
   },
   "previewRun":true,
   "yamlOverride":"MyYAML'"
}

To make life a little easier, if you are using PowerShell there is a VSTeam Module you can install to run this command through PowerShell. It takes the ID of the pipeline you want to run against, which you can get from the pipeline URL in Azure DevOps. It then take the file path to the local YAML file and the project name from Azure DevOps.

Test-VSTeamYamlPipeline -PipelineId 12 -FilePath .\azure-pipelines.yml -ProjectName ProjectName


These are great tools and solve a lot of the issue, but they do not solve all of them. You would still need internet connection to run these, as they run on the remote DevOps instance. Also, I have a repository purely containing groups of YAML tasks for better reusability, which you cannot run tests on these as they are not complete pipelines only templats, so I would still need to do the check in and run process. Finally, it does not merge local YAML files. If you have a main file and then templates locally linked it will not merge them, you need to create a single YAML to run this.

However, I don’t think you can solve all of these as it would require you to run Azure DevOps locally in an exact copy of the live instance you have. If you was using Azure DevOps Server then you might be able to take copies of these and run them in a container on your local, but it could be a longer process compare to the current one.

If you have any other methods to make developing Azure DevOps YAML Pipelines easier then please share.

Push Docker Image to ACR without Service Connection in Azure DevOps

If you are like me and using infrastructure as code to deploy your Azure Infrastructure then using the Azure DevOps Docker task doesn’t work. To use this task you need to know what your Azure Container Registry(ACR) is and have it configured to be able to push your docker images to the registry, but you don’t know that yet. Here I show how you can still use Azure DevOps to push your images to a dynamic ACR.

In my case I am using Terraform to create the Container Registry and with that I pass what I want it to be called. For example ‘prc-acr’ which will generate an ACR with the full login server name ‘prc-acr.azurecr.io’. This can then be used later for sending the images to the correct registry.

When using the official Microsoft Docker Task the documentation asks that your have a Service Connection to your ACR. To do this though you need the registry login server name, username and password to connect, which unless you keep the registry static you will not know. Therefore, you can’t create the connection to then push your images up. I did read some potential methods to dynamically create this connection, but then we need to manage these so they do not get out of control.

To push the image we need only two things, a connection to Azure and where to push the image. The first we can get set up as we know the tenant and subscription we will be deploying to. The connection can be made up by following this guide to connection Azure to Azure DevOps. The other part of where to send the image, we mentioned earlier when we created the ACT in Terraform calling it ‘prc-acr’.

With these details we can use the Azure CLI to push the image to the ACR. First your need to login to the ACR using:

az acr login --name 'prc-acr'

This will connect you into the ACR that was created in Azure. From there you will need to tag your image with the acr login server name with registry name and tag. For example:

docker tag prcImage:latest prc-acr.azurecr.io/prc-registry:latest

This will then tell docker where to push the image to while you are logged in to the Azure Container Registry, which means from there we simply just need to push the image with that tag in the standard docker method:

docker push prc-acr.azurecr.io/prc-registry:latest

Now this is very each and simple as we do not need a connection to the Container Registry, but just a connection to the Azure environment. These details can then be used with the Azure CLI Task as below, where I am passing in the following parameters.

Parameter NameExample ValueDescription
azureServiceConnectionAzureServiceConnectionService Connection name to Azure
azureContainerRegistryNamePrc-acrAzure Container Registry Name
dockerImageprcImageDocker Image Name
tagNameLatestDocker Tag Name
registryNamePrc-registryACR Registry Name
steps:
  - task: AzureCLI@2
    displayName: 'Push Docker Image to ACR'
    inputs:
      azureSubscription: ${{parameters.azureServiceConnection}}
      scriptType: 'ps'
      scriptLocation: 'inlineScript'
      inlineScript: |
        az acr login --name ${{parameters.azureContainerRegistryName}}
        docker tag ${{parameters.dockerImage}}:${{parameters.tagName}} ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}
        docker push ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}