Testing Azure DevOps Pipelines Locally

I have tried and tried before, plus also read a lot of other people asking for the ability to test Azure YAML Pipelines locally. Now although it is not perfect and some room for improvement, Microsoft has create a way to test your pipeline with some local testing and remote testing without having to check in your YAML.

The issue to solve is, you making a change to a Azure DevOps Pipelines YAML, then to test this you need to check it in just to find out something small like, the value you passed is not correct, or the linked repository isn’t spelt correctly. This can then get very annoying as you need to keep doing this back and forward until it is finally able to run.

A quick tool you can use, which is completely local, is the Visual Studio Code (VS Code) extension ‘Azure Pipelines‘. This will validate your YAMLs formatting and in the Output window to the left it can show you the hierarchy flow of the YAML, which is the first step to validate.

Another handy tool is ‘Azure DevOps Snippets‘, which is an auto-complete library of the Azure DevOps Tasks. This can help in getting the correct names of parameters and tasks, instead of having to need a solid memory.

However, these tools will only help with formatting the local YAML and without the parameters you have set in the Azure DevOps Library. What we need is to test this remotely against the information in Azure DevOps and for that Microsoft have provided an API.

This API can be used to run a pipeline remotely that is stored in Azure DevOps, so you can automate via the API to trigger builds. It also has two handy parameters. One being ‘previewRun’, a boolean to determine if this should run the pipeline for real or validate the run. This will use all the information you have set in Azure DevOps and the pipeline, but it will not create a new build and run code. What it will do is use the correct parameters, link to the other real repositories and check all the data will work. It is also the setting use when you edit the pipeline in Azure DevOps and select ‘Validate’ from the options menu in the top right.

This is really good, but would still require you to check in the YAML that you have edited, which is why there is the parameter ‘yamlOverride’ in the API. In this parameter you can add you newly edited YAML content and this will replace what is store in Azure DevOps without overwriting it.

Here is an example POST Body:

POST:

https://dev.azure.com/MyOrg/7e8cdd97-0000-0000-a9ed-8eb0e5c748f5/_apis/pipelines/12/runs
{
   "resources":{
      "pipelines":{
         
      },
      "repositories":{
         "self":{
            "refName":"azure-pipelines"
         }
      },
      "builds":{
         
      },
      "containers":{
         
      },
      "packages":{
         
      }
   },
   "templateParameters":{
      
   },
   "previewRun":true,
   "yamlOverride":"MyYAML'"
}

To make life a little easier, if you are using PowerShell there is a VSTeam Module you can install to run this command through PowerShell. It takes the ID of the pipeline you want to run against, which you can get from the pipeline URL in Azure DevOps. It then take the file path to the local YAML file and the project name from Azure DevOps.

Test-VSTeamYamlPipeline -PipelineId 12 -FilePath .\azure-pipelines.yml -ProjectName ProjectName


These are great tools and solve a lot of the issue, but they do not solve all of them. You would still need internet connection to run these, as they run on the remote DevOps instance. Also, I have a repository purely containing groups of YAML tasks for better reusability, which you cannot run tests on these as they are not complete pipelines only templats, so I would still need to do the check in and run process. Finally, it does not merge local YAML files. If you have a main file and then templates locally linked it will not merge them, you need to create a single YAML to run this.

However, I don’t think you can solve all of these as it would require you to run Azure DevOps locally in an exact copy of the live instance you have. If you was using Azure DevOps Server then you might be able to take copies of these and run them in a container on your local, but it could be a longer process compare to the current one.

If you have any other methods to make developing Azure DevOps YAML Pipelines easier then please share.

Create Identity in Google Cloud Platform and GSuite

Compare to some other cloud providers, creating an identity via code in GCP is a little fragmented if your using GSuite for your identity storage. The Google Cloud Platform holds your users identity reference and permission, while the other system GSuite hold the security of the users authentication. This can also make the documentation feel a little fragmented and not so easy to follow. Hence this post to stick them together in how I used C# Dot Net Core to create a Google Cloud Platform Identity using their SDK.

This part is standard for any SDK access to GCP, which is to have a service account authentication. For this you will need to create a Service Account in GCP, which needs to be associated with a project. You can create it against the project where you are deploying, or to keep things separate, like I would recommend, you can create a  Service Management Project. This is just a standard project, but you can use this to keep all the SDK activity on this project while the usage activity happens on the other project.

Create a Project

  1. Go to the Manage resources page in the Cloud Console.
  2. On the Select organization drop-down list at the top of the page, select the organization in which you want to create a project. If you are a free trial user, skip this step, as this list does not appear.
  3. Click Create Project.
  4. In the New Project window that appears, enter a project name and select a billing account as applicable. A project name can contain only letters, numbers, single quotes, hyphens, spaces, or exclamation points, and must be between 4 and 30 characters.
  5. Enter the parent organization or folder in the Location box. That resource will be the hierarchical parent of the new project.
  6. When you’re finished entering new project details, click Create.

Reference: https://cloud.google.com/resource-manager/docs/creating-managing-projects#console

Create a Service Account

  1. In the Cloud Console, go to the Service accounts page.
  2. Select a project (your new Service Management Project).
  3. Click Create service account.
  4. Enter a service account name to display in the Cloud Console.
    The Cloud Console generates a service account ID based on this name. Edit the ID if necessary. You cannot change the ID later.
  5. Optional: Enter a description of the service account.
  6. If you do not want to set access controls now, click Done to finish creating the service account.
    To set access controls now, click Create and continue to the next step.
  7. Optional: Choose one or more IAM roles to grant to the service account on the project.
  8. When you are done adding roles, click Continue.
  9. Optional: In the Service account users role field, add members that can impersonate the service account.
  10. Optional: In the Service account admins role field, add members that can manage the service account.
  11. Click Done to finish creating the service account.

Reference: https://cloud.google.com/iam/docs/creating-managing-service-accounts#iam-service-accounts-create-console

You could then get more specific for the Identity Access Management (IAM) permissions, but to keep it simple you would just need to apply the Service Account ‘Owner’ and ‘Project IAM Admin’ access on the new Service Management Project. This will give the Service Account access to create the identities, but for more detail on the permissions, you can use this link to look them up. https://cloud.google.com/iam/docs/permissions-reference

Next we need the Service Account to have access to create the identities in the GSuite. The below sets the Service Account in GCP ready to give access in the Admin portal of GSuite.

  1. Locate the newly-created service account in the table. Under Actions, click more the 3 dots at the end, then Edit.
  2. In the service account details, click the down arrow to see more Show domain-wide delegation, then ensure the Enable G Suite Domain-wide Delegation checkbox is checked.
  3. If you haven’t yet configured your app’s OAuth consent screen, you must do so before you can enable domain-wide delegation. Follow the on-screen instructions to configure the OAuth consent screen, then repeat the above steps and re-check the checkbox.
  4. Click Save to update the service account, and return to the table of service accounts. A new column, Domain-wide delegation, can be seen. Click View Client ID, to obtain and make a note of the client ID.

Reference: https://developers.google.com/admin-sdk/directory/v1/guides/delegation#create_the_service_account_and_credentials

Now we connect these together, but giving the Service Account access in the GSuite Admin Portal.

  1. From your Google Workspace domain’s Admin console, go to Main menu menu> Security > API controls.
  2. In the Domain wide delegation pane, select Manage Domain Wide Delegation.
  3. Click Add new.
  4. In the Client ID field, enter the client ID obtained from the service account creation steps above.
  5. In the OAuth Scopes field, enter a comma-delimited list of the scopes required for your application (for a list of possible scopes, see Authorize requests).
    For example, if you require domain-wide access to Users and Groups enter: https://www.googleapis.com/auth/admin.directory.user, https://www.googleapis.com/auth/admin.directory.group
  6. Click Authorize.

Reference: https://developers.google.com/admin-sdk/directory/v1/guides/delegation#delegate_domain-wide_authority_to_your_service_account

At this point our Service Account has access to the GCP Account/Project and also has the access needed for the GSuite to create the identities. Therefore, we can start getting into the code to create these accounts.

To start with the SDK we need the Service Accounts JSON Key, which you can get by:

  1. In the Cloud Console, go to the Service Accounts page.
  2. Click Select a project, choose a project, and click Open.
  3. Find the row of the service account that you want to create a key for. In that row, click the More button, and then click Create key.
  4. Select a Key type and click Create.

Reference: https://cloud.google.com/iam/docs/creating-managing-service-account-keys#iam-service-account-keys-create-console

Once you have downloaded the JSON File we can move to the Authentication in C#.

You will need to install the Google.Apis.Auth Nuget package to your project. There are then multiple difference methods to do this depending on how you are storing your JSON Key, but for my example we are injecting the JSON straight into the method, which we need the GoogleCredential.  The method we need to call is:

GoogleCredential.FromJson(gcpAuthenticationJson);

With gcpAuthenticationJson being the JSON string from the downloaded file. We also need to add scope to the request of access, which we can string together like below with these scopes required:

GoogleCredential.FromJson(gcpAuthenticationJson)
.CreateScoped(new List<string>
{
"https://www.googleapis.com/auth/admin.directory.user",
"https://www.googleapis.com/auth/admin.directory.group",
"https://www.googleapis.com/auth/admin.directory.user.security"
});

Now although we have given the Service Account all the permissions it requires to do the job, it needs to be executed by a GSuite Admin. We of course cannot have the admin logging in every time, therefore we just need the code to act as the admin. We can do this by adding an addition command to the methods:

GoogleCredential.FromJson(gcpAuthenticationJson)
.CreateScoped(new List<string>
{
 "https://www.googleapis.com/auth/admin.directory.user",
 "https://www.googleapis.com/auth/admin.directory.group",
 "https://www.googleapis.com/auth/admin.directory.user.security"
}).CreateWithUser(adminEmail);

We can of course make this a little more flexable as it can be reused for other authentications, so this is the method I would recommend:

///<summary>
///GettheGCPCredentialviatheServiceAccount
///https://cloud.google.com/docs/authentication/production
///</summary>
///<paramname="authJson">DownloadedAuthenticationJSON</param>
///<paramname="apiScopes">CustomAPIScopes</param>
///<paramname="adminEmail">UserEmailAddresstoImpersonate</param>
///<returns>GCPCredentials</returns>
publicGoogleCredential GetGcpCredential(string authJson,List<string> apiScopes=null,string adminEmail="")
{
 var googleCredential = GoogleCredential.FromJson(authJson)
 .CreateScoped(apiScopes ?? new List<string>
 {
  "https://www.googleapis.com/auth/cloud-platform"
 });

 if(!string.IsNullOrEmpty(adminEmail))
  googleCredential=googleCredential.CreateWithUser(adminEmail);

 return googleCredential;
}

From this can then create users with the SDK using this simple bit of code:

var directoryService = new DirectoryService( new BaseClientService.Initializer
            {
                HttpClientInitializer = GetGcpCredential(authJson, apiScopes, userEmail)
            });
            try
            {
                var request = directoryService.Users.Insert(userData);
                return await request.ExecuteAsync();
            }
            finally
            {
                directoryService.Dispose();
            }

Push Docker Image to ACR without Service Connection in Azure DevOps

If you are like me and using infrastructure as code to deploy your Azure Infrastructure then using the Azure DevOps Docker task doesn’t work. To use this task you need to know what your Azure Container Registry(ACR) is and have it configured to be able to push your docker images to the registry, but you don’t know that yet. Here I show how you can still use Azure DevOps to push your images to a dynamic ACR.

In my case I am using Terraform to create the Container Registry and with that I pass what I want it to be called. For example ‘prc-acr’ which will generate an ACR with the full login server name ‘prc-acr.azurecr.io’. This can then be used later for sending the images to the correct registry.

When using the official Microsoft Docker Task the documentation asks that your have a Service Connection to your ACR. To do this though you need the registry login server name, username and password to connect, which unless you keep the registry static you will not know. Therefore, you can’t create the connection to then push your images up. I did read some potential methods to dynamically create this connection, but then we need to manage these so they do not get out of control.

To push the image we need only two things, a connection to Azure and where to push the image. The first we can get set up as we know the tenant and subscription we will be deploying to. The connection can be made up by following this guide to connection Azure to Azure DevOps. The other part of where to send the image, we mentioned earlier when we created the ACT in Terraform calling it ‘prc-acr’.

With these details we can use the Azure CLI to push the image to the ACR. First your need to login to the ACR using:

az acr login --name 'prc-acr'

This will connect you into the ACR that was created in Azure. From there you will need to tag your image with the acr login server name with registry name and tag. For example:

docker tag prcImage:latest prc-acr.azurecr.io/prc-registry:latest

This will then tell docker where to push the image to while you are logged in to the Azure Container Registry, which means from there we simply just need to push the image with that tag in the standard docker method:

docker push prc-acr.azurecr.io/prc-registry:latest

Now this is very each and simple as we do not need a connection to the Container Registry, but just a connection to the Azure environment. These details can then be used with the Azure CLI Task as below, where I am passing in the following parameters.

Parameter NameExample ValueDescription
azureServiceConnectionAzureServiceConnectionService Connection name to Azure
azureContainerRegistryNamePrc-acrAzure Container Registry Name
dockerImageprcImageDocker Image Name
tagNameLatestDocker Tag Name
registryNamePrc-registryACR Registry Name
steps:
  - task: AzureCLI@2
    displayName: 'Push Docker Image to ACR'
    inputs:
      azureSubscription: ${{parameters.azureServiceConnection}}
      scriptType: 'ps'
      scriptLocation: 'inlineScript'
      inlineScript: |
        az acr login --name ${{parameters.azureContainerRegistryName}}
        docker tag ${{parameters.dockerImage}}:${{parameters.tagName}} ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}
        docker push ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}