Automate Security for Azure Container Registry

From March 2021 Azure is deprecating the Container Setting in Azure Web Apps, which changes you to use the new Development Center. This look very nice, but there is a change that is going to force you to have weaker security. This change is to have the Admin Credentials enabled, but there is something you can do to be secure.

Before this change, the best practice method was to turn Admin Credentials off in your Azure Container Registry(ACR). This is because the user is a single user, so you can’t tell different peoples interactions while using this one account, and it also has as it says in the name, admin rights meaning it can do anything.

To make this secure, you would disable the Admin Credentials and then anything trying to connect to the repository would have a Service Principle set up or a role added with the correct access. In this post I describe some of these where you can set up the ACR credentials in the App Settings, so the Azure Web App has access to pull the image. This is using Terraform, but it has the basic idea as well.
Use Terraform to connect ACR with Azure Web App

Now when you create a new Azure Web App or go to an existing one you will be presented with an error like this:

Basics Docker Monitoring 
Tags 
Review + create 
Pull container images from Azure Container Registry, Docker Hub or a private Docker repository. App Service will 
deploy the containerized app with your preferred dependencies to production in seconds. 
Options 
Image Source 
Azure container registry options 
Registry * 
Image 
Tag 
Startup Command O 
Single Container 
Azure Container Registry 
Loading... 
Cannot perform credential operations for 
o 
as admin user is 
disabled. Kindly enable admin user as per docs: https://docs.microsoft.com 
/en-us/azure/container-registry/container-registry-authentication*admin- 
account

This is now the message you get telling you to turn on the Admin Credentials, but what makes it confusing is on the documentation they point you to says:

“The admin account is designed for a single user to access the registry, mainly for testing purposes. “

ref: https://docs.microsoft.com/en-us/azure/container-registry/container-registry-authentication#admin-account

However, it seems we need to play by their conflicting rules, so we need to work with this and make it more secure.
Turning on this setting can be insecure, but what we can do is rotate the keys. As you can tell from the UI you don’t need to enter these credentials as the authentication is handled behind the scenes.

Therefore, we can regenerate the passwords without affecting the connection between the ACR and the Resource. This is not perfect, but it does mean if anyone get your password or uses it, then it will be expired very quickly if you want at least.

To do this we can use the ACR and the Azure CLI. With the CLI you can use the ACR commands to trigger a password regeneration.

az acr credential renew -n MyRegistry --password-name password
az acr credential renew -n MyRegistry --password-name password2

ref: https://docs.microsoft.com/en-us/cli/azure/acr/credential?view=azure-cli-latest#az_acr_credential_renew

We can then schedule this and tie it to the ACR by using the ACR Tasks. These can run ACR commands and be put on a repeating timer to trigger when you wish.
ref: https://docs.microsoft.com/en-us/cli/azure/acr/task?view=azure-cli-latest#az_acr_task_create

unfortunalty the ‘acr’ doesn’t contain the ‘Credential’ command and if you run the ‘az’ cli command it says you need to login.

You can put the commands into a Dockerfile and run the commands using the Azure CLI image, but this seems overkill. I would suggest using alternatives to run the commands, like setting up an automated Azure DevOps pipeline to run the commands in the CLI task.

Merge Azure DevOps Pipeline Templates

As mentioned in my previous post about Azure DevOps Local Pipeline Testing, the available method of testing with the Azure DevOps API doesn’t have the feature to merge the YAML Templates. Therefore, I have set out to try and solve this issue..

You can view the full PowerShell script on GitHub.(https://github.com/PureRandom/AzureDevOpsYamlMerge-ps)

Please feel free to advise on more efficient coding and suggestions of other use cases that need to be considered.

Below I will walk through what it currently does as of the date of this post. I have tried to consider most, if not all, of the scenarios that I have come across, but I am sure there are other ways that need to be solved.

To use the script you simply need to pass it the root location of where your YAML is stored and the name of the main YAML file. For Example:

$yamlPath = "C:\Users\pateman.workspace\CodeRepos\"
$yamlName = "azurepipeline.yml"
$outputPath = processMainPipeline -pipelineYaml $yamlName -rootPath $yamlPath
Write-Host "Parsed into $outputPath"

This will read through each line and rebuild the YAML file. As it reads through if it finds a line that contains the template syntax then the processing starts, but only if the template path does not contain the ‘@’ symbol as that is assumed to be a path in a remote repository.

In the processing it will extract the parameters that are being passed to the template. Then getting a copy of the template YAML into a variable, it will start reading this file and rebuilding it. First it will assume the Parameters are set at the top, so it will extract the parameters. If the parameter found has been set by the main YAML then it will do nothing, else it will create the entry and update value from the default property.

Once it has all the parameters it can find and replace these as it goes through the file. Finally insert this now update version of the template into the same slot as where the template reference was in the main YAML.

This is then saved in the root location, where you can use this file in the pipeline testing API.

Create Identity in Google Cloud Platform and GSuite

Compare to some other cloud providers, creating an identity via code in GCP is a little fragmented if your using GSuite for your identity storage. The Google Cloud Platform holds your users identity reference and permission, while the other system GSuite hold the security of the users authentication. This can also make the documentation feel a little fragmented and not so easy to follow. Hence this post to stick them together in how I used C# Dot Net Core to create a Google Cloud Platform Identity using their SDK.

This part is standard for any SDK access to GCP, which is to have a service account authentication. For this you will need to create a Service Account in GCP, which needs to be associated with a project. You can create it against the project where you are deploying, or to keep things separate, like I would recommend, you can create a  Service Management Project. This is just a standard project, but you can use this to keep all the SDK activity on this project while the usage activity happens on the other project.

Create a Project

  1. Go to the Manage resources page in the Cloud Console.
  2. On the Select organization drop-down list at the top of the page, select the organization in which you want to create a project. If you are a free trial user, skip this step, as this list does not appear.
  3. Click Create Project.
  4. In the New Project window that appears, enter a project name and select a billing account as applicable. A project name can contain only letters, numbers, single quotes, hyphens, spaces, or exclamation points, and must be between 4 and 30 characters.
  5. Enter the parent organization or folder in the Location box. That resource will be the hierarchical parent of the new project.
  6. When you’re finished entering new project details, click Create.

Reference: https://cloud.google.com/resource-manager/docs/creating-managing-projects#console

Create a Service Account

  1. In the Cloud Console, go to the Service accounts page.
  2. Select a project (your new Service Management Project).
  3. Click Create service account.
  4. Enter a service account name to display in the Cloud Console.
    The Cloud Console generates a service account ID based on this name. Edit the ID if necessary. You cannot change the ID later.
  5. Optional: Enter a description of the service account.
  6. If you do not want to set access controls now, click Done to finish creating the service account.
    To set access controls now, click Create and continue to the next step.
  7. Optional: Choose one or more IAM roles to grant to the service account on the project.
  8. When you are done adding roles, click Continue.
  9. Optional: In the Service account users role field, add members that can impersonate the service account.
  10. Optional: In the Service account admins role field, add members that can manage the service account.
  11. Click Done to finish creating the service account.

Reference: https://cloud.google.com/iam/docs/creating-managing-service-accounts#iam-service-accounts-create-console

You could then get more specific for the Identity Access Management (IAM) permissions, but to keep it simple you would just need to apply the Service Account ‘Owner’ and ‘Project IAM Admin’ access on the new Service Management Project. This will give the Service Account access to create the identities, but for more detail on the permissions, you can use this link to look them up. https://cloud.google.com/iam/docs/permissions-reference

Next we need the Service Account to have access to create the identities in the GSuite. The below sets the Service Account in GCP ready to give access in the Admin portal of GSuite.

  1. Locate the newly-created service account in the table. Under Actions, click more the 3 dots at the end, then Edit.
  2. In the service account details, click the down arrow to see more Show domain-wide delegation, then ensure the Enable G Suite Domain-wide Delegation checkbox is checked.
  3. If you haven’t yet configured your app’s OAuth consent screen, you must do so before you can enable domain-wide delegation. Follow the on-screen instructions to configure the OAuth consent screen, then repeat the above steps and re-check the checkbox.
  4. Click Save to update the service account, and return to the table of service accounts. A new column, Domain-wide delegation, can be seen. Click View Client ID, to obtain and make a note of the client ID.

Reference: https://developers.google.com/admin-sdk/directory/v1/guides/delegation#create_the_service_account_and_credentials

Now we connect these together, but giving the Service Account access in the GSuite Admin Portal.

  1. From your Google Workspace domain’s Admin console, go to Main menu menu> Security > API controls.
  2. In the Domain wide delegation pane, select Manage Domain Wide Delegation.
  3. Click Add new.
  4. In the Client ID field, enter the client ID obtained from the service account creation steps above.
  5. In the OAuth Scopes field, enter a comma-delimited list of the scopes required for your application (for a list of possible scopes, see Authorize requests).
    For example, if you require domain-wide access to Users and Groups enter: https://www.googleapis.com/auth/admin.directory.user, https://www.googleapis.com/auth/admin.directory.group
  6. Click Authorize.

Reference: https://developers.google.com/admin-sdk/directory/v1/guides/delegation#delegate_domain-wide_authority_to_your_service_account

At this point our Service Account has access to the GCP Account/Project and also has the access needed for the GSuite to create the identities. Therefore, we can start getting into the code to create these accounts.

To start with the SDK we need the Service Accounts JSON Key, which you can get by:

  1. In the Cloud Console, go to the Service Accounts page.
  2. Click Select a project, choose a project, and click Open.
  3. Find the row of the service account that you want to create a key for. In that row, click the More button, and then click Create key.
  4. Select a Key type and click Create.

Reference: https://cloud.google.com/iam/docs/creating-managing-service-account-keys#iam-service-account-keys-create-console

Once you have downloaded the JSON File we can move to the Authentication in C#.

You will need to install the Google.Apis.Auth Nuget package to your project. There are then multiple difference methods to do this depending on how you are storing your JSON Key, but for my example we are injecting the JSON straight into the method, which we need the GoogleCredential.  The method we need to call is:

GoogleCredential.FromJson(gcpAuthenticationJson);

With gcpAuthenticationJson being the JSON string from the downloaded file. We also need to add scope to the request of access, which we can string together like below with these scopes required:

GoogleCredential.FromJson(gcpAuthenticationJson)
.CreateScoped(new List<string>
{
"https://www.googleapis.com/auth/admin.directory.user",
"https://www.googleapis.com/auth/admin.directory.group",
"https://www.googleapis.com/auth/admin.directory.user.security"
});

Now although we have given the Service Account all the permissions it requires to do the job, it needs to be executed by a GSuite Admin. We of course cannot have the admin logging in every time, therefore we just need the code to act as the admin. We can do this by adding an addition command to the methods:

GoogleCredential.FromJson(gcpAuthenticationJson)
.CreateScoped(new List<string>
{
 "https://www.googleapis.com/auth/admin.directory.user",
 "https://www.googleapis.com/auth/admin.directory.group",
 "https://www.googleapis.com/auth/admin.directory.user.security"
}).CreateWithUser(adminEmail);

We can of course make this a little more flexable as it can be reused for other authentications, so this is the method I would recommend:

///<summary>
///GettheGCPCredentialviatheServiceAccount
///https://cloud.google.com/docs/authentication/production
///</summary>
///<paramname="authJson">DownloadedAuthenticationJSON</param>
///<paramname="apiScopes">CustomAPIScopes</param>
///<paramname="adminEmail">UserEmailAddresstoImpersonate</param>
///<returns>GCPCredentials</returns>
publicGoogleCredential GetGcpCredential(string authJson,List<string> apiScopes=null,string adminEmail="")
{
 var googleCredential = GoogleCredential.FromJson(authJson)
 .CreateScoped(apiScopes ?? new List<string>
 {
  "https://www.googleapis.com/auth/cloud-platform"
 });

 if(!string.IsNullOrEmpty(adminEmail))
  googleCredential=googleCredential.CreateWithUser(adminEmail);

 return googleCredential;
}

From this can then create users with the SDK using this simple bit of code:

var directoryService = new DirectoryService( new BaseClientService.Initializer
            {
                HttpClientInitializer = GetGcpCredential(authJson, apiScopes, userEmail)
            });
            try
            {
                var request = directoryService.Users.Insert(userData);
                return await request.ExecuteAsync();
            }
            finally
            {
                directoryService.Dispose();
            }

Push Docker Image to ACR without Service Connection in Azure DevOps

If you are like me and using infrastructure as code to deploy your Azure Infrastructure then using the Azure DevOps Docker task doesn’t work. To use this task you need to know what your Azure Container Registry(ACR) is and have it configured to be able to push your docker images to the registry, but you don’t know that yet. Here I show how you can still use Azure DevOps to push your images to a dynamic ACR.

In my case I am using Terraform to create the Container Registry and with that I pass what I want it to be called. For example ‘prc-acr’ which will generate an ACR with the full login server name ‘prc-acr.azurecr.io’. This can then be used later for sending the images to the correct registry.

When using the official Microsoft Docker Task the documentation asks that your have a Service Connection to your ACR. To do this though you need the registry login server name, username and password to connect, which unless you keep the registry static you will not know. Therefore, you can’t create the connection to then push your images up. I did read some potential methods to dynamically create this connection, but then we need to manage these so they do not get out of control.

To push the image we need only two things, a connection to Azure and where to push the image. The first we can get set up as we know the tenant and subscription we will be deploying to. The connection can be made up by following this guide to connection Azure to Azure DevOps. The other part of where to send the image, we mentioned earlier when we created the ACT in Terraform calling it ‘prc-acr’.

With these details we can use the Azure CLI to push the image to the ACR. First your need to login to the ACR using:

az acr login --name 'prc-acr'

This will connect you into the ACR that was created in Azure. From there you will need to tag your image with the acr login server name with registry name and tag. For example:

docker tag prcImage:latest prc-acr.azurecr.io/prc-registry:latest

This will then tell docker where to push the image to while you are logged in to the Azure Container Registry, which means from there we simply just need to push the image with that tag in the standard docker method:

docker push prc-acr.azurecr.io/prc-registry:latest

Now this is very each and simple as we do not need a connection to the Container Registry, but just a connection to the Azure environment. These details can then be used with the Azure CLI Task as below, where I am passing in the following parameters.

Parameter NameExample ValueDescription
azureServiceConnectionAzureServiceConnectionService Connection name to Azure
azureContainerRegistryNamePrc-acrAzure Container Registry Name
dockerImageprcImageDocker Image Name
tagNameLatestDocker Tag Name
registryNamePrc-registryACR Registry Name
steps:
  - task: AzureCLI@2
    displayName: 'Push Docker Image to ACR'
    inputs:
      azureSubscription: ${{parameters.azureServiceConnection}}
      scriptType: 'ps'
      scriptLocation: 'inlineScript'
      inlineScript: |
        az acr login --name ${{parameters.azureContainerRegistryName}}
        docker tag ${{parameters.dockerImage}}:${{parameters.tagName}} ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}
        docker push ${{parameters.azureContainerRegistryName}}.azurecr.io/${{parameters.registryName}}:${{parameters.tagName}}

Where to find Azure Tenant ID in Azure Portal?

Some of the documentation about Azure from Microsoft can be confusing and missing, including one I get ask ‘Where is the Tenant ID’. Below I give 3 locations, which there is probably, on where to find the Tenant ID in the portal. I have also added how to get the Tenant ID with the Azure CLI.

The Tenant is  basically the Azure AD instance where you can store and configure users, apps and other security permissions. This is also referred to as the Directory in some of the menu items and documentation. Within the Tenant you can only have a single Azure AD instance, but you can have many Subscriptions associated with it. You can get further information from here https://docs.microsoft.com/en-us/microsoft-365/enterprise/subscriptions-licenses-accounts-and-tenants-for-microsoft-cloud-offerings?view=o365-worldwide

Azure Portal

Azure Active Directory

If you use the Portal menu, once signed in, then you can select the ‘Azure Active Directory’ option.

This will load the Overview page with the summary of your Directory including the Tenant ID.

You can also go to this URL when signed in: https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview

1

Azure AD App Registrations

When configuring external applications or internal products to talk, you can use App Registrations or also know as Service Principal accounts. I know when using the REST API or the Azure SDK you will need the Tenant ID for the authentication, so within the registered app you also get the Tenant ID.

When in the Azure AD, select the ‘App registrations’ from the side menu. Find or add your App then select it.

From the App Overview page you can then find the Tenant ID or also known here as the Directory ID.

Switch Directory

If you have multiple Tenants then you can switch between the Tenants you have access to by switching Directory.

You can do this by selecting your Avatar/Email from the top right of the Portal, which should open a dropdown with your details. There will then be a link call ‘Switch directory’, and by clicking this you can see all the directories you have access to, what your default directory is and switch which one you are on.

As mentioned before the Directory is another word used my Azure for Tenant, so the ID you the see in this view is not just the Directory ID but also the Tenant ID.

Directory +

Azure CLI

From the Azure CLI you can get most every bit of information that is in the Portal depending on your permission.

If you don’t have the CLI then you can install it here: https://docs.microsoft.com/en-us/cli/azure/install-azure-cli

You can sign into the CLI by running:

az login

More information on logging in can be found here: https://docs.microsoft.com/en-us/cli/azure/authenticate-azure-cli

Once you are signed into the Azure CLI, then you can use this command below to get a list of the Subscriptions you have access to, which intern will report back the Tenant ID. Remove everything after ‘–query’ to get the full details.

(https://docs.microsoft.com/en-us/cli/azure/account?view=azure-cli-latest#az_account_list)

 az account list --query '[].{TenantId:tenantId}'

You can also get the current Tenant ID used to authenticate to Azure, by running this command and again remove after the ‘–query’ to get the full information.

(https://docs.microsoft.com/en-us/cli/azure/account?view=azure-cli-latest#az_account_get_access_token)

 az account get-access-token --query tenant --output tsv