SSH to Hyper-V Virtual Machine using SSH.NET without IP Address

I have uses the Dotnet Core Nuget package SSH.NET to SSH into machines a few times, is a very simple, slick and handy tool to have. However, you cannot SSH into a Virtual Machine(VM) in Hyper-V that easy without some extra fiddling to get an exposed IP Address.

With your standard SSH command you can run the simple:

ssh User@Host

This can have many other attributes, but lets keep it simple.

If your VM has an IP Address assigned to the Network Adapter then this can still be very simple, with using the user for the machine and the IP Address as the host.

However, not every VM in some situation will have an IP Address and therefore you cannot connect to it like this.

You can though if you use the Hyper-V Console CLI(HVC). If installed it can be located in ‘C:\Windows\System32\hvc.exe‘ and it is normally install when enabling the Hyper-V Feature in Windows. This tool enables your to communicate to your VM via the Hyper-V Bus in-between your local machine and the VM.

To use this tool you can run the same SSH command but with the HVC prefix:

hvc ssh User@Host

However, instead of the host you can pass the Hyper-V VM name, which you can get from the Hyper-V Program or with PowerShell in Administrator mode:

Get-VM

This is great to use in the Terminal, but doesn’t let you use standard SSH commands, which the SSH.Net tool uses. I have not come across a tool to do this execution via Dotnet Core yet, so I have come up with this solution.

What we can do to accomplish this is port forwarding, where we tell the VM to route traffic from one port on the VM to another port on the local machine.

Below we are telling the VM to push port 22 traffic, which is the SSH standard port, to port 2222 on the local machine with the correct Username and VM Name.

hvc.exe ssh -L 2222:Localhost:22 User@VmName

Once this has been done you can then run the standard SSH command, but with the port parameter and ‘Localhost’ as the Host, the same as you SSH to your own local machine.

Ssh user@Localhost -p 2222

To get this working in C# I would recommend using SSH keys to avoid the requirement of passwords as you would need key entry for that, and then the PowerShell Nuget package to run the HVC command like below:

$SystemDirectory = [Environment]::SystemDirectory
cd $SystemDirectory
hvc.exe ssh -L 2222:Localhost:22 User@VmName -i "KeyPath" -FN

Automate Security for Azure Container Registry

From March 2021 Azure is deprecating the Container Setting in Azure Web Apps, which changes you to use the new Development Center. This look very nice, but there is a change that is going to force you to have weaker security. This change is to have the Admin Credentials enabled, but there is something you can do to be secure.

Before this change, the best practice method was to turn Admin Credentials off in your Azure Container Registry(ACR). This is because the user is a single user, so you can’t tell different peoples interactions while using this one account, and it also has as it says in the name, admin rights meaning it can do anything.

To make this secure, you would disable the Admin Credentials and then anything trying to connect to the repository would have a Service Principle set up or a role added with the correct access. In this post I describe some of these where you can set up the ACR credentials in the App Settings, so the Azure Web App has access to pull the image. This is using Terraform, but it has the basic idea as well.
Use Terraform to connect ACR with Azure Web App

Now when you create a new Azure Web App or go to an existing one you will be presented with an error like this:

Basics Docker Monitoring 
Tags 
Review + create 
Pull container images from Azure Container Registry, Docker Hub or a private Docker repository. App Service will 
deploy the containerized app with your preferred dependencies to production in seconds. 
Options 
Image Source 
Azure container registry options 
Registry * 
Image 
Tag 
Startup Command O 
Single Container 
Azure Container Registry 
Loading... 
Cannot perform credential operations for 
o 
as admin user is 
disabled. Kindly enable admin user as per docs: https://docs.microsoft.com 
/en-us/azure/container-registry/container-registry-authentication*admin- 
account

This is now the message you get telling you to turn on the Admin Credentials, but what makes it confusing is on the documentation they point you to says:

“The admin account is designed for a single user to access the registry, mainly for testing purposes. “

ref: https://docs.microsoft.com/en-us/azure/container-registry/container-registry-authentication#admin-account

However, it seems we need to play by their conflicting rules, so we need to work with this and make it more secure.
Turning on this setting can be insecure, but what we can do is rotate the keys. As you can tell from the UI you don’t need to enter these credentials as the authentication is handled behind the scenes.

Therefore, we can regenerate the passwords without affecting the connection between the ACR and the Resource. This is not perfect, but it does mean if anyone get your password or uses it, then it will be expired very quickly if you want at least.

To do this we can use the ACR and the Azure CLI. With the CLI you can use the ACR commands to trigger a password regeneration.

az acr credential renew -n MyRegistry --password-name password
az acr credential renew -n MyRegistry --password-name password2

ref: https://docs.microsoft.com/en-us/cli/azure/acr/credential?view=azure-cli-latest#az_acr_credential_renew

We can then schedule this and tie it to the ACR by using the ACR Tasks. These can run ACR commands and be put on a repeating timer to trigger when you wish.
ref: https://docs.microsoft.com/en-us/cli/azure/acr/task?view=azure-cli-latest#az_acr_task_create

unfortunalty the ‘acr’ doesn’t contain the ‘Credential’ command and if you run the ‘az’ cli command it says you need to login.

You can put the commands into a Dockerfile and run the commands using the Azure CLI image, but this seems overkill. I would suggest using alternatives to run the commands, like setting up an automated Azure DevOps pipeline to run the commands in the CLI task.

Merge Azure DevOps Pipeline Templates

As mentioned in my previous post about Azure DevOps Local Pipeline Testing, the available method of testing with the Azure DevOps API doesn’t have the feature to merge the YAML Templates. Therefore, I have set out to try and solve this issue..

You can view the full PowerShell script on GitHub.(https://github.com/PureRandom/AzureDevOpsYamlMerge-ps)

Please feel free to advise on more efficient coding and suggestions of other use cases that need to be considered.

Below I will walk through what it currently does as of the date of this post. I have tried to consider most, if not all, of the scenarios that I have come across, but I am sure there are other ways that need to be solved.

To use the script you simply need to pass it the root location of where your YAML is stored and the name of the main YAML file. For Example:

$yamlPath = "C:\Users\pateman.workspace\CodeRepos\"
$yamlName = "azurepipeline.yml"
$outputPath = processMainPipeline -pipelineYaml $yamlName -rootPath $yamlPath
Write-Host "Parsed into $outputPath"

This will read through each line and rebuild the YAML file. As it reads through if it finds a line that contains the template syntax then the processing starts, but only if the template path does not contain the ‘@’ symbol as that is assumed to be a path in a remote repository.

In the processing it will extract the parameters that are being passed to the template. Then getting a copy of the template YAML into a variable, it will start reading this file and rebuilding it. First it will assume the Parameters are set at the top, so it will extract the parameters. If the parameter found has been set by the main YAML then it will do nothing, else it will create the entry and update value from the default property.

Once it has all the parameters it can find and replace these as it goes through the file. Finally insert this now update version of the template into the same slot as where the template reference was in the main YAML.

This is then saved in the root location, where you can use this file in the pipeline testing API.

Testing Azure DevOps Pipelines Locally

I have tried and tried before, plus also read a lot of other people asking for the ability to test Azure YAML Pipelines locally. Now although it is not perfect and some room for improvement, Microsoft has create a way to test your pipeline with some local testing and remote testing without having to check in your YAML.

The issue to solve is, you making a change to a Azure DevOps Pipelines YAML, then to test this you need to check it in just to find out something small like, the value you passed is not correct, or the linked repository isn’t spelt correctly. This can then get very annoying as you need to keep doing this back and forward until it is finally able to run.

A quick tool you can use, which is completely local, is the Visual Studio Code (VS Code) extension ‘Azure Pipelines‘. This will validate your YAMLs formatting and in the Output window to the left it can show you the hierarchy flow of the YAML, which is the first step to validate.

Another handy tool is ‘Azure DevOps Snippets‘, which is an auto-complete library of the Azure DevOps Tasks. This can help in getting the correct names of parameters and tasks, instead of having to need a solid memory.

However, these tools will only help with formatting the local YAML and without the parameters you have set in the Azure DevOps Library. What we need is to test this remotely against the information in Azure DevOps and for that Microsoft have provided an API.

This API can be used to run a pipeline remotely that is stored in Azure DevOps, so you can automate via the API to trigger builds. It also has two handy parameters. One being ‘previewRun’, a boolean to determine if this should run the pipeline for real or validate the run. This will use all the information you have set in Azure DevOps and the pipeline, but it will not create a new build and run code. What it will do is use the correct parameters, link to the other real repositories and check all the data will work. It is also the setting use when you edit the pipeline in Azure DevOps and select ‘Validate’ from the options menu in the top right.

This is really good, but would still require you to check in the YAML that you have edited, which is why there is the parameter ‘yamlOverride’ in the API. In this parameter you can add you newly edited YAML content and this will replace what is store in Azure DevOps without overwriting it.

Here is an example POST Body:

POST:

https://dev.azure.com/MyOrg/7e8cdd97-0000-0000-a9ed-8eb0e5c748f5/_apis/pipelines/12/runs
{
   "resources":{
      "pipelines":{
         
      },
      "repositories":{
         "self":{
            "refName":"azure-pipelines"
         }
      },
      "builds":{
         
      },
      "containers":{
         
      },
      "packages":{
         
      }
   },
   "templateParameters":{
      
   },
   "previewRun":true,
   "yamlOverride":"MyYAML'"
}

To make life a little easier, if you are using PowerShell there is a VSTeam Module you can install to run this command through PowerShell. It takes the ID of the pipeline you want to run against, which you can get from the pipeline URL in Azure DevOps. It then take the file path to the local YAML file and the project name from Azure DevOps.

Test-VSTeamYamlPipeline -PipelineId 12 -FilePath .\azure-pipelines.yml -ProjectName ProjectName


These are great tools and solve a lot of the issue, but they do not solve all of them. You would still need internet connection to run these, as they run on the remote DevOps instance. Also, I have a repository purely containing groups of YAML tasks for better reusability, which you cannot run tests on these as they are not complete pipelines only templats, so I would still need to do the check in and run process. Finally, it does not merge local YAML files. If you have a main file and then templates locally linked it will not merge them, you need to create a single YAML to run this.

However, I don’t think you can solve all of these as it would require you to run Azure DevOps locally in an exact copy of the live instance you have. If you was using Azure DevOps Server then you might be able to take copies of these and run them in a container on your local, but it could be a longer process compare to the current one.

If you have any other methods to make developing Azure DevOps YAML Pipelines easier then please share.

How to authenticate with Fortify Security with PowerShell

Fortify, a security scanning tool for code, has some great features but also some limiting features. Therefore I sought to use their open REST API to expand on its functionality to enhance how we are using it within the DevOps pipeline. This was of course step one to find how to authenticate with Fortify to start doing the requests to its services.

Fortify does have the Swagger page of the URL’s to show what endpoints it offers, but doesn’t detail the authentication endpoint. It then does have the documentation on how to authenticate, but it is not detailed out for easy use.

Therefore this is why I thought I would expand on the details to show other how to authenticate easily, while using PowerShell as the chosen language.

Fortify Swagger

The API Layer from Fortify provides the Swagger definitions. If you chose you provided Data Centre from the link below, you can then simply add ‘/swagger’ to the end to see the definitions, for example https://api.emea.fortify.com/swagger/ui/index

Data Centre URL: https://emea.fortify.com/Docs/en/Content/Additional_Services/API/API_About.htm

Authentication

As mentioned before Fortify does document how to authenticate with the API here https://emea.fortify.com/Docs/en/index.htm#Additional_Services/API/API_Auth.htm%3FTocPath%3DAPI%7C_____3

First thing is to find out what details you require for the request like it has mentioned in the documentation. We require the calling Data Centre URL, which you used above for the Swagger definitions, that is then suffixed with ‘/oauth/token’ e.g. ‘https://api.emea.fortify.com/oauth/token’

We then need scope of what you would like to request, which are both detailed out in this link in the documentation plus also on each of the Swagger definition under the ‘Implementation Notes’, it specifies what scope is required for each request. This value needs to be entered as lowercase to be accepted.

This is the same as the Grant Type, which is a fixed value of ‘client_credentials’ all in lowercase.

Final detail we need is the ‘client_id’ and the ‘client_secret’, but what I found is what we really need is the API Key and the API Secret that is managed in your Fortify portal. If you sign in to your portal, for the Data Centre and product I have access to, you can navigate to the ‘Administration’ then ‘Settings’ and finally ‘API’. From this section you can create the API details with the required set of permissions. Note that the permission are changeable post setting this up so you do not need to commit yet. You should then get all the details required for these two parameters where client_id = API Key and client_secret = API Secret.

Your details in PowerShell should look like this:

$body = @{
scope = "api-tenant"
grant_type = "client_credentials"
client_id = "a1aa1111-11a1-1111-aaa1-aa1a11a1aaaa"
client_secret = "AAxAbAA1AAdrAA1AAAkyAAAwAAArA11uAzArA1A11"
}

From there we can do a simple ‘Invoke-RestMethod’ using PowerShell, with a key things to note. It is that the content type is ‘application/x-www-form-urlencoded’, without this you will keep getting an error saying the ‘Grant Type’ is not valid. With this as well you will notice as above the body is not in JSON, but are formatted as Parameters in the body of the request.

Below is the full example of the request using PowerShell, which I have also included the requests to set the default proxy so if you are requesting behind a proxy, this should still work.

## Set Proxy

[System.Net.WebRequest]::DefaultWebProxy = [System.Net.WebRequest]::GetSystemWebProxy()

[System.Net.WebRequest]::DefaultWebProxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials

## Create Details

$uri = “https://api.emea.fortify.com/oauth/token”

$body = @{

scope = “api-tenant”

grant_type = “client_credentials”

client_id = “a1aa1111-11a1-1111-aaa1-aa1a11a1aaaa”

client_secret = “AAxAbAA1AAdrAA1AAAkyAAAwAAArA11uAzArA1A11”

}

## Request

$response = Invoke-RestMethod -ContentType “application/x-www-form-urlencoded” -Uri $uri -Method POST -Body $body -UseBasicParsing

## Response

Write-Host $response