Bash compare version numbers

Here is a little script to help compare version numbers in the format X.X.X using bash/shell. This can be good if you are trying to find out which version is higher for something like an upgrade.

versionlte() {
    [  "$1" = "`echo -e "$1\n$2" | sort -V | head -n1`" ]
versionlt() {
    [ "$1" = "$2" ] && return 1 || versionlte $1 $2

You can then use this as an iffe.


versionlt $lowerVersion $higherVersion && upgradeRequired=true || upgradeRequired=false

DotNet User Secrets Feature

A little unknown feature of dotnet is User Secrets. This is used to create local App Setting overrides to set values for local usage. This can be a handy and powerful tool for keeping your local setup separate from checking in code.

You can find more details on the Microsoft Documentation here

The goal of this feature is to overwrite your App Settings with local values. For example if you have a connection string within your JSON, you are not going to want your local Username/Password stored in there to be checked. You also don’t want to be pulling other peoples settings down and having to keep changing them. Therefore, with this feature you set your values on your local machine in a different file and they get overridden, not overwritten, so they will work for your but never be checked in.

Within your project location you can run the ‘init’ command, which will create a new node in the project file. This is called ‘UserSecretId’, which contains the ID for this project. When this runs it will use the ID to match up with where the secrets are stored.

The secrets are stored in the folder ‘C:/Users/[UserName]/AppData/Roaming/Microsoft/UserSecrets’ then in a directory with the ID as the directory name. Within this folder there is then a file called ‘secrets.json’ where you will store all the secrets in a json format. You can get more detail on how to format the name properties of your App Settings on the documentation.

When you run the ‘init’ command it doesn’t create this directory and file for you, so I whipped together a script below to generate the User Secret ID and to also create the required directory/file. Before I talk about that I will also show have to use User Secrets with Dotnet Core Console Apps.

This could be something I have done wrong, but when I create a Web Application and use the feature it just works with no extra effort. However, when I created a Console App it did not just work out the box. I found I needed to do a few things to get it working, which Stafford Williams talks about here

One part he missed was when using Dependency Injection to inject where to find the User Secrets ID in the Builder as per:


Create User Secrets

In the below code it accepts a single project path and a directory with many projects. It will find the project files and check if they have the User Secrets ID in the project.

If it doesn’t then it will go to the directory, run the ‘init’ command and then get the ID as well.

From there it can check/create the folders and files for the User Secrets.

Param (
$filesCount = 0
if ($projectPath.EndsWith('.csproj')) {
    $projects = Get-ChildItem -Path $projectPath
    $filesCount = 1
else {
    if ($projectPath.EndsWith('/') -eq $false -or $projectPath.EndsWith('\') -eq $false) {
        $projectPath += "/";
    $projects = Get-ChildItem -Path "$projectPath*.csproj" -Recurse -Force
    $filesCount = $projects.Length
Write-Host("Files Found $filesCount")
if ($filesCount -gt 0) {
    $userSecretsPath = "$ENV:UserProfile/AppData/Roaming/Microsoft/UserSecrets"
    if (!(Test-Path $userSecretsPath)) { 
        Write-Host("Create User Secrets Path")
        New-Item -ItemType directory -Path $userSecretsPath
    $currentDir = [System.IO.Path]::GetDirectoryName($myInvocation.MyCommand.Definition)
    foreach ($project in $projects) {
        Write-Host(" ")
        Write-Host("Current Project $project")
        [xml]$fileContents = Get-Content -Path $project
        if ($null -eq $fileContents.Project.PropertyGroup.UserSecretsId) { 
            Write-Host("User Secret ID node not found in project file")
            Set-Location $project.DirectoryName
            dotnet user-secrets init
            Set-Location $currentDir
            Write-Host("User Secret Create")
            [xml]$fileContents = Get-Content -Path $project
        $userSecretId = $fileContents.Project.PropertyGroup.UserSecretsId
        Write-Host("User Secret ID $userSecretId")
        if ($userSecretId -ne ""){
            $userSecretPath = "$userSecretsPath/$userSecretId"
            if (!(Test-Path $userSecretPath)) { 
                New-Item -ItemType directory -Path $userSecretPath
                Write-Host("User Secret ID $userSecretId Path Created")
            $secretFileName = "secrets.json"
            $secretPath = "$userSecretsPath/$userSecretId/$secretFileName"
            if (!(Test-Path $secretPath)) {   
                New-Item -path $userSecretPath -name $secretFileName -type "file" -value "{}"
                Write-Host("User Secret ID $userSecretId secrets file Created")
            Write-Host("User Secrets path $secretPath")

Microsoft Upgrade Assistant

Dotnet Core has been released for a very long time now and everyone is getting on the cutting edge of the SDK technology when it is realeased. However, there has been some assistance missing in helping, especially .Net Framework projects,  upgrade to the next version. Now 2021 they have brought the tooling out! and by they I mean Microsoft.

The funny part of this is although Microsoft built Dotnet Core, with the communities help, it was AWS that came to the recuse of developers first. Late 2020 AWS developed a porting assistance tool for moving from .Net Framework to Dotnet Core.

This tool brings a nice UI to download and review the actions required to upgrade, but being more developer I prefer the command line. The report is well detailed and does a good job at bringing the old technology to the latest version. I think it is great that this was around to help companies get onto Dotnet Core. However, it doesn’t assist with a change in Dotnet Core SDK versioning, which although it is simple, having something that will make sure your app is properly upgraded without any all nighter is nice to have.

There is also a document on migration in Dotnet Core from Microsoft, that goes through each of the steps required to action the upgrade. I did follow this guide and it was very simple, but then it doesn’t cater for more complex and large solutions. It is also specifically for a Dotnet Core migration from 3.1 to 5, which some developers might be coming from other versions. View the documentation here.

Finally we come to the new toy on the block, which is the Microsoft Upgrade Assistant. The reasons why this is such a good tool, is it covers where the other two methods fail. It is made to run with the Dotnet Core SDK, so it is something you will be or will get to be familiar with. It can assist in upgrading .Net Framework and Dotnet Core projects or solutions. This is very well automated to do the work for you with a simple numbering system to choose what steps to run, skip or explain.

Microsoft Dotnet Core Upgrading Assistance Tool

The guide to installation and use is very simple to follow as it is a few dotnet installations, then run the command.

A handy part is it offers the ability to backup your existing project, so you will not lose your work, but you should also be using a source control like Git to manage you project anyway so you will see the changes made.

If you get into any trouble with the installation or the upgrade, like all of Microsoft projects, it is open source on GitHub. I found one or two issue installing and found the help I needed extremely fast.

The reason why these tools and methods are so important now, is the upgrading cadence is speeding up. Before you would have a set version with some patches, but you would not look to upgrade as often. Now with Dotnet Core and the general speed of development, the changes have a lot of benefits with little impact to upgrade, that is if you keep up with the latest. Already they have release .Net 5 production ready and releasing .Net 6 preview so you can see the speed in change.

The upgrades are getting as simple as updating a Nuget package, so why would you not. You will of course still need to test and validate the upgrade, plus you are restricted by other resources you use being compatible like external Nuget Packages.

Create Identity in Google Cloud Platform and GSuite

Compare to some other cloud providers, creating an identity via code in GCP is a little fragmented if your using GSuite for your identity storage. The Google Cloud Platform holds your users identity reference and permission, while the other system GSuite hold the security of the users authentication. This can also make the documentation feel a little fragmented and not so easy to follow. Hence this post to stick them together in how I used C# Dot Net Core to create a Google Cloud Platform Identity using their SDK.

This part is standard for any SDK access to GCP, which is to have a service account authentication. For this you will need to create a Service Account in GCP, which needs to be associated with a project. You can create it against the project where you are deploying, or to keep things separate, like I would recommend, you can create a  Service Management Project. This is just a standard project, but you can use this to keep all the SDK activity on this project while the usage activity happens on the other project.

Create a Project

  1. Go to the Manage resources page in the Cloud Console.
  2. On the Select organization drop-down list at the top of the page, select the organization in which you want to create a project. If you are a free trial user, skip this step, as this list does not appear.
  3. Click Create Project.
  4. In the New Project window that appears, enter a project name and select a billing account as applicable. A project name can contain only letters, numbers, single quotes, hyphens, spaces, or exclamation points, and must be between 4 and 30 characters.
  5. Enter the parent organization or folder in the Location box. That resource will be the hierarchical parent of the new project.
  6. When you’re finished entering new project details, click Create.


Create a Service Account

  1. In the Cloud Console, go to the Service accounts page.
  2. Select a project (your new Service Management Project).
  3. Click Create service account.
  4. Enter a service account name to display in the Cloud Console.
    The Cloud Console generates a service account ID based on this name. Edit the ID if necessary. You cannot change the ID later.
  5. Optional: Enter a description of the service account.
  6. If you do not want to set access controls now, click Done to finish creating the service account.
    To set access controls now, click Create and continue to the next step.
  7. Optional: Choose one or more IAM roles to grant to the service account on the project.
  8. When you are done adding roles, click Continue.
  9. Optional: In the Service account users role field, add members that can impersonate the service account.
  10. Optional: In the Service account admins role field, add members that can manage the service account.
  11. Click Done to finish creating the service account.


You could then get more specific for the Identity Access Management (IAM) permissions, but to keep it simple you would just need to apply the Service Account ‘Owner’ and ‘Project IAM Admin’ access on the new Service Management Project. This will give the Service Account access to create the identities, but for more detail on the permissions, you can use this link to look them up.

Next we need the Service Account to have access to create the identities in the GSuite. The below sets the Service Account in GCP ready to give access in the Admin portal of GSuite.

  1. Locate the newly-created service account in the table. Under Actions, click more the 3 dots at the end, then Edit.
  2. In the service account details, click the down arrow to see more Show domain-wide delegation, then ensure the Enable G Suite Domain-wide Delegation checkbox is checked.
  3. If you haven’t yet configured your app’s OAuth consent screen, you must do so before you can enable domain-wide delegation. Follow the on-screen instructions to configure the OAuth consent screen, then repeat the above steps and re-check the checkbox.
  4. Click Save to update the service account, and return to the table of service accounts. A new column, Domain-wide delegation, can be seen. Click View Client ID, to obtain and make a note of the client ID.


Now we connect these together, but giving the Service Account access in the GSuite Admin Portal.

  1. From your Google Workspace domain’s Admin console, go to Main menu menu> Security > API controls.
  2. In the Domain wide delegation pane, select Manage Domain Wide Delegation.
  3. Click Add new.
  4. In the Client ID field, enter the client ID obtained from the service account creation steps above.
  5. In the OAuth Scopes field, enter a comma-delimited list of the scopes required for your application (for a list of possible scopes, see Authorize requests).
    For example, if you require domain-wide access to Users and Groups enter:,
  6. Click Authorize.


At this point our Service Account has access to the GCP Account/Project and also has the access needed for the GSuite to create the identities. Therefore, we can start getting into the code to create these accounts.

To start with the SDK we need the Service Accounts JSON Key, which you can get by:

  1. In the Cloud Console, go to the Service Accounts page.
  2. Click Select a project, choose a project, and click Open.
  3. Find the row of the service account that you want to create a key for. In that row, click the More button, and then click Create key.
  4. Select a Key type and click Create.


Once you have downloaded the JSON File we can move to the Authentication in C#.

You will need to install the Google.Apis.Auth Nuget package to your project. There are then multiple difference methods to do this depending on how you are storing your JSON Key, but for my example we are injecting the JSON straight into the method, which we need the GoogleCredential.  The method we need to call is:


With gcpAuthenticationJson being the JSON string from the downloaded file. We also need to add scope to the request of access, which we can string together like below with these scopes required:

.CreateScoped(new List<string>

Now although we have given the Service Account all the permissions it requires to do the job, it needs to be executed by a GSuite Admin. We of course cannot have the admin logging in every time, therefore we just need the code to act as the admin. We can do this by adding an addition command to the methods:

.CreateScoped(new List<string>

We can of course make this a little more flexable as it can be reused for other authentications, so this is the method I would recommend:

publicGoogleCredential GetGcpCredential(string authJson,List<string> apiScopes=null,string adminEmail="")
 var googleCredential = GoogleCredential.FromJson(authJson)
 .CreateScoped(apiScopes ?? new List<string>


 return googleCredential;

From this can then create users with the SDK using this simple bit of code:

var directoryService = new DirectoryService( new BaseClientService.Initializer
                HttpClientInitializer = GetGcpCredential(authJson, apiScopes, userEmail)
                var request = directoryService.Users.Insert(userData);
                return await request.ExecuteAsync();

Setting Bearer tokens in PowerShell

This is a quick post to put out how to set a Bearer Token using PowerShell for Invoke-RestMethod and Invoke-WebRequest as it was something I could not find a clear explaining post.

This is simply set in the headers of the request as below, where ‘$bearer_token’ is the token’s variable. I have put it in the ‘$headers’ variable, which is then used in the Invoke-RestMethod.

$headers = @{Authorization = "Bearer $bearer_token"}
$response = Invoke-RestMethod -ContentType "$contentType" -Uri $url -Method $method -Headers $headers -UseBasicParsing