AppDynamics grouping Database Custom Metric Queries

When you create Custom Database Metrics in AppDynamics, you first thought is to create a new row for each metric, but if you have a lot to report on this can become messy. Not only that but in the metric view you will have a very long list of reports to go through. Therefore when we had the consultant down at work, we was shown how to group collections of metrics in one query, which then shows in the metric view in a sub folder. This tactic, we could not find anywhere on the internet, so I thought I would share this very handle insight for AppDynamics.

Your stand method to add custom queries and metrics would be to go to this view below configuration view in AppDynamics and add new queries for each of the metrics you wish to report on.

AppDynamics Databases

You can then go to the metric view and see the data coming in like below.

AppDyanmics Metric Browser

However, like I said above, this list can grow fast plus by default you are limited to only 20 of theses’ queries, which can disappear faster. Therefore this method gives you more bang for your buck on custom metrics, plus also the organisation of your data.

Instead of adding each query separate, what we can do is create a grouping of queries into sub folders of the ‘Custom Metric’ folder, to look like this.

  • Before
  • Custom Metric
    • Queue 1
    • Queue 2
    • Queue 3
  • After
  • Custom Metric
    • MessagingQueueMontioring
      • Queue 1
    • Queue 2
    • Queue 3

As we, at my company, completed this in Microsoft SQL I will use that as an example, but I would be confident it can be translated to other languages with the same outcome with some slight changes to syntax.

Say we start with the 3 queries that we would want to monitor and we will keep them simple:

SELECT count(id) FROM MessageQueueOne
SELECT count(id) FROM MessageQueueTwo
SELECT count(id) FROM MessageQueueThree

To create the top level folder, you simply create a single query item called ‘MessagingQueueMonitoring’. In this new custom metric query you need to add the above 3 SQL statements, but we need them to be a single query instead of 3. For this to work we will use the SQL command ‘UNION ALL’ to join them together:

SELECT count(id) FROM MessageQueueOne
UNION ALL
SELECT count(id) FROM MessageQueueTwo
UNION ALL
SELECT count(id) FROM MessageQueueThree

This will now create one table with 3 rows and their values, but for AppDynamics to recognise these in the metrics view we need to tell it what each of these rows mean. To tell AppDynamics what the nodes under it are called you add a column to each query for the name. This column should be called ‘Type’ and then for AppDyanmics to know what the value part of the table is, you call that column ‘Total’.

You should end up with a query like below:

SELECT 'Message Queue One' as Type, count(id) as Total FROM MessageQueueOne
UNION ALL

SELECT 'Message Queue Two' as Type, count(id) as Total FROM MessageQueueTwo
UNION ALL

SELECT 'Message Queue Three' as Type, count(id) as Total FROM MessageQueueThree

Then this should result in a table like this:

TypeTotal
Message Query One4
Message Query Two2
Message Query Three56

How to setup AppDynamics for multiple .Net Core 2.0 applications

We have decided to go with App Dynamics to do monitoring on our infrastructure and code, which is great and even better they have released support for .Net Core 2.0. However when working with their product and consultant we found an issue with monitoring multiple .Net Core instances on one server, plus with console apps, but we found a way.

Currently their documentation, that is helpful, shows you have to set up monitoring for the .Net Core Application with environment variables. Follow the direction in the App Dynamics Documentation says to set the environment variable for the profilers path, which is in each of the applications, but of course we can’t set multiple environment variables. Therefore we copied the profiler DLL to a central source and used that as the environment variable, but quickly found out that it still didn’t work. For the profiler to start tracking, it needs to be set to point to the applications root folder for each application.

The consultants investigation then lend to looking at how we can set the environment variables for each application, to which we found the application can be set in the web.config using the node ‘environmentVariables’ under the ‘aspNetCore’ node as stated as part of the Microsoft Documentation. Of course using the ‘dotnet publish’ command generates this web.config, so you can’t just set this in the code. Therefore in the release of the code I wrote some PowerShell to set these parameters.

In the below PowerShell, I get the XML content of the web.config, then create each of the environment variable nodes I want to insert. Once I have these I can then insert them into the correct ‘aspNetCore’ node of the XML variable, which I then use to overwrite the contents of the existing file.

Example PowerShell:

$configFile = "web.config";
$sourceDir = "D://wwwroot";

## Get XML
$doc = New-Object System.Xml.XmlDocument
$doc.Load()
$environmentVariables = $doc.CreateElement("environmentVariables")

## Set 64 bit version
$Profiler64 = $doc.CreateElement("environmentVariable")
$Profiler64.SetAttribute("name", "CORECLR_PROFILER_PATH_64")
$Profiler64.SetAttribute("value", "$sourceDir\$subFolderName\AppDynamics.Profiler_x64.dll")
$environmentVariables.AppendChild($Profiler64)

## Set 32 bit version
$Profiler32 = $doc.CreateElement("environmentVariable")
$Profiler32.SetAttribute("name", "CORECLR_PROFILER_PATH_32")
$Profiler32.SetAttribute("value", "$sourceDir\$subFolderName\AppDynamics.Profiler_x86.dll")
$environmentVariables.AppendChild($Profiler32)

$doc.SelectSingleNode("configuration/system.webServer/aspNetCore").AppendChild($environmentVariables)

$doc.Save($configFile.FullName)

Example Web.config result:

<configuration>
<system.webServer>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="dotnet" arguments=".\SecurityService.dll" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout">
<environmentVariables>
<environmentVariable name="CORECLR_PROFILER_PATH_64" value="D:\IIS\ServiceOne\AppDynamics.Profiler_x64.dll" />
<environmentVariable name="CORECLR_PROFILER_PATH_32" value="D:\IIS\ServiceTwo\AppDynamics.Profiler_x86.dll" />
</environmentVariables>
</aspNetCore>
</system.webServer>
</configuration>

This will work for the application that have a web.config, but something like a Console App doesn’t have one, so what do we do?

The recommendation and solution is to create an organiser script. This script will set the Environment Variable, which will only effect the application triggered in the session. To do this you can use any script really like PowerShell or Command Line.

In this script you just need to set the environment variables and then run the Exe after.

For example in PowerShell:

Param(
[string] $TaskLocation,
[string] $Arguments
)

# Set Environment Variables for AppDynamics
Write-Host “Set Environment Variables in path $TaskLocation”
$env:CORECLR_PROFILER_PATH_64 = “$TaskLocation\AppDynamics.Profiler_x64.dll”
$env:CORECLR_PROFILER_PATH_32 = “$TaskLocation\AppDynamics.Profiler_x86.dll”

# Run Exe
Write-Host “Start  Script”
cmd.exe \c
exit

These two solutions will mean you can use AppDynamics with both .NetCore Web Apps and Console App, with multiple application on one box.

Quick Tips for the Sitecore Package Deployer

There is a package for Sitecore called the Sitecore Package Deployer, that updates the Sitecore Content Management System(CMS) with packages from Team Development for Sitecore(TDS). While working with this package extension I have been told and shown two tips that can help with your development and deployment.

Admin Update Installation Wizard

With this tool you can analyse as well as install new/existing update packages to the Sitecore instance. If you browse to ‘sitecore/admin/UpdateInstallationWizard.aspx’ on your Sitecore instance, you should be presented with a login page.

If you login with your admin credentials and sign in you should be presented with the welcome page.

 

s1.png

Once you click ‘Select a package >’ you will go to a page to select a new package, which should be an update file. Once you have selected your package you can press the ‘Package Information >’ button.

 

s2

 

When you go to the next page you can then see the package detail and go on to ‘analyse the package >’. This will display the analyse page, which if you select the ‘Analyse’ button, it will as it says, analyse the package to identify potential conflict. Once you have reviewed you can then install the package safely and securely.

Sitecore Package Deployer URL

So back to the TDS side, that once you have put your packages in the ‘SitecoreDeployerPackages’ folder you want them to update, but now.

In our release process after putting the files in the location, we don’t want to wait for the timer to trigger, so a method is to request this URL:

[YourSite]/sitecore/admin/SartStiecorePackageDeployer.aspx

This will trigger the deployer to start processing the update files, but it doesn’t stop there. While do this we started to get failed processes as the deployer was busy. This is caused by a clash between the timer and the request, so there is a way to force this by adding the query string name/value of ‘force=1’

This makes the URL look like this:

[YourSite]/sitecore/admin/SartStiecorePackageDeployer.aspx?force=1

 

Anymore?

If you have any tips on using the Sitecore Package Deployer or other Sitecore related tips then please share.

 

Azure Container with PowerShell

When I was trying to use PowerShell to action some Azure functionality, I found it very scattered and hard to get one answer, so here I give you the golden goose for Adding, Removing, Emptying and Copying files to an Azure Container using PowerShell.

The small print of this is of course there are probably more method of doing the same thing, but this is how it worked for me. Also this is not a demo of all the options and parameters the PowerShell commands can do, but what we need them to do. These scripts are set up to run with parameters passed in, but I have also put comments in there so you can run them hardcoded.

How to add a Azure Container?

The parameters required for this script are the Resource Group Name and Storage Account Name for the already built account, plus the new Container’s Name. You can see from below where we pass in the parameters, however in the static version we also need to Login to the required account and pass in the Subscription ID for the account as well.

You can get the Subscription ID by following the steps on this post.

## Get Parameters
Param(
    [string] $ResourceGroupName,
    [string] $StorageAccountName,
    [string] $StorageContainerName
)

## Static Parameters
#Login-AzureRmAccount
#Set-AzureRmContext -SubscriptionID 11111111-1111-1111-1111-111111111111
#$ResourceGroupName = "GroupName"
#$StorageAccountName = "AccountName"
#$StorageContainerName = "ContainerName"

Now we have all the details we can get the storage details from the account from the code below. This gets the storage Key to access the account details, then gets the storage account.

    $Keys = Get-AzureRmStorageAccountKey -ResourceGroupName $ResourceGroupName -Name $StorageAccountName;

    $StorageContext = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $Keys[0].Value;

You need the Storage Context for the future calls to create the container. Before we do create the new container, it is best to check if it already exists. In the circumstance I was in, I only wanted a warning flag so if it was already their then great I don’t need to create it, but just flag that detail to the console.

The first part then is an IF statement that attempts to get the Container, however if it does get anything then it falls into the else and writes a warning to the console. If it doesn’t then we use the parameters passed in to create the new Container, also note the ‘Permission’ argument I have set to ‘Container’, but this can also be set to the other options instead or created as a new parameter passed in.

if (!(Get-AzureStorageContainer -Context $StorageContext | Where-Object { $_.Name -eq $StorageContainerName })){
New-AzureStorageContainer -Context $StorageContext -Name $StorageContainerName -Permission Container;  
}
 else {
Write-Warning "Container $StorageContainerName already exists."
}

This is then all you need for creating a new Azure Container, and for the full example you can go here.

How to copy files to an Azure Container?

Following the life cycle after you create an Azure Container, you will want files into it. So we start as before, with all the parameters that are required. The additional one here is the ‘ArtifactStagingDirectory’, which will be the directory of where the contents is contained.

## Get Parameters
Param(
    [string] $ResourceGroupName,
    [string] $StorageAccountName,
    [string] $StorageContainerName,
    [string] $ArtifactStagingDirectory
)

Again we get the Storage Account context for future commands and then also get the paths for the files from the passed in directory.

$storageAccount = ( Get-AzureRmStorageAccount | Where-Object{$_.StorageAccountName -eq $StorageAccountName} )

$ArtifactFilePaths = Get-ChildItem -Path "$ArtifactStagingDirectory\**" -Recurse -File | ForEach-Object -Process {$_.FullName}

With the files paths we can then loop through each directory location to add to the Container. Within each loop we will set up the source path and pass it in, which you might notice we are using the ‘Force’ argument as we do not want a permission dialog box popping up especially if we are automating.

foreach ($SourcePath in $ArtifactFilePaths) {

$SourcePath
$SourcePath.Substring($ArtifactStagingDirectory.length)
    Set-AzureStorageBlobContent -File $SourcePath -Blob $SourcePath.Substring($ArtifactStagingDirectory.length) `
        -Container $StorageContainerName -Context $StorageAccount.Context -Force

}

This will get all the found files and folders into the Azure Container you have created. If you want to see the full version of how to copy files to an Azure Container go here.

How to empty an Azure Container?

Like in most cases, if in doubt then restart, so this is a script to do just that by emptying the Container of its contents. The set to this has one difference, which is the Containers are a comma separated string of the names instead. This is so you can empty or many Containers at the same time, like if you are cleaning out a whole deployment pipeline.

## Get Parameters
Param(
    [string] $ResourceGroupName,
    [string] $StorageAccountName,
    [string] $StorageContainerNames
)

As usual we get the Azure Storage Accounts context for later commands.

    $Keys = Get-AzureRmStorageAccountKey -ResourceGroupName $ResourceGroupName -Name $StorageAccountName;

    $StorageContext = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $Keys[0].Value;

For this one I am going to break it down by line instead of by statement. To get the full picture click on the link at the bottom to see the full version of this code.

We kick it off by looping each of the Container names:

$StorageContainerNames.Split(",") | ForEach {

We then need to check if the Container exists else we will try to delete content from a none existent Container.

if ((Get-AzureStorageContainer -Context $StorageContext | Where-Object { $_.Name -eq $currentContainer })){

If there is a Container, then we also need to check if there is a Blob to delete the content from.

$blobs = Get-AzureStorageBlob -Container $currentContainer -Context $StorageContext

if ($blobs -ne $null)
{

If all of these do come through then we get the go ahead to delete the contents, however we need to loop through each of the Blobs in the array to clear each Blob item.

foreach ($blob in $blobs) {

                    Write-Output ("Removing Blob: {0}" -f $blob.Name)
                    Remove-AzureStorageBlob -Blob $blob.Name -Container $currentContainer -Context $StorageContext

                }

In the result of this all the contents of the named Containers will be cleared out. As said before these are just snippets, but the full version of Emptying the Azure Container is here.

How to remove an Azure Container?

Just like the previous script, we have the same parameters as the rest and one of them that contains a comma separated string of Container Name. With these parameters we are looking to clear the whole thing out by deleting the Azure Container.

We start with the parameters, get the Storage Account context and loop through the Containers.

## Get Parameters
Param(
    [string] $ResourceGroupName,
    [string] $StorageAccountName,
    [string] $StorageContainerNames
)

$Keys = Get-AzureRmStorageAccountKey -ResourceGroupName $ResourceGroupName -Name $StorageAccountName;

$StorageContext = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $Keys[0].Value;

$StorageContainerNames.Split(",") | ForEach {

For each container, you check if the Container exists before deleting it. Next is the final command to delete the Container, which you will also notice we use the ‘Force’ argument to prevent the authorization pop up showing and get the Container deleted.

Remove-AzureStorageContainer -Context $StorageContext -Name $currentContainer -Force;

The full layout of removing an Azure Container can be seen here. 

How to get the users IP Address in C#.NET?

When you search this on your favourite search engine, (Google) you will get flooded by load of different way to get the end result. This is great in a sense as you know there are thousands of people with different answers, but then that is where the problem is at. There are so many single responses to how you can get the users IP Address in C#.NET, but not one that shows you everything. Therefore I will present to you below different methods to get the users IP and how I have implemented it in a project.

All the methods I found to be best are using the Request Server Variables. These are predetermined environment variables and request header information. This is why you can get IP address as it is part of the header information.

The standard server variable to use is ‘REMOTE_ADDR’, which is done as the following:

HttpContext.Current.Request.ServerVariables["REMOTE_ADDR"];

However if the user is behind a proxy server, then the above will return that IP Address and not the actual users. The server variable to get the users IP address from behind the proxy server is ‘HTTP_X_FORWARDED_FOR’ done as below:

HttpContext.Current.Request.ServerVariables["HTTP_X_FORWARDED_FOR"];

Then you get that issue of not knowing if the user is behind a proxy or not, as the first example will return one or the other and the second example will only return a value if there is a proxy server. Therefore we need to find out if there is a proxy server or not. You can do this with the server variable ‘HTTP_VIA’, if this has a value then they must be using a proxy server and so you can get the correct IP as below:

If (HttpContext.Current.Request.ServerVariables[“HTTP_VIA”] != null) {
return HttpContext.Current.Request.ServerVariables["HTTP_X_FORWARDED_FOR"];
}

return HttpContext.Current.Request.ServerVariables["REMOTE_ADDR"];

Or another method is to do it in reverse and check if the proxy method is not null first instead like:

string userIp = HttpContext.Current.Request.ServerVariables["HTTP_X_FORWARDED_FOR"];

if (string.IsNullOrEmpty(userIp)){
return HttpContext.Current.Request.ServerVariables["REMOTE_ADDR"];
}

return userIp;

This is the best method I found to get the users IP Address in ASP.NET C#, but if you think there is a better way then please comment below with your solution.