How to setup AppDynamics for multiple .Net Core 2.0 applications

We have decided to go with App Dynamics to do monitoring on our infrastructure and code, which is great and even better they have released support for .Net Core 2.0. However when working with their product and consultant we found an issue with monitoring multiple .Net Core instances on one server, plus with console apps, but we found a way.

Currently their documentation, that is helpful, shows you have to set up monitoring for the .Net Core Application with environment variables. Follow the direction in the App Dynamics Documentation says to set the environment variable for the profilers path, which is in each of the applications, but of course we can’t set multiple environment variables. Therefore we copied the profiler DLL to a central source and used that as the environment variable, but quickly found out that it still didn’t work. For the profiler to start tracking, it needs to be set to point to the applications root folder for each application.

The consultants investigation then lend to looking at how we can set the environment variables for each application, to which we found the application can be set in the web.config using the node ‘environmentVariables’ under the ‘aspNetCore’ node as stated as part of the Microsoft Documentation. Of course using the ‘dotnet publish’ command generates this web.config, so you can’t just set this in the code. Therefore in the release of the code I wrote some PowerShell to set these parameters.

In the below PowerShell, I get the XML content of the web.config, then create each of the environment variable nodes I want to insert. Once I have these I can then insert them into the correct ‘aspNetCore’ node of the XML variable, which I then use to overwrite the contents of the existing file.

Example PowerShell:

$configFile = "web.config";
$sourceDir = "D://wwwroot";

## Get XML
$doc = New-Object System.Xml.XmlDocument
$doc.Load()
$environmentVariables = $doc.CreateElement("environmentVariables")

## Set 64 bit version
$Profiler64 = $doc.CreateElement("environmentVariable")
$Profiler64.SetAttribute("name", "CORECLR_PROFILER_PATH_64")
$Profiler64.SetAttribute("value", "$sourceDir\$subFolderName\AppDynamics.Profiler_x64.dll")
$environmentVariables.AppendChild($Profiler64)

## Set 32 bit version
$Profiler32 = $doc.CreateElement("environmentVariable")
$Profiler32.SetAttribute("name", "CORECLR_PROFILER_PATH_32")
$Profiler32.SetAttribute("value", "$sourceDir\$subFolderName\AppDynamics.Profiler_x86.dll")
$environmentVariables.AppendChild($Profiler32)

$doc.SelectSingleNode("configuration/system.webServer/aspNetCore").AppendChild($environmentVariables)

$doc.Save($configFile.FullName)

Example Web.config result:

<configuration>
<system.webServer>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="dotnet" arguments=".\SecurityService.dll" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout">
<environmentVariables>
<environmentVariable name="CORECLR_PROFILER_PATH_64" value="D:\IIS\ServiceOne\AppDynamics.Profiler_x64.dll" />
<environmentVariable name="CORECLR_PROFILER_PATH_32" value="D:\IIS\ServiceTwo\AppDynamics.Profiler_x86.dll" />
</environmentVariables>
</aspNetCore>
</system.webServer>
</configuration>

This will work for the application that have a web.config, but something like a Console App doesn’t have one, so what do we do?

The recommendation and solution is to create an organiser script. This script will set the Environment Variable, which will only effect the application triggered in the session. To do this you can use any script really like PowerShell or Command Line.

In this script you just need to set the environment variables and then run the Exe after.

For example in PowerShell:

Param(
[string] $TaskLocation,
[string] $Arguments
)

# Set Environment Variables for AppDynamics
Write-Host “Set Environment Variables in path $TaskLocation”
$env:CORECLR_PROFILER_PATH_64 = “$TaskLocation\AppDynamics.Profiler_x64.dll”
$env:CORECLR_PROFILER_PATH_32 = “$TaskLocation\AppDynamics.Profiler_x86.dll”

# Run Exe
Write-Host “Start  Script”
cmd.exe \c
exit

These two solutions will mean you can use AppDynamics with both .NetCore Web Apps and Console App, with multiple application on one box.

Can Project ARA work?

Just about all developers watched and followed the Google I/O 2016. You heard about the Allo, the Duo and the Google Home air freshener, but something they also squeezed in was a nod to Project ARA. I thought this was dead with no recent news, but they have brought life to the project again so I ask, can this idea work?

For those who don’t know about Project ARA, it is an idea from Google to build a modular phone. Their plan is to have a phone that you can create and be your own. All you will have to do is replace, upgrade and buy new modules for the phone to keep it running. If you want the new camera upgrade, then you don’t need to buy a whole new phone. You just need to pop to a store that sells the modules and then slot it in.

They also say on the Project ARA website, they are looking to expand to the Open Marketplace. This could mean they will expand to allow third parties to build for the ARA phone. Google said at the I/O that they plan for the phone to be on sale in 2017, so we don’t have long to wait. The question about the open marketplace would be if they will have enough companies building for it before it comes out. If the phone drops and there is nothing other than Google things build for it, then it will be a hard sell and an uphill battle to reach the general public.

 

Why Could It Succeed

Google are known for throwing ideas at the wall and seeing what sticks, so what makes this another egg at the wall and not the new best thing of 2017? I think a lot of it has to do with cost, flexibility and personable. The whole idea is to be able to swap out your modules whenever you want, which brings in the flexibility and the personalisation. Most phone contracts for the general public are 2 years, so when we see a new phone in the middle of this, all we can do is dribble. If they announce a new feature that can be added by a module, then you can get that upgrade then. This also means updates can come faster that rely on hardware. For example, if Apples iOS Force Touch just needs the newer hardware to run that feature, with this you could get told what modules you need to update to get that feature, then do it.

Speaking about another company, as said before they want to open the development of modules to other companies. This could be great to see what they build for the phone and what could be useful. Like the big hype is at the moment, you could have better fitness tracking or blood sugar reader for people with diabetes. The realms are limitless, only bound by how many custom modules you can fit on the phone. It would depend how much room the basic modules you need take up to then how many custom modules you want can get on the phone. The benefit of these being small, easy fit modules though, is they can all be in your pocket for an easy swap over. The phone would just be limited by the imagination of the companies making the modules and by the cost of them.

We would hope the modules don’t cost that much as you would probably want to buy a few at a time and get new ones frequently like apps. If these modules are expensive then people might think why not wait for the next best phone, which will have all the upgrades at once.

Buying the phone will be interesting as well, because you are going to be the creator of the phone. It would be cool if you can basically spec out your phones modules and then put a price to it. For example £200 for the basic phone then 5 X £50 for each module. If you wanted a cheaper phone, you could then either downgrade the modules or you could just get less modules. The flexibility of these devices and the pricing would bring smart phones to more people, while also spreading the rate of updating as well, if they do it right. I think these factors could make the project ARA a starting point for other manufactures to follow.

 

Why Would It Fail

Project ARA is so flexibly it is amazing, but do we really want it that flexibly. You and me will probably say hell yes, but a general public user which is the main stream manufacturers are trying to hit might not. Think about Microsoft App store, no one wants to build apps for them as no one really wants their phones. They have even had to Open Source and partner with companies like Xamarin to make it easier for developer to build for them.

Does the general user know about Camera or Speaker quality and specifications? No and they won’t want to learn about these technical things. You may also find that the sales people don’t know either. In university I applied to a phone shop sales job, told them I knew loads about the handsets and their OS, but was told you don’t really need to know them things, you just need to sell the phones. So how will they sell a phone and modules that they don’t even understand? I think it will be a hard sell to an average user unless it can be sold as a package. They just want a cool, fast phone with awesome gadgets.

Depending on that would then be interesting if the companies making the modules, or Google want to make modules, actually want to. If they can’t see potential of the average user buying their module, then why would they spend time and resource on the product?

 

Until It Begins

I think we will really have to wait and see how Google approaches this. If they can package it and sell it well to the general public, then the Open Market will be willing to put time and money to the project. If that all goes well then I see no reason why this could fail as it has what all people want, a flexible, personal and affordable phone. Either way I could see me getting this phone depending on the modules built and price.

 

Tell me what you think about Project ARA and how it will do when released?

 

Dont be a fool and use the back up tools

Backing up your data is the most vital part of any business and individual person can do. This is the biggest must, but people still don’t think about it. Due to recent events I think it would be good to share my views and tips on the best solutions.

People don’t think they need back up theses days I believe because of companies like Dropbox, Apple and Google. They promote that all you data is backed up and that you can never be forgotten. The view they seem to push is that your data is eternal and can never be lost, but this is of course wrong. The worst thing is the majority of these people are the uneducationed people about technology and about the procedures you should take to safe guard your data.

//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js

(adsbygoogle = window.adsbygoogle || []).push({});

There is only so much you can do of course, as there is always a fall in something, but then your should always try your best. I have a back up by my host and I take a daily back up as well, but as it happens the server I am on gave out before both of them took place. This results in loss of 24 hours of data, which is not except able to anyones standards. Even though I had things in place to protect myself, there was many loop holes that left us exposed.

So what should you do you ask? Well I have been looking to upgrade my system and so here are my solutions for every kind of person.

The less technical

The best thing for the less technical people would be an external disk. I would be led to assume that if you are not that technical then you won’t have very high level data in your PC. For this reason I would say you would only really need a local copy of your back ups. You can purchase a descent external drive from here and all you would have to do would be plug it in every day or so, then move all you data you want over. This will then make sure if your PC dies for any reason or crashes you then have the fail safe of all your data back up.

Samsung M3 1TB USB 3.0 Slimline Portable Hard Drive – Black 

More than enough storage for any person and all there data. With the latest USB version and a high speed transfer.

This is for the most simple of technical people, but before you get to the most technical of things you can also move on from that. You can get external drives like below that are over the network. This way you don’t have to plug in the disk and it will back up all your system data. The only thing is it needs to be set up first, which for most is not an easy set up or maintenance after, plus you then need to make sure you have connectivity to the device or then it is useless.

WD 4TB My Cloud Personal Cloud Storage – NAS 

Bigger and better storage, plus it one of the top brands to delivery this product.

//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js

(adsbygoogle = window.adsbygoogle || []).push({});

Getting technical

So from there we get to paid services as well as local copies. The worst thing to have is all you data locally and then your house is robbed or burnt down. This would of course mean both your data and the back up have been destored or taken. So the next level would be a paid service like KNOWHOW.

KNOWHOW Cloud Storage

Reliable and secured data, one of the best in the business.

With this kind of service, your data is backed up automatically to their storage facliaties. This then means your house could blow away like wizard of oz, but your data is somewhere else safe. Though as I have said about my event recently, I would still do a local backup just incase there storage gets blown away before you. Although you should feel safe a these centers are like bomb bunkers and if you choose the right on they will always safe guard your data.

The technical way

This is when you are getting into the minor big leagues. I have never been higher than this so I am sure there are more robust solutions than the following, but these would be for the likes of Microsoft. For this way I would suggest you store all your system on a VM. If you run your machine on a VM, then you can take snap shot back ups of your machine. This can be done locally by yourself, by the system that is hosting it and even one step further is to get a paid service as well. This way you have multiple back ups in multiple places and they are all fast to install, so you loose nothing. This of course is a bit over the top, but you can be sure your data is not lost. For the smaller kind of people you can also do snap shots with other third party tools, like Apple has the Time Machine that takes snap shots of your Mac.

If we was getting to the higher leagues, I would guess they don’t loose data. They would have your information on hundreds of disks, in lots of different countries, so not even war could get rid of your data.

So don’t forget to do you back ups or you to could loose data and so memories