TechNet Radio – Foglight on demand for Azure Applications

Douglas Chrystall, Chief Architect at Dell Software recently had a session on TeckNet Radio talking about our new monitor solution for Windows Azure cloud services.
In the video Douglas show how  Foglight on demand for Azure Applications help customers to better understand the performance of their cloud service in Windows Azure and to diagnose and overcome real performance problems.

Using the Windows Azure Diagnostics Configuration File – Key Points

The Azure platform uses the Windows Azure Diagnostics mechanism in order to monitor an azure cloud service application.
The Windows Azure Diagnostics or for short WAD essentially collects monitoring data by using a built-in agent installed on the role instance VM.
The WAD agent push the collected data to a remote Azure Storage which can be than analyzed by an application admin offline.

The are several ways to configure the WAD to start monitoring an azure application but on this post I would like to focus on the less common way to configure diagnostics which is by using
the Windows Azure Diagnostics Configuration File (diagnostics.wadcfg) that was introduced in azure 1.3 sdk.
In order for the WAD mechanism to work the diagnostics.wadcfg file should be added to the azure application package under the Role directory so that the WAD agent can access it prior to the start of the application in the role instance VM.

The purpose of using the diagnostics.wadcfg file instead of configuring the diagnostics via the Role code or using the remote diagnostic API is due to the following reasons:
You are using in your cloud service a VM role (the Paas flavor not the persistence VM Iaas one) which by definition does not support startup tasks and hence does not provide a straight forward way set up diagnostics.
You need to track the flow of the startup tasks on a role instance prior to the OnStart call in the Role application code (for example read the event logs).

Worth mentioning that the windows azure diagnostic configuration file collects only event/trace logs and infrastructure data (for example performance counters) and not CrashDumps, FailedRequestLogs, and IISLogs elements.

Here is an example of a diagnostics.wadcfg:

diagconfig

Key point in editing this file is when setting the overallQuotaInMB and the various bufferQuotaInMB elements sizes.

The overallQuotaInMB defined the local storage allocated on the VM disk for buffering diagnostics data whereas the bufferQuotaInMB defined on each diagnostics type (event logs performance counters etc..) the max allocated local disk size (if the size of the buffer is reached temp data will be overridden).
In case the bufferQuotaInMB on each element is higher than the overallQuotaInMB defined or if the bufferQuotaInMB is higher than 750MB (and not reaching the overallQuotaInMB limit) the WAD agent will agent will fail to collect any data and unfortunately there is no way to track that.

Additional information on the exact usage of diagnostic configuration file and best practices on that matter can be found in the following resources:
Using the Windows Azure Diagnostics Configuration File
Configuring WAD via the diagnostics.wadcfg Config File

Windows Azure Application Package New Format

Introduction

You probably found yourself rebuilding your windows azure application just to do a small modification in the application configuration.
For example let’s say that your initial design was to set for each Role instance a virtual machine size of Small VM (1 CPU).
Along the way you noticed that the initial VM size was just not enough and the Role instances needs a bigger VM.
The common way to fix that is to just open visual studio and change the application service definition configurations that will be eventually reflected in the ServiceDefinition.csdefe file under <WebRole name=[WebRoleName] vmsize=[VM_SIZE]>.

It would have been easier if you could have opened the application package and made the changes without the need for the development environment.
In addition it’s a valid option that you as the DevOp just don’t have access to the application source code and application provisioning is an issue that needs to be addressed on a closed application package file.

The trouble with the application package old format is that the file content is encrypted (using the azure sdk) and can not be edited.
Luckily windows azure provides a way to use an open format (OPC) application package that can be opened with simple archive tools such as WinZip or WinRaR in order to modify specific files.

In this post I would demonstrate how to create an application package with the new format using a simple PowerShell script.

First lets quickly cover the parts that assemble an azure application.

Application package old vs new format

Every application package contains 2 files:

  1. *.cspkg – application code which includes the important ServiceDefinition.csdefe file mention above
  2. ServiceConfiguration.cscfg

Using the code

In the CSPack command line tool Microsoft added 2 new options for supporting the new package format:

  1. /useCtpPackageFormat – creates the new OPC application package format.
  2. /convertToCtpPackage – convert application package old to new format.

The code below creates the azure application in an OPC format:
(The CSPack command can be found in the following path: C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\<sdk-version>\bin)

# Working script parms
$cppack = 'C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\2012-10\bin\cspack.exe'
$workingDir = "C:\Users\[USER_NAME]\Documents\Visual Studio 2010\Projects\TestDepOneRole"
$serviceName = "TestDepOneRole"
$roleName = "WebRole1Update"
$roleBinDir = [string]::Format("{0}\{1}",$workingDir,"WebRole1")
$outPath = [string]::Format("{0}\{1}",$workingDir,"packages")

# CSPack command parms
$_serviceDefinitionPath = [string]::Format("{0}\{1}\ServiceDefinition.csdef",$workingDir,$serviceName)
$_serviceConfigurationPath = [string]::Format("{0}\{1}\ServiceConfiguration.Cloud.cscfg",$workingDir,$serviceName)
$_role = [string]::Format("/role:{0};{1}", $roleName,$roleBinDir)
$_sites = [string]::Format("/sites:{0};Web;{1}",$roleName,$roleBinDir) 
$_out = [string]::Format("/out:{0}\package\{1}OPC.cspkg",$workingDir,$serviceName)

# Running script
if(!(test-path $outPath -pathtype container)){
	new-item -Name "package" -ItemType directory -Path $workingDir
}
& $cppack $_serviceDefinitionPath $_role $_sites $_out /useCtpPackageFormat
Copy-Item $_serviceConfigurationPath $outPath

Opening the file using WinRaR reveal the following:

Capture

You can noticed that the service definition can be edited and inserted back to the application package to submit the changes.

Points of Interest

The OPC format should be well used in azure for static content editing for example modifying external properties files in the application package.
However in the current implementation the OPC does not resemble the structure you would expect from the cloud solution.
For example the LocalContent folder contains hash files that cannot be extracted and there isn’t a proper directory structure like in visual studio cloud project so editing a simple css files is not yet possible.

Windows Azure SQL Database Management

Windows Azure SQL Database provides very handy management commands which are exposed either by REST API or PowerShell Cmdlets.
Looking closely we can find various management operations such as:

  1. Creating/deleting a Windows Azure SQL Database server in our subscription (the sql server instance equivalent in on-prem)
  2. Defining firewall rules to allow access at the server or database level (SQL Database provides 2 firewall layers – server level and database level)
  3. Updating the server main password.

The management operations above can well be performed from the azure management portal , so you probably ask yourself why are those commands exposed in the first place ?
The answer is simple – Automation

Here are a few steps that probably every company does in order to set-up a windows azure cloud service:

  1. First they will create a Server in Windows Azure SQL Database,
  2. Than they will create a database instances on that server,
  3. After that a firewall rules definitions will need to be set in order for the application to get access to the databases
  4. Finally the cloud service package will be uploaded to the Windows Azure Cloud.

Automating that process can really speed things up when setting a new environment.
Puling our sleeves up let’s create a PowerShell script that accommodate the configuration process above:

# Create a new server
New-AzureSqlDatabaseServer -AdministratorLogin [user_name] -AdministratorLoginPassword [password] -Location [data_center_name]

# Create server firewall rule
New-AzureSqlDatabaseServerFirewallRule –ServerName "[server_name]" -RuleName "allowAzureServices" -StartIpAddress 0.0.0.0 –EndIpAddress 0.0.0.0

# Setup a new database
$connectionString = "Server=tcp:[server_name].database.windows.net;Database=master;User ID=[user_name]@[server_name];Password=[password];Trusted_Connection=False;Encrypt=True;" 
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()

# Verify the existence of the desired database
$command = New-Object System.Data.SQLClient.SQLCommand
$command.Connection = $connection
$command.CommandText = "select name from sys.databases where name='[database_name]'"
$reader = $Command.ExecuteReader()

if(!$reader.HasRows){
# Create the database
$command.CommandText = "CREATE DATABASE [database_name]"
$command.ExecuteNonQuery()
}
$reader.Close
$connection.Close

# Create a cloud service
$packagePath = "[.cspkg path]" 
$configPath = "[.cscfg path]"
New-AzureService -ServiceName "[service_name]" -Label "[service_label]" -Location "[data_center_location]"

# Upload an application package to the cloud service production slot
Set-AzureSubscription "[subscription_name]" -CurrentStorageAccount "[azure_storage_account_name]"
New-AzureDeployment -ServiceName "[service_name]" -Slot "Production" -Package $packagePath -Configuration $configPath -Label "[deployment_label]"

Windows Azure Mobile Services Overview

Introduction

The Mobile Service has being offered by the Windows Azure Platform for the last few mounts under the Preview edition.
In this post I would like to make a quick overview on the azure mobile service which can serve as a the first step for anyone examining mobile connectivity for his Azure Cloud Service application.

Motivation

The assent of using a Mobile Services is to offer mobile clients smooth access to data with little need to waste expensive mobile compute power.

Azure Mobile Services in Glance

The Mobile Services  helps Windows Azure cloud services to expose connectivity to various mobile app such as IOS , Windows Phone App and Windows Store App (Window 8).
You should look at the Mobile Services as the channel between your cloud application to the various mobile clients.
Technically speaking the actual data is stored or push to Windows Azure SQL Database table through the mobile service itself.

Microsoft had made it very convenient to set-up the Mobile Service from the new management azure portal.
This is done by creating a mobile service, creating a table in SQL Database and finally downloading the sample code for the specific mobile operation system.
The great thing about the sample code is that you actually have an hello word code that can talk to the mobile service in the cloud, so leveraging that for your own mobile app is fairly quick.
Please find a key reference link on the windowsazure.com which can help you getting started.

Mobile Services Main Features

Data Store for Mobile App
The first important feature is persistence of mobile clients’ data in the cloud and implications for that are basically in every app.
The mobile service API can help you store the data in Windows Azure SQL Database collected from the various mobile clients in such way that you don’t have to do anything about the actual communication difficulties between your service in the cloud and the mobile app
(please visit here  for more info on that).

Push Notification
Push notification enables your cloud application to send (“push”) notification to mobile clients whenever you desired too by that enhancing the functionality of your bushiness.
The way it is done is by registering your mobile app in the azure portal and downloading the relevant manifest setting that will be integrated in your code (more on that in here).
To push data to your clients you will simply need to add a new item in SQL Database table.
Disclaimer on push notification is that currently this feature is supported for Windows Store App (Windows 8) and Widows Phone App which means no IOS or Android.

Authentication 

The mobile service also support authentication mechanism which enables you to enforce security to the clients that connects to your mobile service using the various known identity providers such as Facebook, Twitter, Google and of course Windows account.

Integration With Other Cloud Services

The integration is basically done using the Mobile Services server side code which enables use to do several operations whenever data is inserted to the table by the mobile app or whenever you need to push a notification to the user.
So if you need to trigger another service you could just add some basic code to the mobile service server script which will could launch an external service (for example whenever a mobile client add an item you could send confirmation email through an email service).

Summary

The Mobile service in Windows Azure really open developers to a new way of thinking by reducing the time and effort for making a cloud service a truly mobile oriented.

Remote IP Filtering

In the following post I would like to demonstrate how to enforce ip address filtering for a web role using a simple security settings provided by the IIS web server.
The goal of this post is just a proof of concept and should only be taken as a base for future work.

For our discussion let’s consider that during production the devOP have noticed from the the iis logs that a certain client is trying to access the site in an abnormal way.
Naturally we would like the azure application web role to be able to reject that client’s IP.
In addition the azure application has an administration web site on a second web role that can be accessed using a custom port (for example 8080).
To increase the security to the administration web site we would like that only requests originated from a specific company ip address(s) will be able to have access.

In both cases IP address restriction is needed.

IIS provides a nice feature called IP Address and Domain Restriction.
As the feature name hints one of it use is to block specific IPs that try to access a web site.
In order to enable this feature on windows azure we can write a short startup script for our web role.
To actually add the blocked client IPs list we should modify the web.config of that role.
The problem is that those settings are hard coded in the web.config file and be loaded automatically once the web role starts.

Immediate question is raised… will I need to upgrade my azure application in order to apply a new IP filter list every time a pasting little client try to get my service down ?
The answer is No.
We can make the ip filtering settings more “remotely” using windows azure storage, which this post is all about.

The solution is fairly simple: we can modify the web.config security settings for the azure web role using information downloaded from a blob located in windows azure storage.
This way we can add as many IPs to that blob text file as we want just by pressing a button on the web site in order to commit the changes and that is it the client will be band or grant access to our site.
Some gotcha to consider is that modifying the iis web.config settings at run-time will cause a restart to the iis process so we probably should consider storing the session data externally and expect small down time while enforcing the ip filtering rules.
One more thing to consider here is that modification of the web.config settings in run-time from any place that is not in the OnStart method of the web role, for example from a aspx web page, will need the default AppPool identity to be gained with higher privileges (i wont get in to that but it could be done from the web role code to make it automatic).

Some code a head….

First step is to enable the ip filtering feature.
Lets examine what need to be done on the Web Role ServiceDefinittion configuration file:


From the highlighted settings the web role will need to be run in elevated mode (in order to install the ip filtering feature) and a start-up task would need to be defined to actually install the ip filtering feature.
below is the start-up task content that will install the ip filtering feature:

@echo off
@echo Installing and unlocking the IPv4 Address and Domain Restrictions feature
%windir%\System32\ServerManagerCmd.exe -install Web-IP-Security
%windir%\system32\inetsrv\AppCmd.exe unlock config -section:system.webServer/security/ipSecurity

The next step is to load the default ip filtering from a blob the first time the web role starts:

public class WebRole : RoleEntryPoint
{
    public override bool OnStart()
    {
        using (var server = new ServerManager())
        {
            string storageConnection = RoleEnvironment.GetConfigurationSettingValue("StorageConnectionString");
            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnection);
            var blobClient = storageAccount.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("security");
            container.CreateIfNotExist();
            CloudBlob blob = container.GetBlobReference("ipfiltering.txt");
            bool isBlobExist = false;
            try
            {
                blob.FetchAttributes();
                isBlobExist = true;
            }
            catch (Exception e)
            {
                Console.Write(e.Message);
            }

            if (isBlobExist)
            {
                var siteNameFromServiceModel = "Web"; // update this site name for your site. 
                var siteName = string.Format("{0}_{1}", RoleEnvironment.CurrentRoleInstance.Id, siteNameFromServiceModel);
                var siteConfig = server.Sites[siteName].GetWebConfiguration();
                var ipAddressSettings = siteConfig.GetSection("system.webServer/security/ipSecurity").GetCollection();
                ipAddressSettings.Clear();

                string ipFilteringSettings = blob.DownloadText();
                using (StringReader sr = new StringReader(ipFilteringSettings))
                {
                    string line;

                    while ((line = sr.ReadLine()) != null)
                    {
                        if (line.StartsWith("#")) continue;
                        string[] settings = line.Split(' ');
                        ConfigurationElement newElement = ipAddressSettings.CreateElement("add");
                        newElement["ipAddress"] = settings[0];
                        newElement["subnetMask"] = settings[1];
                        newElement["allowed"] = settings[2];
                        ipAddressSettings.Add(newElement);
                    }
                }

                server.CommitChanges();
            }
            else
            {
                blob.UploadText("# ip filtering settings");
            }

        }

        return base.OnStart();
    }
}

The code download a small text file and then modify the web.config IP security settings to applied the content.
The following section in the web.config is the one that will be modified:

I’ve created a small page that display the blocked IPs content as downloaded from the blob file (pressing the load button):

If we look closer at the code below we can noticed that when i press the save button the IPs will be stored in the blob file on azure storage and than the relevant IP filtering settings will be stored in the web.config file (in this phase a small IIS restart will occur in order to load the new web site config settings).

protected void btnSave_Click(object sender, EventArgs e)
{
    string storageConnection = RoleEnvironment.GetConfigurationSettingValue("StorageConnectionString");
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnection);
    var blobClient = storageAccount.CreateCloudBlobClient();
    CloudBlobContainer container = blobClient.GetContainerReference("security");
    container.CreateIfNotExist();
    CloudBlob blob = container.GetBlobReference("ipfiltering.txt");
    try
    {
        blob.FetchAttributes();
        blob.UploadText(txtTest.Text);
        using (var server = new ServerManager())
        {

            var siteNameFromServiceModel = "Web"; // update this site name for your site. 
            var siteName = string.Format("{0}_{1}", RoleEnvironment.CurrentRoleInstance.Id, siteNameFromServiceModel);
            var siteConfig = server.Sites[siteName].GetWebConfiguration();
            var ipAddressSettings = siteConfig.GetSection("system.webServer/security/ipSecurity").GetCollection();
            ipAddressSettings.Clear();

            string ipFilteringSettings = txtTest.Text;
            using (StringReader sr = new StringReader(ipFilteringSettings))
            {
                string line;

                while ((line = sr.ReadLine()) != null)
                {
                    if (line.StartsWith("#")) continue;
                    string[] settings = line.Split(' ');
                    Microsoft.Web.Administration.ConfigurationElement newElement = ipAddressSettings.CreateElement("add");
                    newElement["ipAddress"] = settings[0];
                    newElement["subnetMask"] = settings[1];
                    newElement["allowed"] = settings[2];
                    ipAddressSettings.Add(newElement);
                }
            }

            server.CommitChanges();
        }
    }
    catch(Exception err)
    {
        Console.WriteLine(err.Message);
    }
}

Finally to test that the IP filtering specified above work I’ve included my own IP, 403 error message confirm that.

This posting is provided ‘AS IS’ with no warranties, and confers no rights.

New Azure Portal – Resource Linking

I came across a realy cool feature in the new windows azure portal (manage.windowsazure.com)

If you look carefully under the Cloud service section you will find that there is tab named Linked Resources.

Typically your azure application will use either SQL Azure or Azure Storage or just both services.
Well it will be great to view your own use of those storage services under the same cloud service,
and that is the reason why the Azure team made it possible to logically  link your cloud service to the storage service that you are using.
At this current time you able to link to your cloud service only to your SQL Azure instance, i guess it wont be that long till your azure storage accont would  linked as well.

So what is it good for you ask?

  1. Scale your application from the same Scale section.
  2. Direct link to the your database section under the Storage tab.
  3. Simple monitoring  your storage (currently only database capacity management available from the dashboard section link to the storage tab in the portal).

To summarize this is pretty simple feature in the azure portal but yet again it seems that some one understands that even in the clouds people are involved :)

The new Windows Azure – Web Sites

A new cool feature MS added to the Windows Azure Preview release is the option to create a web sites in a seconds.
I attended a great session on Web Sites in Windows Azure presented by Bill Staples at TechEd North America 2012 (the video will probably be available at channel9 soon) and in the demo a site was created in less than 6 seconds.
After the site is created in azure you will need to deploy the site content.

Let’s talk about the several nice option for deploying your site to azure:  

  1. FTP – simple file transfer directly to your hosting machine in azure (wwwroot dir), you can use the very popular ftp app FileZilla.
  2. Git – an open source distributed source control. Your site in azure can expose a git
    for remote connection so this can be done very easily through the portal.
  3. TFS – Microsoft team foundation server source control, again the portal enable an online service for pushing directly to your site in azure.
  4. Web Deploy

Free stuff coming:.

Right now you able to create 10 sites in azure for free.
The small down to it is that your sites will be sharing the sever resources with other sites in azure. (the seconds option is reserve meaning your site will be running on a single vm dedicated to you and of course you can create multiple vm instances but again this is not free as expected).

Moving from a web site to a service:
On the new azure visual studio tooling you can add to your web site project a cloud project that can have roles in it, in order to expand your web site functionally from just a simple site that serve web pages to a full service that have dedicated back-end machine responsible for some sort of processing.

Web site applications catalog:
Last think I would talk about is the option to create a site from the large variety of application offered such as WordPress (create your own blob hosted in azure), Drupal (CMS – content management system blog personal. company site), Joomla (another very popular CMS) etc…

New Azure Release – Distributed Cache

You probably heard that MS releasing the new Spring (Preview) release for Windows Azure.
Here is quick summary of what its all about:

  1. Windows Azure Virtual Machines – Persistence VM,  any modification to the VM and any data is persistence.
  2. Windows Azure Virtual Network - create VPN and extend on premise application, control the network configuration etc..
  3. Windows Azure Web Sites - build elastic web sites using many of the open source application like WordPress, Drubal etc..
  4. New Azure portal and sdk’s
More information can be found on Scott Guthrie’s great post.
One of the topic that I would like to focus in this post is Distributed Cache.
Distributed Cache enables you to set up an in memory, low latency distributed cache used solely for you application supported by the full AppFabric Cache Server API (support regions containers, notification on cache operations, high availability and local cache).
Just as reminder the former cache services for windows azure (Shared cache) was using a dedicated milt-tenant servers which the user didn’t managed and simply needed to register the cache using the azure portal.
Co located:
Every role can allocate a percentage of his memory to be used by the distributed cache.
With this approach each role instance can access data stored from other roles and the even greater think it free!!!.
This way you can exploit your vm memory resource which in other case could have been unused memory.
Cache Worker Roles
The seconds approach for the distributed cache using the worker role for caching purposes only.
For example you could have 1 web role instance that uses distributed cache constructed from 2 worker role instances memory.
In this approach you expand the the boundaries of you cache by just throwing another instance.
Moreover the cache itself could flexibly increased and decrease at run-time.

Data Center Location – Azure Portal Update

If you look carefully you probably noticed that the option of creating a data center with the Anywhere option has been removed (Anywhere US, Anywhere Europe, Anywhere Asia).

Image

This option was pretty much redundant and was more targeted for fresh users at azure.

Why redundant ?

Choosing the Anywhere option leave the azure app-fabric the decisions where to install your hosted service based on available data centers in that region.
Behind the scene the chosen data-center location does not register at the RDFE server (router and traffic manager above the fabric controller) configuration as it done when you choose the actual data center location thus causing this information not to be exposed for externals APIs.
If you think about it, should you care where your service is located ?
Yes you do!!!!
The data center location should be as close as possible to the major amount of users geographical location.
You could find yourself dealing with latency problem because of not choosing the right data center.
Moreover if you have a full blown Saas solution that uses on or more of the azure service  (storage, caching, service bus etc..) it highly important to understand what is the physical/geographical impact of each of the azure services Paas Lego parts.

Taking the decision for choosing the data center location can be taken a bit further.
It a good practice to define an affinity groups in your subscription account
(Just as a short reminder affinity group is an alias to a data center location)
Image

Once you define an affinity group you could rest assured that your hosted service data center location is in sync with the rest of the azure services you are using in your application.
So instead of choosing a data center when you create a hosted service or a storage account just choose the affinity group name.

Image

Follow

Get every new post delivered to your Inbox.