Saturday, August 27, 2011

Migrating a Web Application to Azure: Step 4

Moving the application to Windows Azure


A New Project

The first step is to add a new cloud project to the solution.


Ensure that you do not add any web or worker roles.

We will convert the existing web application within the solution to a Web Role in the Azure Project.

Right Click on the ‘Roles’ folder in the Azure Project, select ‘Add’ and then ‘Web Role Project in solution…’.

Select the web application – in this case ‘SharPix_Step2’


After adding the project as a Web Role, it should look like this:


Right-Click on the ‘SharPix_Step2’ role and select ‘Properties’

In this dialog, under normal circumstances, you should set Instance count to at least 2 to enable fail-over and scaling, and you would pick an appropriate VM size.

Also, we will start by using the Compute and Storage Emulators, so specify ‘UseDevelopmentStorage=true’ as the Diagnostics storage account.


Running In Emulator…

Visual Studio comes with a tiny taste of the cloud built-in, in the form of the Storage and Compute Emulators.

Simply running our new Azure Project gives us:


Note the URL and the port number. This is no longer the Visual Studio Development Web Server, but the Compute Emulator in action. Examining the Windows Tray shows us that the emulators are running….


To the cloud!

In order to push the Azure Project to our cloud environment, we need to configure Visual Studio and set up our credentials to ensure that we carefully control access to our Azure environment.

Right-Click on the Azure Project and select ‘Publish’. We could use previously issued certificates here, or issue a new one for this project by selecting ‘Add’.


Create a New Certificate, and fill in the details required.


Visual Studio creates a new certificate and stores it in a temporary, obscure, folder. Copy the path from this dialog onto your clipboard and head over to the Windows Azure Management portal.


Click on ‘Management Certificates’, select the Azure Subscription you want to deploy to, and select ‘Add Certificate’ from the right-click context menu.


Once the certificate is uploaded, select and copy the Subscription ID from the portal and fill it into (3) of the Windows Azure Project Management Authentication dialog shown above. Give the Project some reasonable name and click ‘OK’

After dispatching all the other dialogs, you will be left with the first ‘Publish Windows Azure Application’ dialog. Ensure you have selected a Production Environment, and specify the storage account to use.


Hurry Up and Wait

Clicking ‘Publish’ on the dialog above begins the deployment process, which can easily take 15 minutes or more. During this time, some information is relayed through to Visual Studio about the actions being taken and their various statuses.


When the deployment is complete, click on the ‘Website URL’ link, or view the deployment in the Server Explorer and select ‘View in Browser’ from the right-click context menu.

The application finally appears. Note the URL is now that of an Azure cloud application.


In the Management Portal, expanding the Azure Subscription will show the Deployment node and all the compute instances fired up in the production environment. In our case, we have just one instance, but we could add more.


And there it is. A fully local web application that is now running in the cloud and using cloud-based blob and database storage.

Next we’ll talk about federating authentication to this application.



Migrating a Web Application to Azure: Step 3

Getting rid of file access

One issue to watch out for with an application in the cloud is file access.

File-Access in a highly-elastic environment such as Azure is generally a bad idea because it reduces the ability for a request to be correctly handled by multiple server instances. To combat this, generally, all access to the file system should be changed to access Azure storage, which does scale across multiple server instances.

In our example, we have contrived to have a picture stored in the database, but cached on the file-system on its first use, and subsequent access is to the cached file.

Local Application, Images from Server Files

We have code that looks like this in the original code:

public ServerPath EnsureImage(Picture picture)
// ensure the directory exists
var directoryPath = EnsureDirectory(RelativePathToImagesDirectory);

// now ensure the file exists
var filePath = EnsureFile(
_path =>
using (var stream = new FileStream(
FileSystemRights.TakeOwnership | FileSystemRights.Write,
stream.Write(picture.Image.ToArray(), 0, picture.Image.Length);

return filePath;

internal ServerPath EnsureFile(ServerPath root,
string filename,
Action<ServerPath> fileCreateFunc)
var path = new ServerPath(Path.Combine(root.Relative, filename), Server.MapPath);

if (!(File.Exists(path.Absolute)))

return path;

which results in


Note that the web application is still running locally, and the images refer to files located at relative paths within the server.

To remove the dependency on the file-system, we can use Azure Blob Storage to store the cached files.

The Rise Of The Blob…

Blob storage Azure is very cool. It presents a REST-based API to upload, tag, list and retrieve unstructured binary data like images, documents and videos.

So we can switch the application to interact with the Azure Blob store, referring to the images from the storage in the cloud.

To do this, we use the Azure storage account we created earlier.


The important pieces of information to capture are the Access Key (you get this when you click on the button with the orange outline), and the Azure storage account name which is highlighted.

We then initialize access to the blob storage account and container from within the application:

public void Initialize()
if (CloudStorageAccount == null)
CloudStorageAccount = CloudStorageAccount.Parse("AccountName=cos204;AccountKey=<put_your_key_here>;DefaultEndpointsProtocol=http");

if (CloudBlobClient == null)
CloudBlobClient = new CloudBlobClient(CloudStorageAccount.BlobEndpoint, CloudStorageAccount.Credentials);

if (ThumbnailsContainer == null)
ThumbnailsContainer = CloudBlobClient.GetContainerReference("thumbnails");


if (ThumbnailsContainer.GetPermissions().PublicAccess != BlobContainerPublicAccessType.Container)
ThumbnailsContainer.SetPermissions(new BlobContainerPermissions
{PublicAccess = BlobContainerPublicAccessType.Container});

if (ImagesContainer == null)
ImagesContainer = CloudBlobClient.GetContainerReference("images");


if (ImagesContainer.GetPermissions().PublicAccess != BlobContainerPublicAccessType.Container)
ImagesContainer.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Container });

What we are doing here is initializing the Blob Storage infrastructures and creating containers for Thumbnails and Images.

It’s important to note that the demo code shows a hardcoded Azure Connection String just to simplify the code snippet for reference. Normally this would be hidden away in a configuration file.

Local Application, Remote Images

We can now replace the file-access code shown above with its Azure Blob equivalent.

public ServerPath EnsureImage(Picture picture)
var filePath = EnsureBlob(
_blob =>
var blobStream = _blob.OpenWrite();
blobStream.Write(picture.Image.ToArray(), 0, picture.Image.Length);

return filePath;

internal ServerPath EnsureBlob(CloudBlobContainer blobContainer,
string filename,
Action<CloudBlob> blobCreateFunc)
var blob = blobContainer.GetBlobReference(filename);

if (blobContainer.ListBlobs().FirstOrDefault(_blob => _blob.Uri.AbsoluteUri == blob.Uri.AbsoluteUri) == null)

return new ServerPath(filename, blob.Uri.AbsoluteUri);

With this implementation of EnsureImage(), we get:


Pretty cool.

We now have a web application running on the local server, with data in the cloud, caching files and referencing Azure Blob storage also in the cloud!

Inspecting Blob Storage

While developing and debugging this stage of the migration, it’s useful to see exactly what is being stored in the Azure Blob storage.

The Windows Azure Management Portal does not have a handy tool to inspect the structure and contents of the Azure Storage account, so we turn to some other tools to do that.

The pack of Windows Azure Tools for Visual Studio 2010 v1.4 enables a storage account browser, which can be configured with the storage account and the key as before, allowing for full access to the storage account.


We can use this to ensure that the blob storage account is structured as we expect, and filled with the contents we expect.

Next, we’ll move the application itself to Azure.


Friday, August 26, 2011

Migrating a Web Application to Azure: Step 2

Migrating the database to SQL Azure

Now that we have a local web-application with a database, and a SQL Azure instance up and running, we can start the application migration by moving the database to SQL Azure.

SQL Azure is a modified version of SQL Server 2008 R2, and there are some small, but significant limitations on the SQL DDL commands that can be run on Azure. See here and here for more information.

The upshot is, if you already have a local SQL Server database, then you’re really much better off using some specialized migration tools to move it to Azure. The tool of choice for us is the SQL Azure Migration Wizard off codeplex.

Download and install the latest version of this tool on your development box. You’ll thank the tool-developer later!

Fire up the tool, and select “SQL Database” under “Analyze / Migrate”


Select the local database server which hosts the instance you want to migrate


Select the database(s) to migrate in turn.

In our case, we will have to migrate both the ASPNETDB and PICTURES databases.


Click through the dialogs, and migrate database object present in the database.


The Migration Wizard has created the script to use for the migration, and will now require your credentials to migrate to. This is where you fill in the details from the SQL Azure instance screen in the previous step.


Let’s let the migrator create a PICTURES database on the Azure Server


…and begin the migration work


…and we are done!


The Azure portal shows the two databases created and ready for use.


Now, let’s hook up the application we wrote to use these databases in the cloud. To do this, we point the connection strings in the Web.Config of the web application to the databases in the cloud.



And that’s it. Here’s the application running locally but with the data in the cloud.


I’ve created a new album called “Clouds” now, and uploaded three new pictures. If we look at the Azure database instance and look at the data in the PICTURES instance, we see the data there:


Next, we’ll talk about setting up the Azure workspace so we can move our application there…


Migrating a Web Application to Azure: Step 1

Setting up your Azure workspace

So we have a working web-application which we want to move to Azure.

The first thing to do is to get a Windows Live ID and line up one Windows Azure subscription and one SQL Azure subscription.

Sign into with your Live ID


Click on the “Hosted Services, Storage Accounts and CDN” button on the bottom left panel, then on “Hosted Services” on the top left panel.

Right-Click on your Azure Subscription in the main panel and select “New Hosted Service”


Choose not to deploy at this stage!

Similarly create a Storage Account on the same subscription.


Now create a database server on the SQL Azure Subscription.


Ensure you set the firewall rules to include both your machine’s IP address and other Azure services. Your SQL Azure instance will need to be accessed from the Windows Azure instance,


Once you create the database server, you should make a note of the following pieces of information:

  • The subscription id
  • Your administrator login and password
  • The instance name, and the fully-qualified database server DNS name


Now we have Windows Azure and SQL Azure instances set up for use.

Next we’ll migrate our database to SQL Azure.



Thursday, August 25, 2011

Migrating a Web Application to Azure: Step 0

Creating a traditional, locally run web application

Let’s start with a simple, traditional web application.

This is a contrived application, built specifically for demonstrating the issues we want to address in the process of migrating to Azure.

Please do not consider this application as indicative of following best-practices for web application development!

Also, the purpose of this web application is not to wow anyone with a slick user interface! :)

Let’s imagine a site where someone logs on, creates a photo album or two, and uploads a few images to each album.


You will need the following applications and tools to follow along the process of migrating an application to Azure

  • Visual Studio 2010 (SP1 strongly recommended)
  • SQL Server 2008 R2; you’ll be using SQL Server Management Studio
  • Windows Azure SDK 1.4

We’ll introduce a few more tools along the way, but we can get started with this.


Let’s create this application the simplest way possible:

First, create the ASPNETDB Application Services database on the SQL Server 2008 instance. You may need to run aspnet_regsql.exe from the .NET framework directory.


Then create a basic web application in Visual Studio:


Modify the Web.Config on the web application to point the Application Services Providers to our database.


Pressing F5 to run/debug the web application should fire up the default application, and allow you to create an account to authenticate against in future. The default home page after logging in should look like this:


To create the picture album site, we create a new database with this simple schema:


Note that for this demonstration, we will store the uploaded picture in an image column on the Picture table, in the database.

We’ll use Linq 2 SQL to create a data-model for our application.

When we’re done with adding the Linq2Sql classes, the connectionStrings collection in our Web.Config looks like:


Now we’ll actually develop the application, and let’s say we finally land up with:


Let’s also assume that the application contains, for the sake of this demonstration, the following code to extract the binary data from the database and cache the file on the file system, so that the images on the rendered page refer to the cached file:

var relativePathToDirectory = "~/.cache";

// ensure the directory exists
var absolutePathToDirectory = Server.MapPath(relativePathToDirectory);
if (!Directory.Exists(absolutePathToDirectory))

// now ensure the file exists
var relativePathToFile = Path.Combine(relativePathToDirectory, picture.OriginalFilename);
var absolutePathToFile = Server.MapPath(relativePathToFile);

if (File.Exists(absolutePathToFile)) return absolutePathToFile;

using (var stream = new FileStream(
FileSystemRights.TakeOwnership | FileSystemRights.Write,
stream.Write(picture.Image.ToArray(), 0, picture.Image.Length);


We have a rudimentary, but fully functional web application that runs on our local web server in the traditional manner.

We will need to consider the specific changes to make when we want to migrate this application to the cloud.


We’ll put the source code on github and stick the link in here just after the talk.


Migrating a Web Application to Azure: Introduction


This is a series of posts relating to the details of the talk Mahesh Krishnan and I will be giving at TechEd Australia 2011. You can follow along in this series of posts for more details and gotchas that we can’t cover in the 60 minute talk at TechEd!

Here are the details of the talk:

We’ll be taking a contrived, but simple, traditional data-driven web application, and outlining the issues to consider, and steps to take, to move the application to Windows Azure/Sql Azure.

We’ll show you how to set up your Azure subscription, migrate your database to Sql Azure, manage session state with Azure’s Session State Manager, move file-system based actions to Windows Azure blob storage, and secure your application with Azure’s ACS 2.0 Federated Authentication.

We’ll show you what tools you need and how to use Visual Studio 2010 SP1 to simplify the process of migrating.

There’s some code, but mostly we’re going to be discussing issues, tips and tricks.

See you there!


Sunday, July 10, 2011

Functional Object Manipulation


In the last couple of posts, we’ve looked at ways to copy scalar properties from one object to another, with the high performance of explicit assignments and the low maintenance of reflection.

The approach of creating a lambda to operate on the object instance as a whole shows a lot of promise. For example, we may want to compare two object instances property-by-property, and the code for that would look quite similar.

public static class Comparer
public static bool Compare<T>(this T source, T destination) { return CompareClosure<T>.Compare(source, destination); }

private static class CompareClosure<T>
private static Func<T, T, bool> BuildComparerLambda(Func<PropertyInfo, bool> propertyFilter = null)
propertyFilter = propertyFilter ?? (_ => true);

var sourceParameterExpression = Expression.Parameter(typeof(T), "_left");
var destinationParameterExpression = Expression.Parameter(typeof(T), "_right");

var properties = typeof(T).GetProperties().Where(propertyFilter);
var expressions = properties.Select(_pi => Expression.Equal(Expression.Property(destinationParameterExpression, _pi), Expression.Property(sourceParameterExpression, _pi)));
var conjoinedExpression = expressions.Aggregate<Expression, Expression>(Expression.Constant(true), Expression.AndAlso);

var lambdaExpression = Expression.Lambda<Func<T, T, bool>>(conjoinedExpression, sourceParameterExpression, destinationParameterExpression);
return lambdaExpression.Compile();

private static readonly Func<T, T, bool> _compare = BuildComparerLambda();
public static bool Compare(T source, T destination) { return _compare(source, destination); }

which gives us:


What are are doing in the code is mapping each property to a boolean value indicating whether the two instances have the same value for the property, and then folding that set of boolean values to a single boolean value by aggregating it with the && operator (and a seed value of ‘true’).

We can readily see a pattern emerging here:

There are two types of operations possible for a given arity. In the case of Compare and Copy where the operation arity is 2, we can create either an Action<T, T> or a Func<T, T, TResult>.

The Action<> case is a lambda that performs an operation on each property with reference to the given argument instances.

We can create a general operation builder by lifting a parameter which specifies the desired operation on a property, with reference to the lambda arguments.

public static Action<T, T> BuildOperationAction<T>(
Func<PropertyInfo, ParameterExpression, ParameterExpression, Expression> operationExpressionFactory)
var leftParameterExpression = Expression.Parameter(typeof (T), "_left");
var rightParameterExpression = Expression.Parameter(typeof (T), "_right");

var properties = typeof (T).GetProperties();
var expressions = properties.Select(_ => operationExpressionFactory(_, leftParameterExpression, rightParameterExpression));

var blockExpression = Expression.Block(expressions);
var lambdaExpression = Expression.Lambda<Action<T, T>>(blockExpression, leftParameterExpression, rightParameterExpression);

return lambdaExpression.Compile();

The Func<> case is a lambda that maps the set of properties to a set of values with reference to the argument instances, and then aggregates this set of values to a single value using an aggregator.

We can create a general operation builder by lifting:

  1. a parameter which specifies the desired operation on a property, with reference to the lambda arguments
  2. a parameter which specifies the aggregate operation to fold the set of computed values to a single value, and
  3. a parameter specifying the initial value for the aggregator.


public static Func<T, T, TResult> BuildOperationFunc<T, TResult>(
Func<PropertyInfo, ParameterExpression, ParameterExpression, Expression> operationExpressionFactory,
Func<Expression, Expression, Expression> conjunction, Expression seed)
var leftParameterExpression = Expression.Parameter(typeof (T), "_left");
var rightParameterExpression = Expression.Parameter(typeof (T), "_right");

var properties = typeof (T).GetProperties();
var expressions = properties.Select(_ => operationExpressionFactory(_, leftParameterExpression, rightParameterExpression));

var joinedExpression = expressions.Aggregate(seed, conjunction);
var lambdaExpression = Expression.Lambda<Func<T, T, TResult>>(joinedExpression, leftParameterExpression, rightParameterExpression);

return lambdaExpression.Compile();

This allows us to implement Copy() and Compare() as follows:

public static bool CompareWith<T>(this T source, T destination) { return CompareClosure<T>.Compare(source, destination); }
private static class CompareClosure<T>
private static readonly Func<T, T, bool> _compare =
BuildOperationFunc<T, bool>(
(_pi, _src, _dest) => Expression.Assign(Expression.Property(_dest, _pi), Expression.Property(_src, _pi)),

public static bool Compare(T source, T destination) { return _compare(source, destination); }

public static void CopyTo<T>(this T source, T destination) { CopyClosure<T>.Copy(source, destination); }
private static class CopyClosure<T>
private static readonly Action<T, T> _copy =
(_pi, _src, _dest) => Expression.Assign(Expression.Property(_dest, _pi), Expression.Property(_src, _pi)));

public static void Copy(T source, T destination) { _copy(source, destination); }

Very neat.

We could develop more operations than Copy and Compare, building fast, automatically maintained mappers for data access (think ORMs) and UI field access, by simply specifying the equivalent expression. The development effort is simply in the development of the expression.

For example, here is a mapper which maps properties defined on type T, from a dynamic source object to backing fields on a target object of type T:

private static readonly Action<T, T> _copyDynamicToStaticBackingFields = 
(_pi, _src, _dest) =>
var sourceBinder = RuntimeBinder.GetMember(CSharpBinderFlags.InvokeSpecialName, _pi.Name, typeof (T), new [ ] { ThisArgument });
var sourceGetExpression = Expression.Convert(Expression.Dynamic(sourceBinder, typeof (object), _src), _pi.PropertyType);

var fieldName = String.Format("_{0}{1}", _pi.Name.Substring(0, 1).ToLower(), _pi.Name.Substring(1));

var fieldInfo = typeof (T).GetField(fieldName, BindingFlags.Instance | BindingFlags.NonPublic);
if (fieldInfo == null) return Expression.Default(typeof (void));

var destinationAccessExpression = Expression.Field(_dest, fieldInfo);
return Expression.Assign(destinationAccessExpression, sourceGetExpression);

We could further develop this approach by developing the BuildOperationXXX functions for Action<>s and Func<>s of arity 1, which would allow us to quickly develop fast, low-maintenance object printers and serializers.

I believe that this set of generating functions is likely to be more generally useful, so I’ve created a CodePlex project for this. Please visit it and let me know how you used it!