Thursday, September 27, 2012

Casablanca: C++ on Azure

 

We (John Azariah and Mahesh Krishnan) gave a talk at Tech Ed Australia this year titled Casablanca: C++ on Azure. The talk itself was slotted in at 8:15 am on the last day of Tech.Ed after a long party the night before. The crowd was small, and although we were initially disappointed by the turn out, we took heart in the fact that this was the most viewed online video at Tech.Ed this year – lots of five star ratings, Facebook likes and tweets.  This post gives you an introduction to Casablanca and highlights the things we talked about in the Tech.Ed presentation.

So, what is Casablanca? Casablanca is an incubation effort from Microsoft with the aim of providing an option for people to run C++ on Windows Azure. Until now, if you were a C++ programmer, the easiest option for you to use C++ would be to create a library and then P/Invoke it from C# or VB.NET code. Casablanca gives you an option to do away with things like that.

If you are a C++ developer and want to move your code to Azure right away, all we can say is “Hold your horses!” It is, like we said, an incubation effort and  not production ready, yet. But you can download it from the Devlabs site, play with it and provide valuable feedback to the product team.

You are also probably thinking, “Why use C++?” The answer to that question is really “Why not?” Microsoft has been providing developers the option to use various other languages/platforms such as java and Node.js to write for Azure, and now they are giving the same option to C++ programmers – use the language of their choice to write applications in Azure. Although there has been a bit of resurgence in C++ in the last couple of years, we are not really trying to sell C++ to you. If we are writing a Web App that talks to a database, then our first choice would probably still be ASP.NET MVC using C#, and maybe Entity Frameworks to talk to the DB. What we are trying to say is that you still need to use the right language and framework that works best for you, and if C# is the language you are comfortable with, then why change.

On the other hand if you are using C++, then you probably already know why you want to continue using it. You may be using it for cross-platform compatibility or better performance or maybe you have lots of existing legacy code that you can’t be bothered porting across. Whatever the reason, Casablanca gives you an option to bring your C++ code to Azure without having to use another language to talk to its libraries.

 

The Node influence

When you first start to look at Casablanca code, you will notice how some of the code has some resemblance to Node.js. A simple Hello World example in node will look like this -

var http = require('http');

http.createServer(function (request, response) {

response.writeHead(200,
{'Content-Type': 'text/plain'});
respose.end('Hello World!');

}).listen(8080, '127.0.0.1');

The equivalent Hello World in C++ would look something like this -

using namespace http;

http_listener::create("http://127.0.0.1:8080/",
[](http_request request)
{
return request.reply(status_codes::OK,
"text/plain", "Hello World!");
}).listen();

Notice the similarity? This isn’t by accident. The Casablanca team has been influenced a fair bit by Node and the simplicity by which you can code in node.

 

Other inclusions


The proliferation of HTML, Web servers, web pages and the various languages to write web applications based on HTML happened in the 90s. C++ may have been around a lot longer than that, but surprisingly, it didn’t ride the HTML wave. Web servers were probably written in C++, but the applications themselves were written using much simpler languages like PHP. Of course, we did have CGI, which you could write using C++, and there were scores of web applications written in C++ but somehow, it really wasn’t the language of choice for writing them. (It didn’t help that scores of C++ developers moved on to things like Java, C#, and Ruby). What C++ needed was a good library or SDK to work with HTTP requests, and process them.

In addition to this, RESTful applications are becoming common place, and is increasingly becoming the preferred way to write services. So, the ability to easily process GET, PUT, POST and DELETE requests in C++ was also needed.

When we talk about RESTful apps, we also need to talk about the format in which the data is sent to/from the server. JSON seems to be the format of choice these days due to the ease with which it works with Javascript.

The Casablanca team took these things into consideration and added classes into Casablanca to work with the HTTP protocol, easily create RESTful apps and work with JSON.

To process the different HTTP actions and write a simple REST application to do CRUD operations, the code will look something like this:

auto listener = http_listener::create(L"http://localhost:8082/books/");

listener.support(http::methods::GET, [=](http_request request)
{
//Read records from DB and send data back
});

listener.support(http::methods::POST, [=](http_request request)
{
//Create record from data sent in Request body
});

listener.support(http::methods::PUT, [=](http_request request)
{
//Update record based on data sent in Request body
});

listener.support(http::methods::DEL, [=](http_request request)
{
//Delete
});

/* Prevent Listen() from returning until user hits 'Enter' */
listener.listen([]() { fgetc(stdin); }).wait();


Notice how easy it is to process the individual HTTP actions? So, how does conversion from and to Json objects work? To convert a C++ object to a Json object and send it back as a response, the code will look something like this:

using namespace http::json;
...

value jsonObj = value::object();
jsonObj[L"Isbn"] = value::string(isbn);
jsonObj[L"Title"] = value::string(title);
...

request.reply(http::status_codes::OK, jsonObj);

To read json data from the request, the code will look something like this:

using namespace http::json;

...

value jsonValue = request.extract_json().get();

isbn = jsonValue[L"Isbn"].as_string();

You have a collection? no problem, the following code snippet shows how you can create a Json array

...
auto elements = http::json::value::element_vector();
for (auto i = mymap.begin(); i != mymap.end(); ++i)
{
T t = *i;

auto jsonOfT = ...; // Convert t to http::json::value
elements.insert(elements.end(), jsonOfT);
}
return http::json::value::array(elements);


 


Azure Storage


If you are running your application in Windows Azure, then chances are you may also want to use Azure storage. Casablanca provides you with the libraries to be able to do this. The usage, again is quite simple, to create the various clients for blobs, queues and tables the usage is as follow:

storage_credentials creds(storageName, storageKey);

cloud_table_client table_client(tableUrl, creds);
cloud_blob_client blob_client(blobUrl, creds);
cloud_queue_client queue_client(queueUrl, creds);


Notice the consistent way of creating the various client objects. Once you have initialized them, then their usage is quite simple too. The following code snippet shows you how to read data from Table storage:

cloud_table table(table_client, tableName);
query_params params;
...
auto results = table.query_entities(params)
.get().results();

for (auto i = results.begin();
i != result_vector.end(); ++i)
{
cloud_table_entity entity = *i;
entity.match_property(L"ISBN", isbn);
...
}


Writing to Table storage is not difficult either, as seen in this code snippet:

cloud_table table(table_client, table_name);
cloud_table_entity entity(partitionKey, rowKey);

entity.set(L"ISBN", isbn, cloud_table_entity::String);
...

cloud_table.insert_or_replace_entity(entity);

Writing to blobs, and queues follow a similar pattern of usage.

 

Async…


nother one of the main inclusions in Casablanca is the ability to do things in an asynchronous fashion. If you’ve looked at the way things are done on Windows Store applications or used Parallel Patterns Library (PPL), then you would already be familiar with the “promise” syntax. In the previous code snippets, we resisted the urge to use it, as we hadn’t introduced it yet.

 

… and Client-Side Libraries


Also, we have been talking mainly about the server side use of Casablanca, but another thing to highlight is the fact that it can also be used to do client side programming. The following code shows the client side use of Casablanca and how promises can be used:

http::client::http_client client(L"http://someurl/");
client.request(methods::GET, L"/foo.html")
.then(
[=](pplx::task task)
{
http_response response = task.get();
//Do something with response
...
});

If you need to find out more about ppl and promises, then you should read the article Asynchronous Programming in C++ written by Artur Laksberg in the MSDN magazine.

 

Wait, there is more…but first lets get started


Casablanca has also been influenced by Erlang, and the concept of Actors, but let’s talk about it another post. To get started with Casablanca, download it from the DevLabs site. It is available for both VS 2010 and 2012.

 


image012

Tuesday, September 25, 2012

I’m back!

 

It’s been a crazy year since TechEd Australia 2011, and I’ve been strangely quiet…

But I’m back now…

Saturday, August 27, 2011

Migrating a Web Application to Azure: Step 4

Moving the application to Windows Azure

 

A New Project

The first step is to add a new cloud project to the solution.

create_new_cloud_project

Ensure that you do not add any web or worker roles.

We will convert the existing web application within the solution to a Web Role in the Azure Project.

Right Click on the ‘Roles’ folder in the Azure Project, select ‘Add’ and then ‘Web Role Project in solution…’.

Select the web application – in this case ‘SharPix_Step2’

add_web_role

After adding the project as a Web Role, it should look like this:

web_role_added

Right-Click on the ‘SharPix_Step2’ role and select ‘Properties’

In this dialog, under normal circumstances, you should set Instance count to at least 2 to enable fail-over and scaling, and you would pick an appropriate VM size.

Also, we will start by using the Compute and Storage Emulators, so specify ‘UseDevelopmentStorage=true’ as the Diagnostics storage account.

web_role_properties

Running In Emulator…

Visual Studio comes with a tiny taste of the cloud built-in, in the form of the Storage and Compute Emulators.

Simply running our new Azure Project gives us:

web_role_on_compute_emulator

Note the URL and the port number. This is no longer the Visual Studio Development Web Server, but the Compute Emulator in action. Examining the Windows Tray shows us that the emulators are running….

compute_emulator_is_running

To the cloud!

In order to push the Azure Project to our cloud environment, we need to configure Visual Studio and set up our credentials to ensure that we carefully control access to our Azure environment.

Right-Click on the Azure Project and select ‘Publish’. We could use previously issued certificates here, or issue a new one for this project by selecting ‘Add’.

publish_settings

Create a New Certificate, and fill in the details required.

credentials_create_new

Visual Studio creates a new certificate and stores it in a temporary, obscure, folder. Copy the path from this dialog onto your clipboard and head over to the Windows Azure Management portal.

copy_certificate_path

Click on ‘Management Certificates’, select the Azure Subscription you want to deploy to, and select ‘Add Certificate’ from the right-click context menu.

add_management_certificate

Once the certificate is uploaded, select and copy the Subscription ID from the portal and fill it into (3) of the Windows Azure Project Management Authentication dialog shown above. Give the Project some reasonable name and click ‘OK’

After dispatching all the other dialogs, you will be left with the first ‘Publish Windows Azure Application’ dialog. Ensure you have selected a Production Environment, and specify the storage account to use.

ready_to_publish

Hurry Up and Wait

Clicking ‘Publish’ on the dialog above begins the deployment process, which can easily take 15 minutes or more. During this time, some information is relayed through to Visual Studio about the actions being taken and their various statuses.

deployment_in_process

When the deployment is complete, click on the ‘Website URL’ link, or view the deployment in the Server Explorer and select ‘View in Browser’ from the right-click context menu.

The application finally appears. Note the URL is now that of an Azure cloud application.

in_the_cloud

In the Management Portal, expanding the Azure Subscription will show the Deployment node and all the compute instances fired up in the production environment. In our case, we have just one instance, but we could add more.

in_the_cloud_mgmt

And there it is. A fully local web application that is now running in the cloud and using cloud-based blob and database storage.

Next we’ll talk about federating authentication to this application.

 

blogBling_150x150_02333

Migrating a Web Application to Azure: Step 3

Getting rid of file access

One issue to watch out for with an application in the cloud is file access.

File-Access in a highly-elastic environment such as Azure is generally a bad idea because it reduces the ability for a request to be correctly handled by multiple server instances. To combat this, generally, all access to the file system should be changed to access Azure storage, which does scale across multiple server instances.

In our example, we have contrived to have a picture stored in the database, but cached on the file-system on its first use, and subsequent access is to the cached file.

Local Application, Images from Server Files

We have code that looks like this in the original code:

public ServerPath EnsureImage(Picture picture)
{
// ensure the directory exists
var directoryPath = EnsureDirectory(RelativePathToImagesDirectory);

// now ensure the file exists
var filePath = EnsureFile(
directoryPath,
picture.OriginalFilename,
_path =>
{
using (var stream = new FileStream(
_path.Absolute,
FileMode.OpenOrCreate,
FileSystemRights.TakeOwnership | FileSystemRights.Write,
FileShare.None,
1024,
FileOptions.None))
{
stream.Write(picture.Image.ToArray(), 0, picture.Image.Length);
}
});

return filePath;
}

internal ServerPath EnsureFile(ServerPath root,
string filename,
Action<ServerPath> fileCreateFunc)
{
var path = new ServerPath(Path.Combine(root.Relative, filename), Server.MapPath);

if (!(File.Exists(path.Absolute)))
{
fileCreateFunc(path);
}

return path;
}

which results in


images_from_filesystem


Note that the web application is still running locally, and the images refer to files located at relative paths within the server.


To remove the dependency on the file-system, we can use Azure Blob Storage to store the cached files.


The Rise Of The Blob…


Blob storage Azure is very cool. It presents a REST-based API to upload, tag, list and retrieve unstructured binary data like images, documents and videos.


So we can switch the application to interact with the Azure Blob store, referring to the images from the storage in the cloud.


To do this, we use the Azure storage account we created earlier.


configure_blob_storage


The important pieces of information to capture are the Access Key (you get this when you click on the button with the orange outline), and the Azure storage account name which is highlighted.


We then initialize access to the blob storage account and container from within the application:

public void Initialize()
{
if (CloudStorageAccount == null)
{
CloudStorageAccount = CloudStorageAccount.Parse("AccountName=cos204;AccountKey=<put_your_key_here>;DefaultEndpointsProtocol=http");
}

if (CloudBlobClient == null)
{
CloudBlobClient = new CloudBlobClient(CloudStorageAccount.BlobEndpoint, CloudStorageAccount.Credentials);
}

if (ThumbnailsContainer == null)
{
ThumbnailsContainer = CloudBlobClient.GetContainerReference("thumbnails");
}

ThumbnailsContainer.CreateIfNotExist();

if (ThumbnailsContainer.GetPermissions().PublicAccess != BlobContainerPublicAccessType.Container)
{
ThumbnailsContainer.SetPermissions(new BlobContainerPermissions
{PublicAccess = BlobContainerPublicAccessType.Container});
}

if (ImagesContainer == null)
{
ImagesContainer = CloudBlobClient.GetContainerReference("images");
}

ImagesContainer.CreateIfNotExist();

if (ImagesContainer.GetPermissions().PublicAccess != BlobContainerPublicAccessType.Container)
{
ImagesContainer.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Container });
}
}

What we are doing here is initializing the Blob Storage infrastructures and creating containers for Thumbnails and Images.


It’s important to note that the demo code shows a hardcoded Azure Connection String just to simplify the code snippet for reference. Normally this would be hidden away in a configuration file.


Local Application, Remote Images


We can now replace the file-access code shown above with its Azure Blob equivalent.

public ServerPath EnsureImage(Picture picture)
{
var filePath = EnsureBlob(
ImagesContainer,
picture.OriginalFilename,
_blob =>
{
var blobStream = _blob.OpenWrite();
try
{
blobStream.Write(picture.Image.ToArray(), 0, picture.Image.Length);
}
finally
{
blobStream.Flush();
blobStream.Close();
}
});

return filePath;
}

internal ServerPath EnsureBlob(CloudBlobContainer blobContainer,
string filename,
Action<CloudBlob> blobCreateFunc)
{
var blob = blobContainer.GetBlobReference(filename);

if (blobContainer.ListBlobs().FirstOrDefault(_blob => _blob.Uri.AbsoluteUri == blob.Uri.AbsoluteUri) == null)
{
blobCreateFunc(blob);
}

return new ServerPath(filename, blob.Uri.AbsoluteUri);
}

With this implementation of EnsureImage(), we get:


images_from_blob


Pretty cool.


We now have a web application running on the local server, with data in the cloud, caching files and referencing Azure Blob storage also in the cloud!


Inspecting Blob Storage


While developing and debugging this stage of the migration, it’s useful to see exactly what is being stored in the Azure Blob storage.


The Windows Azure Management Portal does not have a handy tool to inspect the structure and contents of the Azure Storage account, so we turn to some other tools to do that.


The pack of Windows Azure Tools for Visual Studio 2010 v1.4 enables a storage account browser, which can be configured with the storage account and the key as before, allowing for full access to the storage account.


azure-storage-browser


We can use this to ensure that the blob storage account is structured as we expect, and filled with the contents we expect.


Next, we’ll move the application itself to Azure.


blogBling_150x150_02333

Friday, August 26, 2011

Migrating a Web Application to Azure: Step 2

Migrating the database to SQL Azure

Now that we have a local web-application with a database, and a SQL Azure instance up and running, we can start the application migration by moving the database to SQL Azure.

SQL Azure is a modified version of SQL Server 2008 R2, and there are some small, but significant limitations on the SQL DDL commands that can be run on Azure. See here and here for more information.

The upshot is, if you already have a local SQL Server database, then you’re really much better off using some specialized migration tools to move it to Azure. The tool of choice for us is the SQL Azure Migration Wizard off codeplex.

Download and install the latest version of this tool on your development box. You’ll thank the tool-developer later!

Fire up the tool, and select “SQL Database” under “Analyze / Migrate”

sql_azure_mw_0

Select the local database server which hosts the instance you want to migrate

sql_azure_mw_1

Select the database(s) to migrate in turn.

In our case, we will have to migrate both the ASPNETDB and PICTURES databases.

sql_azure_mw_2

Click through the dialogs, and migrate database object present in the database.

sql_azure_mw_3

The Migration Wizard has created the script to use for the migration, and will now require your credentials to migrate to. This is where you fill in the details from the SQL Azure instance screen in the previous step.

sql_azure_mw_4

Let’s let the migrator create a PICTURES database on the Azure Server

sql_azure_mw_6

…and begin the migration work

sql_azure_mw_7

…and we are done!

sql_azure_mw_8

The Azure portal shows the two databases created and ready for use.

sql_azure_portal

Now, let’s hook up the application we wrote to use these databases in the cloud. To do this, we point the connection strings in the Web.Config of the web application to the databases in the cloud.

Thusly:

web_config_2

And that’s it. Here’s the application running locally but with the data in the cloud.

localapp_remotedata

I’ve created a new album called “Clouds” now, and uploaded three new pictures. If we look at the Azure database instance and look at the data in the PICTURES instance, we see the data there:

data_in_the_cloud

Next, we’ll talk about setting up the Azure workspace so we can move our application there…

blogBling_150x150_02333

Migrating a Web Application to Azure: Step 1

Setting up your Azure workspace

So we have a working web-application which we want to move to Azure.

The first thing to do is to get a Windows Live ID and line up one Windows Azure subscription and one SQL Azure subscription.

Sign into windows.azure.com with your Live ID

azure_homepage

Click on the “Hosted Services, Storage Accounts and CDN” button on the bottom left panel, then on “Hosted Services” on the top left panel.

Right-Click on your Azure Subscription in the main panel and select “New Hosted Service”

new_hosted_service

Choose not to deploy at this stage!

Similarly create a Storage Account on the same subscription.

new_storage_service

Now create a database server on the SQL Azure Subscription.

new_sqlazure_database

Ensure you set the firewall rules to include both your machine’s IP address and other Azure services. Your SQL Azure instance will need to be accessed from the Windows Azure instance,

setting_firewall_rules_after

Once you create the database server, you should make a note of the following pieces of information:

  • The subscription id
  • Your administrator login and password
  • The instance name, and the fully-qualified database server DNS name

database_server_ready

Now we have Windows Azure and SQL Azure instances set up for use.

Next we’ll migrate our database to SQL Azure.

 

blogBling_150x150_0233

Thursday, August 25, 2011

Migrating a Web Application to Azure: Step 0

Creating a traditional, locally run web application

Let’s start with a simple, traditional web application.

This is a contrived application, built specifically for demonstrating the issues we want to address in the process of migrating to Azure.

Please do not consider this application as indicative of following best-practices for web application development!

Also, the purpose of this web application is not to wow anyone with a slick user interface! :)

Let’s imagine a site where someone logs on, creates a photo album or two, and uploads a few images to each album.

Prerequisites

You will need the following applications and tools to follow along the process of migrating an application to Azure

  • Visual Studio 2010 (SP1 strongly recommended)
  • SQL Server 2008 R2; you’ll be using SQL Server Management Studio
  • Windows Azure SDK 1.4

We’ll introduce a few more tools along the way, but we can get started with this.

Steps

Let’s create this application the simplest way possible:

First, create the ASPNETDB Application Services database on the SQL Server 2008 instance. You may need to run aspnet_regsql.exe from the .NET framework directory.

aspnet_regsql

Then create a basic web application in Visual Studio:

create_web_application

Modify the Web.Config on the web application to point the Application Services Providers to our database.

non_local_aspnetdb

Pressing F5 to run/debug the web application should fire up the default application, and allow you to create an account to authenticate against in future. The default home page after logging in should look like this:

default_home

To create the picture album site, we create a new database with this simple schema:

pictures_schema

Note that for this demonstration, we will store the uploaded picture in an image column on the Picture table, in the database.

We’ll use Linq 2 SQL to create a data-model for our application.

When we’re done with adding the Linq2Sql classes, the connectionStrings collection in our Web.Config looks like:

connection_strings

Now we’ll actually develop the application, and let’s say we finally land up with:

sharpix

Let’s also assume that the application contains, for the sake of this demonstration, the following code to extract the binary data from the database and cache the file on the file system, so that the images on the rendered page refer to the cached file:

var relativePathToDirectory = "~/.cache";

// ensure the directory exists
var absolutePathToDirectory = Server.MapPath(relativePathToDirectory);
if (!Directory.Exists(absolutePathToDirectory))
{
Directory.CreateDirectory(absolutePathToDirectory);
}

// now ensure the file exists
var relativePathToFile = Path.Combine(relativePathToDirectory, picture.OriginalFilename);
var absolutePathToFile = Server.MapPath(relativePathToFile);

if (File.Exists(absolutePathToFile)) return absolutePathToFile;

using (var stream = new FileStream(
absolutePathToFile,
FileMode.OpenOrCreate,
FileSystemRights.TakeOwnership | FileSystemRights.Write,
FileShare.None,
1024,
FileOptions.None))
{
stream.Write(picture.Image.ToArray(), 0, picture.Image.Length);
}

Conclusion


We have a rudimentary, but fully functional web application that runs on our local web server in the traditional manner.


We will need to consider the specific changes to make when we want to migrate this application to the cloud.


PS:


We’ll put the source code on github and stick the link in here just after the talk.


blogBling_150x150_023