The Daily Parker

Politics, Weather, Photography, and the Dog

Grant me the serenity

Via Sullivan, artist Heather Dewey-Hagborg is creating 3D portraits from random hairs:

Collecting hairs she finds in random public places – bathrooms, libraries, and subway seats – she uses a battery of newly developing technologies to create physical, life-sized portraits of the owners of these hairs. You can see the portrait she’s made from her own hair in the photo below. While the actual likeness is a point of contention, these images bring about some creepy-yet-amazing comments; on genetic identity (how much of “you” really resides in your DNA?); on the possibilities of surveillance (what if your jealous partner started making portraits from hairs they found around your house?); and on the subjectivity inherent in working with “hard” data and computer systems (how much of a role do human assumptions play in this machine made portrait?).

The artist's site is here.

All right. This came a little sooner than I expected, and from a different source. I've long recognize the necessity of adapting to, rather than raging impotently against, the fundamental changes to the security and privacy mores we've had for several thousand years. (As Bruce Schneier has pointed out, "Fifteen years ago, [CCTV cameras] weren't everywhere. Fifteen years from now, they'll be so small we won't be able to see them.") But this project, if it works as hoped, actually freaks me out a little.

I'm going to whistle past this graveyard for the time being...

Azure web sites and web roles

(Cross-posted to my company's blog.)

If you’ve looked at Microsoft’s Azure pricing model, you’ve no doubt had some difficulty figuring out what makes the most economic sense. What size instances do I need? How many roles? How much storage? What will my monthly bill actually be?

Since June 7th, Microsoft has had one price for an entry-level offering that is completely comprehensible: free. You can now run up to 10 web sites on a shared instance for free. (Well, you have to pay for data output over 165 MB per month at 12c per gigabyte, and if the site needs a SQL Database, that’s at least $5 a month, etc.)

At 10th Magnitude, we’ve switched to free Azure websites for our dev and staging instances of some internal applications and for our brochure site. And it’s saving us real money.

There are limitations, which I’ll get to, but c’mon: free. A shared-instance Azure website is perfect if you have a small, low-bandwidth, compute-light web application that only needs, maybe, a small MySQL database or some XML files. They even have a quick-start gallery that includes DotNetNuke, dasBlog, WordPress, and a few other open source packages—also free.

So here’s how those limitations hit: Free Azure web sites run on a shared virtual machine with who-knows-how-many other people, and you get an “extra-small” VM to boot (1 GHz processor, 768 MB of RAM). You can’t use Azure tables or blobs with it, and “free” only includes 5 hours of compute time and 165 MB of data going out per month. Most important, you can’t use a custom host header, so your site URL will be “something.azurewebsites.net” instead of “www.something.com”. You can get more, better, faster, and your own domain name by going to a Reserved web site instance—but that is decidedly not free.

Take a look at the pricing model. Our official brochure site runs in an extra-small Azure web role, but doesn’t use a SQL database, nor does it use much storage, compute power, or data egress. The bill comes to about $30 per month. That’s not bad at all, considering how much dedicated hosting costs generally (really, Rackspace? $150 per month is your cheapest deal?).

Let’s say we double that $30 because we’re not going to slap our chief marketing website up there without a private staging instance. So now our $30 site costs $60, and remember, we aren’t even using a database.

Or, in fact, go ahead and triple it to $90, because we need a dedicated dev instance as well. Our CMO, Jen, needs room to experiment, try new designs, and test-drive new marketing approaches, which we don’t want on our staging instance in case we accidentally promote it to production.

Why not use a virtual machine, then? Here’s where Microsoft’s pricing gets tricky. An extra-small VM is less than $10 per month during the “preview period” going on right now, but you’ll need storage to hold the VM, and you’ll still have to pay for bandwidth. That puts the real price around $30 a month.

We could, in theory, run all three environments (production, staging, preview) on the single VM. But who in his right mind would run all three environments on one VM? So we’re back to two VMs—or three—so $90 a month.

By the way, reserved instances have another limitation, which may have something to do with Microsoft’s own capacity constraints as they build out new datacenters. Extra-small reserved instances aren’t available right now, so you’re stuck getting a small instance at $60 per month. I’ll have more on reserved instances in a subsequent post, because they’re great if you have an existing, complex Web application you want to move to the Cloud but don’t want to refactor it to use Azure cloud services.

In short, we’re saving about $60 per month—67%—by using free Azure web sites instead of Web roles or VMs. And that’s just for our corporate brochure. Add what we’re saving for our internal applications, and now we’re talking about more pizza and beer for the developers real savings.

More next post on solving challenges with staging on an Azure web site and hosting the production version in a Web role.

Fourth time's a charm

I've just completed my fourth Windows Azure deployment this month, and this time, it's a non-trivial site. The Inner Drive Technology corporate website now lives up in the Cloud. Actually, it lives in two places: as an Azure Website for testing, and in Azure Cloud Services for production. All I have to do to complete the task is publish the "production" instance (I've successfully published the "staging" instance) and configure DNS.

This deployment gave me the most trouble, mainly because it has a lot of stuff in it: all my code demos, especially time zones. I also discovered a couple of things about deployments to Azure Cloud Services, in particular that the default staging deployment hits a different port than the production deployment.

It took me about 7 hours to convert the existing Inner Drive code into an ASP.NET Web application and get it working in an Azure website. I had a major hiccup trying to get the time zone data to load, because on an Azure website (but not in Cloud Services), the IANA tzinfo database files live in the file system.

Moving it to Cloud Services only took me about 90 minutes, though. As I've discovered, there are differences between the two, and it's a pain in the ass to alter the project and solution files every time you want to deploy it to a different environment. So, I copied the project and solution files, and voilà! Easy deployment to either environment.

I'll write more about this later. At the moment, I'm waiting for the enormous Inner Drive Extensible Architecture SDK to upload to the Cloud. This could take a while...time to walk the dog.

Update, 21:15 CDT: Inner Drive is live on Azure, including the entire SDK. It took 25 minutes to deploy, which, believe it or not, isn't much more than it usually takes. But the total time to add a Cloud Services role and deploy the site—not counting when I walked away to do something else—was just under two hours.

More reasons to love and embrace the Cloud

By Wednesday afternoon I'd migrated two Web sites from the loud and hot server rack in my home office to Microsoft Azure web sites. Then I popped off to New York for last night's game, and when I got back to my hotel room I encountered yet another reason I like the Cloud: I couldn't get to any of the sites back home.

It turned out that a brief power failure had caused the firewall to reboot—I think a UPS didn't last as long as expected—and in the process it caused the Web server's network adapter to fail.

Keep in mind, all I knew was I didn't have most of my Web sites, including the Daily Parker. I did have email, because I'd already moved that to the Cloud. But I didn't know whether I'd blown a circuit breaker, whether someone had cut my home Internet cable, or whether someone had burgled my house.

So, I'm going to continue migrating sites as quickly as I can. And by autumn, mysterious outages will, I hope, not happen again.

Continuing migrations to Azure

I've finished two complete migrations from my living room the Inner Drive Technology Worldwide Data Center to Microsoft Windows Azure web sites. Astute readers may remember that in one case I moved to the Web site offering and then moved it to a full-fledged Web role. Well, today, I moved it back. Even though I'm still on the free trial, it turned out that the Web role would cost $15 per month, which, for a site that gets one or two visitors per day, simply wasn't worth it.

Moving the second site, a silly thing from 2004 created to share photos and commentary about a 10-year series of Presidents Day parties a friend of mine hosted back in the day, went a lot more smoothly:

0. I archived the project after its last deployment to the Web, which was in November 2006, when the Web server went online. (The age of this particular server is one of the reasons I'm moving everything to Azure, in fact.) I use SourceGear Vault for source control. So all the source code still lived in the Obsolete repository.

1. The old source code was C# 1.1 code from 2004, so the old project and solution files were in Visual Studio 2003 format. Forget upgrading; I just created a new solution.

2. Some of the code files in the Obsolete area were shared with other, obsolete projects. I branched them.

3. In source control, I moved files from the obsolete folder to new folder, except for obsolete configuration and code files. Vault doesn't automatically fetch the files when you do this, to prevent disasters. That was helpful.

4. Here I ran into a minor Visual Studio annoyance with its project naming schemes. I want the Web project to have a different, standard name for Web projects, so I had to rename it by hand in the file system and in the .sln file. No big deal.

5. Now I bound the new solution to source control. The procedure has to do with my source control setup, and may not be typical:

  • Got latest files from Vault in their new locations.
  • Added the files to the solution and project.
  • Closed the solution
  • In Vault, manually added the .sln and .csproj files
  • Reopened solution and changed the source control bindings

6. Added references to current version of Inner Drive Extensible Architecture:

  • Wound up replacing global.asax.cs with the version from Boxer's Shorts. The party site's was too old to bother updating.
  • Added references to InnerDrive.Azure, the Azure mail provider, etc.

7. Reviewed the code throughout the project to ensure nothing in them should prevent a quick compile:

  • Converted the project files to Web application. This adds designer files to all Web forms with code-behind files, so Visual Studio can compile the site down to a deployable assembly.
  • Went through each page to make sure the designers and the pages match
  • Built the project (without running it): OK.

At this point I'd spent about 50 minutes on it. So far, so good.

8. Updated configuration files (web.config, email configuration, etc.) to match the new versions of Inner Drive code files.

9. All Inner Drive websites get page metadata from a data tables, so I had to add the site's metadata to the SQL Database running in my Azure account. This is scripted, and the scripts, moreover, are idempotent to allow for easy automation.

10. It turned out that half the photos and two of the pages on the site weren't in source control after all. Short pause to fetch them from the current Web server and add them to the repository.

11. Compile and run in Debug mode: Yay! Evertyhing works! Time elapsed: 90 minutes (including getting my salad out of the fridge and answering a colleague's questions about something unrelated.)

12. In the Windows Azure portal, created a new Web site, and dowloaded its publish profile to the Visual Studio solution.

13. Right-clicked the Web project and clicked "Publish..."

14. While the publication went on (there were 3.6 Mb of photos to copy up to Azure), updated the production Pages tables with the site's page metadata.

15. Publication completed and...crashed immediately. Crap.

16. Uploaded a new web.config file with <customErrors mode="Off">

"The page cannot be displayed because an internal server error has occurred." That's not helpful. And it led to some thrashing before I realized the PEBCAK:

  • Removed the local-only sections of web.config
  • Stripped down the MessageConfig.xml to do absolutely nothing
  • Check the /LogFiles folder on the virtual machine about 20 times only to see it's empty
  • Check the Azure portal's "Configure" tab and turned on diagnostics
  • Oops. There it is: "Configuration file is not well-formed XML"
  • Uploaded a new web.config file with <customErrors mode="Off" /> (notice the terminating slash).

17. "Could not load file or assembly 'Microsoft.WindowsAzure.ServiceRuntime, Version=1.7.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified." No problem, I've seen this before. Fixed it.

18. Reloaded the page: success! Except all of my messaging was turned off from the debugging at step 16, so I just had to add them back.

19. Success!

And it only took 2½ hours, including lunch and talking to colleagues about something else. Next time, I hope to do it in a lot less time.

Unfortunately, I can't share the URL, because the content is old and unflattering. Those of us who attended the parties may want to remember them, or figure out what happened in cases where we can't actually remember. But it's not for general consumption.

It is finished.

I have successfully ported my first (existing) application to the Microsoft Windows Azure platform, and have shut down the running instance on my local Web server. I hope the second one takes less than a week.

It's a funny little site called Boxer's Shorts. Dr. Bob Boxer is a local allergist who likes puns. He worked with a local illustrator, Darnell Towns, and self-published the five paperback pun compilations advertised on the site.

Local web designer Lauren Johnson (née Liss) did the look and feel, and I provided the platform. I think we completed the site in two weeks or so. I've hosted it since it went live in September 2006—just a few days after I got Parker, in fact.

And now it's in the cloud, the first Inner Drive site to be ported. From what I learned doing it, I hope to get two more of my older sites deployed to Azure this week.

Why I haven't finished deploying to Azure

I've spent much of the past week trying to get a single, small website up into the cloud on the Windows Azure platform. Much of this effort revolved around the Azure Website product, mainly because it's free. Well, I got the application up as an Azure website...and there's a big problem with it that means I'll have to redeploy it as a Web role after all.

First, let me just outline how much fun I've had today, starting from this morning when I first tried to publish the application to the cloud:

  • For the first hour or so, I dealt with a missing patch that prevented publishing entirely.
  • My next task, after I got the bits up to Azure, was to track down why the application failed on the method RoleEnvironment.IsAvailable, which you need if you want to deploy something to an Azure Web role. First I got this:
    Could not load file or assembly 'Microsoft.WindowsAzure.ServiceRuntime, Version=1.7.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.
  • The solution was simply to go to the Solution Explorer and mark the ServiceRuntime assembly as "copy local" instead of "do not copy." That fixed it. Until the next time I tried to run, when I got this:
    Could not load file or assembly 'msshrtmi, Version=1.7.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.
  • Fortunately, developer Jake Ginnivan had exactly this problem yesterday. (Wow, who doesn't love Google?) Of course, that required me to compile the whole site as x86, not x64:

At this point, the entire site seemed to work. Except now I wasn't getting any exception messages. Nor was I able to get a "contact us" email. Some digging showed that my code—yes, my code—was swallowing exceptions, and throwing the wrong exceptions.

This is because the Inner Drive Extensible Architecture message system loads configuration if you don't provide it, getting the configuration file spec from the web.config file. Only, in every single instance where the thing is in production, I control the file system, so I've never had a problem knowing where the configuration file actually lives. (I also solved the problem of how to get configuration from Azure storage, as a pain I described earlier today.) The configuration setting looks like this:

<appSettings>
   <add key="messageConfigFile" value="c:\SomeActualPath\MessageConfig.xml"/>
</appSettings>

You can also use a relative path if you know the root. On Azure web sites, it turns out the root is D:\Windows. Seriously. On my little slice of a shared server, the actual path to the application is something like C:\DWASFiles\Sites\Site-Name\VirtualDirectory0\site\wwwroot\. The "virtual" part scares me; this path is not guaranteed. This is what we call in programming "brittle."

So I wound up using a relative path from the application root and adding this code to the application start method:

var configurationFile = ConfigurationManager.AppSettings["messageConfigFile"];
var fileSpec = Server.MapPath(configurationFile);
Publisher.Instance.Load(fileSpec);

I deployed it again and yay! After adding that bit of code, and spending a total of 6.9 non-billable hours on this, I finally got the application to run as a free Azure web site with its azurewebsites.net address. So all I have to do now is add a CNAME record pointing the site's real URL to this one and...

You're kidding me.

Free Azure websites running on shared instances only allow a single hostname per site:

This means that I can't use www.mysitename.com to point to mysitename.azurewebsites.net, which means the whole effort is kind of wasted, except to use the thing as a staging site. Moving to a reserved instance solves the problem, but costs 8c per hour of compute time. But for only 2c and hour, I can use a traditional Web role in an extra-small instance. I'm pretty sure this particular site won't run continuously, but still, why pay any more than required?

Almost everything I did today also applies to publishing the application to a Web role, but now I have to pay money for it. Dang.

Who may I strangle, please?

In the past week, I've been "on the bench" at work, so I've take the time to get deeply familiar with Microsoft Windows Azure. My company, 10th Magnitude, is a 100% cloud-computing shop, and a Microsoft partner. I've been developing for Azure Web applications for a year, but I haven't had to deal with migrating existing sites, pricing, or configuration on my own; this is why we're a team, right?

So, anyway, I've taken what I've learned at work, and:

  • Selected a simple website to migrate; in this case, Boxer's Shorts, a project I completed in 2006 that has five rows of data and five content pages;
  • Worked for about 16 hours this week creating Azure implementations of the Inner Drive Extensible Architecture features that won't work on Azure—but only the two features (messaging and logging) that Boxer's Shorts needs;
  • Spent 6 hours yesterday and an hour today preparing the application for Azure; and
  • Set up an Azure Web site this morning into which I was about to publish the ready-to-roll Web site.

Remember the messaging and logging services I spent lots of time migrating? Well, it looks like I made the right choice in the long run. The services use Azure tables and blobs for logging, holding the site configuration data, and in this site's case, for holding the list of books. Azure storage is really, really cheap, less than 10c per gigabyte for locally-redundant storage or 12.5c per gigabyte for geo-redundant storage. This is de rigeur for a traditional Azure Web role deployment for everything that can run without relational data. (For relational data you need SQL Server.)

I thought—mistakenly, it turns out—that if it worked in a Web role it would work in a Web site. No, not so much. In the short run, I just discovered this:

Ah. No blob storage for Azure web sites. So now I have to strip out all of the stuff that uses blob storage on this web site, and modify the book list to use SQL Server instead of Azure tables. Two more hours.

So why not just publish the site to a Web role instead? Price. With Azure, you get 10 free Web sites with your subscription; but each Web role costs at least 2c per hour for the smallest possible footprint. There are 720 hours in a month, so even though you only pay for the time the application is actually doing something, you have to plan for about $14 per month. For a site that gets 100 page-views per week, has five content pages, and five pieces of data, that's really a lot of money. And it's infinitely more than free.

All right then. I hope that Azure websites get access to Azure storage soon. For now, I'm just going to rip the logging out of the site and fix the rest of it. But first I'm going to walk the dog.

Azure build error messages aren't helpful

When working with Microsoft Windows Azure, I sometimes feel like I'm back in the 1980s. They've rushed their development tools to market so that they can get us developers working on Azure projects, but they haven't yet added the kinds of error messages that one would hope to see.

I've spent most of today trying to get the simplest website in my server rack up into Azure. The last hour and a half has been spent trying to figure out two related error messages that occurred when trying to debug a Web application project that I converted from a Web site project:

  • Failed to debug the Windows Azure Cloud Service project. The output directory ' { path }\csx\Debug' does not exist.
  • Windows Azure Tools: Can't locate service descriptions.

The first error message seems straightforward enough: when the project got created, it never added the \csx\Debug folder. After creating the empty \csx\Debug folder, the second message occurs.

When an Azure project builds, it's supposed to create the \csx\Debug folder under the Cloud Service project root. It then copies the service definition (.csdef) and configuration (.cscfg) projects into the folder, which the Azure compute emulator can hook into.

In my project, this wasn't happening. So I created a new Cloud solution to see if this was a system problem or a configuration problem. (First I uninstalled and reinstalled all the Azure tools...which wasn't as big a time-suck as it could have been because I walked Parker while that was going on.)

The Deleteme solution built fine; mine still had the problem. So then I started comparing the configuration, project, and solution files...and completely missed the significance of this:

...except it gnawed at me for a few minutes, until I looked at this:

Why it created a configuration and then decided not to build it I just don't know. The solution to my hours of pain is simply to do change the solution platform to Any CPU (or check "build" on the .NET platform):

I am now going to fix the hole in my desk where I've been pounding my head.

When I started getting these messages, I Googled and I Googled, but the technology is so new that no one else appears to have had exactly this problem. I hope this post pays back some of the Karmic debt I've taken on from all the times when someone else had the right answer.

Google Maps goes inside

I don't know how extensive this is, but Google Maps street view now goes inside buildings:

To see this for yourself, go on Google Maps to 1028 W Diversey Pkwy, Chicago, 60614. Click on the balloon over Paddy Long's Pub, and click Street View. Notice the double chevron pointing toward the sidewalk:

Click that. And then explore.

I can only weep that we didn't have this kind of data throughout history to see how people lived in the past. And I can only weep for what this will do to privacy.

Update: It looks like they mostly have bars and pubs, including Tommy Nevin's, where Parker spent much of his puppyhood.