The Daily Parker

Politics, Weather, Photography, and the Dog

The Daily Parker...in the cloud

Sometimes things just work.

Last weekend, I wrote about moving my last four web applications out of my living room the Inner Drive Technology International Data Center and into the cloud via a Microsoft Azure Virtual Machine.

Well, if you're reading this blog entry, then I've succeeded in moving The Daily Parker. Except for transferring files (the blog comprises 302 megabytes over 13,700 files), which happened in the background while I did other things, it only took me about 45 minutes to configure the new installation and make the necessary changes to DNS.

Despite the enormous volume of data, this was the easiest of the four. DasBlog has no dependencies on outside services or data, which means I could move it all in one huge block. The three remaining applications will take much more configuration, and will also require data and worker services.

I'm still surprised and pleased with the smoothness of the transfer. If the other three migrations go anywhere nearly as easily as this (taking into consideration their complexities), I'll be an Azure Evangelist for years.

Goof Off at Work Day

According to Jeff Atwood, today's the day:

When you're hired at Google, you only have to do the job you were hired for 80% of the time. The other 20% of the time, you can work on whatever you like – provided it advances Google in some way. At least, that's the theory.

Although the concept predates Google, they've done more to validate it as an actual strategy and popularize it in tech circles than anyone else. Oddly enough, I can't find any mention of the 20% time benefit listed on the current Google jobs page, but it's an integral part of Google's culture. And for good reason: notable 20 percent projects include GMail, Google News, Google Talk, and AdSense. According to ex-employee Marissa Meyer, as many as half of Google's products originated from that 20% time.

He goes on to ask if your company is ready, and offer some suggestions how to implement it. I think it's easier to do when you don't have billable-hour pressure, but still, we at my company do manage to get some goofing-off time in.

Or, put another way, "why is this day different from all other days?"

On hiring and grammar

Via Sullivan, entrepreneur Kyle Wiens won't hire people who use poor grammar (and neither will I):

Good grammar makes good business sense — and not just when it comes to hiring writers. Writing isn't in the official job description of most people in our office. Still, we give our grammar test to everybody, including our salespeople, our operations staff, and our programmers.

Grammar signifies more than just a person's ability to remember high school English. I've found that people who make fewer mistakes on a grammar test also make fewer mistakes when they are doing something completely unrelated to writing — like stocking shelves or labeling parts.

In the same vein, programmers who pay attention to how they construct written language also tend to pay a lot more attention to how they code. You see, at its core, code is prose. Great programmers are more than just code monkeys; according to Stanford programming legend Donald Knuth they are "essayists who work with traditional aesthetic and literary forms." The point: programming should be easily understood by real human beings — not just computers.

Yes. Clear writing shows clear thought, almost always. I might not go so far as to use a grammar test for new employees, but I do pay attention to their emails and CVs.

Deployments are fun!

In every developer's life, there comes a time when he has to take all the software he's written on his laptop and put it into a testing environment. Microsoft Azure Tools make this really, really easy—every time after the first.

Today I did one of those first-time deployments, sending a client's Version 2 up into the cloud for the first time. And I discovered, as predicted, a flurry of minor differences between my development environment (on my own computer) and the testing environment (in an Azure web site). I found five bugs, all of them minor, and almost all of them requiring me to wipe out the test database and start over.

It's kind of like when you go to your strict Aunt Bertha's house—you know, the super-religious aunt who has no sense of humor and who smacks your hands with a ruler every time you say something harsher than "oops."

End of complaint. Back to the Story of D'Oh.

Taking an Azure shortcut

I hope to finish moving my websites into the cloud by the end of the year, including a ground-up rewrite of Weather Now. Meanwhile, I've decided to try moving that site and three others to an Azure Virtual Machine rather than trying to fit them into Azure Cloud Services.

For those of you just tuning in, Azure Cloud Services lets you run applications in roles that scale easily if the application grows. A virtual machine is like a standalone server, but it's actually running inside some other server. A really powerful computer can host a dozen small virtual machines, allocating space and computing power between them as necessary. You can also take a virtual machine offline, fold it up, and put it in your pocket—literally, as there are thumb drives easily as big as small VMs.

This is called infrastructure as a service (IaaS); putting applications into cloud services without bothering to set up a VM is called platform as a service (PaaS).

IaaS offers few advantages over PaaS. The principal disadvantage is that VMs behave like any other computers, so you have to care for them almost as if they were pieces of hardware on your own server rack. You just don't have to worry about licensing Windows or hoping the electricity stays on. Also, VMs are expensive. Instead of paying around $15 per month for a web role, I'll wind up paying about $80 per month for the VM and its associated storage, data transfers, and backup space. And this is for a small instance, with a 1.6 GHz processor and 2 GB of RAM. VMs go up to 8-core, 16 GB behemoths that cost over $500 per month.

On the other hand, my server rack costs easily $100 per month to operate, not counting licenses, certificates, me tearing my hair out when the power fails or my DSL goes down, and having to keep my living room the Inner Drive Technology Worldwide Data Center below 27°C year-round.

So it's not nearly as expensive as rack space would be, but it's less economical than PaaS. Unfortunately, my four most important web applications have special needs that make them difficult, and in two cases impossible, to port to PaaS:

  • The Daily Parker, this blog, which runs on the open-source dasBlog platform. I estimate that porting this blog to PaaS will take about 12 hours of work, and I have lots of other (paid) work ahead of it. In principal, I need to change its storage model to use Azure blobs instead of the local file system, which doesn't work the same way in Azure Web roles as it does on an VM.
  • Weather Now, which is overdue for a ground-up rewrite, and uses a lot of space. Porting the application will take about 12 hours, plus another 12 hours to port the GetWeather application (which keeps the site supplied with weather data) to an Azure worker process. That's time I can better spend rewriting it. Moving it to a VM shouldn't take more than an hour or two.
  • My SourceGear Vault source code control system. Since I don't own the source code, I simply can't port it. Plus, it uses a couple of worker processes on the server, which I also won't be able to port.
  • My FogBugz issue tracking system. Same problem: it's not my software, and it uses a couple of worker processes. I can either install it on a server from a commercial installation package, or I can sign up for FogBugz on demand for $25 per month. And also lose my Vault integration, which lets me track all of my issues back to actual pieces of code.

So watch this blog. In a couple of days, it's liable to migrate to an Azure VM.

Certified, again, and just as happy as the last time

Long-time readers will know how I feel about Microsoft certification exams. When it came time for 10th Magnitude to renew its Microsoft Partner designation, and that meant all of us had to take these tests again, I was not happy.

So, against my will, I took exam 70-583 ("Designing and Developing Windows Azure Applications") and passed it. I am once again a Microsoft Certified Professional.

Fwee.

Houses of cards + breeze

The brokerage house Evercore doesn't believe Groupon. No one else does either:

The brokerage said Groupon Goods, the company's consumer products category, is increasingly becoming the merchant of record - the owner of goods being sold or the first-party seller.

As first-party sales assume inventory risk and drive higher revenue contribution, the composition of Groupon's first-quarter revenue beat in North America has become questionable, analyst Ken Sena wrote in a note.

"Growth in unique visitors in the U.S. to Groupon.com, which can be looked at as a proxy for subscriber growth, exhibited negative year-over-year trends this quarter," Sena said.

Essentially, no one is buying stuff from Groupon, which leaves them holding the bag on lots of it.

In a related story, people are sick of Farmville, which is hurting Zynga:

A slew of analysts cut their ratings and price targets for Zynga after it reported lower-than-expected quarterly results on Wednesday and forecast a much smaller 2012 profit.

Zynga has been hit by user fatigue for some of its long-running games and a shift in the way Facebook Inc's social platform promotes games.

"The biggest factor impacting current performance appears to be the way Facebook is surfacing gaming content on its platform," JP Morgan's Doug Anmuth wrote in a note to clients.

Actually, Facebook users just got bored of FarmVille, and it's hard to blame them. This is what happens when companies stop innovating in favor of milking their cash cows. (Sorry.)

Dual Microsoft Azure deployment: Project synchronization

Last week I offered developers a simple way to simultaneously deploy a web application to a Microsoft Azure web site and an Azure Cloud Services web role. Today I'm going to point out a particular pain with this approach that may make you reconsider trying to deploy to both environments.

Just to recap: since Azure web sites are free, or nearly so, you can save at least $15 a month by putting a demo instance of your app there rather than having a second web role for it. You'll still use a web role for your staging and production environments, of course.

While reading my last post, though, sharp-eyed developers might have noticed that the dual approach creates some additional maintenance overhead. Specifically, you'll need to keep both solution (.sln) files and both web project (.csproj) files in sync. This becomes part of your staging deployment task list, which means you probably only have to synchronize the files once every few weeks, not such a big deal. Still, if you've never hand-edited a solution or project file before, it can be a little daunting.

The solution file probably won't require much synchronization, unless you've added new projects—or new files outside of projects—to the solution. For example, at 10th Magnitude we like to keep all of our database scripts in the solution tree, for easy access. (We also use the open-source RoundhousE tool for database deployment, which I'll talk about in a subsequent post.) If we add new database files to the web site solution, they won't automatically show up in the web role solution. Same with the web project file: adding new controllers, views, or web forms to a project is very common. So you'll have to make sure all the changes in the web site project get migrated over to the web role project.

Keep in mind, though, that some things will remain different between the two pairs of files. The web role solution will have a Cloud Services project, mapping the web and worker role entry points, which the web site won't have. And the web site project file will have references to Microsoft.WindowsAzure.ServiceRuntime.dll and msshrtmi.dll that the web role project won't have. Here's a synchronized pair of solution files, using Beyond Compare to show the deltas:

And here are the two project files, also synchronized, showing the differences you need to wind up with:

One final thing to consider: if you have a paying client, they might not want to pay for development time to synchronize the two deployment environments. If you're charging $125 an hour, and you spend 30 minutes every four weeks—$62.50—to save the client $15 for an additional cloud services instance, that isn't good value. But for an internal application (like the 10th Magnitude brochure site), or for a personal project, the savings might be worth the hassle.

Dual Microsoft Azure deployment, part 1

(This is cross-posted on the 10th Magnitude blog.)

In my last post, I talked about using Azure web sites to save beaucoup bucks over Azure Cloud Services web roles on nonessential, internal, and development web applications. In this post I'll go over a couple of things that bit me in the course of deploying a bunch of applications to Azure web sites in the last two weeks.

First, let me acknowledge that engineering a .NET application to support both types of deployment is a pain. Azure web sites can't use Cloud Storage, including tables, blobs, and queues. You'll need to architect (or refactor) your application to get its data from different sources based on whether it's running inside Cloud Services or not.

Once you've done all that (simple, right?), one line of code will let the application know where it is:

if (RoleEnvironment.IsAvailable) { DoCloudServicesStuff(); }

But here's the catch. Azure web sites don't have access to the RoleEnvironment class by default. The class lives in the Microsoft.WindowsAzure.ServiceRuntime assembly, which Cloud Services applications get through their VM's GAC and which website applications do not.

So even for your application to determine whether it's running in Cloud Services or not, you'll have to add two assemblies to the website project file: Microsoft.WindowsAzure.ServiceRuntime.dll and msshrtmi.dll. If you don't, you'll get one of these runtime errors:

Could not load file or assembly 'Microsoft.WindowsAzure.ServiceRuntime, Version=1.7.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.

Could not load file or assembly 'msshrtmi, Version=1.7.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.

(On my computer, the ServiceRuntime assembly is in "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\2012-06\ref" and msshrtmi is in "C:\...\2012-06\bin\runtimes\base\x86\".)

Once you have added those two references to your web project, find them in the References list under the project root, select both, right-click, and open the Properties tab. Then set both of them to Copy Local = true, like this:

Copy msshrtmi local image

This tells the Publisher to include them in the deployment to Azure web sites. Note that you'll have to include these in your web application if any of the satellite assemblies uses the RoleEnvironment class (like our time zone handler, for example).

But now you have a new problem if you have dual deployments: you need two web project files, because the Cloud Services deployment will crash when the web role starts if you have those two assemblies referenced explicitly. And if you have two different web application project files, you'll need two solution files as well.

Of course, it only takes about two minutes to:

  • Make a copy of the solution file (usually "MyApplication-WebRole.sln" or "MyApplication-Website.sln" for the new copy);
  • Make a copy of the Web project file (you don't need different versions of your other assembly projects);
  • Manually edit the new solution file to point to the new project file.

(You could, I suppose, accomplish the same thing with build scripts and lots of conditional compile regions, but I can't call that an improvement. Like I said, this approach has more PITA than a falafel stand.)

We also find it helpful to wait until late in the development cycle to create the second solution/project pair of files, because you'll need to keep the two project files identical except for the two assembly references.

You have to do one more thing, unfortunately. You need to build your website deployment for the x86 platform target:

Visual Studio 2010 example of building as x86

The msshrtmi assembly requires explicitly building the application for x86 or x64 deployments, depending on whether you grab it from "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\2012-06\bin\runtimes\base\x86\" or "...\x64\". But we've run into other problems trying to use the x64 version, so we don't recommend it.