The Daily Parker

Politics, Weather, Photography, and the Dog

The Daily Parker v3.1

It's finally here: the Daily Parker running on BlogEngine.NET 3.1. This is, in fact, the first native post on the new platform, visible (for the time being) only to the select few who know the temporary URL.

So why did it take me eight weeks to get the new engine up and running? A few reasons:

  • BlogEngine.NET 3.1 is still in development, with the main open source team making changes almost daily.
  • I've made some serious customizations (outlined below) on my own private fork of the source code.
  • I have a real job.
  • I wanted to time the release to a significant event in the blog's history.

My changes went pretty deep into the application's core.

Like most developers, the original coders (not the guys working on it now) made big mistakes with time zones, principally by using the horrible System.DateTime structure instead of its more-correct System.DateTimeOffset replacement. (The .NET Framework has had the DateTimeOffset structure since version 2.0 back in 2005, so this really annoyed me.) As a consequence, I changed date-time storage everywhere in the application, which required a few massive commits to the code base. It also required changing the way the app handles time zones by dropping in the Inner Drive Extensible Architecture™ NuGet package.

Next, the Daily Parker has had geocoded posts for years, so I added a Google Maps control and geographic coordinates to the application. Unfortunately for me, the other guys kept changing the Edit Post screen, which complicated merging their stuff into my private fork. At least I'm using Git, which helps immensely.

Finally, I needed to get the thing to run as an Azure Web App, rather than as an Internet application running on a full server as DasBlog required. Again, I have a lot of experience doing this, and the Inner Drive Azure Tools simplified the task as well. It's still a pain, though it will allow me to retire an otherwise useless virtual machine in favor of a neatly-scaleable Web app that I can deploy in fifteen seconds.

Moving it to Azure necessitated getting file storage off the file system and into Azure blobs, as I outlined earlier.

Well, eight weeks and fifteen seconds. And there's still a bug list...

And I still have 4,998 posts to migrate...

Release Candidate 1

The (new) Daily Parker is code-complete for the first launch. There are a few steps to go, like launching the production site and migrating nearly 5,000 blog entries. But maybe, just maybe, it'll launch tomorrow.

(Note that the beta site only has the last six weeks of entries, and doesn't include any since yesterday, because I didn't re-run the migration for the last bits of testing.)

Almost ready to launch

Yesterday I successfully ran a complete import of the entire Daily Parker, all the way back to May 1998, and promptly discovered a couple of problems. First, a recent change broke the app's ability to add or edit blog posts; and second, because BlogEngine.NET reads the entire blog into memory when it starts up, it took nearly five minutes for the home page to load on my debugging machine.

That means the beta site will only have a few dozen entries up at any point, so I can actually fix the Javascript problems on the Edit Post page.

The new engine could launch this week. I'm excited. Stay tuned.

Ignore this post

I'm continuing to test the new blog engine. This evening's tests, which I'm setting up with this post, will involve some of the trickier tasks in the migration:

  • Relative links to posts within the blog itself
  • Links to arbitrary files using absolute paths
  • Links to files with relative paths
  • Links to images (like the one below) with relative paths

If you're reading this on the new blog engine, and all the links above work and the image below shows up, then the migration tool is complete. Deploying the new blog engine to production could then happen within a couple of days. Stay tuned.

Sneak peek

I've got a development instance of the new blog engine running on Azure: http://dailyparker-dev.azurewebsites.net/. Go ahead, take a peek.

It's important to note that I'm testing the import engine right now, so the collection of entries on the development site will probably change during debugging. Also, since it's a development site, it may disappear altogether from time to time.

Plus, the master source code (from which I'm merging into my custom code base from time to time) keeps changing. And merges don't always go well. And the DasBlog instance is still in production. And so on.

At this writing, the Daily Parker has 4,992 entries going back to 1998, so it'll take a while to import them all. One of the challenges has been making sure that deep links continue to work, and images show up, so each entry has to be parsed and its internal links fixed before it can be published. And I'm doing all this outside my usual work day.

So when is go-live? No idea, but I'm shooting for November 13th.* Stay tuned.

As for the image below, it serves no purpose other than to confirm that the import tool correctly alters image tags:

* Almost nothing in that entry is still true, but it's still the anchor post of this blog.

Cobbler's children

Two things this weekend kept me from blogging. First, the amazing weather. It was warm and sunny both days, so I spent time picking apples and sitting outside with a book.

The other thing is that the time I did spend at my computer involved working on the replacement for this blog engine.

Regular blogging will continue this week.

So kludgy

I noted earlier that this code base I'm working with assumes all file stores look like a disk-based file system. This has forced me to do something totally ugly.

All requests for files get pre-pended with a hard-coded string somewhere in the base classes—i.e., the crap I didn't write. So when I want to use the Azure storage container "myfiles", some (but not all) requests for files will use ~/App_Data/files/myfiles (or whatever is configured) as the container name. Therefore, the Azure provider has to sniff every incoming request and remove ~/App_Data/files/ or the calls fail.

Don't even get me started on how the code assumes HttpContext.Current will exist. That has made unit testing a whole new brand of FFS.

Assumptions in your code may cause annoyance (wonky)

I've been playing around with BlogEngine.NET, and I've hit a snag making it work with Microsoft Azure.

BlogEngine.NET was built to store files inside the application's own file system. So if you install the engine in, say, c:\inetpub\wwwroot\blogEngine, by default the files will be in ~/App_Data/files, which maps to c:\inetpub\wwwroot\blogEngine\App_Data\files. All of the file-handling code, even the abstractions, assume that your files will have some kind of file name that looks like that.

You must never store files locally in an Azure cloud service, because at any moment your virtual machine could blow up and be reconstituted from its image. Boom. There go your files.

You really want to use Azure storage. In its purest form, Azure storage organizes files in containers. A container can't have sub-containers. You access a container by its name only; paths are meaningless.

But because BlogEngine.NET assumes that all file stores use path names (which even works for the database file store plug-in, for reasons I don't want to go into), creating an Azure Storage provider for this thing has been really annoying. I've even had to modify some of the core files because I discovered that it applied a default path to any file request no matter what storage provider you used.

Don't even get me started on the bit of BlogEngine.NET's architecture that pulls all files around through the UI instead of allowing them to live in a CDN...

Self-fulfilling Googling?

I just Googled a problem I'm having setting up a continuous-integration build, because I've had this problem before and wanted to review how I solved it before. Google took me to my own blog on the second hit. (The first hit was the entry I cross-posted on my old employer's blog.)

Why even bother with my own memory?

Success!

After struggling for almost two weeks to learn AngularJS and other technologies, I've gotten BlogEngine.NET (which will replace DasBlog as the Daily Parker's platform) to do geography and time zones the way I want them. (Notice the time stamp and globe icon at the bottom of this post.) Specifically, last night I got the clickable Google Map on the Edit Post page to work.

Sometimes I like learning new technology. This was a lot less painful than some I've taken up, with only a couple of blind alleys and a reasonable learning curve. I'm excited that The Daily Parker could migrate to its new home in just a few weeks, if I continue to make progress.