The Daily Parker

Politics, Weather, Photography, and the Dog

Under the hood of Weather Now

My my most recent post mentioned finishing the GetWeather component of Weather Now, my demo project that provides near-real-time aviation weather for most of the world. I thought some readers might be interested to know how it works.

The GetWeather component has three principal tasks:

In the Inner Drive Technology world, an Azure worker process uses an arbitrary collection of objects that implement the IWorkerTask interface. The interface defines Interval and LastRun properties and an Execute method, which is all the worker process needs to know. The tasks are responsible for their own lifespans, reentry prevention, etc. (That's another discussion.)

In order to decouple the data source (NOAA now, other sources in the future) from the application, I split the three tasks into two IWorkerTask classes:

  • The NoaaFileDownloadingWorkerTask opens an FTP connection to the NOAA public weather servers, retrieves the files it hasn't already retrieved, and stores the contents in Azure Blob Storage; and
  • The NoaaFileParsingWorkerTask pulls the files out of Azure Storage, parses them, and stores the results in an Azure SQL Database and Azure table storage.

I'm using Azure storage as an intermediary between the two sides of the process because my analysis led me to the conclusion that they're really independent of each other. Coupling of the two tasks in the current (2002) version of GetWeather causes all kinds of problems, not least that a failure in one task can stop the whole thing. If, as happens given the nature of the Internet, the FTP side has an unrecoverable problem, the application has to restart. In actual practice it simply kills itself and waits for the next time it runs, which can be a while because it's running on a Windows Server 2008 Scheduler job every 30 minutes.

The new architecture will allow the parser to run every minute or two, see if it has anything to do by looking at some metadata, and do its job if needed. I can change a system setting to stop it from running (for example, because I need to do some database maintenance), while letting the downloader continue to work separately.

On the other side, the downloader can run every 5 minutes, snatch the one or two files it needs from NOAA, and shut down cleanly without waiting for the parser. NOAA likes this because the connection is only open for a few seconds, instead of the 27 minutes it stays open right now. And if the NOAA server isn't available, so what? It's a clean shutdown and a clean start a few minutes later.

This design also allows me to do something else: manually upload files for parsing and storage. This helps with testing, migration, service interruptions—all things that the current architecture has made nearly impossible.

I'm not entirely done with the application (and while writing this I just thought of an improvement I'll need to make to prevent infinite retries), but it's close. And I'm really pleased with the application so far. Stay tuned; I can now set a tentative public launch date of March 31st.

Resolving the oldest case

Five years ago, on 6 January 2008, I opened a FogBugz case (#528) to "Create NOAA Downloader". The NOAA downloader goes out to the National Weather Service, retrieves raw weather data files, and stores the files and some metadata in Windows Azure storage. Marking this work item "resolved"

Well, I just finished it, and therefore I have finished all of the pieces of the GetWeather application. And with that, I've finished the last significant piece of the Weather Now 4.0 rewrite. Total time to rewrite GetWeather: 42 hours. Total time for the rewrite so far: 66 hours.

Now all I have to do is...let's see...create worker role tasks to run the various pieces of the application (getting the weather, parsing the weather, storing the weather, and cleaning up the database), upgrading the Web site to a full Cloud Services application, deploy it to Azure, and deploy its gazetteer. That should be about 5 hours more work. Then, after a couple of weeks of mostly-passive testing, I can finally turn off the Inner Drive Technology Worldwide Data Center.

It was a sunny day

Why? Because it's too cold for clouds.

Actually, this is one of those correlation-causation issues: cold days like today (it's -15°C right now) are usually clear and sunny because both conditions result from a high-pressure system floating over the area. Still, it's pretty cold:

A February hasn’t opened this cold here in the 17 years since 1996. The combination of bitterly cold temperatures, hovering at daybreak Friday near or below zero [Fahrenheit] in many corners of the metro area, plus the biting west winds gusting as high as 48 km/h, are producing 15 to 25-below zero wind chills—readings as challenging as any Chicagoans have encountered this season.

The first reported -18°Cor lower wind chill occurred Thursday at 8 a.m. and the expectation is a 40 or more hour string of consecutive sub--18°C wind chills is likely to continue through midnight or a bit later Friday night in the rising temp regime predicted to take hold at that time.

Still, it's February, which means lengthening days, warmer temperatures, and pitchers & catchers. Yay!

And now, mid-April

Chicago's normal high temperature for April 17th is 16°C, which by strange coincidence is the new record high for January 29th:

The warm front associated with the strong low pressure system passed through the Chicago area between 2 and 3AM on it’s way north and at 6AM is oriented east-west along the Illinois-Wisconsin state line. South of the front south to southwest winds 24 to 45 km/h and temperatures in the upper 10s°C prevail – Wheeling actually reported 15.6°C at 6AM. North of the front through southern Wisconsin and farther north, winds were east to southeast and temperatures near freezing. Milwaukee at 6AM was 3°C.

Moreover:

The 18°C high projected for Chicago Tuesday easily replaces the day's previous 99-year record high of 15°C set in 1914 and is a reading just 1.1°C shy of the city's all-time January record high temp of 19°C set back on Jan 25, 1950. Only 5 of the 34 January 60s [Fahrenheit] on the books have made it to 18°C.

Temps in the 60s [Fahrenheit] in January are incredibly rare—a fact which can't be overstated! In fact, just 21 of 143 Januarys since records here began in 1871 have produced 60s.

The city's last 16°C January temperature took place 5 years ago when the mercury hit 18°C on Jan 7, 2008.

Ordinarily in the middle of winter in Chicago it would be customary at this point to say "It was last this warm in..." and throw out a date from last summer. But no, this is the new world of climate change, so I can say: "It was last this warm December 3rd."

Of course, it can't last. Here's the temperature forecast starting at noon today (click for full size):

January to April to January in three easy steps...

Azure table partition schemes

I'm sitting at my remote office working on a conundrum: how to balance human usability against good software design.

The problem is: how can I create an Azure table partitioning scheme that uses Azure efficiently and still allows the user (me) efficiently to troubleshoot problems with the feature in question. This is a direct consequence of the issues I worked on this morning.

The feature is the component of the Weather Now parsing system that stores raw weather data from NOAA temporarily. By "temporarily" I mean, until I delete it. Keeping the raw data will allow me to figure out why problems occur and will allow the application to apply new features to old data in future.

NOAA publishes "cycle files" about every 3-6 minutes. The cycle uses a predictable sequence of 750 file names that repeats about every 4 days. The files go from file000 to file750, then back to file000. Sometimes, however, NOAA restarts the sequence at 0, skips files, or just crashes entirely, so the feature has to handle the file names as random. That said, the files have definite publication times, and generally—to an extent that Weather Now can optimize itself based on the pattern—the files contain weather data gathered within a short time before NOAA publishes the files.

You can have practically unlimited Azure tables in a storage account; I would imagine the number is close to the Int32 maximum value of 2.1 billion. Each table can have billions of partition keys as well. Searching on a combination of Azure table name and partition key takes the same length of time no matter how many tables are in the storage account or how many partition keys each table has. Under the hood, Azure manages the indexing so efficiently that network latency will be the bigger problem in all but a few edge cases.

For Weather Now, my first thought was to create a new table for each month's NOAA files and partition the table by day. So, weather parsing process would put the metadata for a file downloaded right now in the table "noaa201301" and use the partition key "20130127". That would give me about 5,700 rows in each table and about 190 rows in each partition.

I'm reconsidering. Given it's taken 11 years to change the way that Weather Now retrieves and stores weather data, using that scheme would give me 132 tables and 4,017 partitions, each of them kind of small. Azure wouldn't care, but it would over time clutter up the application's storage account. (The account will be cluttered enough as it is, with the millions of individual weather reports tabled by station and partitioned by month.)

On reflection, then, I'm going to create a new table of metadata each year, and partition by month. An Azure table with 69,000 rows (the number of NOAA files produced each year) isn't appreciably less efficient than one with 69 rows or 69 million, as it turns out. It will still partition the data as efficiently as the partition key suggests. But cutting the partitions down 30-fold could make a big difference in efficiency.

I'm open to contrary evidence. In fact, I'd love to find some. But given the frequency of data reads (one every 5 minutes or so), and the thousands of tables already in the application's storage account, I think this is the best way to go.

Nerdy but possibly welcome update

Even though we've just gotten our first snowfall, and today has started giving us snow, freezing rain, sleet, and icy roads, there is good news.

January 27th is when things officially start looking brighter in Chicago every year. Tonight, for the first time in almost two months, the sun sets at 5pm. Then things start to become noticeably brighter: a 7am sunrise next Monday, a 5:30pm sunset two weeks after that, then a 6:30am sunrise less than a week later.

Yes, this is dorky, but trust me: you'll notice it now.

Why this last Azure move is taking so long

The Inner Drive Technology International Data Center continues to whir away (and use electricity), despite my best efforts to shut it down by moving everything to Microsoft Windows Azure.

Most of the delay finishing the move has nothing to do with its technology. Simply, my real job has taken a lot of time this month as we've worked toward launching a new application tomorrow. Against the 145 hours spent on that project this month, not counting the 38 hours spent helping with other projects, squeezing out the 22 hours I've managed to find for Weather Now has left me falling behind on the Oscar nominees.

For those just joining our story, Weather Now remains the last living application in the IDTIDC. This application shows real-time aviation weather for almost every airport in the world. I wrote the first version in 1998, moved it to its own domain in 2000, and published the last significant update in 2010.

The application benefited for most of its life by having practically unlimited hardware and system software to run on. As a Microsoft partner, I've gotten access to Windows Server, SQL Server, and other goodies for my entire professional life. Moving to Azure changes the calculus radically.

Weather Now runs on Microsoft SQL Server 2012 Enterprise with essentially limitless disk space. In the past 14 years, the application has quietly gathered 50 GB of data, merrily occupying a physical partition scheme that takes up a good bit of a RAID-5 volume. Creating a similar architecture in Azure exceeds my budget a bit: a single medium VM to run the application and its GetWeather component plus a 50 GB SQL Database would cost about $250 per month.

Fortunately, I don't have to do that. Most of the data, you see, hardly ever gets used.

Weather Now usually has around 4,500 current weather observations and 165,000 observations from the last 24 hours. Since each row is small, and the index is positively tiny (only the station ID and observation times are indexed), the current table uses about 1.5 MB of space and the last-24 table uses 43 MB. That doesn't even break a sweat on Azure SQL Database.

No, it's the Archive table that grows like the Beast from Below. That one has all of the past observations since the site started. In some cases I've pruned the table, but basically, it has one row per observation per station. For an average station like O'Hare, that means about 10,500 per year. For a chatty, automated station that spits out a report every 20 minutes, it stores about 27,000 per year. For 2012 alone, that works out to about 47 million* rows—growing at 4 million per month.

What to do? Well, stop storing it was my first thought. It hardly ever gets used, partially because the UI doesn't have a way to pull out historical data.

On the other hand, I've frequently wanted to illustrate blog entries with specific weather reports that have permanent links. And this problem, such as it is, does not have a difficult solution.

So, among its other features, Weather Now 4.0 will store archival weather reports in Azure table storage. It won't have the full 50 GB of material initially, possibly ever; but even if it did, it would only cost about $5 per month to store it. And I've hit on a partitioning scheme that will, eventually, make finding archival data really quick and easy, no matter how much of it there is.

The conclusion should be obvious: If you start looking at things the Azure way, using Azure can save you tons of money. My current estimate of the monthly cost to run Weather Now, assuming current visitor levels and acceptable performance on "very small" Cloud Services instances, is $40 per month. If it eventually amasses 50 GB of archives, it will cost...$42 per month. And if I get thousands of visitors that require upgrading to a "small" instance, I'll start selling subscriptions, but I won't have to buy new equipment because it's Azure.

More on this later. Right now, I've got to get back to work.

* Actually, 47,704,735 rows for 2012, an average of 130,341 new rows per day.

335

Well, Chicago finally found out how long was the longest stretch in recorded history without a 25 mm snowfall: 335 days. The official tally through 6 am was 28 mm, which looked like this in Lincoln Park:

It really won't last. The forecast calls for 11°C by Tuesday.

Six-layer morning

For the first time in almost two years, Chicago woke up to below--18°C temperatures. We last had a day this cold on 11 February 2011, when it got down to -19°C. And we still haven't got any snow:

Lake snowfall across Michigan, despite the relatively low westerly wind-fetch (the "fetch" is the distance over which winds travel across Lake Michigan's comparatively "warm" waters) which is generating it had produced as much as 100-150 mm accumulation late Monday—and more snow is to fall there Tuesday.

Despite snowfall there, all but a comparatively small swath of downstate Illinois, Indiana and Ohio, is reporting sub-par snowfall this season. Chicago, with just 33 mm of snow to its credit, leads the pack of snow-deprived Midwest sites with just 8% of its typical seasonal snow to date--an amount 394 mm below normal.

And we're still pushing out three snow records: the longest period ever without a 25 mm snowfall (333 days, still going); the longest period ever with less than 25 mm of snow on the ground (331 days, still going); and the latest-ever 25 mm-or-greater snowfall (last broken on 17 January 1899—so we're now 5 days past the record).

Weirdest winter in memory, I tell you.

The records just keep breaking

We've had a more-or-less normal 24 hours in January, with temperatures between -1°C and -11°Cbog standard.

That said, we've also had the latest sub-freezing high temperature ever (January 1st), which ended the longest-ever stretch without sub-freezing high temperatures (310 days); the second-most days in a calendar year without a sub-freezing high temperature (354); and the fourth-longest stretch without 25 mm of cumulative snow (through January 5th). More records: the longest period ever without a 25 mm snowfall (325 days, still going); the longest period ever with less than 25 mm of snow on the ground (323 days, still going); and by Thursday, given the forecast, the latest-ever 25 mm-or-greater snowfall (last broken on 17 January 1899).

Meanwhile, it snowed in Jerusalem last week, an event as common as...well, snow in Los Angeles.

Now, with more extreme weather in more places, the *New!* *Improved!* Anthropogenic Climate Change! Yay!