The Daily Parker

Politics, Weather, Photography, and the Dog

Illinois electric utility adds power for the Cloud

The Cloud—known to us in the industry as "someone else's computers"—takes a lot of power to run. Which is why our local electric utility, ComEd, is beefing up their service to the O'Hare area:

Last month, it broke ground to expand its substation in northwest suburban Itasca to increase its output by about 180 megawatts by the end of 2019. Large data centers with multiple users often consume about 24 megawatts. For scale, 1 megawatt is enough to supply as many as 285 homes.

ComEd also has acquired land for a new substation to serve the proposed 1 million-square-foot Busse Farm technology park in Elk Grove Village that will include a data center component. The last time ComEd built a substation was in 2015 in Romeoville, to serve nearby warehouses. In the past year, Elk Grove Village issued permits for four data center projects totaling 600,000 square feet and $175 million in construction. If built, it's a 40 percent increase in total data center capacity in the village.

Insiders say Apple, Google, Microsoft and Oracle have taken on more capacity at data centers in metro Chicago in the past year or so.

One deal that got plenty of tongues wagging was from DuPont Fabros Technology, which started work earlier this year on a 305,000-square-foot data center in Elk Grove Village. DuPont, which recently was acquired by Digital Realty Trust, pre-leased half of it, or about 14 megawatts, to a single customer, believed to be Apple.

One of the oldest cloud data centers, Microsoft's North Central Azure DC, is about three kilometers south of the airport here. Notice the substation just across the tollway to the west.

Chicago's Internet tax

I've just spent a few minutes going through all my company's technology expenses to figure out which ones are subject to the completely daft rental tax that Chicago has extended to cover computing services. The City theorizes that rental tax is payable whenever you pay to use a piece of equipment that belongs to someone else for a period of time. This makes a lot of sense when you go to Hertz, but less when you use Microsoft Azure.

My understanding of the tax and the City's might not be completely orthogonal, but here are some examples of things that I've flagged for my company.

Salesforce.com: This clearly falls within the tax ruling. You pay for an online service that runs on someone else's computers. This is exactly what the city was after when they extended the rental tax.

Microsoft Azure: The tax only seems to cover Azure Compute fees, and specifically exempts Storage charges. So how are database hours taxed, then? With Azure, you pay for Database compute and storage together. Clearly Azure Storage is exempt, though. So now we've got a recordkeeping burden that Microsoft can't help us with yet. Great.

LinkedIn Professional: This may be subject to the tax, if you interpret the tax very broadly. But a LinkedIn subscription isn't so much for the use of its computers (which is free), but for enhanced features of the product that seem more like consulting services than compute time. I think we'll see some litigation over services like this one.

JetBrains ReSharper software license: This does not seem subject to the tax, because we're only paying for a license to run the software on our own computers.

Basically, the City is trying to raise revenue any way it can, but they don't have the technical wherewithal to understand why the tax as constituted makes no sense. Some people in my company feel this makes Chicago unattractive to business, but that's true only if you don't count the difficulty getting talented people to move away from all the city has to offer. It's a frustrating new tax, though, and one the City probably wouldn't have to impose if the rest of the state would pay for its share of the services that Chicago provides to it.

Figuring out the Safe Harbor fallout

As I mentioned yesterday, the European Court of Justice ruled yesterday that the US-EU Safe Harbor pact is illegal under European law:

The ruling, by the European Court of Justice, said the so-called safe harbor agreement was flawed because it allowed American government authorities to gain routine access to Europeans’ online information. The court said leaks from Edward J. Snowden, the former contractor for the National Security Agency, made it clear that American intelligence agencies had almost unfettered access to the data, infringing on Europeans’ rights to privacy.

The court said data protection regulators in each of the European Union’s 28 countries should have oversight over how companies collect and use online information of their countries’ citizens. European countries have widely varying stances toward privacy.

The Electronic Frontier Foundation examines the implications:

[I]f those reviews [of individual companies' transfers] continue to run against the fundamental incompatibility of U.S. mass surveillance with European data protection principles, the end result may well be a growing restriction of the commercial processing of European users' data to within the bounds of the European Union.

That would certainly force the companies to re-think and re-engineer how they manage the vast amount of data they collect. It will not, however, protect their customers from mass surveillance. The geographic siloing of data is of little practical help against mass surveillance if each and every country feels that ordinary customer data is a legitimate target for signals intelligence. If governments continue to permit intelligence agencies to indiscriminately scoop up data, then they will find a way to do that, wherever that data may be kept. Keep your data in Ireland, and GCHQ may well target it, and pass it onto the Americans. Keep your data in your own country, and you'll find the NSA—or other European states, or even your own government— breaking into those systems to extract it.

Harvard law student Alex Loomis highlighted the uncertainties for US-based companies:

But ultimately it is still hard to predict how national and EU authorities will try to enforce the ECJ decision in the short-run because, as one tech lobbyist put it, “[c]ompanies will be working in a legal vacuum.”  Industry insiders are already calling for more guidance on how to act lawfully. That’s hard, because the EU Commission’s decision is no longer controlling and each individual country thus can now enforce EU law on its own. Industry experts suggest that the turmoil will hurt smaller tech companies the most, as the latter lack separate data centers and accordingly are more likely to rely on transferring data back to the United States. As I pointed out last week, that might have some anticompetitive effects.

In short, data transfers between the EU and US are now a problem. A big one. Fortunately at my company, we don't keep any personal information—but we still may have a heck of a time convincing our European partners of that, especially if Germany and France go off the deep end on privacy.

Wrapping up Dreamforce 15

After last night's Killers and Foo Fighters concert-slash-corporate-party—and the free Sierra and Lagunitas Salesforce provided, more to the point—today's agenda has been a bit lighter than the rest of the week.

Today's 10:30 panel was hands-down my favorite. Authors David Brin and Ramez Naam spoke and took questions for an hour about the future. Pretty cool stuff, and now I have a bunch more books on my to-be-read list.

At the moment, I'm sitting at an uncomfortably low table in the exhibit hall along with a few other people trying to get some laptop time in. So I will leave you with today's sunrise, viewed from the back:

Traveling again

I haven't traveled nearly as much this year as I did the past few, but only a week after my last trip, I'm away from home again. For a few days I'll be in San Francisco for Dreamforce '15, where the Force is with me dreams are forced upon you I'll learn about Salesforce and hobnob with other nerds.

Unfortunately, I left all of my laptop power supplies in Chicago. And, having had the same basic Dell model for the last five computers, I have quite a few. Fortunately, my office is sending me one.

So, today's entry will be mercifully short. Photos, and musings about cloud-based CRM, to follow when I have power.

Slightly updated Weather Now

Because Microsoft has deprecated 2011-era database servers, my weather demo Weather Now needed a new database. And now it has one.

Migrating all 8 million records (7.2 million places included) took about 36 hours on an Azure VM. Since I migrated entirely within the U.S. East data center, there were no data transfer charges, but having a couple of VMs running for the weekend probably will cost me a few dollars more this month.

While I was at it, I upgraded the app to the latest Azure and Inner Drive packages, which mainly just fixed minor bugs.

The actual deployment of the updated code was boring, as it should be.

Changes to TDP on the way...maybe

Since development of DasBlog petered out in 2012, and since I have an entire (size A1) Azure VM dedicated solely to hosting The Daily Parker, I've been looking for a new blog engine for this blog.

The requirements are pretty broad:

  • Written in .NET
  • Open source or source code available for download
  • Can use SQL Server as a data source (instead of the local file system, like DasBlog)
  • Can deploy to an Azure Web App (to get it off the VM)
  • Still in active development
  • Modern appearance and user experience

See? Look-and-feel is in there somewhere. But mainly I want something I can play with.

I'm still evaluating them. This list was really helpful, and pointed me towards the successor to DasBlog, BlogEngine.NET. Mads Kristensen's newest blog engine, MiniBlog, has potential, but it doesn't seem ready for prime time yet.

The changes will come at some point in the next few months, assuming I have time to play with some options and modify the chosen engine to support a few features I want, like time zone support and location tagging. I also want to see about adding completely new features, like Google Timeline integration, or private journals and events, which require encryption and other security measures that blog engines don't usually have. Not to mention the possibility of using DocumentDB as a data source...

Stay tuned. The Daily Parker's 10th birthday is coming in November.

How's your week going?

It's just past 9am on Monday and already I'm reduced to this kind of blog post. Tomorrow I may have some more time to read these things:

  • Cranky Flier analyzes Malaysia Airlines' struggles.
  • Microsoft is building subsea fibre cables between the U.S. and Europe and Asia.
  • TPM explains exactly what Jade Helm 15 really is.
  • Missed Microsoft Ignite this year? Here's the Channel 9 page.
  • We're starting to set up JetBrains TeamCity to handle our continuous integration needs. Explain, however, why the user manual is all video? Guys. Seriously. I haven't got time for this.
  • So now that Illinois actually has to pay the pensions we promised to pay, what now? (Hello, 9% income tax?)

Four-hour design review session is imminent. I may post later today...or I may lock myself in my office and stare at the wall.

Microsoft's impressive code drop

The Redmond giant stunned the software development world this week by opening up several core technologies, including the entire .NET platform, to the public:

We are building a .NET Core CLR for Windows, Mac and Linux and it will be both open source and it will be supported by Microsoft. It'll all happen at https://github.com/dotnet.

Much of the .NET Core Framework 4.6 and its Reference Source source is going on GitHub. It's being relicensed under the MIT license, so Mono (and you!) can use that source code in their .NET implementations.

Dr. Dobbs is impressed (as am I):

Of these platforms, Linux is clearly the most important. Today, Microsoft earns much of its (record) profits from enterprise software packages (SQL Server, SharePoint, Exchange, etc.). By running .NET on Linux, it now has the ability to run those apps on a significant majority of server platforms. Except for Solaris sites, all enterprises will be able to run the applications without having to add in the cost of Microsoft Server licenses.

But perhaps more important than the pure server benefit is the cloud aspect. VMs on the cloud, especially the public cloud, are principally Linux-based. Windows VMs are available, too, but at consistently higher pricing. With this move, .NET apps can now run anywhere on the cloud — or said another way, between servers and the cloud, the apps can run anywhere IT is operating.

The big winners of all this goodness are C# developers. In theory, .NET portability favors all .NET languages equally, but it's no secret that C# is the first among equals. (It is, in fact, the only language that Xamarin supports currently.) Microsoft has been an excellent steward of the language, evolving it intelligently and remarkably cleanly. Among developers who use it regularly, it is uniformly well liked, which distinguishes it from most of the other major development languages today, where an appreciation that borders on ambivalence is the more common experience.

The big loser is certainly Java. Java's stock in trade has been its longstanding ability to run without modification or recompilation on all major platforms. In this valuable trait, it has had no major competition. If Microsoft's port of .NET provides a multi-platform experience that is as smooth and seamless as Java, then the JVM will have some very serious competition.

Once I'm done with the deliverable that's due tomorrow, I may download the .NET Framework and take a look. I'll also spin up an Azure VM and play around with Visual Studio 2015 before the end of the week.

Weather Now gazetteer now really, really fast

I mentioned over a month ago that, given some free time, I would fix the search feature on Weather Now. Well, I just deployed the fix, and it's kind of cool.

I used Lucene.NET as the search engine, incorporating it into the Inner Drive Gazetteer that underlies the geographic information for Weather Now. I won't go into too many details about it right now, except that I was surprised at how much the index writer was able to crunch and store (in Azure blobs). The entire index takes up 815 MB of blob space. That's so small a fraction of a cent per month I can't even calculate it right now.

The indexing process took about 6 minutes per 500,000 rows. (The entire database has 7.25 million rows.) It helped that I ran the indexing process on an Azure virtual machine, because at one point during index optimization I clocked the data throughput at 200 Mbps. Yes, two hundred megabits per second. The entire index ran in a little less than two hours on a VM while I was doing other things. And once the index initializes in the Weather Now app, searches only take a second or so.

Go ahead. Try a search. Put in your ZIP code or the name of a prominent building near you.

I still have a lot I want to do with the application, including updating it to a responsive theme and MVC, but this is a pretty big leap.