The Daily Parker

Politics, Weather, Photography, and the Dog

New deal to extend 606 Trail

Sterling Bay, the company developing the Finkl site in Lincoln Park, has reached a deal with the Chicago Terminal Railroad to extend the 606 Trail across the Chicago River:

Sterling Bay, which plans a big development on the former Finkl steel plant site and neighboring parcels, has resolved its dispute with a rail company that owns train tracks that run across riverside land and on to Goose Island.

The rail company, Chicago-based Iowa Pacific Holdings, infuriated Sterling Bay and Goose Island landlords last fall when it rolled a couple dozen empty tanker cars across the Finkl property and onto Goose Island and left them there.

In October, Sterling Bay asked a federal agency to force Iowa Pacific to give up the tracks, arguing that they would derail development in the area. Other landlords complained that Iowa Pacific stored the cars on Goose Island merely to shake them down for money to remove the cars.

But the fight didn't last long: In January, an Iowa Pacific unit, the Chicago Terminal Railroad, gave up, agreeing not to oppose Sterling Bay's application with the federal Surface Transportation Board to force the rail company to abandon the tracks, according to a recent decision by the board.

The proposed extension to the trail would also include moving and modernizing the Metra station at Clybourn Junction.


This month will see two important Daily Parker milestones. This is the first one: the 6,000th post since launched as a pure blog in November 2005. The 5,000th post was back in March 2016, and the 4,000th in March 2014, so I'm trucking along at just about 500 a year, as this chart shows:

Almost exactly four years ago I predicted the 6,000th post would go up in September. I'm glad the rate has picked up a bit. (New predictions: 7,000 in May 2020 and 10,000 in April 2026.)

Once again, thanks for reading. And keep your eyes peeled for another significant Daily Parker milestone in a little less than two weeks.

Winter to spring in 24 hours

Ah, Chicago, your weather really builds character.

Yesterday our official temperature got up to 27°C; today's forecast is 29°C. So it might surprise you that Sunday's low was -1°C, a record fro April 29th.

Or maybe it won't surprise you. Especially given the other records we set in April:

The Chicago area saw a record 16 days in which temperatures were 32 degrees or lower in April, said Kevin Donofrio, a meteorologist for the National Weather Service. The previous record was set in 1874 and 1873 for 15 days freezing or below-freezing temperatures. The average monthly temperature is about 49 degrees, according to the weather service.

In addition, this month may go down as the fourth-coldest April on record for the Chicago area in terms of temperature averages, Donofrio said. The cold start to spring postponed Cubs games and prompted the CTA to keep its “L” platform heat lamps on as commuters slogged through a chilly April.

Snow and cold in Canada was to blame for the lower-than-normal temperatures in Chicago in recent weeks, Donofrio said.

In April, the Chicago area saw six days with accumulating snow and three days with flurries, Donofrio said. The snowfall wasn’t uncommon, though the area did set a record on April 9 for the 2 inches of snow accumulation.

Yes, blame Canada. But really, right now Canada—really just Nunavut and northern Quebec—is the only place in the northern hemisphere with significantly below-normal temperatures. The planet as a whole is 0.4°C above normal, and hasn't been below normal in years. (This has remained true even when the normals are adjusted at 10-year intervals.)

But hey, it's May. I'll take a few spring days before we have to turn the A/C on again.

List of 2018 A-to-Z topics

Blogging A to ZHere's the complete list of topics in the Daily Parker's 2018 Blogging A-to-Z challenge on the theme "Programming in C#":

Generally I posted all of them at noon UTC (7am Chicago time) on the proper day, except for the ones with stars. (April was a busy month.)

I hope you've enjoyed this series. I've already got topic ideas for next year. And next month the blog will hit two huge milestones, so stay tuned.

Z is for Zero

Blogging A to ZToday is the last day of the 2018 Blogging A-to-Z challenge. Today's topic: Nothing. Zero. Nada. Zilch. Null.

The concept of "zero" only made it into Western mathematics just a few centuries ago, and still has yet to make it into many developers' brains. The problem arises in particular when dealing with arrays, and unexpected nulls.

In C#, arrays are zero-based. An array's first element appears at position 0:

var things = new[] { 1, 2, 3, 4, 5 };

// -> 2

This causes no end of headaches for new developers who expect that, because the array above has a length of 5, its last element is #5. But doing this:


...throws an IndexOutOfRange exception.

You get a similar problem when you try to read a string, because if you recall, strings are basically just arrays of characters:

var word = "12345";

// 5


// IndexOutOfRange exception

The funny thing is, both the array things and the string word have a length of 5.

The other bugaboo is null. Null means nothing. It is the absence of anything. It equals nothing, not even itself (though this, alas, is not always true).

Reference types can be null, and value types cannot. That's because value types always have to have a value, while reference types can simply be a reference to nothing. That said, the Nullable<T> structure gives value types a way into the nulliverse that even comes with its own cool syntax:

int? q = null;
int r = 0;
Console.WriteLine(q ?? 0 + r);
// 0

(What I love about this "struct?" syntax is you can almost hear it in a Scooby Doo voice, can't you?)

Line 1 defines a nullable System.Int32 as null. Line 2 defines a bog-standard Int32 equal to zero. If you try to add them, you get a NullReference exception. So line 3 shows the coalescing operator that basically contracts both of these statements into a succinct little fragment:

// Long form:
int result;
if (q.HasValue)
	result = q.Value + r;
	result = 0 + r;

// Shorter form:
int result = (q.HasValue ? q.Value : 0) + r;

// Shortest form:
int result = q ?? 0 + r;

And so the Daily Parker concludes the 2018 Blogging A-to-Z challenge with an entire post about nothing. I hope you've enjoyed the posts this month. Later this morning, I'll post the complete list of topics as a permanent page. Let me know what you think in the comments. It's been a fun challenge.

Y is for Y2K (and other date/time problems)

Blogging A to ZI should have posted day 25 of the Blogging A-to-Z challenge. yesterday, but life happened, as it has a lot this month. I'm looking forward to June when I might not have the over-scheduling I've experienced since mid-March. We'll see.

So it's appropriate that today's topic involves one of the things most programmers get wrong: dates and times. And we can start 20 years ago when the world was young...

A serious problem loomed in the software world in the late 1990s: programmers, starting as far back as the 1950s, had used 2-digit fields to represent the year portion of dates. As I mentioned Friday, it's important to remember that memory, communications, and storage cost a lot more than programmer time until the last 15 years or so. A 2-digit year field makes a lot of sense in 1960, or even 1980, because it saves lots of money, and why on earth would people still use this software 20 or 30 years from now?

You can see (or remember) what happened: the year 2000. If today is 991231 and tomorrow is 000101, what does that do to your date math?

It turns out, not a lot, because programmers generally planned for it way more effectively than non-technical folks realized. On the night of 31 December 1999, I was in a data center at a brokerage in New York, not doing anything. Because we had fixed all the potential problems already.

But as I said, dates and times are hard. Start with times: 24 hours, 60 minutes, 60 seconds...that's not fun. And then there's the calendar: 12 months, 52 weeks, 365 (or 366) days...also not fun.

It becomes pretty obvious even to novice programmers who think about the problem that days are the best unit to represent time in most human-scale cases. (Scientists, however, prefer seconds.) I mentioned on day 8 that I used Julian day numbers very, very early in my programming life. Microsoft (and the .NET platform) also uses the day as the base unit for all of its date classes, and relegates the display of date information to a different set of classes.

I'm going to skip the DateTime structure because it's basically useless. It will give you no end of debugging problems with its asinine DateTime.Kind member. This past week I had to fix exactly this kind of thing at work.

Instead, use the DateTimeOffset structure. It represents an unambiguous point in time, with a double value for the date and a TimeSpan value for the offset from UTC. As Microsoft explains:

The DateTimeOffset structure includes a DateTime value, together with an Offset property that defines the difference between the current DateTimeOffset instance's date and time and Coordinated Universal Time (UTC). Because it exactly defines a date and time relative to UTC, the DateTimeOffset structure does not include a Kind member, as the DateTime structure does. It represents dates and times with values whose UTC ranges from 12:00:00 midnight, January 1, 0001 Anno Domini (Common Era), to 11:59:59 P.M., December 31, 9999 A.D. (C.E.).

The time component of a DateTimeOffset value is measured in 100-nanosecond units called ticks, and a particular date is the number of ticks since 12:00 midnight, January 1, 0001 A.D. (C.E.) in the GregorianCalendar calendar. A DateTimeOffset value is always expressed in the context of an explicit or default calendar. Ticks that are attributable to leap seconds are not included in the total number of ticks.

Yes. This is the way to do it. Except...well, you know what? Let's skip how the calendar has changed over time. (Short answer: the year 1 was not the year 1.)

In any event, DateTimeOffset gives you methods to calculate time and dates accurately across a 20,000-year range.

Which is to say nothing of time zones...

X is for XML vs. JSON

Blogging A to ZWelcome to the antepenultimate day (i.e., the 24th) of the Blogging A-to-Z challenge.

Today we'll look at how communicating between foreign systems has evolved over time, leaving us with two principal formats for information interchange: eXtensible Markup Language (XML) and JavaScript Object Notation (JSON).

Back in the day, even before I started writing software, computer systems talked to each other using specific protocols. Memory, tape (!) and other storage, and communications had significant costs per byte of data. Systems needed to squeeze out every bit in order to achieve acceptable performance and storage costs. (Just check out my computer history, and wrap your head around the 2400 bit-per-second modem that I used with my 4-megabyte 386 box, which I upgraded to 8 MB for $350 in 2018 dollars.)

So, if you wanted to talk to another system, you and the other programmers would work out a protocol that specified what each byte meant at each position. Then you'd send cryptic codes over the wire and hope the other machine understood you. Then you'd spend weeks debugging minor problems.

Fast forward to 1996, when storage and communications costs finally dropped below labor costs, and the W3C created XML. Now, instead of doing something like this:

METAR KORD 261951Z VRB06KT 10SM OVC250 18/M02 A2988

You could do something like this:

<?xml version="1.0" encoding="utf-8"?>
	<station name="Chicago O'Hare Field">KORD</station>
	<observationTime timeZone="America/Chicago" utc="2018-04-26T19:51+0000">2018-04-26 14:51</observationTime>
		<direction degrees="">Variable</direction>
		<speed units="Knots">6</speed>
	<visibility units="miles">10</visibility>
		<layer units="feet" ceiling="true" condition="overcast">25000</layer>
	<temperature units="Celsius">18</temperature>
	<dewpoint units="Celsius">-2</dewpoint>
	<altimeter units="inches Hg">29.88</altimeter>

The XML only takes up a few bytes (612 uncompressed, about 300 compressed), but humans can read it, and so can computers. You can even create and share an XML Schema Definition (XSD) describing what the XML document should contain. That way, both the sending and receiving systems can agree on the format, and change it as needed without a lot of reprogramming.

To display XML, you can use eXtensible Style Language (XSL), which applies CSS styles to your XML. (My Weather Now project uses this approach.)

Only a few weeks later, Douglas Crockford defined an even simpler standard: JSON. It removes the heavy structure from XML and presents data as a set of key-value pairs. Now our weather report can look like this:

  "weatherReport": {
    "station": {
      "name": "Chicago O'Hare Field",
      "icao code": "KORD"
    "observationTime": {
      "timeZone": "America/Chicago",
      "utc": "2018-04-26T19:51+0000",
      "local": "2018-04-26 14:51 -05:00"
    "winds": {
      "direction": { "text": "Variable" },
      "speed": {
        "units": "Knots",
        "value": "6"
    "visibility": {
      "units": "miles",
      "value": "10"
    "clouds": {
      "layer": {
        "units": "feet",
        "ceiling": "true",
        "condition": "overcast",
        "value": "25000"
    "temperature": {
      "units": "Celsius",
      "value": "18"
    "dewpoint": {
      "units": "Celsius",
      "value": "-2"
    "altimeter": {
      "units": "inches Hg",
      "value": "29.88"

JSON is easier to read, and JavaScript (and JavaScript libraries like JQuery) can parse it natively. You can add or remove key-value pairs as needed, often without the receiving system complaining. There's even a JSON Schema project that promises to give you the security of XSD.

Which format should you use? It depends on how structured you need the data to be, and how easily you need to read it as a human.

More reading:

Three on climate change

Earlier this week, the Post reported on data that one of the scariest predictions of anthropogenic climate change theory seems to be coming true:

The new research, based on ocean measurements off the coast of East Antarctica, shows that melting Antarctic glaciers are indeed freshening the ocean around them. And this, in turn, is blocking a process in which cold and salty ocean water sinks below the sea surface in winter, forming “the densest water on the Earth,” in the words of study lead author Alessandro Silvano, a researcher with the University of Tasmania in Hobart.

In other words, the melting of Antarctica’s glaciers appears to be triggering a “feedback” loop in which that melting, through its effect on the oceans, triggers still more melting. The melting water stratifies the ocean column, with cold fresh water trapped at the surface and warmer water sitting below. Then, the lower layer melts glaciers and creates still more melt water — not to mention rising seas as glaciers lose mass.

"The idea is that this mechanism of rapid melting and warming of the ocean triggered sea level rise at other times, like the last glacial maximum, when we know rapid sea level rise was five meters per century,” Silvano said. “And we think this mechanism was the cause of rapid sea-level rise.”

Meanwhile, Chicago magazine speculates about what these changes will mean to our city in the next half-century:

Can Chicago really become a better, maybe even a far better, place while much of the world suffers the intensifying storms and droughts resulting from climate change? A growing consensus suggests the answer may be a cautious yes. For one, there’s Amir Jina, an economist at the University of Chicago who studies how global warming affects regional economies. In the simulations he ran, as temperatures rise, rainfall intensifies, and seas surge, Chicago fares better than many big U.S. cities because of its relative insulation from the worst ravages of heat, hurricanes, and loss of agriculture.

Indeed, the Great Lakes could be considered our greatest insurance against climate change. They contain 95 percent of North America’s supply of freshwater—and are protected by the Great Lakes Water Compact, which prohibits cities and towns outside the Great Lakes basin from tapping them. While aquifers elsewhere run dry, Chicago should stay flush for hundreds of years to come.

“We’re going to be like the Saudi Arabia of freshwater,” says David Archer, a professor of geophysical science at the University of Chicago. “This is one of the best places in the world to live out global warming.”

There’s just one problem: Water, which should be our salvation, could also do us in.

The first drops of the impending deluge have already fallen. Every one-degree rise in temperature increases the atmosphere’s capacity to hold water vapor by almost 4 percent. As a result, rain and snow come down with more force. Historically, there’s been a 4 percent chance of a storm occurring in any given year in Chicago that drops 5.88 inches of rain in 48 hours—a so-called 25-year storm. In the last decade alone, we have had one 25-year storm, plus a 50-year storm and, in 2011, a 100-year storm. In the best-case scenario, where carbon emissions stay relatively under control, we’re looking at a 25 percent increase in the number of days with extreme rainfall by the end of the century. The worst-case scenario sees a surge of 60 percent. Precipitation overall may increase by as much as 30 percent.

And in today's Times, Justin Gillis and Hal Harvey argue that cars are ruining our cities as well as our climate:

[T]he truth is that people who drive into a crowded city are imposing costs on others. They include not just reduced mobility for everyone and degraded public space, but serious health costs. Asthma attacks are set off by the tiny, invisible soot particles that cars emit. Recent research shows that a congestion charge in Stockholm reduced pollution and sharply cut asthma attacks in children.

The bottom line is that the decision to turn our public streets so completely over to the automobile, as sensible as it might have seemed decades ago, nearly wrecked the quality of life in our cities.

We are revealing no big secrets here. Urban planners have known all these things for decades. They have known that removing lanes to add bike paths and widen sidewalks can calm traffic, make a neighborhood more congenial — and, by the way, increase sales at businesses along that more pleasant street. They have known that imposing tolls with variable pricing can result in highway lanes that are rarely jammed.

We're adapting, slowly, to climate change. Over my lifetime I've seen the air in Chicago and L.A. get so much cleaner I can scarcely remember how bad it was growing up. (Old photos help.) But we're in for some pretty big changes in the next few years. I think Chicago will ultimately do just fine, except for being part of the world that has to adapt more dramatically than any time in the last few thousand years.

W is for while (and other iterators)

Blogging A to ZWe're in the home stretch. It's day 23 of the Blogging A-to-Z challenge and it's time to loop-the-loop.

C# has a number of ways to iterate over a collection of things, and a base interface that lets you know you can use an iterator.

The simplest ways to iterate over code is to use while, which just keeps looping until a condition is met:

var n = 1;
while (n < 6)
	Console.WriteLine($"n = {n}");

while is similar to do:

var n = 1;
	Console.WriteLine($"n = {n}");
} while (n < 6);

The main difference is that the do loop will always execute once, but the while loop may not.

The next level up is the for loop:

for (var n = 1; n < 6; n++)
	Console.WriteLine($"n = {n}");

Similar, no?

Then there is foreach, which iterates over a set of things. This requires a bit more explanation.

The base interface IEnumerable and its generic equivalent IEnumerable<T> expose a single method, GetEnumerator (or GetEnumerator<T>) that foreach uses to go through all of the items in the class. Generally, anything in the BCL that holds a set of objects implements IEnumerable: System.Array, System.Collections.ICollection, System.Collections.Generic.List<T>...and many, many others. Each of these classes lets you manipulate the set of objects the thing contains:

var things = new[] { 1, 2, 3, 4, 5 }; // array of int, or int[]
foreach(var it in things)

foreach will iterate over all the things in the order they were added to the array. But it also works with LINQ to give you even more power:

var things = new List<int> {1, 2, 3, 4, 5};
foreach (var it in things.Where(p => p % 2 == 0))

Three guesses what that snippet does.

These keywords and structures are so fundamental to C#, I recommend reading up on them

Eddie Lampert offers to garrote his own company

Longtime readers know how much I loathe Eddie Lampert for what he did to Sears and for how perfectly he demonstrates the dangers of slavishly following a philosophy that owes a lot to the thought processes of adolescent boys.

Well, my longtime predictions seem to be coming true. Lampert has offered to buy the best bits of Sears (i.e., its real estate and Kenmore brand), which would quickly kill the company. Crain's Joe Cahill outlines some of the offal in this awful person's proposal:

It's not clear, however, just what Lampert is willing to pay. The offer letter indicates the transaction should reflect an enterprise value of $500 million for the home improvement and parts businesses, but doesn't put a price on Kenmore or the real estate, beyond confirming Lampert would assume $1.2 billion in real estate debt. The letter further proposes that the asset sale take place in conjunction with offers by Sears to convert some of its debt into equity and buy back or exchange for equity another slug of outstanding debt. Lampert indicates a willingness to "consider participating in such exchange offer and tender offer," which might increase his equity interest in Sears.

The complex and somewhat vague proposal raises questions about Lampert's many hats at Sears—he's the controlling shareholder, CEO, a major creditor, and—if this transaction goes through—a buyer of key company assets. Let's focus on his role as CEO, where his job is to generate maximum returns on company assets, either through business operations or by selling them for the highest possible price. His offer letter implicitly confirms that he's been unable to do either with Kenmore. Yet he evidently believes he could squeeze strong returns out of the brand if he owned it separately from Sears. Otherwise, buying it would make no financial sense for Lampert and any fellow investors in the proposed asset purchase.

Understandably, this disconnect fuels a growing perception that Lampert is cherry-picking company assets ahead of a potential bankruptcy filing that likely would leave Sears shareholders with little or nothing. Already, a real estate investment trust formed by Lampert has acquired many of Sears' store locations with the intention of remarketing them to higher-paying tenants. "There's a very legitimate case to say he's screwed up the company and now he's trying to take the crown jewels," says Nell Minow, a corporate governance expert with Value Edge Advisors.

No kidding. Thanks, Eddie.