The Daily Parker

Politics, Weather, Photography, and the Dog

This sort of thing has cropped up before

...and it has always been due to human error.

Today, I don't mean the HAL-9000. Amtrak:

Amtrak said “human error” is to blame for the disrupted service yesterday at Union Station.

A worker fell on a circuit board, which turned off computers and led to the service interruption, according to U.S. Sen. Dick Durbin.

The delay lasted more than 12 hours and caused significant overcrowding at Union Station.

The error affected more than 60,000 Amtrak and Metra passengers taking trains from Union to the suburbs, according to reports. Some riders resorted to taking the CTA or using ride-sharing services to get home, Chicago Tribune reported.

An analysis of the signal system failures and determined they were caused by “human error in the process of deploying a server upgrade in our technology facility that supports our dispatch control system” at Union Station, Amtrak said in a statement. Amtrak apologized in the statement for failing to provide the service that’s expected of it.

Which led my co-workers to wonder, why the hell were they doing a critical server upgrade in the middle of the day?

Olé, olé olé olé!

Oh, I love these stories. On today's Daily WTF, editor Remy Porter describes the world I grew up in, where dates were dates and 30 December 1899 ruled them all:

If you wanted to set a landmark, you could pick any date, but a nice round number seems reasonable. Let's say, for example, January 1st, 1900. From there, it's easy to just add and subtract numbers of days to produce new dates. Oh, but you do have to think about leap years. Leap years are more complicated- a year is a leap year if it's divisible by four, but not if it's divisible by 100, unless it's also divisible by 400. That's a lot of math to do if you're trying to fit a thousand rows in a spreadsheet on a computer with less horsepower than your average 2019 thermostat.

So you cheat. Checking if a number is divisible by four doesn't require a modulus operation—you can check that with a bitmask, which is super fast. Unfortunately, it means your code is wrong, because you think 1900 is a leap year. Now all your dates after February 28th are off-by-one. Then again, you're the one counting. Speaking of being the one counting, while arrays might start at zero, normal humans start counting at one, so January 1st should be 1, which makes December 31st, 1899 your "zero" date.

Our macro language is off-by-one for the first few months of 1900, but that discrepancy is acceptable, and no one at Microsoft, including Bill Gates who signed off on it, cares.

The Basic-derived macro language is successful enough inside of Excel that it grows up to be Visual Basic. It is "the" Microsoft language, and when they start extending it with features like COM for handling library linking and cross-process communication, it lays the model. Which means when they're figuring out how to do dates in COM… they use the Visual Basic date model. And COM was the whole banana, as far as Windows was concerned- everything on Windows touched COM or its successors in some fashion. It wasn't until .NET that the rule of December 30th, 1899 was finally broken, but it still crops up in Office products and SQL Server from time to time.

The .NET epoch began 1 January 2000. Except for DateTimeOffset values, whose epoch began on the non-existent date 1 January 0. Or DateTime values (now deprecated) which start at the beginning of the Gregorian calendar in 1753. (Same with SQL Server datetime types.)

The bottom line: dates are hard.

Rally nice view

I'm happy to announce that I started a new role on the 14th at Rally Health, a software company wholly owned by United Health Group. I'll have more to say later (still figuring out the social media policies), but for now I can say, look at the view:

And here's the view from the north:

Today, by the way, is the first day since I started that we've had anything approaching full sunlight. Of course, it's frighteningly cold out, but hey: nice view.

(I'll update Facebook and LinkedIn over the weekend, for those of you who care about those things.)

Detecting Alzheimer's in a novel

Researchers used the Iris Murdoch's last novel to quantify how Alzheimer's first signs show up in language:

As [neurologist Peter] Garrard explains, a patient’s vocabulary becomes restricted, and they use fewer words that are specific labels and more words that are general labels. For example, it’s not incorrect to call a golden retriever an “animal,” though it is less accurate than calling it a retriever or even a dog. Alzheimer’s patients would be far more likely to call a retriever a “dog” or an “animal” than “retriever” or “Fred.” In addition, Garrard adds, the words Alzheimer’s patients lose tend to appear less frequently in everyday English than words they keep — an abstract noun like “metamorphosis” might be replaced by “change” or “go.”

Researchers also found the use of specific words decreases and the noun-to-verb ratio changes as more “low image” verbs (be, come, do, get, give, go, have) and indefinite nouns (thing, something, anything, nothing) are used in place of their more unusual brethren. The use of the passive voice falls off markedly as well. People also use more pauses, Garrard says, as “they fish around for words.”

For his analysis of Murdoch, Garrard used a program called Concordance to count word tokens and types in samples of text from three of her novels: her first published effort, Under the Net; a mid-career highlight, The Sea, The Sea, which won the Booker prize in 1978; and her final effort, Jackson’s Dilemma. He found that Murdoch’s vocabulary was significantly reduced in her last book — “it had become very generic,” he says — as compared to the samples from her two earlier books.

Apparently there's a movie about Iris Murdoch too.

Standard, Core, and Framework

Let me elaborate on last night's post.

Microsoft has two flavors of .NET right now: the .NET Framework, which has been in production since February 2002, and .NET Core, which came out in June 2016. .NET Core implements the .NET Standard, which defines a set of APIs that any .NET application can use.

Here's the problem: The 18-year-old Framework has a lot more in it than the 2-year-old Standard specification or Core implementations. So while all .NET Standard and Core code works with the .NET Framework, not all Fx code works with Core.

Where this bit me over the weekend is dealing with Microsoft Azure Tables. I store almost all the data in Weather Now in Tables, because it's a lot of data that doesn't get read a lot—Tables' primary use case. There are .NET Standard implementations of Azure Storage Blob, Azure Queues, and Azure Files...but not Azure Tables. The latest implementation of Microsoft.Azure.CosmosDB.Table only supports the .NET Framework.

And that's a problem, because the new version of the Inner Drive Framework will follow .NET Standard 2.0 (or 3.0, if it comes out soon).

So yesterday I spent an hour going in circles before finally getting a definitive answer on this point.

Support for Azure Tables will happen soon enough, and I have a lot of documentation to write before the new Framework is ready for prime time. But I really wanted to tie a bow on it this weekend.

Waiting for an update

I'm mostly done with a major revision to the Inner Drive Framework, and I've discovered, to my horror, that one part can't be done yet. Microsoft Azure Table support doesn't work with .NET Standard yet.

This will make more sense at some point soon.

Stuff to read later

Of note:

Fun times!

My daily living hell

We've known this for 50 years: open-plan offices do nothing good for companies except reduce rent costs, but they do a whole lot of bad. They are not "fun;" they are not "collaborative;" they are not "start-uppy." They just suck:

Over the decades, a lot of really stupid management fads have come and gone, including:

  1. Six Sigma, where employees wear different colored belts (like in karate) to show they've been trained in the methodology.
  2. Stack Ranking, where employees are encouraged to rat each other out in order to secure their own advancement and budget.
  3. Consensus Management, where all decisions must pass through multiple committees before being implemented.

It need hardly be said that these fads were and are (at best) a waste of time and (at worst) a set of expensive distractions. But open plan offices are worse. Much worse. Why? Because they decrease rather than increase employee collaboration.

Previous studies of open plan offices have shown that they make people less productive, but most of those studies gave lip service to the notion that open plan offices would increase collaboration, thereby offsetting the damage.

The Harvard study, by contrast, undercuts the entire premise that justifies the fad. And that leaves companies with only one justification for moving to an open plan office: less floor space, and therefore a lower rent.

As an introvert in a field that requires concentration, minimal distractions, and time to reflect and think about what I'm doing—not to mention, a field predominantly comprising introverts—it's even worse.

I wish I had at least a cubicle.

The Big Disruption

I started reading Jessica Powell's online novel The Big Disruption last week. It's hilarious. And it has a lot to say about the archetypes of software development.

The premise is that the monarch of a fictional country has been exiled to California, where he found work first as a janitor at Stanford and then at a hot startup. He applies to a Google-like company and gets hired—but by accident, as a product manager.

Sample:

Arsyen washed his hands and returned to the cubicle, armed with his new vocabulary.

When Roni asked Arsyen about prioritization, Arsyen asked, “Is this on the roadmap?”

When Sven suggested adding images of attractive women to the car dashboard, Arsyen rubbed his chin.

“Does this align with our strategy?”

When all three looked to him for an opinion in how best to implement Symmetry Enhancement, Arsyen stood and put his hands on his hips.

“Does this align with the strategy on our roadmap?”

No one seemed to notice anything was amiss. If anything, it seemed like product managers just asked questions that other people had to answer.

“Good brainstorm, everyone. Let’s break for lunch,” Roni said. “Oh, and Arsyen, this is still very confidential, so let’s get this whiteboard cleaned off.”

Arsyen jumped up and began to wipe the whiteboard clean as Sven and Jonas scooted their chairs back to their desks. Arsyen was pleased that product managers seemed to have some janitorial tasks in their role. Maybe this wouldn’t be such a stretch after all.

I can't read it at work because I would have to explain why I'm laughing so hard.