The Daily Parker

Politics, Weather, Photography, and the Dog

I hate Scala

"It is no one's right to despise another. It is a hard-won privilege over long experience."—Isaac Asimov, "C-Chute"

For the past three months, I've worked with a programming language called Scala. When I started with it, I thought it would present a challenge to learn, but ultimately it would be worth it. Scala is derived from Java, which in turn is a C-based language. C#, my primary language of the last 18 years, is also a C-based language. 

So I analogized thus: C# : Java :: Spanish : Italian; therefore C# : Scala :: Spanish : Neapolitan Italian. So as with my decent Spanish skills making it relatively easy to navigate Italy, and my decent C# skills making it relatively easy to navigate Java, I expected Scala to be just a dialect I could pick up.

Wow, was I wrong. After three months, I have a better analogy: C# : Scala :: English : Cockney Rhyming Slang. Or perhaps Jive. Or maybe even, C# : Scala :: German : Yiddish. That's right: I submit that Scala is a creole, probably of Java and Haskell.

Let me explain.

Most C-based languages use similar words to express similar concepts. In Java, C#, and other C-based languages, an instance of a class is an object. Classes can have instance and static members. If you have a list of cases to switch between, you switch...case over them. And on and on.

Scala has a thing called a class, a thing called an object, a thing called a case class, and a thing called a match. I'll leave discovering their meanings as an exercise for the reader. Hint: only one of the four works the same way as in the other related languages. In other words, Scala bases itself on one language, but changes so many things that the language is incomprehensible to most people without a lot of study.

Further, though Scala comes from object-oriented Java, and while it can work in an object-oriented fashion with a lot of coercion, 

As in Yiddish or Gullah, while you can see the relationship from the creole back to the dominant language, no deductive process will get you from the dominant to the creole. "Answer the dog," a perfectly valid way to say "pick up the phone" in Cockney (dog <- dog bone <- rhymes with phone), makes no sense to anyone outside the East End. Nor does the phrase "ich hab' in bodereim," which translates directly from Yiddish as "I have (it) in the bathtub," but really means "fuck it." The fact that Yiddish uses the Hebrew alphabet to express a Germanic creole only adds to its incomprehensibility to outsiders.

These are features, not bugs, of Jive, Gullah, Yiddish, and other creoles and slangs. They take words from the dominant language, mix them with a bunch of other languages, and survive in part as a way to distinguish the in-group from the out-group, where the out-group are the larger culture's dominant class.

I submit that Scala fits this profile.

Scala comes from academic computing, and its creator, German computer scientist Martin Odersky, maintains tight control over the language definition. Fast adoption was never his goal; nor, I think, was producing commercial software. He wanted to solve certain programming problems that current computer science thinking worries about. That it elevates a densely mathematical model of software design into commercial development is incidental, but it does increase the barriers to entry into the profession, so...maybe that's also a feature rather than a bug?

Scala isn't a bad language. It has some benefits over other languages. First, Scala encourages you to create immutable objects, meaning once an object has a value, that value can never change. This prevents threading issues, where one part of the program creates an object and another part changes the object such that the first part gets totally confused. Other languages, like C#, require developers to put guardrails around parts of the code to prevent that from happening. If all of your objects are immutable, however, you don't need those guardrails.

A consequence of this, however, is that some really simple operations in C# become excruciating in Scala. Want to create a list of things that change state when they encounter other things? Oh, no, no, no! Changing state is bad! Your two-line snippet that flips a bit on some objects in a list should instead be 12 lines of code (16 if it's readable) that create two copies of every object in your list through a function that operates on the list. I mean, sure, that's one way to do it, I suppose...but it seems more like an interview question with an arbitrary design constraint than a way to solve a real problem.

Second, this same characteristic makes the language very, very scaleable, right out of the box. That is actually the point: "Scala" is a portmanteau of "Scaleable" and "Language," after all. That said, C# is scaleable enough to run XBox, NBC.com, Office Online, and dozens of other sites with millions of concurrent users, so Scala doesn't exactly have a monopoly on this.

Not to mention, Scala gives frustrated Haskell programmers a way to show off their functional-programming and code-golf skills in a commercial environment. Note that this makes life a lot harder for people who didn't come from an academic background, as the code these people write in Scala is just as incomprehensible to newbies as Haskell is to everyone else.

All of this is fine, but I have developed commercial software for 25 years, and rarely have I encountered a mainstream language as ill-suited to the task of producing a working product in a reasonable time frame than this one. Because the problem that I want to solve, as a grizzled veteran of this industry, is how I can do my job effectively. Scala is making that much more difficult than any language I have ever dealt with. (There are a number of other factors in my current role making it difficult to do my job, but that is not the point of this post.)

Maybe I'm just old. Maybe I have thought about software in terms of objects with behavior and data for so long that thinking about it in terms of functions that operate over data just doesn't penetrate. Maybe I've been doing this job long enough to see functional programming as the pendulum swinging back to the 1980s and early 1990s, when the implementationists ruled the day, and not wanting to go back to those dark ages. (That's another post, soon.)

This cri de coeur aside, I'll continue learning Scala, and continue doing my job. I just really, really wish it had fewer barriers to success.

David Graeber on Bullshit Jobs

I've just started reading anthropologist David Graeber's book Bullshit Jobs. It's hilarious and depressing at the same time. For a good summary, I would point you to Graeber's own essay "On the Phenomenon of Bullshit Jobs" that ran in Strike seven years ago:

A recent report comparing employment in the US between 1910 and 2000 gives us a clear picture (and I note, one pretty much exactly echoed in the UK). Over the course of the last century, the number of workers employed as domestic servants, in industry, and in the farm sector has collapsed dramatically. At the same time, ‘professional, managerial, clerical, sales, and service workers’ tripled, growing ‘from one-quarter to three-quarters of total employment.’ In other words, productive jobs have, just as predicted, been largely automated away (even if you count industrial workers globally, including the toiling masses in India and China, such workers are still not nearly so large a percentage of the world population as they used to be.)

But rather than allowing a massive reduction of working hours to free the world's population to pursue their own projects, pleasures, visions, and ideas, we have seen the ballooning of not even so much of the ‘service’ sector as of the administrative sector, up to and including the creation of whole new industries like financial services or telemarketing, or the unprecedented expansion of sectors like corporate law, academic and health administration, human resources, and public relations. And these numbers do not even reflect on all those people whose job is to provide administrative, technical, or security support for these industries, or for that matter the whole host of ancillary industries (dog-washers, all-night pizza delivery) that only exist because everyone else is spending so much of their time working in all the other ones.

These are what I propose to call ‘bullshit jobs’.

It's as if someone were out there making up pointless jobs just for the sake of keeping us all working. And here, precisely, lies the mystery. In capitalism, this is precisely what is not supposed to happen. Sure, in the old inefficient socialist states like the Soviet Union, where employment was considered both a right and a sacred duty, the system made up as many jobs as they had to (this is why in Soviet department stores it took three clerks to sell a piece of meat). But, of course, this is the sort of very problem market competition is supposed to fix. According to economic theory, at least, the last thing a profit-seeking firm is going to do is shell out money to workers they don't really need to employ. Still, somehow, it happens.

The book expands on the essay's themes, and adds scholarship, so it's therefore even more depressing than the original column. But he suggests an alternative: public policies to redistribute wealth back to the people who created it, and actually free up our time from these bullshit jobs.

Short distance office move

My team have moved to a new space we've leased on a different floor of Chicago's Aon Center. This morning, this was my view:

And now, one floor lower and facing the opposite direction, this is my view:

I actually prefer the south view, but only marginally. In fact, I'll probably keep taking photos of the south view. But neither view sucks.

Critics of the Web—30 years ago

Alexis Madrigal takes a look at criticisms of the World Wide Web from when it was new:

Thirty years ago this week, the British scientist Tim Berners-Lee invented the World Wide Web at CERN, the European scientific-research center. Suffice it to say, the idea took off. The web made it easy for everyday people to create and link together pages on what was then a small network. The programming language was simple, and publishing was as painless as uploading something to a server with a few tags in it.

Just a few years after the internet’s creation, a vociferous set of critics—most notably in Resisting the Virtual Life, a 1995 anthology published by City Lights Books—rose to challenge the ideas that underlay the technology, as previous groups had done with other, earlier technologies.

Maybe as a major technological movement begins to accelerate—but before its language, corporate power, and political economics begin to warp reality—a brief moment occurs when critics see the full and awful potential of whatever’s coming into the world. No, the new technology will not bring better living (at least not only that). There will be losers. Oppression will worm its way into even the most seemingly liberating spaces. The noncommercial will become hooked to a vast profit machine. People of color will be discriminated against in new ways. Women will have new labors on top of the old ones. The horror-show recombination of old systems and cultures with new technological surfaces and innards is visible, like the half-destroyed robot face of Arnold Schwarzenegger in Terminator 2.

Then, if money and people really start to pour into the technology, the resistance will be swept away, left dusty and coughing as what gets called progress rushes on.

The whole piece is worth a read.

Two on data security

First, Bruce Schneier takes a look at Facebook's privacy shift:

There is ample reason to question Zuckerberg's pronouncement: The company has made -- and broken -- many privacy promises over the years. And if you read his 3,000-word post carefully, Zuckerberg says nothing about changing Facebook's surveillance capitalism business model. All the post discusses is making private chats more central to the company, which seems to be a play for increased market dominance and to counter the Chinese company WeChat.

We don't expect Facebook to abandon its advertising business model, relent in its push for monopolistic dominance, or fundamentally alter its social networking platforms. But the company can give users important privacy protections and controls without abandoning surveillance capitalism. While some of these changes will reduce profits in the short term, we hope Facebook's leadership realizes that they are in the best long-term interest of the company.

Facebook talks about community and bringing people together. These are admirable goals, and there's plenty of value (and profit) in having a sustainable platform for connecting people. But as long as the most important measure of success is short-term profit, doing things that help strengthen communities will fall by the wayside. Surveillance, which allows individually targeted advertising, will be prioritized over user privacy. Outrage, which drives engagement, will be prioritized over feelings of belonging. And corporate secrecy, which allows Facebook to evade both regulators and its users, will be prioritized over societal oversight. If Facebook now truly believes that these latter options are critical to its long-term success as a company, we welcome the changes that are forthcoming.

And Cory Doctorow describes a critical flaw in Switzerland's e-voting system:

[E]-voting is a terrible idea and the general consensus among security experts who don't work for e-voting vendors is that it shouldn't be attempted, but if you put out an RFP for magic beans, someone will always show up to sell you magic beans, whether or not magic beans exist.

The belief that companies can be trusted with this power [to fix security defects while preventing people from disclosing them] defies all logic, but it persists. Someone found Swiss Post's embrace of the idea too odious to bear, and they leaked the source code that Swiss Post had shared under its nondisclosure terms, and then an international team of some of the world's top security experts (including some of our favorites, like Matthew Green) set about analyzing that code, and (as every security expert who doesn't work for an e-voting company has predicted since the beginning of time), they found an incredibly powerful bug that would allow a single untrusted party at Swiss Post to undetectably alter the election results.

You might be thinking, "Well, what is the big deal? If you don't trust the people administering an election, you can't trust the election's outcome, right?" Not really: we design election systems so that multiple, uncoordinated people all act as checks and balances on each other. To suborn a well-run election takes massive coordination at many polling- and counting-places, as well as independent scrutineers from different political parties, as well as outside observers, etc.

And even other insecure e-voting systems like the ones in the USA are not this bad: they decentralized, and would-be vote-riggers would have to compromise many systems, all around the nation, in each poll that they wanted to alter. But Swiss Post's defect allows a single party to alter all the polling data, and subvert all the audit systems. As Matthew Green told Motherboard: "I don’t think this was deliberate. However, if I set out to design a backdoor that allowed someone to compromise the election, it would look exactly like this."

Switzerland is going ahead with the election anyway, because that's what people do when they're called out on stupidity.

Weekend reading list

Just a few things I'm reading that you also might want to read:

And finally, it's getting close to April and the Blogging A-to-Z Challenge. Stay tuned.

The last moments of winter

Today actually had a lot of news, not all of which I've read yet:

And now, good night to February.

Lunchtime reading

I had these lined up to read at lunchtime:

Meanwhile, for only the second time in four weeks, we can see sun outside the office windows:

Messing with the wrong guy

A telephone scam artist is going to prison after picking precisely the wrong victim:

Keniel Thomas, 29, from Jamaica, pleaded guilty in October to interstate communication with the intent to extort, federal authorities said.

He was sentenced to 71 months in prison last week by U.S. District Court Judge Beryl A. Howell in Washington, D.C., and will be deported after he has served his term, officials said.

Thomas made his first call to [William] Webster, 94, on June 9, 2014, identifying himself as David Morgan. He said that he was the head of the Mega Millions lottery and that Webster was the winner of $15.5 million and a 2014 Mercedes Benz, according to court documents.

Little did Thomas know that he was targeting the man who had served as director of the FBI and then the CIA under Presidents Jimmy Carter and Ronald Reagan.

Usually Webster just ignores these idiots, but apparently Thomas behaved particularly egregiously, even threatening Webster's wife. So basically Thomas will spend almost 6 years in prison because he's a stupid schmuck.

Still, it's nice to send one of those bastards to jail.

Olé, olé olé olé!

Oh, I love these stories. On today's Daily WTF, editor Remy Porter describes the world I grew up in, where dates were dates and 30 December 1899 ruled them all:

If you wanted to set a landmark, you could pick any date, but a nice round number seems reasonable. Let's say, for example, January 1st, 1900. From there, it's easy to just add and subtract numbers of days to produce new dates. Oh, but you do have to think about leap years. Leap years are more complicated- a year is a leap year if it's divisible by four, but not if it's divisible by 100, unless it's also divisible by 400. That's a lot of math to do if you're trying to fit a thousand rows in a spreadsheet on a computer with less horsepower than your average 2019 thermostat.

So you cheat. Checking if a number is divisible by four doesn't require a modulus operation—you can check that with a bitmask, which is super fast. Unfortunately, it means your code is wrong, because you think 1900 is a leap year. Now all your dates after February 28th are off-by-one. Then again, you're the one counting. Speaking of being the one counting, while arrays might start at zero, normal humans start counting at one, so January 1st should be 1, which makes December 31st, 1899 your "zero" date.

Our macro language is off-by-one for the first few months of 1900, but that discrepancy is acceptable, and no one at Microsoft, including Bill Gates who signed off on it, cares.

The Basic-derived macro language is successful enough inside of Excel that it grows up to be Visual Basic. It is "the" Microsoft language, and when they start extending it with features like COM for handling library linking and cross-process communication, it lays the model. Which means when they're figuring out how to do dates in COM… they use the Visual Basic date model. And COM was the whole banana, as far as Windows was concerned- everything on Windows touched COM or its successors in some fashion. It wasn't until .NET that the rule of December 30th, 1899 was finally broken, but it still crops up in Office products and SQL Server from time to time.

The .NET epoch began 1 January 2000. Except for DateTimeOffset values, whose epoch began on the non-existent date 1 January 0. Or DateTime values (now deprecated) which start at the beginning of the Gregorian calendar in 1753. (Same with SQL Server datetime types.)

The bottom line: dates are hard.