Tuesday, April 21, 2020

2000 and 2020

By Christopher Roach

There was real panic in the air 20 years ago. The “Y2K bug” was a deceptively simple flaw in older mainframes and software. It coded the year portion of dates with two digits, but this flaw would face a reckoning upon the turn of the millennium, as 2000 would now only be rendered as “00.”

While it seemed at first that maybe your watch or a spreadsheet would be confused about what year it was, we were told that power plants would fail and airplanes might fall from the sky. This, in turn, would lead to panic and disorder. A new Dark Age was reportedly upon us, and the smart set took it very seriously.

The Red Cross counseled us to store extra food and get cash from the bank. Preppers were ready. A lot of people stayed away from what should have been an epic New Year’s celebration.

Some $40 billion was spent in the United States revising computer code and making other preparations, back when that was a lot of money. And then . . .

Almost nothing happened. A few clocks and computer printouts were flawed. But otherwise, the grid stayed on, the planes still flew, and the mass hysteria was soon forgotten.

Decisions Under Uncertainty

It’s hard to prepare for the unknown. The problem is akin to the search for the lost city of El Dorado. Conquistadors spent years on many expeditions looking for it. After all, it was supposedly a city of gold. Why not see what’s in the next valley? And the next one after that? There was no easy way to know when to turn around.

The same is true in preparing for large-scale disasters. Since the degree of risk is unknown and the magnitude of harm potentially infinite, ordinary metrics—such as those used by insurance companies or engineers—cannot easily say when enough is enough.

Superficially, it is logical when civilization-destroying risks are involved to devote every spare moment and every spare dollar to averting them. In the shadow of apocalypse, time and money spent on everything else seems frivolous. One can persuade himself that he is like Noah, building his ark while there is still time to do it.

But we know this is not how life works. The vast majority of worst-case scenarios never materialize. After the 2008 recession, it appeared the economy would never recover. We were in the midst of a “secular” decline. People sold what was left of their investments. Obama told us, “those manufacturing jobs aren’t coming back.”

But they did come back. And so did the stock market. The gold bugs missed out on the biggest bull market run of all time. It was hard to see how this would happen in 2008, but it did happen. Things will get better from here, too.

There is probably no absolute right answer to the problem that Donald Rumsfeld once labeled the “unknown unknowns.” The exact weighing of unknown risks is impossible. Ordinarily, risk assessment is a combination of probability and magnitude. But when the probability is completely unknown and unknowable, ordinary methods break down.

As with the “case fatality rate” for the coronavirus, where the numerators and denominators are unknown, the calculation of unknown risks can lead to strange, counterproductive conclusions. This happens especially when the magnitude of harm is nearly infinite for events like pandemics, nuclear wars, and, 20 years ago, the Y2K phenomenon.

Consider the Source

There’s probably a few useful things to keep in mind, though, as the prophets of doom and gloom are not evenly distributed.

First, watch what people do, not what they say.

Economists call this revealed preferences. We’re constantly told about the risks of global warming and rising sea levels. But then we see the Obamas buying a beachfront mansion on Martha’s Vineyard, which suggests that they don’t think the problem is so serious or so immediate that they can’t risk more than $10 million on a house. With coronavirus, we hear that we need to shut everything down, especially churches, even as New York City keeps the subways running and other states keep their liquor stores open as “essential industries.”

Second, as Harvey MacKay said, “beware of the naked guy offering you his shirt.”

It’s easy to make predictions. But do the people make them put their money where their mouths are? Consider all the investment advice out there. Are the people offering the advice actually investing that way? Or might they—like Goldman Sachs did in 2007—be promoting one thing with your money while doing the opposite with their own? Or perhaps, as is the case with “Mad Money” Jim Cramer, might their predictions just be consistently wrong?

People may make flawed predictions for other reasons. A government bureaucrat (or someone infinitely wealthy like Bill Gates) likely does not appreciate the fragility of private-sector employment, and thus will encourage flawed decisions—like shutting down the economy—because he doesn’t face any personal and tangible risk from this advice. After all, the bureaucrat would have to murder someone to lose his job . . . and might find even that no impediment to collecting a salary. And Gates could probably buy himself a new planet if things went really sideways.

Similarly, partisanship seems to matter. Democrats mocked Trump for his early efforts on the virus, such as the shutdown of travel from China, and then pushed him to shut down our country and keep it shut down. These are not serious or even consistent positions, but just knee-jerk reactions. The same people who were mocking masks a month ago are now mocking people for not wearing them today. Do any of these people have the country’s interest at heart? It does not seem very likely.

Finally, a person can have the wrong skin in the game. In other words, he can benefit directly from you believing something.

A guy selling pandemic supplies wants you to think it’s a major risk. The guy selling an investment product cannot help but be influenced by his anticipated commission, even when the objectively better choice is an index fund. In both cases, the judgment of such people is clouded by their direct financial stake in you believing a particular thing.

In the Y2K crisis, this group included all of the software programmers who would “debug” and fortify systems from the incredible threat of the Y2K glitch. It included all of the Cassandras, like James Kunstler, warning people to sell everything and move off the grid. Today, it includes all of the companies and consultants hocking tests, vaccines, ventilators, masks, sanitizer, and anything else from which they can expect to profit. This is not to say none of these things work, or that they may not be valuable.

Always ask yourself who is telling you something and how he may stand to gain.

The Long View

Studying history, even recent history, takes away a lot of life’s surprises. We’ve been warned repeatedly about the “end of the world” from nuclear war, global cooling, global warming, natural disasters, Y2k, EMP events, economic crises, the Zika virus, Ebola, and Trump’s election.

There is also an egocentric dimension to such predictions. The thought that we are living through something historic, such as the end of the world, gives life some drama. It relieves one of the ordinary duties of self-care, as well as one’s duties to others. It also blinds us to the other forms of decline—cultural, moral, and spiritual—that have proceeded through good times and bad.

Obviously, it’s a good idea to be prepared for contingencies—especially predictable and recurring ones like hurricanes and economic downturns. But, for the most part, life keeps going on.

The fact that we still have computers and civilization to lose would have come as a surprise to the more exuberant prophets of Y2K doom. But we do, and life will go on after this, too.


Read more at American Greatness.

1 comment:

  1. False equivalency. The $40B mitigated the Y2K problem and made it possible for people who know nothing about programming - and COBOL and mainframes in particular - to mock the experts who recognized and publicized the problem. I know because I wrote some of that COBOL code in the '70s and '80s; by 2000, I had moved on to other programming languages (and resisted head-hunters who wanted to recruit me for mitigation work). Dates were generally stored as three separate 2-digit numbers. Any computation that involved addition or subtraction on a year had the potential to produce invalid data or crash the program with unpredictable results. Here's a trivial example: Your credit card automatically renewed for 3 years on November 17, (19)99. Adding 3 to the old expiration year to get the new expiration date would either crash the program because you can't fit 102 into the space allocated for a 2-digit number or it would simply chop off the 1 and make your card expire on November 17, (19)02. Either way, you no longer have a valid credit card.

    ReplyDelete