Not Imagining the Unimaginable Happening

Sometimes the unimaginable happens. Back in the sixties, when it was long hair and talking about the revolution and about not trusting anyone over thirty, the Beatles released that ditty about finally turning sixty-four – and it didn’t seem to be ironic at all. We only pretended it was – we were pretentious college students after all. But it really was a charming song about the joy of growing old together, down all the long years, and things turning out rather nicely. There were those grandchildren – Vera, Chuck and Dave. And this was puzzling. But it was one of the first songs Paul McCartney wrote, when he was sixteen – and the guys used it in the early days as something they could play when the amplifiers broke down or the electricity went off. It seems there was not a whole lot of Deep Inner Meaning to any of it – nothing to see here – move along.

But somehow the song did take off – it got a reasonable amount of play on the radio, probably programmed to calm worried parents. Those shaggy kids from Liverpool were really nice young men and not threatening at all. They had a lot of respect for adults and the adult world. They actually wanted to be happily ordinary. That was a relief. Blood pressure dropped all across America, until they heard Mick Jagger.

But that’s another story. And of course none of us at the time really imagined we’d ever be sixty-four. We were cool.

But the unimaginable happened. Damn, it’s coming up, for real, this June. It wasn’t just a silly song after all. And Paul McCartney was right. Things did turn out rather nicely, or marginally nicely, with the usual caveats. The years roll by. You make the best of them, as best you can. And Lennon and Harrison didn’t even make it this far, so there’s no point in complaining.

But now you do find yourself saying the oddest things – that Natalie Portman girl seems like a nice kid – but in June she’ll turn thirty. That used to be really old, the age at which one could no longer be trusted. But she is just a kid. It’s all in your perspective – and her real name is Natalie Hershlag by the way. Of course she does play the waif – that’s her specialty – so you can see how people think of her as just a kid. You don’t have to be sixty-four. And she was in town six weeks ago to collect her Oscar at the Kodak Theater just down the street, for her starring role in that Black Swan movie – a strange bit of business. No doubt the lesbian nude love scene was amazing and tasteful and artistic, but the world really didn’t need another movie about a driven insecure brilliant young ballerina going mad. That gets old. You remember the 1948 movie – those red shoes killed the sweet young thing. But the Portman kid seems to have done a fine job with the old trope.

Still, those two words – Black Swan – conjure up other thoughts – old people thoughts. There was that seventeenth-century philosophical thought experiment. In Europe all anyone had ever seen were white swans – in fact, starting with the premise that “all swans are white” had long been used as the standard example of a scientific truth. You could start with that and build outward. And then in 1697 explorers found Cygnus atratus in Australia – the black swan. Oops. And that was the central metaphor in the 2007 book The Black Swan: The Impact of the Highly Improbable – all the rage for a few years – Nassim Nicholas Taleb pointing out that experts who make their living from economic forecasting are generally wrong, and for good reason. He’s a financial trader and wonders about the statistical concepts that underlie models of prediction – he suggests they might be useless. It’s the black swan that changes everything, and that leads to Black Swan Theory – it’s the odd thing, the outlier, what you didn’t expect, and couldn’t expect, that matters. It’s the total surprise that drives history. Everyone does linear projection from the expected. You can make a good living at that. But there’s always a black swan, a black swan you said did not exist.

Or you can put it precisely:

The disproportionate role of high-impact, hard to predict, and rare events that are beyond the realm of normal expectations in history, science, finance and technology –

The non-computability of the probability of the consequential rare events using scientific methods (owing to the very nature of small probabilities) –

The psychological biases that make people individually and collectively blind to uncertainty and unaware of the massive role of the rare event in historical affairs –

The very rare event has a massive role in most everything, in fact, the dominant role – it is the black swan that matters, not all the white ones. Taleb cites the rise of the Internet, the personal computer, World War I, and the September 11 attacks as examples of Black Swan Events. No one saw any of those coming. They couldn’t. Drawing general conclusions from specific observations failed there. And the unstructured randomness found in life does not resemble the structured randomness found in games – so game theory fails. Decision theory, based on a fixed universe or a model of possible outcomes, ignores and minimizes the effect of events that are “outside model” – so forget that. Thinking in terms of fractals, or power law or scalable distributions might help – but many events simply are without precedent, undercutting the basis of those types of reasoning too.

Needless to say, Nassim Nicholas Taleb upset a lot of people, but comforted others. Maybe you cannot know what you really, really need to know – what just happened and what is likely to happen next – but at least you can look at all those experts on cable news saying what just happened and what will obviously happen next, and decide it might be time to walk the dog. Taleb argues for the use of counterfactual reasoning when considering risk. Who knows what that means? Maybe it means walking the dog.

And needless to say all this would make one lousy Natalie Portman movie. But the highly improbably happens – you turn around and you’re suddenly sixty-four.

And see Chris Anderson:

Four hundred years ago, Francis Bacon warned that our minds are wired to deceive us. “Beware the fallacies into which undisciplined thinkers most easily fall–they are the real distorting prisms of human nature.” Chief among them: “Assuming more order than exists in chaotic nature.” Now consider the typical stock market report: “Today investors bid shares down out of concern over Iranian oil production.” Sigh. We’re still doing it.

Our brains are wired for narrative, not statistical uncertainty. And so we tell ourselves simple stories to explain complex thing we don’t – and, most importantly, can’t – know. The truth is that we have no idea why stock markets go up or down on any given day, and whatever reason we give is sure to be grossly simplified, if not flat out wrong.

We place too much weight on the odds that past events will repeat. That’s emotionally satisfying, and keeps David Gergen and Frank Luntz gainfully employed, but that’s about all it is – most of the really big events in our world are rare and unpredictable. Black swans are the problem.

And Joseph E. Stiglitz points that out in this item in Slate – we may learn nothing from the Wall Street crash or Japan’s nuclear disaster about how to avoid future catastrophic risks:

The consequences of the Japanese earthquake – especially the ongoing crisis at the Fukushima nuclear power plant – resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks and about how badly markets and societies can manage them.

Of course, in one sense, there is no comparison between the tragedy of the earthquake – which has left more than 25,000 people dead or missing – and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.

Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe.

No, Stiglitz is not a Luddite, opposed to new technology. He just knows something failed here, as events proved these folks wrong:

Not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.

And this was pretty dramatic:

Before the Great Recession, America’s economic gurus – from the head of the Federal Reserve to the titans of finance – boasted that we had learned to master risk. “Innovative” financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society but themselves.

These wizards of finance, it turned out, didn’t understand the intricacies of risk, let alone the dangers posed by “fat-tail distributions” – a statistical term for rare events with huge consequences, sometimes called “black swans.” Events that were supposed to happen once in a century – or even once in the lifetime of the universe – seemed to happen every 10 years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause – something like the meltdowns that keep dogging the nuclear industry.

Think about that. You’ve bundled crappy bad loans – that could never be repaid – into those innovative financial instruments, and sold them to folks who split them apart and bundled them in a different ways and sold the new packages, while others insured the packages, then bundled the insurance instruments in different pancakes and sold them too, and those six-times-removed packages were insured themselves by others, in further packages. That’s a lot of innovative financial instruments, but the risk was distributed like a fine dust around the world, and no one would really get dirty. It couldn’t all go bad, and even if it did, no one was holding actual big scary risk, just a tiny share of it, a bit of dust.

But it didn’t work as it should. A black swan showed up, and we said there couldn’t be such a thing. But then we set things up in such a way so there was no apparent risk to anyone. We made things even worse:

Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: We might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favor self-delusion. A system that socializes losses and privatizes gains is doomed to mismanage risk.

Indeed, the entire financial sector was rife with agency problems and externalities. Ratings agencies had incentives to give good ratings to the high-risk securities produced by the investment banks that were paying them. Mortgage originators bore no consequences for their irresponsibility, and even those who engaged in predatory lending or created and marketed securities that were designed to lose did so in ways that insulated them from civil and criminal prosecution.

It’s one thing to honestly not see the risk in something. It quite another to set up a system where people lie to each other about possible risk, and create a system so in the last resort, if everything goes south, the taxpayers will eat the losses so there is no risk at all, really. Who wouldn’t build a house of cards? You know you won’t get hurt when it collapses.

And Stiglitz carries this forward:

Unfortunately, some of the really big risks that we face today are most probably not even rare events. The good news is that such risks can be controlled at little or no cost. The bad news is that doing so faces strong political opposition, because there are people who profit from the status quo.

And he gets specific:

We have seen two of the big risks in recent years, but have done little to bring them under control. By some accounts, the way the last crisis was managed may have increased the risk of a future financial meltdown.

Too-big-to fail banks, and the markets in which they participate, now know that they can expect to be bailed out if they get into trouble. As a result of this “moral hazard,” these banks can borrow on favorable terms, giving them a competitive advantage based not on superior performance but on political strength. While some of the excesses in risk-taking have been curbed, predatory lending and unregulated trading in obscure over-the-counter derivatives continue. Incentive structures that encourage excess risk-taking remain virtually unchanged.

And there’s the parallel:

So, too, while Germany has shut down its older nuclear reactors, in the United States and elsewhere, even plants that have the same flawed design as Fukushima continue to operate. The nuclear industry’s very existence is dependent on hidden public subsidies – costs borne by society in the event of nuclear disaster as well as the costs of the still-unmanaged disposal of nuclear waste. So much for unfettered capitalism!

And there’s another black swan:

For the planet, there is one more risk, which, like the other two, is almost a certainty: global warming and climate change. If there were other planets to which we could move at low cost in the event of the almost certain outcome predicted by scientists, one could argue that this would be a risk worth taking. But there aren’t, so it isn’t.

The costs of reducing emissions pale in comparison to the possible risks the world faces. And that is true even if we rule out the nuclear option (the costs of which were always underestimated). To be sure, coal and oil companies would suffer, and big polluting countries – such as the United States – would obviously pay a higher price than those with a less profligate lifestyle.

So it comes down to this:

In the end, those gambling in Las Vegas lose more than they gain. As a society, we are gambling – with our big banks, with our nuclear power facilities, with our planet. As in Las Vegas, the lucky few – the bankers who put our economy at risk and the owners of energy companies who put our planet at risk – may walk off with a mint. But on average and almost certainly, we as a society, like all gamblers, will lose.

Other than that, have a nice day.

But you know the deal. Sometimes the unimaginable happens. But we don’t have to make it worse. And turning an unlikely sixty-four isn’t that bad, maybe.

About Alan

The editor is a former systems manager for a large California-based HMO, and a former senior systems manager for Northrop, Hughes-Raytheon, Computer Sciences Corporation, Perot Systems and other such organizations. One position was managing the financial and payroll systems for a large hospital chain. And somewhere in there was a two-year stint in Canada running the systems shop at a General Motors locomotive factory - in London, Ontario. That explains Canadian matters scattered through these pages. Otherwise, think large-scale HR, payroll, financial and manufacturing systems. A résumé is available if you wish. The editor has a graduate degree in Eighteenth-Century British Literature from Duke University where he was a National Woodrow Wilson Fellow, and taught English and music in upstate New York in the seventies, and then in the early eighties moved to California and left teaching. The editor currently resides in Hollywood California, a block north of the Sunset Strip.
This entry was posted in Black Swan Event Theory, Economic Meltdown, Epistemology, Political Epistemology and tagged , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s