Live and learn. Teaching was fine, but the pay was awful and teaching wasn’t the half of it. There was careful preparation and planning, which for English teachers involves far more than daily lesson plans. Novels and poems and plays are cultural milestones, and also packed with emotional and intellectual surprises – and there was the endless task of grading papers, rewarding clarity and insight and letting those who could offer neither down easy, so they might manage to find a bit of either, one day. The job was to provide encouragement, any possible way – and the work was draining, but it was clear-cut, and then it was time to leave. Preparing adolescents for the world, but if you’re fully committed to that task, never leaving their world, guarantees your own perpetual adolescence. That’s no way to live. It was time to grow up.
There were other lessons to learn. The business world was different, a world of personal responsibility – contribute to the bottom line, even if indirectly, or you’re gone. And your contribution can be quantified, and will be. Either you did something to increase profits, or did something to make things run more smoothly so others could increase profits, or you saved the company from some disastrous blunder or other that would have cost the company a ton of money and meant no profits, or you’re useless – and that was a change from teaching Hamlet to a roomful of bored adolescents on a snowy afternoon. Encouragement was just a minor management tool, seldom used. That was the world of blame. If something went wrong, someone screwed up – someone was to blame. Someone is always to blame. Things don’t just happen, like in Moby Dick or whatever.
That was fine, so the years working in Human Resources, with all the evaluation and merit-pay stuff, made sense, and then the years in Information Technology made sense – developing applications to make the business run better was a valuable activity – but then there was the move to management and then senior management. Nothing was that clear-cut in managing large-scale systems. When something failed there was no one programmer to blame – careful change-control meant every line of code was checked and rechecked every step of the way. In fact, everyone had probably been doing just what they were supposed to do, and it all went south anyway. That wasn’t very satisfying. The problems were always structural, if not conceptual – no one had carefully thought out what the damned thing was really supposed to do. Perhaps the systems requirements were contradictory in the first place or vague in a vaguely hopeful way – but everyone hand bought into those. Sure, heads should roll, but whose heads? There was no one to blame, damn it.
Live and learn? That’s a hard lesson to learn. There’s always someone to blame – Neville Chamberlain and Adolf Hitler for the Second World War – except there was that Treaty of Versailles that ended the First World War that made someone like Hitler inevitable. Totally humiliating Germany might have felt good, and everyone agreed that was the thing to do – so no one’s to blame – but that was a bad idea. If it hadn’t been Hitler it would have been someone else, and there was the Great Depression and its worldwide effect, creating structural economic reasons that there would be another war, Hitler or not. And that first war was not caused by that obscure fellow in Sarajevo who shot Archduke Ferdinand dead. The western nations and the fading Austro-Hungarian Empire and the failing Ottoman Empire were poised for war. It was a structural thing. No one person was to blame, and maybe George Bush shouldn’t be blamed for our recent useless eight-year war in Iraq. That man, part goofball and part schoolyard bully, was part of an array of larger structural issues – the American people were angry, the neoconservatives had their theories about the End of History and the proper use of American military power, the world oil markets could not be upended, the alliances in the Middle East had been set, there was Israel’s security to consider, there was our history of support for odd folks over there that complicated matters, and so on and so forth. The 9/11 attacks set it all in motion. George Bush just rode the wave, and Dick Cheney waxed his surfboard. They were opportunists. Larger forces were at play – not that they should be held blameless. Opportunists can ruin the world.
No one, however, likes to hear someone say no one is to blame when something goes terribly wrong. Someone has to have screwed up, or a small and specific group of people has to have screwed up – Wall Street bankers and hedge fund managers or Allen Greenspan or Jamie Dimon or Art Laffer. When the economy collapsed in 2008, someone had to be to blame. That’s what people said. Maybe it was that CEO who screwed up Lehman Brothers so awesomely that it went belly-up on the Monday in the middle of September, 2008, taking down everything – but everything was falling down anyway. Ah, it had to be ALL those greedy Wall Street bastards, or someone – or maybe not. Maybe it was Bill Clinton, championing deregulation in his presidency, or George Bush pretty much abolishing all regulation in his – or maybe it was the bankers, or their too-big-to-fail banks, each of which was bigger than each of them individually and not “one person” at all. But someone was to blame.
That brings back memories, of being a senior systems manager having to explain to the top executives that no one was to blame for the snazzy new inventory system going down – at least there was no one person to blame. You see, the problems were structural – the basic idea was fatally flawed, none of us saw it, and you guys didn’t see it, and by the way, no one did anything wrong. Needless to say, they hated that, so it was necessary to fire a few programmers, for show, to keep them happy. That was their world. Someone had to be to blame for what happened. It was time to resign. Retirement is fine.
It is, however, odd to see this playing out again in the political world. Some books scare the crap out of people. Now and then one comes along that upsets important people, or people who think they’re important, and one of those is that new book – Capital in the Twenty-First Century – which Paul Krugman calls the most important economics book of the year, and maybe of the decade. In that book, the French economist Thomas Piketty argues that we’re in a world of hurt now, because, finally, after all these years, it seems that capitalism doesn’t work all that well. We’re returning to “patrimonial capitalism” – the economy is being run by and for those of vast wealth, which is now pretty much inherited wealth, so we have a country and a world in which birth matters more than effort and talent. No one is to blame, really. It’s a structural problem, and Sean McElwee sums that up:
Piketty argues that capital will accumulate in the hands of the few when growth is slower than the rate of return on capital and dis-accumulate if not (This is the now famous “r>g” formula). As growth slows, companies can replace workers with machines (written by economists as “substitution between capital and labor”), but only if there is a high elasticity of capital to labor (higher elasticity means easier replacement). This means that the share of income going to the owners of capital will rise and the distribution of that capital will become more unequal.
Piketty does not hold to a labor theory of value, he does not believe that capitalism is founded on the exploitation of the proletariat, and he does not believe the system will inevitably collapse on its own contradictions.
In short, he’s not a damned Marxist. He sees a structural problem. Capitalism, now, as it is practiced, will generate more and more inequality, and thus more misery for all but a few, from here on out. Some, the few, will think that’s a good thing. Some, the many, will not. Piketty is not taking sides. As things get worse, the many will blame the few, and the few will blame the many, but Piketty isn’t blaming anyone. This is a structural analysis.
McElwee notes that no one wants to hear that:
Progressives are celebrating the book – and its unexpected popularity – as an important turning point in the fight against global wealth inequality. This, of course, means that conservatives have gone completely ballistic.
Rush Limbaugh, for example, has come out guns a-blazing: “Some French socialist, Marxist, communist economist has published a book, and the left in this country is having orgasms over it,” he exclaimed during a recent broadcast.
When the right drops the C-bomb, the M-bomb and S-bomb all at once, you can be certain a book is having an impact. And “Capital” may well be the “General Theory” of the first half of the 21st century, redefining the way we think about capitalism, democracy and equality. This, of course, means that the right-wing attacks have only just begun.
McElwee points out that James Pethokoukis “went full hack” on the book with this gem in the National Review:
Thanks to Piketty, the Left is now having a “Galaxy Quest” moment. All that stuff their Marxist economics professors taught them about the “inherent contradictions” of capitalism and about history’s being on the side of the planners – all the theories that the apparent victory of market capitalism in the last decades of the 20th century seemed to invalidate – well, it’s all true after all.
No, Piketty is not calling for communist central planning, he’s just showing the data, as McElwee goes on to explain in detail – not that it matters. The guy is French. Americans always blame the French.
It’s hard to blame the French for what David Atkins lays out here:
If you’ve paid attention to the economy over the last few years, you’ve doubtless seen the charts and figures showing the decline of the American middle class in concert with the explosion of wealth for the super-rich. Wages have stagnated over the last 40 years even as productivity has increased, which is another way of saying that Americans are working harder but getting paid less. Unemployment remains stubbornly high even though corporate profits and the stock market are at or near record highs. Passive assets in the form of stocks and real estate, in other words, are doing very well. Wages for working people are not. Unfortunately for the middle class, however, the top 1 percent of incomes own almost 50 percent of asset wealth, and the top 10 percent own over 85 percent of it. When assets do well but wages don’t, the middle class suffers.
This ominous trend is particularly prominent in the United States. That shouldn’t surprise us: study after study shows that American policymakers operate almost purely on behalf of wealthy interests. Recent polling also proves that the American rich want policies that encourage the growth of asset values while lowering their own tax rates, and are especially keen on outcomes that favor themselves at the expense of the poor and middle class.
Atkins wonders why the Ninety-Nine Percent doesn’t rise up and put an end to this, but he too thinks structurally. No one is to blame, perhaps, because of how this developed:
Simply put, starting in the 1980s policymaking elites in the Western world were scared to death of oil shortages, inflationary spirals and the impact of jobs being shipped to lower-wage nations or made obsolete by increasingly powerful machines and computers. Something had to be done. Even as foreign policy became explicitly focused on securing access to oil, domestic policy became focused on quashing inflation while disguising wage stagnation. Either countries needed to move sharply to the left through increased worker protections and redistribution of incomes, or to the right by substituting an asset-based economy for the old wage-based economy. Most chose to go right – an understandable move at the time given that state Communism was still a threat to capitalist economies, but also a spent and discredited ideology.
In short, dumping a wage-based economy seemed like a good idea at the time, and Atkins cites Ronald Reagan making the case for this new economic model in a speech from 1975:
Roughly 94 percent of the people in capitalist America make their living from wage or salary. Only 6 percent are true capitalists in the sense of deriving income from ownership of the means of production… We can win the argument once and for all by simply making more of our people Capitalists.
That’s simple enough, but Atkins adds that this was easier said than done:
One of the chief ways that American and British policymakers put this vision into reality was by crippling organized labor. But while that certainly placed downward pressure on wages in the U.S. and Britain, labor was not so similarly affected in most of the rest of the developed world. Organized labor remains a powerful force throughout most of Europe, yet growing wealth inequality and a declining middle class are present trends there as well. The health of organized labor abroad has helped stem the tide, but has not managed to stop it. The less noticed but potentially more consequential way that policymakers across the industrialized world set about accomplishing this goal was to push their middle classes to invest their wealth into assets, especially stocks and real estate, and then use the levers of public policy to inflate the values of those assets in order to disguise the inevitable declines in wages. There was also a concerted effort to hide wage losses by lowering the prices of non-perishable goods – even if doing so meant domestic job losses.
That’s how Atkins said the rich stole our money and made us think they were doing us a favor, in four steps that altered the structure of the economy:
Push people away from defined-benefit pensions and into stocks and 401(k)’s. Believe it or not, there used to be a time when the Dow Jones and S&P 500 indices were little-noticed figures in the business section of the newspaper. That’s because most people’s retirements weren’t tied to the stock market. The switch from pensions to market-based 401(k)’s helped change all that. Moving employees into 401(k)’s did more than just reduce the obligated burden on corporate bottom lines. It also helped goose the growth of the financial sector upon which the ultra-wealthy depend for their passive incomes. This was not an accident. Combined with the Reagan-era excesses and the explosion of the tech bubble, suddenly Wall Street was hot popular culture, and the nation watched breathlessly as the health of the Dow Jones was commonly equated with the health of the overall economy. The share of GDP taken by the financial sector grew from 2.8 percent in 1950 to 8.4 percent and rising as of 2006 – and financial sector profits account for nearly a third of all corporate profits in America. As a broader sector of Americans watched their meager stock portfolios rise, they weren’t as concerned with the slow growth of their regular wages. Only lately has the damage done to retirement security by moving from defined benefits to uncertain stock markets started to become more widely known.
There was no one to blame, as that was a structural thing, as was this effort:
Push more people into buying real estate, and increase home prices by all means possible. Rates of homeownership increased most dramatically in the 1940s to 1960s, creating the first major bump in housing prices. However, the period between 1960 and 1975 saw home prices decline slightly when adjusted for inflation. The government used the levers of public policy to encourage greater homeownership and reduce interest rates. Big business and wealthy interests pushed through Wall Street deregulation during the Reagan and Clinton eras, which not only boosted the stock market but also allowed large banks to make unprecedented money off of home loans. The end result was that wealthy landlords and asset owners got much richer while rents increased and wages declined, but most Americans didn’t feel the pinch because rising home values made them feel rich on paper until the Great Recession. After the financial crisis, policymakers have done everything in their power to boost both stock and home prices through quantitative easing, 0 percent interest rates, and increased homeowner incentive programs.
Everyone should own a home, right? Renters are losers. Everyone knows this, but the argument here is that this is a structural fact now, aided by this effort:
Democratize consumer debt, especially through credit cards. Americans born after 1975 don’t remember a world before the widespread use of credit cards. But it used to be that if a regular member of the public couldn’t pay his or her bills, debt wasn’t usually an option. But that wasn’t usually a huge problem, either: Because jobs were plentiful and wages had more buying power against the cost of living, most Americans didn’t need credit cards. Revolving credit used to be the province of capitalists, not of wage earners.
Though Diner’s Club cards originated in the 1950s, the charge cards as we know them today were truly born and popularized in the mid-1970s and early 1980s – not coincidentally the same time as Wall Street deregulation, 401(k) transitions and the birth pangs of the real estate boom. The boom in popular credit had two major effects: to enrich the same financial services companies, whose success disproportionately benefits the wealthy, and to disguise and soften the effects of stagnant wages.
Now add new free-trade policies:
The same decades that produced the previous trends also saw the implementation of free trade agreements like NAFTA. It is commonly understood today that these treaties benefit wealthy stockholders while reducing jobs in developed nations. But their less-discussed effect was also to reduce the price of many consumer goods made overseas, which in turn helped to disguise wage stagnation.
Even the poor can now buy a bag of ten dozen pairs of tube socks at Wal-Mart for next to nothing, and thus there are structural reasons we won’t have a workers’ uprising of any kind. The decline of the middle class and public discontent over stagnant wages will be masked, but at a price:
The first is that the vast preponderance of wealth will accrue to the very top incomes in an economy where assets inflate while wages deflate. The second is that a purely asset-based economy is bubble-prone, deeply unstable and given to sharp and painful boom-bust cycles. The story of the last half-decade is in part the removal of the blindfold that has been hiding wage losses over the last half-century. Housing prices have skyrocketed beyond the ability of most people under 40 to afford, even as household debt nears record highs. Nearly half of Americans have no retirement savings at all, while much of the rest of the developed world faces a pension obligation crisis.
This is a structural nightmare, recently explored in depth by that French economist, and Atkins knows why his book scares the asset-economy guys:
The tools policymakers have used to distract the public from the raw deal of low wages are no longer working. And that may more than anything else help usher in a new era of populist progressivism… if, that is, the Democratic Party can shift itself away from reinforcing the asset-based economy toward rebuilding a sustainable model that encourages wage growth and a strong labor market.
Don’t hold your breath. There are structural impediments to any reform of any kind. Those who set policy ARE the One Percent, or are themselves financed by them – and there really is no one bad guy here. Who screwed everything up? Atkins seems to imply a conspiracy of the rich, but he also never hints that there was one person or some small cabal behind this. That’s the message even progressives need to understand – the basic idea was fatally flawed, none of us saw it, and you guys didn’t see it either, and by the way, no one did anything wrong, really. Damn, people hate hearing that. Teaching English was easier.