Misunderstanding Calculated Delay

Every day the tourists fill the Santa Monica Pier – every day is a summer day out here after all – and the pier is cool. It has an old merry-go-round that has popped up in more than a few movies, and a coaster and rides of all sorts, and views out to Malibu, and one of the few solar-powered Ferris wheels in the world – because we’re all bleeding-heart liberal tree-huggers out here. Yeah, everyone drives a Prius, except for the rich folks who all drive that swoopy Tesla sedan. We’re cool, and responsible citizens, and there are surfers too – but across the street from the pier, one block south and one block inland, on Main, there’s a large and low white building with tiny windows. That’s the Rand Corporation – the think tank that was created in 1946 by “Hap” Arnold, the famous Air Force general, and the Douglas Aircraft Company, to figure out how to fight and win wars with amazing new weapons – missiles and nuclear bombs and satellites and such. In 1948, Douglas Aircraft, that was founded in Santa Monica and built the airplanes that won the Second World War, pulled out – there was an obvious conflict of interest. Those who supply the hardware shouldn’t be doing operations research on war planning, which soon involved geopolitical game theory stuff, and then analysis of effective and non-effective decision-making processes of all sorts. Rand became the place to think big thoughts. The guys down by the pier sort of gave us the doctrine of nuclear deterrence by mutually assured destruction (MAD) – Robert McNamara used Rand’s work with game theory to argue that was a fine thing, even if it was pretty much what was in place anyway. Rand was good at figuring out what was working, and more importantly, precisely why.

McNamara also created the Vietnam Study Task Force on June 17, 1967, to create an “encyclopedic history of the Vietnam War” from 1947 onward – he wanted to leave a written record for historians, to head off policy errors in the future. If you want to learn from your mistakes you do have to know what happened, when, and why. That’s sensible. McNamara neglected to tell Lyndon Johnson and Dean Rusk about this massive study, but they were busy at the time. This effort became known as the Pentagon Papers – and those papers eventually ended up at the Rand Corporation down by the pier.

That’s where they should have been in the first place, with the experts in the analysis of effective and non-effective decision-making processes, and the Rand folks continued and deepened the original analysis of how we got into that Vietnam mess. It wasn’t pretty, and it had to remain secret. If the public ever found out about how decisions on Vietnam had been reached, or avoided, they’d be outraged. We had had secretly enlarged the war early on with bombing Cambodia and Laos, with coastal raids on North Vietnam, and with Marine Corps attacks all over the place, and none of this had been reported in the media. Congress hadn’t known. President after president had been flailing about and none of it had worked, and they knew it, but no one else knew it. They’d better not find out. The Pentagon Papers remained under lock and key in a dark room across the street from the Santa Monica Pier.

That should have worked fine. Everyone at Rand has a top secret or better clearance, except that didn’t account for the guy who thought the American public should know what was really going on. On a fine warm evening in 1971, Daniel Ellsberg left his Rand Corporation office and walked across the street to the Santa Monica Pier, where he met New York Times reporter Neil Sheehan and handed him the Pentagon Papers, the whole big pile of them, which he had grabbed and photocopied. The rest is history. The New York Times started publishing those Pentagon Papers, and then was stopped by an injunction the Nixon administration had won, so the Washington Post published them and forced the matter up to the Supreme Court, and won, getting the injunction lifted. Daniel Ellsberg was charged with conspiracy and espionage and theft of government property, and those charges were dismissed when everyone found out that the Nixon “plumbers” had broken into the office of Ellsberg’s psychiatrist, hoping to find something to make Ellsberg look like a pervert or a madman. The government wasn’t playing fair, so Ellsberg was free to go, and the public was outraged at what was in all those pages from the pier. Our leaders don’t know what they’re doing, and they know that they don’t know what they’re doing, and all of them, one after another, have being lying to us. Everything is going fine? No one would ever believe that again.

That’s ancient history now, something that happened forty-three years ago, the year that the Sonny and Cher Comedy Hour debuted, but the Rand Corporation still does what it does, analysis of effective and non-effective decision-making processes, and the issue now is Obama. Does Obama know what he’s doing, or is he just flailing around like all the rest? Brian Michael Jenkins is senior advisor to the president of the Rand Corporation, and in the Los Angeles Times he offers the current thinking on that. Think Hannibal and elephants:

President Obama has been repeatedly accused of delay. Critics say he dragged his feet on sending more troops to Afghanistan, on addressing the dangers in Libya, on providing support to Syria’s rebels and, most recently, on initiating military action against Islamic State.

But is that necessarily such a bad thing? Calculated delay has a long history as an effective military strategy, dating back at least to the Second Punic War in the 3rd century BC.

Jenkins is serious:

At the time, Hannibal’s Carthaginian army, including his war elephants, had successfully made its way from North Africa through Spain and across the Alps to invade Italy from the north. There, Hannibal’s troops inflicted two stunning defeats on Rome’s mighty legions, throwing the country into panic.

During ordinary times, the Roman Republic was governed by a senate and two elected consuls who served together for one-year terms. But in times of national crisis, the senate had the option of appointing a dictator to streamline command. In the face of Hannibal’s advance, the senate appointed Quintus Fabius Maximus, who at that point had served two terms as consul.

Everyone expected Fabius Maximus, an admired leader and experienced general, to quickly march on Hannibal’s forces, but he did not. Instead, he avoided major battles while harassing Hannibal’s army around the edges, preventing the invaders from getting supplies, gradually wearing them down and degrading their capabilities. It was a strategy of containment.

It worked just fine, but folks just hated the whole thing:

Avoiding battle was un-Roman, an affront to the greatness of Rome. People called Fabius Maximus the Cunctator – the delayer. It was intended to be an insult.

Fabius Maximus was replaced by a Roman consul who was determined to engage Hannibal directly. Under the command of the new consul, eight Roman legions marched off to destroy Hannibal’s forces. They met at Cannae, in what turned out to be Rome’s greatest military disaster. Between 50,000 and 70,000 Roman soldiers were slain and 10,000 more were taken prisoner.

Oops, but we’ve been there. In 2003, anyone who thought that rather than going to war in Iraq we ought to wait until the UN weapons inspectors finished up, was un-American, or even French. Delay was stupid, and dangerous. Think of Neville Chamberlain. We sent our eight Roman legions to Iraq. Almost five thousand of our troops came home in body-bags and things are even worse there than before we were bold and awesomely American. We needed a Fabius Maximus. We had George Bush.

The Romans got it:

The disaster at Cannae suddenly made Fabius Maximus look brilliant, and Romans again looked to “the delayer” to save the republic. Cunctator became a title signifying prudence, wisdom and respect.

The disaster in Iraq suddenly made Barack Obama president. He was our Fabius Maximus, but he wasn’t an oddball:

Other military commanders have since followed what became known as a Fabian strategy, among them George Washington, who, in the early years of the American war for independence, avoided head-on battles with the British.

The containment strategy of the Cold War was a Fabian approach in the sense that it made the avoidance of nuclear confrontation its primary objective. But it was also based on the message that, if attacked, the United States would retaliate with massive force. And it was understood by all that all-out nuclear war meant the end of the world.

Okay, that last bit is a plug for the game theory work the Rand Corporation did back in the sixties, but it’s still true in a general way:

Fabian strategies make sense in certain kinds of circumstances: against a stronger adversary, say, or when direct confrontation risks a catastrophic defeat or a larger conflagration. They also make sense in instances when time favors the side that opts to delay. In today’s warfare, there is little risk of a single disastrous battle, but there is still risk of military fiasco and political ruin. Fabian strategies might make sense if there’s a risk of becoming bogged down in a costly and seemingly futile military adventure – or in cases where action carries a high risk of casualties that could turn people against the effort.

That may be the situation in the Middle East now, but it still feels bad:

Even when holding back is the right choice, Fabian strategies are almost never popular at the time they’re employed. People prefer quick victories to extended low-level campaigns. Time may erode morale and sap support for the effort. And a Fabian approach can anger hawks at home and dismay allies abroad. Warfare today is in part about the manipulation of perceptions, and Fabian strategies can look alarmingly like appeasement.

Moreover, there are times when a delay in forcefully confronting enemies can lead to disastrous outcomes. Should the world not have intervened earlier, say, to stop Hitler?

Sure, but when – when he got home from the war in 1918 and started seething about how Germany had been humiliated? That is no more than idle speculation. We didn’t send in a Navy SEAL team to take care of him then. We’ll never know, and it may be a long time before we know of our Fabius Maximus is a jerk or not, but he is the Cunctator, the delayer:

Historians will debate whether Obama’s skeptical approach to entering conflicts reflects prudence or weakness. The evidence so far is ambiguous. The president has said the U.S. war on Islamic State will be “a long-term campaign” with no “quick fixes involved.” He set no deadline for its completion.

The administration has not publicly framed its approach as part of a Fabian strategy – no American administration would ever use the term. Sometimes action – or inaction – speaks louder than words.

Like or not, we have our Delayer, which may be a good thing, even if we hate it, but Josh Green makes the counter argument, that Obama is too cool for crisis management:

By the time President Obama gave in and appointed an Ebola czar on Oct. 17, the White House response to this latest national crisis had already run a familiar course: the initial assurance that everything was under control; the subsequent realization that it wasn’t; the delay as administration officials appeared conflicted about what to do; and the growing frustration with a president who seemed a step or two behind each new development. Meanwhile, public anxiety mounted as cable news hysteria filled the vacuum and shaped the perception of the unfolding crisis.

Obama calmly insisted there was nothing to worry about when the news first broke of Thomas Eric Duncan’s infection. “It’s important for Americans to know the facts,” he said on Oct. 6. “Because of the measures we’ve put in place, as well as our world-class health system and the nature of the Ebola virus itself, which is difficult to transmit, the chance of an Ebola outbreak in the United States is extremely low.” It soon became clear the health system wasn’t prepared; the virus spread, infecting two nurses who had treated Duncan. One of them had called the Centers for Disease Control and Prevention to report having a fever, yet was still allowed to board a commercial airliner on Oct. 13. The CDC’s guidelines were declared “absolutely irresponsible and dead wrong” by Sean Kaufman, director for safety training at Emory University Hospital, where two American missionaries from West Africa were treated for Ebola in August. But Obama clung to his position for two more weeks, even after it began to look ridiculous.

Only with public confidence slipping and dozens of congressmen calling for a ban on travel from West Africa did Obama submit to the kind of grand theatrical gesture he abhors: He canceled a campaign trip to hold an emergency cabinet meeting and appointed Ron Klain, a veteran political operative, to coordinate the government’s Ebola response. Then the pageantry of White House crisis response reached its familiar end point, with anonymous aides telling the New York Times that Obama was “seething” at the botched response and the criticism that he’d mishandled the crisis.

The issue is different, but the principle is the same. Obama delays things, and this time he shouldn’t have, but that’s who he is:

The difficulty in formulating a response echoes the fitful efforts to address the Deepwater Horizon oil disaster, the chemical weapons attacks in Syria, the advance of Islamic State, the rollout of healthcare.gov, and even the shooting of Michael Brown by police in Ferguson, Mo.

Administration veterans describe Obama’s crisis-management process as akin to a high-level graduate seminar. “He responds in a very rational way, trying to gather facts, rely on the best expert advice, and mobilize the necessary resources,” says David Axelrod, a former White House senior adviser.

By all accounts, Obama treats a crisis as an intellectual inquiry and develops his response through an intensely rational process. As former CIA Director Leon Panetta said recently in a TV interview, “He approaches things like a law professor in presenting the logic of his position.”

Axelrod meant that as a compliment, Panetta didn’t, and Green is with Panetta:

Six years in, it’s clear that Obama’s presidency is largely about adhering to intellectual rigor – regardless of the public’s emotional needs. The virtues of this approach are often obscured in a crisis, because Obama disdains the performative aspects of his job. “There’s no doubt that there’s a theatrical nature to the presidency that he resists,” Axelrod says. “Sometimes he can be negligent in the symbolism.” Lately, this failing has been especially pronounced. Few things strike terror in people quite like the specter of Ebola. An Oct. 14 Washington Post-ABC News poll found that nearly two-thirds of Americans (65 percent) say they fear a widespread outbreak in the U.S. Cooler heads have noted that more Americans have been married to a Kardashian than have died from Ebola. But that fun fact misses the point: People fear what they can’t control, and when the government can’t control it either, the fear ratchets up to panic.

Axelrod is forced to admit it all comes down to what comes out of Southern California, as Green suggests:

Americans’ views of deadly viruses such as Ebola are shaped by Hollywood movies such as Outbreak and Contagion, and when the prospect of a global pandemic arises, we expect a Hollywood president to take charge. Obama’s Spock-like demeanor and hollow assurances about what experts are telling him feel incongruous.

Obama just doesn’t get it:

It’s hard not to suspect that Obama’s lack of executive experience before becoming president is one reason why he often struggles to strike the right tone. In this way, he’s the opposite of the man who preceded him. “I still remember where I was when Bush took the bullhorn at Ground Zero,” Axelrod says. He was recalling one of the great moments of presidential theater, when George W. Bush climbed atop the rubble of the World Trade Center after the Sept. 11 attacks. “I can hear you,” Bush shouted to the cheering rescue workers. “The rest of the world hears you. And the people who knocked these buildings down will hear all of us soon.” In a stroke, Bush galvanized the nation.

Obama recoils from this kind of bravado – and bravado didn’t always serve Bush so well. (A certain flight suit comes to mind.) It also deserted him at critical moments like the aftermath of Hurricane Katrina. But replacing the impulse and emotion that governed Bush with a fealty to experts has led Obama to develop blind spots of his own.

That may be so, but that seems to suggest confusion about what the problem is. If the task is to fix the actual problem – ISIS or Ebola or whatever – Obama is on the case. If the problem is to fix how people feel about the problem at hand, even if you have no idea how to fix it, Obama is a disaster. Bush was far better at that, for all the good it did – the problem was still there and getting worse. Green conflates the two problems:

It’s often said in Washington that the best politics is good policy. That hasn’t been Obama’s experience. Dragged down by Ebola and other headaches, his approval rating has dropped to 40 percent, the lowest yet in his presidency. Democrats are on the verge of losing the Senate partly as a result. This reflects the cost of botching the initial response to so many crises.

Yeah, but then the problems are solved, with quiet calm and careful corrections, and with delay, with not being too hasty. Shallow and simpleminded jingoistic bravado keeps the approval ratings up, and keeps your party in power, but it doesn’t actually solve problems. The Romans hated Quintus Fabius Maximus, then they loved him, and then they probably hated him again – but Hannibal’s elephants died in the Alps and Hannibal slunk away. The problem was solved.

The folks down by the pier in Santa Monica have studied such things, effective and non-effective decision-making processes – they even had the Pentagon Papers for years after all – and we’re all still here. The world didn’t blow up. There’s something to be said for that.

Posted in Obama Too Cool, Obama's Leadership Style | Tagged , , , , , , , , , , , , , , , | 1 Comment

Calling It Quits

The midterms are coming and it is becoming increasingly clear that the Republicans will retake the Senate, making Obama’s last two years in office miserable for him, and for the nation. All nominees to everything will be blocked. If one of the current justices of the Supreme Court kicks the bucket – and a few of them are as old as the hills – expect a Nugent or No One roar from the Republicans. They want a true conservative on the court, one who will ignore the niceties of the law and end abortion being legal and make suppressing the votes of the wrong sort of people fully legal again. Ted Nugent would be their man, but they would, magnanimously, be willing to compromise on an actual lawyer, like Ann Coulter. Okay, maybe not her, but Obama would have to nominate someone really conservative, or the seat would remain vacant. Ted would be the bargaining chip, offering them a way to say they’re willing to meet Obama halfway, but he’s a radical left-wing jerk.

This is going to be unpleasant. Expect a lot of legislation to pass this new Senate, agreed to by the already Republican house, that Obama will veto – the full repeal of Obamacare, the mandatory arming of all school children with assault rifles, the revocation of the citizenship of all gays, a requirement that within two years all automobiles and trucks and busses in America run only on coal, and of course legislation formally making our Department of State a minor branch of Israel’s foreign office. The list of such things is endless and they won’t pass anything else. They still have two years to make Obama look bad. Make him use that veto pen, and be sad and disappointed when he uses it, a do-nothing president who refuses to move the country forward. Of course that only works if they pass legislation that’s quite absurd. Obama might agree with legislation that would make things better for everyone, so they have to be careful.

It’s a plan. Expect two years of gridlock, two years of nothing getting done. They have two more years to ruin the country, so they can have one of their folk win the presidency in 2916, on the promise to make everything work again, even if they have no idea how to do that. They’re out of practice. Maybe they never knew how to do that. Think of George W. Bush. But everything should be deregulated. That’s a start.

That’s not much of a plan, but Republicans are upset that Obama somehow became president, and they know that everyone else is too. Obama just can’t be the president, even if he won the job rather easily in 2008 and in 2012 won again, rather easily. They want to fix that. They want to erase the guy, to make it as if he never happened to America. They’re upset. They don’t have any idea of how to govern, they may be incapable of governing, but they’re upset.

There are some problems with that. They’ve stopped talking about Obamacare – it’s working just fine and it’s really a way to help people buy insurance from private parties, so it’s hardly a government takeover of healthcare. They’re not talking about Benghazi – it’s all been said and that’s over. There’s not much to say about Ebola either. Half the population of Dallas didn’t die. One guy who flew in from Africa did. That’s it. The public panicked, and now that they’re slowly but surely becoming embarrassed that they did, anyone who screams that we’re all going to die will look like a fool.

Obama didn’t kill us all, and as for ISIS, or ISIL or whatever, dealing with them increasing seems to be a matter of getting the actual stakeholders in the region to do something about them. ISIS may be a problem for us one day, but right now, ISIS is their problem. If the actual stakeholders over there fix that, we’re good. We’ll do what we can to help them get their act together, but we’re not going to spend another eight years in Iraq, and this time in Syria too. Those who scream that it’s time, right now, to put boots on the ground, lots of boots, also look like fools. Obama did not just hand over the Middle East to a bunch of thugs, who will blow up Cleveland next week. This will be a long slog, where careful diplomacy is necessary. Someone has to talk some sense into Turkey, and all the others. Obama is working on that. John Kerry will be busy.

This puts the Republicans in an awkward position. There really is nothing to be upset about. Obama is not a Muslim terror-loving socialist out to destroy America just for the fun of it, or because he’s an angry black man who wants to make America pay for that slavery thing o long ago, or a guy who is still upset about British colonialism in Kenya a hundred years ago. He’s careful and sensible, if not a bit boring. In fact, Bruce Bartlett in The American Conservative, says Obama Is a Republican – he’s the heir to Richard Nixon, not Saul Alinsky – that is the subhead to this item.

Bruce Bartlett should know about such things. He’s the historian and economist who got into politics in 1976 working for Ron Paul, the eccentric libertarian, and then for Jack Kemp, writing Kemp’s tax policy. He then served as a domestic policy adviser to Ronald Reagan and was a Treasury official under George H. W. Bush, the first Bush. You remember him, the one who was relatively stable and informed. Bruce Bartlett thought the second Bush was a jerk. The son screwed everything up. Bruce Bartlett has written book after book about the wonders of supply-side economics, and the younger Bush gave away the store, spending all sorts of money that distorted the righteous operation of free markets and then ruined the country. Bartlett wrote a book about that too, and in 2005, the National Center for Policy Analysis fired Bartlett for ragging on young George so much.

Bartlett, however, argued that he himself was the true Republican, not this Bush kid, who was in way over his head. Bartlett made waves, or he made trouble, and now there’s this:

In my opinion, Obama has governed as a moderate conservative – essentially as what used to be called a liberal Republican before all such people disappeared from the GOP. He has been conservative to exactly the same degree that Richard Nixon basically governed as a moderate liberal, something no conservative would deny today.

Bartlett then points out that Noam Chomsky, of all people, recently called Richard Nixon “the last liberal president” – creating the EPA and going to China and all that – so Bartlett feels he can make the counterargument for Obama, and that starts with the Middle East:

One of Obama’s first decisions after the election was to keep national-security policy essentially on automatic pilot from the Bush administration. He signaled this by announcing on November 25, 2008, that he planned to keep Robert M. Gates on as secretary of defense. Arguably, Gates had more to do with determining Republican policy on foreign and defense policy between the two Bush presidents than any other individual, serving successively as deputy national security adviser in the White House, director of Central Intelligence, and secretary of defense.

Another early indication of Obama’s hawkishness was naming his rival for the Democratic nomination, Sen. Hillary Clinton, as secretary of state. During the campaign, Clinton ran well to his right on foreign policy, so much so that she earned the grudging endorsement of prominent neoconservatives such as Bill Kristol…

And there’s this:

By 2011, Republicans were so enamored with Clinton’s support for their policies that Dick Cheney even suggested publicly that she run against Obama in 2012. The irony is that as secretary of state, Clinton was generally well to Obama’s left… This may simply reflect her assumption of state’s historical role as the dovish voice in every administration. Or it could mean that Obama is far more hawkish than conservatives have given him credit for.

Although Obama followed through on George W. Bush’s commitment to pull U.S. troops out of Iraq in 2011, in 2014 he announced a new campaign against ISIS, an Islamic militant group based in Syria and Iraq.

Only a true Republican would simply announce we’re at war again, by the way, and then there’s the economy:

With the economy collapsing, the first major issue confronting Obama in 2009 was some sort of economic stimulus. Christina Romer, chair of the Council of Economic Advisers, whose academic work at the University of California, Berkeley, frequently focused on the Great Depression, estimated that the stimulus needed to be in the range of $1.8 trillion…

The American Recovery and Reinvestment Act was enacted in February 2009 with a gross cost of $816 billion. Although this legislation was passed without a single Republican vote, it is foolish to assume that the election of McCain would have resulted in savings of $816 billion. There is no doubt that he would have put forward a stimulus plan of roughly the same order of magnitude, but tilted more toward Republican priorities.

A Republican stimulus would undoubtedly have had more tax cuts and less spending, even though every serious study has shown that tax cuts are the least effective method of economic stimulus in a recession. Even so, tax cuts made up 35 percent of the budgetary cost of the stimulus bill – $291 billion – despite an estimate from Obama’s Council of Economic Advisers that tax cuts barely raised the gross domestic product $1 for every $1 of tax cut. By contrast, $1 of government purchases raised GDP $1.55 for every $1 spent. Obama also extended the Bush tax cuts for two years in 2010.

So give the guy a break:

Republicans give no credit to Obama for the significant deficit reduction that has occurred on his watch – just as they ignore the fact that Bush inherited a projected budget surplus of $5.6 trillion over the following decade, which he turned into an actual deficit of $6.1 trillion, according to a CBO study – but the improvement is real.

Republicans would have us believe that their tight-fisted approach to spending is what brought down the deficit. But in fact, Obama has been very conservative, fiscally, since day one, to the consternation of his own party. According to reporting by the Washington Post and New York Times, Obama actually endorsed much deeper cuts in spending and the deficit than did the Republicans during the 2011 budget negotiations, but Republicans walked away.

And there are these things to consider too:

Drugs: Although it has become blindingly obvious that throwing people in jail for marijuana use is insane policy and a number of states have moved to decriminalize its use, Obama continued the harsh anti-drug policy of previous administrations, and his Department of Justice continues to treat marijuana as a dangerous drug…

National-security leaks: At least since Nixon, a hallmark of Republican administrations has been an obsession with leaks of unauthorized information, and pushing the envelope on government snooping. By all accounts, Obama’s penchant for secrecy and withholding information from the press is on a par with the worst Republican offenders. Journalist Dan Froomkin charges that Obama has essentially institutionalized George W. Bush’s policies. Nixon operative Roger Stone thinks Obama has actually gone beyond what his old boss tried to do.

Race: I think almost everyone, including me, thought the election of our first black president would lead to new efforts to improve the dismal economic condition of African-Americans. In fact, Obama has seldom touched on the issue of race, and when he has he has emphasized the conservative themes of responsibility and self-help. Even when Republicans have suppressed minority voting, in a grotesque campaign to fight nonexistent voter fraud, Obama has said and done nothing.

Gay marriage: Simply stating public support for gay marriage would seem to have been a no-brainer for Obama, but it took him two long years to speak out on the subject and only after being pressured to do so.

And then there’s the matter of what Republicans think makes America great, corporate profits:

Despite Republican harping about Obama being anti-business, corporate profits and the stock market have risen to record levels during his administration. Even those progressives who defend Obama against critics on the left concede that he has bent over backward to protect corporate profits. As Theda Skocpol and Lawrence Jacobs put it: “In practice, Obama helped Wall Street avert financial catastrophe and furthered measures to support businesses and cater to mainstream public opinion, he has always done so through specific policies that protect and further opportunities for businesses to make profits.”

That’s just a taste of Bartlett’s long and carefully documented argument, but it comes down to this:

I don’t expect any conservatives to recognize the truth of Obama’s fundamental conservatism for at least a couple of decades – perhaps only after a real progressive presidency. In any case, today they are too invested in painting him as the devil incarnate in order to frighten grassroots Republicans into voting to keep Obama from confiscating all their guns, throwing them into FEMA re-education camps, and other nonsense that is believed by many Republicans. But just as they eventually came to appreciate Bill Clinton’s core conservatism, Republicans will someday see that Obama was no less conservative.

Okay, fine – Obama is a conservative. In fact, Obama is an old-school Republican. So what else is new? Bartlett carefully documents how others on his side of things have made the same argument – he just wanted to emphasize how right they were – but he’s not the only old hand around. There’s this guy:

Douglas MacKinnon served in the White House as a writer for Presidents Ronald Reagan and George H. W. Bush and afterwards in a joint command at the Pentagon, where he had a top secret government clearance.

He has a different take on things:

Conservative columnist and former Reagan administration aide Douglas MacKinnon is out with a new book calling for Southern states to secede…again.

While speaking yesterday with Janet Mefferd about his book – The Secessionist States of America: The Blueprint for Creating a Traditional Values Country…Now – MacKinnon called for a movement of states, starting with South Carolina, Georgia and Florida, to establish a new country that will adhere to the Religious Right’s political agenda.

Texas, MacKinnon explained, was not included in his secessionist blueprint because “there have been a number of incursions into Texas and other places from some of the folks in Mexico.”

There are too many brown folks in Texas. This is about good Christian white folks, and he’s serious about quitting the country:

He added that the South had “seceded legally” and “peacefully” during the Civil War, but greedy Northerners like President Lincoln “waged an illegal war that was in fact not declared against the South after the South basically did what we’re talking about in this book now in terms of peacefully, legally and constitutionally leaving the union.”

That’s not how some remember it, and there’s this:

After lamenting that “for whatever reason the leaders that we’re picking are deciding not to stand firmly for traditional values,” MacKinnon repeated his view that a new country should be formed, and even proposed an “interim name” for the ultraconservative breakaway nation: “Reagan.”

Cool, but this doesn’t sound like Reagan:

MacKinnon strongly defended the South for its role in what he called “The War Between The States,” saying that Religious Right activists should endorse the secessionist movement as a way to “protect our faith.”

No one remembers Reagan telling America that Christianity was under attack. He hated welfare queens and big government. He hated communism, which meant everyone sharing, which was theft from the good guys who did something useful with their lives, just as taxes are no more than theft. He didn’t talk about Jesus a lot, if ever. MacKinnon may have worked near Bartlett in the same two administrations, but they had vastly different experiences.

A frequent contributor here in the comments section is Rick, the News Guy in Atlanta – which is where you end up if you and your wife were part of the team that founded CNN in 1980 and worked there for many years – and seeing this he sent a comment along in an email:

Oh, shit! We’ll have to move – but where to? We can’t AFFORD to live in any of the GOOD states! After all, everybody with a brain wants to live in those places, driving living costs way too high!

In fact, I wonder if it has even occurred to this Douglas MacKinnon guy that one person who chose to live in one of them terrible northern states was his country’s namesake, Ronald Reagan! And he chose California, of all places!

Rick grew up out here in the Pacific Palisades, in grade school with Randy Newman, and his parents knew the Reagans, who lived in the neighborhood, so he is puzzled about where he ended up in retirement:

Southerners have always tended to be sloppy when choosing their dead heroes – idolizing guys who probably wouldn’t have agreed with them, had they lived long enough. I’m not sure but I think the Great Seal of the Confederacy had a depiction of George Washington on it – who, although from Virginia, got rid of his slaves in his later years, and was decidedly NOT for states’ rights; he was such an ardent nationalist, he lent his name to the cause of convening the Constitutional Convention that created the federal government, the one that replaced our FIRST failed attempt at confederacy.

And so, while they’re at it, will this new “United States of Reagan” restore slavery? They might as well, since they’re going to need some way to cut expenses to make up for the loss of all that federal revenue that keeps so many of the states afloat.

But I think it would hard for MacKinnon to make the case that secession is legal. Federal courts have, down through the years, ruled that, unlike the Articles of Confederation, the United States Constitution was not a compact between states that included an exit clause built into it, since its ratification was achieved by the votes of independent constitutional conventions, rather than state legislatures. We are a country founded by “We, the People” – which is one reason the preamble begins with those words, instead of “We, the States”.

So okay, the Founders were sneaky; the states got snookered – but it was all legal and above board!

Yeah, secession won’t work. The base of the Republican Party may be very upset, but that’s not going to work. They might try something like this:

Officials in the City of South Miami have passed a resolution in favor of splitting the state in half so South Florida would become the 51st state.

Vice Mayor Walter Harris proposed the resolution and it passed with a 3-2 vote at the city commission meeting on Oct. 7.

Harris told the commission that Tallahassee isn’t providing South Florida with proper representation or addressing its concerns when it comes to sea-level rising.

“We have to be able to deal directly with this environmental concern and we can’t really get it done in Tallahassee,” Harris said. “I don’t care what people think – it’s not a matter of electing the right people.”

Sometimes you have to start fresh, and this is a different form of secession. You don’t take your preexisting state and just leave. You create a whole new state, which is still part of the union, the United States of America. This isn’t really secession at all – this is secession from Florida, not the United States – but it doesn’t matter. Eve Andrews looks at the climate data – in two hundred years all of this new state of South Florida will be underwater, as sea levels keep rising, and that’s already underway. It doesn’t matter what they do. Let them have their new state. South Florida would be gone soon enough – but these folks are upset. They want to do something. They want to call it quits.

There’s a lot of that going around. Bruce Bartlett points out that people can get the reason for throwing up their hands and just quitting all wrong – they’re upset with someone who is one of them, doing the sorts of things they like. Eve Andrews points out that just quitting may get you nothing at all.

Maybe it’s best to stick it out. No one likes a quitter.

Posted in Obama the Conservative, Secession | Tagged , , , , , , , , , , | 3 Comments

The Death of News

February 3, 1959, was the day the music died – that’s what Don McLean told us in 1971 in that odd hit song about how it was all over for us. That was the day Buddy Holly and Ritchie Valens, and the “The Big Bopper” died in a plane crash in Iowa. Is this heaven? No, it’s Iowa, and you’re dead. Buddy Holly had just disbanded the Crickets and had put together a new group with Waylon Jennings and bunch of his other West Texas buddies, but now that would never be. Ritchie Valens, from out here in Pacoima, the scruffy dusty barrio at the far empty north end of the San Fernando Valley, had had a smash hit with La Bamba – Anglo kids loved it. That was going to change everything, and then his short eight-month recording career was over. Rock would revert to white imitations of black music for the next few decades, and the Big Bopper was just fun. The fun was over. That age of rock music was over. Happy innocence was over. Drive your Chevy to the levee, but the levee is dry.

It was 1971 after all – Nixon was in the White House. That September, the White House “plumbers” unit burglarized a psychiatrist’s office to find files on Daniel Ellsberg, the guy who leaked the Pentagon Papers, to prove he was pervert or something. The New York Times and the Washington Post had published those papers, and suddenly Washington Post reporters were no longer welcome at White House events. Nixon was going to stick it to the Post, and to its editor, Ben Bradlee, who he despised. There’d be news, but the Post would have to report it second-hand, and late. Bradlee was on Nixon’s Enemies List – set up that August by a bunch of White House aides to “use the available federal machinery to screw our political enemies” – even if Nixon himself may not have known about it. It was a nasty time, but the next June, five burglars were arrested in the middle of the night exiting the offices of the Democratic National Committee at Watergate complex, and one of them was James McCord, the security director for the Committee for the Re-Election of the President, appropriately known as CREEP. The Washington Post reported that, and got Attorney General John Mitchell, the head of the Nixon reelection campaign, on record denying any link to what those five guys had been up to, whatever it was. Mitchell would end up in jail. Nixon would eventually resign, the first president to ever do that.

Nixon learned that you don’t mess with Ben Bradlee. He wasn’t Perry White, the blustering befuddled editor of the Daily Planet, trying to figure out what Lois Lane and Clark Kent and Jimmy Olsen were up to, and never quite getting it. Bradlee was the real deal, and he had Bob Woodward and Carl Bernstein. He gave those two holy hell until they had the Watergate story nailed down, with all the details doubly confirmed, or better – there’d be no speculation or bullshit – and then, and only then, would he run the story. Get the news right or get out. Obama’s birth certificate might be a forgery and he might have been born in Kenya? Ebola might be an airborne disease and all the scientists are lying about it, just like they’re lying about global warming? Bradlee would have none of that nonsense. Confirm the story, from multiple sources – otherwise it’s not news and he wouldn’t print it. Nail it down, make it airtight, and he would print it. He kept his reporters honest. He kept his newspaper honest. We won’t see his like again.

Now he’s dead, and October 21, 2014, may be the day news died too. The New York Times – the newspaper that published all those Judith Miller front-page stories about Saddam Hussein’s very real and very scary weapons of mass destruction that turned out to be crap she was fed from a single dubious source – lauds Ben Bradlee in their obituary, as well they should, given that they could have used someone like Ben Bradlee back them, and includes these details:

Mr. Bradlee’s Post and Woodward and Bernstein, as the two became known, captured the popular imagination. Their exploits seemed straight out of a Hollywood movie: two young reporters boldly taking on the White House in pursuit of the truth, their spines steeled by a courageous editor.

The story, of course, became the basis of a best seller, “All the President’s Men,” by Mr. Woodward and Mr. Bernstein, and the book did become, in 1976, a Hollywood box-office hit. Jason Robards Jr. played Mr. Bradlee and won an Oscar for his performance.

Bradlee did become a bit of a folk hero. He was a man who forced others to get it right, and when they did, he let it rip. He gave America the confirmed and verified truth about what was happening, and folks wanted more of that:

After Watergate, journalism schools filled up with would-be Woodwards and Bernsteins, and the business of journalism changed, taking on an even tougher hide of skepticism than the one that formed during the Vietnam War.

“No matter how many spin doctors were provided by no matter how many sides of how many arguments,” Mr. Bradlee wrote, “from Watergate on, I started looking for the truth after hearing the official version of a truth.”

All those would-be Woodwards and Bernsteins, however, can forget that they need someone like Bradlee to keep them from taking that heroic intrepid-reporter thing too seriously. Hemingway once said that every writer needs a foolproof, shockproof crap-detector. Few have one of those. That’s why there are editors. They’re probably more important than the reporters. Someone has to keep them honest.

Judith Miller learned that:

On May 26, 2004, a week after the U.S. government apparently severed ties with Ahmed Chalabi, a New York Times editorial acknowledged that some of that newspaper’s coverage in the run-up to the war had relied too heavily on Chalabi and other Iraqi exiles bent on regime change. …

Public editor Byron Calame wrote: “Ms. Miller may still be best known for her role in a series of Times articles in 2002 and 2003 that strongly suggested Saddam Hussein already had or was acquiring an arsenal of weapons of mass destruction… Many of those articles turned out to be inaccurate… The problems facing her inside and outside the newsroom will make it difficult for her to return to the paper as a reporter.”

Two weeks later, Miller negotiated a private severance package with Times’ publisher, Arthur Ochs Sulzberger, Jr. She contested Calame’s claims and gave no ground in defense of her work, but cited difficulty in performing her job effectively after having become an integral part of the stories she was sent to cover.

She needed someone like Ben Bradlee to keep her honest, and the Times didn’t have one of those. She would go on to write for Rupert Murdock’s Wall Street Journal, and on October 20, 2008, Fox News announced that they had hired her. They’re not all that particular over there, and while at the Times she did go to jail to protect Scooter Libby and thus Dick Cheney, after all – so she’s one of them. Their concept of news is not Ben Bradlee’s.

That sort of news died when Ben Bradlee died, or earlier when Bradlee retired as executive editor of the Washington Post in September 1991, and he continued to serve as Vice President at Large until his death, but that was a ceremonial title. He faded away, but the loss is real. Who do we trust now? On the day of Bradlee’s death, Pew Research gave us this:

When it comes to getting news about politics and government, liberals and conservatives inhabit different worlds. There is little overlap in the news sources they turn to and trust. And whether discussing politics online or with friends, they are more likely than others to interact with like-minded individuals, according to a new Pew Research Center study.

The project – part of a year-long effort to shed light on political polarization in America – looks at the ways people get information about government and politics in three different settings: the news media, social media and the way people talk about politics with friends and family. In all three areas, the study finds that those with the most consistent ideological views on the left and right have information streams that are distinct from those of individuals with more mixed political views – and very distinct from each other.

John Avlon at the Daily Beast explains it all:

We’re two weeks from Election Day, and you can feel political debates turning bitter, more personal.

Even in a midterm election when exhaustion rather than exultation drives the conversation, there is desperation behind the poll watching. There are no happy warriors these days. Everything feels like a cycle of revenge and retrenchment.

“Politics has become more bitterly partisan and mean-spirited than I have seen in 30 years of writing a political newsletter,” attests Charlie Cook.

What’s changed? Well, the two parties in Congress are more ideologically and geographically polarized than at any time in our recent history. But we’ve had deep divisions in our politics before. And yes, the Wingnuts seem to have an outsize influence on our politics debates. But we’ve had extremists in our politics before.

What’s different is the proliferation of partisan media via cable news and the Internet. Amid unprecedented access to information, our fellow citizens are self-segregating themselves into separate political realities.

The idea here is that “the asymmetric polarization we see in Congress not coincidentally extends to media consumption” now, and the Pew poll just confirms that:

For example, 47 percent of “consistent conservatives” view Fox News as their main source of information. Their “consistently liberal” corollaries split their allegiance among CNN, MSNBC, NPR, and the New York Times. And while liberals deem 28 of the 36 news outlets surveyed as “trustworthy,” conservatives take a dimmer view, declaring 24 of the 36 untrustworthy.

That finding is a direct reflection of the original premise behind Roger Ailes pitching Fox News as “far and balanced.” For conservatives, only explicitly right-wing news organizations can be trusted to tell the truth. Any news group that aims for the elusive ideal of objectivity is de facto liberal, in their view. It’s an extension of an idea more appropriate in wartime: If you’re not with us, you’re against us.

All this makes the pluralism of the modern world a scary, unwelcoming place. And so the reaction seems to be to corral oneself off from disagreement. Sixty-six percent of “consistent conservatives” say most of their close friends share their views on government and politics, and nearly half say they mostly see Facebook posts that match their politics.

It’s not like that for everyone else:

On the other side of the spectrum, while liberals are more likely to consume a broader diet of news sites, just over half say their close friends share their views, and 24 percent of “consistent liberals” say they stopped being friends – or stopped talking to – someone because of politics. For these self-righteous and thin-skinned folks, there are apparently limits to the liberal virtue of tolerance.

Then there are the details:

Among moderates, or those with “mixed” political affiliation, as the survey insists on calling them, CNN fares best as the most trusted cable news network and the Wall Street Journal is the only news organization to be deemed trustworthy across the political spectrum (no small feat, especially given its ideologically driven editorial page). Among the news providers underwater in the trust category are Daily Kos, Sean Hannity, Ed Schultz, Glenn Beck, Rush Limbaugh, and, oddly, BuzzFeed. Likewise, Slate is viewed at the liberal end of the spectrum.

The leads to nothing good:

A few decades ago, politicians sent talking points to talk radio hosts. Today, talk radio hosts and online echo-chamber pundits send talking points to politicians. They keep their readers and listeners addicted to anger. The durable wisdom of the late, great Sen. Daniel Patrick Moynihan – “everyone is entitled to their own opinion, but not their own facts” – gets discarded when people come to political debates armed with their own facts. And in a time when the fringe blurs with the base and competitive congressional general elections are all but extinct thanks to the rigged system of redistricting, these base-corralling fanatics have the power to strike fear into the hearts of the gutless wonders on Capitol Hill.

This is not a wake-up call as much as it is a challenge. If we don’t find a way to reverse this media trend, America is headed toward Tower of Babel territory.

It kind of makes you miss Ben Bradlee. He wasn’t out to get Nixon. He was out to get the story right.

Justin Elis isn’t that worried:

On their face, these findings might seem to lend support to the idea that we’re becoming a country of smaller and smaller filter bubbles, personalized universes of news and people that fit our own interests. But the connection between how Americans get news and their political polarization is not black and white.

Pew found that on Facebook, the majority of people only see political posts they agree with some of the time. That’s also reflected in the real world, as Pew found people on all ends of the political spectrum tend to get a mix of dissent and agreement on politics in their everyday life. 58 percent of consistent liberals and 45 percent of consistent conservatives say they often get agreement and disagreement in their conversations on politics. For people with mixed political views – Pew’s middle ideological category – that jumps up to 76 percent.

We’re still talking to each other, aren’t we? Cool. But we may not know what we’re talking about at any given moment. We “trust” different news sources, each without a Ben Bradlee these days, to keep things honest, and Christopher Ingraham notes something odd about the least-trusted new sources:

Overall, four of the top five least-trusted news outlets have a strong conservative lean: Limbaugh, Fox News, Glenn Beck and Sean Hannity. MSNBC rounds out the list. The most trusted news outlets, on the other hand, tend to be major TV networks: CNN, NBC News, ABC News and CBS News, with Fox at No. 5.

The Pew Study notes that “liberals, overall, trust a much larger mix of news outlets than others do. Of the 36 different outlets considered, 28 are more trusted than distrusted by consistent liberals.” By contrast, among conservatives “there are 24 sources that draw more distrust than trust.”

That makes sense. Liberals are, well, liberal – they like a wide range of things, they like divergent views, they’re accepting of the unusual, and people not like them fascinate them. Heck, they find foreign languages fascinating. Conservatives find that odd, or evil, or at least un-American and unpatriotic. That sort of thing leads to the worst thing of all, multiculturalism. It could even lead to moral relativism, and soon people will be marrying box-turtles, and speaking Spanish.

One of Andrew Sullivan’s readers says it’s not like that:

As a grad student who has studied polarization, the Pew study isn’t all that surprising (although it is very useful in confirming what many have long assumed.) I think it may be time, however, to challenge a long-standing assertion of polarization studies. As Bill Bishop has argued in The Big Sort, Americans seem to be increasingly segregating themselves along partisan/ideological lines. Not only are our neighbors more likely than before to share our political views, but we also are probably consuming the same kinds of political news and cultural products. This extends to Facebook as well. Some people argue this creates an “echo chamber” that merely reinforces our political beliefs. In other words, the more Fox News we listen to, the more conservative we become.

But I wonder if there isn’t an opposite effect going on as well. The proliferation of media outlets also makes it easier for us to bump into dissenting views. Unlike the 1950s-1980s, when there was one monopolistic media establishment that kept the heated rhetoric toned down, now there are many outlets, giving us all greater opportunity to encounter viewpoints that we find abhorrent and that we can’t believe others harbor. Facebook didn’t so much create an echo chamber as expose us to the private opinions of people we previously assumed were “sane” in their opinions. Consuming partisan news isn’t so much about finding the truth as it is like running for cover in a crazy world.

That might be so, but another reader adds this comment about Facebook:

I think it’s probably worth noting that liberals are more likely to defriend conservatives over politics, but the chances are good that they weren’t very close friends in the first place (although you can find many laments over the end of long-term friendships on the left, often precipitated by relatively mild pushback and a stream of abuse in response). I’m from the Deep South originally, and of course everyone back there “knows” that Obama is a Muslim socialist, because between Fox and talk radio and right-wing churches and the NRA, that’s what all self-described respectable, well-informed people hear (plus, Democrats are the party of black people, who are widely seen as lazy, violent, and ignorant). I effectively defriended almost everyone there many years ago when I left; social media allowed for at-arms-length reconnections without my having to pretend that I had any interest in the ideology or institutions it was such a relief to leave.

For what it’s worth, I just hide the crazies, and have been defriended a couple of times by conservatives (one a relative to whom I used to be close) even though I’m rarely aggressively political except in political fora or among like-minded acquaintances. The truth is that a) I don’t always want to know what people are thinking about important issues, and b) I do think less of political conservatives, because I consider it a mean, regressive, often self-serving inclination in practice. That’s why I left an area in which it is so unquestioned … and a state that un-coincidentally ranks at or near the bottom of every quality-of-life measure.

What did this person expect? The music died on February 3, 1959, and the news that all of us can trust to be the actual news, officially died on Tuesday, October 21, 2014 – and them good old boys were drinking whiskey and rye, singing this’ll be the day that I die, this’ll be the day that I die. It’s like that.

Posted in Ben Bradlee, The News Business | Tagged , , , , , , , , , , , , , | 1 Comment

Playing Fair

Americans are fair and open and generous. That’s what we tell ourselves, but when we declared our independence more than two hundred years ago we probably shouldn’t have started by declaring that all men are born equal, because that’s just not so. Those who were born to be short and squat aren’t going to be professional basketball stars, and some people just can’t carry a tune, so they’re not going to be crooning breathless romantic ballads to the nation, for big bucks, and they won’t be knocking them dead on Broadway. They’ll have to settle for wall-of-noise rock stardom. We’re all born with different talents, or a lack of any particular talent, but those who penned the Declaration of Independence had that covered. They really didn’t assert that all men are created absolutely equal, just that all men have, or should have, certain inalienable rights – to life, liberty, and the pursuit of happiness.

That’s the “given” in the axiom they presented. There are those three basic rights that all men have, which should never be taken away by any king, like that King George guy on the other side of the ocean. Of course there are dolts and hopeless losers, and geniuses and winners at everything, but the idea was that everyone should get a fair shot at making what they want of their lives, if they are white males, of property. There was a lot to work out over the years, a process of including more folks in that group who have those same rights – black folks, the former slaves, and even women, who finally got the right to vote, and one day may be guaranteed equal pay for equal work. Obama signed the Lilly Ledbetter Fair Pay Act in 2009 and many Republicans are still fuming about that. It cripples business. Stuff like that will ruin America.

We’re still working on a lot of this. The Fourteenth Amendment with its equal protection clause was added in 1868, and we’re now in the process of deciding if that applies to gays, and the current consensus is that it does. The pursuit of happiness is also the pursuit of marriage, although many married straight folks will say happiness is a bit iffy there – but what the heck, let gay folks give it a go. They may have better luck. Things tend toward playing fair. Americans play fair.

That’s why there’s a direct line from the Occupy Wall Street movement in 2011 to the current popularity of Elizabeth Warren, the eloquent populist – something seems unfair. Income inequality has never been this severe. The game seems to be rigged. The one percent, those who hold almost all the wealth, can’t be THAT much better than the rest of us, even if they say they are. Every Republican from Herman Cain to Mitt Romney has said anyone in America can be a millionaire – all they have to do is get off their fat lazy ass and just do it, so everyone should stop whining – but no one believes that. Many have tried. It didn’t work out. Hard work doesn’t get you there. Luck does, or being born in the right family – the hard work is optional. The game is rigged, or it’s all random. Either way, fairness has nothing to do with getting rich – unless you’re one of those that blames only yourself for everything that goes wrong in your life, because you’re just a miserable excuse for a human being. Republicans thrive on the votes of such people. Republicans tell you that the problem is not them, it’s you, and you know it – or it’s those black folks, or the brown ones, who are the problem. There are many ways to use Americans’ sense that things are just not fair. Americans hate unfairness. That’s why we started this country.

This sense of fairness is almost innate:

Even at 15 months, when they are just beginning to grasp language and acquaint themselves with their newfound motor skills, babies understand the concepts of sharing and fairness, suggests a new study.

The researchers also found that infants do have different sharing “personalities,” with some being shocked by unfairness and others by equal sharing.

“These norms of fairness and altruism are more rapidly acquired than we thought,” study researcher Jessica Sommerville, of the University of Washington, said in a statement. “These results also show a connection between fairness and altruism in infants, such that babies who were more sensitive to the fair distribution of food were also more likely to share their preferred toy.”

Even infants are little Democrats and little Republicans. Some are shocked by unfairness and others are shocked by equal sharing, which they see as unfair to them. Abolishing slavery was unfair to the slaveholders after all. That ruined them economically, and this whole business is complicated, as the infant-study shows:

The majority (92 percent) of babies who shared their preferred toy were also the ones who were shocked by unfairness in the videos and were named “altruistic sharers.” Of the infants who shared their least favorite toy, 86 percent were also shocked by equal sharing in the video, called “selfish sharers.”

“The altruistic sharers were really sensitive to the violation of fairness in the food task,” Sommerville said. Fairness seems as though it might even be built into our brains; research published in the journal Nature in 2010 showed that our brain centers react to unfair allocation of monetary rewards.

Though fairness may be ingrained in even the youngest of infants, our ideas of fairness seem to change as we age. Previous research found that young children seem to like all things to be equal, but older adolescents are more likely to consider merit when it comes to dividing up the wealth, a study published in the journal Science in 2010 found. It could be due to brain changes and adaptation to social experiences.

Perhaps we outgrow our sense of fairness, and Republicans are the only grown-ups in the room, but in the Washington Post, Matt O’Brien sees something else going on:

America is the land of opportunity, just for some more than others.

That’s because, in large part, inequality starts in the crib. Rich parents can afford to spend more time and money on their kids, and that gap has only grown the past few decades. Indeed, economists Greg Duncan and Richard Murnane calculate that, between 1972 and 2006, high-income parents increased their spending on “enrichment activities” for their children by 151 percent in inflation-adjusted terms, compared to 57 percent for low-income parents.

They have the money to do that, but it’s more than the money spent:

It’s also a matter of letters and words. Affluent parents talk to their kids three more hours a week on average than poor parents, which is critical during a child’s formative early years. That’s why, as Stanford professor Sean Reardon explains, “Rich students are increasingly entering kindergarten much better prepared to succeed in school than middle-class students,” and they’re staying that way.

It’s an educational arms race that’s leaving many kids far, far behind.

But wait! There’s more:

Even poor kids who do everything right don’t do much better than rich kids who do everything wrong. Advantages and disadvantages, in other words, tend to perpetuate themselves. … Specifically, rich high school dropouts remain in the top about as much as poor college grads stay stuck in the bottom – 14 versus 16 percent, respectively. Not only that, but these low-income strivers are just as likely to end up in the bottom as these wealthy ne’er-do-wells.

Hard work and education get you nowhere, unless you’re there already:

What’s going on? Well, it’s all about glass floors and glass ceilings. Rich kids who can go work for the family business – and, in Canada at least, 70 percent of the sons of the top 1 percent do just that – or inherit the family estate don’t need a high school diploma to get ahead. It’s an extreme example of what economists call “opportunity hoarding.” That includes everything from legacy college admissions to unpaid internships that let affluent parents rig the game a little more in their children’s favor.

But even if they didn’t, low-income kids would still have a hard time getting ahead. That’s, in part, because they’re targets for diploma mills that load them up with debt, but not a lot of prospects. And even if they do get a good degree, at least when it comes to black families, they’re more likely to still live in impoverished neighborhoods that keep them disconnected from opportunities.

Opportunity hoarding, then, is unfair, but there’s not much that can be done about it. Fredrik deBoer cites study after study (with nifty charts) that shows the same thing and throws up his hands:

The question of how much control the average individual has over his or her own economic outcomes is not a theoretical or ideological question. What to do about the odds – that’s philosophical and political. But the power of chance and received advantage – those things can be measured, and have to be. And what we are finding, more and more, is that the outcomes of individuals are buffeted constantly by the forces of economic inequality. Education has been proffered as a tool to counteract these forces, but that claim, too, cannot withstand scrutiny. Redistributive efforts are required to address these differences in opportunity. In the meantime, it falls on us to chip away, bit by bit, on the lie of American meritocracy.

At least someone is doing some chipping away at that lie:

Sen. Elizabeth Warren railed against the GOP during a campaign rally in Minnesota for Sen. Al Franken (D-MN) on Saturday.

“The game is rigged, and the Republicans rigged it,” she said during her speech at Carleton College, according to the Washington Post.

Warren told the crowd that she would fight against the banks that oppose her legislation that would allow students to refinance their student loans.

“We’re coming after them,” she said.

Every little bit helps, unless it doesn’t. The Republicans will sink any legislation she proposes, because she’s dangerous, as the Washington Post’s Eugene Robinson explains here:

Sen. Elizabeth Warren says she isn’t running for president. At this rate, however, she may have to.

The Massachusetts Democrat has become the brightest ideological and rhetorical light in a party whose prospects are dimmed by – to use a word Jimmy Carter never uttered – malaise. Her weekend swing through Colorado, Minnesota and Iowa to rally the faithful displayed something no other potential contender for the 2016 presidential nomination, including Hillary Clinton, seems able to present: a message.

The message is simple. Play fair:

“We can go through the list over and over, but at the end of every line is this: Republicans believe this country should work for those who are rich, those who are powerful, those who can hire armies of lobbyists and lawyers,” she said Friday in Englewood, Colo. “I will tell you we can whimper about it, we can whine about it or we can fight back. I’m here with [Sen.] Mark Udall so we can fight back.”

Warren was making her second visit to the state in two months because Udall’s reelection race against Republican Cory Gardner is what Dan Rather used to call “tight as a tick.” If Democrats are to keep their majority in the Senate, the party’s base must break with form and turn out in large numbers for a midterm election. Voters won’t do this unless somebody gives them a reason.

Warren may be that somebody. Her grand theme is economic inequality and her critique, both populist and progressive, includes a searing indictment of Wall Street. Liberals eat it up.

Of course they do:

Warren talks about comprehensive immigration reform, support for same-sex marriage, the need to raise the minimum wage, abortion rights and contraception – a list of red-button issues at which she jabs and pokes with enthusiasm. The centerpiece, though, is her progressive analysis of how bad decisions in Washington have allowed powerful interests to re-engineer the financial system so that it serves the wealthy and well-connected, not the middle class.

It’s unfair:

There once was consensus on the need for government investment in areas such as education and infrastructure that produced long-term dividends, she said. “Here’s the amazing thing: It worked. It absolutely, positively worked.”

But starting in the 1980s, she said, Republicans took the country in a different direction, beginning with the decision to “fire the cops on Wall Street.”

“They called it deregulation,” Warren said, “but what it really meant was: Have at ‘em, boys. They were saying, in effect, to the biggest financial institutions, any way you can trick or trap or fool anybody into signing anything, man, you can just rake in the profits.”

She went on to say that “Republicans, man, they ought to be wearing a T-shirt. The T-shirt should say, ‘I got mine. The rest of you are on your own.'”

Those were the “selfish sharers” in the infant-study, and Robinson senses something is changing:

She’s not running for president apparently because everyone assumes the nomination is Clinton’s. But everyone was making that same assumption eight years ago, and we know what happened. If the choice is between inspiration and inevitability, Warren may be forced to change her plans.

Americans are fair and open and generous. We have an innate sense of fairness. Everyone does, as that infant-study showed, and Warren thinks we should take our country back, even if she prefers to help America do that from the Senate, not the Oval Office. That’s what she’s saying now, and she may not change her mind. Hillary Clinton had better hope she doesn’t change her mind. If Warren does change her mind, however, that election would be clarifying. The Republican candidate would offer the same message as before, the Tea Party message – I’ve got mine, and the rest of you are on your own, and that’s only fair, because you have no right to my stuff. That’s the country they want to take back, a completely different country. Some are shocked by unfairness and others are shocked by equal sharing. Let’s see who wins.

The outcome of that election would be determined by who is allowed to vote, another matter of fairness, and Joan Walsh covers the latest twist in that fairness battle:

It’s become a cliché that Supreme Court Justice Ruth Bader Ginsberg issued a “blistering dissent” from a conservative, pro-corporate anti-democracy majority position. We need a new term for what Ginsberg did at 5 a.m. Saturday morning, in a rare public dissent from a SCOTUS decision not to take up a case – this one a challenge to Texas’ harsh and in Ginsberg’s words “discriminatory” voter identification law. …

Not only did Ginsberg demand to write a dissent – she was joined by Elena Kagan and Sonia Sotomayor – but she laid out her reasoning in stirring words that echoed a conservative judicial critic of voter identification, Richard Posner, calling it an “unconstitutional poll tax.”

Now that we know what to call it, and we have a legal framework for understanding that voter ID is a direct descendant of Jim Crow laws, will it be easier to fight? I’m not sure, but understanding is always a necessary first step to action.

This is a matter of fairness, with competing views of just what that is:

It can be hard to combat the notion that voter ID is a common-sense requirement. The vast majority of us have driver’s licenses, and we’re used to showing ID to board a plane or enter a major office building. Yet 20 million adults, or 10 percent of eligible voters, don’t have a driver’s license. Voter ID laws disproportionately hurt black and Latino voters, but also elderly people and students. With the exception of the elderly, those voters are the cornerstone of the Democratic coalition.

In Texas, a federal trial court found that Gov. Rick Perry’s voter ID law was intentionally discriminating against minority voters, disenfranchising as many as 600,000 Texans. But the 5th Circuit U.S. Court of Appeals overturned that decision last week, so the ACLU and other groups went to the Supreme Court. The court declined to consider the case, in line with earlier decisions not to change the rules for voting so close to an election. Ginsberg challenged her colleagues’ peculiar decision to prioritize orderly election administration over protecting voting rights.

“The greatest threat to public confidence in elections in this case is the prospect of enforcing a purposefully discriminatory law,” Ginsberg thundered, “one that likely imposes an unconstitutional poll tax and risks denying the right to vote to hundreds of thousands of eligible voters.”

And here, context matters:

Texas has the worst voter ID law in the country, not even allowing student IDs or Veterans Administration IDs, unlike other states. Unlike her majority colleagues, Ginsberg took seriously the costs of obtaining public ID, as well as the difficulty of traveling to get it. That’s what makes it a poll tax, comparable to the imposition of voting fees that were used to turn away poor black voters in the Jim Crow South – which were outlawed by the 24th Amendment.

Ginsberg’s reasoning echoes that of 7th Circuit Court of Appeals Judge Richard Posner, a conservative who’s had a change of heart and mind on the issue of voter ID. Amazingly, Posner wrote the decision upholding Indiana’s voter ID law, which the Supreme Court later upheld. In his remarkable dissent from his colleagues’ refusal to take up a challenge to Wisconsin’s voter ID law earlier this month – the Supreme Court actually stepped in and suspended that one – Posner specifically blasted Republicans for hyping the “essentially nonexistent” threat of voter fraud.

“There is only one motivation for imposing burdens on voting that are ostensibly designed to discourage voter-impersonation fraud,” he writes, “and that is to discourage voting by persons likely to vote against the party responsible for imposing the burdens.” He noted that such laws are “highly correlated with a state’s having a Republican governor and Republican control of the legislature and appear to be aimed at limiting voting by minorities, particularly blacks.” Posner specifically mocked right-wing groups like True-the-Vote, which claims Democrats are busing minority voters to the polls “on nonexistent buses.”

Posner was fighting a lot of nonsense:

While his colleagues claimed anyone could “scrounge up” their birth certificate, the 75-year-old jurist admitted he “has never seen his birth certificate and does not know how he would go about ‘scrounging’ it up.” … Posner attached to his dissent 12 confusing pages of documents given to an applicant whose birth certificate couldn’t be found. He noted that getting ID could cost $75 to $175, much higher than “the $1.50 poll tax outlawed by the 24th amendment in 1964.”

This fairness bug is catching, even if not much can be done now:

Between Posner and Ginsberg, we have a rare bipartisan intellectual, political and moral agreement that voter ID laws are a 21st century descendant of Jim Crow, only now playing nationwide, not just in the South. This should settle the issue, but it’s unlikely to. The Republican Party faces demographic extinction, on its current course, but it has two powerful weapons in its arsenal: stoking fear – of Ebola, ISIS, immigrants, nearly anybody who isn’t white, our first black president and uppity women – and voter suppression.

That may be so, and unfair, but isn’t it also fair to ask for a simple photo-ID to vote in Texas and all the other states where the Republicans have changed the rules? The states are supposed to administer the voting process, because it’s a precinct by precinct thing and quite local, and tedious. Why can’t the state specify which sort of ID is good and which is not? If you can’t get one, because you’re poor, that’s hardly the state’s problem. Fair is fair. Maybe you should get a good job and stop insisting on being poor, because you like government freebies.

This is fairly simple. If you now can’t vote that’s your fault, not theirs. That seems to be the counterargument, from the days when everyone had a fair shot at making what they want of their lives, if they were white males, of property – the good old days. It seems that everyone wants to take their country back. They just choose different points in time, which means they choose different countries.

The issue, however, is fairness, and even a fifteen-month-old infant knows all about that.

Posted in Elizabeth Warren, Income Inequality, Voter Suppression | Tagged , , , , , , , , , , , , , , , , , , | 1 Comment

The Persistence of Buffoonery

A good friend – a high-powered attorney who is the leading expert on some rather arcane details of securities law, who spends his time shuttling between Wall Street, where the players are, and Washington, where the regulators are (and the politicians, who need to have things explained to them over and over) – just got back from two weeks in France – Paris and Provence and that sort of thing. Everyone needs a break now and then, and Paris in October is rather fine. The city looks good in the rain, and the summer tourists are gone. The city becomes itself, a place where people simply work and live, in their French way. The text messages followed, but not about the sights or the food or any of that. It was the feel of the place. He was impressed with the formality there, which he characterized as a refreshing absence of buffoonery. That just made sense to him, but he lives in a big house in New Jersey, a pretty enough place with Princeton nearby, but of course his governor is Chris Christie, New Jersey’s buffoon, to match the buffoon across the river in Manhattan, Donald Trump. Some would say neither is a buffoon – they’re just brash or bold guys who like to shout about what they say is right and wrong, and sneer at those who disagree with them, and that’s refreshing, because they’re not politically correct in any way. They tell it like it is, in your face. If you don’t like it, screw you. Perhaps only the French would call them buffoons, or those who spend a few weeks in Paris, where careful formality is the norm. One can be pointed and nasty without being an asshole. The French have mastered the art of deadly irony you might not get until it’s too late and subtle ridicule that sounds like praise, until you think about what was just said.

It’s an art form. The suave Dominique Marie François René Galouzeau de Villepin smiled and told us that our plan for immediate war with Iraq was ill-advised, as if he were explaining this to a petulant child he was nevertheless fond of. At the UN in early February, 2003, he almost laughed at Colin Powell when Powell asked for the UN to go to war with us, or at least tell us our little war was fine with them. Dominique de Villepin, with that bemused smile of the loving adult for the confused child who needs a little help with his tantrum, said wait, let the inspectors finish – there may be no weapons of mass destruction in Iraq, and even if by some odd chance there are, there are better ways to handle this. And of course the guy was right. It just took ten years for us to realize it. We’re not French. We expect buffoonery, which we can counter with our better buffoonery.

That seldom works. It’s too easy to see being careful and formal and precisely polite, and deadly logical, as weakness – but some things just aren’t done. If you’re invited to a private dinner at a French home, and if your French is up to it, you need to know there are some things that are not discussed after the cheese and then the cognac and coffee. What you do for a living isn’t all that important, how you choose to live your life is, and discussing how much money you make is appalling. Mention that and everyone suddenly falls silent. As the French say, an angel passes. That’s so crass, and don’t tell everyone you’ve been born again and have accepted Jesus as your personal savior. That’s your business, no one else’s. Europe has had more than a thousand years of religious wars – the last caliphate made it all the way up to Lyon, Hitler wiped out six million Jews and the French helped a bit, there are too many Muslims everywhere over there now, and the French have designated Scientology a cult that’s really a scam – so keep it to yourself. Such stuff is private. France is a Catholic nation, but privately Catholic. We discuss the separation of church and state all the time, as if Thomas Jefferson might have been wrong, or might have meant something else. They live it and it works rather well for them. Thomas Jefferson lived in Paris from August 1784 to September 1789 – that says something.

Save all that stuff for when you get home to New Jersey or wherever. Brag all you want about how rich you are, or how rich you’re going to be, or how rich you would be if there were justice in the universe – or whine about how poor you are – and get into whatever heated arguments you’d like about religion. It’s a free country, but there is the idea that such talk is gauche – the French word for what is vulgar and tasteless and a bit embarrassing, and maybe a bit dangerous. It’s also very American.

Americans do like their righteous buffoonery, and religion – and the foolishness of political correctness – is a hot topic now. At the beginning of October it came up on Bill Maher’s HBO show:

Bill Maher, who has been more than vocal (and sometimes sexist) about his views on Islam, dove back into the fray – this time with Ben Affleck as his opponent on Real Time with Bill Maher. Maher argued that “liberals need to stand up for liberal principles,” like equality for women and gays and lesbians, but said they’re reluctant to denounce Islam: “But if you say, in the Muslim world, this is what is lacking, then they get upset.” One of those liberals, Ben Affleck told Maher that conflating Islam as one entity was “gross” and “racist.” Affleck went on, “Or how about the more than a billion people who aren’t fanatical, who don’t punish women, who just want to go to school, have some sandwiches, and don’t do any of the things you say all Muslims do?”

Yeah, well, Maher said the Muslim world gave us ISIS, and gave us the practice of female genital mutilation too, and then others chimed in:

Religion scholar Reza Aslan said Maher’s argument was “not very sophisticated” because many Christian countries also practice female genital mutilation, and many Muslim countries do not. Aslan argued that he should be saying it’s a Central African problem rather than a Muslim one. …

Affleck was joined by New York Times columnist Nicholas Kristof, who pointed out that there are multiple Muslim reformers like Malala Yousafzai…

It was a lively I’m-right-and-you’re-wrong discussion, of a sort – it was mostly shouting (Affleck) and sneering (Maher) – and a bit embarrassing. The comedian and the movie star were each claiming they knew the real truth about one of the world’s major religions, because some things are just obvious. It was typical American righteous buffoonery, a lot of shouting that was pretty far from logical and informed – don’t invite these guys to your next dinner party – and it may have been a bit dangerous. There’s a reason the French avoid such topics. Such talk can start wars – but Maher’s fist show like this on ABC was Politically Incorrect. He’s still at it. It’s what he does. That’s how he makes a living. He creates a buzz.

The buzz didn’t die down. Others, however, decided to add some light to the heat, and Andrew Sullivan offered this:

I think it’s pretty indisputable that any religion that can manifest itself in the form of something like ISIS in any period in history is in a very bad way. I know they’re outliers – even with respect to al Qaeda. But, leaving these mass murderers and sadists to one side, any religion that still cannot allow its own texts to be subject to scholarly and historical inquiry, any religion that denies in so many parts of the world any true opportunities for women, and any religion whose followers believe apostasy should be punished with death is in a terrible, terrible way. There is so much more to Islam than this – but this tendency is so widespread, and its fundamentalism so hard to budge, and the destruction wrought by its violent extremists so appalling, that I find Affleck’s and Aslan’s defenses to be missing the forest for the trees.

Yes, there are Jewish extremists on the West Bank, pursuing unforgivable religious war. There are murderous Buddhist extremists in Burma. There are violent Christian extremists in Nigeria, and in Russia. All religions have a propensity to banish doubt, to suppress humility and to victimize outsiders. But today, in too many parts of the world, no other religion comes close to the menace and violence of Islam.

We have an exception here:

Christianity has a bloody past and a deeply flawed present. Islam has a glorious past in many respects, and manifests itself in many countries today, including the US, humbly, peacefully, beautifully. But far too much of contemporary Islam – from Pakistan through Iran and Iraq to Saudi Arabia – is more than usually fucked up. Some Muslims are threatening non-believers with mass murder, subjecting free societies to shameless terrorism, engaging in foul anti-Semitism, and beheading the sinful in Saudi Arabia just as much as in the Islamic State. And if liberals – in the broadest sense – cannot stand up for freedom of speech and assembly and religion, and for toleration as a core value, then what are liberals for?

Does this make me a bigot? Of course it doesn’t. Criticizing a current manifestation of a religion is a duty – not a sin.

Sullivan is trying to remove the buffoonery from what Maher was saying, and adds this about contemporary Islam:

In history, some of these deviations from the humility of true faith have been worse in other religions. Christianity bears far more responsibility for the Holocaust, for example, than anything in Islam.

But the eighteenth and nineteenth centuries forced a reckoning between those coercive, reactionary forces in Christianity, and in the twentieth century, Catholicism finally, formally left behind its anti-Semitism, its contempt for other faiths, its discomfort with religious freedom, and its disdain for a distinction between church and state. Part of this was the work of reason, part the work of history, but altogether the work of faith beyond fundamentalism. Islam has achieved this too – in many parts of the world. But in the Middle East, history is propelling mankind to different paths – in part because of the unmediated nature of Islam, compared with the resources of other faiths, and also because that region is almost hermetically sealed from free ideas and open debate and civil society.

Let me put it this way: when the Koran can be publicly examined, its historical texts subjected to scholarly inquiry and a discussion of Muhammed become as free and as open in the Middle East as that of Jesus in the West, then we will know that Islam is not what its more unsparing critics allege. When people are able to dissent, to leave the faith, and to question it openly without fearing for their lives, then we will know that Islam is not, in fact, ridden with pathologies that are simply incompatible with modern civilization. It seems to me that until that opening happens, there will be no political progress in the Middle East. That is why we have either autocracy or theocracy in that region, why the Arab Spring turned so quickly into winter, and why the rest of the world has to fear for our lives as a result.

And this sounds very French:

Western democracy was only made possible by the taming of religion.

Western democracy made religion a private matter. That’s what we put in our Constitution. Many in America resent that, but the words are there. Christians aren’t being persecuted. There is no war on Christmas. Christians are being left alone to be whatever they want to be. This isn’t Iran, with the other religion, not Islam, and this was a strange way to bring up the whole issue. Ed Kilgore puts it this way:

You don’t have to watch the segment in question to understand, a priori, that five non-Muslims, none of whom are in any way experts on Islam, aren’t going to do much of anything other than damage in dissecting a big, complicated, multifaceted World Religion in a single segment of a single television show.

In the New York Times, Reza Aslan argues that religious identity is not about this particular faith here, as it’s more about culture and history:

As a form of identity, religion is inextricable from all the other factors that make up a person’s self-understanding, like culture, ethnicity, nationality, gender and sexual orientation. What a member of a suburban megachurch in Texas calls Christianity may be radically different from what an impoverished coffee picker in the hills of Guatemala calls Christianity. The cultural practices of a Saudi Muslim, when it comes to the role of women in society, are largely irrelevant to a Muslim in a more secular society like Turkey or Indonesia.

These guys didn’t know what they were talking about in the first place, but the damage had already been done. There was David Horowitz is the National Review with this:

The horrific images of the beheadings, the reports of mass slaughters, and the threats to the American homeland have accomplished what our small contingent of beleaguered conservatives could never have achieved by ourselves. They brought images of these Islamic fanatics and savages into the living rooms of the American public, and suddenly the acceptable language for describing the enemy began to change. “Savages” and “barbarians” began to roll off the tongues of evening-news anchors and commentators who never would have dreamed of crossing that line before, for fear of offending the politically correct.

Virtually every major Muslim organization in America is an arm of the Muslim Brotherhood, the fountainhead of Islamic terror. Huma Abedin, who was deputy chief of staff to Secretary of State Hillary Clinton (and is still Clinton’s confidante and principal aide), comes from a family of Muslim Brotherhood leaders. Yet legislators who have the power to investigate these matters are still intimidated from even raising them. Representative Michele Bachmann, who did raise them, was excoriated as a racist not only by the Left but also by John Boehner and John McCain.

David Horowitz thanks ISIS for starting to turn this around, because all hate Islam now, or soon will. That’s one way to look at it, but Freddie deBoer in an email to Andrew Sullivan offers this:

I find it disappointing that you have not once, in your series of posts on Islam, significantly reflected on 100+ years of American murder, destruction, destabilization, support for dictatorship, and stealing of resources as radicalizing factor in the Muslim world. The constant arguments of the type “well, Christianity doesn’t have a radicalism problem” completely ignores that the Christian world has not been subject to a century-long campaign of aggression and mistreatment by America. There can be no hope for moderation among a people who have been subjected to constant injustice since before either of us was born. Since World War I, there has never been a time when the United States has not been directly and destructively influencing the greater Muslim world. That has radicalized many Muslims. And it is a failure of basic moral principle to be a citizen of a country that is participating in a destabilizing, radicalizing, moderation-undermining campaign against the members of a religion and to turn around and ask why they are not more moderate.

If there is a cancer in the Muslim world, then America’s behavior is the carcinogen.

He’s angry with Sullivan:

According to the most basic moral principles – that’s Western principles, by the way, Christian principles and secular alike – your responsibility is your own country. In democracy, your job is your own country. So clean your own house before you tell a billion other people how to clean theirs.

Sean McElwee offers an analogy:

The criticism of “radical Islam” in fact bears resemblance to another dodge today. In the wake of usurpation, violence and plunder, white Americans look at blacks and worry about “cultural pathologies,” where only economic deprivation exists. At the core, the fallacy is the same – ascribing a negative culture to an oppressed and maligned group.

During the debate, Bill Maher claimed, “Islam at the moment is the motherlode of bad ideas.” A more correct assessment is that the material circumstances in the Middle East, many of them the legacy of colonial repression and exploitation, are the motherlode of bad ideas. …

Ultimately, the attack on Islam is a convenient dodge, a means to obfuscate the harm of past oppression under the guise of liberal pluralism. Religion will always exist and will reflect material circumstances; it is therefore best to support religious moderates, but also remove the despair and deprivation that allow violent ideologies to flourish.

Reza Aslan adds another twist to this:

People don’t derive their values from their religion – they bring their values to their religion, which is why religions like Judaism, Hinduism, Christianity, [and] Islam, are experienced in such profound, wide diversity. Two individuals can look at the exact same text and come away with radically different interpretations. Those interpretations have nothing to do with the text, which is, after all, just words on a page, and everything to do with the cultural, nationalistic, ethnic, political prejudices and preconceived notions that the individual brings to the text. That is the most basic, logical idea that you could possibly imagine, and yet for some reason, it seems to get lost in the incredibly simplistic rhetoric around religion and the lived experience of religion.

Think of it this way:

This is the thing – it’s not that you can interpret away problematic parts of a scripture. It’s that the scriptures are inundated with conflicting sentiments about almost every subject. In other words, the same Torah that tells Jews to love their neighbor also tells them to kill every single man, woman, and child who doesn’t worship Yahweh. The same Jesus who told his disciples to give away their cloaks to the needy also told them to sell their cloaks and buy swords. The same Quran that tells believers if you kill a single individual, it’s as though you’ve killed all of humanity, also tells them to slay every idolater wherever you find them.

So, how do you, as an individual, confront that text? It’s so basic, a child can understand: The way that you would give credence or emphasis to one verse as opposed to the other has everything to do with who you are. That’s why they have to sort of constantly go back to this notion of an almost comical lack of sophistication in the conversations that we are having about religion. And to me, there’s a shocking inability to understand what, as I say, a child would understand, which is that religions are neither peaceful nor violent, neither pluralistic nor misogynistic – people are peaceful, violent, pluralistic, or misogynistic, and you bring to your religion what you yourself already believe.

That’s why you never discuss religion after the cheese and then the cognac and coffee, should you find yourself at that dinner party in Paris. Try it and everyone will fall silent. They know better. It’s dangerous and stupid. Leave your buffoonery at home.

Posted in Islam | Tagged , , , , , , , , , , , , , , , , , | Leave a comment

At the Edge of Hysteria

Halloween is coming, and everyone likes a good scare, but that means all the old movies on television that people idly watch at the end of a long day, often something they’ve watched many times before but remember fondly, will be the usual Hollywood horror movies, but probably not the classics from the thirties – Dracula (1931) with Bela Lugosi, or James Whale’s Frankenstein, with Boris Karloff, from the same year. The next year it was The Mummy with Boris Karloff, and King Kong (1933) was Adolf Hitler’s favorite movie. The Bride of Frankenstein was a hit in 1935, and four years later in was the Son of Frankenstein, but those films have now lost all their power. Only Mel Brooks could bring Frankenstein back to life, as a charming and iconic comedy. His all singing-and-dancing stage musical version of his Young Frankenstein is now on stage here in Los Angeles. It’s a revival and it’s a hoot. It lives!

Those horror movies from the thirties are quaint now. The terror has been drained away, and what we’ve been offered since has settled down into different tropes. Teenagers make stupid choices. No, that empty old house on that dark hill isn’t a good place to spend that dark and stormy night. There’s probably a nice enough motel just down the road, maybe a Holiday Inn Express. Some doors should not be opened. Some things are none of your business. But the bad choices mount, and much gore and a lot of slashing follows, and the more panicked the sweet young thing and her wide-eyed boyfriend become, the worse choices they make, until everyone’s dead. One can feel deep sympathy for their fate, or decide they were both really stupid. And by the way, don’t mock the hapless ugly girl in your high school, if she’s named Carrie. Actually, mocking the meek and weak is probably a bad idea in general. What’s the point?

There are no more mad scientists or intrepid explorers unleashing horror through their overreach, even if they are smart as hell. We’ve settled on stupid kids making bad choices, then even worse choices as they panic, and we eat it up, probably because that’s closer to our experience. Most of us aren’t smart as hell, and we know all about shared hysteria, leading to worse and worse choices, and then people die. Saddam Hussein had those weapons of mass destruction and we were all going to die unless we took care of him, which would take care of them. There was no solid proof that Saddam Hussein had those nasty weapons, there was no smoking gun, but we were told the smoking gun could come in the form of a mushroom cloud. George Bush said so. We walked right into the haunted house on the hill, and then as things got really nasty, we made worse and worse choices – we tortured people, thinking that would make things better. We decided the Sunnis over there were expendable and a Shiite government would probably work out just fine, and now outraged Sunnis have formed ISIS and the whole region is falling apart. We shouldn’t have opened that door. We’re still in that horror film where everyone dies.

Hollywood was onto something. One stupid choice leads to another, and then another, and pretty soon people are dying left and right. Hollywood just packaged this as something we could watch from a safe distance, for a bit of vicarious thrill, but they knew we all know all about this all too well. That could be us. That is us. Maybe George Bush stoked mass hysteria to get the war he wanted, not that boring war in Afghanistan, or maybe we caught Dick Cheney’s all-consuming hysteria and belligerent paranoia, but we panicked. The American public knew this war was necessary, except for those Americans who kept saying we shouldn’t do something stupid, and the French. One really bad choice can lead to further even worse choices, which always leads to very bad things happening, just like in the slasher movies. Life imitates art, or the other way around. Hillary Clinton and many others mocked Obama for having a foreign policy that he admitted came down to one simple principle, don’t do stupid stuff, but there’s something to be said for that.

That’s of course what the nerd in the horror movie always says. Don’t do that, whatever it is. Think things through. Use your head, not your fears – and the worst thing to do is panic. If you panic you’ll make even worse choices, and panic is catching. Soon everyone will be making stupid choices – but no one listens to the nerd. The horror begins.

This happens a lot, and we seem to be at the same point again with Ebola, where the nerds are trying to get everyone calmed down and sensible, and this is getting worrisome:

As health officials scramble to explain how two nurses in Dallas became infected with Ebola, psychologists are increasingly concerned about another kind of contagion, whose symptoms range from heightened anxiety to avoidance of public places to full-blown hysteria.

So far, emergency rooms have not been overwhelmed with people afraid that they have caught the Ebola virus, and no one is hiding in the basement and hoarding food. But there is little doubt that the events of the past week have left the public increasingly worried, particularly the admission by Dr. Thomas R. Frieden, director of the Centers for Disease Control and Prevention, that the initial response to the first Ebola case diagnosed in the United States was inadequate.

On Wednesday, the CDC offered up the latest piece of bad news, announcing that a second infected nurse in Dallas had flown back from Cleveland a day before developing symptoms. Even before the announcement, two-thirds of the respondents to a Washington Post-ABC News poll said they were concerned about a widespread epidemic of Ebola in this country.

That’s not going to happen:

The risk of Ebola infection remains vanishingly small in this country. The virus is not airborne, not able to travel in the way that, say, measles or the SARS virus can. Close contact with a patient is required for transmission. Just one death from Ebola has occurred here, and medical care is light-years from that available in West Africa, where more than 4,400 people have died in the latest outbreak.

By contrast, in some years, the flu kills more than 30,000 people in the United States. Yet this causes little anxiety: Millions of people who could benefit from a flu shot do not get one.

One can be logical about this, but we’re too far into the horror movie for that:

Experts who study public psychology say the next few weeks will be crucial to containing mounting anxiety. “Officials will have to be very, very careful,” said Paul Slovic, president of Decision Research, a nonprofit that studies public health and perceptions of threat. “Once trust starts to erode, the next time they tell you not to worry – you worry.”

Mass hysteria follows, not that we’ll invade Iraq again – but we might. As for Ebola, we’ve been here before:

Experts said the most recent precedent of the Ebola risk, psychologically speaking, is the anthrax scare that followed the Sept. 11 attacks. In the weeks after an unknown assailant sent deadly envelopes with powdered anthrax spores to public officials, people across the country were seized by anxiety.

Some duct-taped windows and stayed away from work. In pockets of the country – Tennessee, Maryland and Washington – people reported physical symptoms like headaches, nausea and faintness. Ultimately they were determined to be the result of hysteria.

“I was in college then, and I remember they evacuated the business school building because someone saw white powder in the cafeteria,” said Andrew Noymer, a sociologist at the University of California, Irvine. The powder turned out to be artificial sweetener.

Expect that sort of thing now:

Psychologists have known for years that people judge risk based on a sophisticated balance of emotion and deduction. Often the former trumps the latter.

Instinctual reactions are quick and automatic, useful in times when the facts are not known or there is not enough time to process what little is known. Analytical reasoning is much slower and much harder; if we relied on analysis alone, decisions about risk would paralyze us.

Sure, but the risks are known here, not that they matter, as David Ignatius explains:

You could feel a shiver of panic coursing through the American body politic this week as the country struggled with a metastatic set of crises: the spread of the Ebola virus, the surge of Islamic State terrorists and the buckling global economy. Listening to the news, many Americans must have felt … that the protection layer had been breached.

President Obama tried to speak calmly to a rattled nation on Wednesday, describing how he had kissed and embraced nurses at Emory University Hospital who had treated Ebola patients safely. Don’t panic, was the unspoken message. It’s safe. Listening to the president, you couldn’t help but wonder if he was straining to keep a polarized, fearful country from losing its cool.

That’s a tall order:

Panic is a natural human response to danger, but it’s one that severely compounds the risk. Frightened people want to protect themselves, sometimes without thinking about others. Often, they get angry and want to find someone to blame for catastrophe. Inevitably, they spread information without checking whether it’s true.

That’s how we ended up in Iraq, and as then, Ignatius sees the press as an issue:

My own business, the news media, has a peculiar responsibility in times like these. We have to deliver information quickly and reliably, and also hold officials accountable for their performance — all without unnecessarily frightening people or contributing to the kind of hysteria that makes public-health measures more difficult. This role is harder in an unfiltered, Internet-driven media world, where careful reporting can look to some people like suppression of information.

There’s no winning. The hysteria is here, although an odd hero emerged:

Fox News’ Shepard Smith railed against the media’s Ebola hysteria on Wednesday.

“You should have no concerns about Ebola at all. None. I promise,” stated Smith. He went on to tell viewers, “Do not listen to the hysterical voices on the radio and the television or read the fear-provoking words online. The people who say and write hysterical things are being very irresponsible.”

He explained: “We do not have an outbreak of Ebola in the United States. Nowhere. We do have two healthcare workers who contracted the disease from a dying man. They are isolated. There is no information to suggest that the virus has spread to anyone in the general population in America. Not one person in the general population in the United States.”

This was unusual on Fox News, where viewers are told we’re all going to die if Obama is not stopped, but this was even more unusual:

“With midterm elections coming, the party in charge needs to appear to be effectively leading. The party out of power needs to show that there is a lack of leadership,” said Smith.

Smith stressed, “I report to you with certainty this afternoon that being afraid at all is the wrong thing to do.” He called media-stoked Ebola panic “counterproductive”, saying that it “lacks basis in fact or reason.” …

“Someday there may be a real panic. Someday, something may start spreading that they can’t control. And then, do you know what we’re gonna have to do? We’re gonna have to relax and listen to leaders. We’re not gonna panic when we’re supposed to and we’re certainly not gonna panic now. We have to stop it.”

Then there was this:

Conservative radio host Rush Limbaugh complained on Thursday about Fox News host Shepard Smith’s commentary calling for news outlets to cover the burgeoning concerns about Ebola in the U.S. more responsibly.

“Shep Smith was crying so much during his reporting from New Orleans and Hurricane Katrina, his mascara was running,” Limbaugh groused.

Smith did report the Bush folks had messed up in their response to Katrina, and folks died. Limbaugh seems to think the right folks died, as they should have, but Limbaugh is not alone:

According to the American Family Association’s Bryan Fischer, it is because Smith is a gay “card-carrying liberal” who is seeking to provide cover for President Obama because Obama “supports the homosexual agenda.”

“Shepard Smith is a card-carrying liberal,” Fischer explained. “He has been outed as an active homosexual, so he’s down with the entire homosexual agenda. People think he’s on Fox so he’s conservative. Anything but.”

“Why would he want to support President Obama?” Fischer asked, before playing Smith’s segment on the Ebola panic. “Because President Obama supports the homosexual agenda.”

Bryan Fischer knows sentence fragments are powerful, and he knows that Smith is gay, even if that matter is far from clear – and he knows there’s reason to panic. Jonathan Last at the Weekly Standard says there are Six Reasons to Panic:

Start with what we know, and don’t know, about the virus. Officials from the Centers for Disease Control (CDC) and other government agencies claim that contracting Ebola is relatively difficult because the virus is only transmittable by direct contact with bodily fluids from an infected person who has become symptomatic – which means that, in theory, you can’t get Ebola by riding in the elevator with someone who is carrying the virus, because Ebola is not airborne.

This sounds reassuring. Except that it might not be true.

Viruses mutate. This one will mutate. One of those mutations could make it airborne, maybe. Panic is appropriate, and the “general infection rates are terrifying too” as lots of people got Ebola, maybe not here, but lots of people. Think about that, and think about this:

What’s to stop a jihadist from going to Liberia, getting himself infected, and then flying to New York and riding the subway until he keels over? This is just the biological warfare version of a suicide bomb. Can you imagine the consequences if someone with Ebola vomited in a New York City subway car? A flight from Roberts International in Monrovia to JFK in New York is less than $2,000, meaning that the planning and infrastructure needed for such an attack is relatively trivial. This scenario may be highly unlikely. But so were the September 11 attacks and the Richard Reid attempted shoe bombing, both of which resulted in the creation of a permanent security apparatus around airports. We take drastic precautions all the time, if the potential losses are serious enough, so long as officials are paying attention to the threat.

We should obviously shut down travel from these places, but one can catch a flight from Liberia to Lisbon, and then one to Lima, and then one to Miami, and then one to DC – so maybe we need to shut down all air travel. That seems to be the implication here, and this goes on and on. The Africans have been useless containing this over there, and this will surely get worse everywhere, and there’s this – “We have arrived at a moment with our elite institutions where it is impossible to distinguish incompetence from willful misdirection.”

Ah, now we know we can trust no one who says they know what they’re talking about. Obama could be lying through his teeth, and no one would trust him even if he was telling the truth. It’s the same for every doctor and scientist. We’re all on our own. That’s the final reason to panic.

One should remember that William Kristol’s Week Standard a dozen years ago was the place to go for all the arguments for why we had to go to war in Iraq, right now. It was Panic Central. Kristol’s staff provided all the turns of phrase Bush and Cheney and Rumsfeld and Rice would work into their press conferences and speeches. It was the neoconservative service bureau, and it doesn’t seem to have changed much.

Palm Waldman sees it this way:

I don’t know if Ebola is actually going to take Republicans to victory this fall, but it’s becoming obvious that they are super-psyched about it. Put a scary disease together with a new terrorist organization and the ever-present threat of undocumented immigrants sneaking over the border, and you’ve got yourself a putrid stew of fear-mongering, irrationality, conspiracy theories, and good old-fashioned Obama-hatred that they’re luxuriating in like it was a warm bath on a cold night.

It isn’t just coming from the nuttier corners of the right where you might expect it. … One candidate after another is incorporating the issue into their campaign. Scott Brown warns of people with Ebola walking across the border. Thom Tillis agrees: “Ladies and gentlemen, we’ve got an Ebola outbreak – we have bad actors that can come across the border. We need to seal the border and secure it.” “We have to secure the border. That is the first thing,” says Pat Roberts, “And in addition, with Ebola, ISIS, whoever comes across the border, the 167,000 illegals who are convicted felons, that shows you we have to secure the border and we cannot support amnesty.” Because really, what happens if you gave legal status to that guy shingling your roof, and the next thing you know he’s a battle-hardened terrorist from the ISIS Ebola brigade who was sent here to vomit on your family’s pizza? That’s your hope and change right there.

The Weekly Standard has done its job well, for the team:

Even if most people aren’t whipped up into quite the frenzy of terror Republicans hope, I suspect that there will be just enough who are to carry the GOP across the finish line in November. When people are afraid, they’re more likely to vote Republican, so it’s in Republicans’ interest to make them afraid. And you couldn’t come up with a better vehicle for creating that fear than a deadly disease coming from countries full of dark-skinned foreigners. So what if only two Americans, both health care workers caring for a dying man, have actually caught it? You don’t need facts to feed the fear. And they only need two and a half more weeks.

It’s a plan, but Andy Borowitz imagines how it could backfire:

There is a deep-seated fear among some Americans that an Ebola outbreak could make the country turn to science.

In interviews conducted across the nation, leading anti-science activists expressed their concern that the American people, wracked with anxiety over the possible spread of the virus, might desperately look to science to save the day.

“It’s a very human reaction,” said Harland Dorrinson, a prominent anti-science activist from Springfield, Missouri. “If you put them under enough stress, perfectly rational people will panic and start believing in science.”

Additionally, he worries about a “slippery slope” situation, “in which a belief in science leads to a belief in math, which in turn fosters a dangerous dependence on facts.”

That’s a bit fanciful. People who panic never start believing in science, and they certainly don’t suddenly develop a dependence on facts. They open that door that’s there no real reason to open, and the monster jumps out. Everyone has seen the movie. Panic and hysteria follow, which leads to more panic and hysteria. People die. It’s great fun, in the movies.

Posted in Ebola | Tagged , , , , , , , , , , , , , | Leave a comment

Riding the Ebola Wave

Something changed after the Greatest Generation took care of Hitler and Mussolini and Tojo, when America ended up with the only economy in the world that hadn’t been devastated by that war. America hadn’t been bombed into smoking rubble, and now we knew how to make almost anything on a massive scale, and had the capacity to do so. Incredible prosperity followed, for almost everyone. We built cars, not tanks and planes, and the televisions, and then anything we could think of. Everyone wanted a house in the suburbs, and everyone could have one. There seemed to be good jobs for everyone, and if you didn’t like any of those jobs, you could go into business for yourself. You’d do fine. People had money in their pockets. They’d buy what you were selling, and back then the government wasn’t regulating much of anything. You could sell crap that didn’t work at all and then just move on, and then sell something else that might be a little questionable. Ten years after the war ended, Detroit started selling big wallowing cars with giant tailfins, the same cars as before, but now space-age snazzy. People bought them. Kids bought hula hoops. There was a lot of money sloshing around. The economy exploded in all sorts of directions.

That explosion is what changed things. Those of us born just after the war wouldn’t live our lives like our parents, in one respectable career, rising to the top, or at least to the respectability of loyalty to the firm, and the honor that comes with that. We disappointed our parents by job-hopping and even changing careers entirely, several times. We didn’t stick to anything, but then that was impossible, given how the world kept changing. Old industries died. Whole new industries were invented. Success became a matter of riding the next wave, and then the wave after that. There were many awkward conversations in the early sixties. What are your career plans, how do you want to spend the rest of your life? There was no good answer to that. Where do you see yourself in ten years, and how do you plan to get there – what are the specific steps you must take to get there? Okay, fine – tell me what the world will be like in ten years. You can’t, can you? Baby boomers went off into the world willing to improvise. There was no other choice. Their parents looked sad.

No one settled down, but that was okay, and all of us have our tales of how thing somehow worked out. Teaching high school English in the seventies was fine, but that was a dead end. Moving to California and working in the “real world” wasn’t. The aerospace companies were hiring, and Training and Organizational Development wasn’t much of a stretch, but that was a dead end too – but in the mid-eighties desktop computers suddenly arrived, and all sorts of Human Resources stuff could be automated. Suddenly there was such a thing as Human Resources Systems, another wave to ride, something no one saw coming. Cool – and that led to something else no one ever heard of before, outsourcing, dumping all the tedious systems stuff, letting a contractor do the work, and working for the contractor, not the aerospace company, was cool too. There were no dead ends there. They had other contracts, and running the systems shop at that locomotive factory halfway between Detroit and Toronto was something entirely new – the legacy COBOL-based manufacturing resource planning system, running on a competitor’s mainframe in Texas, was a hoot – but that was a dead end too. Why not quit? A chain of Catholic hospitals in Pasadena needed someone to manage the business operations shop – payroll, accounts payable, general ledger – and that was fine, until the nuns outsourced us all to another contractor, who “streamlined” everything and sent us elsewhere. Fine – it was good to learn their fancy system that handled HMO stuff – contracts and eligibility and such things, so the HMO made good money, by making sure no one was getting any sort of treatment that wasn’t authorized by the bean-counters. It was a slick system for maximizing profits, merciless, or looking at it another way, superbly efficient – but then the HMO that wanted to use it went under before any of us could get there, swallowed up by a larger HMO with a different system. Oh well – a year or two off was fine – and then working for an actual HMO was fine too. It was a way to look at the same set of problems from the inside, but their systems were in chaos and some of us were forced out in the churn.

That was okay. It was time to retire anyway, and looking back at all this, it’s clear that any career plan from 1964 or so wouldn’t have worked. As they say, who knew? Additionally, over all the long and strange years, one does learn what controls one’s fate. Others want to make money. They don’t give a damn about you unless you make them money – then they’re happy with you, if they remember your name. This was nothing like teaching high school English. Thirdly, it’s obvious that sooner or later we’ll all end up working in healthcare, one way or another. The boomers are getting old, needing the medical help age makes necessary, and the days when America knew how to make almost anything on a massive scale, and had the capacity to do so, are long gone. Others, overseas, make what we buy. We have a service economy, where we service each other, coupled with that financial world out there where the rich get even richer selling imaginary assets to each other.

Our parents wouldn’t recognize this world. No one stays in one career. There’s no longer any honor in that, and in fact that doesn’t make sense, not now. That hasn’t made sense since 1947 or so, when laissez-faire capitalism was fully unleashed, when Adam Smith’s Invisible Hand became more important than any of us. It’s every man for himself now, in a world where everything was always changing. There were parents, back in the day, who wanted their son to grow up to be a doctor, someone respected, who really helped others, and made a good living too. The daughter might be a nurse – the same thing, without the good money. They wouldn’t recognize this world, where such people are just tools of others, the ones who make the real money, who run the world.

The Ebola crisis is making that painfully obvious, and one of Josh Marshall’s readers over at Talking Points Memo offers this:

I have a perspective tying together today’s big news brouhahas. My wife is an ER nurse at a major urban hospital owned by the Hospital Corporation of America, the hospital chain once run by Rick Scott. It’s the largest for-profit medical system in the world, and is of course also notable for its “creative billing” practices in the largest Medicare fraud settlement in history. Scott was booted from the CEO position following that fraud investigation, so he’s not directly responsible for current conditions in those hospitals.

But it is obvious to those who work there that the combination of lax training and toxic labor relations “leaders” like him have brought to the company are emblematic of a big problem for US hospitals if a major outbreak of Ebola or other infectious disease occurs. My wife’s ER has an “Ebola cart” with some lightweight protective gear and written instructions for putting on a PPE [Personal Protective Equipment] but the instructions are a loose bundle of papers and the pictures don’t match the gear in the cart and has inaccuracies that put them at serious risk.

It’s an object of gallows humor for the staff. That’s the totality of their training or preparedness so far. As we all now know, PPEs are not easy to put on and take off correctly. Even though nurses all have experience with standard droplet control (they see TB and HIV all the time), Ebola is a special case. They have gone months and months without a nurse education director because no one wants to deal with their management and take the position. Her coworkers are clear that they will refuse to treat an ebola patient because they have woefully inadequate training in the correct procedures and lack proper gear.

Rick Scott is currently the Republican governor down there in Florida. The largest Medicare fraud settlement in history didn’t hurt him one bit. He said his subordinates did that and didn’t let him in on what they were doing. Scott was booted from his CEO position because he was a clueless executive who hired and trusted the wrong people. He wasn’t fired for the fraud, and he told the voters of Florida he’d end government waste, cutting everything in sight. He told them he’d drug-test everyone on welfare, and they liked that. He didn’t tell them his wife owned the company that would do all the testing and the two of them would make a fortune billing the state. It’s a strange situation, but conservative voters down there wanted someone mean – merciless or, if you wish, superbly efficient – to slap the state into shape. That’s what they got. He runs the state like he ran his hospital chain, ruthlessly – and he’ll make a bundle too, with a bit of slight-of-hand. He spent seventy-five million dollars of his own money to get elected. It was a good investment.

Josh Marshall’s anonymous reader knows this guy and this world, and his wife’s hospital. The hospital can’t handle Ebola. Everyone knows that:

And yet the head of infectious disease at this hospital went on the local news to proclaim the hospital was ready to receive ebola patients safely. They obviously didn’t bother to speak to a single nurse on the front lines. I’m not particularly panic-y about ebola, even though obviously the family members of ER personnel have a lot at stake in Ebola preparedness. But I think that this situation will be the weak link in any major national response.

So many of our hospitals are run by lunatics like Rick Scott who seek only the highest profit margin – they do not invest in training, they build charting mechanisms that are good for billing but not treating patients, they constantly fight with their unionized employees, they lie to the public, etc., etc. We like to imagine that competent, highly-skilled medical institutions like Emory will save us, but we have way more Dallas Presbyterians in this country than we have Emorys. You can see exactly this managerial incompetence—and toxic labor relations – woven through the statement released by the nurses at Dallas Presbyterian today. Also see the head of National Nurses United on All In With Chris Hayes for a similar perspective.

To put it bluntly: we’ve entrusted our national medical system to the managerial competence and goodwill of the Rick Scotts of the world, and that is much scarier than a podium fan.

In case you miss the podium fan thing see this:

In one of the weirdest and most Floridian moments in debate history, Wednesday night’s gubernatorial debate was delayed because Republican Governor Rick Scott refused to take the stage with Democratic challenger Charlie Crist and his small electric fan… Rather than waiting for the governor to emerge, the debate started with just Crist onstage. “We have been told that Governor Scott will not be participating in this debate,” said the moderator. The crowd booed as he explained the fan situation, and the camera cut to a shot of the offending cooling device.

“That’s the ultimate pleading the fifth I have ever heard in my life,” quipped Crist, annoying the moderators, who seemed intent on debating fan rules and regulations. After a few more awkward minutes, Scott emerged, and the debate proceeded, with only one more electronics dispute. When asked why he brought the fan, Christ answered, “Why not? Is there anything wrong with being comfortable? I don’t think there is.”

Rick Scott may not be governor down there much longer. He may be appropriately mean and merciless, but he threw a tantrum like a prissy sixth-grade little girl, thinking it would impress every voter in Florida. That might have been a miscalculation. The whole nation is making fun of him now. The little things matter. Perhaps he made some good points in that debate. No one will remember them now.

Rick Scott, however, may pull this off. He’s a Republican in this unforgiving world, the kind of guy who does what makes economic sense, not matter who gets hurt, and David Stirling explains how that relates to Ebola:

Ebola has been killing people since 1976, so why do we still have no vaccine?

There is no profit for pharmaceutical giants in developing expensive drugs for rare diseases in countries with no money to buy them.

They are interested only in mass-market medicines for First World conditions such as cancer and heart disease, or lifestyle drugs such as Viagra.

Even health experts have ignored Ebola. Two years ago, the World Health Organisation listed 17 neglected tropical diseases (NTDs) that afflict more than one billion people. Ebola didn’t rate a mention – until the current outbreak killed more than 4000 people so far and put the world on alert.

Of course, drug companies are not charities, they answer to shareholders, and for that matter, where’s the financial incentive for governments when the disease afflicts only Africa?

All that is rather obvious – the Invisible Hand has spoken. No, wait – market forces have spoken. Hands don’t speak. They slap people around, which may be the same thing, but someone will make money here. The Washington Post’s Abby Ohlheiser explains how that works:

The Centers for Disease Control and Prevention announced Tuesday that a person has been diagnosed with Ebola in the United States. The market reacted accordingly. The most striking monetary effect of the CDC’s announcement was encapsulated in this headline from USA Today: “Ebola stocks soar after infection hits U.S.”

Yes, the makers of experimental drugs that have a shot at becoming the first confirmed Ebola treatment fared well in the markets after the Ebola-in-the-U.S. news broke.

“The first confirmed Ebola case in the U.S. is fanning fears around the country, but it’s also driving greed in some corners of the stock market,” CNNMoney said.

It was just the latest in a series of boons for those companies.

There wasn’t money in this before. There is now. Just add panic, and the “natural remedy” folks are seeing green too:

One of the more reliable byproducts of something like the Ebola outbreak in Africa (and its arrival in the U.S.) is the marketing of products that aren’t actually drugs as potential cures or treatments for the illness. This is something the FDA anticipated would happen this year, as Ebola began to spread across West Africa. “Oftentimes with public health incidences, like Ebola or even during H1n1, we see products that are marketed, often online, that claim to treat or cure the disease … without FDA approval,” FDA spokesperson Stephanie Yao said in an earlier interview with The Post.

Last week, the agency sent letters to three companies, alerting them that some of their paid consultants were marketing their products – which included essential oils and organic dark chocolate bars – as Ebola cures and treatments against FDA regulations.

That is rich:

Although two of the companies in question made it very clear in statements to The Post that they don’t condone the marketing of their products in this way, one company was promoting the idea itself.

Natural Solutions Foundation claimed in its online marketing materials that its Nano Silver product could cure Ebola, Hepatitis B and C, and H1N1, among other diseases. “WHO, FDA, the New York Times, etc., have gone on a rampage of disinformation to keep you in the dark about natural ways to dispose of dangerous microbes without damaging your beneficial bacteria,” the company added.

The ads will be on your television screen soon, and then there are the hedge fund managers:

It turns out that the spread of Ebola through West Africa prompted some hedge funds to bet on it affecting cocoa prices. The countries hardest hit by the outbreak border the Ivory Coast, one of the world’s largest cocoa producers. According to Bloomberg, the possibility that Ebola will spread there is one of many factors leading experts to speculate that cocoa prices will continue to rise.

A September 24 Moody’s report cited by Bloomberg notes that Ebola control measures might produce labor shortages during the beginning of cocoa’s harvest season in October.

It’s time to play those cocoa futures. There’s money to be made, and we do live in a world where laissez-faire capitalism was fully unleashed long ago. It’s every man for himself, alone, and we are actually all libertarians now. At Newsweek, Victoria Bekiempis discussed how that’s working out in Texas:

The Centers for Disease Control and Prevention confirmed the first case of Ebola diagnosed in the U.S. on Tuesday, in Dallas, Texas. This presents both epidemiological and political questions. Libertarianism is a major political force in Texas, and Libertarianism generally advocates against government involvement in healthcare – so if the 135 Libertarians running for office in the Lone Star State this November were elected, would they want the government to fight the disease?

The answer is more nuanced than one might expect: Most Libertarians interviewed by Newsweek agreed government should intervene to protect public health in exceptional circumstances, but said intervention would have to be very careful and limited – and, perhaps, that it is better executed by the private sector.

That goes like this:

Carla Howell, National Libertarian Party Political Director, says “governmental bureaucracies” involved with epidemic control are ineffective compared to private and voluntary efforts, in addition to costing too much money and violating individual rights.

“The sole purpose of government is to protect our life, liberty and property from harm caused by others in those few instances where the private sector cannot do a better job,” Howell writes in an email to Newsweek. “Containing Ebola in Africa is best left to private charities such as Doctors without Borders rather than the NIH [National Institutes of Health] or the CDC.

“Screening is better handled by airlines and private hospitals that are both liable for damages and fully free of government red tape. (Sadly no such hospitals exist today in the United States).”

Digby (Heather Parton) summarizes the rest:

To be fair, some other libertarians who are running for office in Texas reluctantly agreed that as much as they loathe “government bureaucracies” like the CDC, they have “bigger fish to fry.” Others recognized that quarantines enforced by the proverbial men with guns might be necessary. Overall, they seemed to be more uncomfortable with implications of their belief system in this instance than we usually see. In fact, they remind [me] of the anti-abortion zealots when confronted with the inconvenient fact that if they consider abortion murder they are morally required to arrest the women who have them.

The spokesperson for the national Libertarian party is the only one who is unashamedly willing to spell out the solutions their philosophy truly requires.

Why not be honest about this? Everyone understands now. This is not the world of the Greatest Generation. There’s no respectability in loyalty to something larger. You’re alone. Ride the wave, and then the next one, no matter who drowns around you. We’ve lived our lives like that for many decades now. Now we know what that means.

Posted in Ebola | Tagged , , , , , , , , , , , , , , , , | 2 Comments