Posts tagged politics

Stag Party

By Frank Rich

It’s not news that the GOP is the anti-abortion party, that it panders to the religious right, and that it’s particularly dependent on white men with less education and less income—a displaced demographic that has been as threatened by the rise of the empowered modern woman as it has been by the cosmopolitan multiracial male elites symbolized by Barack Obama. That aggrieved class is, indeed, Santorum’s constituency. But, as Stephanopoulos was trying to get at when he challenged Romney, this new rush of anti-woman activity on the right isn’t coming exclusively from the Santorum crowd. It’s a phenomenon extending across the GOP. On March 1, every Republican in the Senate except the about-to-flee Olympia Snowe—that would be 45 in total—voted for the so-called Blunt Amendment, which would allow any employer with any undefined “moral” objection to veto any provision in health-care coverage, from birth control to mammograms to diabetes screening for women (or, for that matter, men) judged immorally overweight.

After the Blunt Amendment lost (albeit by only three votes), public attention to the strange 2012 Republican fixation on women might have dissipated had it not been for Rush Limbaugh. His verbal assault on a female Georgetown University law student transformed what half-attentive onlookers might have tracked as a hodgepodge of discrete and possibly fleeting primary-season skirmishes into a big-boned narrative—a full-fledged Republican war on women. And in part because Limbaugh pumped up his hysteria for three straight days, he gave that war a unifying theme: pure unadulterated misogyny.

The GOP Establishment didn’t know what to do about Rush. Conservatives had tried to make the case that the only issue at stake in the contraception debate was religious liberty—Obama’s health-care czars forcing religiously affiliated institutions (or more specifically Catholic institutions) to pay for birth-control coverage (which 98 percent of sexually active American Catholic women use at some point, according to the Gutt­macher Institute). But the Obama administration had walked back that rule in a compromise acceptable to mainstream Catholics, including the Catholic Health Association. So what was Rush yelling about now except his own fantasies (videos included) about this young woman’s sex life?

Health Insurance Is for Everyone

By FAREED ZAKARIA

The centerpiece of the case against Obamacare is the requirement that everyone buy some kind of health insurance or face stiff penalties—the so-called individual mandate. It is a way of moving toward universal coverage without a government-run or single-payer system. It might surprise Americans to learn that another advanced industrial country, one with a totally private health care system, made precisely the same choice nearly 20 years ago: Switzerland. The lessons from Switzerland and other countries can’t resolve the constitutional issues, but they suggest the inevitability of some version of Obamacare.

Switzerland is not your typical European welfare-state society. It is extremely business-friendly and has always gone its own way, shunning the euro and charting its own course on health care. The country ranks higher than the U.S. on the Heritage Foundation’s Index of Economic Freedom.

Twenty years ago, Switzerland had a system very similar to America’s—private insurers, private providers—with very similar problems. People didn’t buy insurance but ended up in emergency rooms, insurers screened out people with pre-existing conditions, and costs were rising fast. The country came to the conclusion that to make health care work, everyone had to buy insurance. So the Swiss passed an individual mandate and reformed their system along lines very similar to Obamacare. The reform law passed by referendum, narrowly. The result two decades later: quality of care remains very high, everyone has access, and costs have moderated. Switzerland spends 11% of its GDP on health care, compared with 17% in the U.S. Its 8 million people have health care that is not tied to their employers, they can choose among many plans, and they can switch plans every year. Overall satisfaction with the system is high.

The most striking aspect of America’s medical system remains how much of an outlier it is in the advanced industrial world. No other nation spends more than 12% of its total economy on health care. We do worse than most other countries on almost every measure of health outcomes: healthy-life expectancy, infant mortality and—crucially—patient satisfaction. Put simply, we have the most expensive, least efficient system of any rich country on the planet. Costs remain high on every level. Recently, the International Federation of Health Plans released a report comparing the prices in various countries of 23 medical services, from a routine checkup to an MRI to a dose of Lipitor. The U.S. had the highest costs in 22 of the 23 cases. An MRI costs $1,080 here; it costs $281 in France.

In 1963, Nobel Prize—winning economist Kenneth Arrow wrote an academic paper explaining why markets don’t work well in health care. He argued that unlike with most goods and services, people don’t know when they will need health care. And when they do need it—say, in the case of heart failure—the cost is often prohibitive. That means you need some kind of insurance or government-run system.

Now, we could decide as a society that it is O.K. for people who suddenly need health care to get it only if they can pay for it. The market would work just as it works for BMWs: anyone who can afford one can buy one. That would mean that the vast majority of Americans wouldn’t be able to pay for a triple bypass or a hip replacement when they needed it. But every rich country in the world—and many not-so-rich ones—has decided that its people should have access to basic health care. Given that value, a pure free-market model simply cannot work.

The Swiss and Taiwanese found that if you’re going to have an insurance model, you need a general one in which everyone is covered. Otherwise, healthy people don’t buy insurance and sick ones get gamed out of it. Catastrophic insurance—covering trauma and serious illnesses—isn’t a solution, because it’s chronically ill patients, just 5% of the total, who account for 50% of American health care costs. That’s why the Heritage Foundation, a conservative think tank, came up with the idea of an individual mandate in the 1980s, proposing that people buy health insurance in exactly the same way that people are required to buy car insurance. That’s why Mitt Romney chose this model as a market-friendly system for Massachusetts when he was governor. And that’s why Newt Gingrich praised the Massachusetts model as the most important step forward in health care in years. They have all changed their minds, but that is about politics, not economics.


Trickle-down consumption

Chrystia Freeland

We know now that trickle-down economics doesn’t really work – the past decade in the United States has seen incomes at the very top soar, while the earnings of the middle class stagnated or declined. But a growing body of academic research is suggesting that this benign force’s wicked stepsister, a phenomenon two economists have dubbed ‘‘trickle-down consumption,’’ is having a powerful impact on the economy and politics of the United States.

The idea is that income inequality has a significant impact on the 99 percent: It drives the rest of us to consume more, whether we can afford to or not.

Robert H. Frank, an economist at Cornell University, is a pioneering student of this behavior who has been writing about the subject for nearly two decades, long before it became fashionable. Frank, who is the co-author of two economics textbooks with the Federal Reserve chairman, Ben Bernanke, believes that rising income inequality affects the rest of us through what he calls ‘‘expenditure cascades.’’

Rising income inequality, he notes, isn’t just about the gap between the 99 percent and the 1 percent; it is also about growing differences across the income distribution, including at the very top. The result is that all of us see people we think of as our peers earning – and spending – a lot more. As a consequence, we find ourselves spending more, too.

‘‘The main idea is that frames of reference are very local,’’ Frank said. ‘‘Bertrand Russell said beggars don’t envy millionaires, they envy other beggars who have a few more coins than they do. Expenditure cascades aren’t because the poor want to emulate the rich.’’

Instead, he argues, each of us imitates those near us – and the result is a cascade of unaffordable consumption.

‘‘There has been extraordinary growth in the 1 percent,’’ Frank said. ‘‘Ordinary people don’t want to emulate them, but what happens is that the people who are next to them want to emulate them, and so on. That social cascade ultimately explains why the middle-class home got 50 percent bigger in the past three decades.’’ (Frank says that the average U.S. house went from 1,570 square feet, or 146 square meters, in 1970 to 2,300 square feet in 2007.)

This cascading increase in consumption can have what some of us might consider to be benign effects – everyone working harder and more women entering the workforce. But it can also have malign ones. In ‘‘Expenditure Cascades,’’ a paper Frank wrote with Adam Seth Levine and Oege Dijk, the three show that more bankruptcies, a higher divorce rate and longer commutes all correlate with increased income inequality.

draft study by two University of Chicago economists that is attracting a lot of attention in the academy supports this view. Marianne Bertrand and Adair Morse coined the term ‘‘trickle-down consumption,’’ and in their paper of the same name they find that higher spending, bankruptcy and self-reported financial distress all increase if you live in a community with higher income inequality, compared with one with lower income inequality.

The concepts of ‘‘trickle-down consumption’’ and ‘‘expenditure cascades’’ help to explain one of the great mysteries of the past decade in U.S. politics and society. Income inequality has been on the rise since the late 1970s, but it is only since the financial crisis that it has gained any real traction in public life. That may be because increased consumption masked growing inequality. Retail therapy meant the 99 percent didn’t notice that the 1 percent was pulling away.

Bertrand and Morse offer empirical evidence of an important explanation for why that was possible. In areas with higher income inequality, politicians were more likely to support measures to make consumer credit cheaper and more available. People weren’t talking about inequality much before 2009, but they felt it. And, America being a democracy, the political system worked to soften it. Interestingly, because inequality grew at a time when overt redistribution was falling out of favor, politicians made it easier to borrow.

Bill Moyers counsels President Obama not to look at America through the rose-colored glasses of people — like Robert Kagan — led by political opportunity and wishful thinking, but by those — like Andrew Bacevich — who see the world as it truly is, and are best poised to make it better.

Romney’s defense budget target is lofty

It is one of Mitt Romney’s most striking anecdotes. The US Navy, he says, has fewer ships today than in 1917, and the US Air Force is smaller than it was in 1947. Notwithstanding that today’s fleets are far beyond the capability of those from yesteryear, Romney says it is evidence that America’s military dominance is at risk.

Romney’s solution is one of the most far-ranging, expensive, and perhaps least understood of his campaign. He has vowed to commit at least 4 percent of the nation’s gross domestic product - $4 out of every $100 in the nation’s economy - to “core’’ defense spending, not including many war expenses.

The cost appears to be far greater than when Romney first broached the idea several years ago, when the nation was spending closer to 4 percent of GDP on defense. Under next year’s budget, defense spending is projected to be about 3.2 percent - yet Romney has stuck by his 4 percent vow. Put another way, that means Romney proposes spending 61 percent more than Obama at the end of a decade-long cycle, according to the libertarian Cato Institute.

Enacting such an increase at the same time that Romney wants to slash taxes and balance the budget could cost trillions of dollars and require huge cuts in domestic programs. As Romney’s website puts it matter-of-factly, “This will not be a cost-free process.’’

'Big Government' Isn't the Problem, Big Money Is

Robert Reich

Conservatives love to rail against “big government.” But the surge of cynicism engulfing the nation isn’t about government’s size. It flows from a growing perception that government doesn’t work for average people but for big business, Wall Street and the very rich—who, in effect, have bought it. In a recent Pew poll, 77 percent of respondents said too much power is in the hands of a few rich people and corporations.

That view is understandable. Wall Street got bailed out by taxpayers, but one out of every three homeowners with a mortgage is underwater, caught in the tsunami caused by the Street’s excesses. The bailout wasn’t conditioned on the banks helping these homeowners, and subsequent help has been meager. The recent settlement of claims against the banks is tiny compared with how much homeowners have lost. Millions of people are losing their homes or simply walking away from mortgage payments they can no longer afford.

Homeowners can’t use bankruptcy to reorganize their mortgage loans because the banks have engineered laws to prohibit this. Banks have also made it extremely difficult for young people to use bankruptcy to reorganize their student loans. Yet corporations routinely use bankruptcy to renege on contracts. American Airlines, which is in bankruptcy, plans to fire 13,000 people—
16 percent of its workforce—while cutting back health benefits for current employees. It also intended to terminate its underfunded pension plans, until the government agency charged with picking up the tab screamed so loudly that American backed off and proposed to freeze the plans.

Not a day goes by without Republicans decrying the budget deficit. But its biggest driver is Big Money’s corruption of Washington. One of the federal budget’s largest and fastest-growing programs is Medicare, whose costs would be far lower if drug companies reduced their prices. It hasn’t happened because Big Pharma won’t allow it. Medicare’s administrative costs are only 3 percent, far below the 10 percent average of private insurers. So it would be logical to tame rising healthcare costs by allowing any family to opt in. That was the idea behind the “public option.” But health insurers stopped it in its tracks.

The other big budget expense is defense. The US spends more on its military than China, Russia, Britain, France, Japan and Germany combined. The “basic” military budget (the annual cost of paying troops and buying planes, ships and tanks—not including the costs of actually fighting wars) keeps growing. With the withdrawal of troops from Afghanistan, the cost of fighting wars is projected to drop—but the base budget is scheduled to rise. It’s already about 25 percent higher than it was a decade ago, adjusted for inflation. One big reason is that it’s almost impossible to terminate large military contracts. Defense contractors have cultivated sponsors on Capitol Hill and located their facilities in politically important districts. Lockheed, Raytheon and others have made national defense America’s biggest jobs program.

“Big government” isn’t the problem. The problem is the Big Money that’s taking over government. Government is doing fewer of the things most of us want it to do—providing good public schools and affordable access to college, improving infrastructure, maintaining safety nets and protecting the public from dangers—and more of the things big corporations, Wall Street and wealthy plutocrats want it to do.

Some conservatives argue that we wouldn’t have to worry about this if we had a smaller government to begin with, because big government attracts Big Money. On ABC’s This Week a few months ago, Congressman Paul Ryan told me that “if the power and money are going to be here in Washington…that’s where the powerful are going to go to influence it.” Ryan has it upside down. A smaller government that’s still dominated by money would continue to do the bidding of Wall Street, the pharmaceutical industry, oil companies, agribusiness, big insurance, military contractors and rich individuals. It just wouldn’t do anything else.

Millionaires and billionaires aren’t donating to politicians out of generosity. They consider these expenditures to be investments, and they expect a good return on them. Experts say the 2012 elections are likely to be the priciest ever, costing an estimated $6 billion. “It is far worse than it has ever been,” says Senator John McCain. And all restraints on spending are off now that the Supreme Court has determined that money is “speech” and corporations are “people.”

Unless one is seized by avarice or psychotic obsession, all a human being wants is a pleasant, possibly long life, to consume what is necessary to keep fit and make love. “Civilization” is the pompous name given to all the political or moral values that make the pursuit of this lifestyle possible. Meanwhile, the financial dogma states that if we want to be part of the game played in banks and markets, we must give up a pleasant, quiet life. We must give up civilization.

But why should we accept this exchange? Europe’s wealth does not come from the stability of the Euro or international markets, or the managers’ ability to monitor their profits. Europe is wealthy because it has millions of intellectuals, scientists, technicians, doctors, and poets. It has millions of workers who have augmented their technical knowledge for centuries. Europe is wealthy because it has historically managed to valorize competence, and not just competition, to welcome and integrate other cultures. And, it must be said, it is also wealthy because for four centuries it has ferociously exploited the physical and human resources of other continents.

We must give something up, but what exactly? Certainly we must give up the hyper-consumption imposed on us by large corporations, but not the tradition of humanism, enlightenment, and socialism—not freedom, rights, and welfare. And this is not because we are attached to old principles of the past, but because it is these principles that make it possible to live decently.

The prospect of a revolution is not open to us. The concept of revolution no longer corresponds to anything, because it entails an exaggerated notion of the political will over the complexity of contemporary society. Our main prospect is to shift to a new paradigm not centered on product growth, profit, and accumulation, but on the full unfolding of the power of collective intelligence.

Franco Berardi Bifo

HOLDING COURT

by 

The legal challenges to ACA, which the Supreme Court will hear next week, center on its key provision, the individual mandate. The mandate essentially requires all adults to obtain health insurance, either through their employers or by buying it themselves. (There will be subsidies for those who cannot afford it.) The idea of a health-insurance mandate first came to wide public notice in 1989, in the form of a proposal from the Heritage Foundation, one of Washington’s venerable right-wing think tanks. In subsequent years, the idea was embraced by a variety of politicians, mostly conservatives, including Newt Gingrich and Orrin Hatch. In Massachusetts, of course, Governor Mitt Romney made it the centerpiece of his health-care-reform plan. For decades, no one suggested that an individual mandate was unconstitutional.

This was understandable. The principal constitutional justification for the individual mandate is the Commerce Clause, which gives Congress the right to regulate commerce among the states. The meaning of this provision has been settled since shortly after the New Deal. The Court has said repeatedly that Congress has the right to regulate economic activity even if it takes place only within a state. 

The main argument that opponents of the health-care law have come up with is that the mandate regulates economic inactivity—i.e., not buying insurance—and the Commerce Clause allows only the regulation of economic activity. In the first appellate review of the law, last summer, the Sixth Circuit demolished that argument. The court pointed out that there are two unique characteristics of the market for health care: “(1) virtually everyone requires health care services at some unpredictable point; and (2) individuals receive health care services regardless of ability to pay.” Thus, there was no such thing as “inactivity” in the health-care market; everyone participates, even if he or she chooses not to buy insurance. Indeed, the choice to forgo insurance imposes a direct cost on the taxpayers, who wind up footing the bill. Those choices by consumers, especially in the aggregate, represent an economic matter that Congress may decide to regulate.

The precedents supporting the constitutionality of ACA haven’t changed, but the federal judiciary, including the Supreme Court, has. As in the Senate, moderate Republicans held sway for years at the Supreme Court, but that species has vanished on both sides of First Street. The likes of Lewis Powell and Sandra Day O’Connor have been replaced by the likes of John Roberts and Samuel Alito. In order to strike down health-care reform, the new Republican Justices would have to change the underlying constitutional law, which they have proved themselves more than capable of doing. They have already cut a swath through the Court’s precedents on such issues as race, abortion, and campaign finance, and it’s possible that they will assemble the votes to do the same on the scope of the Commerce Clause. The high-stakes health-care case is a useful reminder of the even higher stakes in the Presidential election.


George and Mitt Romney & the Death of Moderate GOP

Mitt’s father, George, was once the great hope of moderate Republicans., Art Shay / Polaris

by 

The moderate Republicans of the 1960s were supporters of the free-enterprise system. They distrusted the then-overwhelming power of trade unions. They disliked the bureaucracy of the New Deal spending programs. Yet they did not altogether oppose social insurance. They favored voucher-style programs that delivered benefits without bureaucracy. Many were drawn to Milton Friedman’s concept of a negative income tax: the government would set a number that every American was entitled to. If an American earned less, he or she would receive back from the IRS a check necessary to bring him or her up to the guaranteed minimum. That idea went nowhere, but many other ideas appealing to moderate Republicans were enacted in the 1960s, forming the basis of much of our modern welfare system. We don’t build public housing anymore. We have Section 8 benefits that enable poor people to rent homes in private apartments. We don’t have a Federal Food Administration. We have food stamps.

What happened to George Romney’s moderate supporters inside the GOP? To a great degree, they fell victim to cultural and demographic shifts in the U.S. As evangelical Southern conservatives and white working-class ethnics migrated from the Democratic Party into the GOP, secular Northern professionals and managers migrated out.

George Romney and those who supported him were economic winners: the more affluent, the better-educated. But they were economic winners at a time when there seemed plenty to go around for everybody. Leaving some of the gains on the table to share with the have-nots looked—in the 1950s and ’60s—like a good investment in social cohesion, especially to people who remembered the pain and turmoil of the Great Depression. Today’s economy looks much more pinched, and today’s winners don’t feel they can afford to share. Today’s winners are looking for leaders who will protect their winnings against what looks to them like a voraciously hostile environment. The GOP moderates were the ultimate “good sports” of American politics. There isn’t much room for such people in an era of “winner take all.”

Fanatics and Fanciers

By Stephen L. Carter

There is a kind of fan who is indeed a fanatic, for whom every call against his team represents an occasion to doubt the competence or impartiality of the officials. Then there is the kind of fan who is a fancier, who may root for a team but whose real passion is for the sport itself. The fanatic is the one who screams at the referee that the receiver was pushed out of bounds by the defender, and so the catch should count. The fancier is the one who calmly points out that the rule was changed a few years ago, so if the receiver is forced out, there is no catch.

The continuing political debate between those who demand an even-handed approach and those who think you should reserve your attacks for your enemies tracks precisely this distinction between fanatics and fanciers. Probably we should be unsurprised. Evolutionary psychologists insist that the dividing of the world into “us” and “them” is natural to us, a genetic holdover from our hunter-gatherer days, when we shared with our fellows and fought off attacks from strangers. We are psychologically comfortable, the theory runs, only when we know who is on our side and who isn’t, and can draw clean dividing lines between the two. So strong is this habit, researchers say, that we tend to discount suffering among members of the “out” group while highlighting it among members of the “in” group.

Instinct, then, makes us fanatics rather than fanciers. Fair enough. But the force of civilization is supposed to be away from instinct in the direction of reason. There was a time in living memory when both parties included respected senior members whose esteem for institutions and processes led them to become voices of moderation.

I remember an occasion during the Reagan administration when a relatively minor breach of Senate tradition (not even a written rule) would have allowed the Democrats to defeat the nomination of Daniel Manion, whom they bitterly opposed, for a federal appellate judgeship. A junior Democratic member tried to go against the tradition, but was immediately restrained by his more senior colleagues, for whom the prerogatives of the institution were more important than prevailing in the battle. It is difficult to imagine such a thing happening today.

Stanley Fish asserts that when great moral issues are at stake, the correct governing principle is not the Golden Rule but “be sure to do it to them first and more effectively.” I think he may be right — as long as we are thinking only of the truly great moral divides, such as the fight over slavery in the 19th century. The trouble with contemporary politics is that partisan fanatics make every issue the occasion for a double standard; and we have all too few fanciers to correct them.

Timing is Everything: On The Limits of Presidential Power

by 

The reputations of our Presidents often turn on economic factors beyond their control. The Great Depression was a global event. The contractions in the early nineteen-eighties and in the early nineteen-nineties were driven by the Federal Reserve, as Paul Volcker sought to bring down inflation and Alan Greenspan sought to head it off before it got established. Sometimes changes in the financial markets—changes that aren’t under anybody’s direction—can prove decisive.

President Obama (in contrast to the Presidential candidate Obama) was a victim of unfortunate timing. When he entered the White House, in January, 2009, the gross domestic product and employment were both declining alarmingly, and his term in office has been largely defined by efforts to right the economy.

Setting aside a collapse in spending and an alarming rise in unemployment, the country faced at least five major economic problems when he took over: a bombed-out real-estate market; an oversized, risk-riddled financial sector; a voracious demand for fossil fuels that had to be met by imports; stagnant wages and rising inequality; and a looming entitlements crisis that threatened to swallow the budget and bankrupt the country. All these problems had been long in the making, and none of them offered up ready solutions.

As a Presidential candidate, though, Obama was not averse to raising great expectations. Talking to the Times in the summer of 2008, he noted that Ronald Reagan “ushered in an era that reasserted the marketplace and freedom.” The next President would need to bring about a similar shift, Obama said, one that reasserted the role of an activist government, “laying the groundwork, the framework, the foundation for the market to operate effectively.”

Inevitably, many progressives—including such critics as Robert Kuttner and Joseph Stiglitz—were bitterly disappointed when Obama, in his first two years in office, backed away from positions they favored in health care, financial regulation, climate change, energy policy, and taxation. But they missed the fact that Obama was never really one of them to begin with. Despite his references to establishing a new paradigm, he wasn’t intent on facing down the malefactors of wealth, creating a Canadian-style welfare state, or forging a German-style social compact between labor and capital. In truth, Obama was a moderate young technocrat, whose first instinct was to seek the middle ground. The moment power beckoned, he tilted instinctively toward the establishment, and, in the Democratic Party that Obama had grown up in, the establishment was pro-Wall Street.

Fun with Trends

by Richard Heinberg

Because projecting future magnitudes according to current trends requires relatively simple math, and because doing so sometimes enables analysts to make accurate short-term forecasts of things like population, sales volumes, and commodity prices, trend watching can confer a sense of mystical power. We can predict the future—and maybe even profit by doing so!

 But, as Benjamin Disraeli famously said, “There are three kinds of lies: lies, damned lies, and statistics.” Indeed, projecting existing statistical trends out over long time periods often leads to absurd conclusions. Hence the necessity, when observing trends, of knowing (or at least being able to guess successfully) which ones are important or trivial, and which ones are likely to persist longer than others. A skilful trend watcher will note not only the rate of change in a given magnitude, but the rate of change to the rate of change: for example, while world population is growing at a rate of about 1.1 percent annually, its pace of growth has diminished in recent years, and that’s an important factor in forecasting world population for, say, 2050 or 2100.

 In addition to generalized critical thinking skills, citizens need the ability to assess statistics-based claims in order to arm themselves against people with a political axe to be honed, or buck to be made, by hawking a particular trend. Unfortunately, policy makers don’t always have (or use) these skills, and so it’s common to encounter unrealistic assumptions about our future based on trends unlikely to continue for long. Often this is not just a matter of sloppy thinking: policy makers want certain trends to go on—because if they didn’t, there’d be hell to pay.

 The obvious example is economic growth. Government officials assume (based on trends over the past few decades) that national economies will continue to expand forever, just as transportation planners assume that automobile traffic will always proliferate. Therefore more highways and suburbs are justified, and more public debt—because of course tax revenues will increase. Never mind that, in the US, there’s little basis for further real economic growth in light of debt levels and demography; or that vehicle miles traveled have actually declined significantly in recent years. Those are inconvenient trends that politicians and road-builders hope will simply go away. But will they? Certainly not if global oil supplies continue to tighten (another inconvenient trend): scarce oil will discourage both economic growth and driving.

Murder Is Not an Anomaly in War

By Chris Hedges

The war in Afghanistan—where the enemy is elusive and rarely seen, where the cultural and linguistic disconnect makes every trip outside the wire a visit to hostile territory, where it is clear that you are losing despite the vast industrial killing machine at your disposal—feeds the culture of atrocity. The fear and stress, the anger and hatred, reduce all Afghans to the enemy, and this includes women, children and the elderly. Civilians and combatants merge into one detested nameless, faceless mass. The psychological leap to murder is short. And murder happens every day in Afghanistan. It happens in drone strikes, artillery bombardments, airstrikes, missile attacks and the withering suppressing fire unleashed in villages from belt-fed machine guns.

Military attacks like these in civilian areas make discussions of human rights an absurdity. Robert Bales, a U.S. Army staff sergeant who allegedly killed 16 civilians in two Afghan villages, including nine children, is not an anomaly. To decry the butchery of this case and to defend the wars of occupation we wage is to know nothing about combat. We kill children nearly every day in Afghanistan. We do not usually kill them outside the structure of a military unit. If an American soldier had killed or wounded scores of civilians after the ignition of an improvised explosive device against his convoy, it would not have made the news. Units do not stick around to count their “collateral damage.” But the Afghans know. They hate us for the murderous rampages. They hate us for our hypocrisy.

The scale of our state-sponsored murder is masked from public view. Reporters who travel with military units and become psychologically part of the team spin out what the public and their military handlers want, mythic tales of heroism and valor. War is seen only through the lens of the occupiers. It is defended as a national virtue. This myth allows us to make sense of mayhem and death. It justifies what is usually nothing more than gross human cruelty, brutality and stupidity. It allows us to believe we have achieved our place in human society because of a long chain of heroic endeavors, rather than accept the sad reality that we stumble along a dimly lit corridor of disasters. It disguises our powerlessness. It hides from view the impotence and ordinariness of our leaders. But in turning history into myth we transform random events into a sequence of events directed by a will greater than our own, one that is determined and preordained. We are elevated above the multitude. We march to nobility. But it is a lie. And it is a lie that combat veterans carry within them. It is why so many commit suicide.               

Right Minds

By Samuel Goldman

The French Revolution was not the first revolution in human or even European history. Mobs had ruled the streets before; princes had often enough been deposed. Yet Burke insisted that that the Revolution was “the most astonishing thing that has hitherto happened in the world.” What was so astonishing about it?

Burke’s answer was that the French Revolution was the consequence of an extraordinary new theory of society. According to this theory, which Burke attributed to the philosophers of the Enlightenment, human beings are naturally free and self-sufficient. Because each man is potentially a Crusoe, any relations between individuals are essentially voluntary.

The question, then, is whether the “chains” that bind one person to another reflect the will of every individual involved. If so, they are legitimate—a term that Jean-Jacques Rousseau was the first to transform from a principle of dynastic succession into the moral justification of rule as such. If not, they lack moral authority and may be rejected, potentially with violence. So, in Burke’s view, went the philosophical argument behind the revolution.

This reasoning was mistaken, Burke argued, not so much in its logical structure as in its first principle. In fact, human beings are born into networks of sympathy, obligation, and authority. These networks make us what we are, transforming unformed potential and dispositions into concrete identities. On this view, there is no Archimedean point from which the legitimacy of existing social relations can be assessed. As Maistre put it in a brilliant formulation, “In the course of my life, I have seen Frenchmen, Italians, Russians… . But, as for Man, I declare that I have never met him in my life. If he exists, I certainly have no knowledge of him.”

If the social arrangements that characterize national communities are background conditions of humanity, they are not legitimatized by the consent of those who participate in them at any given time. Instead, they derive their authority from they way that they bind together past, present, and future in an enduring partnership. It follows that men and women of today have no right to dissolve the partnership in which they are involved merely because it seems inconvenient to them. Society, which always means a particular society, is an “entailed inheritance,” like a landed estate whose owner is legally prohibited from selling.

The central principle of conservatism was authority. Specifically, conservatism was an attempt to justify theoretically the political and social hierarchies that the French Revolution challenged. This means that classical conservatism is inextricable from anti-egalitarianism. Its major premise is that men are not created equal—and that they actually become less and less so as they develop their faculties through the enactment of various social roles.

What does this backward-looking, theologically inflected ideology of hierarchy have to with the contemporary America conservative movement? The answer is: not much. In addition to the historical distance, the concept of individual rights imposes an unbridgeable theoretical gap between the two positions. Classical conservatism is essentially communitarian, and locates individuals in structures of obligation that are not derived from their choice or consent. The American conservative movement, on the other hand, appeals to many of same beliefs about natural freedom and equality that inspired the French Revolution.

Bank of America: Too Crooked to Fail

By Matt Taibbi

It’s been four years since the government, in the name of preventing a depression, saved this megabank from ruin by pumping $45 billion of taxpayer money into its arm. Since then, the Obama administration has looked the other way as the bank committed an astonishing variety of crimes – some elaborate and brilliant in their conception, some so crude that they’d be beneath your average street thug. Bank of America has systematically ripped off almost everyone with whom it has a significant business relationship, cheating investors, insurers, depositors, homeowners, shareholders, pensioners and taxpayers. It brought tens of thousands of Americans to foreclosure court using bogus, “robo-signed” evidence – a type of mass perjury that it helped pioneer. It hawked worthless mortgages to dozens of unions and state pension funds, draining them of hundreds of millions in value. And when it wasn’t ripping off workers and pensioners, it was helping to push insurance giants like AMBAC into bankruptcy by fraudulently inducing them to spend hundreds of millions insuring those same worthless mortgages.

But despite being the very definition of an unaccountable corporate villain, Bank of America is now bigger and more dangerous than ever. It controls more than 12 percent of America’s bank deposits (skirting a federal law designed to prohibit any firm from controlling more than 10 percent), as well as 17 percent of all American home mortgages. By looking the other way and rewarding the bank’s bad behavior with a massive government bailout, we actually allowed a huge financial company to not just grow so big that its collapse would imperil the whole economy, but to get away with any and all crimes it might commit. Too Big to Fail is one thing; it’s also far too corrupt to survive.