Writing about history, politics, and climate

Category: Originally appeared on Medium Page 1 of 22

Was the 1889 “Russian Flu” a COVID Doppelganger?

Was the 1889 “Russian Flu” a COVID Doppelganger?

And could a long-ago pandemic help us predict COVID’s course?

Prince “Eddy” the year before his death (Wikimedia Commons)

Prince Albert Victor — Prince Eddy to his family — was the dashing young grandson of Queen Victoria. He was only 28 years old in 1892, and he was second in line to the British throne. He’d just become engaged, and celebrated the new year with a shooting party in the countryside. He was the picture of youthful, noble vigor. Fourteen days after the shooting party, he was gasping for his last deaths as he died of pneumonia.

Eddy’s death was just one of many caused by the “Russian Flu,” which first appeared in 1889 and then swept across the planet, coming back in seasonal waves of infection over the next few years. The disease caused high fevers and terrible lung inflammation. Some of the infected complained that they could no longer taste or smell; in some cases, the disease affected many of their body’s systems, not just the lungs. Many patients suffered long-term symptoms, with the disease making it difficult for them to resume normal life for months, or even years. Weirdly, the disease killed the old much more than the young — children, usually the demographic most vulnerable to disease, seemed unaffected by it.

Sound familiar?

There might be a reason for that. The “Russian Flu” may not have been a flu at all. Some researchers are guessing that it may have been a coronavirus similar to the one that has caused so much suffering over the last two years, and that the fate of the 1889 pandemic might offer some clues to our future.

To be clear, the theory that the Russian Flu was a coronavirus is unproven, and we will likely never really know what caused the disease. The 1889 outbreak occupied a curious place in the history of the human relationship with disease. Medical science and communications technology were advanced enough that this pandemic was the first to be tracked in something like real time around the world.

We know when and where the pandemic started (Siberia, May 1889) and its path from there (across eastern Europe, into North America by January 1890, hitting remote New Zealand by May 1890). Doctors also understood the basics of disease transmission by this point; most no longer thought that illness came from “foul air.” We understand that the pandemic killed as many as a million people on a planet of 1.5 billion.

But we will never be able to definitively identify or analyze the pathogen that caused the pandemic. This is because the people who lived through it didn’t know what would help future scientists. After all, researchers only realized that there must be a category of life called “viruses” in 1892. Nobody saw a virus until the 1930s, when the electron microscope was invented. We don’t have any samples of the pathogen; nobody thought to keep them.

But it’s hard to ignore the parallels between our coronavirus outbreak and the events of 1889. First, the 1889 virus seems — just like our current foe — to have been deadly, but not so deadly that society united in order to defeat it. Many of the people who died had other medical problems that made them vulnerable to pneumonia, so it was hard to say what had killed them. Official death tolls varied. Many dismissed the disease and went on with their business. The New York Evening World concluded, “It is not deadly, not even necessarily dangerous, but it will afford a grand opportunity for the dealers to work off their surplus of bandanas.”

In other accounts, the disease sounds pretty terrible. Doctors, who saw the worst of the disease firsthand, understandably took it more seriously than some journalists. One British doctor described his patients’ symptoms, and it sounds rough:

The invasion is sudden; …with acute pains in the back … often accompanied by vertigo and nausea, and sometimes actual vomiting of bilious matter. There are pains in the limbs and general sense of aching all over; frontal headache of special severity; pains in the eyeballs, increased by the slightest movement of the eyes; shivering; general feeling of misery and weakness, and great depression of spirits, … weeping; nervous restlessness; inability to sleep, and occasionally delirium. In some cases catarrhal symptoms are observed… eyes are injected; sneezing and sore throat; and epistaxis, swelling of the parotid and submaxillary glands, tonsilitis, and spitting of bright blood from the pharynx may occur. There is a hard, dry cough of a paroxysmal kind, worst at night. …There is often tenderness of the spleen. The temperature is high at the onset (100° F. in mild cases to 105° F in severe cases).’

There’s more than just similarity in symptoms to suggest that the 1889 pandemic was a coronavirus. Well before COVID-19 came on the scene, Belgian researchers used genetic analysis to suggest that coronavirus OC43, which now causes some percentage of our routine common colds, diverged from bovine coronaviruses around 1890.

If it’s true that the 1889 pandemic was caused by a zoonotic coronavirus, its path may offer some insight into where we’re headed. The Russian Flu wreaked havoc
around the world for years and caused a lot of deaths. But eventually, it mutated, or perhaps human immune systems adapted to it. Now, it’s possible that a version of the virus that killed a million people in the late nineteenth century just causes runny noses and sneezes.

Most experts expect COVID to become endemic, and many expect that it will cause less severe disease as more of us get immunity and the virus continues to evolve. It’s possible — though by no means certain — that decades from now, the virus that causes COVID is responsible for the sniffles and little else, and that the devastation it caused is a distant memory, just like the 1889 pandemic.

The Short, Strange Saga of American Colonies in Cuba

The Short, Strange Saga of American Colonies in Cuba

Or: what happens when a bunch of Minnesota Swedes move to the tropics?

An American “pineapple patch” in La Gloria, Cuba, circa 1900 (public domain)

When Americans think about Cuban immigration, they generally imagine people coming from Cuba to settle in the United States. But there have been several periods where the flow of people has gone in a different direction. In one now-forgotten episode, thousands of Americans bought up land and moved to Cuba, often recreating small midwestern towns in the Caribbean.

The United States has, of course, had a long history of meddling in Cuban affairs. At various points, Americans flirted with annexing the island as a slave state, invaded it in 1898, retained the Guantanamo military base as American soil, manipulated Cuban politics, launched an invasion at the Bay of Pigs, almost started World War III over Soviet missiles, repeatedly tried to assassinate Fidel Castro, and imposed a crippling economic embargo on the island that has outlasted the actual Cold War by three decades.

In the middle of all this, just after the Spanish-American War, thousands of Americans settled in Cuba. They formed over 80 colonies on the island, turning parts of the tropical Caribbean into little outposts of Nebraska or New Hampshire.

Americans got the opportunity to snap up land in Cuba in the early twentieth century largely because of their country’s invasion of the country in 1898. Though the U.S. had ostensibly fought the Spanish-American War to help Cuba become independent from Spain, the war created conditions that made Cuba ripe for American involvement after the fighting was over.

The American invasion had disrupted the Cuban economy, and land records were a mess when the Americans took charge on the island (the U.S. occupied Cuba between 1898 and 1902, and then again between 1906 and 1909). Many landowners had sunk so deep into debt that they were willing to sell their land at any price. Americans surveyed the island and threw out any claims to land that they felt were spurious. American businessmen constructed a new railroad that stretched from east to west on the island, which made newly-available land seem like an excellent investment to Americans, who envisioned growing produce and shipping it back to the States. Many of the Americans who moved there expected that the United States would soon annex Cuba.

The Cuba Railroad Company and other large landowners on the island advertised the idea of living in paradise to Americans. A steamship company that ferried passengers from New York to eastern Cuba promised,

Cuba is an ‘all-the-year-round’ country. There is no unproductive season. There is no snow, no frost, no time when vegetation refuses to grow or to bear fruit, no month when livestock must be housed and cared for.

It sounded like paradise, and pitches like this attracted tens of thousands of Americans.

The Isle of Pines

Americans especially flooded to the Isle of Pines (now known as Isla de la Juventud), a large island to the south of Havana. The initial treaties at the end of the Spanish-American War left it unclear whether the island would become part of Cuba or end up under American jurisdiction. Opportunistic Americans rushed to occupy the island in order to create facts on the ground.

Within a few years of the war, Americans owned at least 95% of the island. The Americans later lobbied the American government to formally acquire the island, publishing a 1916 pamphlet called “Isle of Pines: American or What?” The pamphlet extolled the island’s

magnificent climate, its modern businesses and financial institutions, schools, churches, &c; its thousands of acres of citrous(sic) fruit, pineapples, and vegetables… And still we must plead for recognition by the United States, our parent Government.

The Isle of Pines eventually housed at least 2,000 American residents, a number at least equal to the number of pinieros, the island’s Cuban inhabitants. These settlers established English-speaking schools, towns, and newspapers.

These settlers were disappointed when the United States finally ratified the Hay-Quesada Treaty in 1925, which gave the island back to Cuba. Though the treaty had been negotiated in 1903, pressure from the Isle of Pines’ American residents slowed the Senate’s ratification processes, as settlers made pleas like this 1924 letter:

American settlers have been subject to an unlawful, most humiliating and unbearable de facto Cuban Government for over 20 years. Many have lost faith and left; some died in despair, and a great majority are holding on to their property in the firm belief and faith that our Government will live up to its representations that the ‘Isle of Pines is United States Territory.’

After the treaty’s ratification, the number of American settlers on the Isle of Pines declined to a few hundred. Some settlers left because they weren’t as economically successful as they had expected; large hurricanes in 1917 and 1926 hastened their economic problems. Others left because they didn’t want to live under a Cuban government. The Isle of
Pines subsequently became best known for housing Cuba’s most notorious prison.

The Midwest in Cuba

Americans didn’t just flock to the disputed territory of the Isle of Pines. As many as 44,000 Americans emigrated to Cuba between 1903 and 1919. Many came and went quickly, but between 10,000 and 20,000 Americans seem to have put down roots on the island. Many of these Americans formed colonies where they could speak English and maintain a version of American culture.

The biggest of the American settlements, La Gloria, a town of 1,000 residents, was built on a lie. Just after the Spanish-American War, the Cuba Land and Steamship Company advertised and sold plots in a neatly-laid-out town called La Gloria. When settlers got to the island, they found nothing of the sort; their “town” was a mangrove swamp. Many of them simply turned around and went home, but enough stayed to create a real town, surrounded by sugarcane and citrus farms in La Gloria. La Gloria eventually became a thriving small town.

A group of Swedish-Americans from Minnesota fled the cold climate of the upper midwest to establish the town of Bayate in southeastern Cuba. The Swedish Land and Colonization Company advertised cheap trips ($45!) to Cuba, hoping to convince Minnesotans to establish permanent residence on the island. Many of the Swedish-American settlers were people who had become dissatisfied with life in the upper midwest, which they had discovered to their horror was even more unpleasant than the frigid nation they had emigrated from.

Another of the colonies, Omaja (pronounced like the city in Nebraska, get it?), involved similarly big dreams that didn’t come true. Chicago’s Cuba Land, Loan, and Title Guarantee Company bought and deforested 12,000 acres, starting in 1903. They hoped to get 1,400 families to live in the town — in homes modeled on those in the midwest — with many others occupying small farms and large citrus plantations around it.

Omaja never quite reached the heights that its founders dreamed of, but it did house over 250 Americans by 1915 or so. It eventually had a number of churches, stores, a barbershop, a school, and social clubs for the American men and women. The surrounding plantations were more successful, becoming major producers of grapefruits and oranges.

The decline and fall of American Cuba

The American colonies in Cuba were briefly successful, but almost all of them were in steep decline by the end of the 1920s. Several factors caused the American colonies to decline in the years after World War I.

As we saw on the Isle of Pines, a number of hurricanes devastated American farms. La Gloria, for example, was home a thousand Americans until a 1932 hurricane destroyed much of the area, causing many of the Americans to return back to the states.

American tariffs, along with periodic bans of Cuban fruit because of invasive fruit flies, squeezed the American farmers in Cuba. Many of those who had dreamed of easy riches farming tropical plants in paradise, had to return to America disappointed and broke.

The American colonies also got caught up in the political turmoil of early twentieth-century Cuba. When a political faction called the Liberals rose up to oppose an election they had deemed fraudulent in 1917, they clashed with government troops. One of their major battles happened in Omaja. Several plantations were burned, and the railroad that serviced the area was damaged, meaning that farmers couldn’t export their fruit for a year. Liberal soldiers also attacked La Gloria as part of the same conflict. An earlier episode of civil unrest, this one along racial lines, led to the death of one of the Swedish-American settlers in 1912. Investment from the U.S. dried up because the area was deemed “unsafe,” and many of the Minnesotans went back home.

The American colonies in Cuba became a strange footnote in American history, forgotten by most Americans. But in Cuba, people remembered. The fact that Americans had snapped up large parts of the island and tried to turn them into little outposts of Nebraska or Minnesota fit into a long narrative of Americans exploiting Cuba with little regard for the Cuban people’s rights. Cuban anger toward American meddling built for decades until Fidel Castro’s revolution severed the ties once and for all in 1959.

Now, of course, few Americans can visit Cuba. The two nations, so close geographically, have been separated by history. I hope there’s a time soon when Americans can again visit Cuba — without exploiting it.

How the Filibuster Emptied Out American Politics

How the Filibuster Emptied Out American Politics

If Congress can’t legislate, what exactly do congresspeople do?

The 111th U.S. Senate (public domain)

When did America’s problem with the filibuster begin?

Was it in 1805, when Vice President Aaron Burr — recently charged with the killing of Alexander Hamilton — suggested that the Senate clarify its rules? The Senate then got rid of the “previous question” motion, which allowed them to end debate and move to a vote; this change theoretically allowed senators to extend debate indefinitely, though no one did for decades.

Or was it in 1837, when Whig senators performed the first filibuster, holding the floor for hours to prevent Democrats from undoing the Senate’s censure of Andrew Jackson? This was the moment that the filibuster became an accepted practice in the Senate.

How about the 1880s, when the tactic moved from being a rare event to something that shut down the Senate’s business every couple of years? This pattern led to the Senate passing a rule in 1917 that allowed a vote of two-thirds of the members to end a filibuster (they changed it to 60 votes in 1975).

Was it the 1970s, when, after repeated filibusters over civil rights legislation, the Senate acted to prevent filibusters grinding all legislation to a halt? Allowing the Senate’s other business to continue while one bill was being filibustered had the unintended consequence of making the tactic more common, since it was no longer seen as a “nuclear option.”

Or has it been in the last 20 years, when Senators in the minority have routinely employed the rule to block pretty much everything the majority party might want to do? The Senate went from 60–80 votes per Congress on cloture — which ends debate and leads to a vote, essentially ending a potential filibuster — to well over 100 on average after 2007, peaking at over 300 invocations of cloture in 2019–2020.

Whatever set of historical events is most to blame, the effects of the filibuster are clear: nothing much happens without 60 votes in the Senate, and since it’s almost impossible for either party to win 60 seats, the Senate passes almost nothing. Our country’s biggest problems have been allowed to metastasize for two decades because of a strange procedural rule that was created by mistake.

The filibuster saps the public’s confidence in Congress

It’s more than that, though. Over the last few decades, because of the filibuster, Americans have gotten used to the idea that their government isn’t really going to do anything about most of our serious issues.

Only a few major bills that weren’t an emergency response to a national crisis have made it through the Senate in the last two decades; and many of them, like Obamacare, were so battered by the process that they didn’t accomplish nearly as much as they could have.

Simultaneously, politics has become more nationalized with the decline of local newspapers and the rise of social media. Our political debates have become more intense, but they’re about fewer things that matter. American politics has emptied out.

You can see the effect of this in Americans’ feelings about Congress. Confidence in Congress has never been great — it fell from 30–40% in the 1970s and 1980s to the 20–30% range in the 1990s and early 2000s. But the number hasn’t broken 13% since 2010, which pretty much corresponds to a period of more aggressive use of the filibuster in the Senate. In 2014, after Republicans had blocked pretty much everything the Obama Administration had tried to do for years, only 7% of Americans had confidence in their legislature.

The failure of Congress to do anything about our most serious unsolved issues — guns, immigration, inequality, climate change, and a million others — has only fed American disillusionment about the ability of government to do anything. It’s no coincidence that the president who took office after confidence in Congress hit single digits was Donald Trump — a nihilist who openly speculates that no politicians are on the level and that Washington is a “swamp.”

What do politicians do all day if they can’t pass laws?

The problem is that American legislators still have to run for office in an environment when it’s quite unlikely that they’ll get to do any, well, legislating. So many of them have decided to make politics about something other than policy.

Some politicians have reacted to this by realizing that they can promise anything to voters and then blame the dysfunction in Washington for their failure. Trump was the master of this. He pledged to replace Obamacare with a “phenomenal” health care program, but no one in the White House ever bothered to draft anything. The Republicans made a few stabs at repealing Obamacare without replacing it, and then Trump spent the rest of his presidency blaming Democrats for his failure. Trump, weirdly, seemed to prefer to fail and blame his opposition for that failure rather than put the effort into actually succeeding. The filibuster is a perfect excuse for his politics, based on resentment and whining.

It’s not just Trump — many politicians don’t even bother with policy anymore. You can see this in the new crop of celebrity Republicans — think Lauren Boebert and Marjorie Taylor Greene — who seem far more interested in culture-war shenanigans than actual legislation. Greene was stripped of her committee assignments after she encouraged violence against Democrats, but she didn’t seem to mind. Now that she’s free from any actual job duties, she has more time to spread misinformation and slander her enemies. Figures like Ted Cruz, once a respected legal mind, now seem to mainly be internet trolls.

Republican representative Madison Cawthorn even said it out loud — he emailed other GOP congresspeople soon after he took office, telling them that “I have built my staff around comms rather than legislation.” Figures like Cawthorn, Greene, Boebert, and Trump tend to fixate on “issues” that can’t really be legislated. They spend much of their time attacking “political correctness,” “wokeism,” and other culture-war boogeymen rather than talking about problems that Congress can actually address.

Perhaps at least some of the blame for our current 24/7, all-encompassing, empty-calorie political hatefest can be assigned to the fact that neither Americans nor their Congresspeople actually expect Congress to pass laws.

You’ll note that I use mostly Republican examples in the paragraphs above. As many have noted, the GOP does not really seem interested in governing anymore. They wager that, if they don’t even try to take responsibility for solving America’s problems — while at least some Democrats do — the media and the public will blame the Democrats for trying and failing while the Republicans hide behind procedural maneuvers like the filibuster. And you know what? It’s working.

What might it look like if the filibuster disappeared? Well, the onus would actually be on winning parties to fulfill their promises to voters. Trump and the Republicans would have had to put up or shut up on immigration and health care. Democrats would have to do the same on climate, inequality, healthcare, and guns. They might actually pass their preferred policies, and then voters could assess those policies and vote in future elections based on whether Congress actually improved their lives.

Some, like Joe Manchin and Kyrsten Sinema, seem to believe that the filibuster encourages bipartisanship, but a quick look around America’s political landscape over the last two decades disproves that contention. Others might argue that we’d have too much legislative whiplash as new parties reversed each other’s policies every two years. I’d actually prefer this to the stasis we find ourselves in now — at least we’d be able to evaluate each party’s policies and vote accordingly.

The filibuster has emptied out American politics. It’s degraded any hope many Americans had that their government might actually work to make their lives better. It’s led to a content-free, conflict-laden version of politics that has raised our collective blood pressure and brought our democracy to the brink. All this because of a weird rule that the Senate backed its way into. Time for the filibuster to go.

Why Do So Many Americans Resent Teachers?

Maybe it’s because they offend the spirit of late capitalism

Photo by Adam Winger on Unsplash

Whatever the educational debate we’re having in this country — school COVID measures, “critical race theory,” or charter schools — there’s one constant. A large portion of the American public doesn’t respect teachers. In fact, some of the rhetoric indicates a level of hostility and resentment that goes well beyond simple disrespect.

The stereotype goes like this: teachers are lazy and pampered. They finish work at 3 pm and have summers off. They have generous pensions. Their unions — some of the last powerful unions in the country — keep them cosseted in luxury. They’re indoctrinating your children and living soft, comfortable lives.

Now, anybody who actually knows a teacher knows that this image isn’t particularly true. First, the hours — studies have found that teachers work pretty much the same number of hours as other college graduates. The school day may end around 3, but it begins for many teachers at 7 am, and they often take work home in the evenings. Even in the summer, the average teacher works about 20 hours a week on school-related work, even though they aren’t being paid for it.

Many teachers don’t make lavish salaries. In Florida, for example, the average teacher makes less than $50,000; adjusted for inflation, Florida teachers’ salaries have declined by 12% over the last two decades. On average, teachers make about 20% less than similarly-educated workers in other industries. Many states’ teacher retirement plans have become less secure, and retirement ages have risen in many places.

So the story you often hear in the discourse about teachers isn’t really true.

But what if it was? What if you had a job that allows you to do what you love? One that promised you a comfortable retirement, a long summer vacation, a short workday, and a union that protected you from exploitation? Wouldn’t that be… good?

I think much of the resentment of teachers comes from the fact that they do a job that doesn’t fit into the way we’ve come to expect the world to work in the bleak atmosphere of late American capitalism. Many American workers just take it for granted that corporations are powerful, and workers aren’t. We’re at the mercy of the rich and powerful, and there’s little we can do about it. We’re going to have to spend most of our waking hours working on things that we’re not particularly interested in. In exchange, we get to be employed, but our job security and income are precarious. Twenty-first century American workers are required to hustle endlessly, shoveling their time and effort into the maw of corporations so that stockholders can realize slightly higher capital gains.

That’s just the way it is. In fact, many of us have convinced ourselves that this is how it’s supposed to be.

Teaching — at least the fantasy of teaching that many commentators attack — represents an assault on many of the values of twenty-first century American capitalism.

These are people who have chosen to make less money in order to make a difference in the world. They work in organizations that are less hierarchical than the business world, doing jobs with little chance of promotion (the ladder of promotions available to teachers is quite short and unattractive to most of them).

Teachers (at least according to the stereotype) have chosen jobs with humane working hours, with a long summer break in which they can pursue other interests and recharge. They have a more secure retirement than most other Americans. They got these things because they are unionized and have fiercely resisted attempts to crush their worker power.

None of this computes in a capitalist system where you’re supposed to sell your soul to the highest bidder and be grateful for the scraps thrown your way.

Psychologists know that resentment and envy are close relatives. Maybe Americans who have given up on working a meaningful job that leaves time for a full life outside of work can’t stand to see anybody else who might be doing that. Americans who have become used to taking whatever their corporate overlords are willing to toss their way can’t stand to see a powerful union that actually takes collective action to get its workers what they want.

We could be working to make a world where everybody has a job like teachers supposedly have. Instead, many Americans just want to make sure that teachers’ jobs are as miserable as everybody else’s.

Authoritarians Always Say They’re Saving Democracy

Don’t be fooled by Trump

Donald Trump, photographed by Gage Skidmore

Well, Donald Trump is offended now. After Joe Biden said on January 6 that Trump’s attempts at subverting elections are like “a dagger at the throat of democracy,” Trump responded. He helpfully cleared things up, releasing a statement that claimed, “Remember, I am not the one trying to undermine American Democracy. I am the one trying to SAVE American Democracy.”

It would be nice if Trump was really interested in saving democracy, but he’s simply following the authoritarian playbook. He’s saying what most aspiring dictators say. When’s the last time you saw a politician announce that he hates democracy and wants to dismantle it? Instead, history more often gives us figures who claim to be embracing democracy while they’re really smothering it to death.

The grand tradition of claiming to save democracy while you’re killing it goes back at least two millennia. Augustus, the first emperor of Rome, ruthlessly took power, killing his enemies in gruesome ways. He stripped the Senate and elected officials of their power and made sure that he had full control of the sources of power that really mattered in Rome. The Roman Republic, a form of government that had allowed Roman citizens to have a say in their government for five centuries, died with him.

The one thing Augustus didn’t do was announce his actual goals — ending the Roman Republic. Instead, he portrayed himself as a traditionalist who would take Rome back to its venerated traditions. When he hunted down the senators who had assassinated Julius Caesar, he framed it as saving the republic — he later said that the tyrant-killers had “waged war on the state… [I] set free the state, which was oppressed by the domination of a faction.”

Augustus made a number of gestures to demonstrate his love for the republic. He ostentatiously declined the most visible forms of power so that he could take control of what really mattered — the military and the economy. He asked to be called “first citizen” rather than “emperor.” But by the time he died, after four decades in power, the republic had long since ceased to function.

It’s not just ancient authoritarians who created undemocratic systems while claiming to be saviors of democracy. Benito Mussolini said that the fascist governments in Italy and Germany were “the greatest and soundest democracies which exist in the world today.” Joseph Stalin called his 1936 constitution and the subsequent elections “thoroughly democratic.” Meanwhile he was conducting the Great Purge, in which many of the drafters of that constitution were imprisoned and killed. Hitler himself called Nazi Germany a “beautiful democracy.” I doubt that any objective observer would call any of these systems actually democratic.

Modern dictators also pay lip service to democracy while chipping away at its core. Vladimir Putin, after all, runs for election under Russia’s constitution — it’s just that anyone who could conceivably beat him is harassed or imprisoned, and the vote tallies are likely rigged. In countries like Russia, Hungary, and Turkey — all places ruled by autocrats whom Trump has praised — democracy has crumbled slowly. In many “elected autocracies,” the rulers end up in the same position Augustus created for himself 2,000 years ago. The veneer of democracy remains and the leader claims that his rule is the will of the people, but the people no longer have a meaningful say in the course of the government.

As Steven Levitsky and Daniel Ziblatt, scholars of “democratic backsliding,” write:

This is how elected autocrats subvert democracy — packing and “weaponizing” the courts and other neutral agencies, buying off the media and the private sector (or bullying them into silence) and rewriting the rules of politics to tilt the playing field against opponents. The tragic paradox of the electoral route to authoritarianism is that democracy’s assassins use the very institutions of democracy — gradually, subtly, and even legally — to kill it.

Authoritarians like Trump will always say they’re trying to protect democracy. But democracy doesn’t mean the same thing to them that it does to most of us. In a real democracy, people have meaningful rights and real opportunities to influence government.

To Trump, democracy may mean that government expresses the will of the people. But, as he’s shown time and time again, the only “people” who count in his eyes are those who support him.

So don’t be fooled — Donald Trump isn’t here to save democracy. He’s here to destroy it.

The Insulated Society

We can’t fix our problems if we don’t face their consequences

Photo by Egor Ivlev on Unsplash

In April, 1942, Franklin Roosevelt spoke to the American people via radio broadcast. He acknowledged that Americans had been through terrible times already — a decade of economic depression, followed by five months of a war that threatened to stretch into the foreseeable future.

Notably, Roosevelt didn’t sugar coat things — he told people their lives were going to get harder before they got easier. After laying out his plan for the next phase of the war, he warned:

The blunt fact is that every single person in the United States is going to be affected by this program…

Are you a business man, or do you own stock in a business corporation? Well, your profits are going to be cut down to a reasonably low level by taxation…. Do you work for wages? You will have to forego higher wages for your particular job for the duration of the war.

All of us are used to spending money for things that we want, things, however, which are not absolutely essential. We will all have to forego that kind of spending…

As I told the Congress yesterday, “sacrifice” is not exactly the proper word with which to describe this program of self-denial. When, at the end of this great struggle we shall have saved our free way of life, we shall have made no “ sacrifice.”

Roosevelt’s rhetoric was optimistic, but clear-eyed; he made sure that the American people understood exactly what the war would require of them. Americans made an informed choice to enter the war despite its costs because they were willing to face the consequences of their choices

Contrast this to our society in the year of our Lord 2022. We’re sleepwalking into disaster on a number of fronts. Most significantly, our democracy is crumbling and the natural world that keeps us alive is dying. Our problems are not easy ones, but they’re solvable. The problem is that we’re not doing much to solve them. How is it that we can know that these problems exist and decide to do nothing? Part of the explanation is that we’ve become an insulated society.

What do I mean by this? Our society has found a way to keep a large number of Americans — most importantly, the people with the social, economic, and cultural capital to influence the direction of the country — packed away in soft comfort, far away from the consequences of their actions.

We’ve restructured our political and economic systems to reinforce the idea that we can do whatever we want with no ill effects.

Americans have found a way to insulate themselves from the consequences of almost every aspect of modern life, but we can use meat, war, and climate change as examples to illustrate the larger phenomenon.

Most Americans eat a lot of meat; the average American consumed 264 pounds of it in 2020. Almost all of this meat is produced in a system that is environmentally unsustainable, cruel to workers, and unspeakably brutal to the animals. But it makes chicken sandwiches cheap, so we’ve found ways to avoid seeing the consequences of the way we produce and consume meat.

Rather than face the awful system that produces our meat, we’ve insulated ourselves. The degrading, physically and psychologically traumatic work of slaughtering animals is largely done by immigrants who have little political clout. The unions that once made the job safer and better-paying (and could bring attention to the conditions in slaughterhouses) have long since been crushed.

In many states, there are actually laws that prevent activists from taking video footage inside slaughterhouses, lest consumers see what is going on inside their food-processing facilities. These “Ag-Gag” laws serve no purpose other than to cover up what is happening in slaughterhouses, in order to keep Americans in blissful ignorance about what has to happen to get bacon onto their breakfast plates.

These conditions insulate Americans from the moral and environmental consequences of meat-eating — you’re not eating an animal, just a shrink-wrapped burger that magically appeared in the grocery store! — and, most importantly, they insulate the wealthy people who profit from this awful system.

We’ve also insulated ourselves from the wars we’ve fought over the last two decades. Unlike in the Second World War and the Korean and Vietnam Wars, we didn’t hold a draft in our recent wars in Afghanistan and Iraq. The fighting was done far away, largely by people without much power in our society. The modern military is dispr
oportionately made up of members of the lower-middle class, and people of color are overrepresented in several service branches.

The fact that few Americans serve in the military — and that many of them come from demographic groups who are not often listened to by the powerful in our society — has allowed our political class, and the voters that elected them, to talk tough about war without ever having to directly face the effects of these wars. George W. Bush famously told people that we wouldn’t have to sacrifice at all for the wars he started; instead, we got a tax cut and were advised to keep shopping. It’s not a coincidence that Bush himself pulled strings as privileged young men to avoid the draft.

Sure, we clap for the veterans at baseball games and awkwardly thank them for their service, but otherwise we’d prefer not to think too much about them. The thousands of soldiers that died, the thousands more that came home with life-altering injuries, and — especially — the foreigners whose lives were destroyed by American wars have been tucked neatly away. We’ve insulated ourselves from the wars that have devastated the lives of our veterans and the people in the places where the fighting happened.

We’ve insulated ourselves from the climate crisis, as well. Climate change is here, and it’s getting worse. The last few years have seen a number of disasters — wildfires, hurricanes, flooding, heat waves — that are at least partially attributable to climate change. The planetary crisis will cost us money whether or not we want it to — we can either invest up front in solar farms and mass transit, or we can pay later to clean up after hurricanes and resettle climate refugees.

It’s well past time to take action, but we don’t seem to be working very hard to make that happen. Again, it’s because most Americans don’t have to face the effects of climate change very directly. Most of the pain from climate change will happen outside the borders of this country. The Americans who are creating the most climate change (it will not surprise you to discover that wealthy Americans have vastly larger carbon footprints than the rest of the country) will likely be just fine. They’re not the people who will die in a heat wave — as Sacoby Wilson, an environmental health professor says, “heat waves are for the poor.” Theirs aren’t the homes in the low-lying areas that will flood more and more often. These are things that will happen to other people, in other places. And if it gets hot, they can just crank up the air conditioning or go on vacation someplace cooler.

We’ve insulated ourselves from the consequences of lots of other decisions, as well. One of the reasons that folks can rant and rave about the ways in which their freedoms are threatened by public-health mandates for COVID is that they don’t have to watch people struggle for their last breaths in intensive-care units. Suburbanites don’t get too wound up by voting restrictions because most of the long lines and inconveniences will take place elsewhere — for wealthy white Americans, voting will still be easy.

Most Americans have found ways to insulate themselves from the consequences of their actions, but it’s especially bad among the wealthiest among us. It’s important to understand that, to politicians, all voters are not created equal. In fact, politicians don’t really listen to average voters. Widespread support for a policy among average voters has no significant effect on whether Congress passes that policy. But when the wealthy support a policy — you guessed it! — Congress jumps into action. And, of course, politicians have found that it’s easier to sell a have-your-cake-and-eat-it-too set of promises to voters rather than confront Americans with the hard truths that we face.

Americans — especially the wealthy — have swaddled themselves in thick, comfortable blankets of privilege. They don’t have to face the effects of their selfishness. They don’t have to understand that there are real consequences for real people at the other end of their decisions. Until Americans — and the politicians who lead us — can face reality more directly, we’re in real trouble.

I Get Why People Tried to Forget the 1918 Pandemic

Will we want to remember the last two years once it’s all over?

Seattle police in 1918 (Wikimedia Commons)

One of the most striking things about John Barry’s book on the 1918 “Spanish” Flu pandemic, The Great Influenza, is what happened, or actually didn’t happen, after the pandemic had ended.

The 1918 pandemic was clearly a defining event in millions of people’s lives. Maybe a third of the human race caught the disease. Something like 100 million people died worldwide— likely more than died during the Black Death in the 1300s. Over 675,000 Americans died in a country with less than one-third the population of the U.S. today. And the disease tragically targeted children and young adults.

Yet, despite all of this, after it was over, the 1918 pandemic just… disappeared from the popular imagination. No great novels or memoirs were written about it, no movies or plays produced. Popular culture basically ignored the flu, and so did most historians for the next few decades. People didn’t seem to want to talk about it once it was over. In the words of historian Alfred Crosby, it was a “forgotten pandemic.”

I’m starting to understand why.

What do we remember?

I’m willing to bet that we’ll memory-hole our current pandemic just like the survivors of the 1918 flu did. These two years (and please, God, let it be just two years) won’t show up in a lot of books, movies, and TV.

Do you think, in a few years’ time, you’ll have an appetite for watching a TV show set in 2020, where the main characters learn to use Zoom, homeschool their kids, and talk about who is fully vaccinated? Will you ever, ever, want to have a conversation that includes the words “herd immunity” or “spike protein” again? I feel like the first time I see a TV character in a mask, my PTSD will kick in and I’ll lunge for the remote.

We choose to remember some events and not others, after all, both in our personal lives and as a society. If you asked an average American to list the most important events of the last century, they would probably tell you about a collection of wars and political events, with a few other things thrown in. A 2016 Pew poll that asked Americans to name the most important events of their lifetimes was topped by September 11, Obama’s election, the “tech revolution,” John F. Kennedy’s assassination, and the Vietnam War.

What do most of these things have in common? They make a good narrative. There are turning points and key moments; there are heroes and villains. At the core of each of them is human agency. Think about 9/11 — Osama bin Laden masterminded terror attacks, and George W. Bush’s United States responded by transforming our own society and invading two countries.

Our cultural memory of the events around 9/11 has heroes (remember how we lionized cops and firefighters?), people making life-and death decisions (flight 93), dramatic set pieces (George Bush with his bullhorn on the rubble), and dramatic consequences for the United States and the world (two wars, thousands dead, trillions spent, a new understanding of the balance between security and liberty, etc.).

But to live through a pandemic, as we’ve found, is simply a long and uncertain slog. There’s very little narrative shape to it — for most of us, just day after day in sweatpants. It consists primarily of waiting; if you’ve been lucky, the defining experience of the pandemic has been deferred plans.

A giant event that made life smaller

Even though it’s been a huge, tragic global event, most people’s experiences of the pandemic have been small, and who wants to remember when life became smaller? American pandemic life has become smaller in a number of ways.

Outside of hospitals, there’s been very little heroism. Even the heroics we’ve seen from medical professionals have been a form of grim endurance, rather than amazing feats. It’s incredibly admirable, but not the stuff of a grand narrative — there doesn’t seem to be a lot of drama, just sadness, exhaustion, and burnout.

I doubt there will be many hospital dramas set in the age of COVID, in which exhausted doctors try in vain to save COVID patients that could have saved their own lives with a free vaccine. We’ve tried to manufacture other forms of pandemic heroism — remember “essential workers?” — but they’ve been pretty transparent efforts to justify the risks we forced others to take in order to make our own lives more convenient.

There hasn’t been a whole lot of leadership from our elected officials. Trump, of course, was cartoonishly stupid and short-sighted during his time as our pandemic president. But the Biden administration, though thankfully in touch with reality, has also struggled to deal with the pandemic. It’s not just America.

Governments around the world have found that there’s no clear formula to dealing with the virus; most have performed disappointingly. At this late stage, most of our leadership seems to be slouching toward letting the disease wash over the population; new attempts to seriously slow the spread of the disease seem to be nonstarters.

There’s been plenty of villainy during the pandemic, but it’s been of a small, uninteresting variety. There’s no imposing human enemy motivated by a persuasive ideology. It’s just that millions of Americans, including a depressing number of our political leaders, have demonstrated that they are willfully ignorant, proudly selfish people with poor critical thinking skills and limited moral imaginations.

Our society has proved itself to be small-minded and not up to making collective sacrifice for the common good. It’s grubby, disappointing, and sad, but it’s not very interesting.

The ultimate enemy, the virus, doesn’t make for a great narrative, either. It has no goals; it’s just a self-replicating bundle of genetic material. The best way to avoid or defeat it is to do very little. Who gets COVID and who doesn’t — and, before vaccination at least, who got a life-threatening case of COVID and who didn’t — feels pretty random and meaningless.

Even at this late stage of the pandemic, there’s still a lot of suffering, but it’s mostly self-inflicted; people who couldn’t tell the difference between conspiracy theories and reality are dying. It’s tragic, but in a small way.

There doesn’t seem to be the prospect for “victory” against the pandemic — just the mitigation of risk, the marginal reduction of suffering. That’s not something you’d make a movie about.

The pandemic has made most people’s individual lives smaller. Less travel, fewer moments with friends and family, fewer of the things that we prioritize when we remember our lives. We’ve devoted more of our mental energy to the boring, exhausting process of assessing risk — is it worth going to that restaurant?

Do I wear a mask at the grocery store? Should I go to that party? — leaving less time for, well, interesting thoughts. For many people, the pandemic has been a bit of a fog — killing time, entertaining ourselves, trying not to despair or burn out, hoping it’ll be better when this is all over.

We’ve become emotionally smaller, too. A huge percentage of Americans have spent big parts of the last two years stewing in some sort of futile outrage. Whether they’ve decided that their local public-health officials are totalitarians, or had murderous thoughts every time they see somebody with their mask under their nose, Americans have spent a lot of time feeling angry and anxious, with no clear outlets for those frustrations. It’s been a dark place to be; I’d imagine we’ll be glad to be rid of it.

Maybe someday my grandchildren will ask me what it was like to live during the pandemic. I’ll probably answer that it was scary, uncertain, and boring. I’ll tell them that it was a disappointing time to be an American. I’ll say that, through a combination of luck and caution, my family made it to vaccination and managed to avoid serious illness.

I’ll struggle to articulate to them the combination of stress, exhaustion, loneliness, and frustration I felt. By this point, my grandkids’ eyes will glaze over with boredom.

Then they’ll ask me what 9/11 was like, and I’ll have a much better story for them.

Just Stay Home!

Let’s learn one thing from COVID — stop making other people sick

Photo by Kinga Cichewicz on Unsplash

We’ve all been there — it’s the middle of cold and flu season. A co-worker comes in to the office, hacking and sniffling. Everybody around them says that they should just go home and take care of themselves, but the sick colleague won’t. There’s an important meeting, or too much to get done. They took some medicine, and now, they claim, “I don’t feel so bad.” They take a quiet pride in their toughness and hoping that the boss notices their “dedication” to the job.

They’ve sent their sick kids to school, too — the kids have to go somewhere, since their mom and dad went to work. Plus, they don’t want the kids to get soft and think they can take the day off of school every time they have the sniffles. The kids spend the day coughing all over their classmates and teachers.

The result of these acts of heroism in the face of the common cold? The sick people who went to work and school probably didn’t get much done, and what they did they didn’t do all that well. And they likely infected lots of other people around them. One case turns into dozens; many people’s weeks are ruined because one family wouldn’t just stay home and watch TV for a couple of days.

There’s a long list of things I wish our country would learn from COVID — health care should be considered a human right, we should make more sacrifices for the common good, we should do more to protect the most vulnerable in society — but it seems like we’re learning very little in these areas. So I’ll set my sights a little lower. Can we all agree that it’s kind of ridiculous for people to come into work and school when they’re contagious with a disease?

Now, I should stipulate up front that I’m mostly talking about white-collar, salaried office jobs here. There are many hourly jobs in this country where taking off for sickness would result in lost wages or even termination. This is ridiculous, and in a compassionate country, we would have policies that prevent people from having to choose whether they’re going to stay home with the flu or pay the heating bill. Everyone should have the choice to stay home when they don’t feel well, and we need policies to ensure this.

But for those of us with sick days we can use and the autonomy to use them, we need a new approach.

First, let’s acknowledge something important. People who come into the office when sick usually think they have to. I’ve done it myself, thinking — “I can do it; I’m not that sick and I don’t want to burden anyone by staying home. I can sacrifice for my job!” But in reality, coming in sick is usually an act of arrogance or selfishness. What you’re really saying is that you’re indispensable. That if you didn’t come into work for a day or two, everything would fall apart.

Guess what — you’re not that important!

Perhaps you’ve had this experience — I know I have. You wake up sick, you spend the hours between 6 and 7 am laying in bed, wrestling with whether or not to go into work. You catalog the things you “need” to do today, all the reasons that your workplace requires your presence. In the end, you take your temperature, see a fever, and stay home.

And, after you stay home, the world doesn’t fall apart! Everything gets done, or it’s pushed back. When you return to the office, half of your co-workers don’t even realize you were gone. Life and work went on; it’s a little humbling but also a relief.

While we’re on the topic of selfishness, let’s talk for a minute about sending your sick kids into school to infect their classmates and teachers because you “can’t” take off of work to care for them. You’re foisting your problems — and germs — onto other people. Some poor third-grade teacher is going to have to spend the weekend in bed with a fever because you didn’t want to call into the sales meeting. Not great!

Second, COVID has given us a golden opportunity to kill presenteeism — the idea that physically sitting in an office for a certain number of hours is equivalent to working, and that sitting in the office for more hours means you’re working harder.

Millions of jobs went remote during the pandemic and things were mostly fine! Sure, there was a little less office camaraderie, and it was hard for bosses to justify their paychecks without cubicles to drop into, but things were pretty OK.

We all learned the skills necessary for remote work. Even if that meeting today is absolutely crucial for the future of the company, the planet, and the universe — which it isn’t, by the way — you could just attend via Zoom. Everybody is used to it, it’s easy, and it’s no big deal. There’s really no reason for you to come to work and sneeze all over everybody in person. Just stay in bed and send your emails and make your calls and keep your flu (or cold, or strep, or COVID) to yourself.

In the end, staying home is a win-win. It’s good for you! You don’t have to go to work when you feel bad! You can take a day to rest up — after all, your running yourself ragged might be why you got sick anyway. It’s good for the people around you! Not going to work and infecting everybody shows that you are capable of being considerate. And maybe your example will lead to a co-worker staying home next time they get sick, and not infecting you.

It’s actually good for the company and the economy, too. Despite your fantasies that everything will fall apart without your snotty presence, presenteeism actually costs companies money. The assumptions behind presenteeism have been known to be false for over a decade. When people feel like they have to show up in the office even when they’re not feeling up to it, productivity suffers badly. An Australian study found that workers who drag themselves into the office and spread disease cost the country’s economy $34 billion a year.

So let’s learn at least one thing from COVID. When you’re sick, just stay home. It’s the best thing to do for yourself and the considerate thing to do for others.

If Biden Doesn’t Run in 2024, He’ll Join a Very Short List

Few presidents have chosen not to run for a second term

Biden in August of 2020 (Wikimedia Commons)

Joe Biden was old — 77 years old — when he won the presidency in 2020. He’ll be 81 in 2024 — almost nine years older than the previous oldest two-term president, Ronald Reagan, was when he started his second term. Whether you’re a Joe Biden supporter or not, there’s no denying that Joe Biden may be too old to run for president in 2024.

Though he seems more spry than most people I know who are in their late 70s, it’s hard to imagine Joe Biden as president when he’s 86 years old in 2028. It’s a mentally and physically demanding job, and let’s face it, he already seems a little frail.

Even before his first year in office is up, rumors are starting to swirl about whether Biden will run again. He’s claimed that he will, assuming he stays healthy, but that’s a big if. Experts like Ed Kilgore speculate that Biden won’t run again, but he has to pretend that he will in for short-term political reasons.

It’s not uncommon for presidents to serve only four years — if he did, Biden would be in the company of presidents like Jimmy Carter, George H.W. Bush, and Donald Trump. But most of the one-termers tried for a second; given the ambitious nature of the people who run for president, it’s unsurprising that most of them want to hold onto power.

It’s quite unusual for presidents to voluntarily decline to seek a second term. In fact, the last president who would be a direct analogue for Biden stepped down from the presidency over 140 years ago.

The twentieth century one-termers

There are two categories of presidents who stopped after one term — the true one-termers, and those who served a term and a bit more.

First, those who served a bit more than one term. These three men — Calvin Coolidge (R-Massachusetts) , Harry S. Truman (D-Missouri), and Lyndon B. Johnson (D-Texas) — started as vice presidents and then took over the top job after the previous president died. Each served a partial term after the president died, was elected as president in his own right, and then chose not to run for a second full term.

Coolidge’s decision was the strangest of the three. True to his reputation as “Silent Cal,” he didn’t really explain why he chose not to run in 1928. He simply gathered reporters while on vacation in South Dakota in the summer of 1927, passed out little handwritten slips of paper that said, “I do not choose to run for president in 1928,” and made no further comment.

It was a weird moment, and the notes were worded in a confusing way that left Coolidge’s intentions unclear, but he stayed true to his word and declined to run. He watched, somewhat frustrated, as the Republican Party nominated Herbert Hoover (R-California)— a man Coolidge couldn’t stand — to replace him. Hoover won, and had the privilege of presiding over the beginning of the Great Depression.

Harry Truman served almost two full terms as president after Franklin Roosevelt died in 1945. While he was president, Congress had amended the Constitution to prevent presidents from serving more than two terms, but Truman was exempt as he had ascended to the presidency before the Twenty-Second Amendment had gone into effect.

He apparently seriously considered running again in 1952, but, by the standards of the time, he was pretty old: 68, a full decade younger than Biden is right now. The uncertainty around whether he would run hurt Truman — he allowed his name to appear on the New Hampshire primary ballot, he lost badly, and then announced that he would not run again. Dwight D. Eisenhower (R-Kansas), who Truman had tried to recruit to the Democratic Party, ended up winning in 1952.

Lyndon Johnson’s decision not to run again is perhaps the most famous of the three. He was elected in 1964 after the assassination of President John F. Kennedy (D-Massachusetts). Despite presiding over remarkable domestic achievements — during his time in office, Democrats passed landmark civil rights legislation and established Medicare, Head Start, food stamps, and work-study financial aid.

But his time in office was overshadowed by the fateful decision he made to get the United States involved in the Vietnam War. As the war became more of a quagmire, and his own health became worse, Johnson decided to step down. He announced, in March 1968, that he would “not seek, and I will not accept, the nomination of my party for another term as your President.”

Richard M. Nixon (R-California) won the presidency in 1968. Johnson was right about his health, at least — he developed debilitating heart problems soon after leaving office, and died in 1973.

The real one-termers

Though Coolidge, Truman, and Johnson all chose not to run for election a second time, they all spent more than one term in office. To find a president who voluntarily chose to limit himself to four years in office, you have to go all the way back to the nineteenth century.

There was a brief trend of candidates pledging not to run for more than one term in the middle of the century. James K. Polk (D-Tennessee) claimed that he had never wanted to be president after he was drafted for nomination by the Democrats, and he pledged not to run for a second term in 1848.

James Buchanan (D-Pennsylvania), who had won the Democratic nomination from Franklin Pierce (D-Hampshire) in 1856 (limiting Pierce to one term), pledged in his inaugural address to step down in 1860. This is probably for the best, as
he is generally regarded as one of the country’s worst presidents; his inconsistent and counterproductive policies helped to bring about the Civil War.

The most recent analogue for Biden, if he chooses to step down, would be Rutherford B. Hayes (R-Ohio), who did so over 140 years ago, in 1876. Hayes’ story has more parallels with our current reality than one might imagine. In some ways, Hayes’ story is the real version of what Donald J. Trump (R-Florida_ and the Republican Party tried to manufacture in 2020.

After a fraud-ridden election in 1876, Hayes was made president by the narrowest of margins (there was actual, widespread fraud in this election, not false allegations of fraud like we have today). Since the popular vote in three states — Florida, South Carolina, and Louisiana — was rendered unknowable by the sheer volume of malfeasance by both parties, neither Hayes nor his opponent, Samuel Tilden (D-New York), could claim a majority in the Electoral College.

Hayes had earned fewer Electoral College votes than Tilden (166 to 184; Tilden needed 185 to clinch the presidency). Hayes’ party, the Republicans, managed to get a majority on the committee that would decide the fate of the 20 disputed electoral votes — after a nonpartisan judge resigned from the panel, a Republican was nominated.

He promised to rule without bias, but quickly voted with the other Republicans. In a straight party-line vote, the committee gave Hayes the presidency. In exchange for Democratic acquiescence, the GOP agreed to end reconstruction in the south, dooming black southerners to decades of Jim Crow discrimination.

Hayes was not accepted by much of the country as a legitimate president. He had pledged when running for the office to limit his stay in the White House to four years — the idea was that he intended to pursue aggressive reform and he did not want to have to worry about his popularity.

He did make some reforms, but his presidency was best known for the events that overtook it, most notably the 1877 railroad strike; Hayes summoned federal troops to violently put down the work stoppage. No president has since done what Hayes did — limit his stay in office to four years.

It’s early in Joe Biden’s presidency; he has a lot of time to decide what to do about the 2024 election. But if he decides not to run again, he will be the first president since the invention of the automobile to voluntarily limit himself to four years in the White House.

Every Wasted Day Adds Decades to Our Climate Problem

Why even a short delay of Build Back Better will have a long-term impact

Photo by Wilhelm Gunkel on Unsplash

Somebody made Joe Manchin sad (or maybe the coal millionaire from a coal state whose campaigns are funded by other coal millionaires never had any intention of doing much on climate). This means that the Build Back Better bill is in trouble. At best, it will be postponed until after the holidays. At worst, it’s dead.

Let’s be optimistic and imagine that BBB is passed in the spring, or the summer. You might guess that a delay of another three or six months doesn’t matter very much. But you’d be wrong. Of course, late is better than never, but every day that we delay putting climate solutions into action creates years of extra carbon emissions.

It’s no secret that many of the climate provisions of BBB have been watered down (many of them at the behest of the man who eventually rejected the whole thing anyway). Nevertheless, even a watered-down version would still be the most ambitious attempt to remedy climate change in American history.

One of the most exciting parts of what’s left is a series of subsidies designed to reflect the environmental benefits of green technology. You see, the way our marketplace is set up right now, green technologies (solar panels, electric cars, more efficient electric air conditioners and furnaces, etc.) are usually more expensive than their fossil-burning counterparts. This is because we don’t price the cost of climate change (through a carbon tax or some other mechanism) into the fossil fuels and the technologies that burn them.

Experts are pretty much unanimous in saying that we need to electrify everything in our lives if we’re going to have a shot at tackling climate change. If we get all of the combustion out of our households, and switch our power generation over to carbon-free sources, we will be able to have our creature comforts without adding to global warming. But we’ll never convince most Americans to electrify their home heating and transportation unless it’s economically attractive. The Build Back Better bill would use tax rebates and other incentives to make green technologies price-competitive, making the right environmental choice the right economic choice, as well.

Most of the purchases that can make a big dent in your household’s climate impact are big, expensive purchases that you make very rarely. Transportation and utilities alone make up about half of American households’ carbon output, and most of those emissions come from large, costly, rarely replaced machines like your furnace, water heater, oven, air conditioner, and car.

These are machines that make modern life possible; most of us aren’t going to live without them. They’re all very expensive, costing thousands of dollars. Most of these machines are not things we replace often, or on a whim — have you ever met anybody who dropped thousands of dollars to replace their furnace just because they wanted a new one? Importantly, many of these purchases aren’t planned. When the family car breaks down or the furnace stops working in the winter, people are going to make a quick choice under duress. Nobody’s going to go a couple of years without a water heater to wait for new technology to get cheaper. If they haven’t planned ahead for this unexpected purchase, most people understandably choose the most affordable option.

Most of these big, expensive machines that make modern life so convenient last a really long time. An average gas water heater can last a decade or more. The average car on the road today was made in 2009; they routinely last 15–20 years before they end up in the junkyard. Furnaces installed today can be expected to work until 2041.

All of this means that, for every day we wait to make doing the right thing cheaper, people are going to be buying fossil-burning machines that will put carbon into the atmosphere for decades to come. Let’s take a hopeful view and assume that the Build Back Better subsidies will pass after a six-month delay. In those six months, Americans will have installed over 1.5 million gas furnaces, over 2 million gas water heaters, and more than 8 million new cars. Most of these items will be in use for at least ten years; many of them will last twice as long.

In order to combat climate change, we need to look at the long-term impact of consumer decisions. The climate impact of a car or furnace comes not when it is purchased, but over its lifetime. We need these subsidies in place as soon as possible. Every day that our politicians dither about them, they are committing to years and years of unnecessary carbon emissions.

Page 1 of 22

Powered by WordPress & Theme by Anders Norén