The Slow Breaking of Shared Prosperity
An account of how policy, finance and power reshaped who gains and who loses in the American economy.
John Sante, White Houses, ca. 1939, Metropolitan Museum of Art. Public Domain.
In October 2008, deputies in Akron, Ohio had been to Addie Polk's house five or six times to deliver notices of foreclosure.
"The bankers could give her more time," one deputy recalled. "Her husband passed away. She was a widow trying to keep her home. Imagine just wanting to live your last days in your home."
Addie was 97 years old. She'd always had a home and her husband had provided for her throughout her life. And now, at nearly a century old, she faced being on the street.
"I think it was just too much," those close to her said.
On October 1, 2008, rather than face eviction, Addie Polk shot herself. She survived, but her story became a national symbol of the foreclosure epidemic sweeping America.
Later investigation would reveal something even more disturbing: Addie's mortgage was likely forged. An investigator looked through her old church contribution envelopes and noticed she always used her middle initial.
But the mortgage paperwork—a $45,620 loan from Countrywide Financial taken out in 2004 when she was 86 years old, with a term extending to 2034 when she would be 116—had no middle initial.The signatures didn't match.
Even more telling, Addie had called the sheriff's department asking why deputies kept coming to her door. Her house was paid for, she insisted. Her husband had paid for it before his death.
Why would someone call asking that question unless they believed their house was already theirs?
"She believed, and I believe today," the investigator concluded, "that that mortgage that was on her house was not originated by her."
A 97-year-old widow shot herself over a mortgage she never took out, on a house that was already paid for. Sadly, Addie’s story was not an isolated incident.
Patrick Lovell, a filmmaker in Utah, lost his own home to foreclosure that same year.
"I am neither an economist nor a scholar," he would later say. "I'm just an average American who lost my home and very nearly my family to foreclosure when the market imploded, and I've spent almost every day since trying to find out why."
What he discovered over nearly a decade of investigation wasn't market forces or bad luck.
It wasn't millions of reckless borrowers buying homes they couldn't afford. It was, in his words, "a serial systematic enterprise” using all the prestige of the Wall Street machine to do it.
"Once the dust settled," Lovell explained, "it quickly became clear that my story was no different than millions of other Americans. We all thought that we were alone. We all thought that we'd failed, but none of us really knew why."
The question is: How did this happen?
What Was Shared Prosperity? (1945-1980)
Before we can understand what was dismantled, we need to understand what existed.
The Post-WWII Social Contract
From 1945 to 1980, the United States operated under a fundamentally different economic arrangement than exists today.
It wasn't perfect—it excluded many, particularly people of color and women.
But for those it included, it created something unprecedented in human history: broad-based prosperity that grew across an entire generation.
Union membership peaked at 35% of the workforce in the 1950s. Unions negotiated not just wages, but benefits, working conditions, and pensions. They had real power.
Defined benefit pensions were the norm. Your employer guaranteed you a specific retirement income for life based on your years of service and salary. The risk was on the company, not you.
Homeownership was achievable. The rate hit 64% by 1980.
More importantly, the median home cost 2.8 times the median household income. A single working-class income could buy a home and support a family.
Wages and productivity moved together. From 1950 to 1980, when worker productivity increased 100%, real wages increased 100%. When workers produced more, they earned more. The gains were shared.
Upward mobility was real. Each generation did better than the last. Children expected to be more prosperous than their parents, and usually were.
This was the promise: Work hard, play by the rules and you'll retire secure. Your kids will have it better than you did.
The Two Pillars of Working-Class Wealth
Two institutions made this possible: Pensions and Homeownership.
Pensions: These were deferred wages. You worked, the company contributed to a pension fund on your behalf, and that money was invested conservatively—in AAA-rated securities for safety. When you retired, you received guaranteed income for life.
This was collectively bargained power converted into long-term security.
Critically, pension assets were managed by institutional investors—pension funds, life insurance companies—that were restricted by law to invest only in the safest assets: AAA-rated securities.
This restriction existed to protect retirees from risky bets.
Homeownership: This was equity accumulation and generational wealth transfer. You bought a home for 2.8 times your annual income, paid it off over 30 years, and passed that wealth to your children.
Homeownership was the primary vehicle for working-class wealth creation in America.
The Data
This was shared prosperity. It wasn't perfect, but it was real. And then it was deliberately dismantled.
Worker productivity doubled. Real wages doubled.
Economic growth and worker pay rose together, meaning the gains of a growing economy were broadly shared.
Median home price
2.8x median household income
Homeownership rate
64% of households owned homes
Union membership
35% of workers belonged to unions
Defined benefit pensions were standard
Retirement security was built into employment structures.
EIGHT DAYS IN AUGUST 1971: THE ORIGIN
Everything that follows traces back to eight days in August 1971.
August 15, 1971: Nixon Closes the Gold Window
On August 15, 1971, President Richard Nixon suspended the dollar's convertibility to gold, effectively ending the Bretton Woods system that had governed international monetary policy since World War II.
The U.S. dollar became a fiat currency—its value based on government decree rather than gold backing.
Why does this matter? The transition to fiat currency essentially untethered the dollar from physical constraints. The Federal Reserve could now expand the money supply without maintaining gold reserves. In other words, the government could print money without limitation, which later opened the door to:
Unlimited money creation (government and banks)
Financialization (money making money without producing goods)
Debt expansion at unprecedented scale
Inflation management as political tool
But here's the critical part: Modern Monetary Theory (MMT), as explained by economist Stephanie Kelton in "The Deficit Myth," demonstrates that a government issuing its own currency can fund its priorities without traditional deficit constraints.
The real constraint isn't money, it's inflation, which only becomes a problem when you exceed productive capacity.
MMT's promise: Fund infrastructure, healthcare, education, and public investment without "running out of money."
The reality: Fiat currency unlocked MMT-style money creation, but that power was directed exclusively to extreme wealth.
The Pattern: Selective Money Printing (1971-2024)
For more than 50 years, we've watched a clear pattern:
Money printed instantly for extreme wealth: Tax cuts for the wealthy: Reagan (1981), Bush (2001, 2003), Trump (2017)—trillions in revenue lost, deficits explode. No "deficit problem" raised.
Wall Street bailouts (2008): $700 billion TARP passed in 3 weeks. $182 billion for AIG. No debate about "how do we pay for it."
Quantitative Easing (2008-2014): Federal Reserve creates $4.5 trillion out of thin air. No congressional vote. No inflation crisis. Money inflates asset prices owned by the wealthy.
Pentagon budget $886 billion annually (2024): No one asks "how do we pay for it?"
Corporate subsidies and tax breaks: Trillions over decades. No deficit concerns.
Meanwhile, for working-class priorities:
Medicare for All: Would cost ~$3 trillion per year while saving $450 billion annually. Answer: "How do we pay for it?" Dead on arrival.
Student debt relief: $1.7 trillion total, one-time cancellation. Answer: "Deficit crisis!" Blocked by courts.
Infrastructure for public good: Build Back Better originally $3.5 trillion over 10 years. Cut to $1.2 trillion. "We can't afford it."
Climate investment: Labeled "too expensive" despite returns.
The same government that created $4.5 trillion for Quantitative Easing without a vote—money that inflated asset prices for wealth holders—claims Medicare for All is "unaffordable."
The Discovery: David Rogers Webb (1990s)
In the 1990s, researcher David Rogers Webb discovered what he called a "money creation anomaly.” The financial system was creating money in ways that didn't align with traditional banking theory.
Here's what most people don't understand: Banks create money when they make loans. It works like this:
When you borrow $100,000 for a mortgage:
The bank creates a deposit in your account: +$100,000 new money
The bank records a loan on its books: $100,000 asset
No existing money was moved. New money was created from the loan.
This is fractional reserve banking. Banks only need to hold a fraction (say, 10%) of deposits as reserves. They can lend, and thereby create, the rest. But Webb discovered something bigger.
The Derivatives Explosion: Money Creation on Steroids
Derivatives are financial contracts whose value derives from an underlying asset, such as mortgages, bonds, commodities and currencies.
They include:
Credit default swaps (CDS)
Interest rate swaps
Collateralized debt obligations (CDOs)
Mortgage-backed securities (MBS)
Synthetic derivatives (bets on derivatives)
When banks create derivatives, they create debt. That debt creates money.
Financial Expansion
The derivatives market has multiplied in the last two and-a-half decades. The scale is staggering.
The global derivatives market today is approximately $600-700 trillion while Global GDP is $105 trillion.
The derivatives market is 6-7 times the size of the entire global economy.
From 2000 to 2008—just eight years—the derivatives market exploded from $100 trillion to $600 trillion. That’s $500 trillion in new "money" created in less than a decade.
Where did it come from? Banks created it through lending and derivatives.
Where did it go? To financial institutions and wealth holders who could access derivatives markets
What Webb discovered: Legal infrastructure changes (UCC revisions 1994-2004, derivatives deregulation 2000) enabled this explosion.
Someone was building a $600 trillion money creation engine in real-time.
The money flowed to financial institutions and asset holders. Working-class access to this money creation was restricted, essentially turning working-class borrowers into debt slaves while wealth holders became money creators.
August 23, 1971: The Powell Memo
Eight days after Nixon closed the gold window, Lewis Powell—a corporate lawyer who would soon be appointed to the Supreme Court—wrote a confidential memorandum to the U.S. Chamber of Commerce.
Its title: "Attack on American Free Enterprise System." Powell warned that American capitalism was under assault from unions, consumer advocates like Ralph Nader and environmentalists.
He called for Corporate America to stop apologizing and go on the offensive.
His recommendations were specific:
Universities: Fund business-friendly academics. Shape curriculum. Counter "liberal" professors.
Media: Monitor television and newspapers. Push back against criticism. Fund corporate media operations.
Government: "Business must learn the lesson... that political power is necessary; that such power must be assiduously cultivated; and that when necessary, it must be used aggressively and with determination—without embarrassment."
Courts: Fund business-friendly legal scholars. Pursue test cases to reshape law.
Think tanks: Create joint funding for long-term institutional infrastructure.
The funding: "Far more generous financial support from American corporations than the Chamber has ever received in the past."
Powell suggested diverting just 10% of corporate advertising budgets—roughly $2 billion per year in 1971 dollars, equivalent to $15 billion today.
Powell wrote: "Strength lies in organization, in careful long-range planning and implementation, in consistency of action over an indefinite period of years, in the scale of financing available only through joint effort."
Powell closed with urgency: "The ultimate issue may be survival,” he said. "The hour is late."
Two months later, President Nixon nominated Lewis Powell to the Supreme Court.
The Powell Memo: Executed with Precision
The Powell Memo was implemented swiftly over the course of a decade:
Business Roundtable: 1972
Heritage Foundation: 1973
Cato Institute: 1977
Manhattan Institute: 1978
Federalist Society: 1982
Powell himself, confirmed to the Supreme Court in 1972, authored First National Bank of Boston v. Bellotti (1978)—establishing corporate free speech rights.
As a result, we saw:
Corporate lobbyists in Washington: 175 (1971) → 2,500 (1982)
Corporate PACs: fewer than 300 (1976) → 1,200+ (1980)
DC public affairs offices: 100 (1968) → 500+ (1978)
Within eight days of the new fiat monetary system a new political blueprint was established, and everything that followed—union busting, deregulation, financialization, the mortgage-backed securities fraud and the 2008 collapse—can be traced back to August 1971.
Fiat currency provided the tool, the Powell Memo provided the plan, and for 53 years, that plan has been executed with precision.
BREAKING ORGANIZED LABOR (1981-1990s)
August 1981: PATCO Strike—The Signal
On August 3, 1981, 13,000 air traffic controllers went on strike for better wages and working conditions. President Ronald Reagan gave them 48 hours to return to work. They refused.
On August 5, Reagan fired all 11,345 striking controllers and banned them from federal employment for life.
The message to Corporate America was clear: The government will not protect labor. Union busting is now policy.
Corporate America got the message and union busting accelerated across the private sector.
1980s–1990s
The Dismantling
35% → 20% → 14.5% → 10.3%
From the 1950s to 2023, union membership collapsed, weakening one of the strongest structures workers had for bargaining power.
Right-to-work laws spread
States passed laws that weakened union power and made collective bargaining harder to sustain.
The NLRB was gutted
The National Labor Relations Board’s staff and enforcement capacity were slashed.
Pensions shifted
Defined benefit pensions gave way to 401(k)s, transferring market risk from employers to workers.
70% → 28%
The top marginal tax rate dropped dramatically, concentrating more gains at the top.
Productivity grew 75%. Real wages grew 12%.
From 1980 to 2020, the economy kept producing more value, but workers received far less of it. The gains were no longer shared.
But breaking unions and cutting taxes wasn't enough. The real wealth—the trillions accumulated over 40 years in pension funds and home equity—required something more sophisticated: financial engineering.
CLINTON: ACCELERATING THE DISMANTLING (1993-2001)
Bill Clinton campaigned as a champion of working people, but what he delivered was systematic acceleration of the Powell blueprint.
Read More: 4 Presidents
NAFTA (1994): Gutting Manufacturing
In January 1994, Clinton signed the North American Free Trade Agreement, which enabled American companies to move their factories to Mexico with no penalty.
NAFTA displaced approximately 1 million manufacturing jobs directly. Combined with China's WTO entry (2000) and automation, NAFTA contributed to 5.3 million manufacturing job losses from 1994 to 2010.
The Rust Belt, including Michigan, Ohio, Pennsylvania and Wisconsin, was hollowed out and union manufacturing strongholds evaporated. The impact of this decision destroyed communities and continues to affect Americans to this day.
A 2024 New York Times analysis found that NAFTA-affected communities experienced sharp increases in deaths of despair, such as suicides, drug overdoses and alcohol-related deaths.
Wage pressure was immediate: "Accept lower wages or we move to Mexico."
The promise of NAFTA was that displaced workers would transition to higher-skilled jobs but the reality is displaced factory workers would end up taking jobs without benefits, pensions or security.
Welfare Reform (1996): Shredding the Safety Net
In August 1996, Clinton signed the Personal Responsibility and Work Opportunity Reconciliation Act.
He promised to "end welfare as we know it." And he did: Aid to Families with Dependent Children (AFDC)—the permanent safety net established during the New Deal—was replaced with Temporary Assistance for Needy Families (TANF).
TANF put a five-year lifetime limit on benefits—that’s five years total, across your entire life, anything after that and you're on your own.
So welfare rolls dropped 60%: 12.3 million recipients (1996) → 4.8 million (2000). Politicians celebrated it as a success, as if poverty dropped. But it didn’t, people just lost benefits.
Think about the timing: NAFTA destroyed manufacturing jobs in 1994, followed by welfare reform two years later, ultimately leaving the working class with no jobs and no safety net.
That wasn’t by accident.
Telecommunications Act (1996): Consolidating Media Control
In February 1996, President Bill Clinton signed the Telecommunications Act of 1996, lifting long-standing limits on media ownership. Companies were suddenly free to acquire stations at a scale that had previously been restricted.
The consolidation came quickly. In 1983, roughly 50 companies controlled most of the American media landscape. By 2011, that number had narrowed to six.
Few examples were more striking than Clear Channel Communications, which grew from about 40 radio stations in 1996 to more than 1,200 within seven years.
By the early 2000s, the pattern was becoming clear to even mainstream audiences. Jon Stewart’s The Daily Show, for example, ran a recurring segement of local and national anchors delivering nearly identical lines, often word for word.
The same dynamic surfaced in political coverage. In October 2006, President George W. Bush told George Stephanopoulos that the administration had never used the phrase “stay the course.”
The following night, Keith Olbermann aired a montage of 29 instances in which Bush had done exactly that.
These moments did not reveal coordination so much as structure. By the mid-2000s, a small group of corporations determined what most Americans saw and heard.
A 2008 study from the Pew Research Center found that The Daily Show devoted a notable share of its airtime to examining the press itself, often more than traditional outlets.
More than a decade later, the pattern had not disappeared. In 2018, Sinclair Broadcast Group required anchors across its local stations to read the same script warning viewers about “fake news” and media bias.
The footage circulated widely: different cities, different faces, identical language, ending on the same line—“This is extremely dangerous to our democracy.”
The conditions that made that moment possible had been set years earlier.
Glass-Steagall Repeal (1999): Creating "Too Big to Fail"
In November 1999, Clinton signed the Gramm-Leach-Bliley Act, repealing the Glass-Steagall Act of 1933.
For more than six decades, Glass-Steagall had kept commercial banking and investment banking separate.
Its repeal changed the structure of American finance. Commercial banks, investment banks, insurance companies and securities firms could now consolidate under the same corporate roof. What followed was the rise of financial institutions so large, so interconnected and so embedded in the economy that their failure threatened the entire system.
Citigroup. JPMorgan Chase. Bank of America. Wells Fargo.
The wall between everyday banking and speculative finance had been weakened. Banks with access to deposits and public support could take on greater risk through complex financial products, including derivatives. When those bets paid off, executives and shareholders captured the upside. When the system buckled, the public absorbed the damage.
Byron Dorgan, a Democratic senator from North Dakota, warned about the consequences on the Senate floor in November 1999.
“I think we will look back in 10 years’ time and say we should not have done this.”
He was almost exactly right. Nine years later, the financial system collapsed.
Derivatives Deregulation (2000): The Final Enabler
In December 2000, Clinton signed the Commodity Futures Modernization Act.
The legislation reshaped how a rapidly growing corner of finance would be governed. It placed most over-the-counter derivatives outside the reach of federal regulators, limited the authority of the Commodity Futures Trading Commission and clarified that instruments like credit default swaps would not be treated as gambling under state law.
In effect, a vast and increasingly complex market was allowed to expand with minimal oversight.
Brooksley Born, then chair of the CFTC, had warned that the opacity and scale of these instruments posed a systemic risk. She argued that without transparency and basic regulatory guardrails, exposures could accumulate in ways that would be difficult to track and harder to contain.
She was silenced by Federal Reserve Chair Alan Greenspan, Treasury Secretary Robert Rubin, and Deputy Secretary Larry Summers. All three men had ties to Wall Street and all three pushed deregulation.
The derivatives market proceeded to explode from $100 trillion (2000) to $600 trillion (2008).
BUILDING THE LEGAL INFRASTRUCTURE (2001-2005)
To loot pensions and homes at scale, Wall Street needed legal protection built in advance. Here’s how it worked:
Phase 1: Enable the Fraud (2001-2004)
Uniform Commercial Code (UCC) Revisions (1994-2004): The UCC governs commercial transactions in the United States. Between 1994 and 2004, Articles 8 and 9 were systematically revised to:
Facilitate securities-based lending
Create legal framework for derivatives
Enable complex securitization structures
Make warehouse lines (credit lines for mortgage originators) legally enforceable
These were technical changes, adopted state-by-state. Almost no one noticed. By 2001, the new framework was nearly universal.
Phase 2: Protect the Perpetrators (2005)
Bankruptcy Abuse Prevention and Consumer Protection Act (April 2005): Four months before Hurricane Katrina, Congress passed and President George W. Bush signed bankruptcy "reform." The law created a new class: "Creditor Safe Harbor."
This provision:
Exempted derivatives, repurchase agreements, and securities lending from the automatic bankruptcy stay
Insulated Wall Street from consequences if their schemes collapsed
Made it harder for working-class debtors to file bankruptcy (means testing, filing restrictions imposed)
System Build-Up
The infrastructure was built long before the collapse.
Legal infrastructure enabling derivatives
A decade of regulatory and legal shifts quietly expanded the foundations of the derivatives market.
Derivatives deregulation
Key oversight was removed, allowing complex financial instruments to grow largely unchecked.
Protection for Wall Street
Regulatory decisions further insulated major financial institutions from constraint and accountability.
Derivatives explosion
$100 trillion → $600 trillion
System collapse
The accumulated risk surfaced all at once.
THE EXECUTION: MORTGAGE-BACKED SECURITIES FRAUD (2000-2008)
With the legal infrastructure in place, Wall Street was ready to execute. The targets: the two pillars of working-class wealth—pensions and homeownership.
Step 1: Fund the Originators: Wall Street investment banks—Goldman Sachs, Lehman Brothers, Bear Stearns, Merrill Lynch—provided credit lines ("warehouse lines") to subprime mortgage lenders: Countrywide Financial, Golden West Mortgage, New Century Financial and Ameriquest.
Wall Street controlled the entire pipeline from origination to securitization.
Step 2: Originate Predatory Mortgages: Here's where the ground-level fraud happened in Akron, Ohio—a quintessential Rust Belt city hit early and hard—local law enforcement stumbled onto something.
The Summit County Task Force was investigating foreclosures. What they found was systematic fraud at Carnation Bank and a sales organization called Evergreen Homes, run by a man named David Willan.
They discovered that Carnation Bank was training its employees to forge documents.
One investigator recalled: "We'd come over here all the time. You know, you recognize the one car that's here all the time. We knew the names. And what they were doing is packaging everything up. They were training their people to do the documents. They had to change the numbers so they could get a mortgage."
"Carnation Bank was actually training its employees how to forge documents, inflate income, and get the appraisal numbers that they needed. Even the forging of signatures was not uncommon."
When investigators interviewed one Carnation Bank employee, they asked how many times he'd falsified loan application documents to make them fit approval criteria.
The answer shocked them: "I expected the answer to be two or three times," the investigator said. "And the guy looked right at us and said, 'I don't know, maybe 200 times maybe.' That, I mean, I was shocked. I was expecting an answer of, 'I either didn't do that, or two or three times.' He said, 'I don't know, maybe no more than approximately 200 times.'"
"So this guy falsified 200 different loan applications. And that's one person that worked there."
The fraud pattern was clear:
No-doc loans ("liar loans"—no documentation of income required)
Teaser rates (low initial payments that balloon later)
Negative amortization (loan balance grows, not shrinks)
Prepayment penalties (traps borrowers)
Forged documents (signatures, income, appraisals)
And here's the critical part: "This was money chasing people. The person who sold you a loan made more money if they sold you a higher rate loan."
Lenders weren't responding to demand. They were hunting borrowers. Higher-rate (worse) loans meant higher commissions. The incentive structure was designed to defraud.
At Ameriquest, employees bragged "you could come in at nine o'clock and have a loan by five o'clock because they were going back and forging the documents."
Step 3: Package into Mortgage-Backed Securities (MBS): Wall Street took these toxic, fraudulent mortgages and packaged them into securities, slicing them into tranches by supposed credit quality:
AAA tranche (top): Supposedly highest quality
AA, A, BBB tranches (middle): Lower quality
BB, B, Equity tranches (bottom): First to absorb losses
Step 4: The Insurance Fraud: Lower-rated tranches were insured by "default insurers" to justify higher ratings. Here's the RICO element: Wall Street banks owned the default insurers.
And those insurers were not capitalized to cover predictable defaults. No one bothered to check whether these insurers had cash reserves. Or rather, the people who should have checked were paid not to look.
Step 5: Buy AAA Ratings: Credit rating agencies—S&P, Moody's, Fitch—gave these mortgage-backed securities AAA ratings based on the insurance.
Conflict of interest: Wall Street paid the rating agencies. The rating agencies either didn't check insurer capitalization, or knew and didn't care.
AAA rating = "institutional grade" = legal for pension funds to purchase.
Step 6: Target Institutional Investors: This is where the two pillars connect. Pension funds and life insurance companies—holding working-class retirement savings—are restricted by law to AAA-rated investments. This restriction exists to protect retirees from risky bets.
Wall Street took advantage of that restriction. Teachers. Nurses. Firefighters. Factory workers. Their pensions, managed by institutional investors, poured trillions into toxic mortgage-backed securities fraudulently rated AAA.
These were the real targets. Homeowners were collateral damage.
Step 7: Wall Street Hedges Its Own Fraud: Here's the smoking gun. Wall Street investment banks—knowing these securities were toxic—bought Credit Default Swaps (CDS) from AIG as insurance against their own securities.
Think about that. They were selling securities to pension funds as AAA-safe while simultaneously buying insurance betting those same securities would fail.
Buying CDS on "AAA" securities proves foreknowledge of fraud. If the securities were truly AAA, insurance would be unnecessary.
The FBI Warnings (2004-2006)
Between 2004 and 2006, the FBI repeatedly warned of an epidemic of mortgage fraud.
"It has the potential to be an epidemic," said Chris Swecker, who who formerely led the Criminal Division at FBI headquarters in Washington. "We think we can prevent a problem that could have as much impact as the S&L crisis," he said
Regulators didn't act. And when the crisis hit in 2008, they claimed no one could have seen it coming.
2007–2009
The Collapse and Bailout
The Defaults Begin
Teaser rates expired. Borrowers couldn’t afford balloon payments. Foreclosures spiked. In Akron, Ohio, by 2008, 47% of homes were underwater, meaning the mortgage exceeded the home’s value.
In 2007, Summit County served an average of 746 foreclosure notices per month. By 2008, that number exploded.
Across America, the pattern was the same: foreclosures cascading, home values plummeting and neighborhoods hollowed out.
The Crisis Explodes
Bear Stearns collapses
Acquired by JPMorgan in a Fed-backed rescue.
Lehman Brothers files for bankruptcy
The largest bankruptcy in U.S. history.
AIG bailout
$182 billion in taxpayer-funded support.
TARP bailout
$700 billion authorized and passed in three weeks.
On the day TARP passed, Addie Polk shot herself rather than face eviction over a mortgage she likely never took out.
Obama's Choice: The AIG Scandal
In September 2008, the government bailed out AIG with $182 billion in taxpayer money. AIG had sold credit default swaps—insurance on toxic mortgage-backed securities—that it couldn't pay.
Under ordinary circumstances, a firm in that position would enter bankruptcy, and its creditors would take losses. In late 2008, AIG had begun negotiating with its counterparties to accept discounted payouts—so-called “haircuts”—reportedly in the range of 60 to 70 cents on the dollar.
That outcome did not materialize.
Officials at the Federal Reserve Bank of New York, led at the time by Timothy Geithner, chose instead to use federal support to pay AIG’s counterparties in full.
The beneficiaries included major global financial institutions. Goldman Sachs received approximately $12.9 billion, while Société Générale received about $11.9 billion and Deutsche Bank about $11.8 billion. All paid in full using taxpayer money.
When investigators tried to expose it, Geithner's New York Fed pressured AIG to hide the payments from SEC filings.
The SIGTARP report—the Special Inspector General for the Troubled Asset Relief Program—documented the cover-up. Congressional testimony confirmed it.
In a bankruptcy, creditors absorb losses. That's how bankruptcy works. AIG was bankrupt. The negotiated haircuts—60 to 70 cents on the dollar—would have saved taxpayers $50-70 billion. Geithner chose to make Wall Street whole instead.
Goldman's Pay Days and Bonuses
In 2009, as the U.S. economy struggled to recover from the financial crisis, Goldman Sachs set aside $16.2 billion for employee compensation and benefits, even as the firm reported $13.4 billion in net earnings for the year.
The figures drew scrutiny because they underscored how quickly the largest financial institutions had stabilized—and how uneven that recovery appeared.
On average, Goldman’s roughly 32,500 employees received close to $500,000 in compensation.
Chief executive Lloyd Blankfein received a $9 million bonus, which the firm described at the time as a measure of restraint.
That November, Blankfein told The Times of London that Goldman was “doing God’s work,” a remark that quickly became shorthand for the industry’s posture in the aftermath of the crisis.
Outside Wall Street, the economic picture looked very different.
In 2009:
Nearly 2.8 million U.S. homes entered foreclosure.
About 8.7 million jobs were lost during the downturn.
The unemployment rate peaked at 10.1 percent.
Median household net worth fell by roughly 39 percent, from $126,400 in 2007 to $77,300 in 2010.
Millions of Americans remained unemployed for six months or longer.
Public pension funds collectively lost hundreds of billions of dollars in value during the market collapse.
The contrast became a defining feature of the post-crisis narrative. Financial institutions that had required extraordinary public support returned to profitability within a year. Households, by comparison, absorbed lasting losses in income, savings and housing wealth.
In December 2009, Goldman Sachs announced a $500 million program aimed at small-business lending and community investment. The firm framed the initiative as part of its response to the crisis.
The scale of that commitment—while substantial in absolute terms—represented a small fraction of the firm’s annual compensation pool.
The broader question lingered: who bore the cost of the crisis, and who recovered first.
The Human Cost
The Victims
10 million foreclosures
From 2007 to 2016, millions lost homes, communities were hollowed out and generational wealth was wiped out.
$9.1 trillion in household wealth lost
Retirement security evaporated
CalPERS lost $67 billion. NYC pensions lost $26 billion. Teachers, nurses, firefighters and factory workers saw retirement savings vanish.
401(k)s were cut in half
Retirement was delayed, reduced or abandoned entirely.
“We all thought we were alone. We all thought that we’d failed.”
$882 billion in bailouts
$700 billion for TARP. $182 billion for AIG. Public money funded the rescue of the institutions that helped drive the crisis.
Who Benefited
The Beneficiaries
Bailed out and preserved
Institutions deemed “too big to fail” survived the crisis and its aftermath.
- Goldman Sachs
- JPMorgan Chase
- Citigroup
- Bank of America
Creditor safe harbor provisions shielded derivatives positions from standard bankruptcy losses.
Compensation remained intact
Across the industry, roughly $140 billion in bonuses were paid between 2006 and 2008.
No senior executives faced criminal charges tied to the crisis, and personal financial consequences were limited.
Acquired distressed assets at scale
Large firms including BlackRock and Blackstone moved into the single-family housing market, purchasing foreclosed homes at steep discounts.
Over time, many of these properties were converted into rental portfolios, shifting ownership patterns and expanding institutional control of housing.
The Prosecutions
In the years following the 2008 financial crisis, federal authorities brought numerous cases tied to mortgage fraud and related misconduct.
Yet those prosecutions were largely concentrated at the margins of the system. No senior Wall Street executive at a major financial institution was criminally charged in connection with the practices that contributed to the collapse.
The government did secure large civil settlements with several banks. Goldman Sachs agreed to pay roughly $5 billion in 2016 to resolve claims related to mortgage-backed securities, while JPMorgan Chase reached a $13 billion settlement in 2013, one of the largest in U.S. history. Those penalties were borne by the institutions and their shareholders, rather than individual executives.
At the same time, enforcement efforts targeted smaller-scale fraud. The Federal Bureau of Investigation pursued cases against individuals involved in mortgage scams, including borrowers, brokers and local operators, reflecting a prosecutorial approach that emphasized discrete acts of fraud rather than systemic decision-making at the highest levels of finance.
The outcome drew sustained criticism from legal scholars and policymakers who argued that accountability had been unevenly applied. Among them was Elizabeth Warren, then a law professor, who pointed to the absence of charges against senior executives as evidence of how legal and financial power intersect.
"Not one of the executives on Wall Street has been charged with anything,” she said. “That is what power is about. That is what corruption is about, and that is what has to change in the United States of America."
It didn't change. And twelve years later, the same playbook was executed again—this time using a pandemic as cover.
The CARES Act Heist (March 2020)
In March 2020, as the rapid spread of COVID-19 forced businesses to close and unemployment surged, Congress passed the CARES Act, a $2.2 trillion emergency package designed to stabilize the economy.
A central component of that response involved the Federal Reserve, which was authorized to use $454 billion in Treasury-backed funds to support a series of emergency lending facilities.
Those funds were structured as a loss-absorbing backstop, allowing the central bank to extend credit at a much larger scale—up to several trillion dollars—primarily through programs aimed at maintaining liquidity in corporate debt markets.
Through these facilities, large corporations gained access to borrowing at historically low interest rates, closely aligned with the federal funds rate, which had been reduced to near zero. The goal was to prevent a cascade of corporate defaults and preserve the functioning of financial markets during a period of acute uncertainty.
The experience for small businesses unfolded differently. While programs such as the Paycheck Protection Program were designed to provide relief, distribution was uneven and, in many cases, delayed.
By some estimates, more than 30 percent of small businesses closed permanently during the pandemic, a loss that reshaped local economies across the country.
The divergence reflected both the structure and the speed of the response. Large firms, with established relationships in capital markets, were able to access support quickly through credit facilities designed to stabilize financial systems. Smaller businesses, which relied more heavily on direct aid and banking intermediaries, often faced greater barriers in securing timely assistance.
The result was an uneven recovery in which large corporations emerged with strengthened balance sheets, while many small businesses—particularly in service sectors and local communities—did not return.
The Housing Lockout (2021-2023)
The dislocation in the American housing market that unfolded between 2021 and 2023 did not arrive as a single shock. It developed in phases, each reinforcing the next, and together reshaping who could realistically enter the market.
The first shift came in 2021, as large institutional investors moved aggressively into single-family housing.
Firms including Blackstone and BlackRock expanded their acquisitions, aided by abundant liquidity following pandemic-era interventions by the Federal Reserve.
By some estimates, investors accounted for roughly 18 percent of all home purchases that year, accelerating a longer-running trend in which single-family homes were absorbed into rental portfolios at scale.
Prices rose in tandem. The Case-Shiller Home Price Index recorded annual gains of nearly 19 percent in 2021, one of the sharpest increases on record. Median home prices climbed from $288,000 in 2020 to $355,000 in 2021, reaching approximately $440,000 by 2023.
The ratio of home prices to median household income, which stood at 2.8 in 1980, approached five by 2021, placing ownership further out of reach for many first-time buyers.
Rising home values produced a secondary effect. As equity increased, existing homeowners experienced a surge in perceived wealth, a dynamic economists often describe as the “wealth effect.”
Higher home equity supported greater consumer spending, which in turn added to broader price pressures across the economy. By June 2022, inflation had reached 9.1 percent, its highest level in decades.
The burden of those increases was uneven. Households that owned property benefited from rising asset values and greater financial flexibility, while renters and prospective buyers faced higher costs across both housing and everyday goods, with fewer avenues to build comparable wealth.
The Federal Reserve responded by tightening monetary policy, raising interest rates at the fastest pace in years and reducing its balance sheet through the sale of Treasury securities.
Those moves pushed borrowing costs higher across the economy, including in the housing market. The average rate on a 30-year mortgage, which had hovered near 3 percent in 2021, rose above 7 percent by 2023.
That shift altered incentives on both sides of the market. Prospective buyers encountered sharply higher monthly payments, even as home prices remained elevated.
Existing homeowners, many of whom had secured historically low mortgage rates, became reluctant to sell, effectively locking in place and limiting the supply of available homes. The result was a market defined less by transactions than by stasis.
By 2024, the affordability gap had widened further. Nationally, home prices were roughly five times median household income, according to data from the Harvard Joint Center for Housing Studies, with major metropolitan areas such as San Francisco, Los Angeles and New York City exceeding seven times income.
Over the course of three years, the combined effects of investor activity, rising asset values and higher borrowing costs narrowed the path to homeownership.
For many households, particularly younger and lower-income buyers, the barrier was no longer a temporary hurdle but a structural condition of the market.
The Pattern: Same Con, Different Crisis
Across three major disruptions—the 2008 financial collapse, the pandemic-era shock of 2020 and the housing market shifts that followed—government intervention has tended to stabilize large institutions more quickly than the households and smaller businesses most directly exposed to the fallout.
Big banks bailed out, homeowners foreclosed
Big corporations get $4.25 trillion at 0%, small businesses die
Institutional investors lock out homeownership at scale
Taken together, these policy choices form a pattern that stretches back more than half a century, steadily reshaping the postwar model of shared prosperity and narrowing the primary mechanisms through which working-class wealth is built
Questions Worth Asking
Why were local prosecutors, including those in Summit County, able to pursue racketeering cases tied to mortgage activity, while federal authorities did not bring similar charges under the same statutes?
What explains the gap between early warnings—issued by the Federal Bureau of Investigation between 2004 and 2006—and the pace or scope of regulatory intervention that followed?
How did a crisis that resulted in roughly 10 million foreclosures conclude without criminal charges against senior executives at major financial institutions?
If banks can create $600 trillion through derivatives, why is Medicare for All “unaffordable”?
If the Fed can create $4.5 trillion for QE without a vote, why can’t it create $3 trillion/year for healthcare that saves money?
If Geithner could pay Wall Street counterparties 100% when they should have taken 60–70% haircuts, saving taxpayers $50–70 billion, why did he choose to make Wall Street whole?