How the US Set Sail on a Sea of Red Ink
This article is part two of a three-part series on "Our United States of Indebtedness." Read the first piece in the series here.
A majority of Americans struggle daily to stay afloat on a sea of red ink, perpetually threatened by wave after wave of debt. This hasn't always been the case. The phenomenon can be traced back to 1978, when the US economy was sailing into dire straits.
The cumulative impact of the Vietnam War (which coincided with a large tax cut), the persistent misallocation of capital and the Arab oil embargo of 1973 created a lasting economic malaise that, by the middle of President Jimmy Carter's tenure, generated severe stagflation.
"Stagflation" is the pernicious intersection of stagnant economic growth with high unemployment and high inflation. It hits harder than a simple slowdown because it's accompanied by a sharp rise in the cost of living. It's a double whammy: The economy generates less wealth, and the wealth it does generate is worth less as prices spike. And it's a tough problem to solve.
On October 24, 1978, Carter gave a nationally televised speech that kicked off a voluntary anti-inflationary program recommending caps on wages and prices. Inflation had nearly doubled in just two years from 4.9 percent to 8.9 percent, and it spiked to over 13 percent by the end of 1979. Unlike today, most Americans simply had to cope with the rapidly rising cost of living without the help of a credit card. If you couldn't afford it, you simply couldn't afford it. It was up to President Carter to relieve the inflationary pressure crushing consumers. Pulling out a credit card simply wasn't an option for most middle-class families.
But that would start to change just one week later.
Brother, Can You Spare a Paradigm?
On October 31, 1978, the Supreme Court heard arguments in Marquette National Bank of Minneapolis v. First of Omaha Service Corp. At stake were "usury laws" imposed by states on financial institutions that issued loans and credit cards. Until the unanimous Marquette decision was reached on December 18, 1978, banks issuing credit cards nationally had to comply with each individual state's unique usury laws. These laws were legal limitations on interest rates and fees imposed by lenders operating within states lines. If a bank in loosely regulated Delaware wanted to float credit in a tightly regulated state, it had to comply with the laws in the tightly regulated state.
In the Marquette case, a bank based in Nebraska wanted to operate under the looser restrictions of Nebraska when doing business in tightly regulated Minnesota. But banks in Minnesota had to comply with Minnesota's stricter usury law. That put them at a disadvantage. So, they argued that Nebraska's banks should be forced to comply with Minnesota's laws when operating in Minnesota.
The First Bank of Omaha argued that it was a nationally chartered bank and, therefore, it was only subject to federal restrictions for its interstate financial transactions. The Supreme Court agreed and immediately opened the floodgates. Banks suddenly scurried to states like Delaware and South Dakota that had loose usury laws. Freed from decades of constraint, they quickly spread their newly liberated debt businesses to all 50 states, pouring red ink into usury-averse states after relocating to islands of sparse regulation.
This one simple decision nullified many state-level usury laws.
It is notable, yet often forgotten, that the United States actually had "usury laws." And it is also notable that they were called "usury laws" instead of something less loaded with biblical intonations of moral judgment - perhaps like "lending rules" or "financial regulations." But they weren't. Usury laws were essentially ethical restrictions on lenders seeking to perniciously extract both income and wealth from borrowers. Restrictions on "usury" also reflected a longstanding view of personal debt as something to be avoided. And it reflected the view of the personal debt business (also known as loan sharking) as something shady and morally specious.
It was one of the reasons credit cards were, by and large, associated with rich people.
For decades, Visa, MasterCard, Diner's Club and American Express were not a way to "get by" during lean times or a handy financial safety net for emergencies. Credit cards were status symbols used primarily for their convenience. Credit cards were not an ad hoc infusion of cash for middle- or working-class households. They were the wallet-stuffers of the wealthy. They were used by people who could afford to pull out the plastic when they didn't have cash on hand at a fancy restaurant, or when they simply wanted to wow someone with their elite credit rating.
At the beginning of the 1970s, "bank-type" credit cards like BankAmericard and MasterCharge were held by just one-sixth of US families, according to a Federal Reserve study. That's a long way from the two-thirds of Americans in 1998 and the nearly three-quarters of Americans in 2012. In 2014, Gallup found that the average American now holds 2.6 credit cards. By the way, that rises to 3.7 if you exclude non-cardholders. And those rates are way down from pre-crash highs.
During the 1970s, most Americans, if they had a card at all, held store-issued cards from retailers like Sears, JCPenney or Montgomery Ward. These were not used as loans to help make ends meet, but more like layaway plans with interest that allowed financially limited middle- and working-class families to afford "big ticket" items like washing machines and refrigerators.
And those store-issued cards, along with gasoline cards, allowed them to "establish a credit rating" in anticipation of a future car loan or mortgage. But the Marquette decision changed the nature and availability of credit. It opened up the roulette wheel of revolving credit on the eve of the debt-soaked 1980s.
Then, Democrats and the United States' first "deregulator-in-chief" upped the ante. But that chief deregulator wasn't Ronald Reagan. It was Jimmy Carter.
From the outset of his presidency, Carter worked to loosen rules on key industries like airlines, the oil industry, railroads and the trucking business. But it didn't grease the wheels of economic growth. So, after stagflation and the Iran-related oil shock of 1979 threw more sand into the gears, he and a Democratic-controlled Congress went a step further with the Depository Institutions Deregulation and Monetary Control Act of 1980. The often-overlooked law removed Depression era regulations on banking, including caps on interest rates. It also increased federal deposit insurance from $40,000 to $100,000, which was a crucial safety blanket for consumers. But it also offered a lot more wiggle room for financial institutions. When it comes to taking risks with people's money, federally backed deposit insurance works both ways - as it would in the next decade.
Along with the Marquette decision, it catalyzed an unlikely transformation out of the New Deal and away from the highly regulated responses to the financial games that led to the crash of 1929. Instead of "Brother, can you spare a dime?" the paradigm was shifting to "Honey, just pull out the plastic."
Charge One for the Gipper
If Reaganomics are known for one thing, it's the "trickle-down" theory.
An unfortunate intersection of "supply-side" economics and the ironically named Laffer curve, the theory goes something like this: radically cut taxes for the people at the very top and their newfound windfall will trickle down through the economy until it finally floats everyone's boat. The Laffer curve - hastily (but appropriately) scrawled on a cocktail napkin - predicted something similar. The haphazard sketch by economist Arthur Laffer seemed to indicate that cutting taxes would unleash the forces of capital and create so much growth in the tax base that cutting taxes would actually translate into more tax revenue.
Basically, if you want to collect more taxes, you should cut the tax rate.
In the short term, trickle down worked. In concert with then-Federal Reserve chairman Paul Volcker's radical inflation cutting measures and a sharp spike in "military Keynesianism" (pumping taxpayer money into the economy through government spending on the military-industrial complex), Reaganomics did generate "Morning in America" - the famous ad campaign devised by strategist Ed Rollins to sell four more years of the Gipper. But his opponent in the 1984 presidential election, former Vice President Walter Mondale, wasn't buying it.
Mondale's retort to "Morning in America" was that he, too, could buy the illusion of prosperity if he just went out and spent like a drunken sailor armed with a taxpayer-funded credit card. Mondale's message focused not only on the nakedness of the emperor, but also on the fact that the emperor just went down to the mall and bought an expensive designer suit he couldn't afford with a Visa card. Of course, Mondale lost in an epic drubbing. Suddenly flush Americans willingly gave Reagan all the credit - both literally and figuratively - for their rising fortunes.
And so it began.
The paper-thin economics of Reagan's "miracle" was built on plastic both at the federal and the household level. Credit (also known as "debt") drove the economy to new highs. It was a miraculous amount of unpaid spending that came due when the market crashed on October 19, 1987. "Black Monday" was fueled in part by newly unleashed "exotic" financial instruments that used borrowed money to place risky, highly speculative bets on the market. The Dow Jones lost more than 20 percent of its value during an intense day market panic. That crash halted a mad dash of financialized deal-making, and it ended the meteoric rise of risky, high-yielding junk bonds that armed corporate raiders with the leverage they needed to artificially stoke stock prices before eventually dismantling their quarry.
In a portent of things to come, the financialized bubble finally burst.
The crash triggered a full-bore recession that took hold after George H.W. Bush replaced Reagan in the Oval Office. But there was another key factor driving the recession. It was an outgrowth of Reagan's effort to further advance Carter's deregulatory push by pulling down the barrier between banking and finance. Another brick was taken out of the regulatory wall on October 15, 1982.
During the signing ceremony for the Garn-St. Germain Depository Institutions Act of 1982, Reagan quipped quite inauspiciously, "All in all, I think we've hit the jackpot."
And did "they" ever.
Garn-St. Germain unleashed the mortgage-focused savings and loan industry in a bid to "revitalize the housing industry by strengthening the financial stability of home mortgage-lending institutions and ensuring the availability of home mortgage loans." Sounds vaguely familiar, right? Maybe a little like George W. Bush's "Ownership Society" or Bill Clinton's less-well-known "National Homeownership Strategy," perhaps?
The funny thing about the hyper-financialized economy is that old dogs don't need to learn completely new tricks. It's the way financial instruments become more "exotic" (i.e. newfangled and risky, such as credit default swaps) after deregulation. Profit-seeking instrumentalists just take an old, highly speculative trick and repackage into something a little bit trickier. The only way to let old dogs practice their trickier tricks is to loosen their leashes. That's exactly what happened when Reagan signed the bill.
After Garn-St. Germain loosened up the mortgage business, the then-struggling savings and loan industry began a frantic competition for depositors' money. They even employed "deposit brokers" as sales agents who found high-yielding certificates of deposit (CDs) for customers - and earned a financial commission on each deal. Just two years later, the average one-year CD paid a shocking 9.59 percent as the newly freed savings and loan industry outcompeted banks for so-called "hot money."
But that "hot money" wasn't sleeping.
Rather quickly, the savings and loan industry went from a defaulting and declining industry before deregulation to a suddenly hot, fast and oh-so-loose investor of other people's money. Once Reagan let the dogs out it was like a starting gun went off. The irony of Garn-St. Germain, which was supposed to stoke home ownership, was that the savings and loan industry actually dropped run-of-the-mill personal mortgages by selling them off (also known as financialization). According to an FDIC history of the debacle they used the "hot money" to invest in commercial "real estate, equity securities, service corporations and operating subsidiaries" - all with "virtually no limitations." Like bubbles to come, the unleashed industry turned to an unsustainable, but self-reinforcing system of self-inflating wheeling and dealing.
At the heart of the bubble was a construction and investment boom around the West and Southwest. It started collapsing like a poorly made soufflé in 1986 and wasn't fully resolved - at a total cost of half a trillion dollars - until 1995. Interestingly enough, one of the triggers was the failure of Neil Bush's Silverado Banking, Savings and Loan Association. Luckily, the lifting of the FDIC guarantee in 1980 from $40,000 to $100,000 ameliorated the impact - luckily for both the depositors and the bankers. Had that guarantee not been lifted and had more people lost more money and their pain been deeper, perhaps a structural change would've been in the offing.
But that didn't happen.
After the Resolution Trust Company (RTC) was founded in 1989 it began sorting through the financialized fallout. In an effort to pay back insured depositors the $80 billion they lost, and to offset a $130 billion taxpayer-funded bailout, the RTC took all those hard assets - apartment complexes, office parks, high rises and even office equipment - and liquidated them to recoup pennies on the dollar for Uncle Sam. Not coincidentally, some of the players who blew the bubble eventually circled back after it burst to gobble up the assets Uncle Sam was selling at bargain basement prices.
Ironically, it was this specific bubble and overall failure of trickle down that set the wealth reallocating boom-and-bust cycle in motion. The frenzy of financialized bubble-blowing inevitably leads to the bursting of the bubble. Then those suddenly cheap assets built up during the frenzy get hoarded for pennies on the dollar during a period of government bailouts. The economy wasn't trickling down; it was percolating upward.
After the crash of 2008, all that wealth the working and middle classes poured into the housing bubble "percolated up" to the lords of liquidity like private equity firm Blackstone. They dove into the flood of sharply declining assets and bulk-purchased foreclosures. That surge of asset hoarding quickly turned a titanic private equity firm into a mega-landlord. And the biggest banks used government-supplied liquidity to gobble up the assets of their weakened competitors. Unlike the crash of 2008, the post savings and loan bailout was "shared" with taxpayers through the recently extended guarantees of the FDIC from $40,000 to $100,000 per customer. Taxpayers had a dog directly in the fight. Also, unlike the crash of 2008, the government actually prosecuted financial perpetrators with over 1,000 felony convictions and actual jail time for high-profile offenders.
But those catches hanging up the system wouldn't last for long. The process of financialization was just getting started and "too big to fail" wasn't yet "a thing."
Running Up the Bill
Bill Clinton's victory was famously all about the economy, stupid.
Like Carter, President Clinton was a Democrat and a former governor of a Southern state and he faced the daunting task of lifting a sullen economy.
Unlike Carter, President Clinton took charge of an increasingly financialized economy with an expanding, post-usury law credit market and a populace growing ever more comfortable with household debt, even though they grew weary of Uncle Sam's unpaid spending.
During the 1992 campaign, Ross Perot's folksy admonitions and straightforward charts transformed government debt into the fiscal issue of the day. Taking a cue from Perot, President Clinton actually balanced Uncle Sam's books and, as he likes to remind people over and over again, he ushered in the "longest period of peacetime expansion in American history."
But what was that expansion made of, anyway?
A study by Demos published in 2003 looked back at the salad days of the Clinton economy and found that during his fondly remembered tenure the "average American family experienced a 53 percent increase in credit card debt, from $2,697 to $4,126," that low-income families experienced a "184 percent rise in their debt" and that even "high-income families had 28 percent more credit card debt in 2001 than they did in 1989."
The Demos study also tracked the credit industry's "aggressive" marketing push, which included a sharp rise in direct mail solicitations from 1.52 billion in 1993 to a forest-felling 5 billion in 2001. And the industry lowered monthly minimums from 5 percent to 2 percent, thus making it easier to carry debt for longer periods (which often means more profits). Set all that against the backdrop of Clinton era "free trade," high consumer spending and the loss of high-wage jobs to outsourcing, and it sure makes sense that the credit industry "tripled the amount of credit it offered customers from $777 billion to almost $3 trillion" by the time Clinton left office.
During this flood of credit (also known as debt), President Clinton busily cut deals with Republicans to reform welfare and to continue the deregulatory process of his predecessors. Not coincidentally, the income gap opened wider and wider. Many point to the Reagan years as the beginning of the end of the middle class. And it's true that the gap started to open during Reaganomics. But Reaganomics crashed, the stock market tanked in 1987 and the savings and loan bubble burst.
"Trickle down" was down and out.
So, was Reagan's bubble really the primary source of the troubling gap between "the haves" and "have less-and-lessers"?
As this graph featured in a report from the Center for American Progress shows, income inequality did spike under Reagan. But then it retreated in response to the crash of 1987. Until, that is, Bill Clinton came into office.
Additionally, the Economic Policy Institute tracked CEO compensation, which also began its much-discussed rise under Bill Clinton.
In fact, an overlooked part of Ross Perot's colorful critique of the US economy centered on disproportionately high, "rock-star-like" compensation for CEOs in the US versus CEOs in Japan and Europe. As he said during one of his famous 1992 political infomercials, "Trickle down economics didn't trickle down at all, folks!"
But that trend was only just beginning.
Ironically, and perhaps not coincidentally, the biggest climb in both income inequality and CEO compensation began right about the time Clinton famously declared that "the era of big government is over" in his 1996 State of the Union address. Also not coincidentally, the preceding years had seen the financialization of the credit card industry, which was auspiciously timed with the coming of the spike in income disparities. As Cornell professor Louis Hyman succinctly told Salon in 2012:
It also became cheaper to borrow so that during the early '90s from 1991 to 1996, during a five year period, nearly all the expansion in credit card debt was through securitization, when banks resell debt as credit card backed securities, just the way mortgages can be sold again as securities. Because it became cheaper to lend, banks could lend to poorer and poorer people with more vulnerable employment statuses because if these people defaulted, it wasn't as big of a deal.
Riskier bets on poorer people may not have been a "big deal" to the financial class. But the "big deals" being cut overseas were really, really big, both for US corporations and for the financial industry that helped credit-empowered consumers soak up the surging output from all the manufacturing being relocated to foreign shores. Even as Clinton clamped down on welfare, corporate captains eager to drive up their margins by driving down labor costs found a willing ally in the White House.
The Shipping State
The "free trade" agenda dominated President Clinton's tenure.
Thanks to Perot's debate with Vice President Al Gore, NAFTA is the most famous, or infamous, of the deals. But, as detailed in a Brookings Institute retrospective, Clinton's team also negotiated "the Uruguay Round, concluded WTO agreements on telecommunications, financial services, and information technology, launched negotiations toward the FTAA and a free trade agreement with Chile, negotiated a free trade agreement with Jordan, secured legislative approval of significantly expanded Caribbean trade preferences and a generous new trade preference program with Africa, and negotiated and won legislative approval for China's entry into the WTO."
That's a heady pace of deal-making.
The Clinton administration's wheeling-and-dealing economic policy kicked open the exit doors for US corporations seeking any and every opportunity to quickly stoke their stock prices with the easy cuts in overhead that come with outsourcing production and avoiding regulation.
The turning point came when Clinton decided to grant China "most favored nation" trade status in 1994. By contradicting a campaign promise and angering members of Congress unwilling to forgive China for Tiananmen Square, Clinton fueled the rise of soon-to-be ubiquitous big box stores as they filled their shelves with Chinese-made tchotchkes. And no box got bigger than Arkansas' own retailing titan - Walmart.
Coincidentally, Hillary Clinton served on Walmart's board of directors from 1986 to 1992.
At the end of his presidency, Clinton granted China permanent most favored nation status. He also paved the way for China to join the WTO shortly thereafter. By the end of the 1990s, "Made in Taiwan" and "Made in Japan" and, most importantly, "Made in the USA" were quickly replaced by "Made in China." Americans used ubiquitous credit cards to buy a flood of Chinese-made stuff. Middle-class wealth was exchanged for widgets made in Asia and drained off by paying interest on those purchases to credit card companies.
And, as manufacturing fled at an even greater pace, the middle class poured even more of their wealth into Wall Street's mutual funds. Income inequality steadily grew, Wall Street blew another bubble and Clinton's "longest peacetime expansion" in US history looked more and more like the hollow Reagan revival.
But there was one key difference.
Ronald Reagan's bubble was blown by cutting taxes and with a massive public expenditure on defense. Thus, the debt load was placed squarely on the back of Uncle Sam. And, like it or not, Keynesian defense dollars did create good-paying jobs at home. And the debt wasn't being funneled through Chinese factories.
During the Clinton years, government debt declined, but the fuel - the debt - didn't really disappear. It just got shifted onto the backs of US consumers, who shouldered the burden of spending money they didn't really have in an economy lacking solid fundamentals in sustainable wage growth, investment in manufacturing and long-term (rather than financialized or speculative) profitability.
The growing gap between consumer aspirations and financial reality - detailed by the Demos study and predicted by Perot - was filled by the tidal wave of credit card offers that eventual saw millions of people holding six credit cards - at the same time! US household debt slowly replaced Uncle Sam's military-industrial deficits as the debt driver of the economy.
It was another epic paradigm shift - this time away from middle-class Americans' debt-averse history.
At the same time, US productivity was increasing, meaning US workers were doing more for the same or less compensation. And, for the first time, more and more "average" Americans put what would've otherwise been their rainy day funds and retirement savings into the financialized hands of Wall Street. That influx of middle-class cash to Wall Street helped blow the dot-com bubble.
Mutual funds, day traders and baby boomers obsessed with quarterly 401(k) statements chose short-term gains over fiscal restraint. This was also a period of perpetual fear about Social Security insolvency. The drumbeat from politicians and the media about the shaky future of entitlements - and the rise of whiz-bang "business news" cable channels - tempted people to join the investment party in an endless reach for yield.
And they did.
That's when Federal Reserve Chairman Alan Greenspan showed his mastery at managing the thin line between reality and fantasy. By today's standard, Greenspan's low interest rate policies don't look low. From 1994 to 2000, the Fed's benchmark oscillated between 5 percent and 6 percent. The first jump from 3 percent to 6 percent was an effort to contain a superheating economy fueled by the beginning of the first tech bubble, rising productivity and consumer credit. But his still-low benchmark rates also made the stock market's rising fortunes look far more attractive than a savings account.
In retrospect, Greenspan's tenure was a high-wire act. He managed an unsustainable economy built on deregulation and financialization. By the time he left in 2006, Greenspan had seen US household debt hit a record $11.4 trillion and US households spend a record 13.75 percent of after-tax income on servicing their debts, and, according to a Washington Post story on his tenure, lorded over a series of record spikes in the trade deficit that increased "Americans indebtedness to foreigners."
Not quite the rosy scenario he's been peddling since 1988.
In spite of those trends, Americans generally have a rosy view of the 1990s. Unemployment dropped to a historically low 4.2 percent by 1999. And people went out and bought a lot of stuff. Yet, the '90s were the launching pad for income inequality and the beginning of a credit card bonanza that saw the United States' credit card debt nearly triple from $238 billion to $692 billion and the number of bankruptcies spike by 125 percent.
Also contradicting the halcyon memories of the 1990s, the post-crash spike in CEO compensation still hasn't exceeded the lofty heights of enrichment enjoyed by corporate captains at the end of Bill Clinton's presidency. All told, the Clinton team's legacy of free trade, of deregulation (see also the Telecommunications Act of 1996) and of welfare "reform" moved Americans even further away from the New Deal's brakes on financialization.
At the end of his presidency, Clinton's team of Wall Street walkers - led by then-Treasury Secretary and future Citigroup millionaire Robert Rubin - closed the book on the New Deal when they spearheaded the effort to remove the last vestige of Great Depression prudence - the Glass-Steagall Act.
Actually, it was a one-two punch: The Gramm-Leach-Bliley Act of 1999 removed the barriers keeping banking, securities and insurance separate and the Commodity Futures Modernization Act of 2000 removed regulatory checks on "over-the-counter" derivates like credit default swaps. Before riding off into his sepia-toned sunset, Bill Clinton signed both into law.
Even as the dot-com bubble was beginning to blow up baby boomers' retirement dreams, Clinton's final deregulatory pen strokes chummed the waters for predatory loan sharks ready to swim into the troubled waters of the 21st century. Once 9/11 cemented the Bush presidency and security-minded Americans got locked into a "cycle of fear and consumption," it was just a matter of time until the next bubble popped up.
Three decades of financialization, a decade of intense marketing by unfettered credit providers and the perpetual reach for yield in an economy of high productivity but also stagnant wages took a brutal toll on the US middle and working classes. And for an emerging class of "too big to fail" financial titans, there was no place like home.
Copyright, Truthout. Reprinted with permission. May not be reprinted without permission.
JP Sottile is a freelance journalist, published historian, radio co-host and documentary filmmaker (The Warning, 2008). His credits include a stint on the Newshour news desk, C-SPAN, and as newsmagazine producer for ABC affiliate WJLA in Washington. His weekly show, Inside the Headlines w/ The Newsvandal, co-hosted by James Moore, airs every Friday on KRUU-FM in Fairfield, Iowa. He blogs under the pseudonym "the Newsvandal."