Viewpoints

As the second quarter of 2025 draws to a close, we’re taking a different approach in this edition of Viewpoints. Much of what needed to be said about the current macroeconomic backdrop—and our positioning in response—has already been covered in prior letters. This time, we turn to the past.

Financial markets don’t exist in a vacuum. The distortions and dislocations we see today echo earlier periods in American financial history. Now feels like a fitting moment to revisit those episodes—not as academic exercises, but to better understand the forces shaping today’s world. As Alexander Hamilton put it, “The history of the United States is, in no small measure, the history of its money.”

From the First Bank of the United States to the inflationary 1970s to the 2008 crisis, history offers essential context. In the pages that follow, we aim to draw meaningful parallels—and lessons—from each.

The Foundations of Finance: Hamilton, Wildcat Banks, and the Gilded Age

American financial history begins with Hamilton’s bold post-Revolution proposals to restore national credit and modernize the economy. Though controversial, Congress adopted his ideas. By 1791, U.S. credit was restored, revenue streams stabilized, and a central bank was chartered.

Hamilton’s legacy is still relevant. First, he overcame deep skepticism toward central banking—a skepticism that persists today. Second, barely a year after the emergence of tradeable securities, the first U.S. financial bubble burst. Even then, markets reflected not just fundamentals, but the fallibility of human behavior.

In the decades that followed Hamilton’s framework, the country’s financial system evolved—but not always in a straight line. After the charter of the First Bank expired in 1811, and later the Second Bank in 1836, the United States entered a volatile period known as the Wildcat Banking Era. With no central authority to regulate credit or currency, state-chartered banks proliferated—often irresponsibly—fueling cycles of speculation, boom, and bust.

This era of locally issued, loosely backed currency bears an uncanny resemblance to today’s cryptocurrency landscape. Just as thousands of fly-by-night banks once printed dubious paper claims, we now find ourselves with tens of thousands of cryptocurrencies—many of them minted by questionable or entirely unregulated sources.

As the 19th century progressed, the excesses of the Wildcat era gave way to a new chapter: the Gilded Age. Marked by rapid industrialization, massive capital formation, and extraordinary technological innovation, this period also saw staggering inequality, rampant speculation, and widespread political corruption.

Railroads, steel, and oil created fortunes—and financial empires. But alongside the rise of tycoons like Vanderbilt, Carnegie, and Rockefeller came practices that would be familiar to any modern observer: monopolistic behavior, regulatory capture, and markets manipulated to benefit insiders. Meanwhile, retail investors often found themselves chasing trends, buying into bubbles, and left holding the bag.

The parallels to today are striking. A handful of technology firms now command immense economic and cultural power, dominating indexes and capital flows. Financial markets remain flush with capital, yet fragile under the surface. And as in the late 1800s, we’re witnessing rising populist backlash against institutions perceived as serving the few at the expense of the many.

The Crash, the Comeback, and the Crown  

As America entered the 1930s, the speculative excesses of the Roaring Twenties gave way to the crushing reality of the Great Depression. Equity markets collapsed, unemployment soared, and trust in both Wall Street and Washington evaporated. For the first time in modern history, the very survival of capitalism came into question.

This was not just a financial crisis—it was a systemic reckoning. Bank failures swept across the country, and the gold standard buckled under pressure. And in response, the federal government launched an unprecedented expansion of its role in economic life. From the creation of the SEC to deposit insurance and Social Security, many of the institutions we now take for granted were born in the crucible of that crisis.

But the 1930s also brought a surge in political polarization. With confidence in institutions shattered, populist figures emerged across the political spectrum—some calling for sweeping government overhauls, others railing against the system entirely. FDR’s New Deal drew fierce support and equally fierce opposition. Elsewhere in the world, economic despair paved the way for authoritarian ideologies to take root.

Then, as now, the social fabric strained under the weight of economic dislocation, rising inequality, and cultural division. Many of the debates from that era—over the proper role of government, the concentration of economic power, and the fragility of democratic norms—resonate with renewed urgency today.

Out of the chaos of the 1930s emerged not only a new domestic order, but eventually a new global one. As World War II drew to a close, allied leaders sought to prevent the kind of competitive devaluations, trade breakdowns, and monetary instability that had deepened the Depression.

In 1944, the Bretton Woods Agreement created a new international system, anchored by the U.S. dollar. Under the agreement, global currencies were pegged to the dollar, which in turn was convertible to gold. This marked the beginning of America’s reign as the world’s hegemon.

The dollar’s supremacy wasn’t just a reflection of military might—it was a symbol of economic resilience and institutional credibility. In an unstable world, the U.S. offered a stable anchor. Capital flowed into American markets and treasury debt became the ultimate safe haven. And a new era of global finance was born—one that we still live in today, though not without cracks showing in the foundation.

But from the outset, a structural tension lay beneath the surface: the U.S. was tasked with providing the world with dollars to support trade and liquidity, while also maintaining the dollar’s convertibility to gold. This contradiction—known as the Triffin Dilemma—meant that the very act of fulfilling global demand for dollars would eventually undermine the system’s credibility. It was a flaw embedded in the blueprint, and it would come to a head by the early 1970s. Even today, long after the gold link was severed, the core tension remains: the global system depends on a steady supply of dollars, yet questions persist about the sustainability of the debt and deficits required by the U.S. to provide them.

The Lost Decade: Inflation, Stagnation, and Uncertainty 

The fragile balance underpinning the Bretton Woods system finally unraveled in 1971, when President Nixon suspended the dollar’s convertibility into gold. What followed was a decade of economic disorientation. The U.S. had severed the final tether between currency and a hard asset, and the world was now operating on a fully fiat system for the first time in modern history.

The 1970s brought a toxic mix of inflation, stagnant growth, and repeated energy shocks. Traditional economic models broke down. The once-reliable tradeoff between inflation and unemployment collapsed. Oil embargoes and geopolitical instability exacerbated supply constraints, while loose fiscal and monetary policy fanned the flames of rising prices.

Policymakers struggled to adapt. The Federal Reserve found itself behind the curve, unwilling or unable to raise rates decisively until the end of the decade. Inflation became a psychological force—embedded in expectations, wages, and behavior. Investors, too, were left unmoored. Real returns evaporated, and portfolios built on the assumptions of the postwar boom faltered. Equity markets spent much of the decade chopping sideways in nominal terms—and falling in real terms—offering little refuge for capital. Valuations contracted meaningfully. The average price-to-earnings (P/E) ratio of the S&P 500, which had reached well above 20 in the late 1960s, fell into the single digits by the early 1980s. As inflation ate into margins and sentiment deteriorated, investors simply weren’t willing to pay up for uncertain earnings. This was an era not only of economic stagnation, but of valuation compression—a pattern we’ve seen repeat in the aftermath of other speculative peaks.

It was a period defined by uncertainty, inflation, and a loss of public faith in institutions—economic and political alike. Sound familiar?

It ultimately took extraordinary action to break the back of inflation. In the early 1980s, Federal Reserve Chairman Paul Volcker raised interest rates into the double digits—an act of monetary shock therapy that triggered a deep recession but succeeded in restoring price stability and central bank credibility.

From that moment forward, a new era began: one defined by disinflation, globalization, and steadily declining interest rates. Over the next four decades, yields fell almost uninterrupted—from north of 15% in 1981 to near-zero by the time of the COVID pandemic. For both public and private sectors, borrowing became cheaper, leverage more attractive, and risk-taking more easily justified.

But that long glide path downward—while fueling asset prices and financial innovation—also laid the groundwork for future excess. Nowhere was this more evident than in the 2008 Global Financial Crisis.

After the Fall: 2008 and the World That Followed

By the early 2000s, the long arc of falling interest rates had nurtured a false sense of stability. Risk appeared manageable, volatility subdued, and financial engineering promised to smooth out the rough edges of capitalism. But beneath the surface, excess was building—quietly and systematically. P/E ratios again reached elevated levels in the years leading up to the crisis, with valuation expansion driven more by falling interest rates and leverage than by durable earnings growth. In hindsight, it was a familiar setup: high asset prices resting on unstable foundations, and a market assuming too much precision from financial models and too little humility from its participants.

The 2008 Global Financial Crisis was, at its core, a failure of risk perception. Complex derivatives, structured credit, and layers of financial innovation were stacked atop a housing market that was fundamentally unsound. Models told investors these products were safe and ratings agencies blessed them. But when housing prices reversed and liquidity vanished, the entire edifice collapsed under its own complexity.

What followed was a decade and a half of truly unprecedented monetary intervention: zero interest rates, quantitative easing, and trillions in central bank balance sheet expansion. While these policies were designed to stabilize the system—and in many ways succeeded—they also created new distortions. In a world starved for yield, capital chased alternatives like private equity, speculative tech names with no earnings, bitcoin, and other high-risk assets.

Now, just as in 2008, faith in complex structures and supposedly savvy managers has often supplanted fundamental scrutiny. The move to zero interest rates didn’t merely rescue the economy—it fundamentally reshaped capital markets, fostering behavior built on the assumption that liquidity would always be abundant and free.

Today, P/E ratios remain historically elevated, even as margins begin to face pressure from rising input costs, labor demands, and tariff uncertainty. As in past cycles, valuation alone is not predictive—but it is instructive. Periods of elevated multiples rarely persist indefinitely, especially when the forces that inflated them—cheap capital, global disinflation, financial engineering—begin to unwind. We now find ourselves in the aftermath of that era. Interest rates have risen, liquidity is tightening, and the tide is slowly rolling back. And under such conditions, history is clear: sideways markets—like those of the 1970s following the high multiples of the ’60s—have followed every major secular bull market without exception. As Warren Buffett famously put it, “only when the tide goes out do you discover who’s been swimming naked.” 

In Conclusion

None of these periods were identical. The circumstances, players, and outcomes varied. But they rhyme—in revealing, and at times, unsettling ways. Today, we are once again grappling with many of the same forces that shaped those earlier chapters: excessive debt burdens, rising wealth inequality, declining trust in institutions, and the rise of populist leaders on both ends of the spectrum. Financial markets, meanwhile, are contending with the aftershocks of a zero-interest-rate world, just as earlier generations struggled to adjust to the unraveling of their own economic regimes.

History may not offer a precise roadmap, but it does offer perspective. And in markets—where emotion so often outruns logic—perspective can be one of the most valuable assets we hold. By understanding the past, we not only gain clarity about the present—we also cultivate confidence in the future.

For all its challenges, America has endured. Time and again, it has restructured, recalibrated, and emerged stronger. That legacy of resilience remains one of our greatest long-term advantages.

As Alexis de Tocqueville once observed: “The American system is a perpetual miracle. It sometimes stumbles, but never falls.”

Michael P. Moeller, CIMA®

Portfolio Manager & Director of Research