The Hoover Institution’s DC office held a short event this morning on “Rules for International Monetary Stability” which highlighted papers from last year’s Hoover Institution Monetary Policy Conference. While much of the discussion devolved into the minutiae of the particular monetary rules that “should” be implemented, there was one thing that stuck out to me that didn’t seem to be picked up on either by the audience or by the presenters. Professor Michael Bordo, in his presentation on monetary policy rules mentioned on several occasions how well the international gold standard worked as a monetary policy rule from 1880 to 1914. However, he also stated that it “stopped working” after World War I.
But as George Selgin and others have pointed out, not only did it work well, it didn’t just “stop working” – it was done away with by governments the world over. The gold standard was a hindrance to government spending, so governments around the world decided to jettison it. That was not a fault of the gold standard, it was a feature, keeping governments from being able to print money ad infinitum. Once governments got off gold, all sorts of mischief ensued – bank holidays, successive devaluations, hyperinflation, etc. I was tempted to ask the presenters: “If the gold standard worked so well, why not use that as the monetary policy rule going forward?” You can hear the scoffing now, and the protestations that the gold standard is impractical and that’s why it was abandoned. But in reality, the gold standard is no different than the Taylor Rule or any other monetary policy rule – once it begins to handcuff the government’s ability to inflate its way out of a recession it will be discarded. Fiscal dominance will always win out.
At the end of the day, discussions about central bank independence are moot. The success of any monetary policy rule, or indeed any monetary policy, is dependent on the government’s AND the central bank’s willingness to voluntarily set very limited boundaries for its own actions and to adhere to those boundaries. Once those boundaries have been crossed, the credibility of the government or the central bank to withdraw and retrench within those boundaries is gone. That’s what we face today. Central banks that have engaged in relentless quantitative easing, credit accommodation, and experimental negative interest rate policies cannot be trusted to return even to a pre-crisis monetary policy stance, let alone anything resembling a stable monetary policy rule.
With the announcement earlier this week that Federal Reserve Board of Governors member Daniel Tarullo will resign effective April 5, 2017, the Federal Open Market Committee (FOMC) will likely find itself in a highly unusual situation come April, one in which the regional Federal Reserve Bank Presidents on the FOMC outnumber the members of the Board of Governors. The Board of Governors of the Federal Reserve System has been operating with two vacancies for several years, following the resignations of Jeremy Stein and Sarah Bloom Raskin in 2014, and Tarullo’s resignation will bring that to three open positions.
Let’s recall the structure of the Fed’s Board of Governors. Each of the seven governors is appointed to a 14-year term, with each term beginning on February 1st in an even-numbered year every two years and expiring 14 years later on January 31st. So a new term began on February 1, 2016, another will begin on February 1, 2018, another on February 1, 2020, etc. The two current open terms are the one that began in 2016 and the one that will begin in 2018. Tarullo’s term expires January 31, 2022. A governor appointed to a full term may not be reappointed, but a governor appointed to fill the remainder of an unexpired term may be reappointed for another full term.
The two current openings mean that President Trump could appoint someone to the current unexpired term that expires January 31, 2018, then reappoint that person to a full term that expires January 31, 2032. He could also appoint someone to the unexpired term that began February 1, 2016 that expires January 31, 2030, and that person could then be reappointed in 2030 until 2044. With Tarullo’s resignation, he could appoint someone to fill that unexpired term and, if he wins re-election in 2020, reappoint that person to serve until 2036. Finally, Vice Chairman Stanley Fischer’s term expires January 31, 2020, giving President Trump a fourth appointment opportunity until 2034. And, since Chairman Janet Yellen’s term as chairman expires in 2018 (her Board position expires in 2024), President Trump will also get to pick a new chairman next year.
I am here today to talk about one of the most important, but also most overlooked, issues of our day: the relationship between central banking and total war. When you focus on central banking and the problems that result from it, it’s very easy to see how central banks enable larger, more centralized, and more pervasive governments. But it isn’t always easy for those who oppose war to see how central banks enable war. So I’ll go ahead and give you kind of the 10,000 foot view of the symbiotic relationship between central banking and war.
One of the primary activities that states engage in is fighting wars. But wars cost money. Armies march on their stomachs, and someone has to buy the necessary food and transport it. Weapons and armament cost money too, all of which has to be paid for. So where have kings and governments historically gotten that money from, particularly when their own treasuries ran out? As Willie Sutton could have told them – banks.
Banks developed initially as a means for merchants to store their funds safely and securely. But eventually those banks took in so much money that they got the idea to loan out some of those funds, hoping that they could juggle loans and receive enough payment on outstanding loans to satisfy demands for redemption by depositors. Thus was born fractional reserve banking and the recourse to banks as lenders of money. Sure, kings could expropriate money from banks, but that only went so far. If you continued to rob banks outright, they would eventually either hide their money or disappear from the kingdom and the king was left with no money to fight his wars. So what developed was a relationship that has developed over time and become ever closer and more symbiotic over the course of time between banks and governments.
On April 5, 1933, President Franklin D. Roosevelt issued Executive Order 6102, requiring all gold coin, gold bullion, and gold certificates to be surrendered to the Federal Reserve Banks or to banks that were members of the Federal Reserve System. With very limited exceptions, it was now illegal for Americans to own gold. This state of affairs lasted until 1975. It was the movement to legalize gold ownership in the United States that influenced a young doctor in Texas to make his first forays into politics. That young doctor is known and beloved by millions today: Dr. Ron Paul. Even out of something as evil as outright gold confiscation, something good came about.
Mount Washington Hotel, site of the Bretton Woods Conference. Image: Richard Hicks
In the aftermath of World War II, the United States cemented its position as the world’s largest and most powerful economy. The new international monetary order created at Bretton Woods, New Hampshire in 1946 was based in part on the gold-exchange standard of the 1920s, only with the dollar as the sole international reserve currency—since it was as good as gold. All countries tied their currencies to the dollar at fixed exchange rates, with the dollar being defined as FDR had left it, at 1/35 ounce of gold (i.e. $35 per ounce of gold). While individuals in the United States were still unable to own gold or to redeem their dollars for gold, foreign governments were able to cash in their dollars to the U.S. government and receive gold in return, a process that became known as the “gold window.” While the United States would pyramid its dollar issue on top of its gold reserves, other countries were supposed to hold dollars, and not gold, as their primary foreign exchange holdings.
The Federal Reserve’s monetary inflation throughout the mid- to late-1920s resulted, not surprisingly, in the Great Depression. As with any credit-induced economic boom, the newly created credit caused a distortion in the allocation of resources. Instead of economic growth resulting from increased real savings and investment, the boom of the 1920s was caused by an artificial increase of credit in the banking system by the Federal Reserve.
Whereas savings-induced growth aligns consumers’ present and future preferences, credit-induced growth does not. An artificial increase in credit allows banks to make more loans to businesses, and these increased loans signal to businesses that consumers are saving more in the present in order to consume more in the future. Businesses begin to undertake longer-term, more capital-intensive projects which, once they are finished, they find to be unsustainable because consumers either do not actually want them or cannot afford them because they have not saved enough money to purchase the goods. These resources have been malinvested, or invested badly, into sectors of the economy that do not actually serve the needs and wants of consumers. And it is not just one or two businesses which find themselves in such straits, but a whole slew of businesses, often across many different sectors of the economy.
The way out of a crisis had traditionally been to allow these malinvested resources to liquidate. Bad debts had to be liquidated so that prices could fall in order for markets to clear. In doing so, resources that were malinvested would be shifted to be used productively in other sectors. This was what was done during the Depression of 1920-21, in which President Harding refused to allow any sort of intervention by the federal government to alleviate the crisis. As we have seen, that crisis, although quite sharp, came to a quick end as the economy rebounded and returned to normal.
Today’s daily joke comes courtesy of Richmond Fed President Jeffrey Lacker, who according to Marketwatch stated at yesterday’s Cato Monetary Policy Conference that “History has demonstrated the gold standard is unworkable.” What he failed to mention, or perhaps what he fails to understand, is that it is not the gold standard that is unworkable, but the expectation that government will adhere to the gold standard that is unworkable. Remember that the gold standard did not fall away because it was inefficient or counterproductive; it was actively destroyed by governments which did not want to continue to be bound by its strictures. The gold standard provides a restraint on the growth of the size and scope of government, which is why rapacious governments sought to do away with it.
Today is Armistice Day, the anniversary of the final day of World War I. While we remember the lives of the many men lost in battle, we should also not forget the many other negative and long-lasting effects of the war. In just a few short years, European culture changed dramatically. At Christmas of 1914, troops from both sides declared ceasefires and met in no-man’s land to play football, exchange presents, and sing Christmas carols. By 1918, all that had changed, as the belligerents shelled each other up until the very last minute before the armistice. In 1914 the belligerents saw each other as fellow human beings and hoped for a quick end to the conflict. By 1918 they viewed each other as mortal enemies and had no qualms about engaging in pointless killing just for the sake of killing. Those cultural shifts were to reverberate throughout Western society and impact all aspects of life. Not surprisingly, those changes impacted the monetary system and resulted in something the effects of which continue to impact people around the world today: the collapse of the international gold standard.