Every few years, the idea of the gold standard becomes a hot topic. And why not? Gold is shiny and valuable, and people like it.
A gold standard means the value of a country’s currency is linked to a specified amount of gold. Under the gold standard, governments needed to be ready and willing to buy and sell gold to anyone at the set price.
The gold standard has roots in ancient history: Gold was used to fund trade and finance wars. What would people accept in exchange for their labor or goods? They wanted something tangible and of value. Gold was a good fit because of its limited supply and, frankly, because it was pretty. So, new and forming countries relied on the shiny stuff.
The U.S. was no different. Commercial banks and Federal Reserve banks had a gold reserve requirement. They had to keep reserves of gold in their vaults equal to a fraction of the money they issued.
“For every Federal Reserve dollar that was issued, the Reserve Bank had to have 40 cents worth of gold in its vault downstairs in the basement,” explained David Wheelock, vice president and deputy director of research.
And then the Great Depression hit. People hoarded gold instead of depositing it in banks, which created an international gold shortage. Countries around the world basically ran out of supply and were forced off the gold standard. The U.S. came off the gold standard for domestic transactions in 1933 and ended international convertibility of the dollar to gold in 1971.
There are significant problems with tying currency to the gold supply:
“The U.S. mines a lot of gold, but we’re not the biggest producer,” Wheelock said. “The bigger suppliers of gold would have more control over our monetary policy, and there’s no reason to have it because we can get the advantages of the gold standard and avoid the disadvantages without being on a gold standard.”