Site icon The Insiders Fund

Is it time for the US to disengage the world from the dollar?

The week before last on Thursday the Financial Times published an OpEd piece I wrote arguing that Washington should take the lead in getting the world to abandon the dollar as the dominant reserve currency.  My basic argument is that every twenty to thirty years – whenever, it seems, that American current account deficits surge – we hear dire warnings in the US and abroad about the end of the dollar’s dominance as the world’s reserve currency.  Needless to say in the last few years these warnings have intensified to an almost feverish pitch.  In fact I discuss one such warning, by Barry Eichengreen, in an entry two months ago.

But these predictions are likely to be as wrong now as they have been in the past.  Reserve currency status is a global public good that comes with a cost, and people often forget that cost.

Just as importantly as a public good it requires a number of characteristics.   At a minimum these include ample liquidity, central bank credibility, flexible domestic financial markets, minimal government or political intervention, and very deep and open domestic bond markets.  With the exception perhaps of the euro, which may or may not emerge in the next decade on a more rational basis than it currently exists (albeit with more than one defection), no other currency has the necessary characteristics that will allow it plausibly to serve the needs of the global economy.

And no other country, not even Europe, will be willing to pay the cost.  If there is any chance that the dollar’s status declines in the future, it will require that Washington itself take the lead in forcing the world gradually to disengage from the dollar.

Ironically, this is exactly what Washington should be doing.  Conspiracy theory notwithstanding, claims that the reserve status of the dollar unfairly benefits the US are no longer true if they ever were.  On the contrary, the global use of the dollar has become bad for the US economy, and because of the global imbalances it permits, bad for the world.

During the first few decades of the post-War period, the cost of maintaining the dollar’s status could be justified by the incremental benefits to the US of a stable and growing world economy within Cold War constraints.  The trading benefits that accrued from a widely available global currency benefitted US allies and the relative size of the US economy ensured that the costs were limited.  But beginning in the 1980s, trade policies in a number of countries abroad have sharply raised the cost to the US, while the end of the Cold War has limited the benefits.

This cost comes as a choice between rising unemployment and rising debt.  The mechanism is fairly straightforward.  Countries that seek to supercharge domestic growth by acquiring a larger share of global demand can do so by gaming the global system and actively stockpiling foreign currency, mainly in the form of, but not limited to, central bank reserves.   This allows them to forcibly accumulate domestic savings while relying on foreign demand to compensate for their own limited domestic demand.

Always dollars

In practice, dollar liquidity, limited Washington intervention, and the size and flexibility of US financial markets ensure that these countries always stockpiled dollars.   There is no real alternative to the dollar, and most other governments would anyway actively discourage massive purchases of their own currencies because of the adverse trade impacts.  If foreigners accumulate euros or yen at anywhere near the rate they accumulate dollars, they would force Europe and Japan into massive current account deficits, and neither Europe nor Japan has any interest in seeing this happen.

So foreign acquisition of dollars automatically forces the US into running a corresponding current account deficit as foreign polices that constrain consumption at home require higher consumption abroad.   Active trade intervention in countries that engineer large trade surpluses, in other words, have to be accommodated by rising trade deficits in the US.

This importing of US demand by other countries forces the US economy to respond in one of two ways.  Either American unemployment must rise as demand is diverted abroad and the tradable goods sector in the US shrinks, or Americans must counteract the employment impact by increasing domestic consumption or investment.

Without government intervention, there is no reason for domestic investment to rise in response to policies abroad.  On the contrary, with the diversion of domestic demand private investment may even decline.

So in order to limit the employment impact, capital flows into the US have to finance additional US consumption.  Americans, in other words, are forced to choose between higher unemployment and higher debt, and in the past the Federal Reserve has chosen to encourage higher debt.

But what about the benefits to the US of reserve currency status?  A lot of analysts argue that the predominance of the dollar gives the US two important advantages.  It reduces the cost of imports to American consumers and it lowers US government borrowing costs.

But both arguments are seriously flawed, I think.  Americans already over-consume, and so it is hard to argue that they benefit in the aggregate from lower consumption costs, especially when it comes at the expense of employment.  And anyway if cheaper consumption is such a gift, it is hard to explain why attempts by the US to return the gift to countries whose consumption costs are artificially high – demanding for example that these countries revalue their currencies and so reduce costs for their own consumers – are always so indignantly rejected.

As for borrowing cheaply, what matters to a government’s borrowing cost, as countries like Switzerland clearly demonstrate, is not major reserve currency status but rather creditworthiness.  Because reserve currency status actually increases US borrowing, it is more likely to undermine the ability of the US Treasury to finance itself cheaply than the loss of reserve status would.

The supposed advantages of reserve currency status are simply the obverse of the cost.  As countries accumulate dollars, they force trade deficits onto the US, which the US can only manage by increasing borrowing.  This borrowing is financed by the foreign accumulation of dollars.

So will it happen?

The massive imbalances that this system has permitted are destabilizing for the world because they permit large and unstable debt buildups both in countries that over-produce, like China and Japan, and those that over-consume, like the US.   If the world were forced to give up the dollar, there is no doubt that there would be a cost – it would reduce global trade somewhat and it would probably spell the end of the Asian growth model – but it would also lower long-term economic costs for the US and reduce dangerous global imbalances.

The US, I would argue, should take the lead in shifting the world to multi-currency reserves in which the dollar is simply first among equals.  The cost of maintaining sole reserve currency status has simply become too high in the past three decades and is leading inexorably to rising American debt and worrying global imbalances.

My Financial Times article got a lot more response than I would have expected.  Some of it was in other news sources, as in this CNN report, but there were also a lot of private responses, suggesting that a larger number of people out there than I had expected were thinking along the same lines. The most interesting of the responses was an email by Kenneth Austin accompanied with an article he had recently published in World Economics.  Austin is an International Economist with the US Treasury Department, and a professor at the University of Maryland.

His article was a fascinating re-reading of John Hobson’s theories on under consumption (which I remembered from my graduate school days primarily because he had so profoundly influenced Lenin’s ideas on imperialism).  Hobson was a leading British economist of the late 19th and early 20th centuries, and he was one of the major figures in the “underconsumptionist” school.  Here is Austin on the topic:

The basic idea is that oversaving causes insufficient demand for economic output. In turn, that leads to recession and resource misallocation, including excessive investment in marketing and distribution. This was a direct challenge to a core thesis of the classical economists: ‘Savings are always beneficial because they allow greater accumulation of capital.’

Keynes, almost 50 years after The Physiology of Industry, found there the essential ideas of the General Theory (first published 1936): the determinants of aggregate demand and the significance of savings– investment imbalances. In Chapter 23 of the General Theory, ‘Notes on Mercantilism etc.’, he lauded Hobson and Mummery for bringing the issue of excess saving to the fore. But Keynes disagreed with Hobson’s theory that excess saving leads to unnecessary investment. Instead, Keynes believed that ‘a relatively weak propensity to consume helps to cause unemployment by requiring and not receiving the accompaniment of a compensating volume of new investment’ (Keynes 1964, p. 370). Keynes attributed Hobson’s error to the lack of an independent theory of the rate of interest. Hobson (Mummery had died many years earlier) considered Keynes’ work the completion and vindication of his efforts.

Hobson took his excess savings theory in another direction in Imperialism: A Study, first published in 1902. In a closed economy, excess savings cause recessions, but an open economy has another alternative: domestic savers can invest abroad. Hobson attributed the renewed enthusiasm for colonial conquest among the industrial powers of the day to a need to find new foreign markets and investment opportunities. He called this need to vent the excess savings abroad ‘The Economic Taproot of Imperialism’.

However, increasing foreign investment requires earning the necessary foreign exchange to invest abroad. This requires an increase in net exports. So foreign investment solves two problems at once. It reduces the excess supply of goods and drains the pool of excess saving. The two objectives are simultaneously fulfilled because they are, in fact and theory, logically equivalent.*

Since industrialised countries tended to develop the same problem of excesses of savings over time, they could not solve the problem cooperatively among themselves. They needed to capture less developed economies to absorb the surplus savings and goods.

Sorry for quoting such a large chunk of Austin’s piece, but I found it a fascinating read and very relevant to understanding China.  We may seem to be straying from the topic of the role of the dollar, but basically Austin argues that under-consuming countries today are able to use the dollar today for the same reason that European countries used colonialism in the past – as a way of allowing them to export capital and import foreign demand.

Post script on Black swans

On a totally different subject (and sorry for sounding grumpy), I wonder if I could propose a moratorium on the phrase “black swan.”  Although it once had a very limited but useful meaning, it has gradually become one of the most popular and meaningless phrases in financial markets.  Last night I watched a Niall Ferguson documentary where he kept warning about black-swan financial crises in places like Latin America.  Today I got a report that begins:

The financial markets have endured a flock of geopolitical “black swans” including the devastating earthquake and nuclear crisis in Japan, widespread revolution and violence in the Middle East and North Africa, and escalation of the European sovereign debt crisis. Incredibly domestic markets shrugged aside fear from each transformational event as stocks registered their best first quarter in over a decade led by a recovery in corporate earnings and job growth.

Black swans rarely fly in flocks, and not a single one of these are black swans.  They are just shocks.  A black swan is an event that disproves a widely-held hypothesis – in the original case, the widely-believed hypothesis was that all swans are white.  The discovery of black swans in Australia disproved that hypothesis once and for all.

The Japanese earthquake, then, was not a black swan.  To have been a black swan would have required that until last month all of us believed that for whatever reason Japan was wholly immune to earthquakes.  In that case the devastating earthquake last month would have resoundingly overturned all of our deepest geological convictions and so it would have qualified as a black swan.  Since we know Japan is on a major fault, and everyone has been waiting for decades for the “big one” to hit Japan, the terrible earthquake that hit last month cannot possibly have overturned any hypotheses about Japan and earthquakes and so cannot have been a black swan.  The fact that it may have unleashed a nuclear disaster is also not a black swan, unless most of us truly believed that nuclear disasters are in principle and in practice impossible.

Financial crises are not black swans.  Revolutions in Egypt are not black swans.  If Portugal defaults that will not be a black swan, and the fact that its government-bond spreads are widening is not even close to being a black swan.  If Portugal suddenly negotiated an agreement with Australia to become an Australian state, however, that might qualify as a black swan because it would create a reality that none of us had previously thought possible and it would have dramatically changed most of our ideas about possible resolutions to the debt crisis.   Widening credit spreads do not qualify.

Black swans almost never occur while shocks occur all the time, and there is no point confusing the two.  And by the way while we are on the subject we should probably also call a moratorium on the phrase “tipping point”.  I can’t say how many times I have been asked to predict tipping points.  The whole point of the “tipping point” concept is that it is totally unpredictable.  It means that small incremental inputs will have no discernable effect on output until, suddenly and unexpectedly, a single additional tiny input causes a massive change in output.  The straw that broke the camel’s back is an example of a tipping point, and of course it is impossible to predict which straw will do that.

BY MICHAEL PETTIS

Exit mobile version