Category Archive for: Macroeconomics [Return to Main]

Sunday, July 26, 2015

'The F Story about the Great Inflation'

Simon Wren-Lewis:

The F story about the Great Inflation: Here F could stand for folk. The story that is often told by economists to their students goes as follows. After Phillips discovered his curve, which relates inflation to unemployment, Samuelson and Solow in 1960 suggested this implied a trade-off that policymakers could use. They could permanently have a bit less unemployment at the cost of a bit more inflation. Policymakers took up that option, but then could not understand why inflation didn’t just go up a bit, but kept on going up and up. Along came Milton Friedman to the rescue, who in a 1968 presidential address argued that inflation also depended on inflation expectations, which meant the long run Phillips curve was vertical and there was no permanent inflation unemployment trade-off. Policymakers then saw the light, and the steady rise in inflation seen in the 1960s and 1970s came to an end.
This is a neat little story, particularly if you like the idea that all great macroeconomic disasters stem from errors in mainstream macroeconomics. However even a half awake student should spot one small difficulty with this tale. Why did it take over 10 years for Friedman’s wisdom to be adopted by policymakers, while Samuelson and Solow’s alleged mistake seems to have been adopted quickly? Even if you think that the inflation problem only really started in the 1970s that imparts a 10 year lag into the knowledge transmission mechanism, which is a little strange.
However none of that matters, because this folk story is simply untrue. There has been some discussion of this in blogs (by Robert Waldmann in particular - see Mark Thoma here), and the best source on this is another F: James Forder. There are papers (e.g. here), but the most comprehensive source is now his book, which presents an exhaustive study of this folk story. It is, he argues, untrue in every respect. Not only did Samuelson and Solow not argue that there was a permanent inflation unemployment trade-off that policymakers could exploit, policymakers never believed there was such a trade-off. So how did this folk story arise? Quite simply from another F: Friedman himself, in his Nobel Prize lecture in 1977.
Forder discusses much else in his book, including the extent to which Friedman’s 1968 emphasis on the importance of expectations was particularly original (it wasn’t). He also describes how and why he thinks Friedman’s story became so embedded that it became folklore....

Friday, July 24, 2015

Paul Krugman: The M.I.T. Gang

The MIT school of economics:

The M.I.T. Gang, by Paul Krugman, Commentary, NY Times: Goodbye, Chicago boys. Hello, M.I.T. gang.

If you don’t know what I’m talking about, the term “Chicago boys” was originally used to refer to Latin American economists, trained at the University of Chicago, who took radical free-market ideology back to their home countries. The influence of these economists was part of a broader phenomenon: The 1970s and 1980s were an era of ascendancy for laissez-faire economic ideas and the Chicago school...

But that was a long time ago. Now a different school is in the ascendant, and deservedly so.

It’s actually surprising how little media attention has been given to the dominance of M.I.T.-trained economists in policy positions and policy discourse. But it’s quite remarkable. Ben Bernanke has an M.I.T. Ph.D.; so do Mario Draghi, the president of the European Central Bank, and Olivier Blanchard, the enormously influential chief economist of the International Monetary Fund. Mr. Blanchard is retiring, but his replacement, Maurice Obstfeld, is another M.I.T. guy — and another student of Stanley Fischer, who taught at M.I.T. for many years and is now the Fed’s vice chairman. ...

M.I.T.-trained economists, especially Ph.D.s from the 1970s, play an outsized role ... in policy discussion across the Western world. And yes, I’m part of the same gang.

So what distinguishes M.I.T. economics, and why does it matter? ...

At M.I.T..., Keynes never went away. To be sure, stagflation showed that there were limits to what policy can do. But students continued to learn about the imperfections of markets and the role that monetary and fiscal policy can play in boosting a depressed economy. ...

This open-minded, pragmatic approach was overwhelmingly vindicated after crisis struck in 2008. Chicago-school types warned incessantly that responding to the crisis by printing money and running deficits would lead to 70s-type stagflation, with soaring inflation and interest rates. But M.I.T. types predicted, correctly, that inflation and interest rates would stay low in a depressed economy, and that attempts to slash deficits too soon would deepen the slump. ...

Meanwhile, in the United States, Republicans have responded to the utter failure of free-market orthodoxy and the remarkably successful predictions of much-hated Keynesians by digging in even deeper, determined to learn nothing from experience.

In other words, being right isn’t necessarily enough to change the world. But it’s still better to be right than to be wrong, and M.I.T.-style economics, with its pragmatic openness to evidence, has been very right indeed.

Sunday, July 19, 2015

The Rivals (Samuelson and Friedman)

This is by David Warsh:

The Rivals, Economic Principals: When Keynes died, in April 1946, The Times of London gave him the best farewell since Nelson after Trafalgar: “To find an economist of comparable influence one would have to go back to Adam Smith.” A few years later, Alvin Hansen, of Harvard University, Keynes’ leading disciple in the United States, wrote , “It may be a little too early to claim that, along with Darwin’s Origin of Species and Marx’s Capital, The General Theory is one of the most significant book which have appeared in the last hundred years. … But… it continues to gain in importance.”
In fact, the influence of Keynes’ book, as opposed to the vision of “macroeconomics” at the heart of it, and the penumbra of fame surrounding it, already had begun its downward arc. Civilians continued to read the book, more for its often sparkling prose than for the clarity of its argument. Among economists, intermediaries and translators had emerged in various communities to explain the insights the great man had sought to convey. Speaking of the group in Cambridge, Massachusetts, Robert Solow put it this way, many years later: “We learned not as much from it – it was…almost unreadable – as from a number of explanatory articles that appeared on all our graduate school reading lists.”
Instead it was another book that ushered in an era of economics very different from the age before. Foundations of Economic Analysis, by Paul A. Samuelson, important parts of it written as much as ten years before, appeared in 1947. “Mathematics is a Language,” proclaimed its frontispiece; equations dominated nearly every page. “It might be still too early to tell how the discoveries of the 1930s would pan out,” Samuelson wrote delicately in the introduction, but their value could be ascertained only by expressing them in mathematical models whose properties could be thoroughly explored and tested. “The laborious literary working-over of essentially simple mathematical concepts such as is characteristic of much of modern economic theory is not only unrewarding from the standpoint of advancing the science, but involves as well mental gymnastics of a particularly depraved type.”
Foundations had won a prize as a dissertation, so Harvard University was required to publish it as a book. In Samuelson’s telling, the department chairman had to be forced to agree to printing a thousand copies, dragged his feet, and then permitted its laboriously hand-set plates to be melted down for other uses after 887 copies were run off. Thus Foundations couldn’t be revised in subsequent printings, until a humbled Harvard University Press republished an “enlarged edition” with a new introduction and a mathematical appendix in 1983. When Samuelson biographer Roger Backhouse went through the various archival records, he concluded that the delay could be explained by production difficulties and recycling of the lead type by postwar exigencies at Press.
It didn’t matter. With the profession, Samuelson soon would win the day.
The “new” economics that he represented – the earliest developments had commenced in the years after World War I – conquered the profession, high and low. The next year Samuelson published an introductory textbook, Economics, to inculcate the young. Macroeconomic theory was to be put to work to damp the business cycle and, especially, avoid the tragedy of another Great Depression. The new approach swiftly attracted a community away from alternative modes of inquiry, in the expectation that it would yield new solutions to the pressing problem of depression-prevention. Alfred Marshall’s Principles of Economics eventually would be swept completely off the table. Foundations was a paradigm in the Kuhnian sense.
At the very zenith of Samuelson’s success, another sort of book appeared, in 1962, A Monetary History of the United States, 1869-1960, by Milton Friedman and Anna Schwartz, published by the National Bureau of Economic Research. At first glance, the two books had nothing to do with one another. A Monetary History harkened back to approaches that had been displaced by Samuelsonian methods – “hypotheses” instead of theorems; charts instead of models, narrative, not econometric analytics. The volume did little to change the language that Samuelson had established. Indeed, economists at the University of Chicago, Friedman’s stronghold, were on the verge of adapting a new, still- higher mathematical style to the general equilibrium approach that Samuelson had pioneered.
Yet one interpretation of the relationship between the price system and the Daedalean wings that A Monetary History contained was sufficiently striking as to reopen a question thought to have been settled. A chapter of their book, “The Great Contraction,” contained an interpretation of the origins of the Great Depression that gradually came to overshadow the rest. As J. Daniel Hammond has written,
The “Great Contraction” marked a watershed in thinking about the greatest economic calamity in modern times. Until Friedman and Schwartz provoked the interest of economists by rehabilitating monetary history and theory, neither economic theorists nor economic historians devoted as much attention to the Depression as historians.
So you could say that some part of the basic agenda of the next fifty years was ordained by the rivalry that began in the hour that Samuelson and Friedman became aware of each other, perhaps in the autumn of 1932, when both turned up the recently-completed Social Science Research Building of the University of Chicago, at the bottom of the Great Depression. Excellent historians, with access to extensive archives, have been working on both men’s lives and work: Hammond, of Wake Forest University, has largely completed his project on Friedman; Backhouse, of the University of Birmingham, is finishing a volume on Samuelson’s early years. Neither author has yet come across a frank recollection by either man of those first few meetings. Let’s hope one or more second-hand accounts turn up in the papers of the many men and women who knew them then. When I asked Friedman about their relationship in 2005, he deferred to his wife, who, somewhat uncomfortably, mentioned a differential in privilege. I lacked the temerity to ask Samuelson directly the last couple of times we talked; he clearly didn’t enjoy discussing it.
Biography is no substitute for history, much less for theory and history of thought, and journalism is, at best, only a provisional substitute for biography. But one way of understanding what happened in economics in the twentieth century is to view it as an argument between Samuelson and Friedman that lasted nearly eighty years, until one aspect of it, at least, was resolved by the financial crisis of 2008. The departments of economics they founded in Cambridge and Chicago, headquarters in the long wars between the Keynesians and the monetarists, came to be the Athens and Sparta of their day. ...[continue reading]...

[There is much, much more in the full post.]

Saturday, July 04, 2015

'Stability of a Market Economy'

"The macroeconomy is inherently unstable and ... booms and busts arise endogenously as the results of market incentives":

Stability of a market economy, by Paul Beaudry, Dana Galizia, and Franck Portier, Vox EU: There are two polar views about the functioning of a market economy.
  • On the one hand, there is the view that such a system is inherently stable, with market forces tending to direct the economy to a smooth growth path.

According to such a belief, most of the fluctuations in the macroeconomy result from either individually optimal adjustments to changes in the environment or from improper government interventions. In such a case, the role of macroeconomic policy should be to do no harm; if policymakers hold back from actively influencing the economy, market forces would take care of the rest and foster desirable outcomes.

  • On the other hand, there is the view that the market economy is inherently unstable, and that left to itself it will repeatedly go through periods of socially costly booms and busts, with recurrent periods of sustained high levels of unemployment.

According to this view, macroeconomic policy is needed to help stabilize an unruly system.    

Most modern macroeconomic models, such as those used by large central banks and governments, are somewhere in between these two extremes. However, they are by design much closer to the first view than the second, and this is generally not fully appreciated. In fact, most commonly used macroeconomic models have the feature that, in the absence of outside disturbances, the economy is expected to converge to a stable path. In this sense, these models are based on the premise that a decentralized economy is a stable system and that market forces, in of themselves, do not tend to produce boom and busts. The only reason why we see economic cycles in mainstream macroeconomic models is due to outside forces that perturb an otherwise stable system. We can call such a framework the stable-with-shocks view of the macroeconomy.

Stable-with-shocks view of the macroeconomy

There are many reasons why the economic profession has mostly adopted the stable-with-shocks view of macroeconomic fluctuations.

  • First, if we take a step back, and look at aggregate economic outcomes over long periods of time (say 100 years), the most striking feature is the stable growth path (see Figure 1).

Disregarding the two world wars, although the economy fluctuated, these fluctuations were small in comparison to the growth path. In particular, when looking over such long periods, it becomes clear that the economy looks more like a globally stable system than an unstable system.

  • Secondly, a huge fraction of economic theory suggests that market forces will favor stable outcomes. 
  • Thirdly, the stable-with-shocks framework is very tractable and flexible, allowing one to analyze economic outcomes using linear techniques.

Figure 1. Long-run evolution of GDP per capita

Vox1

Source: Bolt and van Zanden (2014).

Figure 2. Unemployment rates

Vox2

Source: FRED, Federal Reserve Bank of St. Louis.

Notwithstanding these attractive features of the stable-with-shocks view of the macroeconomy, the ubiquitous and recurrent nature of cycles in most market economies, as illustrated by the fluctuations of unemployment rates (see Figure 2), strongly suggests that a market economy, by its very nature, may create recurrent boom and bust independently of outside disturbances. This idea is well captured by the statement that “a bust sows the seed of the next boom”. Although, such an idea has a long tradition in the economics literature (Kalecki 1937, Kaldor 1940, Hicks 1950, Goodwin 1951), it is not present in most modern macro-models. 

Capturing economic fluctuations: New framework

In a companion paper (Beaudry et al. 2015), we have developed and explored an empirical framework that allows one to examine whether economic fluctuations may best be captured by the stable-with-shocks type framework or whether they may be better characterized as reflecting some sort of instability. To examine such an issue, one needs to depart from the preponderant convention in macroeconomics of focusing on linear models to analyze outcomes. A frequent criticism of macro-modelling, mostly from non-mainstream macroeconomists, is that the profession’s focus on linear models may have substantially biased our understanding of how the economy actually functions. As Blanchard (2014) writes, “We in the field did think of the economy as roughly linear, constantly subject to different shocks, constantly fluctuating, but naturally returning to its steady state over time.”

Within a linear set-up, a dynamic system is either stable or unstable. In contrast, in a non-linear setup, a system can be globally stable while simultaneously being locally unstable. It is this latter characteristic that has the potential to be relevant in macroeconomics given that in the long run the economy appears rather stable, while in the short run it exhibits substantial volatility. By looking at the economy through a lens that allows for the possibility of non-linear dynamics, one is de facto permitting an interpretation of the economic fluctuations where endogenous cyclical behavior or even chaos may emerge; both features that are well known to arise in many dynamic environments. In other words, by looking at the economy using non-linear techniques we can ask whether market forces are tending to favor recurrent booms and busts, or whether they favor stability.

Our main finding is that, instead of favoring the conventional stable-with-shocks view for aggregate dynamics, our results suggest that the macroeconomy is inherently unstable and that booms and busts arise endogenously as the results of market incentives.

In fact, we found that for the US economy, market forces tend in of themselves to generate a cycle that lasts about eight years. However, these cycles are not regular or identical over time. Instead, outside forces play an important role in accelerating, amplifying, and postponing the forces that create cycles.   

What causes business cycles?

So what causes the economy to be unstable and exhibit business cycles? According to our analysis, this results from simple incentives that favour the coordination of behavior across households. In particular, in a market economy where individuals face unemployment risk, households have an incentive to buy housing and durable goods at similar time. The reason for these coordinated purchases is that when others are making large purchases, this reduces unemployment; then when unemployment is low, it is a less risky time to make large purchases since taking on debt is easier. However, let us emphasize that we are not finding that business cycles are driven primarily by animal spirits.

  • Instead, we are arguing that business cycles are driven by individually rational, but socially costly, mass behavior based on fundamentals.
  • In our view, the recovery phase of a cycle starts when the stock of housing and durables have been depleted enough to lead some people to go out and make new purchases even if unemployment is still high.

This incites others to do the same, which eventually sustains the recovery and leads to a boom. Interestingly, the boom does not stop when people have the ‘right’ stock of goods, but households instead over-shoot because the boom period is a good time to buy even knowing that a recession will eventually come. Once household have sufficiently over-accumulated, they will in mass stop purchasing, knowing that others are also stopping and knowing that they can wait out a recession benefiting for some time from the services of housing and durables bought during the expansion. The expansion therefore ends and a recession begins. Once this stock of good is again sufficiently depleted, the cycle will restart. Stated this way, business cycles appear very deterministic. However, there are always other developments in the economy that interact with this consumer cycle to create unique features. For example, the consumer cycle generally competes with forces affecting business investment, thereby causing the length and duration of a cycle to be affected by technological developments driving firm investment.  

Concluding remarks

But why should we care if the macroeconomy is locally unstable versus if it is locally stable? Society’s understanding of how the economy functions, especially what creates business cycles, greatly affects how we design stabilization policy. 

In the current dominant paradigm, there is a tendency to see monetary policy as the central tool for mitigating the business cycle. This view makes sense if excessive macroeconomic fluctuations reflect mainly the slow adjustment of wages and prices to outside disturbances within an otherwise stable system. 

However, if the system is inherently unstable and exhibits forces that favor recurrent booms and busts of about seven to ten years intervals, then it is much less likely that monetary policy is the right tool for addressing macroeconomic fluctuations. Instead, in such a case we are likely to need policies aimed at changing the incentives that lead household to bunch their purchasing behavior in the first place.

References

Beaudry, P , D Galizia, and F Portier, “Reviving the Limit Cycle View of Macroeconomic Fluctuations”, CEPR Discussion Paper 10645 and NBER working paper 21241.

Blanchard, O J (2014), “Where Danger Lurks”, Finance & Development, 51(3), 28-31.

Bolt, J and J L van Zanden (2014), “The Maddison Project: collaborative research on historical national accounts”, The Economic History Review, 67 (3): 627–651.

Goodwin, R (1951): “The Nonlinear Accelerator and the Persistence of Business Cycles”, Econometrica, 19(1), 1–17.

Hicks, J (1950), A Contribution to the Theory of the Trade Cycle, Clarendon Press, Oxford.

Kaldor, N (1940), “A Model of the Trade Cycle”, The Economic Journal, 50(197), 78–92.

Kalecki, M (1937), “A Theory of the Business Cycle”, The Review of Economic Studies, 4(2), 77–97.

Sunday, June 14, 2015

'What Assumptions Matter for Growth Theory?'

Dietz Vollrath explains the "mathiness" debate (and also Euler's theorem in a part of the post I left out). Glad he's interpreting Romer -- it's very helpful:

What Assumptions Matter for Growth Theory?: The whole “mathiness” debate that Paul Romer started tumbled onwards this week... I was able to get a little clarity in this whole “price-taking” versus “market power” part of the debate. I’ll circle back to the actual “mathiness” issue at the end of the post.
There are really two questions we are dealing with here. First, do inputs to production earn their marginal product? Second, do the owners of non-rival ideas have market power or not? We can answer the first without having to answer the second.
Just to refresh, a production function tells us that output is determined by some combination of non-rival inputs and rival inputs. Non-rival inputs are things like ideas that can be used by many firms or people at once without limiting the use by others. Think of blueprints. Rival inputs are things that can only be used by one person or firm at a time. Think of nails. The income earned by both rival and non-rival inputs has to add up to total output.
Okay, given all that setup, here are three statements that could be true.
  1. Output is constant returns to scale in rival inputs
  2. Non-rival inputs receive some portion of output
  3. Rival inputs receive output equal to their marginal product
Pick two.
Romer’s argument is that (1) and (2) are true. (1) he asserts through replication arguments, like my example of replicating Earth. (2) he takes as an empirical fact. Therefore, (3) cannot be true. If the owners of non-rival inputs are compensated in any way, then it is necessarily true that rival inputs earn less than their marginal product. Notice that I don’t need to say anything about how the non-rival inputs are compensated here. But if they earn anything, then from Romer’s assumptions the rival inputs cannot be earning their marginal product.
Different authors have made different choices than Romer. McGrattan and Prescott abandoned (1) in favor of (2) and (3). Boldrin and Levine dropped (2) and accepted (1) and (3). Romer’s issue with these papers is that (1) and (2) are clearly true, so writing down a model that abandons one of these assumptions gives you a model that makes no sense in describing growth. ...
The “mathiness” comes from authors trying to elide the fact that they are abandoning (1) or (2). ...

[There's a lot more in the full post. Also, Romer comments on Vollrath here.]

Tuesday, June 09, 2015

'What is it about German Economics?'

Can you help Simon Wren-Lewis figure this out?:

What is it about German economics?: ...Keynesian ideas are pretty mainstream elsewhere...: why does macroeconomics in Germany seem to be an outlier? Given the damage done by austerity in the Eurozone, and the central role that the views of German policy makers have played in that, this is a question I have asked for many years. The textbooks used to teach macroeconomics in Germany seem to be as Keynesian as elsewhere, yet Peter Bofinger is the only Keynesian on their Council of Economic Experts, and he confirmed to me how much this minority status is typical. [1]
There are two explanations that are popular outside Germany that I now think on their own are inadequate. The first is that Germany is preoccupied by inflation as a result of the hyperinflation of the Weimar republic, and that this spills over into their attitude to government debt. (The recession of the 1930s helped create a more serious disaster, and here is a provocative account of why the memory of hyperinflation dominates.) A second idea is that Germans are culturally debt averse, and people normally note that the German for debt is also their word for guilt. The trouble with both stories is that they imply that German government debt should be much lower than in other countries, but it is not. (In 2000, the German government’s net financial liabilities as a percentage of GDP were at the same level as France, and slightly above the UK and US.) ...
It is as if in some respects economic thinking in Germany has not moved on since the 1970s: Keynesian ideas are still viewed as anti-market rather than correcting market failure...
One of the distinctive characteristics of the German economy appears to be very far from neoliberalism, and that is co-determination: the importance of workers organisations in management, and more generally the recognition that unions play an important role in the economy. Yet I wonder whether this may have had an unintended consequence: the polarisation and politicisation of economic policy advice. ... If conflict over wages is institutionalised at the national level, perhaps the influence of ideology on economic policy - in so far as it influences that conflict (see footnote [1]) - is bound to be greater. 
As you can see, I remain some way from answering the question posed in the title of this post, but I think I’m a bit further forward than I was.

Saturday, June 06, 2015

'A Crisis at the Edge of Physics'

Seems like much the same can be said about modern macroeconomics (except perhaps the "given the field its credibility" part):

A Crisis at the Edge of Physics, by Adam Frank and Marcelo Gleiser, NY Times: Do physicists need empirical evidence to confirm their theories?
You may think that the answer is an obvious yes, experimental confirmation being the very heart of science. But a growing controversy at the frontiers of physics and cosmology suggests that the situation is not so simple.
A few months ago in the journal Nature, two leading researchers, George Ellis and Joseph Silk, published a controversial piece called “Scientific Method: Defend the Integrity of Physics.” They criticized a newfound willingness among some scientists to explicitly set aside the need for experimental confirmation of today’s most ambitious cosmic theories — so long as those theories are “sufficiently elegant and explanatory.” Despite working at the cutting edge of knowledge, such scientists are, for Professors Ellis and Silk, “breaking with centuries of philosophical tradition of defining scientific knowledge as empirical.”
Whether or not you agree with them, the professors have identified a mounting concern in fundamental physics: Today, our most ambitious science can seem at odds with the empirical methodology that has historically given the field its credibility. ...

'Views Differ on Shape of Macroeconomics'

Paul Krugman:

Views Differ on Shape of Macroeconomics: The doctrine of expansionary austerity ... was immensely popular among policymakers in 2010, as the great turn toward austerity began. But the statistical underpinnings of the doctrine fell apart under scrutiny... So at this point research economists overwhelmingly believe that austerity is contractionary (and that stimulus is expansionary). ...

Nonetheless, Simon Wren-Lewis points us to Robert Peston of the BBC declaring

I am simply pointing out that there is a debate here (though Krugman, Wren-Lewis and Portes are utterly persuaded they’ve won this match – and take the somewhat patronising view that voters who think differently are ignorant sheep led astray by a malign or blinkered media).

Wow. Yes, I suppose that “there is a debate” — there are debates about lots of things, from climate change to evolution to alien spaceships hidden in Area 51. But to suggest that this debate is at all symmetric is just wrong — and deeply misleading to one’s audience.

As for the claim that it’s somehow patronizing to suggest that voters are ill-informed when (a) macroeconomics is a technical subject, and (b) the media have indeed misreported the state of the professional debate — well, this is sort of an economic version of the line that one must not suggest that the Iraq war was launched on false pretenses, because this would be disrespectful to the troops. If you’re being accused of misleading reporting, it’s hardly a defense to say that the public believed your misinformation — more like a self-indictment. ...

Wednesday, June 03, 2015

'Coordination Equilibrium and Price Stickiness'

This is the introduction to a relatively new working paper by Cidgem Gizem Korpeoglu and Stephen Spear (sent in response to my comment that I've been disappointed with the development of new alternatives to the standard NK-DSGE models):

Coordination Equilibrium and Price Stickiness, by Cidgem Gizem Korpeoglu (University College London) Stephen E. Spear (Carnegie Mellon): 1 Introduction Contemporary macroeconomic theory rests on the three pillars of imperfect competition, nominal price rigidity, and strategic complementarity. Of these three, nominal price rigidity (aka price stickiness) has been the most important. The stickiness of prices is a well-established empirical fact, with early observations about the phenomenon going back to Alfred Marshall. Because the friction of price stickiness cannot occur in markets with perfect competition, modern micro-founded models (New Keynesian or NK models, for short) have been forced to abandon the standard Arrow-Debreu paradigm of perfect competition in favor of models where agents have market power and set market prices for their own goods. Strategic complementarity enters the picture as a mechanism for explaining the kinds of coordination failures that lead to sustained slumps like the Great Depression or the aftermath of the 2008 …financial crisis. Early work by Cooper and John laid out the importance of these three features for macroeconomics, and follow-on work by Ball and Romer showed that failure to coordinate on price adjustments could itself generate strategic complementarity, effectively unifying two of the three pillars.
Not surprisingly, the Ball and Romer work was based on earlier work by a number of authors (see Mankiw and Romer's New Keynesian Economics) which used the model of Dixit and Stiglitz of monopolistic competition as the basis for price-setting behavior in a general equilibrium setting, combined with the idea of menu costs -- literally the cost of posting and communicating price changes -- and exogenously-specified adjustment time staggering to provide the friction(s) leading to nominal rigidity. While these models perform well in explaining aspects of the business cycle, they have only recently been subjected to what one would characterize as thorough empirical testing, because of the scarcity of good data on how prices actually change. This has changed in the past decade as new sources of data on price dynamics have become available, and as computational power capable of teasing out what might be called the "…fine structure" of these dynamics has emerged. On a different dimension, the overall suitability of monopolistic competition as the appropriate form of market imperfection to use as the foundation of the new macro models has been largely unquestioned, though we believe this is largely due to the tractability of the Dixit-Stiglitz model relative to other models of imperfect competition generated by large …fixed costs or increasing returns to scale not due to specialization.
In this paper, we examine both of these underlying assumptions in light of what the new empirics on pricing dynamics has found, and propose a different, and we believe, better microfoundation for New Keynesian macroeconomics based on the Shapley-Shubik market game.

Krugman vs. DeLong

Krugman vs. DeLong has an outcome that follows DeLong's rule:

Krugman: The Inflationista Puzzle: Martin Feldstein has a new column on what he calls the “inflation puzzle” — the failure of inflation to soar despite the Fed’s large asset purchases, which led to a very large rise in the monetary base. As Tony Yates points out, however, there’s nothing puzzling at all about what happened; it’s exactly what you should expect when interest rates are near zero.
And this isn’t an ex-post rationale, it’s what many of us were saying from the beginning. Traditional IS-LM analysis said that the Fed’s policies would have little effect on inflation; so did the translation of that analysis into a stripped-down New Keynesian framework that I did back in 1998, starting the modern liquidity-trap literature. ...
DeLong: New Economic Thinking, Hicks-Hansen-Wicksell Macro, and Blocking the Back Propagation Induction-Unraveling from the Long Run Omega Point: ... Whatever may be going on in the short run must thus be transitory in duration, moderate in their effects, and limited in the distance it can can push the economy away from its proper long run equilibrium. And it certainly cannot keep it there. Not for long.
This is the real critique of Paul Krugman’s “depression economics”. Paul can draw his Hicksian IS-LM diagrams of an economy stuck in a liquidity trap...
He can draw his Wicksellian I=S diagrams of how the zero lower bound forces the market interest rate above the natural interest rate at which planned investment balances savings that would be expected were the economy at full employment...
Paul can show, graphically, that conventional monetary policy is then completely ineffective–swapping two assets that are perfect substitutes for each other. Paul can show, graphically, that expansionary fiscal policy is then immensely powerful and has no downside: it does not generate higher interest rates; it does not crowd out productive private investment; and, because interest rates are zero, it entails no financing burden and thus no required increase in future tax wedges. But all this is constrained and limited by the inescapable and powerful logic of the induction-unraveling propagating itself back through the game tree from the Omega Point that is the long run equilibrium. In the IS-LM diagram, the fact that the long run is out there means that even the contemplation of permanent expansion of the monetary base is rapidly moving the IS curve up and to the right, and thus leading the economy to quickly exist the liquidity trap. In the Wicksellian I=S diagram, the fact that the long run is out there means that even the contemplation of permanent expansion of the monetary base is rapidly moving the I=S curve up so that the zero lower bound will soon no longer constrain the economy away from its full-employment equilibrium.
The “depression economics” equilibrium Paul plots on his graph is a matter for today–a month or two, or a quarter or two, or at most a year or two. ...
Krugman:Backward Induction and Brad DeLong (Wonkish): Brad DeLong is, unusually, unhappy with my analysis in a discussion of the inflationista puzzle — the mystery of why so many economists failed to grasp the implications of a liquidity trap, and still fail to grasp those implications despite 6 years of being wrong. Brad sorta-kinda defends the inflationistas on the basis of backward induction; I find myself somewhat baffled by that defense.

Actually, I find myself baffled both theoretically and empirically. ...

In the end, while the post-2008 slump has gone on much longer than even I expected (thanks in part to terrible fiscal policy), and the downward stickiness of wages and prices has been more marked than I imagined, overall the model those of us who paid attention to Japan deployed has done pretty well — and it’s kind of shocking how few of those who got everything wrong are willing to learn from their failure and our success.
DeLong: Paul Krugman Was Right. I, Ken Rogoff, Marty Feldstein, and Many, Many Others Were Wrong: The question is: Why were we wrong? We had, after all, read, learned, and taught the same Hicks-Hansen-Wicksell-Metzler-Tobin macro that was Paul Krugman’s foundation. ...

I want to highlight one of Brad's points. Theoretical models often act as if there is only one type of demand shock, and the short-run depends upon a single variable, e.g. the time period when inflation expectations are wrong. But the short-run depends upon the type of recession we experience, and the variable that signals the length of the recovery will not be the same in every case. A monetary induced recession will have a much shorter short-run than a balance sheet recession induced by a financial collapse, and an recession caused by an oil price shock will recover differently from both. Early in the Great Recession, policymakers, analysts, and most economists did not fully recognize that this recession truly was different, and hence required a different policy approach from the recessions in recent memory. Krugman, due to his work on Japan, did see this early on, but it took time for the notion of a balance sheet recession to take hold, and we never fully adopted fiscal policy to deal with this fact (e.g. sufficient help with rebuilding household balance sheets). To me this in one of the big lessons of the Great Recession -- we must figure out the type of recession we are experiencing, realize that the "short-run" will depend critically on the type of shock causing the recession, and adopt our policies accordingly. If we can do that, then maybe the short-run won't be a decade long the next time we have a balance sheet recession. And there will be a next time.

Monday, June 01, 2015

'The Case of the Missing Minsky'

Paul Krugman says I'm not upbeat enough about the state of macroeconomics:

The Case of the Missing Minsky: Gavyn Davis has a good summary of the recent IMF conference on rethinking macro; Mark Thoma has further thoughts. Thoma in particular is disappointed that there hasn’t been more of a change, decrying

the arrogance that asserts that we have little to learn about theory or policy from the economists who wrote during and after the Great Depression.

Maybe surprisingly, I’m a bit more upbeat than either. Of course there are economists, and whole departments, that have learned nothing, and remain wholly dominated by mathiness. But it seems to be that economists have done OK on two of the big three questions raised by the economic crisis. What are these three questions? I’m glad you asked. ...[continue]...

Sunday, May 31, 2015

'Has the Rethinking of Macroeconomic Policy Been Successful?'

The beginning of a long discussion from Gavyn Davies:

Has the rethinking of macroeconomic policy been successful?: The great financial crash of 2008 was expected to lead to a fundamental re-thinking of macro-economics, perhaps leading to a profound shift in the mainstream approach to fiscal, monetary and international policy. That is what happened after the 1929 crash and the Great Depression, though it was not until 1936 that the outline of the new orthodoxy appeared in the shape of Keynes’ General Theory. It was another decade or more before a simplified version of Keynes was routinely taught in American university economics classes. The wheels of intellectual change, though profound in retrospect, can grind fairly slowly.
Seven years after 2008 crash, there is relatively little sign of a major transformation in the mainstream macro-economic theory that is used, for example, by most central banks. The “DSGE” (mainly New Keynesian) framework remains the basic workhorse, even though it singularly failed to predict the crash. Economists have been busy adding a more realistic financial sector to the structure of the model [1], but labour and product markets, the heart of the productive economy, remain largely untouched.
What about macro-economic policy? Here major changes have already been implemented, notably in banking regulation, macro-prudential policy and most importantly the use of the central bank balance sheet as an independent instrument of monetary policy. In these areas, policy-makers have acted well in advance of macro-economic researchers, who have been struggling to catch up. ...

There has been more progress on the theoretical front than I expected, particularly in adding financial sector frictions to the NK-DSGE framework and in overcoming the restrictions imposed by the representative agent model. At the same time, there has been less progress than I expected in developing alternatives to the standard models. As far as I can tell, a serious challenge to the standard model has not yet appeared. My biggest disappointment is how much resistance there has been to the idea that we need to even try to find alternative modeling structures that might do better than those in use now, and the arrogance that asserts that we have little to learn about theory or policy from the economists who wrote during and after the Great Depression.

Sunday, May 17, 2015

'Blaming Keynes'

Simon Wren-Lewis:

Blaming Keynes: A few people have asked me to respond to this FT piece from Niall Ferguson. I was reluctant to, because it is really just a bit of triumphalist Tory tosh. That such things get published in the Financial Times is unfortunate but I’m afraid not surprising in this case. However I want to write later about something else that made reference to it, so saying a few things here first might be useful.
The most important point concerns style. This is not the kind of thing an academic should want to write. It makes no attempt to be true to evidence, and just cherry picks numbers to support its argument. I know a small number of academics think they can drop their normal standards when it comes to writing political propaganda, but I think they are wrong to do so. ...

'Ed Prescott is No Robert Solow, No Gary Becker'

Paul Romer continues his assault on "mathiness":

Ed Prescott is No Robert Solow, No Gary Becker: In his comment on my Mathiness paper, Noah Smith asks for more evidence that the theory in the McGrattan-Prescott paper that I cite is any worse than the theory I compare it to by Robert Solow and Gary Becker. I agree with Brad DeLong’s defense of the Solow model. I’ll elaborate, by using the familiar analogy that theory is to the world as a map is to terrain.

There is no such thing as the perfect map. This does not mean that the incoherent scribbling of McGrattan and Prescott are on a par with the coherent, low-resolution Solow map that is so simple that all economists have memorized it. Nor with the Becker map that has become part of the everyday mental model of people inside and outside of economics.

Noah also notes that I go into more detail about the problems in the Lucas and Moll (2014) paper. Just to be clear, this is not because it is worse than the papers by McGrattan and Prescott or Boldrin and Levine. Honestly, I’d be hard pressed to say which is the worst. They all display the sloppy mixture of words and symbols that I’m calling mathiness. Each is awful in its own special way.

What should worry economists is the pattern, not any one of these papers. And our response. Why do we seem resigned to tolerating papers like this? What cumulative harm are they doing?

The resignation is why I conjectured that we are stuck in a lemons equilibrium in the market for mathematical theory. Noah’s jaded question–Is the theory of McGrattan-Prescott really any worse than the theory of Solow and Becker?–may be indicative of what many economists feel after years of being bullied by bad theory. And as I note in the paper, this resignation may be why empirically minded economists like Piketty and Zucman stay as far away from theory as possible. ...

[He goes on to give more details using examples from the papers.]

Friday, May 15, 2015

'Mathiness in the Theory of Economic Growth'

Paul Romer:

My Paper “Mathiness in the Theory of Economic Growth”: I have a new paper in the Papers and Proceedings Volume of the AER that is out in print and on the AER website. A short version of the supporting appendix is available here. It should eventually be available on the AER website but has not been posted yet. A longer version with more details behind the calculations is available here.

The point of the paper is that if we want economics to be a science, we have to recognize that it is not ok for macroeconomists to hole up in separate camps, one that supports its version of the geocentric model of the solar system and another that supports the heliocentric model. As scientists, we have to hold ourselves to a standard that requires us to reach a consensus about which model is right, and then to move on to other questions.

The alternative to science is academic politics, where persistent disagreement is encouraged as a way to create distinctive sub-group identities.

The usual way to protect a scientific discussion from the factionalism of academic politics is to exclude people who opt out of the norms of science. The challenge lies in knowing how to identify them.

From my paper:

The style that I am calling mathiness lets academic politics masquerade as science. Like mathematical theory, mathiness uses a mixture of words and symbols, but instead of making tight links, it leaves ample room for slippage between statements in natural versus formal language and between statements with theoretical as opposed to empirical content.

Persistent disagreement is a sign that some of the participants in a discussion are not committed to the norms of science. Mathiness is a symptom of this deeper problem, but one that is particularly damaging because it can generate a broad backlash against the genuine mathematical theory that it mimics. If the participants in a discussion are committed to science, mathematical theory can encourage a unique clarity and precision in both reasoning and communication. It would be a serious setback for our discipline if economists lose their commitment to careful mathematical reasoning.

I focus on mathiness in growth models because growth is the field I know best, one that gave me a chance to observe closely the behavior I describe. ...

The goal in starting this discussion is to ensure that economics is a science that makes progress toward truth. ... Science is the most important human accomplishment. An investment in science can offer a higher social rate of return than any other a person can make. It would be tragic if economists did not stay current on the periodic maintenance needed to protect our shared norms of science from infection by the norms of politics.

[I cut quite a bit -- see the full post for more.]

Saturday, May 02, 2015

'Needed: New Economic Frameworks for a Disappointing New Normal'

Brad DeLong ends a post on the need for "New Economic Frameworks for a Disappointing New Normal" with:

... Our government, here in the U.S. at least, has been starved of proper funding for infrastructure of all kinds since the election of Ronald Reagan. Our confidence in our institutions’ ability to manage aggregate demand properly is in shreds–and for the good reason of demonstrated incompetence and large-scale failure. Our political system now has a bias toward austerity and idle potential workers rather than toward expansion and inflation. Our political system now has a bias away from desirable borrow-and-invest. And the equity return premium is back to immediate post-Great Depression levels–and we also have an enormous and costly hypertrophy of the financial sector that is, as best as we can tell, delivering no social value in exchange for its extra size.
We badly need a new framework for thinking about policy-relevant macroeconomics given that our new normal is as different from the late-1970s as that era’s normal was different from the 1920s, and as that era’s normal was different from the 1870s.
But I do not have one to offer.

Friday, April 24, 2015

'Unit Roots, Redux'

John Cochrane weighs in on the discussion of unit roots:

Unit roots, redux: Arnold Kling's askblog and Roger Farmer have a little exchange on GDP and unit roots. My two cents here.
I did a lot of work on this topic a long time ago, in How Big is the Random Walk in GNP?  (the first one)  Permanent and Transitory Components of GNP and Stock Prices” (The last, and I think best one) "Multivariate estimates" with Argia Sbordone, and "A critique of the application of unit root tests", particularly appropriate to Roger's battery of tests.
The conclusions, which I still think hold up today:
Log GDP has both random walk and stationary components. Consumption is a pretty good indicator of the random walk component. This is also what the standard stochastic growth model predicts: a random walk technology shock induces a random walk component in output but there are transitory dynamics around that value.
A linear trend in GDP is only visible ex-post, like a "bull" or "bear" market.  It's not "wrong" to detrend GDP, but it is wrong to forecast that GDP will return to the linear trend or to take too seriously correlations of linearly detrended series, as Arnold mentions. Treating macro series as cointegrated with one common trend is a better idea.
Log stock prices have random walk and stationary components. Dividends are a pretty good indicator of the random walk component. (Most recently, here.) ...
Both Arnold and Roger claim that unemployment has a unit root. Guys, you must be kidding. ...

He goes on to explain.

Tuesday, April 21, 2015

'Rethinking Macroeconomic Policy'

Olivier Blanchard at Vox EU:

Rethinking macroeconomic policy: Introduction, by Olivier Blanchard: On 15 and 16 April 2015, the IMF hosted the third conference on “Rethinking Macroeconomic Policy”. I had initially chosen as the title and subtitle “Rethinking Macroeconomic Policy III. Down in the trenches”.1 I thought of the first conference in 2011 as having identified the main failings of previous policies, the second conference in 2013 as having identified general directions, and this conference as a progress report.
My subtitle was rejected by one of the co-organisers, namely Larry Summers. He argued that I was far too optimistic, that we were nowhere close to knowing where were going. Arguing with Larry is tough, so I chose an agnostic title, and shifted to “Rethinking Macro Policy III. Progress or confusion?”
Where do I think we are today? I think both Larry and I are right. I do not say this for diplomatic reasons. We are indeed proceeding in the trenches. But where the trenches are eventually going remains unclear. This is the theme I shall develop in my remarks, focusing on macroprudential tools, monetary policy, and fiscal policy.

Continue reading "'Rethinking Macroeconomic Policy'" »

Saturday, April 18, 2015

NBER Annual Conference on Macroeconomics: Abstracts for Day Two

First paper:

Declining Desire to Work and Downward Trends in Unemployment and Participation, by Regis Barnichon and Andrew Figura: Abstract The US labor market has witnessed two apparently unrelated trends in the last 30 years: a decline in unemployment between the early 1980s and the early 2000s, and a decline in labor force participation since the early 2000s. We show that a substantial factor behind both trends is a decline in desire to work among individuals outside the labor force, with a particularly strong decline during the second half of the 90s. A decline in desire to work lowers both the unemployment rate and the participation rate, because a nonparticipant who wants to work has a high probability to join the unemployment pool in the future, while a nonparticipant who does not want to work has a low probability to ever enter the labor force. We use cross-sectional variation to estimate a model of non-participants' propensity to want a job, and we find that changes in the provision of welfare and social insurance, possibly linked to the mid-90s welfare reforms, explain about 50 percent of the decline in desire to work.

Second paper:

External and Public Debt Crises, by Cristina Arellano, Andrew Atkeson, and Mark Wright: Abstract In recent years, the members of two advanced monetary and economic unions -- the nations of the Eurozone and the states of the United States of America -- experienced debt crises with spreads on government borrowing rising dramatically. Despite the similar behavior of spreads on public debt, these crises were fundamentally different in nature. In Europe, the crisis occurred after a period of significant increases in government indebtedness from levels that were already substantial, whereas in the USA state government borrowing was limited and remained roughly unchanged. Moreover, whereas the most troubled nations of Europe experienced a sudden stop in private capital flows and private sector borrowers also faced large rises in spreads, there is little evidence that private borrowing in US states was differentially affected by the creditworthiness of state governments. In this sense, we can say that the US States experienced a public debt crisis , whereas the nations of Europe experienced an external debt crisis affecting both public and private borrowers. Why did Europe experience an external debt crisis and the US States only a public debt crisis? And, why did the members of other economic unions, such as the provinces of Canada, not experience a debt crisis at all despite high and rising provincial public debt levels? In this paper, we construct a model of default on domestic and external public debt and interference in private external debt contracts and use it to argue that these different debt experiences result from the interplay of differences in the ability of governments to interfere in the private external debt contracts of their citizens, with differences in the flexibility of state fiscal institutions. We also assemble a range of empirical evidence that suggests that the US States are less fiscally flexible but more constrained in their ability to interfere in private contracts than the members of other economic unions, which simultaneously exposes the states to public debt crises while insulating them from an external debt crisis affecting private sector borrowers within the state. In contrast, Eurozone nations are more fiscally flexible but have a greater ability to interfere with the contracts, which together allow for more public borrowing at the cost of a joint public and private external debt crisis. Lastly, Canadian provincial governments are both fiscally flexible and limited in their ability to interfere, which allows both for more public borrowing and limits the likelihood of either a public or external debt crisis occurring. We draw lessons from these findings for the future design of Eurozone economic and legal institutions.

Friday, April 17, 2015

NBER Annual Conference on Macroeconomics: Abstracts for Day One

First paper at the NBER Annual Conference on Macroeconomics

Expectations and Investment, by Nicola Gennaioli, Yueran Ma, and Andrei Shleifer: Abstract Using micro data from Duke University quarterly survey of Chief Financial Officers, we show that corporate investment plans as well as actual investment are well explained by CFOs’ expectations of earnings growth. The information in expectations data is not subsumed by traditional variables, such as Tobin’s Q or discount rates. We also show that errors in CFO expectations of earnings growth are predictable from past earnings and other data, pointing to extrapolative structure of expectations and suggesting that expectations may not be rational . This evidence, like earlier findings in finance, points to the usefulness of data on actual expectations for understanding economic behavior.

Second paper:

Trends and Cycles in China's Macroeconomy, by Chun Chang, Kaji Chen, Daniel Waggoner, and Tao Zha: Abstract We make three contributions in this paper. First, we provide a core of macroeconomic time series usable for systematic research on China. Second, we document, through various empirical methods, the robust findings about striking patterns of trend and cycle. Third, we build a theoretical model that accounts for these facts. The model's mechanism and assumptions are corroborated by institutional details, disaggregated data, and banking time series, all of which are distinctive of Chinese characteristics. The departure of our theoretical model from standard ones offers a constructive framework for studying China's macroeconomy.

Third paper:

Demystifying the Chinese Housing Boom, byHanming Fang, Quanlin Gu, Wei Xiong, and Li-An Zhou: Abstract We construct housing price indices for 120 major cities in China in 2003 - 2013 based on sequential sales of new homes within the same housing developments. By using these indices and detailed information on mortgage borrowers across these cities, we find enormous housing price appreciation during the decade, which was accompanied by equally impressive growth in household income, except in a few first-tier cities. Housing market participation by households from the low-income fraction of the urban population remained steady. Nevertheless, bottom-income mortgage borrowers endured severe financial burdens by using price-to-income ratios over eight to buy homes, which reflected their expectations of persistently high income growth into the future. Such future income expectations could contract substantially in the event of a sudden stop in the Chinese economy and present an important source of risk to the housing market.

Fourth paper:

Networks and the Macroeconomy: An Empirical Exploration, by Daron Acemoglu, Ufuk Akcigit, and William Kerr: Abstract The propagation of macroeconomic shocks through input-output and geographic networks can be a powerful driver of macroeconomic fluctuations. We first exposit that in the presence of Cobb-Douglas production functions and consumer preferences, there is a specific pattern of economic transmission whereby demand-side shocks propagate upstream (to input supplying industries) and supply-side shocks propagate downstream (to customer industries) and that there is a tight relationship between the direct impact of a shock and the magnitudes of the downstream and the upstream indirect effects. We then investigate the short-run propagation of four different types of industry-level shocks: two demand-side ones (the exogenous component of the variation in industry imports from China and changes in federal spending) and two supply-side ones (TFP shocks and variation in knowledge/ideas coming from foreign patent- ing). In each case, we find substantial propagation of these shocks through the input-output network, with a pattern broadly consistent with theory. Quantitatively, the network-based propagation is larger than the direct effects of the shocks, sometimes by several fold. We also show quantitatively large effects from the geographic network, capturing the fact that the local propagation of a shock to an industry will fall more heavily on other industries that tend to collocate with it across local markets. Our results suggest that the transmission of various different types of shocks through economic networks and industry inter-linkages could have first-order implications for the macroeconomy.

Thursday, April 16, 2015

Video: Rethinking Macro Policy

Rethinking Macro Policy III: Session 3. Monetary Policy in the Future
Chair: José Viñals, Ben Bernanke, Gill Marcus, John Taylor


Rethinking Macro Policy: Session 4. Fiscal Policy in the Future
Chair: Vitor Gaspar, Marco Buti, Martin Feldstein, Brad DeLong,

Monday, April 13, 2015

In Defense of Modern Macroeconomic Theory

A small part of a much longer post from David Andolfatto (followed by some comments of my own):

In defense of modern macro theory: The 2008 financial crisis was a traumatic event. Like all social trauma, it invoked a variety of emotional responses, including the natural (if unbecoming) human desire to find someone or something to blame. Some of the blame has been directed at segments of the economic profession. It is the nature of some of these criticisms that I'd like to talk about today. ...
The dynamic general equilibrium (DGE) approach is the dominant methodology in macro today. I think this is so because of its power to organize thinking in a logically consistent manner, its ability to generate reasonable conditional forecasts, as well as its great flexibility--a property that permits economists of all political persuasions to make use of the apparatus. ...

The point I want to make here is not that the DGE approach is the only way to go. I am not saying this at all. In fact, I personally believe in the coexistence of many different methodologies. The science of economics is not settled, after all. The point I am trying to make is that the DGE approach is not insensible (despite the claims of many critics who, I think, are sometimes driven by non-scientific concerns). ...

Once again (lest I be misunderstood, which I'm afraid seems unavoidable these days) I am not claiming that DGE is the be-all and end-all of macroeconomic theory. There is still a lot we do not know and I think it would be a good thing to draw on the insights offered by alternative approaches. I do not, however, buy into the accusation that there "too much math" in modern theory. Math is just a language. Most people do not understand this language and so they have a natural distrust of arguments written in it. .... Before criticizing, either learn the language or appeal to reliable translations...

As for the teaching of macroeconomics, if the crisis has led more professors to pay more attention to financial market frictions, then this is a welcome development. I also fall in the camp that stresses the desirability of teaching more economic history and placing greater emphasis on matching theory with data. ... Thus, one could reasonably expect a curriculum to be modified to include more history, history of thought, heterodox approaches, etc. But this is a far cry from calling for the abandonment of DGE theory. Do not blame the tools for how they were (or were not) used.

I've said a lot of what David says about modern macroeconomic models at one time or another in the past, for example it's not the tools of macroeconomics, it's how they are used. But I do think he leaves out one important factor, the need to ask the right question (and why we didn't prior to the crisis). This is from August, 2009:

In The Economist, Robert Lucas responds to recent criticism of macroeconomics ("In Defense of the Dismal Science"). Here's my entry at Free Exchange in response to his essay:

Lucas roundtable: Ask the right questions, by Mark Thoma: In his essay, Robert Lucas defends macroeconomics against the charge that it is "valueless, even harmful", and that the tools economists use are "spectacularly useless".

I agree that the analytical tools economists use are not the problem. We cannot fully understand how the economy works without employing models of some sort, and we cannot build coherent models without using analytic tools such as mathematics. Some of these tools are very complex, but there is nothing wrong with sophistication so long as sophistication itself does not become the main goal, and sophistication is not used as a barrier to entry into the theorist's club rather than an analytical device to understand the world.

But all the tools in the world are useless if we lack the imagination needed to build the right models. We ... have to ask the right questions before we can build the right models.

The problem wasn't the tools that macroeconomists use, it was the questions that we asked. The major debates in macroeconomics had nothing to do with the possibility of bubbles causing a financial system meltdown. That's not to say that there weren't models here and there that touched upon these questions, but the main focus of macroeconomic research was elsewhere. ...

The interesting question to me, then, is why we failed to ask the right questions. ...

Why did we, for the most part, fail to ask the right questions? Was it lack of imagination, was it the sociology within the profession, the concentration of power over what research gets highlighted, the inadequacy of the tools we brought to the problem, the fact that nobody will ever be able to predict these types of events, or something else?

It wasn't the tools, and it wasn't lack of imagination. As Brad DeLong points out, the voices were there—he points to Michael Mussa for one—but those voices were not heard. Nobody listened even though some people did see it coming. So I am more inclined to cite the sociology within the profession or the concentration of power as the main factors that caused us to dismiss these voices. ...

I don't know for sure the extent to which the ability of a small number of people in the field to control the academic discourse led to a concentration of power that stood in the way of alternative lines of investigation, or the extent to which the ideology that markets prices always tend to move toward their long-run equilibrium values caused us to ignore voices that foresaw the developing bubble and coming crisis. But something caused most of us to ask the wrong questions, and to dismiss the people who got it right, and I think one of our first orders of business is to understand how and why that happened.

Here's an interesting quote from Thomas Sargent along the same lines:

The criticism of real business cycle models and their close cousins, the so-called New Keynesian models, is misdirected and reflects a misunderstanding of the purpose for which those models were devised.6 These models were designed to describe aggregate economic fluctuations during normal times when markets can bring borrowers and lenders together in orderly ways, not during financial crises and market breakdowns.

Which to me is another way of saying we didn't foresee the need to ask questions (and build models) that would be useful in a financial crisis -- we were focused on models that would explain "normal times" (which is connected to the fact that we thought the Great Moderation would continue due to arrogance on behalf of economists leading to the belief that modern policy tools, particularly from the Fed, would prevent major meltdowns, financial or otherwise). That is happening now, so we'll be much more prepared if history repeats itself, but I have to wonder what other questions we should be asking, but aren't.

Let me add one more thing (a few excerpts from a post in 2010) about the sociology within economics:

I want to follow up on the post highlighting attempts to attack the messengers -- attempts to discredit Brad DeLong and Paul Krugman on macroeconomic policy in particular -- rather than engage academically with the message they are delivering (Krugman's response). ...
One of the objections often raised is that Krugman and DeLong are not, strictly speaking, macroeconomists. But if Krugman, DeLong, and others are expressing the theoretical and empirical results concerning macroeconomic policy accurately, does it really matter if we can strictly classify them as macroeconomists? Why is that important except as an attempt to discredit the message they are delivering? ... Attacking people rather than discussing ideas avoids even engaging on the issues. And when it comes to the ideas -- here I am talking most about fiscal policy -- as I've already noted in the previous post, the types of policies Krugman, DeLong, and others have advocated (and I should include myself as well) can be fully supported using modern macroeconomic models. ...
So, in answer to those who objected to my defending modern macro, you are partly right. I do think the tools and techniques macroeconomists use have value, and that the standard macro model in use today represents progress. But I also think the standard macro model used for policy analysis, the New Keynesian model, is unsatisfactory in many ways and I'm not sure it can be fixed. Maybe it can, but that's not at all clear to me. In any case, in my opinion the people who have strong, knee-jerk reactions whenever someone challenges the standard model in use today are the ones standing in the way of progress. It's fine to respond academically, a contest between the old and the new is exactly what we need to have, but the debate needs to be over ideas rather than an attack on the people issuing the challenges.

Tuesday, April 07, 2015

In Search of Better Macroeconomic Models

I have a new column:

In Search of Better Macroeconomic Models: Modern macroeconomic models did not perform well during the Great Recession. What needs to be done to fix them? Can the existing models be patched up, or are brand new models needed? ...

It's mostly about the recent debate on whether we need microfoundations in macroeconomics.

Saturday, April 04, 2015

'Do not Underestimate the Power of Microfoundations'

Simon Wren-Lewis takes a shot at answering Brad DeLong's question about microfoundations:

Do not underestimate the power of microfoundations: Brad DeLong asks why the New Keynesian (NK) model, which was originally put forth as simply a means of demonstrating how sticky prices within an RBC framework could produce Keynesian effects, has managed to become the workhorse of modern macro, despite its many empirical deficiencies. ... Brad says his question is closely related to the “question of why models that are microfounded in ways we know to be wrong are preferable in the discourse to models that try to get the aggregate emergent properties right.”...
Why are microfounded models so dominant? From my perspective this is a methodological question, about the relative importance of ‘internal’ (theoretical) versus ‘external’ (empirical) consistency. ...
 I would argue that the New Classical (counter) revolution was essentially a methodological revolution. However..., it will be a struggle to get macroeconomists below a certain age to admit this is a methodological issue. Instead they view microfoundations as just putting right inadequacies with what went before.
So, for example, you will be told that internal consistency is clearly an essential feature of any model, even if it is achieved by abandoning external consistency. ... In essence, many macroeconomists today are blind to the fact that adopting microfoundations is a methodological choice, rather than simply a means of correcting the errors of the past.
I think this has two implications for those who want to question the microfoundations hegemony. The first is that the discussion needs to be about methodology, rather than individual models. Deficiencies with particular microfounded models, like the NK model, are generally well understood, and from a microfoundations point of view simply provide an agenda for more research. Second, lack of familiarity with methodology means that this discussion cannot presume knowledge that is not there. ... That makes discussion difficult, but I’m not sure it makes it impossible.

Saturday, March 28, 2015

'Unreal Keynesians'

Paul Krugman:

Unreal Keynesians: Brad DeLong points me to Lars Syll declaring that I am not a “real Keynesian”, because I use equilibrium models and don’t emphasize the instability of expectations. ...
I don’t care whether Hicksian IS-LM is Keynesian in the sense that Keynes himself would have approved of it, and neither should you. What you should ask is whether that approach has proved useful — and whether the critics have something better to offer.
And as I have often argued, these past 6 or 7 years have in fact been a triumph for IS-LM. Those of us using IS-LM made predictions about the quiescence of interest rates and inflation that were ridiculed by many on the right, but have been completely borne out in practice. We also predicted much bigger adverse effects from austerity than usual because of the zero lower bound, and that has also come true. ...

Wednesday, March 25, 2015

'Anti-Keynesian Delusions'

Paul Krugman continues the discussion on the use of the Keynesian model:

Anti-Keynesian Delusions: I forgot to congratulate Mark Thoma on his tenth blogoversary, so let me do that now. ...
Today Mark includes a link to one of his own columns, a characteristically polite and cool-headed response to the latest salvo from David K. Levine. Brad DeLong has also weighed in, less politely.
I’d like to weigh in with a more general piece of impoliteness, and note a strong empirical regularity in this whole area. Namely, whenever someone steps up to declare that Keynesian economics is logically and empirically flawed, has been proved wrong and refuted, you know what comes next: a series of logical and empirical howlers — crude errors of reasoning, assertions of fact that can be checked and rejected in a minute or two.
Levine doesn’t disappoint. ...

He goes on to explain in detail.

Update: Brad DeLong also comments.

Tuesday, March 24, 2015

'Macro Wars: The Attack of the Anti-Keynesians'

I have a new column:

Macro Wars: The Attack of the Anti-Keynesians, by Mark Thoma: The ongoing war between the Keynesians and the anti-Keynesians appears to be heating up again. The catalyst for this round of fighting is The Keynesian Illusion by David K. Levine, which elicited responses such as this and this from Brad DeLong and Nick Rowe.
The debate is about the source of economic fluctuations and the government’s ability to counteract them with monetary and fiscal policy. One of the issues is the use of “old fashioned” Keynesian models – models that have supposedly been rejected by macroeconomists in favor of modern macroeconomic models – to explain and understand the Great Recession and to make monetary and fiscal policy recommendations. As Levine says, “Robert Lucas, Edward Prescott, and Thomas Sargent … rejected Keynesianism because it doesn't work… As it happens we have developed much better theories…”
I believe the use of “old-fashioned” Keynesian models to analyze the Great Recession can be defended. ...

Monday, March 23, 2015

Paul Krugman: This Snookered Isle

Mediamacro:

This Snookered Isle, by Paul Krugman, Commentary, NY Times: The 2016 election is still 19 mind-numbing, soul-killing months away. There is, however, another important election in just six weeks, as Britain goes to the polls. And many of the same issues are on the table.
Unfortunately, economic discourse in Britain is dominated by a misleading fixation on budget deficits. Worse, this bogus narrative has infected supposedly objective reporting; media organizations routinely present as fact propositions that are contentious if not just plain wrong.
Needless to say, Britain isn’t the only place where things like this happen. A few years ago, at the height of our own deficit fetishism, the American news media showed some of the same vices. ... Reporters would drop all pretense of neutrality and cheer on proposals for entitlement cuts.
In the United States, however, we seem to have gotten past that. Britain hasn’t.
The narrative I’m talking about goes like this: In the years before the financial crisis, the British government borrowed irresponsibly... As a result, by 2010 Britain was at imminent risk of a Greek-style crisis; austerity policies, slashing spending in particular, were essential. And this turn to austerity is vindicated by Britain’s low borrowing costs, coupled with the fact that the economy, after several rough years, is now growing quite quickly.
Simon Wren-Lewis of Oxford University has dubbed this narrative “mediamacro.” As his coinage suggests, this is what you hear all the time on TV and read in British newspapers, presented not as the view of one side of the political debate but as simple fact.
Yet none of it is true. ...
Given all this, you might wonder how mediamacro gained such a hold on British discourse. Don’t blame economists. ... This media orthodoxy has become entrenched despite, not because of, what serious economists had to say.
Still, you can say the same of Bowles-Simpsonism in the United States... It was all about posturing, about influential people believing that pontificating about the need to make sacrifices — or, actually, for other people to make sacrifices — is how you sound wise and serious. ...
As I said, in the United States we have mainly gotten past that, for a variety of reasons — among them, I suspect, the rise of analytical journalism, in places like The Times’s The Upshot. But Britain hasn’t; an election that should be about real problems will, all too likely, be dominated by mediamacro fantasies.

Wednesday, March 18, 2015

'Is the Walrasian Auctioneer Microfounded?'

Simon Wren-Lewis (he says this is "For macroeconomists"):

Is the Walrasian Auctioneer microfounded?: I found this broadside against Keynesian economics by David K. Levine interesting. It is clear at the end that he is child of the New Classical revolution. Before this revolution he was far from ignorant of Keynesian ideas. He adds: “Knowledge of Keynesianism and Keynesian models is even deeper for the great Nobel Prize winners who pioneered modern macroeconomics - a macroeconomics with people who buy and sell things, who save and invest - Robert Lucas, Edward Prescott, and Thomas Sargent among others. They also grew up with Keynesian theory as orthodoxy - more so than I. And we rejected Keynesianism because it doesn't work not because of some aesthetic sense that the theory is insufficiently elegant.”
The idea is familiar: New Classical economists do things properly, by founding their analysis in the microeconomics of individual production, savings and investment decisions. [2] It is no surprise therefore that many of today’s exponents of this tradition view their endeavour as a natural extension of the Walrasian General Equilibrium approach associated with Arrow, Debreu and McKenzie. But there is one agent in that tradition that is as far from microfoundations as you can get: the Walrasian auctioneer. It is this auctioneer, and not people, who typically sets prices. ...
Now your basic New Keynesian model contains a huge number of things that remain unrealistic or are just absent. However I have always found it extraordinary that some New Classical economists declare such models as lacking firm microfoundations, when these models at least try to make up for one area where RBC models lack any microfoundations at all, which is price setting. A clear case of the pot calling the kettle black! I have never understood why New Keynesians can be so defensive about their modelling of price setting. Their response every time should be ‘well at least it’s better than assuming an intertemporal auctioneer’.[1] ...
As to the last sentence in the quote from Levine above, I have talked before about the assertion that Keynesian economics did not work, and the implication that RBC models work better. He does not talk about central banks, or monetary policy. If he had, he would have to explain why most of the people working for them seem to believe that New Keynesian type models are helpful in their job of managing the economy. Perhaps these things are not mentioned because it is so much easier to stay living in the 1980s, in those glorious days (for some) when it appeared as if Keynesian economics had been defeated for good.

'Arezki, Ramey, and Sheng on News Shocks'

I was at this conference as well. This paper was very well received (it has been difficult to find evidence that news generates business cycles, in part because it's been difficult to find a "clean" shock):

Arezki, Ramey, and Sheng on news shocks: I attended the NBER EFG (economic fluctuations and growth) meeting a few weeks ago, and saw a very nice paper by Rabah Arezki, Valerie Ramey, and Liugang Sheng, "News Shocks in Open Economies: Evidence from Giant Oil Discoveries" (There were a lot of nice papers, but this one is more bloggable.)

They look at what happens to economies that discover they have a lot of oil. ... An oil discovery is a well identified "news shock."

Standard productivity shocks are a bit nebulous, and alter two things at once: they give greater productivity and hence incentive to work today and also news about more income in the future.

An oil discovery is well publicized. It incentivizes a small investment in oil drilling, but mostly is pure news of an income flow in the future. It does not affect overall labor productivity or other changes to preferences or technology.
Rabah,Valerie, and Liugang then construct a straightforward macro model of such an event. ...[describes model and results]...

Valerie, presenting the paper, was a bit discouraged. This "news shock" doesn't generate a pattern that looks like standard recessions, because GDP and employment go in the opposite direction.

I am much more encouraged. Here are macroeconomies behaving exactly as they should, in response to a shock where for once we really know what the shock is. And in response to a shock with a nice dynamic pattern, which we also really understand.

My comment was something to the effect of "this paper is much more important than you think. You match the dynamic response of economies to this large and very well identified shock with a standard, transparent and intuitive neoclassical model. Here's a list of some of the ingredients you didn't need: Sticky prices, sticky wages, money, monetary policy, (i.e. interest rates that respond via a policy rule to output and inflation or zero bounds that stop them from doing so), home bias, segmented financial markets, credit constraints, liquidity constraints, hand-to-mouth consumers, financial intermediation, liquidity spirals, fire sales, leverage, sudden stops, hot money, collateral constraints, incomplete markets, idiosyncratic risks, strange preferences including habits, nonexpected utility, ambiguity aversion, and so forth, behavioral biases, nonexpected utility, or rare disasters. If those ingredients are really there, they ought to matter for explaining the response to your shocks too. After all, there is only one economic structure, which is hit by many shocks. So your paper calls into question just how many of those ingredients are really there at all."

Thomas Phillipon, whose previous paper had a pretty masterful collection of a lot of those ingredients, quickly pointed out my overstatement. One needs not need every ingredient to understand every shock. Constraint variables are inequalities. A positive news shock may not cause credit constraints etc. to bind, while a negative shock may reveal them.

Good point. And really, the proof is in the pudding. If those ingredients are not necessary, then I should produce a model without them that produces events like 2008. But we've been debating the ingredients and shock necessary to explain 1932 for 82 years, so that approach, though correct, might take a while.

In the meantime, we can still cheer successful simple models and well identified shocks on the few occasions that they appear and fit data so nicely. Note to graduate students, this paper is a really nice example to follow for its integration of clear theory and excellent empirical work.

Saturday, March 14, 2015

'John and Maynard’s Excellent Adventure'

Paul Krugman defends IS-LM analysis (I'd make one qualification. Models are built to answer specific questions, we do not have one grand unifying model to use for all questions. IS-LM models were built to answer exactly the kinds of questions we encountered during the Great Recession, and the IS-LM model provided good answers (especially if one remembers where the model encounters difficulties). DSGE models were built to address other issues, and it's not surprising they didn't do very well when they were pushed to address questions they weren't designed to answer. The best model to use depends upon the question one is asking):

John and Maynard’s Excellent Adventure: When I tell people that macroeconomic analysis has been triumphantly successful in recent years, I tend to get strange looks. After all, wasn’t everyone predicting lots of inflation? Didn’t policymakers get it all wrong? Haven’t the academic economists been squabbling nonstop?
Well, as a card-carrying economist I disavow any responsibility for Rick Santelli and Larry Kudlow; I similarly declare that Paul Ryan and Olli Rehn aren’t my fault. As for the economists’ disputes, well, let me get to that in a bit.
I stand by my claim, however. The basic macroeconomic framework that we all should have turned to, the framework that is still there in most textbooks, performed spectacularly well: it made strong predictions that people who didn’t know that framework found completely implausible, and those predictions were vindicated. And the framework in question – basically John Hicks’s interpretation of John Maynard Keynes – was very much the natural way to think about the issues facing advanced countries after 2008. ...
I call this a huge success story – one of the best examples in the history of economics of getting things right in an unprecedented environment.
The sad thing, of course, is that this incredibly successful analysis didn’t have much favorable impact on actual policy. Mainly that’s because the Very Serious People are too serious to play around with little models; they prefer to rely on their sense of what markets demand, which they continue to consider infallible despite having been wrong about everything. But it also didn’t help that so many economists also rejected what should have been obvious.
Why? Many never learned simple macro models – if it doesn’t involve microfoundations and rational expectations, preferably with difficult math, it must be nonsense. (Curiously, economists in that camp have also proved extremely prone to basic errors of logic, probably because they have never learned to work through simple stories.) Others, for what looks like political reasons, seemed determined to come up with some reason, any reason, to be against expansionary monetary and fiscal policy.
But that’s their problem. From where I sit, the past six years have been hugely reassuring from an intellectual point of view. The basic model works; we really do know what we’re talking about.

[The original is quite a bit longer.]

Thursday, March 05, 2015

'Economists' Biggest Failure'

Noah Smith:

Economists' Biggest Failure: One of the biggest things that economists get grief about is their failure to predict big events like recessions. ... 
Pointing this out usually leads to the eternal (and eternally fun) debate over whether economics is a real science. The profession's detractors say that if you don’t make successful predictions, you aren’t a science. Economists will respond that seismologists can’t forecast earthquakes, and meteorologists can’t forecast hurricanes, and who cares what’s really a “science” anyway. 
The debate, however, misses the point. Forecasts aren’t the only kind of predictions a science can make. In fact, they’re not even the most important kind. 
Take physics for example. Sometimes physicists do make forecasts -- for example, eclipses. But those are the exception. Usually, when you make a new physics theory, you use it to predict some new phenomenon... For example, quantum mechanics has gained a lot of support from predicting the strange new things like quantum tunneling or quantum teleportation.
Other times, a theory will predict things we have seen before, but will describe them in terms of other things that we thought were totally separate, unrelated phenomena. This is called unification, and it’s a key part of what philosophers think science does. For example, the theory of electromagnetism says that light, electric current, magnetism, radio waves are all really the same phenomenon. Pretty neat! ...
So that’s physics. What about economics? Actually, econ has a number of these successes too. When Dan McFadden used his Random Utility Model to predict how many people would ride San Francisco's Bay Area Rapid Transit system,... he got it right. And he got many other things right with the same theory -- it wasn’t developed to explain only train ridership. 
Unfortunately, though, this kind of success isn't very highly regarded in the economics world... Maybe now, with the ascendance of empirical economics and a decline in theory, we’ll see a focus on producing fewer but better theories, more unification, and more attempts to make novel predictions. Someday, maybe macroeconomists will even be able to make forecasts! But let’s not get our hopes up.

I've addressed this question many times, e.g. in 2009, and to me the distinction is between forecasting the future, and understanding why certain phenomena occur (re-reading, it's a bit repetitive):

Are Macroeconomic Models Useful?: There has been no shortage of effort devoted to predicting earthquakes, yet we still can't see them coming far enough in advance to move people to safety. When a big earthquake hits, it is a surprise. We may be able to look at the data after the fact and see that certain stresses were building, so it looks like we should have known an earthquake was going to occur at any moment, but these sorts of retrospective analyses have not allowed us to predict the next one. The exact timing and location is always a surprise.
Does that mean that science has failed? Should we criticize the models as useless?
No. There are two uses of models. One is to understand how the world works, another is to make predictions about the future. We may never be able to predict earthquakes far enough in advance and with enough specificity to allow us time to move to safety before they occur, but that doesn't prevent us from understanding the science underlying earthquakes. Perhaps as our understanding increases prediction will be possible, and for that reason scientists shouldn't give up trying to improve their models, but for now we simply cannot predict the arrival of earthquakes.
However, even though earthquakes cannot be predicted, at least not yet, it would be wrong to conclude that science has nothing to offer. First, understanding how earthquakes occur can help us design buildings and make other changes to limit the damage even if we don't know exactly when an earthquake will occur. Second, if an earthquake happens and, despite our best efforts to insulate against it there are still substantial consequences, science can help us to offset and limit the damage. To name just one example, the science surrounding disease transmission helps use to avoid contaminated water supplies after a disaster, something that often compounds tragedy when this science is not available. But there are lots of other things we can do as well, including using the models to determine where help is most needed.
So even if we cannot predict earthquakes, and we can't, the models are still useful for understanding how earthquakes happen. This understanding is valuable because it helps us to prepare for disasters in advance, and to determine policies that will minimize their impact after they happen.
All of this can be applied to macroeconomics. Whether or not we should have predicted the financial earthquake is a question that has been debated extensively, so I am going to set that aside. One side says financial market price changes, like earthquakes, are inherently unpredictable -- we will never predict them no matter how good our models get (the efficient markets types). The other side says the stresses that were building were obvious. Like the stresses that build when tectonic plates moving in opposite directions rub against each other, it was only a question of when, not if. (But even when increasing stress between two plates is observable, scientists cannot tell you for sure if a series of small earthquakes will relieve the stress and do little harm, or if there will be one big adjustment that relieves the stress all at once. With respect to the financial crisis, economists expected lots of little, small harm causing adjustments, instead we got the "big one," and the "buildings and other structures" we thought could withstand the shock all came crumbling down. On prediction in economics, perhaps someday improved models will allow us to do better than we have so far at predicting the exact timing of crises, and I think that earthquakes provide some guidance here. You have to ask first if stress is building in a particular sector, and then ask if action needs to be taken because the stress has reached dangerous levels, levels that might result in a big crash rather than a series of small stress relieving adjustments. I don't think our models are very good at detecting accumulating stress...
Whether the financial crisis should have been predicted or not, the fact that it wasn't predicted does not mean that macroeconomic models are useless any more than the failure to predict earthquakes implies that earthquake science is useless. As with earthquakes, even when prediction is not possible (or missed), the models can still help us to understand how these shocks occur. That understanding is useful for getting ready for the next shock, or even preventing it, and for minimizing the consequences of shocks that do occur. 
But we have done much better at dealing with the consequences of unexpected shocks ex-post than we have at getting ready for these a priori. Our equivalent of getting buildings ready for an earthquake before it happens is to use changes in institutions and regulations to insulate the financial sector and the larger economy from the negative consequences of financial and other shocks. Here I think economists made mistakes - our "buildings" were not strong enough to withstand the earthquake that hit. We could argue that the shock was so big that no amount of reasonable advance preparation would have stopped the "building" from collapsing, but I think it's more the case that enough time has passed since the last big financial earthquake that we forgot what we needed to do. We allowed new buildings to be constructed without the proper safeguards.
However, that doesn't mean the models themselves were useless. The models were there and could have provided guidance, but the implied "building codes" were ignored. Greenspan and others assumed no private builder would ever construct a building that couldn't withstand an earthquake, the market would force them to take this into consideration. But they were wrong about that, and even Greenspan now admits that government building codes are necessary. It wasn't the models, it was how they were used (or rather not used) that prevented us from putting safeguards into place.
We haven't failed at this entirely though. For example, we have had some success at putting safeguards into place before shocks occur, automatic stabilizers have done a lot to insulate against the negative consequences of the recession (though they could have been larger to stop the building from swaying as much as it has). So it's not proper to say that our models have not helped us to prepare in advance at all, the insulation social insurance programs provide is extremely important to recognize. But it is the case that we could have and should have done better at preparing before the shock hit.
I'd argue that our most successful use of models has been in cleaning up after shocks rather than predicting, preventing, or insulating against them through pre-crisis preparation. When despite our best effort to prevent it or to minimize its impact a priori, we get a recession anyway, we can use our models as a guide to monetary, fiscal, and other policies that help to reduce the consequences of the shock (this is the equivalent of, after a disaster hits, making sure that the water is safe to drink, people have food to eat, there is a plan for rebuilding quickly and efficiently, etc.). As noted above, we haven't done a very good job at predicting big crises, and we could have done a much better job at implementing regulatory and institutional changes that prevent or limit the impact of shocks. But we do a pretty good job of stepping in with policy actions that minimize the impact of shocks after they occur. This recession was bad, but it wasn't another Great Depression like it might have been without policy intervention.
Whether or not we will ever be able to predict recessions reliably, it's important to recognize that our models still provide considerable guidance for actions we can take before and after large shocks that minimize their impact and maybe even prevent them altogether (though we will have to do a better job of listening to what the models have to say). Prediction is important, but it's not the only use of models.

Monday, January 26, 2015

'Does Monopoly Power Cause Inflation? (1968 and all that)'

Nick Rowe:

Does monopoly power cause inflation? (1968 and all that): Here's a question for you: Suppose there is a permanent increase in monopoly power across the economy (either firms having more monopoly power in output markets, or unions having more monopoly power in labour markets). Would that permanent increase in monopoly power cause a permanent increase in the inflation rate?
Most economists today would answer "no" to that question. It might maybe cause a temporary once-and-for-all rise in the price level, but it would not cause a permanent increase in the inflation rate. The question just sounds strange to modern economists' ears. They would much prefer to discuss whether a permanent increase in monopoly power caused a permanent reduction in real output and employment. What has monopoly power got to do with inflation?
To economists 40 or 50 years ago, the question would not have sounded strange at all. Many (maybe most?) economists would have answered "yes" to that question. ...

 

Saturday, January 10, 2015

'Orthodoxy, Heterodoxy, and Ideology'

Paul Krugman:

Orthodoxy, Heterodoxy, and Ideology: Many economists responded badly to the economic crisis. And there’s a lot wrong with mainstream economic analysis. But how closely are these two assertions related? Not as much as you might think. So I’m very much in accord with Simon Wren-Lewis on the remarkable unhelpfulness of recent heterodox assaults on the field. Not that there’s anything wrong with being heterodox in general; but a lot of what we’ve been seeing misidentifies the problem, and if anything gives aid and comfort to the wrong people.
The point is that standard macroeconomics does NOT justify the attacks on fiscal stimulus and the embrace of austerity. On these issues, people like Simon and myself have been following well-established models and analyses, while the austerians have been making up new stuff and/or rediscovering old fallacies to justify the policies they want. Formal modeling and quantitative analysis doesn’t justify the austerian position; on the contrary, austerians had to throw out the models and abandon statistical principles to justify their claims.
Let’s look at several examples. ...

See also Chris Dillow: Heterodox economics & the left.

It's remarkable how many people rejected the conclusions of *modern* macroeconomic models (or invented nonsense) in order to oppose fiscal policy. It seemed to have more to do with ideology (the government can't possible help no matter what the model says...) and identification (I'm a serious macroeconomist, don't lump me in with all those old fashioned Keynesian hippie types) than with standard macroeconomic analysis.

On this point, see Simon Wren-Lewis: Faith based macroeconomics.

Thursday, December 18, 2014

'What’s the Matter with Economics?': An Exchange

Arnold Packer and Jeff Madrick respond to Alan Blinder in the NYRB, and he replies:

‘What’s the Matter with Economics?’: An Exchange: In response to: What’s the Matter with Economics? from the December 18, 2014 issue ...
To the Editors:
Alan Blinder is one of the finest mainstream economists around. But to read his review of my book, you’d think that nothing was wrong with economics in recent decades except as it is practiced by a few right-wingers.
This is of course not the case. ...
Jeff Madrick
New York City
Alan S. Blinder replies:
According to both Jeff Madrick and Arnie Packer, I claim “that except for some right-wingers outside the ‘mainstream’…little is the matter” with economics. (These are Packer’s words; Madrick’s are similar.) But it’s not true. I think there is lots wrong with mainstream economics.
For starters, my review explicitly agreed with Madrick that (a) ideological predispositions infect economists’ conclusions far too much; (b) economics has drifted to the right (along with the American body politic); and (c) some economists got carried away by the allure of the efficient markets hypothesis. I also added a few indictments of my own: that we economists have failed to convey even the most basic economic principles to the public; and that some of our students turned Adam Smith’s invisible hand into Gordon Gekko’s “greed is good.” ...
Yet Madrick still insists that “economists rely on a fairly pure version of the invisible hand most of the time.” Not us mainstreamers. I’m a member of the tribe, I live among these people every day, and—trust me—we really don’t apply the “pure version” to the real world. For example, many of us see reasons for a minimum wage, mandatory Social Security, progressive taxation, carbon taxes, and a whole variety of financial regulations—to name just a few. ...

[Hard to summarize this one with a few excerpts -- I left a lot out...]

Sunday, December 14, 2014

Real Business Cycle Theory

Roger Farmer:

Real business cycle theory and the high school Olympics: I have lost count of the number of times I have heard students and faculty repeat the idea in seminars, that “all models are wrong”. This aphorism, attributed to George Box,  is the battle cry  of the Minnesota calibrator, a breed of macroeconomist, inspired by Ed Prescott, one of the most important and influential economists of the last century.
Of course all models are wrong. That is trivially true: it is the definition of a model. But the cry  has been used for three decades to poke fun at attempts to use serious econometric methods to analyze time series data. Time series methods were inconvenient to the nascent Real Business Cycle Program that Ed pioneered because the models that he favored were, and still are, overwhelmingly rejected by the facts. That is inconvenient. Ed’s response was pure genius. If the model and the data are in conflict, the data must be wrong. ...

After explaining, he concludes:

We don't have to play by Ed's rules. We can use the methods developed by Rob Engle and Clive Granger as I have done here. Once we allow aggregate demand to influence permanently the unemployment rate, the data do not look kindly on either real business cycle models or on the new-Keynesian approach. It's time to get serious about macroeconomic science...

Thursday, November 27, 2014

MarkSpeaks

Simon Wren-Lewis:

As Mark Thoma often says, the problem is with macroeconomists rather than macroeconomics.

Much, much more here.

Saturday, November 15, 2014

'The Unwisdom of Crowding Out'

Here's Paul Krugman's response to the Vox EU piece by Peter Temin and David Vines that I posted yesterday:

The Unwisdom of Crowding Out (Wonkish): I am, to my own surprise, not too happy with the defense of Keynes by Peter Temin and David Vines in VoxEU. Peter and David are of course right that Keynes has a lot to teach us, and are also right that the anti-Keynesians aren’t just making really bad arguments; they’re making the very same really bad arguments Keynes refuted 80 years ago.
But the Temin-Vines piece seems to conflate several different bad arguments under the heading of “Ricardian equivalence”, and in so doing understates the badness.
The anti-Keynesian proposition is that government spending to boost a depressed economy will fail, because it will lead to an equal or greater fall in private spending — it will crowd out investment and maybe consumption, and therefore accomplish nothing except a shift in who spends. But why do the AKs claim this will happen? I actually see five arguments out there — two (including the actual Ricardian equivalence argument) completely and embarrassingly wrong on logical grounds, three more that aren’t logical nonsense but fly in the face of the evidence.
Here they are...[explains all five]...

He ends with:

My point is that you do a disservice to the debate by calling all of these things Ricardian equivalence; and the nature of that disservice is that you end up making the really, really bad arguments sound more respectable than they are. We do not want to lose sight of the fact that many influential people, including economists with impressive CVs, responded to macroeconomic crisis with crude logical fallacies that reflected not just sloppy thinking but ignorance of history.

Tuesday, October 28, 2014

Are Economists Ready for Income Redistribution?

I have a new column:

Are Economists Ready for Income Redistribution?: When the Great Recession hit and it became clear that monetary policy alone would not be enough to prevent a severe, prolonged downturn, fiscal policy measures – a combination of tax cuts and new spending – were used to try to limit the damage to the economy. Unfortunately, macroeconomic research on fiscal policy was all but absent from the macroeconomics literature and, for the most part, policymakers were operating in the dark, basing decisions on what they believed to be true rather than on solid theoretical and empirical evidence.
Fiscal policy will be needed again in the future, either in a severe downturn or perhaps to address the problem of growing inequality, and macroeconomists must do a better job of providing the advice that policymakers need to make informed fiscal policy decisions. ...

The question of redistribution is coming, and we need to be ready when it does.

Tuesday, October 14, 2014

'The Mythical Phillips Curve?'

An entry in the ongoing debate over the Phillips curve:

The mythical Phillips curve?, by Simon Wren-Lewis, mainly macro: Suppose you had just an hour to teach the basics of macroeconomics, what relationship would you be sure to include? My answer would be the Phillips curve. With the Phillips curve you can go a long way to understanding what monetary policy is all about.
My faith in the Phillips curve comes from simple but highly plausible ideas. In a boom, demand is strong relative to the economy’s capacity to produce, so prices and wages tend to rise faster than in an economic downturn. However workers do not normally suffer from money illusion: in a boom they want higher real wages to go with increasing labour supply. Equally firms are interested in profit margins, so if costs rise, so will prices. As firms do not change prices every day, they will think about future as well as current costs. That means that inflation depends on expected inflation as well as some indicator of excess demand, like unemployment.
Microfoundations confirm this logic, but add a crucial point that is not immediately obvious. Inflation today will depend on expectations about inflation in the future, not expectations about current inflation. That is the major contribution of New Keynesian theory to macroeconomics. ...[turns to evidence]...

Is it this data which makes me believe in the Phillips curve? To be honest, no. Instead it is the basic theory that I discussed at the beginning of this post. It may also be because I’m old enough to remember the 1970s when there were still economists around who denied that lower unemployment would lead to higher inflation, or who thought that the influence of expectations on inflation was weak, or who thought any relationship could be negated by direct controls on wages and prices, with disastrous results. But given how ‘noisy’ macro data normally is, I find the data I have shown here pretty consistent with my beliefs.

Monday, October 06, 2014

'Is Keynesian Economics Left Wing?'

A small part of a much longer argument/post by Simon Wren-Lewis:

More asymmetries: Is Keynesian economics left wing?: ...So why is there this desire to deny the importance of Keynesian theory coming from the political right? Perhaps it is precisely because monetary policy is necessary to ensure aggregate demand is neither excessive nor deficient. Monetary policy is state intervention: by setting a market price, an arm of the state ensures the macroeconomy works. When this particular procedure fails to work, in a liquidity trap for example, state intervention of another kind is required (fiscal policy). While these statements are self-evident to many mainstream economists, to someone of a neoliberal or ordoliberal persuasion they are discomforting. At the macroeconomic level, things only work well because of state intervention. This was so discomforting that New Classical economists attempted to create an alternative theory of business cycles where booms and recessions were nothing to be concerned about, but just the optimal response of agents to exogenous shocks.
So my argument is that Keynesian theory is not left wing, because it is not about market failure - it is just about how the macroeconomy works. On the other hand anti-Keynesian views are often politically motivated, because the pivotal role the state plays in managing the macroeconomy does not fit the ideology. ...

Friday, September 26, 2014

'The New Classical Clique'

Paul Krugman continues the conversation on New Classical economics::

The New Classical Clique: Simon Wren-Lewis thinks some more about macroeconomics gone astray; Robert J. Waldmann weighs in. For those new to this conversation, the question is why starting in the 1970s much of academic macroeconomics was taken over by a school of thought that began by denying any useful role for policies to raise demand in a slump, and eventually coalesced around denial that the demand side of the economy has any role in causing slumps.
I was a grad student and then an assistant professor as this was happening, albeit doing international economics – and international macro went in a different direction, for reasons I’ll get to in a bit. So I have some sense of what was really going on. And while both Wren-Lewis and Waldmann hit on most of the main points, neither I think gets at the important role of personal self-interest. New classical macro was and still is many things – an ideological bludgeon against liberals, a showcase for fancy math, a haven for people who want some kind of intellectual purity in a messy world. But it’s also a self-promoting clique. ...

Wednesday, September 24, 2014

Where and When Macroeconomics Went Wrong

Simon Wren-Lewis:

Where macroeconomics went wrong: In my view, the answer is in the 1970/80s with the New Classical revolution (NCR). However I also think the new ideas that came with that revolution were progressive. I have defended rational expectations, I think intertemporal theory is the right place to start in thinking about consumption, and exploring the implications of time inconsistency is very important to macro policy, as well as many other areas of economics. I also think, along with nearly all macroeconomists, that the microfoundations approach to macro (DSGE models) is a progressive research strategy.
That is why discussion about these issues can become so confused. New Classical economics made academic macroeconomics take a number of big steps forward, but a couple of big steps backward at the same time. The clue to the backward steps comes from the name NCR. The research program was anti-Keynesian (hence New Classical), and it did not want microfounded macro to be an alternative to the then dominant existing methodology, it wanted to replace it (hence revolution). Because the revolution succeeded (although the victory over Keynesian ideas was temporary), generations of students were taught that Keynesian economics was out of date. They were not taught about the pros and cons of the old and new methodologies, but were taught that the old methodology was simply wrong. And that teaching was/is a problem because it itself is wrong. ...

Tuesday, September 16, 2014

Rethinking New Economic Thinking

I have a new column:

Rethinking New Economic Thinking: Efforts such as Rethinking Economics and The Institute for New Economic Thinking are noteworthy attempts to, as INET says, “broaden and accelerate the development of new economic thinking that can lead to solutions for the great challenges of the 21st century. The havoc wrought by our recent global financial crisis has vividly demonstrated the deficiencies in our outdated current economic theories, and shown the need for new economic thinking – right now. 
It is certainly true that mainstream, modern macroeconomic models failed us prior to and during the Great Recession. The models failed to give any warning at all about the crisis that was about to hit – if anything those using modern macro models resisted the idea that a bubble was inflating in housing markets – and the models failed to give us the guidance we needed to implement effective monetary and fiscal policy responses to our economic problems. 
But amid the calls for change in macroeconomics there is far too much attention on the tools and techniques that macroeconomists use to answer questions, and far too little attention on what really matters... ...[continue reading]...

'Making the Case for Keynes'

Peter Temin and David Vines have a new book:

Making the case for Keynes, by Peter Dizikes, MIT News Office: In 1919, when the victors of World War I were concluding their settlement against Germany — in the form of the Treaty of Versailles — one of the leading British representatives at the negotiations angrily resigned his position, believing the debt imposed on the losers would be too harsh. The official, John Maynard Keynes, argued that because Britain had benefitted from export-driven growth, forcing the Germans to spend their money paying back debt rather than buying British products would be counterproductive for everyone, and slow global growth.
Keynes’ argument, outlined in his popular 1919 book, “The Economic Consequences of the Peace,” proved prescient. But Keynes is not primarily regarded as a theorist of international economics: His most influential work, “The General Theory of Employment, Interest, and Money,” published in 1936, uses the framework of a single country with a closed economy. From that model, Keynes arrived at his famous conclusion that government spending can reduce unemployment by boosting aggregate demand.
But in reality, says Peter Temin, an MIT economic historian, Keynes’ conclusions about demand and employment were long intertwined with his examination of international trade; Keynes was thinking globally, even when modeling locally.
“Keynes was interested in the world economy, not just in a single national economy,” Temin says. Now he is co-author of a new book on the subject, “Keynes: Useful Economics for the World Economy,” written with David Vines, a professor of economics at Oxford University, published this month by MIT Press.
In their book, Temin and Vines make the case that Keynesian deficit spending by governments is necessary to reignite the levels of growth that Europe and the world had come to expect prior to the economic downturn of 2008. But in a historical reversal, they believe that today’s Germany is being unduly harsh toward the debtor states of Europe, forcing other countries to pay off debts made worse by the 2008 crash — and, in turn, preventing them from spending productively, slowing growth and inhibiting a larger continental recovery.
“If you have secular [long-term] stagnation, what you need is expansionary fiscal policy,” says Temin, who is the Elisha Gray II Professor Emeritus of Economics at MIT.
Additional government spending is distinctly not the approach that Europe (and, to a lesser extent, the U.S.) has pursued over the last six years, as political leaders have imposed a wide range of spending cuts — the pursuit of “austerity” as a response to hard times. But Temin thinks it is time for the terms of the spending debate to shift.  
“The hope David and I have is that our simple little book might change people’s minds,” Temin says.
“Sticky” wages were the sticking point
In an effort to do so, the authors outline an intellectual trajectory for Keynes in which he was highly concerned with international, trade-based growth from the early stages of his career until his death in 1946, and in which the single-country policy framework of his “General Theory” was a necessary simplification that actually fits neatly with this global vision.
As Temin and Vines see it, Keynes, from early in his career, and certainly by 1919, had developed an explanation of growth in which technical progress leads to greater productive capacity. This leads businesses in advanced countries to search for international markets in which to sell products; encourages foreign lending of capital; and, eventually, produces greater growth by other countries as well.
“Clearly, Keynes knew that domestic prosperity was critically determined by external conditions,” Temin and Vines write.
Yet as they see it, Keynes had to overcome a crucial sticking point in his thought: As late as 1930, when Keynes served on a major British commission investigating the economy, he was still using an older, neoclassical idea in which all markets reached a sort of equilibrium. 
This notion implies that when jobs were relatively scarce, wages would decline to the point where more people would be employed. Yet this doesn’t quite seem to happen: As economists now recognize, and as Keynes came to realize, wages could be “sticky,” and remain at set levels, for various psychological or political reasons. In order to arrive at the conclusions of the “General Theory,” then, Keynes had to drop the assumption that wages would fluctuate greatly.
“The issue for Keynes was that he knew that if prices were flexible, then if all prices [including wages] could change, then you eventually get back to full employment,” Temin says. “So in order to avoid that, he assumed away all price changes.”
But if wages will not drop, how can we increase employment? For Keynes, the answer was that the whole economy had to grow: There needed to be an increase in aggregate demand, one of the famous conclusions of the “General Theory.” And if private employers cannot or will not spend more money on workers, Keynes thought, then the government should step in and spend.
“Keynes is very common-sense,” Temin says, in “that if you put people to work building roads and bridges, then those people spend money, and that promotes aggregate demand.”
Today, opponents of Keynes argue that such public spending will offset private-sector spending without changing overall demand. But Temin contends that private-sector spending “won’t be offset if those people were going to be unemployed, and would not be spending anything. Given jobs, he notes, “They would spend money, because now they would have money.”
Keynes’ interest in international trade and international economics never vanished, as Temin and Vines see it. Indeed, in the late stages of World War II, Keynes was busy working out proposals that could spur postwar growth within this same intellectual framework — and the International Monetary Fund is one outgrowth of this effort.
History repeating?
“Keynes: Useful Economics for the World Economy” has received advance praise from some prominent scholars. ... Nonetheless, Temin is guarded about the prospect of changing the contemporary austerity paradigm.
“I can’t predict what policy is going to do in the next couple of years,” Temin says. And in the meantime, he thinks, history may be repeating itself, as debtor countries are unable to make capital investments while paying off debt.
Germany has “decided that they are not willing to take any of the capital loss that happened during the crisis,” Temin adds. “The [other] European countries don’t have the resources to pay off these bonds. They’ve had to cut their spending to get the resources to pay off the bonds. If you read the press, you know this hasn’t been working very well.”

Thursday, September 11, 2014

'Trapped in the ''Dark Corners'''?

A small part of Brad DeLong's response to Olivier Blanchard. I posted a shortened version of Blanchard's argument a week or two ago:

Where Danger Lurks: Until the 2008 global financial crisis, mainstream U.S. macroeconomics had taken an increasingly benign view of economic fluctuations in output and employment. The crisis has made it clear that this view was wrong and that there is a need for a deep reassessment. ...
That small shocks could sometimes have large effects and, as a result, that things could turn really bad, was not completely ignored by economists. But such an outcome was thought to be a thing of the past that would not happen again, or at least not in advanced economies thanks to their sound economic policies. ... We all knew that there were “dark corners”—situations in which the economy could badly malfunction. But we thought we were far away from those corners, and could for the most part ignore them. ...
The main lesson of the crisis is that we were much closer to those dark corners than we thought—and the corners were even darker than we had thought too. ...
How should we modify our benchmark models—the so-called dynamic stochastic general equilibrium (DSGE) models...? The easy and uncontroversial part of the answer is that the DSGE models should be expanded to better recognize the role of the financial system—and this is happening. But should these models be able to describe how the economy behaves in the dark corners?
Let me offer a pragmatic answer. If macroeconomic policy and financial regulation are set in such a way as to maintain a healthy distance from dark corners, then our models that portray normal times may still be largely appropriate. Another class of economic models, aimed at measuring systemic risk, can be used to give warning signals that we are getting too close to dark corners, and that steps must be taken to reduce risk and increase distance. Trying to create a model that integrates normal times and systemic risks may be beyond the profession’s conceptual and technical reach at this stage.
The crisis has been immensely painful. But one of its silver linings has been to jolt macroeconomics and macroeconomic policy. The main policy lesson is a simple one: Stay away from dark corners.

And I responded:

That may be the best we can do for now (have separate models for normal times and "dark corners"), but an integrated model would be preferable. An integrated model would, for example, be better for conducting "policy and financial regulation ... to maintain a healthy distance from dark corners," and our aspirations ought to include models that can explain both normal and abnormal times. That may mean moving beyond the DSGE class of models, or perhaps the technical reach of DSGE models can be extended to incorporate the kinds of problems that can lead to Great Recessions, but we shouldn't be satisfied with models of normal times that cannot explain and anticipate major economic problems.

Here's part of Brad's response:

But… but… but… Macroeconomic policy and financial regulation are not set in such a way as to maintain a healthy distance from dark corners. We are still in a dark corner now. There is no sign of the 4% per year inflation target, the commitments to do what it takes via quantitative easing and rate guidance to attain it, or a fiscal policy that recognizes how the rules of the game are different for reserve currency printing sovereigns when r < n+g. Thus not only are we still in a dark corner, but there is every reason to believe that should we get out the sub-2% per year effective inflation targets of North Atlantic central banks and the inappropriate rhetoric and groupthink surrounding fiscal policy makes it highly likely that we will soon get back into yet another dark corner. Blanchard’s pragmatic answer is thus the most unpragmatic thing imaginable: the “if” test fails, and so the “then” part of the argument seems to me to be simply inoperative. Perhaps on another planet in which North Atlantic central banks and governments aggressively pursued 6% per year nominal GDP growth targets Blanchard’s answer would be “pragmatic”. But we are not on that planet, are we?

Moreover, even were we on Planet Pragmatic, it still seems to be wrong. Using current or any visible future DSGE models for forecasting and mainstream scenario planning makes no sense: the DSGE framework imposes restrictions on the allowable emergent properties of the aggregate time series that are routinely rejected at whatever level of frequentist statistical confidence that one cares to specify. The right road is that of Christopher Sims: that of forecasting and scenario planning using relatively instructured time-series methods that use rather than ignore the correlations in the recent historical data. And for policy evaluation? One should take the historical correlations and argue why reverse-causation and errors-in-variables lead them to underestimate or overestimate policy effects, and possibly get it right. One should not impose a structural DSGE model that identifies the effects of policies but certainly gets it wrong. Sims won that argument. Why do so few people recognize his victory?

Blanchard continues:

Another class of economic models, aimed at measuring systemic risk, can be used to give warning signals that we are getting too close to dark corners, and that steps must be taken to reduce risk and increase distance. Trying to create a model that integrates normal times and systemic risks may be beyond the profession’s conceptual and technical reach at this stage…

For the second task, the question is: whose models of tail risk based on what traditions get to count in the tail risks discussion?

And missing is the third task: understanding what Paul Krugman calls the “Dark Age of macroeconomics”, that jahiliyyah that descended on so much of the economic research, economic policy analysis, and economic policymaking communities starting in the fall of 2007, and in which the center of gravity of our economic policymakers still dwell.

Sunday, August 31, 2014

'Where Danger Lurks'

Olivier Blanchard (a much shortened version of his arguments, the entire piece is worth reading):

Where Danger Lurks: Until the 2008 global financial crisis, mainstream U.S. macroeconomics had taken an increasingly benign view of economic fluctuations in output and employment. The crisis has made it clear that this view was wrong and that there is a need for a deep reassessment. ...
That small shocks could sometimes have large effects and, as a result, that things could turn really bad, was not completely ignored by economists. But such an outcome was thought to be a thing of the past that would not happen again, or at least not in advanced economies thanks to their sound economic policies. ... We all knew that there were “dark corners”—situations in which the economy could badly malfunction. But we thought we were far away from those corners, and could for the most part ignore them. ...
The main lesson of the crisis is that we were much closer to those dark corners than we thought—and the corners were even darker than we had thought too. ...
How should we modify our benchmark models—the so-called dynamic stochastic general equilibrium (DSGE) models...? The easy and uncontroversial part of the answer is that the DSGE models should be expanded to better recognize the role of the financial system—and this is happening. But should these models be able to describe how the economy behaves in the dark corners?
Let me offer a pragmatic answer. If macroeconomic policy and financial regulation are set in such a way as to maintain a healthy distance from dark corners, then our models that portray normal times may still be largely appropriate. Another class of economic models, aimed at measuring systemic risk, can be used to give warning signals that we are getting too close to dark corners, and that steps must be taken to reduce risk and increase distance. Trying to create a model that integrates normal times and systemic risks may be beyond the profession’s conceptual and technical reach at this stage.
The crisis has been immensely painful. But one of its silver linings has been to jolt macroeconomics and macroeconomic policy. The main policy lesson is a simple one: Stay away from dark corners.

That may be the best we can do for now (have separate models for normal times and "dark corners"), but an integrated model would be preferable. An integrated model would, for example, be better for conducting "policy and financial regulation ... to maintain a healthy distance from dark corners," and our aspirations ought to include models that can explain both normal and abnormal times. That may mean moving beyond the DSGE class of models, or perhaps the technical reach of DSGE models can be extended to incorporate the kinds of problems that can lead to Great Recessions, but we shouldn't be satisfied with models of normal times that cannot explain and anticipate major economic problems.

Tuesday, August 19, 2014

The Agent-Based Method

Rajiv Sethi:

The Agent-Based Method: It's nice to see some attention being paid to agent-based computational models on economics blogs, but Chris House has managed to misrepresent the methodology so completely that his post is likely to do more harm than good. 

In comparing the agent-based method to the more standard dynamic stochastic general equilibrium (DSGE) approach, House begins as follows:

Probably the most important distinguishing feature is that, in an ABM, the interactions are governed by rules of behavior that the modeler simply encodes directly into the system individuals who populate the environment.

So far so good, although I would not have used the qualifier "simply", since encoded rules can be highly complex. For instance, an ABM that seeks to describe the trading process in an asset market may have multiple participant types (liquidity, information, and high-frequency traders for instance) and some of these may be using extremely sophisticated strategies.

How does this approach compare with DSGE models? House argues that the key difference lies in assumptions about rationality and self-interest:

People who write down DSGE models don’t do that. Instead, they make assumptions on what people want. They also place assumptions on the constraints people face. Based on the combination of goals and constraints, the behavior is derived. The reason that economists set up their theories this way – by making assumptions about goals and then drawing conclusions about behavior – is that they are following in the central tradition of all of economics, namely that allocations and decisions and choices are guided by self-interest. This goes all the way back to Adam Smith and it’s the organizing philosophy of all economics. Decisions and actions in such an environment are all made with an eye towards achieving some goal or some objective. For consumers this is typically utility maximization – a purely subjective assessment of well-being.  For firms, the objective is typically profit maximization. This is exactly where rationality enters into economics. Rationality means that the “agents” that inhabit an economic system make choices based on their own preferences.

This, to say the least, is grossly misleading. The rules encoded in an ABM could easily specify what individuals want and then proceed from there. For instance, we could start from the premise that our high-frequency traders want to maximize profits. They can only do this by submitting orders of various types, the consequences of which will depend on the orders placed by others. Each agent can have a highly sophisticated strategy that maps historical data, including the current order book, into new orders. The strategy can be sensitive to beliefs about the stream of income that will be derived from ownership of the asset over a given horizon, and may also be sensitive to beliefs about the strategies in use by others. Agents can be as sophisticated and forward-looking in their pursuit of self-interest in an ABM as you care to make them; they can even be set up to make choices based on solutions to dynamic programming problems, provided that these are based on private beliefs about the future that change endogenously over time. 

What you cannot have in an ABM is the assumption that, from the outset, individual plans are mutually consistent. That is, you cannot simply assume that the economy is tracing out an equilibrium path. The agent-based approach is at heart a model of disequilibrium dynamics, in which the mutual consistency of plans, if it arises at all, has to do so endogenously through a clearly specified adjustment process. This is the key difference between the ABM and DSGE approaches, and it's right there in the acronym of the latter.

A typical (though not universal) feature of agent-based models is an evolutionary process, that allows successful strategies to proliferate over time at the expense of less successful ones. Since success itself is frequency dependent---the payoffs to a strategy depend on the prevailing distribution of strategies in the population---we have strong feedback between behavior and environment. Returning to the example of trading, an arbitrage-based strategy may be highly profitable when rare but much less so when prevalent. This rich feedback between environment and behavior, with the distribution of strategies determining the environment faced by each, and the payoffs to each strategy determining changes in their composition, is a fundamental feature of agent-based models. In failing to understand this, House makes claims that are close to being the opposite of the truth: 

Ironically, eliminating rational behavior also eliminates an important source of feedback – namely the feedback from the environment to behavior.  This type of two-way feedback is prevalent in economics and it’s why equilibria of economic models are often the solutions to fixed-point mappings. Agents make choices based on the features of the economy.  The features of the economy in turn depend on the choices of the agents. This gives us a circularity which needs to be resolved in standard models. This circularity is cut in the ABMs however since the choice functions do not depend on the environment. This is somewhat ironic since many of the critics of economics stress such feedback loops as important mechanisms.

It is absolutely true that dynamics in agent-based models do not require the computation of fixed points, but this is a strength rather than a weakness, and has nothing to do with the absence of feedback effects. These effects arise dynamically in calendar time, not through some mystical process by which coordination is instantaneously achieved and continuously maintained. 

It's worth thinking about how the learning literature in macroeconomics, dating back to Marcet and Sargent and substantially advanced by Evans and Honkapohja fits into this schema. Such learning models drop the assumption that beliefs continuously satisfy mutual consistency, and therefore take a small step towards the ABM approach. But it really is a small step, since a great deal of coordination continues to be assumed. For instance, in the canonical learning model, there is a parameter about which learning occurs, and the system is self-referential in that beliefs about the parameter determine its realized value. This allows for the possibility that individuals may hold incorrect beliefs, but limits quite severely---and more importantly, exogenously---the structure of such errors. This is done for understandable reasons of tractability, and allows for analytical solutions and convergence results to be obtained. But there is way too much coordination in beliefs across individuals assumed for this to be considered part of the ABM family.

The title of House's post asks (in response to an earlier piece by Mark Buchanan) whether agent-based models really are the future of the discipline. I have argued previously that they are enormously promising, but face one major methodological obstacle that needs to be overcome. This is the problem of quality control: unlike papers in empirical fields (where causal identification is paramount) or in theory (where robustness is key) there is no set of criteria, widely agreed upon, that can allow a referee to determine whether a given set of simulation results provides a deep and generalizable insight into the workings of the economy. One of the most celebrated agent-based models in economics---the Schelling segregation model---is also among the very earliest. Effective and acclaimed recent exemplars are in short supply, though there is certainly research effort at the highest levels pointed in this direction. The claim that such models can displace the equilibrium approach entirely is much too grandiose, but they should be able to find ample space alongside more orthodox approaches in time. 

---

The example of interacting trading strategies in this post wasn't pulled out of thin air; market ecology has been a recurrent theme on this blog. In ongoing work with Yeon-Koo Che and Jinwoo Kim, I am exploring the interaction of trading strategies in asset markets, with the goal of addressing some questions about the impact on volatility and welfare of high-frequency trading. We have found the agent-based approach very useful in thinking about these questions, and I'll present some preliminary results at a session on the methodology at the Rethinking Economics conference in New York next month. The event is free and open to the public but seating is limited and registration required. 

Wednesday, August 13, 2014

'Unemployment Fluctuations are Mainly Driven by Aggregate Demand Shocks'

Do the facts have a Keynesian bias?:

Using product- and labour-market tightness to understand unemployment, by Pascal Michaillat and Emmanuel Saez, Vox EU: For the five years from December 2008 to November 2013, the US unemployment rate remained above 7%, peaking at 10% in October 2009. This period of high unemployment is not well understood. Macroeconomists have proposed a number of explanations for the extent and persistence of unemployment during the period, including:

  • High mismatch caused by major shocks to the financial and housing sectors,
  • Low search effort from unemployed workers triggered by long extensions of unemployment insurance benefits, and
  • Low aggregate demand caused by a sudden need to repay debts or pessimism, but no consensus has been reached.

In our opinion this lack of consensus is due to a gap in macroeconomic theory: we do not have a model that is rich enough to account for the many factors driving unemployment – including aggregate demand – and simple enough to lend itself to pencil-and-paper analysis. ...

In Michaillat and Saez (2014), we develop a new model of unemployment fluctuations to inspect the mechanisms behind unemployment fluctuations. The model can be seen as an equilibrium version of the Barro-Grossman model. It retains the architecture of the Barro-Grossman model but replaces the disequilibrium framework on the product and labour markets with an equilibrium matching framework. ...

Through the lens of our simple model, the empirical evidence suggests that price and real wage are somewhat rigid, and that unemployment fluctuations are mainly driven by aggregate demand shocks.

Tuesday, August 12, 2014

Why Do Macroeconomists Disagree?

I have a new column:

Why Do Macroeconomists Disagree?, by Mark Thoma, The Fiscal Times: On August 9, 2007, the French Bank BNP Paribus halted redemptions to three investment funds active in US mortgage markets due to severe liquidity problems, an event that many mark as the beginning of the financial crisis. Now, just over seven years later, economists still can’t agree on what caused the crisis, why it was so severe, and why the recovery has been so slow. We can’t even agree on the extent to which modern macroeconomic models failed, or if they failed at all.
The lack of a consensus within the profession on the economics of the Great Recession, one of the most significant economic events in recent memory, provides a window into the state of macroeconomics as a science. ...