Category Archive for: Macroeconomics [Return to Main]

Friday, April 24, 2015

'Unit Roots, Redux'

John Cochrane weighs in on the discussion of unit roots:

Unit roots, redux: Arnold Kling's askblog and Roger Farmer have a little exchange on GDP and unit roots. My two cents here.
I did a lot of work on this topic a long time ago, in How Big is the Random Walk in GNP?  (the first one)  Permanent and Transitory Components of GNP and Stock Prices” (The last, and I think best one) "Multivariate estimates" with Argia Sbordone, and "A critique of the application of unit root tests", particularly appropriate to Roger's battery of tests.
The conclusions, which I still think hold up today:
Log GDP has both random walk and stationary components. Consumption is a pretty good indicator of the random walk component. This is also what the standard stochastic growth model predicts: a random walk technology shock induces a random walk component in output but there are transitory dynamics around that value.
A linear trend in GDP is only visible ex-post, like a "bull" or "bear" market.  It's not "wrong" to detrend GDP, but it is wrong to forecast that GDP will return to the linear trend or to take too seriously correlations of linearly detrended series, as Arnold mentions. Treating macro series as cointegrated with one common trend is a better idea.
Log stock prices have random walk and stationary components. Dividends are a pretty good indicator of the random walk component. (Most recently, here.) ...
Both Arnold and Roger claim that unemployment has a unit root. Guys, you must be kidding. ...

He goes on to explain.

Tuesday, April 21, 2015

'Rethinking Macroeconomic Policy'

Olivier Blanchard at Vox EU:

Rethinking macroeconomic policy: Introduction, by Olivier Blanchard: On 15 and 16 April 2015, the IMF hosted the third conference on “Rethinking Macroeconomic Policy”. I had initially chosen as the title and subtitle “Rethinking Macroeconomic Policy III. Down in the trenches”.1 I thought of the first conference in 2011 as having identified the main failings of previous policies, the second conference in 2013 as having identified general directions, and this conference as a progress report.
My subtitle was rejected by one of the co-organisers, namely Larry Summers. He argued that I was far too optimistic, that we were nowhere close to knowing where were going. Arguing with Larry is tough, so I chose an agnostic title, and shifted to “Rethinking Macro Policy III. Progress or confusion?”
Where do I think we are today? I think both Larry and I are right. I do not say this for diplomatic reasons. We are indeed proceeding in the trenches. But where the trenches are eventually going remains unclear. This is the theme I shall develop in my remarks, focusing on macroprudential tools, monetary policy, and fiscal policy.

Continue reading "'Rethinking Macroeconomic Policy'" »

Saturday, April 18, 2015

NBER Annual Conference on Macroeconomics: Abstracts for Day Two

First paper:

Declining Desire to Work and Downward Trends in Unemployment and Participation, by Regis Barnichon and Andrew Figura: Abstract The US labor market has witnessed two apparently unrelated trends in the last 30 years: a decline in unemployment between the early 1980s and the early 2000s, and a decline in labor force participation since the early 2000s. We show that a substantial factor behind both trends is a decline in desire to work among individuals outside the labor force, with a particularly strong decline during the second half of the 90s. A decline in desire to work lowers both the unemployment rate and the participation rate, because a nonparticipant who wants to work has a high probability to join the unemployment pool in the future, while a nonparticipant who does not want to work has a low probability to ever enter the labor force. We use cross-sectional variation to estimate a model of non-participants' propensity to want a job, and we find that changes in the provision of welfare and social insurance, possibly linked to the mid-90s welfare reforms, explain about 50 percent of the decline in desire to work.

Second paper:

External and Public Debt Crises, by Cristina Arellano, Andrew Atkeson, and Mark Wright: Abstract In recent years, the members of two advanced monetary and economic unions -- the nations of the Eurozone and the states of the United States of America -- experienced debt crises with spreads on government borrowing rising dramatically. Despite the similar behavior of spreads on public debt, these crises were fundamentally different in nature. In Europe, the crisis occurred after a period of significant increases in government indebtedness from levels that were already substantial, whereas in the USA state government borrowing was limited and remained roughly unchanged. Moreover, whereas the most troubled nations of Europe experienced a sudden stop in private capital flows and private sector borrowers also faced large rises in spreads, there is little evidence that private borrowing in US states was differentially affected by the creditworthiness of state governments. In this sense, we can say that the US States experienced a public debt crisis , whereas the nations of Europe experienced an external debt crisis affecting both public and private borrowers. Why did Europe experience an external debt crisis and the US States only a public debt crisis? And, why did the members of other economic unions, such as the provinces of Canada, not experience a debt crisis at all despite high and rising provincial public debt levels? In this paper, we construct a model of default on domestic and external public debt and interference in private external debt contracts and use it to argue that these different debt experiences result from the interplay of differences in the ability of governments to interfere in the private external debt contracts of their citizens, with differences in the flexibility of state fiscal institutions. We also assemble a range of empirical evidence that suggests that the US States are less fiscally flexible but more constrained in their ability to interfere in private contracts than the members of other economic unions, which simultaneously exposes the states to public debt crises while insulating them from an external debt crisis affecting private sector borrowers within the state. In contrast, Eurozone nations are more fiscally flexible but have a greater ability to interfere with the contracts, which together allow for more public borrowing at the cost of a joint public and private external debt crisis. Lastly, Canadian provincial governments are both fiscally flexible and limited in their ability to interfere, which allows both for more public borrowing and limits the likelihood of either a public or external debt crisis occurring. We draw lessons from these findings for the future design of Eurozone economic and legal institutions.

Friday, April 17, 2015

NBER Annual Conference on Macroeconomics: Abstracts for Day One

First paper at the NBER Annual Conference on Macroeconomics

Expectations and Investment, by Nicola Gennaioli, Yueran Ma, and Andrei Shleifer: Abstract Using micro data from Duke University quarterly survey of Chief Financial Officers, we show that corporate investment plans as well as actual investment are well explained by CFOs’ expectations of earnings growth. The information in expectations data is not subsumed by traditional variables, such as Tobin’s Q or discount rates. We also show that errors in CFO expectations of earnings growth are predictable from past earnings and other data, pointing to extrapolative structure of expectations and suggesting that expectations may not be rational . This evidence, like earlier findings in finance, points to the usefulness of data on actual expectations for understanding economic behavior.

Second paper:

Trends and Cycles in China's Macroeconomy, by Chun Chang, Kaji Chen, Daniel Waggoner, and Tao Zha: Abstract We make three contributions in this paper. First, we provide a core of macroeconomic time series usable for systematic research on China. Second, we document, through various empirical methods, the robust findings about striking patterns of trend and cycle. Third, we build a theoretical model that accounts for these facts. The model's mechanism and assumptions are corroborated by institutional details, disaggregated data, and banking time series, all of which are distinctive of Chinese characteristics. The departure of our theoretical model from standard ones offers a constructive framework for studying China's macroeconomy.

Third paper:

Demystifying the Chinese Housing Boom, byHanming Fang, Quanlin Gu, Wei Xiong, and Li-An Zhou: Abstract We construct housing price indices for 120 major cities in China in 2003 - 2013 based on sequential sales of new homes within the same housing developments. By using these indices and detailed information on mortgage borrowers across these cities, we find enormous housing price appreciation during the decade, which was accompanied by equally impressive growth in household income, except in a few first-tier cities. Housing market participation by households from the low-income fraction of the urban population remained steady. Nevertheless, bottom-income mortgage borrowers endured severe financial burdens by using price-to-income ratios over eight to buy homes, which reflected their expectations of persistently high income growth into the future. Such future income expectations could contract substantially in the event of a sudden stop in the Chinese economy and present an important source of risk to the housing market.

Fourth paper:

Networks and the Macroeconomy: An Empirical Exploration, by Daron Acemoglu, Ufuk Akcigit, and William Kerr: Abstract The propagation of macroeconomic shocks through input-output and geographic networks can be a powerful driver of macroeconomic fluctuations. We first exposit that in the presence of Cobb-Douglas production functions and consumer preferences, there is a specific pattern of economic transmission whereby demand-side shocks propagate upstream (to input supplying industries) and supply-side shocks propagate downstream (to customer industries) and that there is a tight relationship between the direct impact of a shock and the magnitudes of the downstream and the upstream indirect effects. We then investigate the short-run propagation of four different types of industry-level shocks: two demand-side ones (the exogenous component of the variation in industry imports from China and changes in federal spending) and two supply-side ones (TFP shocks and variation in knowledge/ideas coming from foreign patent- ing). In each case, we find substantial propagation of these shocks through the input-output network, with a pattern broadly consistent with theory. Quantitatively, the network-based propagation is larger than the direct effects of the shocks, sometimes by several fold. We also show quantitatively large effects from the geographic network, capturing the fact that the local propagation of a shock to an industry will fall more heavily on other industries that tend to collocate with it across local markets. Our results suggest that the transmission of various different types of shocks through economic networks and industry inter-linkages could have first-order implications for the macroeconomy.

Thursday, April 16, 2015

Video: Rethinking Macro Policy

Rethinking Macro Policy III: Session 3. Monetary Policy in the Future
Chair: José Viñals, Ben Bernanke, Gill Marcus, John Taylor


Rethinking Macro Policy: Session 4. Fiscal Policy in the Future
Chair: Vitor Gaspar, Marco Buti, Martin Feldstein, Brad DeLong,

Monday, April 13, 2015

In Defense of Modern Macroeconomic Theory

A small part of a much longer post from David Andolfatto (followed by some comments of my own):

In defense of modern macro theory: The 2008 financial crisis was a traumatic event. Like all social trauma, it invoked a variety of emotional responses, including the natural (if unbecoming) human desire to find someone or something to blame. Some of the blame has been directed at segments of the economic profession. It is the nature of some of these criticisms that I'd like to talk about today. ...
The dynamic general equilibrium (DGE) approach is the dominant methodology in macro today. I think this is so because of its power to organize thinking in a logically consistent manner, its ability to generate reasonable conditional forecasts, as well as its great flexibility--a property that permits economists of all political persuasions to make use of the apparatus. ...

The point I want to make here is not that the DGE approach is the only way to go. I am not saying this at all. In fact, I personally believe in the coexistence of many different methodologies. The science of economics is not settled, after all. The point I am trying to make is that the DGE approach is not insensible (despite the claims of many critics who, I think, are sometimes driven by non-scientific concerns). ...

Once again (lest I be misunderstood, which I'm afraid seems unavoidable these days) I am not claiming that DGE is the be-all and end-all of macroeconomic theory. There is still a lot we do not know and I think it would be a good thing to draw on the insights offered by alternative approaches. I do not, however, buy into the accusation that there "too much math" in modern theory. Math is just a language. Most people do not understand this language and so they have a natural distrust of arguments written in it. .... Before criticizing, either learn the language or appeal to reliable translations...

As for the teaching of macroeconomics, if the crisis has led more professors to pay more attention to financial market frictions, then this is a welcome development. I also fall in the camp that stresses the desirability of teaching more economic history and placing greater emphasis on matching theory with data. ... Thus, one could reasonably expect a curriculum to be modified to include more history, history of thought, heterodox approaches, etc. But this is a far cry from calling for the abandonment of DGE theory. Do not blame the tools for how they were (or were not) used.

I've said a lot of what David says about modern macroeconomic models at one time or another in the past, for example it's not the tools of macroeconomics, it's how they are used. But I do think he leaves out one important factor, the need to ask the right question (and why we didn't prior to the crisis). This is from August, 2009:

In The Economist, Robert Lucas responds to recent criticism of macroeconomics ("In Defense of the Dismal Science"). Here's my entry at Free Exchange in response to his essay:

Lucas roundtable: Ask the right questions, by Mark Thoma: In his essay, Robert Lucas defends macroeconomics against the charge that it is "valueless, even harmful", and that the tools economists use are "spectacularly useless".

I agree that the analytical tools economists use are not the problem. We cannot fully understand how the economy works without employing models of some sort, and we cannot build coherent models without using analytic tools such as mathematics. Some of these tools are very complex, but there is nothing wrong with sophistication so long as sophistication itself does not become the main goal, and sophistication is not used as a barrier to entry into the theorist's club rather than an analytical device to understand the world.

But all the tools in the world are useless if we lack the imagination needed to build the right models. We ... have to ask the right questions before we can build the right models.

The problem wasn't the tools that macroeconomists use, it was the questions that we asked. The major debates in macroeconomics had nothing to do with the possibility of bubbles causing a financial system meltdown. That's not to say that there weren't models here and there that touched upon these questions, but the main focus of macroeconomic research was elsewhere. ...

The interesting question to me, then, is why we failed to ask the right questions. ...

Why did we, for the most part, fail to ask the right questions? Was it lack of imagination, was it the sociology within the profession, the concentration of power over what research gets highlighted, the inadequacy of the tools we brought to the problem, the fact that nobody will ever be able to predict these types of events, or something else?

It wasn't the tools, and it wasn't lack of imagination. As Brad DeLong points out, the voices were there—he points to Michael Mussa for one—but those voices were not heard. Nobody listened even though some people did see it coming. So I am more inclined to cite the sociology within the profession or the concentration of power as the main factors that caused us to dismiss these voices. ...

I don't know for sure the extent to which the ability of a small number of people in the field to control the academic discourse led to a concentration of power that stood in the way of alternative lines of investigation, or the extent to which the ideology that markets prices always tend to move toward their long-run equilibrium values caused us to ignore voices that foresaw the developing bubble and coming crisis. But something caused most of us to ask the wrong questions, and to dismiss the people who got it right, and I think one of our first orders of business is to understand how and why that happened.

Here's an interesting quote from Thomas Sargent along the same lines:

The criticism of real business cycle models and their close cousins, the so-called New Keynesian models, is misdirected and reflects a misunderstanding of the purpose for which those models were devised.6 These models were designed to describe aggregate economic fluctuations during normal times when markets can bring borrowers and lenders together in orderly ways, not during financial crises and market breakdowns.

Which to me is another way of saying we didn't foresee the need to ask questions (and build models) that would be useful in a financial crisis -- we were focused on models that would explain "normal times" (which is connected to the fact that we thought the Great Moderation would continue due to arrogance on behalf of economists leading to the belief that modern policy tools, particularly from the Fed, would prevent major meltdowns, financial or otherwise). That is happening now, so we'll be much more prepared if history repeats itself, but I have to wonder what other questions we should be asking, but aren't.

Let me add one more thing (a few excerpts from a post in 2010) about the sociology within economics:

I want to follow up on the post highlighting attempts to attack the messengers -- attempts to discredit Brad DeLong and Paul Krugman on macroeconomic policy in particular -- rather than engage academically with the message they are delivering (Krugman's response). ...
One of the objections often raised is that Krugman and DeLong are not, strictly speaking, macroeconomists. But if Krugman, DeLong, and others are expressing the theoretical and empirical results concerning macroeconomic policy accurately, does it really matter if we can strictly classify them as macroeconomists? Why is that important except as an attempt to discredit the message they are delivering? ... Attacking people rather than discussing ideas avoids even engaging on the issues. And when it comes to the ideas -- here I am talking most about fiscal policy -- as I've already noted in the previous post, the types of policies Krugman, DeLong, and others have advocated (and I should include myself as well) can be fully supported using modern macroeconomic models. ...
So, in answer to those who objected to my defending modern macro, you are partly right. I do think the tools and techniques macroeconomists use have value, and that the standard macro model in use today represents progress. But I also think the standard macro model used for policy analysis, the New Keynesian model, is unsatisfactory in many ways and I'm not sure it can be fixed. Maybe it can, but that's not at all clear to me. In any case, in my opinion the people who have strong, knee-jerk reactions whenever someone challenges the standard model in use today are the ones standing in the way of progress. It's fine to respond academically, a contest between the old and the new is exactly what we need to have, but the debate needs to be over ideas rather than an attack on the people issuing the challenges.

Tuesday, April 07, 2015

In Search of Better Macroeconomic Models

I have a new column:

In Search of Better Macroeconomic Models: Modern macroeconomic models did not perform well during the Great Recession. What needs to be done to fix them? Can the existing models be patched up, or are brand new models needed? ...

It's mostly about the recent debate on whether we need microfoundations in macroeconomics.

Saturday, April 04, 2015

'Do not Underestimate the Power of Microfoundations'

Simon Wren-Lewis takes a shot at answering Brad DeLong's question about microfoundations:

Do not underestimate the power of microfoundations: Brad DeLong asks why the New Keynesian (NK) model, which was originally put forth as simply a means of demonstrating how sticky prices within an RBC framework could produce Keynesian effects, has managed to become the workhorse of modern macro, despite its many empirical deficiencies. ... Brad says his question is closely related to the “question of why models that are microfounded in ways we know to be wrong are preferable in the discourse to models that try to get the aggregate emergent properties right.”...
Why are microfounded models so dominant? From my perspective this is a methodological question, about the relative importance of ‘internal’ (theoretical) versus ‘external’ (empirical) consistency. ...
 I would argue that the New Classical (counter) revolution was essentially a methodological revolution. However..., it will be a struggle to get macroeconomists below a certain age to admit this is a methodological issue. Instead they view microfoundations as just putting right inadequacies with what went before.
So, for example, you will be told that internal consistency is clearly an essential feature of any model, even if it is achieved by abandoning external consistency. ... In essence, many macroeconomists today are blind to the fact that adopting microfoundations is a methodological choice, rather than simply a means of correcting the errors of the past.
I think this has two implications for those who want to question the microfoundations hegemony. The first is that the discussion needs to be about methodology, rather than individual models. Deficiencies with particular microfounded models, like the NK model, are generally well understood, and from a microfoundations point of view simply provide an agenda for more research. Second, lack of familiarity with methodology means that this discussion cannot presume knowledge that is not there. ... That makes discussion difficult, but I’m not sure it makes it impossible.

Saturday, March 28, 2015

'Unreal Keynesians'

Paul Krugman:

Unreal Keynesians: Brad DeLong points me to Lars Syll declaring that I am not a “real Keynesian”, because I use equilibrium models and don’t emphasize the instability of expectations. ...
I don’t care whether Hicksian IS-LM is Keynesian in the sense that Keynes himself would have approved of it, and neither should you. What you should ask is whether that approach has proved useful — and whether the critics have something better to offer.
And as I have often argued, these past 6 or 7 years have in fact been a triumph for IS-LM. Those of us using IS-LM made predictions about the quiescence of interest rates and inflation that were ridiculed by many on the right, but have been completely borne out in practice. We also predicted much bigger adverse effects from austerity than usual because of the zero lower bound, and that has also come true. ...

Wednesday, March 25, 2015

'Anti-Keynesian Delusions'

Paul Krugman continues the discussion on the use of the Keynesian model:

Anti-Keynesian Delusions: I forgot to congratulate Mark Thoma on his tenth blogoversary, so let me do that now. ...
Today Mark includes a link to one of his own columns, a characteristically polite and cool-headed response to the latest salvo from David K. Levine. Brad DeLong has also weighed in, less politely.
I’d like to weigh in with a more general piece of impoliteness, and note a strong empirical regularity in this whole area. Namely, whenever someone steps up to declare that Keynesian economics is logically and empirically flawed, has been proved wrong and refuted, you know what comes next: a series of logical and empirical howlers — crude errors of reasoning, assertions of fact that can be checked and rejected in a minute or two.
Levine doesn’t disappoint. ...

He goes on to explain in detail.

Update: Brad DeLong also comments.

Tuesday, March 24, 2015

'Macro Wars: The Attack of the Anti-Keynesians'

I have a new column:

Macro Wars: The Attack of the Anti-Keynesians, by Mark Thoma: The ongoing war between the Keynesians and the anti-Keynesians appears to be heating up again. The catalyst for this round of fighting is The Keynesian Illusion by David K. Levine, which elicited responses such as this and this from Brad DeLong and Nick Rowe.
The debate is about the source of economic fluctuations and the government’s ability to counteract them with monetary and fiscal policy. One of the issues is the use of “old fashioned” Keynesian models – models that have supposedly been rejected by macroeconomists in favor of modern macroeconomic models – to explain and understand the Great Recession and to make monetary and fiscal policy recommendations. As Levine says, “Robert Lucas, Edward Prescott, and Thomas Sargent … rejected Keynesianism because it doesn't work… As it happens we have developed much better theories…”
I believe the use of “old-fashioned” Keynesian models to analyze the Great Recession can be defended. ...

Monday, March 23, 2015

Paul Krugman: This Snookered Isle

Mediamacro:

This Snookered Isle, by Paul Krugman, Commentary, NY Times: The 2016 election is still 19 mind-numbing, soul-killing months away. There is, however, another important election in just six weeks, as Britain goes to the polls. And many of the same issues are on the table.
Unfortunately, economic discourse in Britain is dominated by a misleading fixation on budget deficits. Worse, this bogus narrative has infected supposedly objective reporting; media organizations routinely present as fact propositions that are contentious if not just plain wrong.
Needless to say, Britain isn’t the only place where things like this happen. A few years ago, at the height of our own deficit fetishism, the American news media showed some of the same vices. ... Reporters would drop all pretense of neutrality and cheer on proposals for entitlement cuts.
In the United States, however, we seem to have gotten past that. Britain hasn’t.
The narrative I’m talking about goes like this: In the years before the financial crisis, the British government borrowed irresponsibly... As a result, by 2010 Britain was at imminent risk of a Greek-style crisis; austerity policies, slashing spending in particular, were essential. And this turn to austerity is vindicated by Britain’s low borrowing costs, coupled with the fact that the economy, after several rough years, is now growing quite quickly.
Simon Wren-Lewis of Oxford University has dubbed this narrative “mediamacro.” As his coinage suggests, this is what you hear all the time on TV and read in British newspapers, presented not as the view of one side of the political debate but as simple fact.
Yet none of it is true. ...
Given all this, you might wonder how mediamacro gained such a hold on British discourse. Don’t blame economists. ... This media orthodoxy has become entrenched despite, not because of, what serious economists had to say.
Still, you can say the same of Bowles-Simpsonism in the United States... It was all about posturing, about influential people believing that pontificating about the need to make sacrifices — or, actually, for other people to make sacrifices — is how you sound wise and serious. ...
As I said, in the United States we have mainly gotten past that, for a variety of reasons — among them, I suspect, the rise of analytical journalism, in places like The Times’s The Upshot. But Britain hasn’t; an election that should be about real problems will, all too likely, be dominated by mediamacro fantasies.

Wednesday, March 18, 2015

'Is the Walrasian Auctioneer Microfounded?'

Simon Wren-Lewis (he says this is "For macroeconomists"):

Is the Walrasian Auctioneer microfounded?: I found this broadside against Keynesian economics by David K. Levine interesting. It is clear at the end that he is child of the New Classical revolution. Before this revolution he was far from ignorant of Keynesian ideas. He adds: “Knowledge of Keynesianism and Keynesian models is even deeper for the great Nobel Prize winners who pioneered modern macroeconomics - a macroeconomics with people who buy and sell things, who save and invest - Robert Lucas, Edward Prescott, and Thomas Sargent among others. They also grew up with Keynesian theory as orthodoxy - more so than I. And we rejected Keynesianism because it doesn't work not because of some aesthetic sense that the theory is insufficiently elegant.”
The idea is familiar: New Classical economists do things properly, by founding their analysis in the microeconomics of individual production, savings and investment decisions. [2] It is no surprise therefore that many of today’s exponents of this tradition view their endeavour as a natural extension of the Walrasian General Equilibrium approach associated with Arrow, Debreu and McKenzie. But there is one agent in that tradition that is as far from microfoundations as you can get: the Walrasian auctioneer. It is this auctioneer, and not people, who typically sets prices. ...
Now your basic New Keynesian model contains a huge number of things that remain unrealistic or are just absent. However I have always found it extraordinary that some New Classical economists declare such models as lacking firm microfoundations, when these models at least try to make up for one area where RBC models lack any microfoundations at all, which is price setting. A clear case of the pot calling the kettle black! I have never understood why New Keynesians can be so defensive about their modelling of price setting. Their response every time should be ‘well at least it’s better than assuming an intertemporal auctioneer’.[1] ...
As to the last sentence in the quote from Levine above, I have talked before about the assertion that Keynesian economics did not work, and the implication that RBC models work better. He does not talk about central banks, or monetary policy. If he had, he would have to explain why most of the people working for them seem to believe that New Keynesian type models are helpful in their job of managing the economy. Perhaps these things are not mentioned because it is so much easier to stay living in the 1980s, in those glorious days (for some) when it appeared as if Keynesian economics had been defeated for good.

'Arezki, Ramey, and Sheng on News Shocks'

I was at this conference as well. This paper was very well received (it has been difficult to find evidence that news generates business cycles, in part because it's been difficult to find a "clean" shock):

Arezki, Ramey, and Sheng on news shocks: I attended the NBER EFG (economic fluctuations and growth) meeting a few weeks ago, and saw a very nice paper by Rabah Arezki, Valerie Ramey, and Liugang Sheng, "News Shocks in Open Economies: Evidence from Giant Oil Discoveries" (There were a lot of nice papers, but this one is more bloggable.)

They look at what happens to economies that discover they have a lot of oil. ... An oil discovery is a well identified "news shock."

Standard productivity shocks are a bit nebulous, and alter two things at once: they give greater productivity and hence incentive to work today and also news about more income in the future.

An oil discovery is well publicized. It incentivizes a small investment in oil drilling, but mostly is pure news of an income flow in the future. It does not affect overall labor productivity or other changes to preferences or technology.
Rabah,Valerie, and Liugang then construct a straightforward macro model of such an event. ...[describes model and results]...

Valerie, presenting the paper, was a bit discouraged. This "news shock" doesn't generate a pattern that looks like standard recessions, because GDP and employment go in the opposite direction.

I am much more encouraged. Here are macroeconomies behaving exactly as they should, in response to a shock where for once we really know what the shock is. And in response to a shock with a nice dynamic pattern, which we also really understand.

My comment was something to the effect of "this paper is much more important than you think. You match the dynamic response of economies to this large and very well identified shock with a standard, transparent and intuitive neoclassical model. Here's a list of some of the ingredients you didn't need: Sticky prices, sticky wages, money, monetary policy, (i.e. interest rates that respond via a policy rule to output and inflation or zero bounds that stop them from doing so), home bias, segmented financial markets, credit constraints, liquidity constraints, hand-to-mouth consumers, financial intermediation, liquidity spirals, fire sales, leverage, sudden stops, hot money, collateral constraints, incomplete markets, idiosyncratic risks, strange preferences including habits, nonexpected utility, ambiguity aversion, and so forth, behavioral biases, nonexpected utility, or rare disasters. If those ingredients are really there, they ought to matter for explaining the response to your shocks too. After all, there is only one economic structure, which is hit by many shocks. So your paper calls into question just how many of those ingredients are really there at all."

Thomas Phillipon, whose previous paper had a pretty masterful collection of a lot of those ingredients, quickly pointed out my overstatement. One needs not need every ingredient to understand every shock. Constraint variables are inequalities. A positive news shock may not cause credit constraints etc. to bind, while a negative shock may reveal them.

Good point. And really, the proof is in the pudding. If those ingredients are not necessary, then I should produce a model without them that produces events like 2008. But we've been debating the ingredients and shock necessary to explain 1932 for 82 years, so that approach, though correct, might take a while.

In the meantime, we can still cheer successful simple models and well identified shocks on the few occasions that they appear and fit data so nicely. Note to graduate students, this paper is a really nice example to follow for its integration of clear theory and excellent empirical work.

Saturday, March 14, 2015

'John and Maynard’s Excellent Adventure'

Paul Krugman defends IS-LM analysis (I'd make one qualification. Models are built to answer specific questions, we do not have one grand unifying model to use for all questions. IS-LM models were built to answer exactly the kinds of questions we encountered during the Great Recession, and the IS-LM model provided good answers (especially if one remembers where the model encounters difficulties). DSGE models were built to address other issues, and it's not surprising they didn't do very well when they were pushed to address questions they weren't designed to answer. The best model to use depends upon the question one is asking):

John and Maynard’s Excellent Adventure: When I tell people that macroeconomic analysis has been triumphantly successful in recent years, I tend to get strange looks. After all, wasn’t everyone predicting lots of inflation? Didn’t policymakers get it all wrong? Haven’t the academic economists been squabbling nonstop?
Well, as a card-carrying economist I disavow any responsibility for Rick Santelli and Larry Kudlow; I similarly declare that Paul Ryan and Olli Rehn aren’t my fault. As for the economists’ disputes, well, let me get to that in a bit.
I stand by my claim, however. The basic macroeconomic framework that we all should have turned to, the framework that is still there in most textbooks, performed spectacularly well: it made strong predictions that people who didn’t know that framework found completely implausible, and those predictions were vindicated. And the framework in question – basically John Hicks’s interpretation of John Maynard Keynes – was very much the natural way to think about the issues facing advanced countries after 2008. ...
I call this a huge success story – one of the best examples in the history of economics of getting things right in an unprecedented environment.
The sad thing, of course, is that this incredibly successful analysis didn’t have much favorable impact on actual policy. Mainly that’s because the Very Serious People are too serious to play around with little models; they prefer to rely on their sense of what markets demand, which they continue to consider infallible despite having been wrong about everything. But it also didn’t help that so many economists also rejected what should have been obvious.
Why? Many never learned simple macro models – if it doesn’t involve microfoundations and rational expectations, preferably with difficult math, it must be nonsense. (Curiously, economists in that camp have also proved extremely prone to basic errors of logic, probably because they have never learned to work through simple stories.) Others, for what looks like political reasons, seemed determined to come up with some reason, any reason, to be against expansionary monetary and fiscal policy.
But that’s their problem. From where I sit, the past six years have been hugely reassuring from an intellectual point of view. The basic model works; we really do know what we’re talking about.

[The original is quite a bit longer.]

Thursday, March 05, 2015

'Economists' Biggest Failure'

Noah Smith:

Economists' Biggest Failure: One of the biggest things that economists get grief about is their failure to predict big events like recessions. ... 
Pointing this out usually leads to the eternal (and eternally fun) debate over whether economics is a real science. The profession's detractors say that if you don’t make successful predictions, you aren’t a science. Economists will respond that seismologists can’t forecast earthquakes, and meteorologists can’t forecast hurricanes, and who cares what’s really a “science” anyway. 
The debate, however, misses the point. Forecasts aren’t the only kind of predictions a science can make. In fact, they’re not even the most important kind. 
Take physics for example. Sometimes physicists do make forecasts -- for example, eclipses. But those are the exception. Usually, when you make a new physics theory, you use it to predict some new phenomenon... For example, quantum mechanics has gained a lot of support from predicting the strange new things like quantum tunneling or quantum teleportation.
Other times, a theory will predict things we have seen before, but will describe them in terms of other things that we thought were totally separate, unrelated phenomena. This is called unification, and it’s a key part of what philosophers think science does. For example, the theory of electromagnetism says that light, electric current, magnetism, radio waves are all really the same phenomenon. Pretty neat! ...
So that’s physics. What about economics? Actually, econ has a number of these successes too. When Dan McFadden used his Random Utility Model to predict how many people would ride San Francisco's Bay Area Rapid Transit system,... he got it right. And he got many other things right with the same theory -- it wasn’t developed to explain only train ridership. 
Unfortunately, though, this kind of success isn't very highly regarded in the economics world... Maybe now, with the ascendance of empirical economics and a decline in theory, we’ll see a focus on producing fewer but better theories, more unification, and more attempts to make novel predictions. Someday, maybe macroeconomists will even be able to make forecasts! But let’s not get our hopes up.

I've addressed this question many times, e.g. in 2009, and to me the distinction is between forecasting the future, and understanding why certain phenomena occur (re-reading, it's a bit repetitive):

Are Macroeconomic Models Useful?: There has been no shortage of effort devoted to predicting earthquakes, yet we still can't see them coming far enough in advance to move people to safety. When a big earthquake hits, it is a surprise. We may be able to look at the data after the fact and see that certain stresses were building, so it looks like we should have known an earthquake was going to occur at any moment, but these sorts of retrospective analyses have not allowed us to predict the next one. The exact timing and location is always a surprise.
Does that mean that science has failed? Should we criticize the models as useless?
No. There are two uses of models. One is to understand how the world works, another is to make predictions about the future. We may never be able to predict earthquakes far enough in advance and with enough specificity to allow us time to move to safety before they occur, but that doesn't prevent us from understanding the science underlying earthquakes. Perhaps as our understanding increases prediction will be possible, and for that reason scientists shouldn't give up trying to improve their models, but for now we simply cannot predict the arrival of earthquakes.
However, even though earthquakes cannot be predicted, at least not yet, it would be wrong to conclude that science has nothing to offer. First, understanding how earthquakes occur can help us design buildings and make other changes to limit the damage even if we don't know exactly when an earthquake will occur. Second, if an earthquake happens and, despite our best efforts to insulate against it there are still substantial consequences, science can help us to offset and limit the damage. To name just one example, the science surrounding disease transmission helps use to avoid contaminated water supplies after a disaster, something that often compounds tragedy when this science is not available. But there are lots of other things we can do as well, including using the models to determine where help is most needed.
So even if we cannot predict earthquakes, and we can't, the models are still useful for understanding how earthquakes happen. This understanding is valuable because it helps us to prepare for disasters in advance, and to determine policies that will minimize their impact after they happen.
All of this can be applied to macroeconomics. Whether or not we should have predicted the financial earthquake is a question that has been debated extensively, so I am going to set that aside. One side says financial market price changes, like earthquakes, are inherently unpredictable -- we will never predict them no matter how good our models get (the efficient markets types). The other side says the stresses that were building were obvious. Like the stresses that build when tectonic plates moving in opposite directions rub against each other, it was only a question of when, not if. (But even when increasing stress between two plates is observable, scientists cannot tell you for sure if a series of small earthquakes will relieve the stress and do little harm, or if there will be one big adjustment that relieves the stress all at once. With respect to the financial crisis, economists expected lots of little, small harm causing adjustments, instead we got the "big one," and the "buildings and other structures" we thought could withstand the shock all came crumbling down. On prediction in economics, perhaps someday improved models will allow us to do better than we have so far at predicting the exact timing of crises, and I think that earthquakes provide some guidance here. You have to ask first if stress is building in a particular sector, and then ask if action needs to be taken because the stress has reached dangerous levels, levels that might result in a big crash rather than a series of small stress relieving adjustments. I don't think our models are very good at detecting accumulating stress...
Whether the financial crisis should have been predicted or not, the fact that it wasn't predicted does not mean that macroeconomic models are useless any more than the failure to predict earthquakes implies that earthquake science is useless. As with earthquakes, even when prediction is not possible (or missed), the models can still help us to understand how these shocks occur. That understanding is useful for getting ready for the next shock, or even preventing it, and for minimizing the consequences of shocks that do occur. 
But we have done much better at dealing with the consequences of unexpected shocks ex-post than we have at getting ready for these a priori. Our equivalent of getting buildings ready for an earthquake before it happens is to use changes in institutions and regulations to insulate the financial sector and the larger economy from the negative consequences of financial and other shocks. Here I think economists made mistakes - our "buildings" were not strong enough to withstand the earthquake that hit. We could argue that the shock was so big that no amount of reasonable advance preparation would have stopped the "building" from collapsing, but I think it's more the case that enough time has passed since the last big financial earthquake that we forgot what we needed to do. We allowed new buildings to be constructed without the proper safeguards.
However, that doesn't mean the models themselves were useless. The models were there and could have provided guidance, but the implied "building codes" were ignored. Greenspan and others assumed no private builder would ever construct a building that couldn't withstand an earthquake, the market would force them to take this into consideration. But they were wrong about that, and even Greenspan now admits that government building codes are necessary. It wasn't the models, it was how they were used (or rather not used) that prevented us from putting safeguards into place.
We haven't failed at this entirely though. For example, we have had some success at putting safeguards into place before shocks occur, automatic stabilizers have done a lot to insulate against the negative consequences of the recession (though they could have been larger to stop the building from swaying as much as it has). So it's not proper to say that our models have not helped us to prepare in advance at all, the insulation social insurance programs provide is extremely important to recognize. But it is the case that we could have and should have done better at preparing before the shock hit.
I'd argue that our most successful use of models has been in cleaning up after shocks rather than predicting, preventing, or insulating against them through pre-crisis preparation. When despite our best effort to prevent it or to minimize its impact a priori, we get a recession anyway, we can use our models as a guide to monetary, fiscal, and other policies that help to reduce the consequences of the shock (this is the equivalent of, after a disaster hits, making sure that the water is safe to drink, people have food to eat, there is a plan for rebuilding quickly and efficiently, etc.). As noted above, we haven't done a very good job at predicting big crises, and we could have done a much better job at implementing regulatory and institutional changes that prevent or limit the impact of shocks. But we do a pretty good job of stepping in with policy actions that minimize the impact of shocks after they occur. This recession was bad, but it wasn't another Great Depression like it might have been without policy intervention.
Whether or not we will ever be able to predict recessions reliably, it's important to recognize that our models still provide considerable guidance for actions we can take before and after large shocks that minimize their impact and maybe even prevent them altogether (though we will have to do a better job of listening to what the models have to say). Prediction is important, but it's not the only use of models.

Monday, January 26, 2015

'Does Monopoly Power Cause Inflation? (1968 and all that)'

Nick Rowe:

Does monopoly power cause inflation? (1968 and all that): Here's a question for you: Suppose there is a permanent increase in monopoly power across the economy (either firms having more monopoly power in output markets, or unions having more monopoly power in labour markets). Would that permanent increase in monopoly power cause a permanent increase in the inflation rate?
Most economists today would answer "no" to that question. It might maybe cause a temporary once-and-for-all rise in the price level, but it would not cause a permanent increase in the inflation rate. The question just sounds strange to modern economists' ears. They would much prefer to discuss whether a permanent increase in monopoly power caused a permanent reduction in real output and employment. What has monopoly power got to do with inflation?
To economists 40 or 50 years ago, the question would not have sounded strange at all. Many (maybe most?) economists would have answered "yes" to that question. ...

 

Saturday, January 10, 2015

'Orthodoxy, Heterodoxy, and Ideology'

Paul Krugman:

Orthodoxy, Heterodoxy, and Ideology: Many economists responded badly to the economic crisis. And there’s a lot wrong with mainstream economic analysis. But how closely are these two assertions related? Not as much as you might think. So I’m very much in accord with Simon Wren-Lewis on the remarkable unhelpfulness of recent heterodox assaults on the field. Not that there’s anything wrong with being heterodox in general; but a lot of what we’ve been seeing misidentifies the problem, and if anything gives aid and comfort to the wrong people.
The point is that standard macroeconomics does NOT justify the attacks on fiscal stimulus and the embrace of austerity. On these issues, people like Simon and myself have been following well-established models and analyses, while the austerians have been making up new stuff and/or rediscovering old fallacies to justify the policies they want. Formal modeling and quantitative analysis doesn’t justify the austerian position; on the contrary, austerians had to throw out the models and abandon statistical principles to justify their claims.
Let’s look at several examples. ...

See also Chris Dillow: Heterodox economics & the left.

It's remarkable how many people rejected the conclusions of *modern* macroeconomic models (or invented nonsense) in order to oppose fiscal policy. It seemed to have more to do with ideology (the government can't possible help no matter what the model says...) and identification (I'm a serious macroeconomist, don't lump me in with all those old fashioned Keynesian hippie types) than with standard macroeconomic analysis.

On this point, see Simon Wren-Lewis: Faith based macroeconomics.

Thursday, December 18, 2014

'What’s the Matter with Economics?': An Exchange

Arnold Packer and Jeff Madrick respond to Alan Blinder in the NYRB, and he replies:

‘What’s the Matter with Economics?’: An Exchange: In response to: What’s the Matter with Economics? from the December 18, 2014 issue ...
To the Editors:
Alan Blinder is one of the finest mainstream economists around. But to read his review of my book, you’d think that nothing was wrong with economics in recent decades except as it is practiced by a few right-wingers.
This is of course not the case. ...
Jeff Madrick
New York City
Alan S. Blinder replies:
According to both Jeff Madrick and Arnie Packer, I claim “that except for some right-wingers outside the ‘mainstream’…little is the matter” with economics. (These are Packer’s words; Madrick’s are similar.) But it’s not true. I think there is lots wrong with mainstream economics.
For starters, my review explicitly agreed with Madrick that (a) ideological predispositions infect economists’ conclusions far too much; (b) economics has drifted to the right (along with the American body politic); and (c) some economists got carried away by the allure of the efficient markets hypothesis. I also added a few indictments of my own: that we economists have failed to convey even the most basic economic principles to the public; and that some of our students turned Adam Smith’s invisible hand into Gordon Gekko’s “greed is good.” ...
Yet Madrick still insists that “economists rely on a fairly pure version of the invisible hand most of the time.” Not us mainstreamers. I’m a member of the tribe, I live among these people every day, and—trust me—we really don’t apply the “pure version” to the real world. For example, many of us see reasons for a minimum wage, mandatory Social Security, progressive taxation, carbon taxes, and a whole variety of financial regulations—to name just a few. ...

[Hard to summarize this one with a few excerpts -- I left a lot out...]

Sunday, December 14, 2014

Real Business Cycle Theory

Roger Farmer:

Real business cycle theory and the high school Olympics: I have lost count of the number of times I have heard students and faculty repeat the idea in seminars, that “all models are wrong”. This aphorism, attributed to George Box,  is the battle cry  of the Minnesota calibrator, a breed of macroeconomist, inspired by Ed Prescott, one of the most important and influential economists of the last century.
Of course all models are wrong. That is trivially true: it is the definition of a model. But the cry  has been used for three decades to poke fun at attempts to use serious econometric methods to analyze time series data. Time series methods were inconvenient to the nascent Real Business Cycle Program that Ed pioneered because the models that he favored were, and still are, overwhelmingly rejected by the facts. That is inconvenient. Ed’s response was pure genius. If the model and the data are in conflict, the data must be wrong. ...

After explaining, he concludes:

We don't have to play by Ed's rules. We can use the methods developed by Rob Engle and Clive Granger as I have done here. Once we allow aggregate demand to influence permanently the unemployment rate, the data do not look kindly on either real business cycle models or on the new-Keynesian approach. It's time to get serious about macroeconomic science...

Thursday, November 27, 2014

MarkSpeaks

Simon Wren-Lewis:

As Mark Thoma often says, the problem is with macroeconomists rather than macroeconomics.

Much, much more here.

Saturday, November 15, 2014

'The Unwisdom of Crowding Out'

Here's Paul Krugman's response to the Vox EU piece by Peter Temin and David Vines that I posted yesterday:

The Unwisdom of Crowding Out (Wonkish): I am, to my own surprise, not too happy with the defense of Keynes by Peter Temin and David Vines in VoxEU. Peter and David are of course right that Keynes has a lot to teach us, and are also right that the anti-Keynesians aren’t just making really bad arguments; they’re making the very same really bad arguments Keynes refuted 80 years ago.
But the Temin-Vines piece seems to conflate several different bad arguments under the heading of “Ricardian equivalence”, and in so doing understates the badness.
The anti-Keynesian proposition is that government spending to boost a depressed economy will fail, because it will lead to an equal or greater fall in private spending — it will crowd out investment and maybe consumption, and therefore accomplish nothing except a shift in who spends. But why do the AKs claim this will happen? I actually see five arguments out there — two (including the actual Ricardian equivalence argument) completely and embarrassingly wrong on logical grounds, three more that aren’t logical nonsense but fly in the face of the evidence.
Here they are...[explains all five]...

He ends with:

My point is that you do a disservice to the debate by calling all of these things Ricardian equivalence; and the nature of that disservice is that you end up making the really, really bad arguments sound more respectable than they are. We do not want to lose sight of the fact that many influential people, including economists with impressive CVs, responded to macroeconomic crisis with crude logical fallacies that reflected not just sloppy thinking but ignorance of history.

Tuesday, October 28, 2014

Are Economists Ready for Income Redistribution?

I have a new column:

Are Economists Ready for Income Redistribution?: When the Great Recession hit and it became clear that monetary policy alone would not be enough to prevent a severe, prolonged downturn, fiscal policy measures – a combination of tax cuts and new spending – were used to try to limit the damage to the economy. Unfortunately, macroeconomic research on fiscal policy was all but absent from the macroeconomics literature and, for the most part, policymakers were operating in the dark, basing decisions on what they believed to be true rather than on solid theoretical and empirical evidence.
Fiscal policy will be needed again in the future, either in a severe downturn or perhaps to address the problem of growing inequality, and macroeconomists must do a better job of providing the advice that policymakers need to make informed fiscal policy decisions. ...

The question of redistribution is coming, and we need to be ready when it does.

Tuesday, October 14, 2014

'The Mythical Phillips Curve?'

An entry in the ongoing debate over the Phillips curve:

The mythical Phillips curve?, by Simon Wren-Lewis, mainly macro: Suppose you had just an hour to teach the basics of macroeconomics, what relationship would you be sure to include? My answer would be the Phillips curve. With the Phillips curve you can go a long way to understanding what monetary policy is all about.
My faith in the Phillips curve comes from simple but highly plausible ideas. In a boom, demand is strong relative to the economy’s capacity to produce, so prices and wages tend to rise faster than in an economic downturn. However workers do not normally suffer from money illusion: in a boom they want higher real wages to go with increasing labour supply. Equally firms are interested in profit margins, so if costs rise, so will prices. As firms do not change prices every day, they will think about future as well as current costs. That means that inflation depends on expected inflation as well as some indicator of excess demand, like unemployment.
Microfoundations confirm this logic, but add a crucial point that is not immediately obvious. Inflation today will depend on expectations about inflation in the future, not expectations about current inflation. That is the major contribution of New Keynesian theory to macroeconomics. ...[turns to evidence]...

Is it this data which makes me believe in the Phillips curve? To be honest, no. Instead it is the basic theory that I discussed at the beginning of this post. It may also be because I’m old enough to remember the 1970s when there were still economists around who denied that lower unemployment would lead to higher inflation, or who thought that the influence of expectations on inflation was weak, or who thought any relationship could be negated by direct controls on wages and prices, with disastrous results. But given how ‘noisy’ macro data normally is, I find the data I have shown here pretty consistent with my beliefs.

Monday, October 06, 2014

'Is Keynesian Economics Left Wing?'

A small part of a much longer argument/post by Simon Wren-Lewis:

More asymmetries: Is Keynesian economics left wing?: ...So why is there this desire to deny the importance of Keynesian theory coming from the political right? Perhaps it is precisely because monetary policy is necessary to ensure aggregate demand is neither excessive nor deficient. Monetary policy is state intervention: by setting a market price, an arm of the state ensures the macroeconomy works. When this particular procedure fails to work, in a liquidity trap for example, state intervention of another kind is required (fiscal policy). While these statements are self-evident to many mainstream economists, to someone of a neoliberal or ordoliberal persuasion they are discomforting. At the macroeconomic level, things only work well because of state intervention. This was so discomforting that New Classical economists attempted to create an alternative theory of business cycles where booms and recessions were nothing to be concerned about, but just the optimal response of agents to exogenous shocks.
So my argument is that Keynesian theory is not left wing, because it is not about market failure - it is just about how the macroeconomy works. On the other hand anti-Keynesian views are often politically motivated, because the pivotal role the state plays in managing the macroeconomy does not fit the ideology. ...

Friday, September 26, 2014

'The New Classical Clique'

Paul Krugman continues the conversation on New Classical economics::

The New Classical Clique: Simon Wren-Lewis thinks some more about macroeconomics gone astray; Robert J. Waldmann weighs in. For those new to this conversation, the question is why starting in the 1970s much of academic macroeconomics was taken over by a school of thought that began by denying any useful role for policies to raise demand in a slump, and eventually coalesced around denial that the demand side of the economy has any role in causing slumps.
I was a grad student and then an assistant professor as this was happening, albeit doing international economics – and international macro went in a different direction, for reasons I’ll get to in a bit. So I have some sense of what was really going on. And while both Wren-Lewis and Waldmann hit on most of the main points, neither I think gets at the important role of personal self-interest. New classical macro was and still is many things – an ideological bludgeon against liberals, a showcase for fancy math, a haven for people who want some kind of intellectual purity in a messy world. But it’s also a self-promoting clique. ...

Wednesday, September 24, 2014

Where and When Macroeconomics Went Wrong

Simon Wren-Lewis:

Where macroeconomics went wrong: In my view, the answer is in the 1970/80s with the New Classical revolution (NCR). However I also think the new ideas that came with that revolution were progressive. I have defended rational expectations, I think intertemporal theory is the right place to start in thinking about consumption, and exploring the implications of time inconsistency is very important to macro policy, as well as many other areas of economics. I also think, along with nearly all macroeconomists, that the microfoundations approach to macro (DSGE models) is a progressive research strategy.
That is why discussion about these issues can become so confused. New Classical economics made academic macroeconomics take a number of big steps forward, but a couple of big steps backward at the same time. The clue to the backward steps comes from the name NCR. The research program was anti-Keynesian (hence New Classical), and it did not want microfounded macro to be an alternative to the then dominant existing methodology, it wanted to replace it (hence revolution). Because the revolution succeeded (although the victory over Keynesian ideas was temporary), generations of students were taught that Keynesian economics was out of date. They were not taught about the pros and cons of the old and new methodologies, but were taught that the old methodology was simply wrong. And that teaching was/is a problem because it itself is wrong. ...

Tuesday, September 16, 2014

Rethinking New Economic Thinking

I have a new column:

Rethinking New Economic Thinking: Efforts such as Rethinking Economics and The Institute for New Economic Thinking are noteworthy attempts to, as INET says, “broaden and accelerate the development of new economic thinking that can lead to solutions for the great challenges of the 21st century. The havoc wrought by our recent global financial crisis has vividly demonstrated the deficiencies in our outdated current economic theories, and shown the need for new economic thinking – right now. 
It is certainly true that mainstream, modern macroeconomic models failed us prior to and during the Great Recession. The models failed to give any warning at all about the crisis that was about to hit – if anything those using modern macro models resisted the idea that a bubble was inflating in housing markets – and the models failed to give us the guidance we needed to implement effective monetary and fiscal policy responses to our economic problems. 
But amid the calls for change in macroeconomics there is far too much attention on the tools and techniques that macroeconomists use to answer questions, and far too little attention on what really matters... ...[continue reading]...

'Making the Case for Keynes'

Peter Temin and David Vines have a new book:

Making the case for Keynes, by Peter Dizikes, MIT News Office: In 1919, when the victors of World War I were concluding their settlement against Germany — in the form of the Treaty of Versailles — one of the leading British representatives at the negotiations angrily resigned his position, believing the debt imposed on the losers would be too harsh. The official, John Maynard Keynes, argued that because Britain had benefitted from export-driven growth, forcing the Germans to spend their money paying back debt rather than buying British products would be counterproductive for everyone, and slow global growth.
Keynes’ argument, outlined in his popular 1919 book, “The Economic Consequences of the Peace,” proved prescient. But Keynes is not primarily regarded as a theorist of international economics: His most influential work, “The General Theory of Employment, Interest, and Money,” published in 1936, uses the framework of a single country with a closed economy. From that model, Keynes arrived at his famous conclusion that government spending can reduce unemployment by boosting aggregate demand.
But in reality, says Peter Temin, an MIT economic historian, Keynes’ conclusions about demand and employment were long intertwined with his examination of international trade; Keynes was thinking globally, even when modeling locally.
“Keynes was interested in the world economy, not just in a single national economy,” Temin says. Now he is co-author of a new book on the subject, “Keynes: Useful Economics for the World Economy,” written with David Vines, a professor of economics at Oxford University, published this month by MIT Press.
In their book, Temin and Vines make the case that Keynesian deficit spending by governments is necessary to reignite the levels of growth that Europe and the world had come to expect prior to the economic downturn of 2008. But in a historical reversal, they believe that today’s Germany is being unduly harsh toward the debtor states of Europe, forcing other countries to pay off debts made worse by the 2008 crash — and, in turn, preventing them from spending productively, slowing growth and inhibiting a larger continental recovery.
“If you have secular [long-term] stagnation, what you need is expansionary fiscal policy,” says Temin, who is the Elisha Gray II Professor Emeritus of Economics at MIT.
Additional government spending is distinctly not the approach that Europe (and, to a lesser extent, the U.S.) has pursued over the last six years, as political leaders have imposed a wide range of spending cuts — the pursuit of “austerity” as a response to hard times. But Temin thinks it is time for the terms of the spending debate to shift.  
“The hope David and I have is that our simple little book might change people’s minds,” Temin says.
“Sticky” wages were the sticking point
In an effort to do so, the authors outline an intellectual trajectory for Keynes in which he was highly concerned with international, trade-based growth from the early stages of his career until his death in 1946, and in which the single-country policy framework of his “General Theory” was a necessary simplification that actually fits neatly with this global vision.
As Temin and Vines see it, Keynes, from early in his career, and certainly by 1919, had developed an explanation of growth in which technical progress leads to greater productive capacity. This leads businesses in advanced countries to search for international markets in which to sell products; encourages foreign lending of capital; and, eventually, produces greater growth by other countries as well.
“Clearly, Keynes knew that domestic prosperity was critically determined by external conditions,” Temin and Vines write.
Yet as they see it, Keynes had to overcome a crucial sticking point in his thought: As late as 1930, when Keynes served on a major British commission investigating the economy, he was still using an older, neoclassical idea in which all markets reached a sort of equilibrium. 
This notion implies that when jobs were relatively scarce, wages would decline to the point where more people would be employed. Yet this doesn’t quite seem to happen: As economists now recognize, and as Keynes came to realize, wages could be “sticky,” and remain at set levels, for various psychological or political reasons. In order to arrive at the conclusions of the “General Theory,” then, Keynes had to drop the assumption that wages would fluctuate greatly.
“The issue for Keynes was that he knew that if prices were flexible, then if all prices [including wages] could change, then you eventually get back to full employment,” Temin says. “So in order to avoid that, he assumed away all price changes.”
But if wages will not drop, how can we increase employment? For Keynes, the answer was that the whole economy had to grow: There needed to be an increase in aggregate demand, one of the famous conclusions of the “General Theory.” And if private employers cannot or will not spend more money on workers, Keynes thought, then the government should step in and spend.
“Keynes is very common-sense,” Temin says, in “that if you put people to work building roads and bridges, then those people spend money, and that promotes aggregate demand.”
Today, opponents of Keynes argue that such public spending will offset private-sector spending without changing overall demand. But Temin contends that private-sector spending “won’t be offset if those people were going to be unemployed, and would not be spending anything. Given jobs, he notes, “They would spend money, because now they would have money.”
Keynes’ interest in international trade and international economics never vanished, as Temin and Vines see it. Indeed, in the late stages of World War II, Keynes was busy working out proposals that could spur postwar growth within this same intellectual framework — and the International Monetary Fund is one outgrowth of this effort.
History repeating?
“Keynes: Useful Economics for the World Economy” has received advance praise from some prominent scholars. ... Nonetheless, Temin is guarded about the prospect of changing the contemporary austerity paradigm.
“I can’t predict what policy is going to do in the next couple of years,” Temin says. And in the meantime, he thinks, history may be repeating itself, as debtor countries are unable to make capital investments while paying off debt.
Germany has “decided that they are not willing to take any of the capital loss that happened during the crisis,” Temin adds. “The [other] European countries don’t have the resources to pay off these bonds. They’ve had to cut their spending to get the resources to pay off the bonds. If you read the press, you know this hasn’t been working very well.”

Thursday, September 11, 2014

'Trapped in the ''Dark Corners'''?

A small part of Brad DeLong's response to Olivier Blanchard. I posted a shortened version of Blanchard's argument a week or two ago:

Where Danger Lurks: Until the 2008 global financial crisis, mainstream U.S. macroeconomics had taken an increasingly benign view of economic fluctuations in output and employment. The crisis has made it clear that this view was wrong and that there is a need for a deep reassessment. ...
That small shocks could sometimes have large effects and, as a result, that things could turn really bad, was not completely ignored by economists. But such an outcome was thought to be a thing of the past that would not happen again, or at least not in advanced economies thanks to their sound economic policies. ... We all knew that there were “dark corners”—situations in which the economy could badly malfunction. But we thought we were far away from those corners, and could for the most part ignore them. ...
The main lesson of the crisis is that we were much closer to those dark corners than we thought—and the corners were even darker than we had thought too. ...
How should we modify our benchmark models—the so-called dynamic stochastic general equilibrium (DSGE) models...? The easy and uncontroversial part of the answer is that the DSGE models should be expanded to better recognize the role of the financial system—and this is happening. But should these models be able to describe how the economy behaves in the dark corners?
Let me offer a pragmatic answer. If macroeconomic policy and financial regulation are set in such a way as to maintain a healthy distance from dark corners, then our models that portray normal times may still be largely appropriate. Another class of economic models, aimed at measuring systemic risk, can be used to give warning signals that we are getting too close to dark corners, and that steps must be taken to reduce risk and increase distance. Trying to create a model that integrates normal times and systemic risks may be beyond the profession’s conceptual and technical reach at this stage.
The crisis has been immensely painful. But one of its silver linings has been to jolt macroeconomics and macroeconomic policy. The main policy lesson is a simple one: Stay away from dark corners.

And I responded:

That may be the best we can do for now (have separate models for normal times and "dark corners"), but an integrated model would be preferable. An integrated model would, for example, be better for conducting "policy and financial regulation ... to maintain a healthy distance from dark corners," and our aspirations ought to include models that can explain both normal and abnormal times. That may mean moving beyond the DSGE class of models, or perhaps the technical reach of DSGE models can be extended to incorporate the kinds of problems that can lead to Great Recessions, but we shouldn't be satisfied with models of normal times that cannot explain and anticipate major economic problems.

Here's part of Brad's response:

But… but… but… Macroeconomic policy and financial regulation are not set in such a way as to maintain a healthy distance from dark corners. We are still in a dark corner now. There is no sign of the 4% per year inflation target, the commitments to do what it takes via quantitative easing and rate guidance to attain it, or a fiscal policy that recognizes how the rules of the game are different for reserve currency printing sovereigns when r < n+g. Thus not only are we still in a dark corner, but there is every reason to believe that should we get out the sub-2% per year effective inflation targets of North Atlantic central banks and the inappropriate rhetoric and groupthink surrounding fiscal policy makes it highly likely that we will soon get back into yet another dark corner. Blanchard’s pragmatic answer is thus the most unpragmatic thing imaginable: the “if” test fails, and so the “then” part of the argument seems to me to be simply inoperative. Perhaps on another planet in which North Atlantic central banks and governments aggressively pursued 6% per year nominal GDP growth targets Blanchard’s answer would be “pragmatic”. But we are not on that planet, are we?

Moreover, even were we on Planet Pragmatic, it still seems to be wrong. Using current or any visible future DSGE models for forecasting and mainstream scenario planning makes no sense: the DSGE framework imposes restrictions on the allowable emergent properties of the aggregate time series that are routinely rejected at whatever level of frequentist statistical confidence that one cares to specify. The right road is that of Christopher Sims: that of forecasting and scenario planning using relatively instructured time-series methods that use rather than ignore the correlations in the recent historical data. And for policy evaluation? One should take the historical correlations and argue why reverse-causation and errors-in-variables lead them to underestimate or overestimate policy effects, and possibly get it right. One should not impose a structural DSGE model that identifies the effects of policies but certainly gets it wrong. Sims won that argument. Why do so few people recognize his victory?

Blanchard continues:

Another class of economic models, aimed at measuring systemic risk, can be used to give warning signals that we are getting too close to dark corners, and that steps must be taken to reduce risk and increase distance. Trying to create a model that integrates normal times and systemic risks may be beyond the profession’s conceptual and technical reach at this stage…

For the second task, the question is: whose models of tail risk based on what traditions get to count in the tail risks discussion?

And missing is the third task: understanding what Paul Krugman calls the “Dark Age of macroeconomics”, that jahiliyyah that descended on so much of the economic research, economic policy analysis, and economic policymaking communities starting in the fall of 2007, and in which the center of gravity of our economic policymakers still dwell.

Sunday, August 31, 2014

'Where Danger Lurks'

Olivier Blanchard (a much shortened version of his arguments, the entire piece is worth reading):

Where Danger Lurks: Until the 2008 global financial crisis, mainstream U.S. macroeconomics had taken an increasingly benign view of economic fluctuations in output and employment. The crisis has made it clear that this view was wrong and that there is a need for a deep reassessment. ...
That small shocks could sometimes have large effects and, as a result, that things could turn really bad, was not completely ignored by economists. But such an outcome was thought to be a thing of the past that would not happen again, or at least not in advanced economies thanks to their sound economic policies. ... We all knew that there were “dark corners”—situations in which the economy could badly malfunction. But we thought we were far away from those corners, and could for the most part ignore them. ...
The main lesson of the crisis is that we were much closer to those dark corners than we thought—and the corners were even darker than we had thought too. ...
How should we modify our benchmark models—the so-called dynamic stochastic general equilibrium (DSGE) models...? The easy and uncontroversial part of the answer is that the DSGE models should be expanded to better recognize the role of the financial system—and this is happening. But should these models be able to describe how the economy behaves in the dark corners?
Let me offer a pragmatic answer. If macroeconomic policy and financial regulation are set in such a way as to maintain a healthy distance from dark corners, then our models that portray normal times may still be largely appropriate. Another class of economic models, aimed at measuring systemic risk, can be used to give warning signals that we are getting too close to dark corners, and that steps must be taken to reduce risk and increase distance. Trying to create a model that integrates normal times and systemic risks may be beyond the profession’s conceptual and technical reach at this stage.
The crisis has been immensely painful. But one of its silver linings has been to jolt macroeconomics and macroeconomic policy. The main policy lesson is a simple one: Stay away from dark corners.

That may be the best we can do for now (have separate models for normal times and "dark corners"), but an integrated model would be preferable. An integrated model would, for example, be better for conducting "policy and financial regulation ... to maintain a healthy distance from dark corners," and our aspirations ought to include models that can explain both normal and abnormal times. That may mean moving beyond the DSGE class of models, or perhaps the technical reach of DSGE models can be extended to incorporate the kinds of problems that can lead to Great Recessions, but we shouldn't be satisfied with models of normal times that cannot explain and anticipate major economic problems.

Tuesday, August 19, 2014

The Agent-Based Method

Rajiv Sethi:

The Agent-Based Method: It's nice to see some attention being paid to agent-based computational models on economics blogs, but Chris House has managed to misrepresent the methodology so completely that his post is likely to do more harm than good. 

In comparing the agent-based method to the more standard dynamic stochastic general equilibrium (DSGE) approach, House begins as follows:

Probably the most important distinguishing feature is that, in an ABM, the interactions are governed by rules of behavior that the modeler simply encodes directly into the system individuals who populate the environment.

So far so good, although I would not have used the qualifier "simply", since encoded rules can be highly complex. For instance, an ABM that seeks to describe the trading process in an asset market may have multiple participant types (liquidity, information, and high-frequency traders for instance) and some of these may be using extremely sophisticated strategies.

How does this approach compare with DSGE models? House argues that the key difference lies in assumptions about rationality and self-interest:

People who write down DSGE models don’t do that. Instead, they make assumptions on what people want. They also place assumptions on the constraints people face. Based on the combination of goals and constraints, the behavior is derived. The reason that economists set up their theories this way – by making assumptions about goals and then drawing conclusions about behavior – is that they are following in the central tradition of all of economics, namely that allocations and decisions and choices are guided by self-interest. This goes all the way back to Adam Smith and it’s the organizing philosophy of all economics. Decisions and actions in such an environment are all made with an eye towards achieving some goal or some objective. For consumers this is typically utility maximization – a purely subjective assessment of well-being.  For firms, the objective is typically profit maximization. This is exactly where rationality enters into economics. Rationality means that the “agents” that inhabit an economic system make choices based on their own preferences.

This, to say the least, is grossly misleading. The rules encoded in an ABM could easily specify what individuals want and then proceed from there. For instance, we could start from the premise that our high-frequency traders want to maximize profits. They can only do this by submitting orders of various types, the consequences of which will depend on the orders placed by others. Each agent can have a highly sophisticated strategy that maps historical data, including the current order book, into new orders. The strategy can be sensitive to beliefs about the stream of income that will be derived from ownership of the asset over a given horizon, and may also be sensitive to beliefs about the strategies in use by others. Agents can be as sophisticated and forward-looking in their pursuit of self-interest in an ABM as you care to make them; they can even be set up to make choices based on solutions to dynamic programming problems, provided that these are based on private beliefs about the future that change endogenously over time. 

What you cannot have in an ABM is the assumption that, from the outset, individual plans are mutually consistent. That is, you cannot simply assume that the economy is tracing out an equilibrium path. The agent-based approach is at heart a model of disequilibrium dynamics, in which the mutual consistency of plans, if it arises at all, has to do so endogenously through a clearly specified adjustment process. This is the key difference between the ABM and DSGE approaches, and it's right there in the acronym of the latter.

A typical (though not universal) feature of agent-based models is an evolutionary process, that allows successful strategies to proliferate over time at the expense of less successful ones. Since success itself is frequency dependent---the payoffs to a strategy depend on the prevailing distribution of strategies in the population---we have strong feedback between behavior and environment. Returning to the example of trading, an arbitrage-based strategy may be highly profitable when rare but much less so when prevalent. This rich feedback between environment and behavior, with the distribution of strategies determining the environment faced by each, and the payoffs to each strategy determining changes in their composition, is a fundamental feature of agent-based models. In failing to understand this, House makes claims that are close to being the opposite of the truth: 

Ironically, eliminating rational behavior also eliminates an important source of feedback – namely the feedback from the environment to behavior.  This type of two-way feedback is prevalent in economics and it’s why equilibria of economic models are often the solutions to fixed-point mappings. Agents make choices based on the features of the economy.  The features of the economy in turn depend on the choices of the agents. This gives us a circularity which needs to be resolved in standard models. This circularity is cut in the ABMs however since the choice functions do not depend on the environment. This is somewhat ironic since many of the critics of economics stress such feedback loops as important mechanisms.

It is absolutely true that dynamics in agent-based models do not require the computation of fixed points, but this is a strength rather than a weakness, and has nothing to do with the absence of feedback effects. These effects arise dynamically in calendar time, not through some mystical process by which coordination is instantaneously achieved and continuously maintained. 

It's worth thinking about how the learning literature in macroeconomics, dating back to Marcet and Sargent and substantially advanced by Evans and Honkapohja fits into this schema. Such learning models drop the assumption that beliefs continuously satisfy mutual consistency, and therefore take a small step towards the ABM approach. But it really is a small step, since a great deal of coordination continues to be assumed. For instance, in the canonical learning model, there is a parameter about which learning occurs, and the system is self-referential in that beliefs about the parameter determine its realized value. This allows for the possibility that individuals may hold incorrect beliefs, but limits quite severely---and more importantly, exogenously---the structure of such errors. This is done for understandable reasons of tractability, and allows for analytical solutions and convergence results to be obtained. But there is way too much coordination in beliefs across individuals assumed for this to be considered part of the ABM family.

The title of House's post asks (in response to an earlier piece by Mark Buchanan) whether agent-based models really are the future of the discipline. I have argued previously that they are enormously promising, but face one major methodological obstacle that needs to be overcome. This is the problem of quality control: unlike papers in empirical fields (where causal identification is paramount) or in theory (where robustness is key) there is no set of criteria, widely agreed upon, that can allow a referee to determine whether a given set of simulation results provides a deep and generalizable insight into the workings of the economy. One of the most celebrated agent-based models in economics---the Schelling segregation model---is also among the very earliest. Effective and acclaimed recent exemplars are in short supply, though there is certainly research effort at the highest levels pointed in this direction. The claim that such models can displace the equilibrium approach entirely is much too grandiose, but they should be able to find ample space alongside more orthodox approaches in time. 

---

The example of interacting trading strategies in this post wasn't pulled out of thin air; market ecology has been a recurrent theme on this blog. In ongoing work with Yeon-Koo Che and Jinwoo Kim, I am exploring the interaction of trading strategies in asset markets, with the goal of addressing some questions about the impact on volatility and welfare of high-frequency trading. We have found the agent-based approach very useful in thinking about these questions, and I'll present some preliminary results at a session on the methodology at the Rethinking Economics conference in New York next month. The event is free and open to the public but seating is limited and registration required. 

Wednesday, August 13, 2014

'Unemployment Fluctuations are Mainly Driven by Aggregate Demand Shocks'

Do the facts have a Keynesian bias?:

Using product- and labour-market tightness to understand unemployment, by Pascal Michaillat and Emmanuel Saez, Vox EU: For the five years from December 2008 to November 2013, the US unemployment rate remained above 7%, peaking at 10% in October 2009. This period of high unemployment is not well understood. Macroeconomists have proposed a number of explanations for the extent and persistence of unemployment during the period, including:

  • High mismatch caused by major shocks to the financial and housing sectors,
  • Low search effort from unemployed workers triggered by long extensions of unemployment insurance benefits, and
  • Low aggregate demand caused by a sudden need to repay debts or pessimism, but no consensus has been reached.

In our opinion this lack of consensus is due to a gap in macroeconomic theory: we do not have a model that is rich enough to account for the many factors driving unemployment – including aggregate demand – and simple enough to lend itself to pencil-and-paper analysis. ...

In Michaillat and Saez (2014), we develop a new model of unemployment fluctuations to inspect the mechanisms behind unemployment fluctuations. The model can be seen as an equilibrium version of the Barro-Grossman model. It retains the architecture of the Barro-Grossman model but replaces the disequilibrium framework on the product and labour markets with an equilibrium matching framework. ...

Through the lens of our simple model, the empirical evidence suggests that price and real wage are somewhat rigid, and that unemployment fluctuations are mainly driven by aggregate demand shocks.

Tuesday, August 12, 2014

Why Do Macroeconomists Disagree?

I have a new column:

Why Do Macroeconomists Disagree?, by Mark Thoma, The Fiscal Times: On August 9, 2007, the French Bank BNP Paribus halted redemptions to three investment funds active in US mortgage markets due to severe liquidity problems, an event that many mark as the beginning of the financial crisis. Now, just over seven years later, economists still can’t agree on what caused the crisis, why it was so severe, and why the recovery has been so slow. We can’t even agree on the extent to which modern macroeconomic models failed, or if they failed at all.
The lack of a consensus within the profession on the economics of the Great Recession, one of the most significant economic events in recent memory, provides a window into the state of macroeconomics as a science. ...

Monday, August 11, 2014

'On Macroeconomic Forecasting'

Simon Wren-Lewis:

...The rather boring truth is that it is entirely predictable that forecasters will miss major recessions, just as it is equally predictable that each time this happens we get hundreds of articles written asking what has gone wrong with macro forecasting. The answer is always the same - nothing. Macroeconomic model based forecasts are always bad, but probably no worse than intelligent guesses.

More here.

'Inflation in the Great Recession and New Keynesian Models'

From the NY Fed's Liberty Street Economics:

Inflation in the Great Recession and New Keynesian Models, by Marco Del Negro, Marc Giannoni, Raiden Hasegawa, and Frank Schorfheide: Since the financial crisis of 2007-08 and the Great Recession, many commentators have been baffled by the “missing deflation” in the face of a large and persistent amount of slack in the economy. Some prominent academics have argued that existing models cannot properly account for the evolution of inflation during and following the crisis. For example, in his American Economic Association presidential address, Robert E. Hall called for a fundamental reconsideration of Phillips curve models and their modern incarnation—so-called dynamic stochastic general equilibrium (DSGE) models—in which inflation depends on a measure of slack in economic activity. The argument is that such theories should have predicted more and more disinflation as long as the unemployment rate remained above a natural rate of, say, 6 percent. Since inflation declined somewhat in 2009, and then remained positive, Hall concludes that such theories based on a concept of slack must be wrong.        
In an NBER working paper and a New York Fed staff report (forthcoming in the American Economic Journal: Macroeconomics), we use a standard New Keynesian DSGE model with financial frictions to explain the behavior of output and inflation since the crisis. This model was estimated using data up to 2008. We find that following the increase in financial stress in 2008, the model successfully predicts not only the sharp contraction in economic activity, but also only a modest decline in inflation. ...

Thursday, July 31, 2014

'What Are Academics Good For?'

Simon Wren-Lewis

What are academics good for?: A survey of US academic economists, which found that 36 thought the Obama fiscal stimulus reduced unemployment and only one thought otherwise, led to this cri de coeur from Paul Krugman. What is the point in having academic research if it is ignored, he asked? At the same time I was involved in a conversation on twitter, where the person I was tweeting with asked ... why should we take any more notice of what academic economists say about economics than, well, City economists or economic journalists?
Here is a very good example of why. ...

Sunday, July 27, 2014

'Monetarist, Keynesian, and Minskyite Depressions Once Again'

Brad DeLong:

I have said this before. But I seem to need to say it again…
The very intelligent and thoughtful David Beckworth, Simon Wren-Lewis, and Nick Rowe are agreeing on New Keynesian-Market Monetarist monetary-fiscal convergence. Underpinning all of their analyses there seems to me to be the assumption that all aggregate demand shortfalls spring from the same deep market failures. And I think that that is wrong. ...[continue]...

Wednesday, July 23, 2014

'Wall Street Skips Economics Class'

The discussion continues:

Wall Street Skips Economics Class, by Noah Smith: If you care at all about what academic macroeconomists are cooking up (or if you do any macro investing), you might want to check out the latest economics blog discussion about the big change that happened in the late '70s and early '80s. Here’s a post by the University of Chicago economist John Cochrane, and here’s one by Oxford’s Simon Wren-Lewis that includes links to most of the other contributions.
In case you don’t know the background, here’s the short version...

Friday, July 18, 2014

'Further Thoughts on Phillips Curves'

Simon Wren-Lewis:

Further thoughts on Phillips curves: In a post from a few days ago I looked at some recent evidence on Phillips curves, treating the Great Recession as a test case. I cast the discussion as a debate between rational and adaptive expectations. Neither is likely to be 100% right of course, but I suggested the evidence implied rational expectations were more right than adaptive. In this post I want to relate this to some other people’s work and discussion. (See also this post from Mark Thoma.) ...
The first issue is why look at just half a dozen years, in only a few countries. As I noted in the original post, when looking at CPI inflation there are many short term factors that may mislead. Another reason for excluding European countries which I did not mention is the impact of austerity driven higher VAT rates (and other similar taxes or administered prices), nicely documented by Klitgaard and Peck. Surely all this ‘noise’ is an excellent reason to look over a much longer time horizon?
One answer is given in this recent JEL paper by Mavroeidis, Plagborg-Møller and Stock. As Plagborg-Moller notes in an email to Mark Thoma: “Our meta-analysis finds that essentially any desired parameter estimates can be generated by some reasonable-sounding specification. That is, estimation of the NKPC is subject to enormous specification uncertainty. This is consistent with the range of estimates reported in the literature….traditional aggregate time series analysis is just not very informative about the nature of inflation dynamics.” This had been my reading based on work I’d seen.
This is often going to be the case with time series econometrics, particularly when key variables appear in the form of expectations. Faced with this, what economists often look for is some decisive and hopefully large event, where all the issues involving specification uncertainty can be sidelined or become second order. The Great Recession, for countries that did not suffer a second recession, might be just such an event. In earlier, milder recessions it was also much less clear what the monetary authority’s inflation target was (if it had one at all), and how credible it was. ...

I certainly agree with the claim that a "decisive and hopefully large event" is needed to empirically test econometric models since I've made the same point many times in the past. For example, "...the ability to choose one model over the other is not quite as hopeless as I’ve implied. New data and recent events like the Great Recession push these models into unchartered territory and provide a way to assess which model provides better predictions. However, because of our reliance on historical data this is a slow process – we have to wait for data to accumulate – and there’s no guarantee that once we are finally able to pit one model against the other we will be able to crown a winner. Both models could fail..."

Anyway...he goes on to discuss "How does what I did relate to recent discussions by Paul Krugman?," and concludes with:

My interpretation suggests that the New Keynesian Phillips curve is a more sensible place to start from than the adaptive expectations Friedman/Phelps version. As this is the view implicitly taken by most mainstream academic macroeconomics, but using a methodology that does not ensure congruence with the data, I think it is useful to point out when the mainstream does have empirical support. ...

Monday, July 14, 2014

Is There a Phillips Curve? If So, Which One?

One place that Paul Krugman and Chris House disagree is on the Phillips curve. Krugman (responding to a post by House) says:

New Keynesians do stuff like one-period-ahead price setting or Calvo pricing, in which prices are revised randomly. Practicing Keynesians have tended to rely on “accelerationist” Phillips curves in which unemployment determined the rate of change rather than the level of inflation.
So what has happened since 2008 is that both of these approaches have been found wanting: inflation has dropped, but stayed positive despite high unemployment. What the data actually look like is an old-fashioned non-expectations Phillips curve. And there are a couple of popular stories about why: downward wage rigidity even in the long run, anchored expectations.

House responds:

What the data actually look like is an old-fashioned non-expectations Phillips curve. 
OK, here is where we disagree. Certainly this is not true for the data overall. It seems like Paul is thinking that the system governing the relationship between inflation and output changes between something with essentially a vertical slope (a “Classical Phillips curve”) and a nearly flat slope (a “Keynesian Phillips Curve”). I doubt that this will fit the data particularly well and it would still seem to open the door to a large role for “supply shocks” – shocks that neither Paul nor I think play a big role in business cycles.

Simon Wren-Lewis also has something to say about this in his post from earlier today, Has the Great Recession killed the traditional Phillips Curve?:

Before the New Classical revolution there was the Friedman/Phelps Phillips Curve (FPPC), which said that current inflation depended on some measure of the output/unemployment gap and the expected value of current inflation (with a unit coefficient). Expectations of inflation were modelled as some function of past inflation (e.g. adaptive expectations) - at its simplest just one lag in inflation. Therefore in practice inflation depended on lagged inflation and the output gap.
After the New Classical revolution came the New Keynesian Phillips Curve (NKPC), which had current inflation depending on some measure of the output/unemployment gap and the expected value of inflation in the next period. If this was combined with adaptive expectations, it would amount to much the same thing as the FPPC, but instead it was normally combined with rational expectations, where agents made their best guess at what inflation would be next period using all relevant information. This would include past inflation, but it would include other things as well, like prospects for output and any official inflation target.
Which better describes the data? ...
[W]e can see why some ... studies (like this for the US) can claim that recent inflation experience is consistent with the NKPC. It seems much more difficult to square this experience with the traditional adaptive expectations Phillips curve. As I suggested at the beginning, this is really a test of whether rational expectations is a better description of reality than adaptive expectations. But I know the conclusion I draw from the data will upset some people, so I look forward to a more sophisticated empirical analysis showing why I’m wrong.

I don't have much to add, except to say that this is an empirical question that will be difficult to resolve empirically (because there are so many different ways to estimate a Phillips curve, and different specifications give different answers, e.g. which measure of prices to use, which measure of aggregate activity to use, what time period to use and how to handle structural and policy breaks during the period that is chosen, how should natural rates be extracted from the data, how to handle non-stationarities, if we measure aggregate activity with the unemployment rate, do we exclude the long-term unemployed as recent research suggests, how many lags should be included, etc., etc.?).

Sunday, July 13, 2014

New Classical Economics as Modeling Strategy

Judy Klein emails a response to a recent post of mine based upon Simon Wren Lewis's post “Rereading Lucas and Sargent 1979”:

Lucas and Sargent’s, “After Keynesian Macroeconomics,” was presented at the 1978 Boston Federal Reserve Conference on “After the Phillips Curve: Persistence of High Inflation and High Unemployment.” Although the title of the conference dealt with stagflation, the rational expectations theorists saw themselves countering one technical revolution with another.

The Keynesian Revolution was, in the form in which it succeeded in the United States, a revolution in method. This was not Keynes’s intent, nor is it the view of all of his most eminent followers. Yet if one does not view the revolution in this way, it is impossible to account for some of its most important features: the evolution of macroeconomics into a quantitative, scientific discipline, the development of explicit statistical descriptions of economic behavior, the increasing reliance of government officials on technical economic expertise, and the introduction of the use of mathematical control theory to manage an economy. [Lucas and Sargent, 1979, pg. 50]

The Lucas papers at the Economists' Papers Project at the University of Duke reveal the preliminary planning for the 1978 presentation. Lucas and Sargent decided that it would be a “rhetorical piece… to convince others that the old-fashioned macro game is up…in a way which makes it clear that the difficulties are fatal”; it’s theme would be the “death of macroeconomics” and the desirability of replacing it with an “Aggregative Economics” whose foundation was “equilibrium theory.” (Lucas letter to Sargent February 9, 1978). Their 1978 presentation was replete, as their discussant Bob Solow pointed out, with the planned rhetorical barbs against Keynesian economics of “wildly incorrect," "fundamentally flawed," "wreckage," "failure," "fatal," "of no value," "dire implications," "failure on a grand scale," "spectacular recent failure," "no hope." The empirical backdrop to Lucas and Sargent’s death decree on Keynesian economics was evident in the subtitle of the conference: “Persistence of High Inflation and High Unemployment.”

Although they seized the opportunity to comment on policy failure and the high misery-index economy, Lucas and Sargent shifted the macroeconomic court of judgment from the economy to microeconomics. They fought a technical battle over the types of restrictions used by modelers to identify their structural models. Identification-rendering restrictions were essential to making both the Keynesian and rational expectations models “work” in policy applications, but Lucas and Sargent defined the ultimate terms of success not with regard to a model’s capacity for empirical explanation or achievement of a desirable policy outcome, but rather with regard to the model’s capacity to incorporate optimization and equilibrium – to aggregate consistently rational individuals and cleared markets.

In the macroeconomic history written by the victors, the Keynesian revolution and the rational expectations revolution were both technical revolutions, and one could delineate the sides of the battle line in the second revolution by the nature of the restricting assumptions that enabled the model identification that licensed policy prescription. The rational expectations revolution, however, was also a revolution in the prime referential framework for judging macroeconomic model fitness for going forth and multiplying; the consistency of the assumptions – the equation restrictions - with optimizing microeconomics and mathematical statistical theory, rather than end uses of explaining the economy and empirical statistics, constituted the new paramount selection criteria.

Some of the new classical macroeconomists have been explicit about the narrowness of their revolution. For example, Sargent noted in 2008, “While rational expectations is often thought of as a school of economic thought, it is better regarded as a ubiquitous modeling technique used widely throughout economics.” In an interview with Arjo Klamer in 1983, Robert Townsend asserted that “New classical economics means a modeling strategy.”

It is no coincidence, however, that in this modeling narrative of economic equilibrium crafted in the Cold War era, Adam Smith’s invisible hand morphs into a welfare-maximizing “hypothetical ‘benevolent social planner’” (Lucas, Prescott, Stokey 1989) enforcing a “communism of models” (Sargent 2007) and decreeing to individual agents the mutually consistent rules of action that become the equilibrating driving force. Indeed, a long-term Office of Naval Research grant for “Planning & Control of Industrial Operations” awarded to the Carnegie Institutes of Technology’s Graduate School of Industrial Administration had funded Herbert Simon’s articulation of his certainty equivalence theorem and John Muth’s study of rational expectations. It is ironic that a decade-long government planning contract employing Carnegie professors and graduate students underwrote the two key modeling strategies for the Nobel-prize winning demonstration that the rationality of consumers renders government intervention to increase employment unnecessary and harmful.

Friday, July 11, 2014

'Rereading Lucas and Sargent 1979'

Simon Wren-Lewis with a nice follow-up to an earlier discussion:

Rereading Lucas and Sargent 1979: Mainly for macroeconomists and those interested in macroeconomic thought
Following this little interchange (me, Mark Thoma, Paul Krugman, Noah Smith, Robert Waldman, Arnold Kling), I reread what could be regarded as the New Classical manifesto: Lucas and Sargent’s ‘After Keynesian Economics’ (hereafter LS). It deserves to be cited as a classic, both for the quality of ideas and the persuasiveness of the writing. It does not seem like something written 35 ago, which is perhaps an indication of how influential its ideas still are.
What I want to explore is whether this manifesto for the New Classical counter revolution was mainly about stagflation, or whether it was mainly about methodology. LS kick off their article with references to stagflation and the failure of Keynesian theory. A fundamental rethink is required. What follows next is I think crucial. If the counter revolution is all about stagflation, we might expect an account of why conventional theory failed to predict stagflation - the equivalent, perhaps, to the discussion of classical theory in the General Theory. Instead we get something much more general - a discussion of why identification restrictions typically imposed in the structural econometric models (SEMs) of the time are incredible from a theoretical point of view, and an outline of the Lucas critique.
In other words, the essential criticism in LS is methodological: the way empirical macroeconomics has been done since Keynes is flawed. SEMs cannot be trusted as a guide for policy. In only one paragraph do LS try to link this general critique to stagflation...[continue]...

Sunday, July 06, 2014

'Slump Stories and the Inflation Test'

Does evidence matter?:

Slump Stories and the Inflation Test: Noah Smith has another post on John Cochrane’s anti-Keynesian screed... All the anti-Keynesian stories (except “uncertainty”, which as Nick Rowe points out is actually a Keynesian story but doesn’t know it) are supply-side stories; Cochrane even puts scare quotes around the word “demand”. Basically, they’re claiming that unemployment benefits, or Obamacare, or regulations, or something, are reducing the willingness of workers and firms to produce stuff.
How would you test this? In a supply-constrained economy, the kind of monetary policy we’ve had, with the Fed quintupling the size of its balance sheet over a short period of time, would be highly inflationary. Indeed, just about everyone on the right has been predicting runaway inflation year after year.
Meanwhile, if you had a demand-side view, and considered the implications of the zero lower bound, you declared that nothing of the sort would happen...
It seems to me that the failure of the inflation predicted by anti-Keynesians to appear — and the fact that this failure was predicted by Keynesian models — is a further big reason not to take what these people are saying seriously.

In a "supply-constrained economy" the price of inputs like labor should also rise, but that hasn't happened either.

Friday, July 04, 2014

Responses to John Cochrane's Attack on New Keynesian Models

The opening quote from chapter 2 of Mankiw's intermediate macro textbook:

It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to fit facts. — Sherlock Holmes

Or, instead of "before one has data," change it to "It is a capital mistake to theorize without knowledge of the data" and it's a pretty good summary of Paul Krugman's response to John Cochrane:

Macro Debates and the Relevance of Intellectual History: One of the interesting things about the ongoing economic crisis is the way it has demonstrated the importance of historical knowledge. ... But it’s not just economic history that turns out to be extremely relevant; intellectual history — the history of economic thought — turns out to be relevant too.
Consider, in particular, the recent to-and-fro about stagflation and the rise of new classical macroeconomics. You might think that this is just economist navel-gazing; but you’d be wrong.
To see why, consider John Cochrane’s latest. ... Cochrane’s current argument ... effectively depends on the notion that there must have been very good reasons for the rejection of Keynesianism, and that harkening back to old ideas must involve some kind of intellectual regression. And that’s where it’s important — as David Glasner notes — to understand what really happened in the 70s.
The point is that the new classical revolution in macroeconomics was not a classic scientific revolution, in which an old theory failed crucial empirical tests and was supplanted by a new theory that did better. The rejection of Keynes was driven by a quest for purity, not an inability to explain the data — and when the new models clearly failed the test of experience, the new classicals just dug in deeper. They didn’t back down even when people like Chris Sims (pdf), using the very kinds of time-series methods they introduced, found that they strongly pointed to a demand-side model of economic fluctuations.
And critiques like Cochrane’s continue to show a curious lack of interest in evidence. ... In short, you have a much better sense of what’s really going on here, and which ideas remain relevant, if you know about the unhappy history of macroeconomic thought.

Nick Rowe:

Insufficient Demand vs?? Uncertainty: ...John Cochrane says: "John Taylor, Stanford's Nick Bloom and Chicago Booth's Steve Davis see the uncertainty induced by seat-of-the-pants policy at fault. Who wants to hire, lend or invest when the next stroke of the presidential pen or Justice Department witch hunt can undo all the hard work? Ed Prescott emphasizes large distorting taxes and intrusive regulations. The University of Chicago's Casey Mulligan deconstructs the unintended disincentives of social programs. And so forth. These problems did not cause the recession. But they are worse now, and they can impede recovery and retard growth." ...
Increased political uncertainty would reduce aggregate demand. Plus, positive feedback processes could amplify that initial reduction in aggregate demand. Even those who were not directly affected by that increased political uncertainty would reduce their own willingness to hire lend or invest because of that initial reduction in aggregate demand, plus their own uncertainty about aggregate demand. So the average person or firm might respond to a survey by saying that insufficient demand was the problem in their particular case, and not the political uncertainty which caused it.
But the demand-side problem could still be prevented by an appropriate monetary policy response. Sure, there would be supply-side effects too. And it would be very hard empirically to estimate the relative magnitudes of those demand-side vs supply-side effects. ...
So it's not just an either/or thing. Nor is it even a bit-of-one-plus-bit-of-the-other thing. Increased political uncertainty can cause a recession via its effect on demand. Unless monetary policy responds appropriately. (And that, of course, would mean targeting NGDP, because inflation targeting doesn't work when supply-side shocks cause adverse shifts in the Short Run Phillips Curve.)

On whether supply or demand shocks are the source of aggregate fluctuations, Blanchard and Quah (1989), Shapiro and Watson (1988), and others had it right (though the identifying restriction that aggregate demand shocks do not have permanent effects seems to be undermined by the Great Recession ). It's not an eithor/or question, it's a matter of figuring out how much of the variation in GDP/employment is due to supply shocks, and how much is due to demand shocks. And as Nick Rowe points out with his example, sorting between these two causes can be very difficult -- identifying which type of shock is driving changes in aggregate variables is not at all easy and depends upon particular assumptions. Nevertheless, my reading of the empirical evidence is much like Krugman's. Overall, across all these papers, it is demand shocks that play the most prominent role. Supply shocks do matter, but not nearly so much as demand shocks when it comes to explaining aggregate fluctuations.

Saturday, June 28, 2014

The Rise and Fall of the New Classical Model

Simon Wren-Lewis (my comments are at the end):

Understanding the New Classical revolution: In the account of the history of macroeconomic thought I gave here, the New Classical counter revolution was both methodological and ideological in nature. It was successful, I suggested, because too many economists were unhappy with the gulf between the methodology used in much of microeconomics, and the methodology of macroeconomics at the time.
There is a much simpler reading. Just as the original Keynesian revolution was caused by massive empirical failure (the Great Depression), the New Classical revolution was caused by the Keynesian failure of the 1970s: stagflation. An example of this reading is in this piece by the philosopher Alex Rosenberg (HT Diane Coyle). He writes: “Back then it was the New Classical macrotheory that gave the right answers and explained what the matter with the Keynesian models was.”
I just do not think that is right. Stagflation is very easily explained: you just need an ‘accelerationist’ Phillips curve (i.e. where the coefficient on expected inflation is one), plus a period in which monetary policymakers systematically underestimate the natural rate of unemployment. You do not need rational expectations, or any of the other innovations introduced by New Classical economists.
No doubt the inflation of the 1970s made the macroeconomic status quo unattractive. But I do not think the basic appeal of New Classical ideas lay in their better predictive ability. The attraction of rational expectations was not that it explained actual expectations data better than some form of adaptive scheme. Instead it just seemed more consistent with the general idea of rationality that economists used all the time. Ricardian Equivalence was not successful because the data revealed that tax cuts had no impact on consumption - in fact study after study have shown that tax cuts do have a significant impact on consumption.
Stagflation did not kill IS-LM. In fact, because empirical validity was so central to the methodology of macroeconomics at the time, it adapted to stagflation very quickly. This gave a boost to the policy of monetarism, but this used the same IS-LM framework. If you want to find the decisive event that led to New Classical economists winning their counterrevolution, it was the theoretical realisation that if expectations were rational, but inflation was described by an accelerationist Phillips curve with expectations about current inflation on the right hand side, then deviations from the natural rate had to be random. The fatal flaw in the Keynesian/Monetarist theory of the 1970s was theoretical rather than empirical.

I agree with this, so let me add to it by talking about what led to the end of the New Classical revolution (see here for a discussion of the properties of New Classical, New Keynesian, and Real Business Cycle Models). The biggest factor was empirical validity. Although some versions of the New Classical model allowed monetary non-neutrality (e.g. King 1982, JPE), when three factors are present, continuous market clearing, rational expectations, and the natural rate hypothesis, monetary neutrality is generally present in these models. Initially work from people like Barrow found strong support for the prediction of these models that only unanticipated changes in monetary policy can affect real variables like output, but subsequent work and eventually the weight of the evidence pointed in the other direction. Both expected and unexpected changes in the money supply appeared to matter in contrast to a key prediction of the New Classical framework.

A second factor that worked against New Classical models is that they had difficulty explaining both the duration and magnitude of actual business cycles. If the reaction to an unexpected policy shock was focused in a single period, the magnitude could be matched, but not the duration. If the shock was spread over 3-5 years to match the duration, the magnitude of cycles could not be matched. Movements in macroeconomic variables arising from informational errors (unexpected policy shocks) did not have enough "power" to capture both aspects of actual business cycles.

The other factor that worked against these models was that information problems were a key factor in generating swings in GDP and employment, and these variations were costly in aggregate. Yet no markets for information appeared to resolve this problem. For those who believe in the power of markets, and many proponents of New Classical models were also market fundamentalists, the lack of markets for information was a problem.

The New Classical model had displaced the Keynesian model for the reasons highlighted above, but the failure of the New Classical model left the door open for the New Keynesian model to emerge (it appeared to be more consistent with the empirical evidence on the effects of changes in the money supply, and in other areas as well, e.g. the correlation between productivity and economic activity).

But while the New Classical revolution was relatively short-lived as macro models go, it left two important legacies, rational expectations and microfoundations (as well as better knowledge about how non-neutralities might arise, in essence the New Keynesian model drops continuous market clearing through the assumption of short-run price rigidities, and about how to model information sets). Rightly or wrongly, all subsequent models had to have these two elements present within them (RE and microfoundaions), or they would be dismissed.

Thursday, June 26, 2014

Why DSGEs Crash During Crises

David Hendry and Grayham Mizon with an important point about DSGE models:

Why DSGEs crash during crises, by David F. Hendry and Grayham E. Mizon: Many central banks rely on dynamic stochastic general equilibrium models – known as DSGEs to cognoscenti. This column – which is more technical than most Vox columns – argues that the models’ mathematical basis fails when crises shift the underlying distributions of shocks. Specifically, the linchpin ‘law of iterated expectations’ fails, so economic analyses involving conditional expectations and inter-temporal derivations also fail. Like a fire station that automatically burns down whenever a big fire starts, DSGEs become unreliable when they are most needed.

Here's the introduction:

In most aspects of their lives humans must plan forwards. They take decisions today that affect their future in complex interactions with the decisions of others. When taking such decisions, the available information is only ever a subset of the universe of past and present information, as no individual or group of individuals can be aware of all the relevant information. Hence, views or expectations about the future, relevant for their decisions, use a partial information set, formally expressed as a conditional expectation given the available information.
Moreover, all such views are predicated on there being no unanticipated future changes in the environment pertinent to the decision. This is formally captured in the concept of ‘stationarity’. Without stationarity, good outcomes based on conditional expectations could not be achieved consistently. Fortunately, there are periods of stability when insights into the way that past events unfolded can assist in planning for the future.
The world, however, is far from completely stationary. Unanticipated events occur, and they cannot be dealt with using standard data-transformation techniques such as differencing, or by taking linear combinations, or ratios. In particular, ‘extrinsic unpredictability’ – unpredicted shifts of the distributions of economic variables at unanticipated times – is common. As we shall illustrate, extrinsic unpredictability has dramatic consequences for the standard macroeconomic forecasting models used by governments around the world – models known as ‘dynamic stochastic general equilibrium’ models – or DSGE models. ...[continue]...

Update: [nerdy] Reply to Hendry and Mizon: we have DSGE models with time-varying parameters and variances.

Tuesday, June 24, 2014

'Was the Neoclassical Synthesis Unstable?'

The last paragraph from a much longer argument by Simon Wren-Lewis:

Was the neoclassical synthesis unstable?: ... Of course we have moved on from the 1980s. Yet in some respects we have not moved very far. With the counter revolution we swung from one methodological extreme to the other, and we have not moved much since. The admissibility of models still depends on their theoretical consistency rather than consistency with evidence. It is still seen as more important when building models of the business cycle to allow for the endogeneity of labour supply than to allow for involuntary unemployment. What this means is that many macroeconomists who think they are just ‘taking theory seriously’ are in fact applying a particular theoretical view which happens to suit the ideology of the counter revolutionaries. The key to changing that is to first accept it.

Friday, May 09, 2014

Economists and Methodology

Simon Wren-Lewis:

Economists and methodology: ...very few economists write much about methodology. This would be understandable if economics was just like some other discipline where methodological discussion was routine. This is not the case. Economics is not like the physical sciences for well known reasons. Yet economics is not like most other social sciences either: it is highly deductive, highly abstractive (in the non-philosophical sense) and rarely holistic. ...
This is a long winded way of saying that the methodology used by economics is interesting because it is unusual. Yet, as I say, you will generally not find economists writing about methodology. One reason for this is ... a feeling that the methodology being used is unproblematic, and therefore requires little discussion.
I cannot help giving the example of macroeconomics to show that this view is quite wrong. The methodology of macroeconomics in the 1960s was heavily evidence based. Microeconomics was used to suggest aggregate relationships, but not to determine them. Consistency with the data (using some chosen set of econometric criteria) often governed what was or was not allowed in a parameterised (numerical) model, or even a theoretical model. It was a methodology that some interpreted as Popperian. The methodology of macroeconomics now is very different. Consistency with microeconomic theory governs what is in a DSGE model, and evidence plays a much more indirect role. Now I have only a limited knowledge of the philosophy of science..., but I know enough to recognise this as an important methodological change. Yet I find many macroeconomists just assume that their methodology is unproblematic, because it is what everyone mainstream currently does. ...
... The classic example of an economist writing about methodology is Friedman’s Essays in Positive Economics. This puts forward an instrumentalist view: the idea that realism of assumptions do not matter, it is results that count.
Yet does instrumentalism describe Friedman’s major contributions to macroeconomics? Well one of those was the expectations augmented Phillips curve. ... Friedman argued that the coefficient on expected inflation should be one. His main reason for doing so was not that such an adaptation predicted better, but because it was based on better assumptions about what workers were interested in: real rather nominal wages. In other words, it was based on more realistic assumptions. ...
Economists do not think enough about their own methodology. This means economists are often not familiar with methodological discussion, which implies that using what they write on the subject as evidence about what they do can be misleading. Yet most methodological discussion of economics is (and should be) about what economists do, rather than what they think they do. That is why I find that the more interesting and accurate methodological writing on economics looks at the models and methods economists actually use...

Monday, May 05, 2014

'Refocusing Economics Education'

Antonio Fatás (each of the four points below are explained in detail in the original post):

Refocusing economics education: Via Mark Thoma I read an interesting article about how the mainstream economics curriculum needs to be revamped (Wren-Lewis also has some nice thoughts on this issue).

I am sympathetic to some of the arguments made in those posts and the need for some serious rethinking of the way economics is taught but I would put the emphasis on slightly different arguments. First, I  am not sure the recent global crisis should be the main reason to change the economics curriculum. Yes, economists failed to predict many aspects of the crisis but my view is that it was not because of the lack of tools or understanding. We have enough models in economics that explain most of the phenomena that caused and propagated the global financial crisis. There are plenty of models where individuals are not rational, where financial markets are driven by bubbles, with multiple equilbria,... that one can use to understand the last decade. We do have all these tools but as economics teachers (and researchers) we need to choose which ones to focus on. And here is where we failed. And we did it before and during the crisis but we also did it earlier. Why aren't we focusing on the right models or methodology? Here is my list of mistakes we do in our teaching, which might also reflect on our research:

#1 Too much theory, not enough emphasis on explaining empirical phenomena. ...

#2 Too many counterintuitive results. Economists like to teach things that are surprising. ...

#3 The need for a unified theory. ...

#4 We teach what our audience wants to hear. ...

I also believe the sociology within the profession needs to change.