Category Archive for: Academic Papers [Return to Main]

Tuesday, April 08, 2014

A Model of Secular Stagnation

Gauti Eggertson and Neil Mehotra have an interesting new paper:

A Model of Secular Stagnation, by Gauti Eggertsson and Neil Mehrotra: 1 Introduction During the closing phase of the Great Depression in 1938, the President of the American Economic Association, Alvin Hansen, delivered a disturbing message in his Presidential Address to the Association (see Hansen ( 1939 )). He suggested that the Great Depression might just be the start of a new era of ongoing unemployment and economic stagnation without any natural force towards full employment. This idea was termed the ”secular stagnation” hypothesis. One of the main driving forces of secular stagnation, according to Hansen, was a decline in the population birth rate and an oversupply of savings that was suppressing aggregate demand. Soon after Hansen’s address, the Second World War led to a massive increase in government spending effectively end- ing any concern of insufficient demand. Moreover, the baby boom following WWII drastically changed the population dynamics in the US, thus effectively erasing the problem of excess sav- ings of an aging population that was of principal importance in his secular stagnation hypothesis.
Recently Hansen’s secular stagnation hypothesis has gained increased attention. One obvious motivation is the Japanese malaise that has by now lasted two decades and has many of the same symptoms as the U.S. Great Depression - namely dwindling population growth, a nominal interest rate at zero, and subpar GDP growth. Another reason for renewed interest is that even if the financial panic of 2008 was contained, growth remains weak in the United States and unemployment high. Most prominently, Lawrence Summers raised the prospect that the crisis of 2008 may have ushered in the beginning of secular stagnation in the United States in much the same way as suggested by Alvin Hansen in 1938. Summers suggests that this episode of low demand may even have started well before 2008 but was masked by the housing bubble before the onset of the crisis of 2008. In Summers’ words, we may have found ourselves in a situation in which the natural rate of interest - the short-term real interest rate consistent with full employment - is permanently negative (see Summers ( 2013 )). And this, according to Summers, has profound implications for the conduct of monetary, fiscal and financial stability policy today.
Despite the prominence of Summers’ discussion of the secular stagnation hypothesis and a flurry of commentary that followed it (see e.g. Krugman ( 2013 ), Taylor ( 2014 ), Delong ( 2014 ) for a few examples), there has not, to the best of our knowledge, been any attempt to formally model this idea, i.e., to write down an explicit model in which unemployment is high for an indefinite amount of time due to a permanent drop in the natural rate of interest. The goal of this paper is to fill this gap. ...[read more]...

In the abstract, they note the policy prescriptions for secular stagnation:

In contrast to earlier work on deleveraging, our model does not feature a strong self-correcting force back to full employment in the long-run, absent policy actions. Successful policy actions include, among others, a permanent increase in inflation and a permanent increase in government spending. We also establish conditions under which an income redistribution can increase demand. Policies such as committing to keep nominal interest rates low or temporary government spending, however, are less powerful than in models with temporary slumps. Our model sheds light on the long persistence of the Japanese crisis, the Great Depression, and the slow recovery out of the Great Recession.

Tuesday, March 04, 2014

'Will MOOCs Lead to the Democratisation of education?'

Some theoretical results on MOOCs:

Will MOOCs lead to the democratisation of education?, by Joshua Gans: With all the recent discussion of how hard it is for journalists to read academic articles, I thought I’d provide a little service here and ‘translate’ the recent NBER working paper by Daron Acemoglu, David Laibson and John List, “Equalizing Superstars” for a general audience. The paper contains a ‘light’ general equilibrium model that may be difficult for some to parse.
The paper is interested in what the effect of MOOCs or, in general, web-based teaching options would be on educational outcomes around the world, the distribution of those outcomes and the wages of teachers. ...

Thursday, February 13, 2014

Debt and Growth: There is No Magic Threshold

New paper from the IMF:

Debt and Growth: Is There a Magic Threshold?, by Andrea Pescatori ; Damiano Sandri ; John Simon [Free Full text]: Summary: Using a novel empirical approach and an extensive dataset developed by the Fiscal Affairs Department of the IMF, we find no evidence of any particular debt threshold above which medium-term growth prospects are dramatically compromised. Furthermore, we find the debt trajectory can be as important as the debt level in understanding future growth prospects, since countries with high but declining debt appear to grow equally as fast as countries with lower debt. Notwithstanding this, we find some evidence that higher debt is associated with a higher degree of output volatility.

[Via Bruce Bartlett on Twitter.]

Wednesday, February 12, 2014

'Is Increased Price Flexibility Stabilizing? Redux'

I need to read this:

Is Increased Price Flexibility Stabilizing?, by Redux Saroj Bhattarai, Gauti Eggertsson, and Raphael Schoenle, NBER Working Paper No. 19886 February 2014 [Open Link]: Abstract We study the implications of increased price flexibility on output volatility. In a simple DSGE model, we show analytically that more flexible prices always amplify output volatility for supply shocks and also amplify output volatility for demand shocks if monetary policy does not respond strongly to inflation. More flexible prices often reduce welfare, even under optimal monetary policy if full efficiency cannot be attained. We estimate a medium-scale DSGE model using post-WWII U.S. data. In a counterfactual experiment we find that if prices and wages are fully flexible, the standard deviation of annualized output growth more than doubles.

Friday, February 07, 2014

Latest from the Journal of Economic Perspectives

A few of the articles from the latest Journal of Economic Perspectives:

When Ideas Trump Interests: Preferences, Worldviews, and Policy Innovations, by Dani Rodrik: Ideas are strangely absent from modern models of political economy. In most prevailing theories of policy choice, the dominant role is instead played by "vested interests"—elites, lobbies, and rent-seeking groups which get their way at the expense of the general public. Any model of political economy in which organized interests do not figure prominently is likely to remain vacuous and incomplete. But it does not follow from this that interests are the ultimate determinant of political outcomes. Here I will challenge the notion that there is a well-defined mapping from "interests" to outcomes. This mapping depends on many unstated assumptions about the ideas that political agents have about: 1) what they are maximizing, 2) how the world works, and 3) the set of tools they have at their disposal to further their interests. Importantly, these ideas are subject to both manipulation and innovation, making them part of the political game. There is, in fact, a direct parallel, as I will show, between inventive activity in technology, which economists now routinely make endogenous in their models, and investment in persuasion and policy innovation in the political arena. I focus specifically on models professing to explain economic inefficiency and argue that outcomes in such models are determined as much by the ideas that elites are presumed to have on feasible strategies as by vested interests themselves. A corollary is that new ideas about policy—or policy entrepreneurship—can exert an independent effect on equilibrium outcomes even in the absence of changes in the configuration of political power. I conclude by discussing the sources of new ideas. Full-Text Access | Supplementary Materials

An Economist's Guide to Visualizing Data, by Jonathan A. Schwabish: Once upon a time, a picture was worth a thousand words. But with online news, blogs, and social media, a good picture can now be worth so much more. Economists who want to disseminate their research, both inside and outside the seminar room, should invest some time in thinking about how to construct compelling and effective graphics. Full-Text Access | Supplementary Materials

Wednesday, January 01, 2014

'Minimum Wages and the Distribution of Family Incomes'

Arin Dube new working paper entitled "Minimum wages and the distribution of family incomes."

Here is his short summary:

The paper tries to make sense of the existing literature, while providing new (and I would argue better) answers to old questions such as the effect on the poverty rate, and also conduct a more full-fledged distributional analysis of minimum wages and family incomes using newer tools.

Here is the abstract:

I use data from the March Current Population Survey between 1990 and 2012 to evaluate the effect of minimum wages on the distribution of family incomes for non-elderly individuals. I find robust evidence that higher minimum wages moderately reduce the share of individuals with incomes below 50, 75 and 100 percent of the federal poverty line. The elasticity of the poverty rate with respect to the minimum wage ranges between -0.12 and -0.37 across specifications with alternative forms of time-varying controls and lagged effects; most of these estimates are statistically significant at conventional levels. For my preferred (most saturated) specification, the poverty rate elasticity is -0.24, and rises in magnitude to -0.36 when accounting for lags. I also use recentered influence function regressions to estimate unconditional quantile partial effects of minimum wages on family incomes. The estimated minimum wage elasticities are sizable for the bottom quantiles of the equivalized family income distribution. The clearest effects are found at the 10th and 15th quantiles, where estimates from most specifications are statistically significant; minimum wage elasticities for these two family income quantiles range between 0.10 and 0.43 depending on control sets and lags. I also show that the canonical two-way fixed effects model---used most often in the literature---insufficiently accounts for the spatial heterogeneity in minimum wage policies, and fails a number of key falsification tests. Accounting for time-varying regional effects, and state-specific recession effects both suggest a greater impact of the policy on family incomes and poverty, while the addition of state-specific trends does not appear to substantially alter the estimates. I also provide a quantitative summary of the literature, bringing together nearly all existing elasticities of the poverty rate with respect to minimum wages from 12 different papers. The range of the estimates in this paper is broadly consistent with most existing evidence, including for some key subgroups, but previous studies often suffer from limitations including insufficiently long sample periods and inadequate controls for state-level heterogeneity, which tend to produce imprecise and erratic results.

Update: Here is the key graph from the paper (click for larger version):

Minwageelas

Sunday, December 01, 2013

God Didn’t Make Little Green Arrows

Paul Krugman notes work by my colleague George Evans relating to the recent debate over the stability of GE models:

God Didn’t Make Little Green Arrows: Actually, they’re little blue arrows here. In any case George Evans reminds me of paper (pdf) he and co-authors published in 2008 about stability and the liquidity trap, which he later used to explain what was wrong with the Kocherlakota notion (now discarded, but still apparently defended by Williamson) that low rates cause deflation.

The issue is the stability of the deflation steady state ("on the importance of little arrows"). This is precisely the issue George studied in his 2008 European Economic Review paper with E. Guse and S. Honkapohja. The following figure from that paper has the relevant little arrows:

Evans

This is the 2-dimensional figure (click on it for a larger version) showing the phase diagram for inflation and consumption expectations under adaptive learning (in the New Keynesian model both consumption or output expectations and inflation expectations are central). The intended steady state (marked by a star) is locally stable under learning but the deflation steady state (given by the other intersection of black curves) is not locally stable and there are nearby divergent paths with falling inflation and falling output. There is also a two page summary in George's 2009 Annual Review of Economics paper.

The relevant policy issue came up in 2010 in connection with Kocherlakota's comments about interest rates, and I got George to make a video in Sept. 2010 that makes the implied monetary policy point.

I think it would be a step forward if  the EER paper helped Williamson and others who have not understood the disequilibrium stability point. The full EER reference is Evans, George; Guse, Eran and Honkapohja, Seppo, "Liquidity Traps, Learning and Stagnation" European Economic Review, Vol. 52, 2008, 1438 – 1463.

Friday, November 15, 2013

'Infant Mortality and the President’s Party'

Chris Blattman:

Do Republican Presidents kill babies?:

Across all nine presidential administrations, infant mortality rates were below trend when the President was a Democrat and above trend when the President was a Republican.

This was true for overall, neonatal, and postneonatal mortality, with effects larger for postneonatal compared to neonatal mortality rates.

Regression estimates show that, relative to trend, Republican administrations were characterized by infant mortality rates that were, on average, three percent higher than Democratic administrations.

In proportional terms, effect size is similar for US whites and blacks. US black rates are more than twice as high as white, implying substantially larger absolute effects for blacks.

A new paper titled, “US Infant Mortality and the President’s Party“. I like my title better.

The abstract also says:

Conclusions: We found a robust, quantitatively important association between net of trend US infant mortality rates and the party affiliation of the president. There may be overlooked ways by which macro-dynamics of policy impact micro-dynamics of physiology, suggesting the political system is a component of the underlying mechanism generating health inequality in the United States.

Monday, November 11, 2013

'Why ask Why? Forward Causal Inference and Reverse Causal Questions'

Andrew Gelman and Guido Imbens posted this at the NBER to try to get the attention of economists:

Why ask Why? Forward Causal Inference and Reverse Causal Questions, by Andrew Gelman and Guido Imbens, NBER Working Paper No. 19614, November 2013 NBER: The statistical and econometrics literature on causality is more focused on "effects of causes" than on "causes of effects." That is, in the standard approach it is natural to study the effect of a treatment, but it is not in general possible to define the causes of any particular outcome. This has led some researchers to dismiss the search for causes as "cocktail party chatter" that is outside the realm of science. We argue here that the search for causes can be understood within traditional statistical frameworks as a part of model checking and hypothesis generation. We argue that it can make sense to ask questions about the causes of effects, but the answers to these questions will be in terms of effects of causes.

Thursday, November 07, 2013

New Research in Economics: The Effect of Household Debt Deleveraging on Unemployment – Evidence from Spanish Provinces

Sebastian Jauch and Sebastian Watzka "find that around 1/3 of the increase in Spanish unemployment following the housing boom is due to household mortgage debt deleveraging. This is strongly at odds with the EC estimates of the NAIRU"

Download Jauch-Watzka:

The Effect of Household Debt Deleveraging on Unemployment – Evidence from Spanish Provinces, by Sebastian Jauch and Sebastian Watzka: Introduction Spanish unemployment has risen from a low of 7% in 2007 to its height of 26% in 2012. Around 6.1 mil. people are currently unemployed in Spain. Unemployment rates are particularly high for young people with every second young Spaniard looking for a job. Given the enormous economic, psychological and social problems that are related with high and long-lasting unemployment, it is of the utmost importance to study the causes of the high increase in Spanish unemployment.
In this paper we therefore take a close look at one of these causes and study the extent to which the increase in Spanish unemployment is due to the effects of Spanish household debt deleveraging. Using household mortgage debt data for 50 Spanish provinces together with detailed data on sectoral provincial unemployment data we estimate that over the period 2007-10 around 1/3 of the newly unemployed, or a total of approximately 860,000 people, have become unemployed due to mortgage debt-related aggregate demand reasons.
The underlying transmission mechanism investigated in this study begins with a deleveraging shock to the balance sheets of individual households. The shock for households is greater if they must direct more effort to restructure their balance sheets. The more debt a household has accumulated relative to its income before the shock occurred, the more deleveraging the household must arrange by increasing savings and reducing spending after the shock to restructure its balance sheet. Given the elasticity of employment with respect to demand, these deleveraging needs will increase unemployment. ...

Tuesday, November 05, 2013

Aggregate Supply: Recent Developments and Implications for the Conduct of Monetary Policy

New paper on how the recession has damaged the economy from Dave Reifschneider, William Wascher, and David Wilcox of the Federal Reserve Board:

Aggregate Supply in the United States: Recent Developments and Implications for the Conduct of Monetary Policy, by Dave Reifschneider, William Wascher, and David Wilcox: Abstract: The recent financial crisis and ensuing recession appear to have put the productive capacity of the economy on a lower and shallower trajectory than the one that seemed to be in place prior to 2007. Using a version of an unobserved components model introduced by Fleischman and Roberts (2011), we estimate that potential GDP is currently about 7 percent below the trajectory it appeared to be on prior to 2007. We also examine the recent performance of the labor market. While the available indicators are still inconclusive, some indicators suggest that hysteresis should be a more present concern now than it has been during previous periods of economic recovery in the United States. We go on to argue that a significant portion of the recent damage to the supply side of the economy plausibly was endogenous to the weakness in aggregate demand—contrary to the conventional view that policymakers must simply accommodate themselves to aggregate supply conditions. Endogeneity of supply with respect to demand provides a strong motivation for a vigorous policy response to a weakening in aggregate demand, and we present optimal-control simulations showing how monetary policy might respond to such endogeneity in the absence of other considerations. We then discuss how other considerations--such as increased risks of financial instability or inflation instability--could cause policymakers to exercise restraint in their response to cyclical weakness.

See here too.

Saturday, November 02, 2013

'Improving GDP Measurement: A Measurement-Error Perspective'

Interesting work on obtaining blended estimates of GDP:

Improving GDP Measurement: A Measurement-Error Perspective, by Boragan Aruoba, Francis X. Diebold, Jeremy Nalewaik, Frank Schorfheide, and Dongho Song, First Draft, January 2013 This Draft, May 2, 2013: Abstract: We provide a new and superior measure of U.S. GDP, obtained by applying optimal signal-extraction techniques to the (noisy) expenditure-side and income-side estimates. Its properties -- particularly as regards serial correlation -- differ markedly from those of the standard expenditure-side measure and lead to substantially-revised views regarding the properties of GDP.
1 Introduction Aggregate real output is surely the most fundamental and important concept in macroeconomic theory. Surprisingly, however, significant uncertainty still surrounds its measurement. In the U.S., in particular, two often-divergent GDP estimates exist, a widely-used expenditure-side version, GDPE, and a much less widely-used income-side version, GDPI.1 Nalewaik (2010) and Fixler and Nalewaik (2009) make clear that, at the very least, GDPI deserves serious attention and may even have properties in certain respects superior to those of GDPE. That is, if forced to choose between GDP E and GDPI, a surprisingly strong case exists for GDPI. But of course one is not forced to choose between GDPE and GDPI, and a GDP estimate based on both GDPE and GDPI may be superior to either one alone. In this paper we propose and implement a framework for obtaining such a blended estimate. ...

The main result on the serial correlation properties is that the blended measure of "GDPM is highly serially correlated across all specifications (ρ≈.6), much more so than the current 'consensus' based on GDPE (ρ≈.3)."

Friday, October 25, 2013

Are Sticky Prices Costly? Evidence From The Stock Market

Another interesting paper that supports the New Keynesian sticky price assumption:

Are Sticky Prices Costly? Evidence From The Stock Market, by Yuriy Gorodnichenko and and Michael Weber, NBER: Abstract We show that after monetary policy announcements, the conditional volatility of stock market returns rises more for rms with stickier prices than for firms with more flexible prices. This differential reaction is economically large as well as strikingly robust to a broad array of checks. These results suggest that menu costs -- broadly defined to include physical costs of price adjustment, informational frictions, etc. -- are an important factor for nominal price rigidity. We also show that our empirical results are qualitatively and, under plausible calibrations, quantitatively consistent with New Keynesian macroeconomic models where firms have heterogeneous price stickiness. Since our framework is valid for a wide variety of theoretical models and frictions preventing firms from price adjustment, we provide "model-free" evidence that sticky prices are indeed costly.

Fiscal Multipliers: Liquidity Traps and Currency Unions

First paper at the conference is interesting:

Fiscal Multipliers: Liquidity Traps and Currency Unions, by Emmanuel Farhi and Iván Werning, NBER: We provide explicit solutions for government spending multipliers during a liquidity trap and within a fixed exchange regime using standard closed and open-economy models. We confirm the potential for large multipliers during liquidity traps. For a currency union, we show that self-financed multipliers are small, always below unity. However, outside transfers or windfalls can generate larger responses in out- put, whether or not they are spent by the government. Our solutions are relevant for local and national multipliers, providing insight into the economic mechanisms at work as well as the testable implications of these models.

Discussant: The "Keynesian demand effect can potentially be very large." Here is a bit of the introduction that explains further:

1 Introduction Economists generally agree that macroeconomic stabilization should be handled first and foremost by monetary policy. Yet monetary policy can run into constraints that impair its effectiveness. For example, the economy may find itself in a liquidity trap, where interest rates hit zero, preventing further reductions in the interest rate. Similarly, countries that belong to currency unions, or states within a country, do not have the option of an independent monetary policy. Some economists advocate for fiscal policy to fill this void, increasing government spending to stimulate the economy. Others disagree, and the issue remains deeply controversial, as evidenced by vigorous debates on the magnitude of fiscal multipliers. No doubt, this situation stems partly from the lack of definitive empirical evidence, but, in our view, the absence of clear theoretical benchmarks also plays an important role. Although various recent contributions have substantially furthered our understanding, to date, the implications of standard macroeconomic models have not been fully worked out. This is the goal of our paper.
We solve for the response of the economy to changes in the path for government spending during liquidity traps or within currency unions using standard closed and open-economy monetary models. ...
Our results confirm that fiscal policy can be especially potent during a liquidity trap. The multiplier for output is greater than one. The mechanism for this result is that government spending promotes inflation. With fixed nominal interest rates, this reduces real interest rates which increases current spending. The increase in consumption in turn leads to more inflation, creating a feedback loop. The fiscal multiplier is increasing in the degree of price flexibility, which is intuitive given that the mechanism relies on the response of inflation. We show that backloading spending leads to larger effects; the rationale is that inflation then has more time to affect spending decisions.
In a currency union, by contrast, government spending is less effective at increasing output. We show that consumption is depressed, so that the multiplier is less than one. Moreover, price flexibility diminishes the effectiveness of spending, instead of increasing it. We explain this result using a simple argument that illustrates its robustness. Government spending leads to inflation in domestically produced goods and this loss in competitiveness depresses private spending. Applied to current debates in Europe, this highlights a possible tradeoff: putting off fiscal consolidation may postpone internal devaluations that actually help reactivate private spending.
It may seem surprising that fiscal multipliers are necessarily less than one whenever the exchange rate is fixed, because this contrasts sharply with the effects during liquidity traps. Our analytical approach allows us to uncover the crucial difference in monetary policy: although a fixed exchange rate implies a fixed nominal interest rate, the converse is not true. Indeed, we prove that the liquidity trap analysis implicitly combines a shock to government spending with a one-off devaluation. The positive response of consumption relies entirely on this devaluation. A currency union rules out such devaluations, explaining the negative response of consumption.
In the context of a currency union, our results uncover the importance of transfers from the outside, from other countries or regions. In the short run, when prices haven’t fully adjusted, positive transfers from the rest of the world increase the demand for home goods, stimulating output. We compute “transfer multipliers” that capture the response of the economy to transfers from the outside. We show that these multipliers may be large and depend crucially on the degree of openness of the domestic economy.
Outside transfers are often tied to government spending. In the United States federal military spending allocated to a particular state is financed by the country as a whole. The same is true for exogenous differences in stimulus payments, due to idiosyncratic provisions in the law. Likewise, idiosyncratic portfolio returns accruing to a particular state’s coffers represent a windfall for this state against the rest. When changes in spending are financed by such outside transfers, the associated multipliers are a combination of self-financed multipliers and transfer multipliers. As a result, multipliers may be substantially larger than one.
Finally, we explore non-Ricardian effects from fiscal policy by introducing hand-to- mouth consumers. We think of this as a tractable way of modeling liquidity constraints. In both in a liquidity trap and in a currency union, government spending now has an additional stimulative effect. It increases the income and consumption of hand-to-mouth agents. This effects is largest when spending is deficit financed; indeed, the effects may in some cases depend entirely on deficits, not spending per se. Overall, although hand to mouth consumers introduce an additional effect most of our conclusions, such as the comparison of fiscal multipliers in a liquidity trap and a currency union, are unaffected. ...

Thursday, October 10, 2013

Have Blog, Will Travel

I am here for the next two days:

38th Annual Federal Reserve Bank of St. Louis Fall Conference

Thursday, October 10, 2013

8: 45 – 9:00 am Opening Remarks
James Bullard, President, Federal Reserve Bank of St. Louis

Session I - Financial Markets 1

9:00 – 10:15 am "Trade Dynamics in the Market for Federal Funds"
Presenter:  Ricardo Lagos, New York University
Coauthor:  Gara Afonso, Federal Reserve Bank of New York
Discussant:  Huberto Ennis, Federal Reserve Bank of Richmond

10:45 am – 12:00 pm "Banks' Risk Exposures"
Presenter:  Martin Schneider , Stanford University
Coauthors:  Juliane Begenau, Stanford University and Monika Piazzesi, Stanford University
Discussant:  Hanno Lustig, University of California-Los Angeles

Session II: Monetary Policy and Macro Dynamics

1:00 – 2:15 pm "Unemployment and Business Cycles"
Presenter:  Martin S. Eichenbaum, Northwestern University
Coauthors:  Lawrence J. Christiano, Northwestern University and Mathias Trabandt, Board of Governors of the Federal Reserve System
Discussant:  Jaroslav Borovicka, New York University

2:45 – 4:00 pm "Conventional and Unconventional Monetary Policy in a Model with Endogenous Collateral Constraints"
Presenter:  Michael Woodford, Columbia University
Coauthors:  Aloísio Araújo, Getulio Vargas Foundation and Susan Schommer, Instituto Nacional de Matemática Pura e Aplicada
Discussant:  Stephen Williamson, Washington University

4:00 – 5:15 pm "Leverage Restrictions in a Business Cycle Model"
Presenter:  Lawrence J. Christiano, Northwestern University
Coauthor:  Daisuke Ikeda, Bank of Japan
Discussant:  Benjamin Moll, Princeton UniversityFriday, October 11, 2013

Friday, October 11, 2013

Session III: Financial Markets 2

9:00 – 10:15 am "Measuring the Financial Soundness of U.S. Firms, 1926—2012"
Presenter:  Andrew G. Atkeson, University of California-Los Angeles
Coauthor:  Andrea L. Eisfeldt, University of California-Los Angeles and Pierre-Olivier Weill, University of California-Los Angeles
Discussant:  Gian Luca Clementi, New York University

10:45 am – 12:00 pm "The I Theory of Money"
Presenter:  Markus K. Brunnermeier, Princeton University
Coauthor:  Yuliy Sannikov, Princeton University
Discussant:  Ed Nosal, Federal Reserve Bank of Chicago

Session IV: Households Lifecycle Behavior

1:00 – 2:15 pm "Is There 'Too Much' Inequality in Health Spending Across Income Groups?"
Presenter:  Larry E. Jones, University of Minnesota
Coauthors:  Laurence Ales, Carnegie Mellon University and Roozbeh Hosseini , Arizona State University
Discussant:  Selahattin İmrohoroğlu, University of Southern California

2:15 –3:30 pm "Retirement, Home Production and Labor Supply Elasticities"
Presenter:  Richard Rogerson, Princeton University
Coauthor:  Johanna Wallenius, Stockholm School of Economics
Discussant:  Nancy Stokey, University of Chicago

Monday, October 07, 2013

'Uncertainty Shocks are Aggregate Demand Shocks'

More new research:

Uncertainty Shocks are Aggregate Demand Shocks, by Sylvain Leduc and Zheng Liu: Abstract We present empirical evidence and a theoretical argument that uncertainty shocks act like a negative aggregate demand shock, which raises unemployment and lowers inflation. We measure uncertainty using survey data from the United States and the United Kingdom. We estimate the macroeconomic effects of uncertainty shocks in a vector autoregression (VAR) model, exploiting the relative timing of the surveys and macroeconomic data releases for identification. Our estimation reveals that uncertainty shocks accounted for at least one percentage point increases in unemployment in the Great Recession and recovery, but did not contribute much to the 1981-82 recession. We present a DSGE model to show that, to understand the observed macroeconomic effects of uncertainty shocks, it is essential to have both labor search frictions and nominal rigidities.

New Research in Economics: The Return and Risk of Pursuing a BA

This is from Frank Levy at MIT:

I am attaching a paper co-authored with two former students that uses California higher ed data to make stylized calculations of the return and risk of pursuing a BA. The paper makes two main points.
Most studies of the rate of return to college use a best-case scenario in which students earn a degree with certainty in four years. More realistic calculations that account for students who take more than four years and students who drop out without a degree, etc. result in an average rate of return that is lower than it was in 2000 but still exceeds the interest rate on unsubsidized Stafford student loans – i.e. college remains a good investment by the normal criteria.
Most studies present an average rate of return without considering the investment’s risk. Over the last decade, rising tuition and deteriorating earnings for new college graduates (particularly at the bottom of the distribution) have increased the risk of pursuing a BA – e.g. the risk that a graduate at age 30 will have student loan payments that exceed 15% of their income. This growing risk is one explanation for increased  skepticism about the value of a college degree despite the apparently high rate of return. It also underlines the importance of students becoming aware of the government’s income contingent loan repayment plans.
The paper is posted on SSRN.

Friday, October 04, 2013

Minimum Wages and Job Growth: a Statistical Artifact

Arin Dube:

Minimum Wages and Job Growth: a Statistical Artifact, by Arin Dube: In a recent paper, Jonathan Meer and Jeremy West argue that it takes time for employment to adjust in response to a minimum wage hike, making it more difficult to detect an impact by looking at employment levels. In contrast, they argue, impact is easier to discern when considering employment growth. They find that a 10 percent increase in minimum wage is associated with as much as 0.5 percentage point lower aggregate employment growth. These estimates are very large, as John Schmitt explains in a recent post, and far outside the range in the existing literature. But are they right?
As I show in a new paper, the short answer is: no. The negative association between job growth and minimum wages is in the wrong place: it shows up in a sector like manufacturing that has few minimum wage workers, but is absent in low-wage sectors like food services and retail. In other words, it is likely a statistical artifact, and not a causal relationship...

Friday, September 27, 2013

Have Blog, Will Travel

I am here today:

Finance and the Wealth of Nations Workshop
Federal Reserve Bank of San Francisco
& The Institute of New Economic Thinking

9:00AM - 9:45AM: David Scharfstein and Robin Greenwood (Harvard Business School), “The Growth of Finance”, Discussant: Bradford DeLong (UC Berkeley)

9:45AM-10:30AM: Ariell Reshef (Virginia) and Thomas Philippon (NYU-Stern), “An International Look at the Growth of Modern Finance” , Discussant: Charles Jones (Stanford GSB)

10:45AM -11:30AM: Andrea Eisfeldt (UCLA-Anderson), Andrew Atkeson (UCLA) and Pierre-Olivier Weill (UCLA), “The Financial Soundness of U.S. Firms 1926–2011: Financial Frictions and the Business Cycle”, Discussant: Jonathan Rose (Federal Reserve Board) 

11:30AM -12:15PM: Ross Levine (UC Berkeley-Haas), Yona Rubenstein (LSE), Liberty for More: Finance and Educational Opportunities”, Discussant: Gregory Clark (UC Davis)

1:30PM-2:15PM: Atif Mian (Princeton), Amir Sufi (U. Chicago-Booth), “The Effect Of Interest Rate And Collateral Value Shocks On Household Spending: Evidence from Mortgage Refinancsing”, Discussant: Reuven Glick (SF Fed)

2:15PM-3:00PM: Maurice Obstfeld (UC Berkeley), “Finance at Center Stage: Some Lessons of the Euro Crisis”, Discussant: Giovanni dell'Ariccia (IMF)

3:00PM-3:45PM: Stephen G. Cecchetti and Enisse Kharroubi (BIS), “Why Does Financial Sector Growth Crowd Out Real Economic Growth?”, Discussant: Barry Eichengreen (UC Berkeley) 

4:00PM-4:45PM: Thorsten Beck (Tilburg), “Financial Innoation: The Bright and the Dark Sides”, Discussant: Sylvain Leduc (SF Fed) 

4:45PM-5:30PM: Alan M. Taylor (UC Davis), Òscar Jordà (SF Fed/UC Davis), Moritz Schularick (Bonn), “Sovereigns versus Banks: Crises, Causes and Consequences”, Discussant: Aaron Tornell (UCLA)

6:15PM: Keynote Speaker, Introduction: John Williams (SF Fed, President), Lord Adair Turner (INET, Senior Fellow; former Chairman of the UK Financial Services Authority), "Credit, Money and Leverage"

Thursday, September 12, 2013

New Research in Economics: Rational Bubbles

New research on rational bubbles from George Waters:

Dear Mark,

I’d like to take you up on your offer to publicize research. I’ve spent a good chunk of my time (along with Bill Parke) over the last decade developing an asset price model with heterogeneous expectations, where agents are allowed to adopt a forecast based on a rational bubble.

The idea of a rational bubble has been around for quite a while, but there has been little effort to explain how investors would coordinate on such a forecast when there is a perfectly good alternative forecast based on fundamentals. In our model agents are not assumed to use either forecast but are allowed to switch between forecasting strategies based on past performance, according to an evolutionary game theory dynamic.

The primary theoretical point is to provide conditions where agents coordinate on the fundamental forecast in accordance with the strong version of the efficient markets hypothesis. However, it is quite possible that agents do not always coordinate on the fundamental forecast, and there are periods of time when a significant fraction of agents adopt a bubble forecast. There are obvious implications about assuming a unique rational expectation.

A more practical goal is to model the endogenous formation and collapse of bubbles. Bubbles form when there is a fortuitous correlation between some extraneous information and the fundamentals, and agents are sufficiently aggressive about switching to better performing strategies. Bubbles always collapse due to the presence of a small fraction of agents who do not abandon fundamentals, and the presence of a reflective forecast, a weighted average of the other two forecasts, that is the rational forecast in the presence of heterogeneity.

There are strong empirical implications. The asset price is not forecastable, so the weak version of the efficient markets hypothesis is satisfied. Simulated data from the model shows excess persistence and variance in the asset price and ARCH effects and long memory in the returns.

There is much more work to be done to connect the approach to the literature on the empirical detection of bubbles, and to develop models with dynamic switching between heterogeneous strategies in more sophisticated macro models.

A theoretical examination of the model is forthcoming in Macroeconomic Dynamics.

A more user friendly exposition of the model and the empirical implications is here.

An older published paper (Journal of Economic Dynamics and Control 31(7)) focuses on ARCH effects and long memory.

Dr. George Waters
Associate Professor of Economics
Illinois State University
http://www.econ.ilstu.edu/gawater/

Sunday, September 01, 2013

'Limited Time Offer! Temporary Sales and Price Rigidities'

Are prices rigid? (For background and a discussion of previous evidence on price rigidity at both aggregated and disaggregated levels, see this post.) This is via Carola Binder:

Limited Time Offer! Temporary Sales and Price Rigidities: Even though prices change frequently, this does not necessarily mean that prices are very flexible, according to a new paper by Eric Anderson, Emi Nakamura, Duncan Simester, and Jón Steinsson. In "Informational Rigidities and the Stickiness of Temporary Sales," these authors note that it is important to distinguish temporary sales from regular price changes when analyzing the frequency of price adjustment and the response of prices to macroeconomic shocks.
"The literature on price rigidity can be divided into a literature on "sticky prices" and a literature on "sticky information" (which gives rise to sticky plans). A key question in interpreting the extremely high frequencies of price change observed in retail price data is whether these frequent price changes facilitate rapid responses to changing economic conditions, or whether some of these price changes are part of “sticky plans” that are determined substantially in advance and therefore not responsive to changing conditions. ...
They provide some interesting institutional features of temporary sales and promotions...
They conclude that regular (non-sale) prices exhibit stickiness, while temporary sale prices follow "sticky plans" that are relatively unresponsive in the short run to macroeconomic shocks:
"Our analysis suggests that regular prices are sticky prices that change infrequently but are responsive to macroeconomic shocks, such as the rapid run-up and decline of oil prices. In contrast, temporary sales follow sticky plans. These plans include price discounts of varying depth and frequency across products. But, the plans themselves are relatively unresponsive in the near term to macroeconomic shocks. We believe that this characterization of regular and sale prices as sticky prices versus sticky plans substantially advances an ongoing debate about the extent of retail price fluctuations and offers deeper insight into how retail prices adjust in response to macroeconomic shocks."

Monday, August 19, 2013

'Making Do With Less: Working Harder During Recessions'

New paper:

Making Do With Less: Working Harder During Recessions, by Edward P. Lazear, Kathryn L. Shaw, Christopher Stanton, NBER Working Paper No. 19328 Issued in August 2013: There are two obvious possibilities that can account for the rise in productivity during recent recessions. The first is that the decline in the workforce was not random, and that the average worker was of higher quality during the recession than in the preceding period. The second is that each worker produced more while holding worker quality constant. We call the second effect, “making do with less,” that is, getting more effort from fewer workers. Using data spanning June 2006 to May 2010 on individual worker productivity from a large firm, it is possible to measure the increase in productivity due to effort and sorting. For this firm, the second effect—that workers’ effort increases—dominates the first effect—that the composition of the workforce differs over the business cycle.

Friday, July 26, 2013

How Anti-Poverty Programs Go Viral

This is a summary of research by Esther Duflo, Abhijit Banerjee, Arun Chandrasekhar, and Matthew Jackson on the spread of information about government programs through social networks:

How anti-poverty programs go viral, by Peter Dizikes, MIT News Office: Anti-poverty researchers and policymakers often wrestle with a basic problem: How can they get people to participate in beneficial programs? Now a new empirical study co-authored by two MIT development economists shows how much more popular such programs can be when socially well-connected citizens are the first to know about them.
The economists developed a new measure of social influence that they call “diffusion centrality.” Examining the spread of microfinance programs in rural India, the researchers found that participation in the programs increases by about 11 percentage points when well-connected local residents are the first to gain access to them.
“According to our model, when someone with high diffusion centrality receives a piece of information, it will spread faster through the social network,” says Esther Duflo, the Abdul Latif Jameel Professor of Poverty Alleviation at MIT. “It could thus be a guide for an organization that tries to [place] a piece of information in a network.”
The researchers specifically wanted to study how knowledge about a program spreads by word of mouth, MIT professor Abhijit Banerjee says, because “while there was a body of elegant theory on the relation between what the network looks like and the speed of transmission of information, there was little empirical work on the subject.”
The paper, titled “The Diffusion of Microfinance,” is published today in the journal Science. ...
Microfinance is the term for small-scale lending, popularized in the 1990s, that can help relatively poor people in developing countries gain access to credit they would not otherwise have. The concept has been the subject of extensive political debate; academic researchers are still exploring its effects across a range of economic and geographic settings.
“Microfinance is the type of product which is very interesting to study,” Duflo says, “because in many cases it won’t be well known, and hence there is a role for information diffusion.” Moreover, she notes, “It is also the kind of product on which people could have strongly held … opinions.” So, she says, understanding the relationship between social structure and adoption could be particularly important.
Other scholars believe the findings are valuable. Lori Beaman, an economist at Northwestern University, says the paper “significantly moves forward our understanding of how social networks influence people’s decision-making,” and suggests that the work could spur other on-the-ground research projects that study community networks in action.
“I think this work will lead to more innovative research on how social networks can be used more effectively in promoting poverty alleviation programs in poor countries,” adds Beaman... “Other areas would include agricultural technology adoption … vaccinations for children, [and] the use of bed nets [to prevent malaria], to name just a few.”  ...

Thursday, June 06, 2013

Orphanides and Wieland: Complexity and Monetary Policy

A paper I need to read:

Complexity and Monetary Policy, by Athanasios Orphanides and Volker Wieland, CFS Working Paper: Abstract The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.

Wednesday, May 29, 2013

'Inflation in the Great Recession and New Keynesian Models'

DSGE models are "surprisingly accurate":

Inflation in the Great Recession and New Keynesian Models, by Marco Del Negro, Marc P. Giannoni, and Frank Schorfheide: It has been argued that existing DSGE models cannot properly account for the evolution of key macroeconomic variables during and following the recent Great Recession, and that models in which inflation depends on economic slack cannot explain the recent muted behavior of inflation, given the sharp drop in output that occurred in 2008-09. In this paper, we use a standard DSGE model available prior to the recent crisis and estimated with data up to the third quarter of 2008 to explain the behavior of key macroeconomic variables since the crisis. We show that as soon as the financial stress jumped in the fourth quarter of 2008, the model successfully predicts a sharp contraction in economic activity along with a modest and more protracted decline in inflation. The model does so even though inflation remains very dependent on the evolution of both economic activity and monetary policy. We conclude that while the model considered does not capture all short-term fluctuations in key macroeconomic variables, it has proven surprisingly accurate during the recent crisis and the subsequent recovery. [pdf]

Saturday, May 18, 2013

New Research in Economics: Self-interest vs. Greed and the Limitations of the Invisible Hand

This is from Matt Clements, Associate Professor and Chair of the Economics Department at St. Edward’s University:

Dear Professor Thoma,
Allow me to add to the flood of responses you have no doubt received to your offer to help publicize your readers’ research. The paper is called "Self-interest vs. Greed and the Limitations of the Invisible Hand," forthcoming in the American Journal of Economics and Sociology (pdf of the final version). The point of the paper is that greed, as opposed to enlightened self-interest, can be destructive. Markets always operate within some framework of laws and enforcement, and the claim that greed is good implicitly assumes that the legal framework is essentially perfect. To the extent that laws are suboptimal and enforcement is imperfect, greed can easily enrich some market participants at the expense of total surplus. All of this seemed sufficiently obvious to me that at first I wondered if the paper was even worth writing, but the referees were surprisingly difficult to convince.

Thursday, May 16, 2013

New Research in Economics: Terrorism and the Macroeconomy: Evidence from Pakistan

This is from Sultan Mehmood. The article appears in the May edition of Defense and Peace Economics, which the author describes as "a highly specialized journal on conflict":

Terrorism and the Macroeconomy: Evidence from Pakistan, by Sultan Mehmood, Journal of Defense and Peace Economics, May 2013: Summary: The study evaluates the macroeconomic impact of terrorism in Pakistan by utilizing terrorism data for around 40 years. Standard time-series methodology allows us to distinguish between short and long run effects, and it also avoids the aggregation problems in cross-country studies. The study is also one of the few that focuses on evaluating impact of terrorism on a developing country. The results show that cumulatively terrorism has cost Pakistan around 33.02% of its real national income over the sample period.
Motivation: Studies on the impact of terrorism on the economy have exclusively focused on developed countries (see e.g. Eckstein and Tsiddon, 2004). This is surprising because developing countries are not only hardest hit by terrorism, but are more responsive to external shocks. Terrorism in Pakistan, with magnitude greater than Israel, Greece, Turkey, Spain and USA combined in terms of incidents and death count, has consistently hit news headlines across the world. Yet, terrorism in Pakistan has received relatively little academic attention.
The case of Pakistan is unique for studying the impact of terrorism on the economy for a number of reasons. Firstly, Pakistan has a long and intense history of terrorism which allows one to capture the effect on the economy in the long run. Secondly, growth retarding effects of terrorism are hypothesized to be more pronounced in developing rather than developed countries (Frey et al., 2007). Thirdly, the Pakistani economy is exceptionally vulnerable to external shocks with 12 IMF programmes during 1990-2007 (IMF, 2010, 2011). Lastly, the case study of terrorism for a developing or least developing country is yet to be done. Scholars of the Copenhagen Consensus studying terrorism note the ‘need for additional case studies, especially of developing countries’ (Enders and Sandler, 2006, p. 31). This research attempts to fill this void.
Main Results: The results of the econometric investigation suggest that terrorism has cost Pakistan around 33.02% of its real national income over the sample time period of 1973–2008, with the adverse impact mainly stemming from a fall in domestic investment and lost workers’ remittances from abroad. This averages to a per annum loss of around 1% of real GDP per capita growth. Moreover, estimates from a Vector Error Correction Model (VECM) show that terrorism impacts the economy primarily through medium- and long-run channels. The article also finds that the negative effect of terrorism lasts for at least two years for most of the macroeconomic variables studied, with the adverse effect on worker remittances, a hitherto ignored factor, lasting for five years. The results are robust to different lag length structures, policy variables, structural breaks and stability tests. Furthermore, it is shown that they are unlikely to be driven by omitted variables, or [Granger type] reverse causality.
Hence, the article finds evidence that terrorism, particularly in emerging economies, might pose significant macroeconomic costs to the economy.

New Research in Economics: Robust Stability of Monetary Policy Rules under Adaptive Learning

I have had several responses to my offer to post write-ups of new research that I'll be posting over the next few days (thanks!), but I thought I'd start with a forthcoming paper from a former graduate student here at the University of Oregon, Eric Guass:

Robust Stability of Monetary Policy Rules under Adaptive Learning, by Eric Gaus, forthcoming, Southern Economics Journal: Adaptive learning has been used to assess the viability a variety of monetary policy rules. If agents using simple econometric forecasts "learn" the rational expectations solution of a theoretical model, then researchers conclude the monetary policy rule is a viable alternative. For example, Duffy and Xiao (2007) find that if monetary policy makers minimize a loss function of inflation, interest rates, and the output gap, then agents in a simple three equation model of the macroeconomy learn the rational expectations solution. On the other hand, Evans and Honkapohja (2009) demonstrates that this may not always be the case. The key difference between the two papers is an assumption over what information the agents of the model have access to. Duffy and Xiao (2007) assume that monetary policy makers have access to contemporaneous variables, that is, they adjust interest rates to current inflation and output. Evans and Honkapohja (2009) instead assume that agents only can form expectations of contemporaneous variables. Another difference between these two papers is that in Duffy and Xiao (2007) agents use all the past data they have access to, whereas in Evans and Honkapohja (2009) agents use a fixed window of data.
This paper examines several different monetary policy rules under a learning mechanism that changes how much data agents are using. It turns out that as long as the monetary policy makers are able to see contemporaneous endogenous variables (output and inflation) then the Duffy and Xiao (2007) results hold. However, if agents and policy makers use expectations of current variables then many of the policy rules are not "robustly stable" in the terminology of Evans and Honkapohja (2009).
A final result in the paper is that the switching learning mechanism can create unpredictable temporary deviations from rational expectations. This is a rather starting result since the source of the deviations is completely endogenous. The deviations appear in a model where there are no structural breaks or multiple equilibria or even an intention of generating such deviations. This result suggests that policymakers should be concerned with the potential that expectations, and expectations alone, can create exotic behavior that temporarily strays from the REE.

Wednesday, May 15, 2013

Help Me Publicize Your Research

The previous post reminds me of an offer I've been meaning to make to try to help to publicize academic research:

If you have a paper that is about to be published in an economics journal (or was recently published), send me a summary of the research explaining the findings, the significance of the work, etc. and I'd be happy to post the write-up here. It can be micro, macro, econometrics, any topic at all, but hoping for something that goes beyond a mere echo of the abstract and I want to avoid research not yet accepted for publication (so I don't have to make a judgment on the quality of the research -- I don't always have the time to read papers carefully, and they may not be in my area of expertise).

Homeowners Do Not Increase Consumption When Their Housing Prices Increase?

New and contrary results on the wealth effect for housing:

Homeowners do not increase consumption despite their property rising in value, EurekAlert: Although the value of our property might rise, we do not increase our consumption. This is the conclusion by economists from University of Copenhagen and University of Oxford in new research which is contrary to the widely believed assumption amongst economists that if there occurs a rise in house prices then a natural rise in consumption will follow. The results of the study is published in The Economic Journal.
"We argue that leading economists should not wholly be focused on monitoring the housing market. Economists are closely watching the developments on the housing market with the expectation that house prices and household consumption tend to move in tandem, but this is not necessarily the case," says Professor of Economics at University of Copenhagen, Søren Leth-Petersen.
Søren Leth-Petersen has, alongside Professor Martin Browning from University of Oxford and Associate Professor Mette Gørtz from University of Copenhagen, tested this widespread assumption of 'wealth effect' and concluded that the theory has no significant effect.
Søren Leth-Petersen explains that when economists use the theory of  'wealth effect' the presumption is that older homeowners will adjust their consumption the most when house prices change whilst younger homeowners will adjust their consumption the least. However, according to this research, most homeowners do not feel richer in line with the rise of housing wealth.
"Our research shows that homeowners aged 45 and over, do not increase their consumption significantly when the value of their property goes up, and this goes against the theory of 'wealth effect'. Thus, we are able to reject the theory as the connecting link between rising house prices and increased consumption," explains Søren Leth-Petersen. ...
The research shows that homeowners aged 45 and over did not react significantly to the rise in house prices. However, the younger homeowners, who are typically short of finances, took the opportunity to take out additional consumption loans when given the chance. ...

Tuesday, April 16, 2013

'How Much Unemployment Was Caused by Reinhart and Rogoff's Arithmetic Mistake?'

The work of Reinhart and Rogoff was a major reason for the push for austerity at a time when expansionary policy was called for, i.e. their work supported the bad idea that austerity during a recession can actually be stimulative. It isn't as the events in Europe have shown conclusively.

To be fair, as I discussed here (in "Austerity Can Wait for Sunnier Days") after watching Reinhart give a talk on this topic at an INET conference, she didn't assert that contractionary policy was somehow expansionary (i.e. she did not claim the confidence fairy would more than offset the negative short-run effects of austerity). What she asserted is that pain now -- austerity -- can avoid even more pain down the road in the form of lower economic growth.

Here's the problem. She is right that austerity causes pain in the short-run. But according to a review of her work with Rogoff discussed below, the lower growth from debt levels above 90 percent that austerity is supposed to avoid turns out, it appears, to be largely the result of errors in the research. In fact, there is no substantial growth penalty from high debt levels, and hence not much gain from short-run austerity.

Here's Dean Baker with a rundown on the new work (see also Mike Konczal who helped to shed light on this research):

How Much Unemployment Was Caused by Reinhart and Rogoff's Arithmetic Mistake?, by Dean Baker: That's the question millions will be asking when they see the new paper by my friends at the University of Massachusetts, Thomas Herndon, Michael Ash, and Robert Pollin. Herndon, Ash, and Pollin (HAP) corrected the spreadsheets of Carmen Reinhart and Ken Rogoff. They show the correct numbers tell a very different story about the relationship between debt and GDP growth than the one that Reinhart and Rogoff have been hawking.
Just to remind folks, Reinhart and Rogoff (R&R) are the authors of the widely acclaimed book on the history of financial crises, This Time is Different. They have also done several papers derived from this research, the main conclusion of which is that high ratios of debt to GDP lead to a long periods of slow growth. Their story line is that 90 percent is a cutoff line, with countries with debt-to-GDP ratios above this level seeing markedly slower growth than countries that have debt-to-GDP ratios below this level. The moral is to make sure the debt-to-GDP ratio does not get above 90 percent.
There are all sorts of good reasons for questioning this logic. First, there is good reason for believing causation goes the other way. Countries are likely to have high debt-to-GDP ratios because they are having serious economic problems.
Second, as Josh Bivens and John Irons have pointed out, the story of the bad growth in high debt years in the United States is driven by the demobilization after World War II. In other words, these were not bad economic times, the years of high debt in the United States had slow growth because millions of women opted to leave the paid labor force.
Third, the whole notion of public debt turns out to be ill-defined. ...
But HAP tells us that we need not concern ourselves with any arguments this complicated. The basic R&R story was simply the result of them getting their own numbers wrong.
After being unable to reproduce R&R's results with publicly available data, HAP were able to get the spreadsheets that R&R had used for their calculations. It turns out that the initial results were driven by simple computational and transcription errors. The most important of these errors was excluding four years of growth data from New Zealand in which it was above the 90 percent debt-to-GDP threshold..., correcting this one mistake alone adds 1.5 percentage points to the average growth rate for the high debt countries. This eliminates most of the falloff in growth that R&R find from high debt levels. (HAP find several other important errors in the R&R paper, however the missing New Zealand years are the biggest part of the story.)
This is a big deal because politicians around the world have used this finding from R&R to justify austerity measures that have slowed growth and raised unemployment. In the United States many politicians have pointed to R&R's work as justification for deficit reduction even though the economy is far below full employment by any reasonable measure. In Europe, R&R's work and its derivatives have been used to justify austerity policies that have pushed the unemployment rate over 10 percent for the euro zone as a whole and above 20 percent in Greece and Spain. In other words, this is a mistake that has had enormous consequences.
In fairness, there has been other research that makes similar claims, including more recent work by Reinhardt and Rogoff. But it was the initial R&R papers that created the framework for most of the subsequent policy debate. And HAP has shown that the key finding that debt slows growth was driven overwhelmingly by the exclusion of 4 years of data from New Zealand.
If facts mattered in economic policy debates, this should be the cause for a major reassessment of the deficit reduction policies being pursued in the United States and elsewhere. It should also cause reporters to be a bit slower to accept such sweeping claims at face value.
(Those interested in playing with the data itself can find it at the website for the Political Economic Research Institute.)

Update: Reinhart-Rogoff Response to Critique - WSJ.

Monday, March 18, 2013

Trickle-Down Consumption

Robert Frank, who has been arguing for effects, will like the results in this paper from the NBER:

Trickle-Down Consumption, by Marianne Bertrand and Adair Morse, NBER Working Paper No. 18883 Issued in March 2013 [open link]: Have rising income and consumption at the top of income distribution since the early 1980s induced households in the lower tiers of the distribution to consume a larger share of their income? Using state-year variation in income level and consumption in the top first quintile or decile of the income distribution, we find evidence for such “trickle-down consumption.” The magnitude of effect suggests that middle income households would have saved between 2.6 and 3.2 percent more by the mid-2000s had incomes at the top grown at the same rate as median income. Additional tests argue against permanent income, upwardly-biased expectations of future income, home equity effects and upward price pressures as the sole explanations for this finding. Instead, we show that middle income households’ consumption of more income elastic and more visible goods and services appear particularly responsive to top income levels, consistent with supply-driven demand and status-driven explanations for our primary finding. Non-rich households exposed to higher top income levels self-report more financial duress; moreover, higher top income levels are predictive of more personal bankruptcy filings. Finally, focusing on housing credit legislation, we suggest that the political process may have internalized and facilitated such trickle-down

Here's a nice discussion of the work from Chrystia Freeland (and why it will make Robert Frank happy): Trickle-down consumption.

Friday, March 15, 2013

Journal News (BE Journal of Theoretical Economics)

Resignations at the BE Journal of Theoretical Economics

Friday, March 08, 2013

Measuring the Effect of the Zero Lower Bound on Medium- and Longer-Term Interest Rates

Watching John Williams give this paper:

Measuring the Effect of the Zero Lower Bound on Medium- and Longer-Term Interest Rates, by Eric T. Swanson and John C. Williams, Federal Reserve Bank of San Francisco, January 2013: Abstract The federal funds rate has been at the zero lower bound for over four years, since December 2008. According to many macroeconomic models, this should have greatly reduced the effectiveness of monetary policy and increased the efficacy of fiscal policy. However, standard macroeconomic theory also implies that private-sector decisions depend on the entire path of expected future short-term interest rates, not just the current level of the overnight rate. Thus, interest rates with a year or more to maturity are arguably more relevant for the economy, and it is unclear to what extent those yields have been constrained. In this paper, we measure the effects of the zero lower bound on interest rates of any maturity by estimating the time-varying high-frequency sensitivity of those interest rates to macroeconomic announcements relative to a benchmark period in which the zero bound was not a concern. We find that yields on Treasury securities with a year or more to maturity were surprisingly responsive to news throughout 2008–10, suggesting that monetary and fiscal policy were likely to have been about as effective as usual during this period. Only beginning in late 2011 does the sensitivity of these yields to news fall closer to zero. We offer two explanations for our findings: First, until late 2011, market participants expected the funds rate to lift off from zero within about four quarters, minimizing the effects of the zero bound on medium- and longer-term yields. Second, the Fed’s unconventional policy actions seem to have helped offset the effects of the zero bound on medium- and longer-term rates.

Tuesday, March 05, 2013

'Are Sticky Prices Costly? Evidence From The Stock Market'

There has been a debate in macroeconomics over whether sticky prices -- the key feature of New Keynesian models -- are actually as sticky as assumed, and how large the costs associated with price stickiness actually are. This paper finds "evidence that sticky prices are indeed costly":

Are Sticky Prices Costly? Evidence From The Stock Market, by Yuriy Gorodnichenko and Michael Weber, NBER Working Paper No. 18860, February 2013 [open link]: We propose a simple framework to assess the costs of nominal price adjustment using stock market returns. We document that, after monetary policy announcements, the conditional volatility rises more for firms with stickier prices than for firms with more flexible prices. This differential reaction is economically large as well as strikingly robust to a broad array of checks. These results suggest that menu costs---broadly defined to include physical costs of price adjustment, informational frictions, etc.---are an important factor for nominal price rigidity. We also show that our empirical results qualitatively and, under plausible calibrations, quantitatively consistent with New Keynesian macroeconomic models where firms have heterogeneous price stickiness. Since our approach is valid for a wide variety of theoretical models and frictions preventing firms from price adjustment, we provide "model-free" evidence that sticky prices are indeed costly.

Saturday, March 02, 2013

Booms and Systemic Banking Crises

Everyone at the conference seemed to like this model of endogenous banking crises (me included -- this is the non-technical summary, the paper itself is fairly technical):

Booms and Systemic Banking Crises, by Frederic Boissay, Fabrice Collard, and Frank Smets: ... Non-Technical Summary Recent empirical research on systemic banking crises (henceforth, SBCs) has highlighted the existence of similar patterns across diverse episodes. SBCs are rare events. Recessions that follow SBC episodes are deeper and longer lasting than other recessions. And, more importantly for the purpose of this paper, SBCs follow credit intensive booms; "banking crises are credit booms gone wrong" (Schularick and Taylor, 2012, p. 1032). Rare, large, adverse financial shocks could possibly account for the first two properties. But they do not seem in line with the fact that the occurrence of an SBC is not random but rather closely linked to credit conditions. So, while most of the existing macro-economic literature on financial crises has focused on understanding and modeling the propagation and the amplification of adverse random shocks, the presence of the third stylized fact mentioned above calls for an alternative approach.
In this paper we develop a simple macroeconomic model that accounts for the above three stylized facts. The primary cause of systemic banking crises in the model is the accumulation of assets by households in anticipation of future adverse shocks. The typical run of events leading to a financial crisis is as follows. A sequence of favorable, non permanent, supply shocks hits the economy. The resulting increase in the productivity of capital leads to a demand-driven expansion of credit that pushes the corporate loan rate above steady state. As productivity goes back to trend, firms reduce their demand for credit, whereas households continue to accumulate assets, thus feeding the supply of credit by banks. The credit boom then turns supply-driven and the corporate loan rate goes down, falling below steady state. By giving banks incentives to take more risks or misbehave, too low a corporate loan rate contributes to eroding trust within the banking sector precisely at a time when banks increase in size. Ultimately, the credit boom lowers the resilience of the banking sector to shocks, making systemic crises more likely.
We calibrate the model on the business cycles in the US (post WWII) and the financial cycles in fourteen OECD countries (1870-2008), and assess its quantitative properties. The model reproduces the stylized facts associated with SBCs remarkably well. Most of the time the model behaves like a standard financial accelerator model, but once in while -- on average every forty years -- there is a banking crisis. The larger the credit boom, (i) the higher the probability of an SBC, (ii) the sooner the SBC, and (iii) -- once the SBC breaks out -- the deeper and the longer the recession. In our simulations, the recessions associated with SBCs are significantly deeper (with a 45% larger output loss) than average recessions. Overall, our results validate the role of supply-driven credit booms leading to credit busts. This result is of particular importance from a policy making perspective as it implies that systemic banking crises are predictable. We indeed use the model to compute the k-step ahead probability of an SBC at any point in time. Fed with actual US data over the period 1960-2011, the model yields remarkably realistic results. For example, the one-year ahead probability of a crisis is essentially zero in the 60-70s. It jumps up twice during the sample period: in 1982-3, just before the Savings & Loans crisis, and in 2007-9. Although very stylized, our model thus also provides with a simple tool to detect financial imbalances and predict future crises.

'Monetary Policy Alternatives at the Zero Bound: Lessons from the 1930s'

This paper from the SF Fed conference might be of interest (it's a bit technical in some sections):

Monetary Policy Alternatives at the Zero Bound: Lessons from the 1930s U.S. February, 2013 Christopher Hanes: Abstract: In recent years economists have debated two unconventional policy options for situations when overnight rates are at the zero bound: boosting expected inflation through announced changes in policy objectives such as adoption of price-level or nominal GDP targets; and large-scale asset purchases to lower long-term rates by pushing down term or risk premiums - “portfolio- balance” effects. American policies in the 1930s, when American overnight rates were at the zero bound, created experiments that tested the effectiveness of the expected-inflation option, and the existence of portfolio-balance effects. In data from the 1930s, I find strong evidence of portfolio- balance effects but no clear evidence of the expected-inflation channel.

(The discussants seemed to like the paper, but the results for expectations channel drew more questions than the results for the portfolio-balance effects.)

Friday, March 01, 2013

FRBSF Conference: The Past and Future of Monetary Policy

I am here today:

In 1913, President Woodrow Wilson signed the Federal Reserve Act into law, and the Federal Reserve System was created. In recognition of the centennial of the Fed's founding, the Economic Research Department of the Federal Reserve Bank of San Francisco is sponsoring a research conference on the theme “The Past and Future of Monetary Policy.”

Agenda

Morning Session Chair: John Fernald, Federal Reserve Bank of San Francisco

8:15 AM Continental Breakfast

8:50 AM Welcoming Remarks: John Williams, President, Federal Reserve Bank of San Francisco

9:00 AM  Robert Hall, Stanford University, Ricardo Reis, Columbia University, Controlling Inflation and Maintaining Central Bank Solvency under New-Style Central Banking  Discussants: John Leahy, New York University, Carl Walsh, University of California, Santa Cruz

10:15 AM Break

10:35 AM Christopher Gust, Federal Reserve Board, David Lopez-Salido, Federal Reserve Board, Matthew Smith, Federal Reserve Board, The Empirical Implications of the Interest-Rate Lower Bound, Discussants: Martin Eichenbaum, Northwestern University, Christopher Sims, Princeton University

11:50 AM Break

12:00 PM Lunch – Market Street Dining Room, Fourth Floor, Introduction: Glenn Rudebusch, Director of Research, Federal Reserve Bank of San Francisco, Speaker: Lars Svensson, Deputy Governor, Riksbank

Afternoon Session Chair: Eric Swanson, Federal Reserve Bank of San Francisco

1:15 PM Anna Cieslak, Kellogg School of Management, Northwestern University, Pavol Povala, Stern School of Business, New York University, Expecting the Fed, Discussants: Kenneth Singleton, Stanford Graduate School of Business, Mark Watson, Princeton University

2:30 PM Break

2:45 PM Frederic Boissay, European Central Bank, Fabrice Collard, University of Bern, Frank Smets, European Central Bank, Booms and Systemic Banking Crises, Discussants: Lawrence Christiano, Northwestern University, Mark Gertler, New York University

4:00 PM Break

4:15 PM Christopher Hanes, SUNY Binghamton, Monetary Policy Alternatives at the Zero Bound: Lessons from the 1930s U.S., Discussants; Gary Richardson, University of California, Irvine, James Hamilton, University of California, San Diego

5:30 PM Reception – West Market Street Lounge, Fourth Floor

6:15 PM Dinner – Market Street Dining Room, Fourth Floor, Introduction: John Williams, President, Federal Reserve Bank of San Francisco, Speaker: Ben Bernanke, Chairman, Federal Reserve Board of Governors

Wednesday, February 27, 2013

2013 West Coast Trade Workshop

If any academics happen to be in Eugene this weekend:

2013 West Coast Trade Workshop (link)

All sessions will be held in the Walnut room in the Inn at the 5th.

Saturday March 2nd

8:15 am-10:15 am Session 1 – Innovation and Growth (Chair – Nicholas Sly)

10:15am - 10:30am Coffee Break

10:30am- 12:30pm Session 2 – International Trade and Worker Skills (Chair – Bruce Blonigen)

12:30pm-2pm Lunch Break 2pm-4pm Session 3 – Foreign Direct Investment (Chair – Jennifer Poole)

 Evening Hosted Group Dinner

Sunday, March 3rd ­8:30am-10:30am Session 4 – Consequences of the Trade Liberalization (Chair – Alan Spearot)

10:30am – 10:45am Coffee Break 10:45am-12:45pm Session 5 – Offshoring (Chair – Anca Cristea)

Adjourn

Monday, February 18, 2013

Jordi Galí: Monetary Policy and Rational Asset Price Bubbles

Another paper to read:

Monetary Policy and Rational Asset Price Bubbles, by Jordi Galí, NBER Working Paper No. 18806, February 2013 [open link]: Abstract I examine the impact of alternative monetary policy rules on a rational asset price bubble, through the lens of an overlapping generations model with nominal rigidities. A systematic increase in interest rates in response to a growing bubble is shown to enhance the fluctuations in the latter, through its positive effect on bubble growth. The optimal monetary policy seeks to strike a balance between stabilization of the bubble and stabilization of aggregate demand. The paper's main findings call into question the theoretical foundations of the case for "leaning against the wind" monetary policies.

What's the key mechanism working against the traditional "lean against the wind" policy? That rational bubbles grow at the rate of interest, hence raising (real) interest rates makes the bubble grow faster. From the introduction:

...The role that monetary policy should play in containing ... bubbles has been the subject of a heated debate, well before the start of the recent crisis. The consensus view among most policy makers in the pre-crisis years was that central banks should focus on controlling inflation and stabilizing the output gap, and thus ignore asset price developments, unless the latter are seen as a threat to price or output stability. Asset price bubbles, it was argued, are difficult if not outright impossible to identify or measure; and even if they could be observed, the interest rate would be too blunt an instrument to deal with them, for any significant adjustment in the latter aimed at containing the bubble may cause serious "collateral damage" in the form of lower prices for assets not affected by the bubble, and a greater risk of an economic downturn.1

But that consensus view has not gone unchallenged, with many authors and policy makers arguing that the achievement of low and stable inflation is not a guarantee of financial stability and calling for central banks to pay special attention to developments in asset markets.2 Since episodes of rapid asset price inflation often lead to a financial and economic crisis, it is argued, central banks should act preemptively ... by raising interest rates sufficiently to dampen or bring to an end any episodes of speculative frenzy -- a policy often referred to as "leaning against the wind." ...

Independently of one's position in the previous debate, it is generally taken for granted (a) that monetary policy can have an impact on asset price bubbles and (b) that a tighter monetary policy, in the form of higher short-term nominal interest rates, may help disinflate such bubbles. In the present paper I argue that such an assumption is not supported by economic theory and may thus lead to misguided policy advice, at least in the case of bubbles of the rational type considered here. The reason for this can be summarized as follows: in contrast with the fundamental component of an asset price, which is given by a discounted stream of payoffs, the bubble component has no payoffs to discount. The only equilibrium requirement on its size is that the latter grow at the rate of interest, at least in expectation. As a result, any increase in the (real) rate engineered by the central bank will tend to increase the size of the bubble, even though the objective of such an intervention may have been exactly the opposite. Of course, any decline observed in the asset price in response to such a tightening of policy is perfectly consistent with the previous result, since the fundamental component will generally drop in that scenario, possibly more than offsetting the expected rise in the bubble component.

Below I formalize that basic idea... The paper's main results can be summarized as follows:

  • Monetary policy cannot affect the conditions for existence (or nonexistence) of a bubble, but it can influence its short-run behavior, including the size of its fluctuations.
  • Contrary to the conventional wisdom a stronger interest rate response to bubble fluctuations (i.e. a "leaning against the wind policy") may raise the volatility of asset prices and of their bubble component.
  • The optimal policy must strike a balance between stabilization of current aggregate demand -- which calls for a positive interest rate response to the bubble -- and stabilization of the bubble itself (and hence of future aggregate demand) which would warrant a negative interest rate response to the bubble. If the average size of the bubble is sufficiently large the latter motive will be dominant, making it optimal for the central bank to lower interest rates in the face of a growing bubble.

...

But before we lower interest rates in response to signs of an inflating bubble, it would be good to heed this warning from the conclusion:

Needless to say the conclusions should not be taken at face value when it comes to designing actual policies. This is so because the model may not provide an accurate representation of the challenges facing actual policy makers. In particular, it may very well be the case that actual bubbles are not of the rational type and, hence, respond to monetary policy changes in ways not captured by the theory above. In addition, the model above abstracts from many aspects of actual economies that may be highly relevant when designing monetary policy in bubbly economies, including the presence of frictions and imperfect information in financial markets. Those caveats notwithstanding, the analysis above may be useful by pointing out a potentially important missing link in the case for "leaning against the wind" policies.

Wednesday, February 13, 2013

'Asset Quality Misrepresentation by Financial Intermediaries: Evidence from RMBS Market'

I need to read this paper:

Asset Quality Misrepresentation by Financial Intermediaries: Evidence from RMBS Market, by Tomasz Piskorski, Amit Seru, and James Witkin: Abstract: We contend that buyers received false information about the true quality of assets in contractual disclosures by intermediaries during the sale of mortgages in the $2 trillion non-agency market. We construct two measures of misrepresentation of asset quality -- misreported occupancy status of borrower and misreported second liens -- by comparing the characteristics of mortgages disclosed to the investors at the time of sale with actual characteristics of these loans at that time that are available in a dataset matched by a credit bureau. About one out of every ten loans has one of these misrepresentations. These misrepresentations are not likely to be an artifact of matching error between datasets that contain actual characteristics and those that are reported to investors. At least part of this misrepresentation likely occurs within the boundaries of the financial industry (i.e., not by borrowers). The propensity of intermediaries to sell misrepresented loans increased as the housing market boomed, peaking in 2006. These misrepresentations are costly for investors, as ex post delinquencies of such loans are more than 60% higher when compared with otherwise similar loans. Lenders seem to be partly aware of this risk, charging a higher interest rate on misrepresented loans relative to otherwise similar loans, but the interest rate markup on misrepresented loans does not fully reflect their higher default risk. Using measures of pricing used in the literature, we find no evidence that these misrepresentations were priced in the securities at their issuance. A significant degree of misrepresentation exists across all reputable intermediaries involved in sale of mortgages. The propensity to misrepresent seems to be unrelated to measures of incentives for top management, to quality of risk management inside these firms or to regulatory environment in a region. Misrepresentations on just two relatively easy-to-quantify dimensions of asset quality could result in forced repurchases of mortgages by intermediaries in upwards of $160 billion.

Friday, February 08, 2013

Have Blog, Will Travel: NBER Research Meeting

I am here today:

National Bureau of Economic Research Research Meeting
Matthias Doepke and Emmanuel Farhi, Organizers
Federal Reserve Bank of San Francisco
101 Market Street San Francisco, CA

PROGRAM

8:30 am Continental Breakfast

9:00 am Zhen Huo, University of Minnesota Jose-Victor Rios-Rull, University of Minnesota and NBER Engineering a Paradox of Thrift Recession Discussant: Mark Aguiar, Princeton University and NBER

10:00 am Coffee Break

10:30 am Simeon Alder, University of Notre Dame David Lagakos, Arizona State University Lee Ohanian, University of California at Los Angeles and NBER The Decline of the U.S. Rust Belt: A Macroeconomic Analysis Discussant: Leena Rudanko, Boston University and NBER

11:30 am Raj Chetty, Harvard University and NBER John Friedman, Harvard University and NBER Soren Leth-Petersen, University of Copenhagen Torben Nielsen, The Danish National Centre for Social Research Tore Olsen, Harvard University Active vs. Passive Decisions and Crowd-Out in Retirement Savings Accounts: Evidence from Denmark Discussant: Christopher Carroll, Johns Hopkins University

12:30 pm Lunch

1:30 pm Lawrence Christiano, Northwestern University and NBER Martin Eichenbaum, Northwestern University and NBER Mathias Trabandt, Federal Reserve Board Unemployment and Business Cycles Discussant: Robert Shimer, University of Chicago and NBER

2:30 pm Coffee Break

3:00 pm Andrew Atkeson, University of California at Los Angeles and NBER Andrea Eisfeldt, University of California at Los Angeles Pierre-Olivier Weill, University of California at Los Angeles and NBER The Market for OTC Derivatives Discussant: Gustavo Manso, University of California at Berkeley

4:00 pm Greg Kaplan, Princeton University and NBER Guido Menzio, University of Pennsylvania and NBER Shopping Externalities and Self-Fulfilling Unemployment Fluctuations Discussant: Martin Schneider, Stanford University and NBER

5:00 pm Adjourn

5:15 pm Reception and Dinner

Monday, January 28, 2013

Gorton and Ordonez: The Supply and Demand for Safe Assets

I need to read this:

The Supply and Demand for Safe Assets, by Gary Gorton and Guillermo Ordonez, January 2013, NBER [open link]: Abstract There is a demand for safe assets, either government bonds or private substitutes, for use as collateral. Government bonds are safe assets, given the governments’ power to tax, but their supply is driven by fiscal considerations, and does not necessarily meet the private demand for safe assets. Unlike the government, the private sector cannot produce riskless collateral. When the private sector reaches its limit (the quality of private collateral), government bonds are net wealth, up to the governments own limits (taxation capacity). The economy is fragile to the extent that privately-produced safe assets are relied upon. In a crisis, government bonds can replace private assets that do not sustain borrowing anymore, raising welfare.

Tuesday, January 22, 2013

'Wealth Effects Revisited: 1975-2012'

Housing cycles matter:

Wealth Effects Revisited: 1975-2012, by Karl E. Case, John M. Quigley, Robert J. Shiller, NBER Working Paper No. 18667, January 2013 [open link, previous version]: We re-examine the links between changes in housing wealth, financial wealth, and consumer spending. We extend a panel of U.S. states observed quarterly during the seventeen-year period, 1982 through 1999, to the thirty-seven year period, 1975 through 2012Q2. Using techniques reported previously, we impute the aggregate value of owner-occupied housing, the value of financial assets, and measures of aggregate consumption for each of the geographic units over time. We estimate regression models in levels, first differences and in error-correction form, relating per capita consumption to per capita income and wealth. We find a statistically significant and rather large effect of housing wealth upon household consumption. This effect is consistently larger than the effect of stock market wealth upon consumption.
In our earlier version of this paper we found that households increase their spending when house prices rise, but we found no significant decrease in consumption when house prices fall. The results presented here with the extended data now show that declines in house prices stimulate large and significant decreases in household spending.
The elasticities implied by this work are large. An increase in real housing wealth comparable to the rise between 2001 and 2005 would, over the four years, push up household spending by a total of about 4.3%. A decrease in real housing wealth comparable to the crash which took place between 2005 and 2009 would lead to a drop of about 3.5%

Thursday, January 03, 2013

Bad Advice from Experts, Herding, and Bubbles

Here's the introduction to a paper I'm giving at the AEA meetings (journal version of paper). The model in the paper, which is a variation of the Brock and Hommes (1998) generalization of the Lucas (1978) asset pricing model, shows that bad advice from experts can increase the likelihood of harmful financial bubbles:

Bad Advice from Experts, Herding, and Bubbles: The belief that housing prices would continue to rise into the foreseeable future was an important factor in creating the housing price bubble. But why did people believe this? Why did they become convinced, as they always do prior to a bubble, that this time was different? One reason is bad advice from academic and industry experts. Many people turned to these experts when housing prices were inflating and asked if we were in a bubble. The answer in far too many cases – almost all when they had an opinion at all – was that no, this wasn’t a bubble. Potential homebuyers were told there were real factors such as increased immigration, zoning laws, resource constraints in an increasingly globalized economy, and so on that would continue to drive up housing prices.

When the few economists who did understand that housing prices were far above their historical trends pointed out that a typical bubble pattern had emerged – both Robert Shiller and Dean Baker come to mind – they were mostly ignored. Thus, both academic and industry economists helped to convince people that the increase in prices was permanent, and that they ought to get in on the housing boom as soon as possible.

But why did so few economists warn about the bubble? And more importantly for the model presented in this paper, why did so many economists validate what turned out to be destructive trend-chasing behavior among investors?

One reason is that economists have become far too disconnected from the lessons of history. As courses in economic history have faded from graduate programs in recent decades, economists have become much less aware of the long history of bubbles. This has caused a diminished ability to recognize the housing bubble as it was inflating. And worse, the small amount of recent experience we have with bubbles has led to complacency. We were able to escape, for example, the stock bubble crash of 2001 without too much trouble. And other problems such as the Asian financial crisis did not cause anything close to the troubles we had after the housing bubble collapsed, or the troubles other bubbles have caused throughout history.

Economists did not have the historical perspective they needed, and there was confidence that even if a bubble did appear policymakers would be able to clean it up without too much damage. As Robert Lucas said in his 2003 presidential address to the American Economic Association, the “central problem of depression-prevention has been solved.” We no longer needed to worry about big financial meltdowns of the type that caused so many problems in the 1800s and early 1900s. But in reality economists hardly knew what to look for, did not fully understand the dangers, and were hence unconcerned even if they did suspect that housing prices were out of line with the underlying fundamentals.

A second factor is the lack of deep institutional knowledge of the markets academic economists study. Theoretical models are idealized, pared down versions of reality intended to capture the fundamental issues relative to the question at hand. Because of their mathematical complexity, macro models in particular are highly idealized and only capture a few real world features such as sticky prices and wages. Economists who were intimately familiar with these highly stylized models assumed they were just as familiar with the markets the models were intended to represent. But the models were not up to the task at hand,[1] and when the models failed to signal that a bubble was coming there was no deep institutional knowledge to rely upon. There was nothing to give the people using these models a hint that they were not capturing important features of real world markets.

These two disconnects – from history and from the finer details of markets – made it much more likely that economists would certify that this time was different, that fundamentals such as population growth, immigration, financial innovation, could explain the run-up in housing prices.

The model in this paper examines the implications of these two disconnects and shows that when experts endorse the idea that this time is different and cause herding toward incorrect beliefs about the future, it increases the likelihood that a large, devastating bubble will occur.

_____________________

[1] See Wieland and Wolters (2011) for an overview of the forecasting performance of macroeconomic models before, during, and after the crisis.

Monday, December 24, 2012

Smart Machines, A New Guide to Keynes, and The Inefficient Markets Hypothesis

Haven't read these papers yet, but looks like I should (I added open links when I could find them):

First, Sachs and Kotlikoff (anything to keep these two from writing about the deficit and the accumulated debt is, in my view, a plus):

Smart Machines and Long-Term Misery, by Jeffrey D. Sachs, Laurence J. Kotlikoff, NBER Working Paper No. 18629, Issued in December 2012: Are smarter machines our children’s friends? Or can they bring about a transfer from our relatively unskilled children to ourselves that leaves our children and, indeed, all our descendants – worse off?
This, indeed, is the dire message of the model presented here in which smart machines substitute directly for young unskilled labor, but complement older skilled labor. The depression in the wages of the young then limits their ability to save and invest in their own skill acquisition and physical capital. This, in turn, means the next generation of young, initially unskilled workers, encounter an economy with less human and physical capital, which further drives down their wages. This process stabilizes through time, but potentially entails each newborn generation being worse off than its predecessor.
We illustrate the potential for smart machines to engender long-term misery in a highly stylized two-period model. We also show that appropriate generational policy can be used to transform win-lose into win-win for all generations.

Next, Jordi Gali revisits Keynes:

Notes for a New Guide to Keynes (I): Wages, Aggregate Demand, and Employment, by Jordi Galí, NBER Working Paper No. 18651, Issued in December 2012 [open link]: I revisit the General Theory's discussion of the role of wages in employment determination through the lens of the New Keynesian model. The analysis points to the key role played by the monetary policy rule in shaping the link between wages and employment, and in determining the welfare impact of enhanced wage flexibility. I show that the latter is not always welfare improving.

Finally, Roger Farmer, Carine Nourry, and Alain Venditti on whether "competitive financial markets efficiently allocate risk" (according to this, they don't):

The Inefficient Markets Hypothesis: Why Financial Markets Do Not Work Well in the Real World, Roger E.A. Farmer, Carine Nourry, Alain Venditti, NBER Working Paper No. 18647, Issued in December 2012 [open link]: Existing literature continues to be unable to offer a convincing explanation for the volatility of the stochastic discount factor in real world data. Our work provides such an explanation. We do not rely on frictions, market incompleteness or transactions costs of any kind. Instead, we modify a simple stochastic representative agent model by allowing for birth and death and by allowing for heterogeneity in agents' discount factors. We show that these two minor and realistic changes to the timeless Arrow-Debreu paradigm are sufficient to invalidate the implication that competitive financial markets efficiently allocate risk. Our work demonstrates that financial markets, by their very nature, cannot be Pareto efficient, except by chance. Although individuals in our model are rational; markets are not.

Friday, December 21, 2012

'A Pitfall with DSGE-Based, Estimated, Government Spending Multipliers'

This paper, which I obviously think is worth noting, is forthcoming in AEJ Macroeconomics:

A Pitfall with DSGE-Based, Estimated, Government Spending Multipliers, by Patrick Fève,  Julien Matheron, Jean-Guillaume Sahuc, December 5, 2012: 1 Introduction Standard practice in estimation of dynamic stochastic general equilibrium (DSGE) models, e.g. the well-known work by Smets and Wouters (2007), is to assume that government consumption expenditures are described by an exogenous stochastic process and are separable in the households’ period utility function. This standard practice has been adopted in the most recent analyses of fiscal policy (e.g. Christiano, Eichenbaum and Rebelo, 2011, Coenen et al., 2012, Cogan et al., 2010, Drautzburg and Uhlig, 2011, Eggertsson, 2011, Erceg and Lindé, 2010, Fernández-Villaverde, 2010, Uhlig, 2010).
In this paper, we argue that both short-run and long-run government spending multipliers (GSM) obtained in this literature may be downward biased. This is so because the standard approach does not typically allow for the possibility that private consumption and government spending are Edgeworth complements in the utility function[1] and that government spending has an endogenous countercyclical component (automatic stabilizer)... Since, as we show, the GSM increases with the degree of Edgeworth complementarity,... the standard empirical approach may ... result in a downward-biased estimate of the GSM.
In our benchmark empirical specification with Edgeworth complementarity and a countercyclical component of policy, the estimated long-run multiplier amounts to 1.31. Using the same model..., when both Edgeworth complementarity and the countercyclical component of policy are omitted,... the estimated multiplier is approximately equal to 0.5. Such a difference is clearly not neutral if the model is used to assess recovery plans of the same size as those recently enacted in the US. To illustrate this more concretely, we feed the American Recovery and Reinvestment Act (ARRA) fiscal stimulus package into our model. We obtain that omitting the endogenous policy rule at the estimation stage would lead an analyst to underestimate the short-run GSM by slightly more than 0.25 points. Clearly, these are not negligible figures. ...
_____
1 We say that private consumption and government spending are Edgeworth complements/substitutes when an increase in government spending raises/diminishes the marginal utility of private consumption. Such a specification has now become standard, following the seminal work by Aschauer (1985), Bailey (1971), Barro (1981), Braun (1994), Christiano and Eichenbaum (1992), Finn (1998), McGrattan (1994).

Let me also add these qualifications from the conclusion:

In our framework, we have deliberately abstracted from relevant details... However, the recent literature insists on other modeling issues that might potentially affect our results. We mention two of them. First, as put forth by Leeper, Plante and Traum (2010), a more general specification of government spending rule, lump-sum transfers, and distortionary taxation is needed to properly fit US data. This richer specification includes in addition to the automatic stabilizer component, a response to government debt and co-movement between tax rates. An important quantitative issue may be to assess which type of stabilization (automatic stabilization and/or debt stabilization) interacts with the estimated degree of Edgeworth complementarity. Second, Fiorito and Kollintzas (2004) have suggested that the degree of complementarity/substitutability between government and private consumptions is not homogeneous over types of public expenditures. This suggests to disaggregate government spending and inspect how feedback rules affect the estimated degree of Edgeworth complementarity in this more general setup. These issues will constitute the object of future researches.

Tuesday, December 11, 2012

'What Does the New CRA Paper Tell Us?'

Mike Konczal:

What Does the New Community Reinvestment Act (CRA) Paper Tell Us?, by Mike Konczal: There are two major, critical questions that show up in the literature surrounding the 1977 Community Reinvestment Act (CRA).
The first question is how much compliance with the CRA changes the portfolio of lending institutions. Do they lend more often and to riskier people, or do they lend the same but put more effort into finding candidates? The second question is how much did the CRA lead to the expansion of subprime lending during the housing bubble. Did the CRA have a significant role in the financial crisis?   There's a new paper on the CRA, Did the Community Reinvestment Act (CRA) Lead to Risky Lending?, by Agarwal, Benmelech, Bergman and Seru, h/t Tyler Cowen, with smart commentary already from Noah Smith. (This blog post will use the ungated October 2012 paper for quotes and analysis.) This is already being used as the basis for an "I told you so!" by the conservative press, which has tried to argue that the second question is most relevant. However, it is important to understand that this paper answers the first question, while, if anything, providing evidence against the conservative case for the second. ...
"the very small share of all higher-priced loan originations that can reasonably be attributed to the CRA makes it hard to imagine how this law could have contributed in any meaningful way to the current subprime crisis." ...

Monday, November 05, 2012

'Managing a Liquidity Trap: Monetary and Fiscal Policy'

I like Stephen Williamson a lot better when he puts on his academic cap. I learned something from this:

Managing a Liquidity Trap: Monetary and Fiscal Policy

I disagree with him about the value of forward guidance, though I wouldn't bet the recovery on this one mechanism, but it's a nice discussion of the underlying issues.

I was surprised to see this reference to fiscal policy:

I've come to think of the standard New Keynesian framework as a model of fiscal policy. The basic sticky price (or sticky wage) inefficiency comes from relative price distortions. Particularly given the zero lower bound on the nominal interest rate, monetary policy is the wrong vehicle for addressing the problem. Indeed, in Werning's model we can always get an efficient allocation with appropriately-set consumption taxes (see Correia et al., for example). I don't think the New Keynesians have captured what monetary policy is about.

For some reason, I thought he was adamantly opposed to fiscal policy interventions. But I think I'm missing something here -- perhaps he is discussing what this particular model says, or what NK models say more generally, rather than what he believes and endorses. After all, he's not a fan of the NK framework. In any case, in addition to whatever help monetary policy can provide, as just noted in the previous post I agree that fiscal policy has an important role to play in helping the economy recover.

Maurizio Bovi: Are You a Good Econometrician? No, I am British (With a Response from George Evans)

Via email, Maurizio Bovi describes a paper of his on adaptive learning (M. Bovi (2012). "Are the Representative Agent’s Beliefs Based on Efficient Econometric Models?" Journal of Economic Dynamics and Control). A colleague of mine, George Evans -- a leader in this area -- responds:

Are you a good econometrician? No, I am British!, by Maurizio Bovi*: A typical assumption of mainstream strands of research is that agents’ expectations are grounded in efficient econometric models. Muthian agents are all equally rational and know the true model. The adaptive learning literature assumes that agents are boundedly rational in the sense that they are as smart as econometricians and that they are able to learn the correct model. The predictor choice approach argues that individuals are boundedly rational in the sense that agents switch to the forecasting rule that has the highest fitness. Preferences could generate enduring inertia in the dynamic switching process and a stationary environment for a sufficiently long period is necessary to learn the correct model. Having said this, all the cited approaches typically argue that there is a general tendency to forecast via optimal forecasting models because of the costs stemming from inefficient predictions.
To the extent that the representative agent’s beliefs i) are based on efficient (in terms of minimum MSE=mean squared forecasting errors) econometric models, and ii) can be captured by ad hoc surveys, two basic facts emerge, stimulating my curiosity. First, in economic systems where the same simple model turns out to be the best predictor for a sufficient span of time survey expectations should tend to converge: more and more individuals should learn or select it. Second, the forecasting fitness of this enduring minimum MSE econometric model should not be further enhanced by the use of information provided by survey expectations. If agents act as if they were statisticians in the sense that they use efficient forecasting rules, then survey-based beliefs must reflect this and cannot contain any statistically significant information that helps reduce the MSE relative to the best econometric predictor. In sum, there could be some value in analyzing hard data  and survey beliefs to understand i) whether these latter derive from optimal econometric models and ii) the time connections between survey-declared and efficient model-grounded expectations. By examining real-time GDP dynamics in the UK I have found that, over a time-span of two decades, the adaptive expectations (AE) model systematically outperforms other standard predictors which, as argued by the above recalled literature, should be in the tool-box of representative econometricians (Random Walk, ARIMA, VAR). As mentioned, this peculiar environment should eventually lead to increased homogeneity in best-model based expectations. However data collected in the surveys managed by the Business Surveys Unit of the European Commission (European Commission, 2007) highlight that great variety in expectations persists. Figure 1 shows that in the UK the number of optimists and pessimists tend to be rather similar at least since the inception of data1 availability (1985).

Bovi

In addition, evidence points to one-way information flows going from survey data to econometric models. In particular, Granger-causality, variance decomposition and Geweke’s instantaneous feedback tests suggest that the accuracy of the AE forecasting model can be further enhanced by the use of the information provided by the level of disagreement across survey beliefs. That is, as per GDP dynamics in the UK, the expectation feedback system looks like an open loop where possibly non-econometrically based beliefs play a key role with respect to realizations. All this affects the general validity of the widespread assumption that representative agents’ beliefs derive from optimal econometric models.
Results are robust to several methods of quantifications of qualitative survey observations as well as to standard forecasting rules estimated both recursively and via optimal-size rolling windows. They are also in line both with the literature supporting the non-econometrically-based content of the information captured by surveys carried out on laypeople and, interpreting MSE as a measure of volatility, with the stylized fact on the positive correlation between dispersion in beliefs and macroeconomic uncertainty.
All in all, our evidence raises some intriguing questions: Why do representative UK citizens seem to be systematically more boundedly rational than what is usually hypothesized in the adaptive learning literature and the predictor choice approach? What does it persistently hamper them to use the most accurate statistical model? Are there econometric (objective) or psychological (subjective) impediments?
____________________
*Italian National Institute of Statistics (ISTAT), Department of Forecasting and Economic Analysis. The opinions expressed herein are those of the author (E-mail mbovi@istat.it) and do not necessarily reflect the views of ISTAT.
[1] The question is “How do you expect the general economic situation in the country to develop over the next 12 months?” Respondents may reply “it will…: i) get a lot better, ii) get a little better, iii) stay the same, iv) get a little worse, v) get a lot worse, vi) I do not know. See European Commission (1997).
References
European Commission (2007). The Joint Harmonised EU Programme of Business and Consumer Surveys, User Guide, European Commission, Directorate-General for Economic and Financial Affairs, July.
M. Bovi (2012). “Are the Representative Agent’s Beliefs Based on Efficient Econometric Models?” Journal of Economic Dynamics and Control DOI: 10.1016/j.jedc.2012.10.005.

Here's the response from George Evans:

Comments on Maurizio Bovi, “Are the Representative Agent’s Beliefs Based on Efficient Econometric Models?”, by George Evans, University of Oregon: This is an interesting paper that has a lot of common ground with the adaptive learning literature. The techniques and a number of the arguments will be familiar to those of us who work in adaptive learning. The tenets of the adaptive learning approach can be summarized as follows: (1) Fully “rational expectations” (RE) are implausibly strong and implicitly ignores a coordination issue that arises because economic outcomes are affected by the expectations of firms and households (economic “agents”). (2) A more plausible view is that agents have bounded rationality with a degree of rationality comparable to economists themselves (the “cognitive consistency principle”). For example agents’ expectations might be based on statistical models that are revised and updated over time. On this approach we avoid assuming that agents are smarter than economists, but we also recognize that agents will not go on forever making systematic errors. (3) We should recognize that economic agents, like economists, do not agree on a single forecasting model. The economy is complex. Therefore, agents are likely to use misspecified models and to have heterogeneous expectations.
The focus of the adaptive learning literature has changed over time. The early focus was on whether agents using statistical learning rules would or would not eventually converge to RE, while the main emphasis now is on the ways in which adaptive learning can generate new dynamics, e.g. through discounting of older data and/or switching between forecasting models over time. I use the term “adaptive learning” broadly, to include, for example, the dynamic predictor selection literature.
Bovi’s paper “Are the Representative Agent’s Beliefs Based on Efficient Econometric Models” argues that with respect to GDP growth in the UK the answer to his question is no because 1) there is a single efficient econometric model, which is a version of AE (adaptive expectations), and 2) agents might be expected therefore to have learned to adopt this optimal forecasting model over time. However the degree of heterogeneity of expectations has not fallen over time, and thus agents are failing to learn to use the best forecasting model.
From the adaptive learning perspective, Bovi’s first result is intriguing, and merits further investigation, but his approach will look very familiar to those of us who work in adaptive learning. And the second point will surprise few of us: the extent of heterogeneous expectations is well-known, as is the fact that expectations remain persistently heterogeneous, and there is considerable work within adaptive learning that models this heterogeneity.
More specifically:
1) Bovi’s “efficient” model uses AE with the adaptive expectations parameter gamma updated over time in a way that aims to minimize the squared forecast error. This is in fact a simple adaptive learning model, which was proposed and studied in Evans and Ramey, “Adaptive expectations, underparameterization and the Lucas critique”, Journal of Monetary Economics (2006). We there suggested that agents might want to use AE as an optimal choice for a parsimonious (underparameterized) forecasting rule, showed what would determine the optimal choice of gamma, and provided an adaptive learning algorithm that would allow agents to update their choice of gamma over time in order to track unknown structural change. (Our adaptive learning rule exploits the fact that AE can be viewed as the forecast that arises from an IMA(1,1) time-series model, and in our rule the MA parameter is estimated and updated recursively using a constant gain rule.)
2) At the same time I am suspicious that economists will agree that there is a single best way to forecast GDP growth. For the US there is a lot of work by numerous researchers that strongly indicates that choosing between univariate time-series models is controversial, i.e. there appears to be no single clearly best univariate forecasting model, and (ii) forecasting models for GDP growth should be multivariate and should include both current & lagged unemployment rates and the consumption to GDP ratio. Other forecasters have found a role for nonlinear (Markov-switching) dynamics. Thus I doubt that there will be agreement by economists on a single best forecasting model for GDP growth or other key macro variables. Hence we should expect households and firms also to entertain multiple forecasting models, and for different agents to use different models.
3) Even if there were a single forecasting model that clearly dominated, one would not expect homogeneity of expectations across agents or for heterogeneity to disappear over time. In Evans and Honkapohja, “Learning as a Rational Foundation for Macroeconomics and Finance”, forthcoming 2013 in R Frydman and E Phelps, Rethinking Expectations: The Way Forward for Macroeconomics, we point out that variations across agents in the extent of discounting and the frequency with which agents update parameter estimates, as well as the inclusion of idiosyncratic exogenous expectation shocks, will give rise to persistent heterogeneity. There are costs to forecasting, and some agents will have larger benefits from more accurate forecasts than other agents. For example, for some agents the forecast method advocated by Bovi will be too costly and an even simpler forecast will be adequate (e.g. a RW forecast that the coming year will be like last year, or a forecast based on mean growth over, say, the last five years).
4) When there are multiple models potentially in play, as there always is, the dynamic predictor selection approach initiated by Brock and Hommes means that because of varying costs of forecast methods, and heterogeneous costs across agents, not all agents will want to use what appears to be the best performing model. We therefore expect heterogeneous expectations at any moment in time. I do not regard this as a violation of the cognitive consistency principle – even economists will find that in some circumstances in their personal decision-making they use more boundedly rational forecast methods than in other situations in which the stakes are high.
In conclusion, here is my two sentence summary for Maurizio Bovi: Your paper will find an interested audience among those of us who work in this area. Welcome to the adaptive learning approach. 
George Evans