Category Archive for: Academic Papers [Return to Main]

Friday, February 24, 2017

Have Blog, Will Travel

I am here today:

EF&G Research Meeting
Laura Veldkamp and Jon Steinsson, Organizers
February 24, 2017
Federal Reserve Bank of New York 33 Liberty Street New York, NY


Friday, February 24:

9:00 am Matthias Kehrig, Duke University Nicolas Vincent, HEC Montreal Do Firms Mitigate or Magnify Capital Misallocation? Evidence from Plant-Level Data Discussant: Virgiliu Midrigan, New York University and NBER

10:15 am Daniel Garcia-Macia, International Monetary Fund Chang-Tai Hsieh, University of Chicago and NBER Peter Klenow, Stanford University and NBER How Destructive is Innovation? Discussant: Andrew Atkeson, University of California at Los Angeles and NBER

11:30 am George-Marios Angeletos, Massachusetts Institute of Technology and NBER Chen Lian, Massachusetts Institute of Technology Forward Guidance without Common Knowledge Discussant: Kristoffer Nimark, Cornell University

1:30 pm Barney Hartman-Glaser, University of California at Los Angeles Hanno Lustig, Stanford University and NBER Mindy Zhang, University of Texas at Austin Capital Share Dynamics When Firms Insure Managers Discussant: Brent Neiman, University of Chicago and NBER

2:45 pm Sang Yoon Lee, University of Mannheim Yongseok Shin, Washington University in St. Louis and NBER Horizontal and Vertical Polarization: Task-Specific Technological Change in a Multi-Sector Economy Discussant: Nancy Stokey, University of Chicago and NBER

4:00 pm Michael Gelman, University of Michigan Yuriy Gorodnichenko, University of California at Berkeley and NBER Shachar Kariv, University of California at Berkeley Dmitri Koustas, University of California at Berkeley Matthew Shapiro, University of Michigan and NBER Dan Silverman, Arizona State University and NBER Steven Tadelis, University of California at Berkeley and NBER The Response of Consumer Spending to Changes in Gasoline Prices Discussant: Arlene Wong, Federal Reserve Bank of Minneapolis

5:00 pm Adjourn

Monday, January 30, 2017

How to Write an Effective Referee Report and Improve the Scientific Review Process

From the Journal of Economic Perspectives:

"How to Write an Effective Referee Report and Improve the Scientific Review Process," by Jonathan B. Berk, Campbell R. Harvey and David Hirshleifer [Full-Text Access | Supplementary Materials]: The review process for academic journals in economics has grown vastly more extensive over time. Journals demand more revisions, and papers have become bloated with numerous robustness checks and extensions. Even if the extra resulting revisions do on average lead to improved papers--a claim that is debatable--the cost is enormous. We argue that much of the time involved in these revisions is a waste of research effort. Another cause for concern is the level of disagreement amongst referees, a pattern that suggests a high level of arbitrariness in the review process. To identify and highlight what is going right and what is going wrong in the reviewing process, we wrote to a sample of former editors of the American Economic Review, the Journal of Political Economy, the Quarterly Journal of Economics, Econometrica, the Review of Economic Studies, and the Journal of Financial Economics, and asked them for their thoughts about what might improve the process. We found a rough consensus that referees for top journals in economics tend to make similar, correctable mistakes. The italicized quotations throughout this paper are drawn from our correspondence with these editors and our own experience. Their insights are consistent with our own experiences as editors at the Journal of Finance and the Review of Financial Studies. Our objective is to highlight these mistakes and provide a roadmap for how to avoid them.

Tuesday, December 20, 2016

Hysteresis and Fiscal Policy

The implications of hysteresis for fiscal policy:

Hysteresis and Fiscal Policy, by Philipp Engler and Juha Tervala, December 19, 2016: Abstract Empirical studies support the hysteresis hypothesis that recessions have a permanent effect on the level of output. We analyze the implications of hysteresis for fiscal policy in a DSGE model. We assume a simple learning-by-doing mechanism where demand-driven changes in employment can affect the level of productivity permanently, leading to hysteresis in output. We show that the fiscal output multiplier is much larger in the presence of hysteresis and that the welfare multiplier of fiscal policy -- the consumption equivalent change in welfare for one dollar change in public spending -- is positive (negative) in the presence (absence) of hysteresis. The main benefit of accommodative fiscal policy in the presence of hysteresis is to diminish the damage of a recession to the long-term level of productivity and, thus, output.

Monday, November 28, 2016

Immigrants and Firms' Outcomes: Evidence from France

From the NBER:

Immigrants and Firms' Outcomes: Evidence from France, by Cristina Mitaritonna, Gianluca Orefice, and Giovanni Peri, NBER Working Paper No. 22852 Issued in November 2016: In this paper we analyze the impact of an increase in the local supply of immigrants on firms’ outcomes, allowing for heterogeneous effects across firms according to their initial productivity. Using micro-level data on French manufacturing firms spanning the period 1995-2005, we show that a supply-driven increase in the share of foreign-born workers in a French department (a small geographic area) increased the total factor productivity of firms in that department. Immigrants were prevalently highly educated and this effect is consistent with a positive complementarity and spillover effects from their skills. We also find this effect to be significantly stronger for firms with low initial productivity and small size. The positive productivity effect of immigrants was also associated with faster growth of capital, larger exports and higher wages for natives. Highly skilled natives were pushed towards firms that did not hire too many immigrants spreading positive productivity effects to those firms too. Because of stronger effects on smaller and initially less productive firms, the aggregate effects of immigrants at the department level on average productivity and employment was small.

Sunday, November 20, 2016

Game Theory in Economics and Beyond

This is from the Journal of Economic Perspectives (the link is open):

Game Theory in Economics and Beyond, by Larry Samuelson, Journal of Economic Perspectives vol. 30, no. 4, Fall 2016 (pp. 107-30): Abstract Within economics, game theory occupied a rather isolated niche in the 1960s and 1970s. It was pursued by people who were known specifically as game theorists and who did almost nothing but game theory, while other economists had little idea what game theory was. Game theory is now a standard tool in economics. Contributions to game theory are made by economists across the spectrum of fields and interests, and economists regularly combine work in game theory with work in other areas. Students learn the basic techniques of game theory in the first-year graduate theory core. Excitement over game theory in economics has given way to an easy familiarity. This essay first examines this transition, arguing that the initial excitement surrounding game theory has dissipated not because game theory has retreated from its initial bridgehead, but because it has extended its reach throughout economics. Next, it discusses some key challenges for game theory, including the continuing problem of dealing with multiple equilibria, the need to make game theory useful in applications, and the need to better integrate noncooperative and cooperative game theory. Finally it considers the current status and future prospects of game theory.

Thursday, October 06, 2016

Did a Legal Ivory Sale Increase Smuggling and Poaching?

From the NBER Digest:

Did a Legal Ivory Sale Increase Smuggling and Poaching?: After the experimental 2008 sale, there was a discontinuous jump in the proportion of wild elephants poached and in seizures of contraband ivory leaving Africa.
Advocates of legalizing the purchase of goods sold in black markets argue that allowing legal trade will displace illegal buying and selling, reduce criminal activity, and permit greater control of the previously illegal goods. New research indicates that this is not always the case.
In Does Legalization Reduce Black Market Activity? Evidence from a Global Ivory Experiment and Elephant Poaching Data (NBER Working Paper No. 22314), Solomon Hsiang and Nitin Sekar show that the production of black market elephant ivory expanded by an estimated 66 percent following a one-time legal sale in 2008. Seizures of contraband ivory leaving African countries also increased, from 4.8 to 8.4 seizures per country per year. The weight of ivory in the seizures increased by an average of 335 kilograms per year.


In 1989, the Convention on the International Trade of Endangered Species (CITES) banned international trade in ivory in order to protect the wild African elephant. Individual countries continued to regulate their domestic ivory trade. Poaching slowed, and elephant populations began to recover. African governments kept stockpiles of ivory harvested from animals that died naturally.
Poaching began increasing again in the mid-1990s. Following a single legal sale from stockpiles to Japan in 1999, China and Japan requested the right to make an additional purchase. After years of debate, the governments of those countries were able to purchase 62 and 45 tons of legal ivory, respectively, at auction in 2008. The governments continue to resell that ivory in their domestic markets.
After the legal sale in 1999, CITES established the Monitoring the Illegal Killing of Elephants (MIKE) program at 79 sites in 40 countries in Africa and Asia. Preliminary data collection began in mid-2002. The Proportion of Illegally Killed Elephants (PIKE) Index is the fraction of "detected elephant carcasses that were illegally killed," a measure designed to correct for fluctuating elephant populations and field worker effort.
The researchers examine how poachers responded to the 2008 sale by studying annual PIKE data from 2003 to 2013. They find a clear discontinuous increase in the index after the 2008 sale. They cannot explain this increase with changes in natural elephant mortality rates, or with economic variables such as China's or Japan's per capita GDP, Chinese or Japanese trade with elephant range countries, measures of China's physical presence in range countries, or per capita GDP in PIKE-reporting countries.
The researchers conclude that the legal sale of ivory "triggered an increase in black market ivory production by increasing consumer demand and/or reducing the cost of supplying black market ivory." Supplier costs may be reduced if legalization of a product makes it more difficult to detect and monitor illegal provision of that product. Consumer demand may rise because legalization may reduce the stigma around a previously banned product.

Tuesday, August 02, 2016

Gold Has Never Been a Great Hedge against Bad Economic Times

Summary of ‘Gold Returns’ by Robert Barro and Sanjay Misra published in the August 2016 issue of the Economic Journal:

Gold Has Never Been a Great Hedge against Bad Economic Times: Evidence from decades of US and global data: Gold has not served very well as a hedge against bad macroeconomic and stock market outcomes. That is the central conclusion of research by Professors Robert Barro and Sanjay Misra, published in the August 2016 issue of the Economic Journal. Their study draws on evidence from long-term US data on gold returns, as well as gold returns during some of the worst macroeconomic disasters experienced across the world.
Gold has historically played a prominent role in transactions among financial institutions even in modern systems that rely on paper money. What’s more, many observers think that gold provides a hedge against major macroeconomic declines. But after assessing long-term US data on gold returns, the new research finds that gold has not served consistently as a hedge against large declines in real GDP or real stock prices. ... [more] ...

[Paper (October 2013 version)]

Tuesday, June 07, 2016

A Contagious Malady? Open Economy Dimensions of Secular Stagnation

This paper by Gauti Eggertsston, Neil Mehrotra, Sanjay Singh, and Larry Summers was released yesterday as an NBER Working paper. The paper looks at secular stagnation in open economy and examines how it is transmitted across countries (in an OLG framework with many countries and imperfect capital integration). One interesting implication is that if the Fed pursues an interest rate hike, and the rest of the world does not follow, we should expect strong capital flows into the US thus possibly generating a mismatch between desired savings and investment. This, in turn, leads to a drop in the US natural rate of interest forcing the Fed to cut rates again to avoid a recession:

A Contagious Malady? Open Economy Dimensions of Secular Stagnation, by Gauti B. Eggertsson, Neil R. Mehrotra, Sanjay R. Singh, Lawrence H. Summers, NBER Working Paper No. 22299: Conditions of secular stagnation - low interest rates, below target inflation, and sluggish output growth - characterize much of the global economy. We consider an overlapping generations, open economy model of secular stagnation, and examine the effect of capital flows on the transmission of stagnation. In a world with a low natural rate of interest, greater capital integration transmits recessions across countries as opposed to lower interest rates. In a global secular stagnation, expansionary fiscal policy carries positive spillovers implying gains from coordination, and fiscal policy is self-financing. Expansionary monetary policy, by contrast, is beggar-thy-neighbor with output gains in one country coming at the expense of the other. Similarly, we find that competitiveness policies including structural labor market reforms or neomercantilist trade policies are also beggar-thy-neighbor in a global secular stagnation.

A related variation that strips down the argument in the paper (which uses and elaborate DSGE model) into a simple textbook IS-MP framework is in this year's AER Papers and Proceedings volume. The results are much the same. See here. The more elaborate model should give people comfort in knowing that the key insights hold once you put add all the bells and whistles of a modern DSGE (or perhaps it's the other way around).

Wednesday, May 04, 2016

Neo-Fisherian Policies Impart Unavoidable Instability

My colleagues have a new paper on interest rate pegs in New Keynesian models:

Interest Rate Pegs in New Keynesian Models by George W. Evans and Bruce McGough Abstract: John Cochrane asks: "Do higher interest rates raise or lower inflation?" We find that pegging the interest rate at a higher level will induce instability and most likely lead to falling inflation and output over time. Eventually, this will precipitate a change of policy. ...
Conclusions: Following the Great Recession, many countries have experienced repeated periods with realized and expected inflation below target levels set by policymakers. Should policy respond to this by keeping interest rates near zero for a longer period or, in line with neo-Fisherian reasoning, by increasing the interest rate to the steady-state level corresponding to the target inflation rate? We have shown that neo-Fisherian policies, in which interest rates are set according to a peg, impart unavoidable instability. In contrast, a temporary peg at low interest rates, followed by later imposition of the Taylor rule around the target inflation rate, provides a natural return to normalcy, restoring inflation to its target and the economy to its steady state.

Monday, May 02, 2016

Growth of Income and Welfare in the U.S, 1979-2011

From the NBER:

Growth of income and welfare in the U.S, 1979-2011, by John Komlos, NBER Working Paper No. 22211 Issued in April 2016: We estimate growth rates of real incomes in the U.S. by quintiles using the Congressional Budget Office’s (CBO) post-tax, post-transfer data as basis for the period 1979-2011. We improve upon them by including only the present value of earnings that will accrue in retirement and excluding items included in the CBO income estimates such as “corporate taxes borne by labor” that do not increase either current purchasing power or utility. We estimate a high and a low growth rate using two price indexes, the CPI and the Personal Consumption Expenditure index. The major consistent findings include what in the colloquial is referred to as the “hollowing out” of the middle class. According to these estimates, the income of the middle class 2nd and 3rd quintiles increased at a rate of between 0.1% and 0.7% per annum, i.e., barely distinguishable from zero. Even that meager rate was achieved only through substantial transfer payments. In contrast, the income of the top 1% grew at an astronomical rate of between 3.4% and 3.9% per annum during the 32-year period, reaching an average annual value of $918,000, up from $281,000 in 1979 (in 2011 dollars). Hence, the post-tax, post-transfer income of the 1% relative to the 1st quintile increased from a factor of 21 in 1979 to a factor of 51 in 2011. However, income of no other group increased substantially relative to that of the lowest quintile. Oddly, the income of even those in the 96-99 percentiles increased only from a multiple of 8.1 to a multiple of 11.3. We next estimate growth in welfare assuming diminishing marginal utility of income. A logarithmic utility function yields a growth in welfare for the middle class of roughly 0.01% to 0.07% per annum, which is indistinguishable from zero. With interdependent utility functions only the welfare of the 5th quintile experienced meaningful growth while those of the first four quintiles tend to be either negligible or even negative.

[Open link to earlier version.]

Friday, April 15, 2016

NBER 31st Annual Conference on Macroeconomics

I am here today and tomorrow:

National Bureau of Economic Research, Inc.
31st Annual Conference on Macroeconomics
Martin Eichenbaum and Jonathan Parker, Organizers
Royal Sonesta Hotel
Cambridge, MA  
Friday, April 15: 
9:00 am
Jeffrey Campbell, Federal Reserve Bank of Chicago
Jonas Fisher, Federal Reserve Bank of Chicago
Alejandro Justiniano, Federal Reserve Bank of Chicago
Leonardo Melosi, Federal Reserve Bank of Chicago
Forward Guidance and Macroeconomic Outcomes Since the Financial Crisis
Narayana Kocherlakota, University of Rochester and NBER
Gauti B. Eggertsson, Brown University and NBER
11:00 am
Fernando Alvarez, University of Chicago and NBER
Francesco Lippi, Einaudi Institute for Economics and Finance
Juan Passadore, Einaudi Institute for Economics and Finance
Are State and Time Dependent Models Really Different?
John Leahy, University of Michigan and NBER
Greg Kaplan, Princeton University and NBER
12:30 pm Lunch Panel on Global Commodity Prices
James D. Hamilton, University of California at San Diego and NBER
Steven B. Kamin, Federal Reserve Board
Steven Strongin, Goldman Sachs
 2:30 pm
Paul Beaudry, University of British Columbia and NBER
Dana Galizia, Carleton University
Franck Portier, Toulouse School of Economics
Is the Macroeconomy Locally Unstable and Why Should We Care?
Laura Veldkamp, New York University and NBER
Ivan Werning, Massachusetts Institute of Technology and NBER
4:30 pm
Òscar Jordà, Federal Reserve Bank of San Francisco
Moritz Schularick, University of Bonn
Alan M. Taylor, University of California at Davis and NBER
Macrofinancial History and the New Business Cycle Facts
Mark Gertler, New York University and NBER
Atif Mian, Princeton University and NBER
6:30 pm Dinner Speaker: 
Lawrence Summers, Harvard University and NBER
Saturday, April 16:
9:00 am
Pierre-Olivier Gourinchas, University of California at Berkeley and NBER
Thomas Philippon, New York University and NBER
Dimitri Vayanos, London School of Economics and NBER
The Analytics of the Greek Crisis
Olivier Blanchard, Peterson Institute for International Economics and NBER
Markus Brunnermeier, Princeton University and NBER
11:00 am
Olivier Blanchard, Peterson Institute for International Economics and NBER
Christopher Erceg, Federal Reserve Board
Jesper Lindé, Sveriges Riksbank
Jump-Starting the Euro Area Recovery: Would a Rise in Core Fiscal Spending Help the Periphery?
Harald Uhlig, University of Chicago and NBER
Ricardo Reis, Columbia University and NBER


Forward Guidance and Macroeconomic Outcomes Since the Financial Crisis, by Jeffrey R. Campbell, Jonas D. M. Fisher, Alejandro Justiniano, and Leonardo Melosi: April 13, 2016 Abstract This paper studies the effects of FOMC forward guidance. We begin by using high frequency identification and direct measures of FOMC private information to show that puzzling responses of private sector forecasts to movements in federal funds futures rates on FOMC announcement days can be attributed almost entirely to Delphic forward guidance. However a large fraction of futures rates’ variability on announcement days remains unexplained leaving open the possibility that the FOMC has successfully communicated Odyssean guidance. We then examine whether the FOMC used Odyssean guidance to improve macroeconomic outcomes since the financial crisis. To this end we use an estimated medium-scale New Keynesian model to perform a counterfactual experiment for the period 2009:1–2014q4 in which we assume the FOMC did not employ any Odyssean guidance and instead followed its reaction function inherited from before the crisis as closely as possible while respecting the effective lower bound. We find that a purely rule-based policy would have delivered better outcomes in the years immediately following the crisis – forward guidance was counterproductive. However starting toward the end of 2011, after the Fed’s introduction of “calendar-based” communications, Odyssean guidance appears to have boosted real activity and moved inflation closer to target. We show that our results do not reflect Del Negro, Giannoni, and Patterson (2015)’s forward guidance puzzle.

Are State and Time dependent models really different?, Fernando Alvarez, Francesco Lippi Einaudi, Juan Passadore:  April 13, 2016 FIRST DRAFT Abstract Yes, but only for large monetary shocks. In particular, we show that for a large class of models where shocks have continuous paths, the propagation of a monetary impulse is independent of the nature of the sticky price friction when shocks are small. The propagation of large shocks instead depends on the nature of the friction: the impulse response of inflation to monetary shocks is non-linear in state-dependent models, while it is independent of the shock size in time-dependent models. We use data on exchange rate devaluations and inflation for a panel of countries over 1974-2014 to test for the presence of state dependent decision rules. We find evidence of a non-linear effect of exchange rate changes on prices in the sample of flexible-exchange rate countries with low inflation. In particular, we find that large exchange rate changes have larger short term pass through, as implied by state dependent models.

Is the Macroeconomy Locally Unstable and Why Should We Care?, by Paul Beaudry, Dana Galizia, and Franck Portier: March 2016 Abstract In most modern macroeconomic models, the steady state (or balanced growth path) of the system is a local attractor, in the sense that, in the absence of shocks, the economy would converge to the steady state. In this paper, we examine whether the time series behavior of macroeconomic aggregates (especially labor market aggregates) is in fact supportive of this local-stability view of macroeconomic dynamics, or if it instead favors an alternative interpretation in which the macroeconomy may be better characterized as being locally unstable, with nonlinear deterministic forces capable of producing endogenous cyclical behavior. To do this, we extend a standard AR representation of the data to allow for smooth nonlinearities. Our main finding is that, even using a procedure that may have low power to detect local instability, the data provide intriguing support for the view that the macroeconomy may be locally unstable and involve limit-cycle forces. An interesting finding is that the degree of nonlinearity we detect in the data is small, but nevertheless enough to alter the description of macroeconomic behavior. We complete the paper with a discussion of the extent to which these two different views about the inherent dynamics of the macroeconomy may matter for policy.

Macrofinancial History and the New Business Cycle Facts. by Oscar Jordà, Moritz Schularick, and Alan M. Taylor: Abstract In the era of modern finance, a century-long near-stable ratio of credit to GDP gave way to increasing financialization and surging leverage in advanced economies in the last forty years. This “financial hockey stick” coincides with shifts in foundational macroeconomic relationships beyond the widely-noted return of macroeconomic fragility and crisis risk. Leverage is correlated with central business cycle moments. We document an extensive set of such moments based on a decade-long international and historical data collection effort. More financialized economies exhibit somewhat less real volatility but lower growth, more tail risk, and tighter real-real and real- financial correlations. International real and financial cycles also cohere more strongly. The new stylized facts we document should prove fertile ground for the development of a newer generation of macroeconomic models with a prominent role for financial factors.

The Analytics of the Greek Crisis, by Pierre-Olivier Gourinchas, Thomas Philippon, and Dimitri Vayanos: April 13, 2016 Abstract This paper presents an interim and analytical report on the Greek Crisis of 2010. The Greek crisis presents a number of important features that sets it apart from the typical sudden stop, sovereign default, or lending boom/bust episodes of the last quarter century. We provide an analytical account of the Greek crisis using a rich model designed to capture the main financial and macro linkages of a small open economy. Using the model to parse through the wreckage, we uncover the following main findings: (a) Greece experienced a more prolonged and severe decline in output per capita than almost any crisis on record since 1980; (b) the crisis was significantly backloaded, thanks to important financial assistance mechanisms; (c) a sizable share of the crisis was the consequence of the sudden stop that started in late 2009; (d) the severity of the crisis was compounded by elevated initial levels of exposure (external debt, public debt, domestic credit), vastly in excess of levels observed in typical emerging economies. In summary: Greece experienced a typical Emerging Market Sudden Stop crisis, with the initial exposure levels of an Advanced Economy

Jump-Starting the Euro Area Recovery: Would a Rise in Core Fiscal Spending Help the Periphery?, by Olivier Blanchard, Christopher J. Erceg, Jesper Linde: March 24, 2016 Abstract We show that a Öscal expansion by the core economies of the euro area would have a large and positive impact on periphery GDP assuming that policy rates remain low for a prolonged period. Under our preferred model specification, an expansion of core government spending equal to one percent of euro area GDP would boost periphery GDP around 1 percent in a liquidity trap lasting three years, nearly half as large as the effect on core GDP. Accordingly, under a standard ad hoc loss function involving output and inflation gaps, increasing core spending would generate substantial welfare improvements, especially in the periphery. The benefits are considerably smaller under a utility-based welfare measure, reflecting in part that higher net exports play a material role in raising periphery GDP.

Thursday, February 11, 2016

'Does Inequality Cause Financial Distress?'

This is from a Federal Reserve Bank of Philadelphia Working Paper:

Does inequality cause financial distress? Evidence from lottery winners and neighboring bankruptcies Sumit Agarwal, Vyacheslav Mikhed, and Barry Scholnick: Abstract We test the hypothesis that income inequality causes financial distress. To identify the effect of income inequality, we examine lottery prizes of random dollar magnitudes in the context of very small neighborhoods (13 households on average). We find that a C$1,000 increase in the lottery prize causes a 2.4% rise in subsequent bankruptcies among the winners’ close neighbors. We also provide evidence of conspicuous consumption as a mechanism for this causal relationship. The size of lottery prizes increases the value of visible assets (houses, cars, motorcycles), but not invisible assets (cash and pensions), appearing on the balance sheets of neighboring bankruptcy filers.
Download Full text.

Wednesday, February 10, 2016

'The Cap-and-Trade Sulfur Dioxide Allowances Market Experiment'

From the NBER Digest:

The Cap-and-Trade Sulfur Dioxide Allowances Market Experiment: The Acid Rain Program led to higher levels of premature mortality than would have occurred under a hypothetical no-trade counterfactual with the same overall sulfur dioxide emissions.

Since the passage of the Clean Air Act of 1990, the federal government has pursued a variety of policies designed to reduce the level of sulfur dioxide emissions from coal-fired power plants and the associated acid rain. In The Market for Sulfur Dioxide Allowances: What Have We Learned from the Grand Policy Experiment? (NBER Working Paper No. 21383), H. Ron Chan, B. Andrew Chupp, B. Andrew Chupp, Maureen L. Cropper, and Nicholas Z. Muller evaluate the cost savings and the health consequences of relying on a cap-and-trade sulfur dioxide allowance market to implement emissions reductions.
The key argument advanced by proponents of cap-and-trade programs for pollution reduction is that they are less costly than regulatory programs that impose the same abatement requirements on all polluters. By allowing emission sources with high abatement costs to offset higher on-site emissions by purchasing additional reductions from other, lower-cost polluters, they assert trade in pollution allowances reduces the total cost of achieving a given reduction in aggregate emissions.
To study the cost savings associated with the Acid Rain Program, which allowed such trade, the authors model the cost of abatement for individual coal-fired power plants. They estimate how firms choose between the two leading technologies for sulfur dioxide abatement, burning low-sulfur coal and installing flue-gas desulfurization units. They use these estimates to compare abatement decisions corresponding to the Acid Rain Program and standards that achieve the same aggregate reduction in emissions by making uniform requirements on coal-fired plants, with no trading allowed. They find cost savings in 2002, with the Acid Rain Program in full swing, of approximately $250 million from trade in emission allowances. This is less than half of the previously estimated saving from tradable permits. The data suggest that many generating units were not complying with the Clean Air Act in the most economical manner.
One potential drawback of a cap-and-trade system is that in some areas the level of local pollutants — those which pose the greatest health threat near their place of emission — can be higher than under uniform emission standards. This could occur if, for example, utilities in the densely populated eastern United States, where emission reduction can be comparatively costly, pay utilities in less-populous western regions, where abatement is cheaper, to cut emissions there. The aggregate national reduction may still be achieved, but many more people in the densely populated east could be exposed to pollutants.
The researchers find a greater level of particulate air pollution and associated premature mortality under the Acid Rain Program than under a hypothetical no-trade scenario in which units emitted SO2 at a rate equal to 2002 allowance allocations plus observed drawdowns of their allowance banks. They estimate the cost of health damages associated with observed SO2 emissions in 2002 under the Acid Rain Program to be $2.4 billion higher than would have been the case under the no-trade scenario. They conclude that the health impact of a cap-and-trade program depends on how the program is structured and on the correlation between marginal abatement costs and marginal damages across pollution sources.

Wednesday, February 03, 2016

'How Successful Was the New Deal?'

From the NBER:

How Successful Was the New Deal? The Microeconomic Impact of New Deal Spending and Lending Policies in the 1930s, by Price V. Fishback, NBER Working Paper No. 21925 Issued in January 2016: Abstract The New Deal during the 1930s was arguably the largest peace-time expansion in federal government activity in American history. Until recently there had been very little quantitative testing of the microeconomic impact of the wide variety of New Deal programs. Over the past decade scholars have developed new panel databases for counties, cities, and states and then used panel data methods on them to examine the examine the impact of New Deal spending and lending policies for the major New Deal programs. In most cases the identification of the effect comes from changes across time within the same geographic location after controlling for national shocks to the economy. Many of the studies also use instrumental variable methods to control for endogeneity. The studies find that public works and relief spending had state income multipliers of around one, increased consumption activity, attracted internal migration, reduced crime rates, and lowered several types of mortality. The farm programs typically aided large farm owners but eliminated opportunities for share croppers, tenants, and farm workers. The Home Owners’ Loan Corporation’s purchases and refinancing of troubled mortgages staved off drops in housing prices and home ownership rates at relatively low ex post cost to taxpayers. The Reconstruction Finance Corporation’s loans to banks and railroads appear to have had little positive impact, although the banks were aided when the RFC took ownership stakes.

(I couldn't find an open link.)

Monday, November 16, 2015

'Inflation and Activity – Two Explorations and their Monetary Policy Implications'

Olivier Blanchard, Eugenio Cerutti, and Lawrence Summers (the results are preliminary):

Inflation and Activity – Two Explorations and their Monetary Policy Implications Olivier Blanchard, Eugenio Cerutti, and Lawrence Summers NBER Working Paper No. 21726 November 2015: Introduction: We explore two empirical issues triggered by the Great Financial Crisis. First, in most advanced countries, output remains far below the pre-recession trend, leading researchers to revisit the issue of hysteresis... Second, while inflation has decreased, it has decreased less than was anticipated (an outcome referred to as the “missing disinflation’’), leading researchers to revisit the relation between inflation and activity.
Clearly, if confirmed, either the presence of hysteresis or the deterioration of the relation between inflation and activity would have major implications for monetary policy and for stabilization policy more generally. ...
First, we revisit the hysteresis hypothesis, defined as the hypothesis that recessions may have permanent effects on the level of output relative to trend. ... We find that a high proportion of recessions, about two-thirds, are followed by lower output relative to the pre-recession trend even after the economy has recovered. Perhaps more surprisingly, in about one-half of those cases, the recession is followed not just by lower output, but by lower output growth relative to the pre-recession output trend. That is, as time passes following recessions, the gap between output and projected output on the basis of the pre-recession trend increases. ...
Turning to the Phillips curve relation, we ... find clear evidence that the effect of the unemployment gap on inflation has substantially decreased since the 1970s. Most of the decrease, however, took place before the early 1990s. Since then, the coefficient appears to have been stable, and, in most cases, significant...
Finally, in the last section, we explore the implications of our findings for monetary policy. The findings of the second section have opposite implications for monetary policy... To the extent that recessions are due to the perception or anticipation of lower underlying growth, this implies that estimates of potential output, based on the assumption of an unchanged underlying trend, may be too optimistic, and lead to too strong a policy response to movements in output. However, to the extent that recessions have hysteresis or super-hysteresis effects, then the cost of allowing downward movements in output in response to shifts in demand increases implies that a stronger response to output gaps is desirable.
The findings of the third section yield less dramatic conclusions. To the extent that the coefficient on the unemployment gap, while small, remains significant, the implication is that, within an inflation targeting framework, the interest rate rule should put more weight on the output gap relative to inflation. ...

Wednesday, November 11, 2015

'Even Famous Female Economists Get No Respect'

Bit behind today. This is by Justin Wolfers:

Even Famous Female Economists Get No Respect: Men’s voices tend to dominate economic debate, although perhaps this is shaped by how we talk about the contributions of female economists. This is easiest to see in how we discuss the work of economist power couples.
Remembering the journalistic cliché that one is an example, two is a coincidence and three is a trend, I figured it worth exploring how female economists are treated. ...

Monday, October 26, 2015

'Economic Cycles in Ancient China'

From the NBER:

Economic Cycles in Ancient China, by Yaguang Zhang, Guo Fan, and John Whalley, NBER Working Paper No. 21672 Issued in October 2015: We discuss business cycles in ancient China. Data on Ancient China business cycles are sparse and incomplete and so our discussion is qualitative rather than quantitative. Essentially, ancient debates focused on two types of cycles: long run political or dynastic cycles of many decades, and short run nature induced cycles. Discussion of the latter show strong parallels to Jevons’ conception of sun spot cycles. The former has no clear contemporary analogue, were often deep in impact and of long duration. The discussion of both focused on agricultural economies. Ancient discussion on intervention focused on counter cyclical measures, including stockpiling, and predated Keynes and the discussion in the 1930s by centuries. Also, a strongly held belief emerged that cycles create their own cycles to follow, and that cycles are part of the inevitable economic order, a view consistent with Mitchell’s view of the business cycle in the 1940s. Current debates on how best to respond to the ongoing global financial crisis draw in part on historical precedents, but these are largely limited to the last 150 years for OECD countries and with major focus on the 1990’s. Here we also probe material on Ancient China to see what is relevant.

Monday, October 19, 2015

'How Does Declining Unionism Affect the American Middle Class and Intergenerational Mobility?'

Via the NBER:

How Does Declining Unionism Affect the American Middle Class and Intergenerational Mobility?, by Richard Freeman, Eunice Han, David Madland, Brendan V. Duke, NBER Working Paper No. 21638 [Open Link to Earlier Version]: This paper examines unionism’s relationship to the size of the middle class and its relationship to intergenerational mobility. We use the PSID 1985 and 2011 files to examine the change in the share of workers in a middle-income group (defined by persons having incomes within 50% of the median) and use a shift-share decomposition to explore how the decline of unionism contributes to the shrinking middle class. We also use the files to investigate the correlation between parents’ union status and the incomes of their children. Additionally, we use federal income tax data to examine the geographical correlation between union density and intergenerational mobility. We find: 1) union workers are disproportionately in the middle-income group or above, and some reach middle-income status due to the union wage premium; 2) the offspring of union parents have higher incomes than the offspring of otherwise comparable non-union parents, especially when the parents are low-skilled; 3) offspring from communities with higher union density have higher average incomes relative to their parents compared to offspring from communities with lower union density. These findings show a strong, though not necessarily causal, link between unions, the middle class, and intergenerational mobility.

Friday, October 16, 2015

'Economics and the Modern Economic Historian'

This surprised me. I was under the impression that things are moving in the opposite direction:

Economics and the Modern Economic Historian, by Ran Abramitzky, NBER Working Paper No. 21636, October 2015: Abstract I reflect on the role of modern economic history in economics. I document a substantial increase in the percentage of papers devoted to economic history in the top-5 economic journals over the last few decades. I discuss how the study of the past has contributed to economics by providing ground to test economic theory, improve economic policy, understand economic mechanisms, and answer big economic questions. Recent graduates in economic history appear to have roughly similar prospects to those of other economists in the economics job market. I speculate how the increase in availability of high quality micro level historical data, the decline in costs of digitizing data, and the use of computationally intensive methods to convert large-scale qualitative information into quantitative data might transform economic history in the future.

From the introduction to the paper:

... This sense that economists “believe history to be of small and diminishing interest” was made clear ... in 1976, when McCloskey wrote in defense of economic history a paper entitled “Does the past have useful economics?”. McCloskey concluded that the average American economist answers “no”. McCloskey showed a sharp decline in the publication of economic history papers in the top economic journals (AER, QJE, JPE). It was clear that “…this older generation of American economists did not persuade many of the younger that history is essential to economics.” ...

Today, thirty years later, economic history is far from being marginalized and overlooked by economists. To be sure, economic history remains a small field within economics, but the average economist today would answer a “yes” to the question of whether the past has useful economics. Economists increasingly recognize that historical events shape current economic development, and that current modern economies were once upon a time developing and their experience might be relevant for current developing countries. Recent debates in the US and Europe about immigration policies renewed interest in historical migration episodes. Most notably, the Great Recession of 2007-08 reminded economists of the Great Depression and other historic financial crises. Macroeconomic historian Christina Romer, a Great Depression expert, became the chief advisor of president Obama.3 Indeed, Barry Eichengreen, himself an expert on financial crises in history, started his 2011 presidential address by saying that “this has been a good crisis for economic history.”

 That economic history today is more respected and appreciated by the average economist is also reflected by an increase in economic history publications in the top-5 economic journals. The decline in economic history in the top-3 journals that McCloskey documented has been reversed...

Friday, October 09, 2015

'Resurrecting the Role of the Product Market Wedge in Recession'

Some of you might find this interesting:

Resurrecting the Role of the Product Market Wedge in Recessions Mark Bils, Peter J. Klenow, and Benjamin A. Malin: Abstract Employment and hours appear far more cyclical than dictated by the behavior of productivity and consumption. This puzzle has been called “the labor wedge” — a cyclical intratemporal wedge between the marginal product of labor and the marginal rate of substitution of consumption for leisure. The intratemporal wedge can be broken into a product market wedge (price markup) and a labor market wedge (wage markup). Based on the wages of employees, the literature has attributed the intratemporal wedge almost entirely to labor market distortions. Because employee wages may be smoothed versions of the true cyclical price of labor, we instead examine the self-employed and intermediate inputs, respectively. Looking at the past quarter century in the United States, we find that price markup movements are at least as important as wage markup movements — including during the Great Recession and its aftermath. Thus, sticky prices and other forms of countercyclical markups deserve a central place in business cycle research, alongside sticky wages and matching frictions.

Download Full text.

Monday, September 28, 2015

'Cheap Talk, Round Numbers, and Signaling Behavior'

Advice about selling goods on eBay from the NBER Digest:

Cheap Talk, Round Numbers, and Signaling Behavior: In the marketplace for ordinary goods, buyers and sellers have many characteristics that are hidden from each other. From the seller's perspective, it may be beneficial to reveal some of these characteristics. For example, a patient seller may want to signal unending willingness to wait in order to secure a good deal. At the same time, an impatient seller may want to signal a desire to sell a good quickly, albeit at a lower price.

This insight is at the heart of Cheap Talk, Round Numbers, and the Economics of Negotiation (NBER Working Paper No. 21285) by Matthew Backus, Thomas Blake, and Steven Tadelis. The authors show that sellers on eBay behave in a fashion that is consistent with using round numbers as signals of impatience.
The authors analyze data from eBay's bargaining platform using its collectibles category—coins, antiques, toys, memorabilia, and the like. The process is one of sequential offers not unlike haggling in an open-air market. A seller lists an initial price, to which buyers may make counteroffers, to which sellers may make counteroffers, and so on. If a price is agreed upon, the good sells. The authors analyze 10.5 million listed items, out of which 2.8 million received offers and 2.1 million ultimately sold. Their key finding is that items listed at multiples of $100 receive lower offers on average than items listed at nearby prices, ultimately selling for 5 to 8 percent less.
It is tempting to label such behavior a mistake. However, items listed at these round numbers receive offers 6 to 11 days sooner and are 3 to 5 percent more likely to sell than items listed at "precise" numbers. Furthermore, even experienced sellers frequently list items at round numbers, suggesting it is an equilibrium behavior best modeled by rationality rather than seller error. It appears that impatient sellers are able to signal their impatience and are happy to do it, even though it nets them a lower price.
One concern with the analysis is that round-number pricing might provide a signal about the good being sold, rather than the person or firm selling it. To address this issue, the authors use data on goods originally posted with prices in British pounds. These prices are automatically translated to U.S. dollars for the American market. Hence, the authors can test what happens when goods intended to be sold at round numbers are, in fact, sold at non-round numbers. This removes the round-number signal while holding the good's features constant. In this setting, they find that buyers of goods priced in non-round dollar amounts systematically realize higher prices, though the effect is not as strong as that in their primary sample. This evidence indicates the round numbers themselves have a significant effect on bargaining outcomes.
The authors find additional evidence on the round-number phenomenon in the real estate market in Illinois from 1992 to 2002. This is a wholly different market than that for eBay collectibles, with much higher prices and with sellers typically receiving advice from professional listing agents. But here, too, there is evidence that round-number listings lead to lower sales prices. On average, homes listed at multiples of $50,000 sold for $600 less.

'The Wage Impact of the Marielitos: A Reappraisal'

No sense hiding from evidence that works against my support of immigration. This is from George Borjas (if you are unfamiliar with the Mariel boatlift, see here):

The Wage Impact of the Marielitos: A Reappraisal, by George J. Borjas, NBER Working Paper No. 21588 [open link]: This paper brings a new perspective to the analysis of the Mariel supply shock, revisiting the question and the data armed with the accumulated insights from the vast literature on the economic impact of immigration. A crucial lesson from this literature is that any credible attempt to measure the wage impact of immigration must carefully match the skills of the immigrants with those of the pre-existing workforce. The Marielitos were disproportionately low-skill; at least 60 percent were high school dropouts. A reappraisal of the Mariel evidence, specifically examining the evolution of wages in the low-skill group most likely to be affected, quickly overturns the finding that Mariel did not affect Miami’s wage structure. The absolute wage of high school dropouts in Miami dropped dramatically, as did the wage of high school dropouts relative to that of either high school graduates or college graduates. The drop in the relative wage of the least educated Miamians was substantial (10 to 30 percent), implying an elasticity of wages with respect to the number of workers between -0.5 and -1.5. In fact, comparing the magnitude of the steep post-Mariel drop in the low-skill wage in Miami with that observed in all other metropolitan areas over an equivalent time span between 1977 and 2001 reveals that the change in the Miami wage structure was a very unusual event. The analysis also documents the sensitivity of the estimated wage impact to the choice of a placebo. The measured impact is much smaller when the placebo consists of cities where pre-Mariel employment growth was weak relative to Miami.

Monday, September 21, 2015

The Lifecycle of Scholarly Articles across Fields of Economic Research

This might interest some of you:

The lifecycle of scholarly articles across fields of economic research, by Sebastian Galiani, Ramiro Gálvez, Maria Victoria Anauati, Vox EU: Citation counts stand as the de facto methodology for measuring the influence of scholarly articles in today’s economics profession. Nevertheless, a great deal of criticism has been made of the practice of naively using citation analysis to compare the impact of scholarly articles without taking into account other factors which may affect citation patterns (see Bornmann and Daniel 2008).
One recurrent criticism focuses on ‘field-dependent factors’... In a recent paper (Anauati et al. 2015), we analyze if the ‘field-dependent factors’ critique is also valid for fields of research inside economics. Our approach began by assigning into one of four fields of economic research (applied, applied theory, econometric methods and theory) every paper published in the top five economics journals –  The American Economic Review, Econometrica, the Journal of Political Economy, The Quarterly Journal of Economics, and The Review of Economic Studies.
The sample consisted of 9,672 articles published in the top five journals between 1970 and 2000. It did not include notes, comments, announcements or American Economic Review Papers and Proceedings issues. ...

What did they find?:

Conclusions Even though citation counts are an extremely valuable tool for measuring the importance of academic articles, the patterns observed for the lifecycles of papers across fields of economic research support the ‘field-dependent factors’ inside this discipline. Evidence seems to provide a basis for a caveat regarding the use of citation counts as a ‘one-size-fits-all’ yardstick to measure research outcomes in economics across fields of research, as the incentives generated by their use can be detrimental for fields of research which effectively generate valuable (but perhaps more specialized) knowledge, not only in economics but in other disciplines as well.
According to our findings, pure theoretical economic research is the clear loser in terms of citation counts. Therefore, if specialized journals' impact factors are calculated solely on the basis of citations during the first years after an article’s publication, then theoretical research will clearly not be attractive to departments, universities or journals that are trying to improve their rankings or to researchers who use their citation records when applying for better university positions or for grants. The opposite is true for applied papers and applied theory papers – these fields of research are the outright winners when citation counts are used as a measurement of articles' importance, and their citation patterns over time are highly attractive for all concerned. Econometric method papers are a special case; their citation patterns vary a great deal across different levels of success.

Saturday, September 19, 2015

Unemployment Insurance and Progressive Taxation as Automatic Stabilizers

Some preliminary results from a working paper by Alisdair Mckay and Ricardo Reis:

Optimal Automatic Stabilizers, by Alisdair McKay and Ricardo Reis: 1 Introduction How generous should the unemployment insurance system be? How progressive should the tax system be? These questions have been studied extensively and there are well-known trade-offs between social insurance and incentives. Typically these issues are explored in the context of a stationary economy. These policies, however, also serve as automatic stabilizers that alter the dynamics of the business cycle. The purpose of this paper is to ask how and when aggregate stabilization objectives call for, say, more generous unemployment benefits or a more progressive tax system than would be desirable in a stationary economy. ...
We consider two classic automatic stabilizers: unemployment benefits and progressive taxation. Both of these policies have roles in redistributing income and in providing social insurance. Redistribution affects aggregate demand in our model because households differ in their marginal propensities to consume. Social insurance affects aggregate demand through precautionary savings decisions because markets are incomplete. In addition to unemployment insurance and progressive taxation, we also consider a fiscal rule that makes government spending respond automatically to the state of the economy.
Our focus is on the manner in which the optimal fiscal structure of the economy is altered by aggregate stabilization concerns. Increasing the scope of the automatic stabilizers can lead to welfare gains if they raise equilibrium output when it would otherwise be inefficiently low and vice versa. Therefore, it is not stabilization per se that is the objective but rather eliminating inefficient fluctuations. An important aspect of the model specification is therefore the extent of inefficient business cycle fluctuations. Our model generates inefficient fluctuations because prices are sticky and monetary policy cannot fully eliminate the distortions. We show that in a reasonable calibration, more generous unemployment benefits and more progressive taxation are helpful in reducing these inefficiencies. Simply put, if unemployment is high when there is a negative output gap, a larger unemployment benefit will stimulate aggregate demand when it is inefficiently low thereby raising welfare. Similarly, if idiosyncratic risk is high when there is a negative output gap,1 providing social insurance through more progressive taxation will also increase welfare....

Tuesday, September 15, 2015

'Market Power in Healthcare'

Via Austin Frakt at The Incidental Economist (I shortened the summaries):

Market Power: Recent NBER publications by Laurence Baker, M. Kate Bundorf, and Daniel Kessler:

1) “The Effect of Hospital/Physician Integration on Hospital Choice“:

We find that a hospital’s ownership of an admitting physician dramatically increases the probability that the physician’s patients will choose the owning hospital. We also find that ownership of an admitting physician has large effects on how the hospital’s cost and quality affect patients’ hospital choice. Patients whose admitting physician is not owned by a hospital are more likely to choose facilities that are low cost and high quality. ... We conclude that hospital/physician integration affects patients’ hospital choices in a way that is inconsistent with their best interests.

2) “Does Health Plan Generosity Enhance Hospital Market Power?” :

To what extent does the generosity of health insurance coverage facilitate the exercise of market power by producers of health services?  […]

We find a statistically significant and economically important effect of plan generosity on hospital prices in uncompetitive markets. ...

Monday, September 07, 2015

'Support for Redistribution in an Age of Rising Inequality: New Stylized Facts and Some Tentative Explanations'

From the NBER:

Support for Redistribution in an Age of Rising Inequality: New Stylized Facts and Some Tentative Explanations, by Vivekinan Ashok, Ilyana Kuziemko, and Ebonya Washington, NBER Working Paper No. 21529 Issued in September 2015 [open link to earlier version]: Despite the large increases in economic inequality since 1970, American survey respondents exhibit no increase in support for redistribution, in contrast to the predictions from standard theories of redistributive preferences. We replicate these results but further demonstrate substantial heterogeneity by demographic groups. In particular, the two groups who have most moved against income redistribution are the elderly and African-Americans. We find little evidence that these subgroup trends are explained by relative economic gains or growing cultural conservatism, two common explanations. We further show that the elderly trend is uniquely American, at least relative to other developed countries with comparable survey data. While we are unable to provide definitive evidence on the cause of these two groups' declining redistributive support, we offer additional correlations which may offer fruitful directions for future research on the topic. One story consistent with the data on elderly trends is that older Americans worry that redistribution will come at their expense, in particular via cuts to Medicare. We find that the elderly have grown increasingly opposed to government provision of health insurance and that controlling for this tendency explains about 40% of their declining support for redistribution. For blacks, controlling for their declining support of race-targeted aid explains nearly 45% of their differential decline in redistributive preferences (raising the question of why support for race-targeted aid has fallen during a period when black economic catch-up to whites has stalled).

Monday, August 31, 2015

'The Effect of Payday Lending Restrictions on Liquor Sales'

This is a summary of new research from two of our former graduate students here at the University of Oregon, Harold Cuffe and Chris Gibbs (link to full paper):

The effect of payday lending restrictions on liquor sales – Synopsis, by Harold Cuffe and Chris Gibbs: The practice of short-term consumer financing known as payday lending remains controversial because the theoretical gains in welfare from greater credit access stand in opposition to anecdotal evidence that many borrowers are made worse off. Advocates for the industry assert that the loans fill a gap in credit access for underserved individuals facing temporary financial hardship. Opponents, who include many state legislatures and the Obama administration, argue that lenders target financially vulnerable individuals with little ability to pay down their principal, who may end up paying many times the borrowed amount in interest and fees.
Regulations restricting both payday loan and liquor access seek to minimize the potential for overuse. To justify intervention in the two markets, policy makers note a host of negative externalities associated with each product, and cite behavioral motivations underlying individuals' consumption decisions. In particular, researchers have shown that the same models of impulsivity and dynamically inconsistent decision making - hyperbolic preferences and the cue theory of consumption - used to describe the demand for alcohol, also describe patterns of payday loan usage. In these models, individuals can objectively benefit from a restricted choice set that limits their access to loans and liquor. The overlap in behavioral characteristics of over-users of both products suggests that liquor sales is a reasonable and interesting place to test the effectiveness of payday lending regulations.
To identify the causal effect of lending restrictions on liquor sales, we exploit a change in payday lending laws in the State of Washington. Leveraging lender- and liquor store-level data, we estimate a difference-in-differences model comparing Washington to the neighboring State of Oregon, which did not experience a change in payday lending laws during this time. We find that the law change leads to a significant reduction in liquor sales, with the largest decreases occurring at liquor stores located very near to payday lenders at the time the law took effect. Our results provide compelling evidence on how credit constraints affect consumer spending, suggest a behavioral mechanism that may underlie some payday loan usage, and provide evidence that the Washington’s payday lending regulations reduced one form of loan misuse.
Washington State enacted HB 1709 on January, 1st 2010, which introduced three new major restrictions to the payday loan industry. First the law limited the size of a payday loan to 30% of a person's monthly income or $700, whichever is less. Second the law created a state-wide database to track the issuance of payday loans in order to set a hard cap on the number of loans an individual could obtain in a twelve month period to eight, and eliminated multiple concurrent loans. This effectively prohibited the repayment of an existing loan with a new one. In the year prior to the law, the State of Washington estimated that roughly one third of all payday loan borrowers took out more than eight loans. Finally, the law mandated that borrowers were entitled to a 90 day instalment plan to pay back loans of $400 or less or 180 days for loans over $400.
The effect of the law on the industry was severe. There were 603 payday loan locations active in Washington in 2009 that were responsible for 3.24 million loans worth $1.366 billion according to Washington Division of Financial Institutions. In the year following the law change, the number of payday lenders dropped to 424, and loan volume fell to 1.09 million loans worth only $434 million. The following year the number of locations fell again to 256 with a loan volume of roughly 900,000 worth $330 million. Today there are fewer than 200 lenders in Washington and the total loan volume and value has stabilized close to the 2011 values.
A crucial feature of our estimation strategy involves accounting for potentially endogenous supply side factors that challenge efforts to separately identify changes in demand from the store response to the change. To do so, we focus on liquor control states, in which the state determines the number and location of liquor stores, the products offered, and harmonizes prices across stores to regulate and restrict liquor access. Oregon and Washington were both liquor control states until June of 2012 (Washington privatized liquor sales in June 2012).
Main Results
For this study, we use monthly store-level sales data provided by Oregon's and Washington's respective liquor control agencies from July 2008 through March 2012. Figure 4 plots estimated residuals from a regression of log liquor store sales on a set of store-by-month fixed effects, averaged over state and quarter. The graph possesses three notable features. First, prior to Washington's lending restrictions (indicated by the vertical dashed line), the states' log sales are trending in parallel, which confirming the plausibility of the ``common trends'' assumption of the DD model. Second, a persistent gap in the states' sales appears in the same quarter as the law change. This gap is the result of a relatively large downward movement in Washington's sales compared to Oregon's, consistent with a negative effect of the law on sales. Finally, the effect appears to be primarily a level shift as sales in both states maintain a common upward trend.


Our regression estimates indicate that the introduction of payday lending restrictions reduced liquor store sales by approximately 3.6% (statistically significant at the 1% level). As average Washington liquor sales were approximately $163,000 in the months prior to the law change, this represents a $5,900 decline per store each month. At the state level, the point estimate implies a $23.5 million dollar annual decrease in liquor sales. As Washington State reported that the law decreased payday loans by $932 million from 2009 to 2010, this decline represents approximately 2.5% of the change in total value of loans issued.
We see two primary explanations (not mutually exclusive) for the decline in Washington liquor sales in response to the law change. First, the effect may represent a wider permanent reduction in consumption as households lose their ability to cope with unforeseen negative income shocks. Alternatively, the drop in spending may indicate a more direct financing of liquor purchases by individuals with present-biased preferences. The first explanation implies that restrictions on payday lending negatively affect consumer welfare, while the second allows for a positive impact, since individuals with present-biased preferences may be made objectively better off with a restricted choice set.
Zinman (2013) highlights Laibson (2001) theory of Pavlovian cues as a particularly intriguing explanation for payday loan usage. In these models, consumer ``impulsivity'' makes instant gratification a special case during dynamic utility maximization, where exposure to a cue can explain dynamically inconsistent behavior. Indeed, Laibson uses liquor as a prime example of a consumption good thought to be influenced by cues, and subsequent experimental research on liquor uncovers evidence consistent with this hypothesis (MacKillop et al (2010)). In situations where payday lenders locate very near to liquor stores, individuals may be exposed to a cue for alcohol, and then see the lender as a means to satisfy the urge to make an immediate purchase. A lender and liquor store separated by even a brief walk may be far enough apart to allow an individual to resist the urge to obtain both the loan and liquor. Of course, cue-theory of consumption makes lender-liquor store distance relevant even in circumstances where individuals experience a cue only after borrowing. Lenders locating near liquor stores increase the likelihood that an individual exposed to a cue is financially liquid, and able to act on an impulse.
To investigate liquor store and lender proximity, we geocode the stores' and lenders' street addresses, and calculate walking distances for all liquor store-lender pairs within two kilometers of one another. We then repeatedly estimate our preferred specification with a full set of controls on an ever expanding window of liquor stores beginning with the stores that were located within a ten meter walking distance of a lender in the month prior to the law change, then within 100 meters, within 200 meters, etc., to two kilometres. These estimates are presented in Figure 5. The graph demonstrates a negative effect of 9.2% on those liquor stores that had a payday lender located within ten meters in the month before the law change (significant at the 1% levels), an effect almost three times as large as that overall.  The larger effect rapidly declines in distance suggesting that even a small degree of separation is significant. The degree of nonlinearity in the relationship between distance and liquor sales supports the behavioral explanation of demand.


Our analysis provides the first empirical evidence of the connection between payday lending and spending on liquor. We uncover a clear reduction in liquor sales resulting from payday lending restrictions. In addition, we find that those liquor stores located very near to lenders at the time of the law change experience declines in sales almost three times as large as the overall average.
This finding is significant because it highlights that a segment of borrowers may be willing to assume significant risk by borrowing in order to engage in alcohol consumption - an activity which carries significant personal risk of its own. The connection between payday lending restrictions and reduced liquor purchases, therefore, suggests that the benefits to payday lending restrictions extend beyond personal finance and may be large.
Effective payday loan regulation should recognize the potential for greater credit access to help or harm consumers. As Carrell and Zinman (2014) highlight, heterogeneity likely exists within the pool of payday loan users, and external factors will influence the ratio of ``productive and counter-productive borrowers.'' Lending restrictions can seek to reduce the proportion of counterproductive borrowers through the prohibition of practices known to harm consumers, including those that rely upon leveraging behavioral responses such as addiction and impulsivity. The behavioral overlap identified in the literature between counterproductive payday loan borrowers and heavy alcohol users suggests that there exists a link between the two markets. The decline in liquor sales documented here provides evidence that these regulations may be effective in promoting productive borrowing.
1. Carrell, Scott and Jonathan Zinman, “In harm's way? Payday loan access and military personnel performance," Review of Financial Studies, 2014, 27(9), 2805-2840.
2. Laibson, David, “A cue-theory of consumption," Quarterly Journal of Economics, 2001, pp. 81-119.
3. MacKillop, James, Sean O'Hagen, Stephen A Lisman, James G Murphy, Lara A Ray, Jennifer W Tidey, John E McGeary, and Peter M Monti, “Behavioral economic analysis of cue-elicited craving for alcohol," Addiction, 2010, 105 (9), 1599-1607.
4. Zinman, Jonathan, “Consumer Credit: Too Much or Too Little (or Just Right)?," Working Paper 19682, National Bureau of Economic Research November 2013.

Tuesday, August 25, 2015

'Great Recession Job Losses Severe, Enduring'

Nothing particularly surprising here -- the Great recession was unusually severe and unusually long, and hence had unusual impacts, but it's good to have numbers characterizing what happened:

Great Recession Job Losses Severe, Enduring: Of those who lost full-time jobs between 2007 and 2009, only about 50 percent were employed in January 2010 and only about 75 percent of those were re-employed in full-time jobs.
The economic downturn that began in December 2007 was associated with a rapid rise in unemployment and with an especially pronounced increase in the number of long-term unemployed. In "Job Loss in the Great Recession and its Aftermath: U.S. Evidence from the Displaced Workers Survey" (NBER Working Paper No. 21216), Henry S. Farber uses data from the Displaced Workers Survey (DWS) from 1984-2014 to study labor market dynamics. From these data he calculates both the short-term and medium-term effects of the Great Recession's sharply elevated rate of job losses. He concludes that these effects have been particularly severe.

Of the workers who lost full-time jobs between 2007 and 2009, Farber reports, only about 50 percent were employed in January 2010 and only about 75 percent of those were re-employed in full-time jobs. This means only about 35 to 40 percent of those in the DWS who reported losing a job in 2007-09 were employed full-time in January 2010. This was by far the worst post-displacement employment experience of the 1981-2014 period.
The adverse employment experience of job losers has also been persistent. While both overall employment rates and full-time employment rates began to improve in 2009, even those who lost jobs between 2011 and 2013 had very low re-employment rates and, by historical standards, very low full-time employment rates.
In addition, the data show substantial weekly earnings declines even for those who did find work, although these earnings losses were not especially large by historical standards. Farber suggests that the earnings decline measure from the DWS is appropriate for understanding how job loss affects the earnings that a full-time-employed former job-loser is able to command.
The author notes that the measures on which he focuses may understate the true economic cost of job loss, since they do not consider the value of time spent unemployed or the value of lost health insurance and pension benefits.
Farber concludes that the costs of job losses in the Great Recession were unusually severe and remain substantial years later. Most importantly, workers laid off in the Great Recession and its aftermath have been much less successful at finding new jobs, particularly full-time jobs, than those laid off in earlier periods. The findings suggest that job loss since the Great Recession has had severe adverse consequences for employment and earnings.

'Thinking, Fast and Slow: Efforts to Reduce Youthful Crime in Chicago'

From the NBER Digest:

Thinking, Fast and Slow: Efforts to Reduce Youthful Crime in Chicago: Interventions that get youths to slow down and behave less automatically in high-stakes situations show positive results in three experiments.

Disparities in youth outcomes in the United States are striking. For example, among 15-to-24 year olds, the male homicide rate in 2013 was 18 times higher for blacks than for whites. Black males lose more years of potential life before age 65 to homicide than to heart disease, America's leading overall killer. A large body of research emphasizes that, beyond institutional factors, choices and behavior contribute to these outcomes. Those choices include decisions around dropping out of high school, involvement with drugs or gangs, and how to respond to confrontations that could escalate to serious violence.
In "Thinking, Fast and Slow? Some Field Experiments to Reduce Crime and Dropout in Chicago" (NBER Working Paper No. 21178), authors Sara B. Heller, Anuj K. Shah, Jonathan Guryan, Jens Ludwig, Sendhil Mullainathan, and Harold A. Pollack explain these behavioral differences using the psychology of automaticity. Because it is mentally costly to think through every situation in detail, all of us have automatic responses to some of the situations we encounter. These responses—automaticity—are tuned to situations we commonly face.
The authors present results from three large-scale, randomized experimental studies carried out in Chicago with economically disadvantaged male youth. All three experiments show sizable behavioral responses to fairly short-duration, automaticity-reducing interventions that get youths to slow down and behave less automatically in high-stakes situations.
The first intervention (called Becoming a Man, or BAM, developed by Chicago-area nonprofit Youth Guidance) involved 2,740 males in the 7th through 10th grades in 18 public schools on the south and west sides of the city. Some youths were offered an automaticity-reducing program once a week during school or an after-school sports intervention developed by Chicago nonprofit World Sport Chicago. The authors find that participation in the programming reduced arrests over the program year for violent crimes by 44 percent, and non-violent, non-property, non-drug crimes by 36 percent. Participation also increased engagement with school, which the authors estimate could translate into gains in graduation rates of between 7 and 22 percent.
A second study of BAM randomly assigned 2,064 male 9th and 10th graders within nine Chicago public high schools to the treatment or to a control condition. The authors found that arrests of youth in the treatment group were 31 percent lower than arrests in the control group.
The third intervention was delivered by trained detention staff to high-risk juveniles housed in the Cook County Juvenile Temporary Detention Center. The curriculum in this program, while different from the first two interventions, also focused on reducing automaticity. Some 5,728 males were randomly assigned to units inside the facility that did or did not implement the program. The authors found that those who received programming were about 16 percent less likely to be returned to the detention center than those who did not.
The sizable impacts the authors observe from all three interventions stand in stark contrast to the poor record of many efforts to improve the long-term life outcomes of disadvantaged youths. As with all randomized experiments, there is the question of whether these impacts generalize to other samples and settings. The interventions considered in this study would not be costly to expand. The authors estimate that the cost of the intervention for each participant in the first two studies was between $1,178 and $2,000. In the third case, the per-participant cost was about $60 per juvenile detainee. The results suggest that expanding these programs may be more cost-effective than other crime-prevention strategies that target younger individuals.
The authors also present results from various survey measures suggesting the results do not appear to be due to changes in mechanisms like emotional intelligence or self-control. On the other hand results from some decision-making exercises the authors carried out seem to support reduced automaticity as a key mechanism. The results overall suggest that automaticity can be an important explanation for disparities in outcomes.

Monday, August 17, 2015

Stiglitz: Towards a General Theory of Deep Downturns

This is the abstract, introduction, and final section of a recent paper by Joe Stiglitz on theoretical models of deep depressions (as he notes, it's "an extension of the Presidential Address to the International Economic Association"):

Towards a General Theory of Deep Downturns, by Joseph E. Stiglitz, NBER Working Paper No. 21444, August 2015: Abstract This paper, an extension of the Presidential Address to the International Economic Association, evaluates alternative strands of macro-economics in terms of the three basic questions posed by deep downturns: What is the source of large perturbations? How can we explain the magnitude of volatility? How do we explain persistence? The paper argues that while real business cycles and New Keynesian theories with nominal rigidities may help explain certain historical episodes, alternative strands of New Keynesian economics focusing on financial market imperfections, credit, and real rigidities provides a more convincing interpretation of deep downturns, such as the Great Depression and the Great Recession, giving a more plausible explanation of the origins of downturns, their depth and duration. Since excessive credit expansions have preceded many deep downturns, particularly important is an understanding of finance, the credit creation process and banking, which in a modern economy are markedly different from the way envisioned in more traditional models.
Introduction The world has been plagued by episodic deep downturns. The crisis that began in 2008 in the United States was the most recent, the deepest and longest in three quarters of a century. It came in spite of alleged “better” knowledge of how our economic system works, and belief among many that we had put economic fluctuations behind us. Our economic leaders touted the achievement of the Great Moderation.[2] As it turned out, belief in those models actually contributed to the crisis. It was the assumption that markets were efficient and self-regulating and that economic actors had the ability and incentives to manage their own risks that had led to the belief that self-regulation was all that was required to ensure that the financial system worked well , an d that there was no need to worry about a bubble . The idea that the economy could, through diversification, effectively eliminate risk contributed to complacency — even after it was evident that there had been a bubble. Indeed, even after the bubble broke, Bernanke could boast that the risks were contained.[3] These beliefs were supported by (pre-crisis) DSGE models — models which may have done well in more normal times, but had little to say about crises. Of course, almost any “decent” model would do reasonably well in normal times. And it mattered little if, in normal times , one model did a slightly better job in predicting next quarter’s growth. What matters is predicting — and preventing — crises, episodes in which there is an enormous loss in well-being. These models did not see the crisis coming, and they had given confidence to our policy makers that, so long as inflation was contained — and monetary authorities boasted that they had done this — the economy would perform well. At best, they can be thought of as (borrowing the term from Guzman (2014) “models of the Great Moderation,” predicting “well” so long as nothing unusual happens. More generally, the DSGE models have done a poor job explaining the actual frequency of crises.[4]
Of course, deep downturns have marked capitalist economies since the beginning. It took enormous hubris to believe that the economic forces which had given rise to crises in the past were either not present, or had been tamed, through sound monetary and fiscal policy.[5] It took even greater hubris given that in many countries conservatives had succeeded in dismantling the regulatory regimes and automatic stabilizers that had helped prevent crises since the Great Depression. It is noteworthy that my teacher, Charles Kindleberger, in his great study of the booms and panics that afflicted market economies over the past several hundred years had noted similar hubris exhibited in earlier crises. (Kindleberger, 1978)
Those who attempted to defend the failed economic models and the policies which were derived from them suggested that no model could (or should) predict well a “once in a hundred year flood.” But it was not just a hundred year flood — crises have become common . It was not just something that had happened to the economy. The crisis was man-made — created by the economic system. Clearly, something is wrong with the models.
Studying crises is important, not just to prevent these calamities and to understand how to respond to them — though I do believe that the same inadequate models that failed to predict the crisis also failed in providing adequate responses. (Although those in the US Administration boast about having prevented another Great Depression, I believe the downturn was certainly far longer, and probably far deeper, than it need to have been.) I also believe understanding the dynamics of crises can provide us insight into the behavior of our economic system in less extreme times.
This lecture consists of three parts. In the first, I will outline the three basic questions posed by deep downturns. In the second, I will sketch the three alternative approaches that have competed with each other over the past three decades, suggesting that one is a far better basis for future research than the other two. The final section will center on one aspect of that third approach that I believe is crucial — credit. I focus on the capitalist economy as a credit economy , and how viewing it in this way changes our understanding of the financial system and monetary policy. ...

He concludes with:

IV. The crisis in economics The 2008 crisis was not only a crisis in the economy, but it was also a crisis for economics — or at least that should have been the case. As we have noted, the standard models didn’t do very well. The criticism is not just that the models did not anticipate or predict the crisis (even shortly before it occurred); they did not contemplate the possibility of a crisis, or at least a crisis of this sort. Because markets were supposed to be efficient, there weren’t supposed to be bubbles. The shocks to the economy were supposed to be exogenous: this one was created by the market itself. Thus, the standard model said the crisis couldn’t or wouldn’t happen ; and the standard model had no insights into what generated it.
Not surprisingly, as we again have noted, the standard models provided inadequate guidance on how to respond. Even after the bubble broke, it was argued that diversification of risk meant that the macroeconomic consequences would be limited. The standard theory also has had little to say about why the downturn has been so prolonged: Years after the onset of the crisis, large parts of the world are operating well below their potential. In some countries and in some dimension, the downturn is as bad or worse than the Great Depression. Moreover, there is a risk of significant hysteresis effects from protracted unemployment, especially of youth.
The Real Business Cycle and New Keynesian Theories got off to a bad start. They originated out of work undertaken in the 1970s attempting to reconcile the two seemingly distant branches of economics, macro-economics, centering on explaining the major market failure of unemployment, and microeconomics, the center piece of which was the Fundamental Theorems of Welfare Economics, demonstrating the efficiency of markets.[66] Real Business Cycle Theory (and its predecessor, New Classical Economics) took one route: using the assumptions of standard micro-economics to construct an analysis of the aggregative behavior of the economy. In doing so, they left Hamlet out of the play: almost by assumption unemployment and other market failures didn’t exist. The timing of their work couldn’t have been worse: for it was just around the same time that economists developed alternative micro-theories, based on asymmetric information, game theory, and behavioral economics, which provided better explanations of a wide range of micro-behavior than did the traditional theory on which the “new macro - economics” was being constructed. At the same time, Sonnenschein (1972) and Mantel (1974) showed that the standard theory provided essentially no structure for macro- economics — essentially any demand or supply function could have been generated by a set of diverse rational consumers. It was the unrealistic assumption of the representative agent that gave theoretical structure to the macro-economic models that were being developed. (As we noted, New Keynesian DSGE models were but a simple variant of these Real Business Cycles, assuming nominal wage and price rigidities — with explanations, we have suggested, that were hardly persuasive.)
There are alternative models to both Real Business Cycles and the New Keynesian DSGE models that provide better insights into the functioning of the macroeconomy, and are more consistent with micro- behavior, with new developments of micro-economics, with what has happened in this and other deep downturns . While these new models differ from the older ones in a multitude of ways, at the center of these models is a wide variety of financial market imperfections and a deep analysis of the process of credit creation. These models provide alternative (and I believe better) insights into what kinds of macroeconomic policies would restore the economy to prosperity and maintain macro-stability.
This lecture has attempted to sketch some elements of these alternative approaches. There is a rich research agenda ahead.

Sunday, August 16, 2015

'The U.S. Foreclosure Crisis Was Not Just a Subprime Event'

From the NBER Digest:

The U.S. Foreclosure Crisis Was Not Just a Subprime Event, by Les Picker, NBER: Many studies of the housing market collapse of the last decade, and the associated sharp rise in defaults and foreclosures, focus on the role of the subprime mortgage sector. Yet subprime loans comprise a relatively small share of the U.S. housing market, usually about 15 percent and never more than 21 percent. Many studies also focus on the period leading up to 2008, even though most foreclosures occurred subsequently. In "A New Look at the U.S. Foreclosure Crisis: Panel Data Evidence of Prime and Subprime Borrowers from 1997 to 2012" (NBER Working Paper No. 21261), Fernando Ferreira and Joseph Gyourko provide new facts about the foreclosure crisis and investigate various explanations of why homeowners lost their homes during the housing bust. They employ microdata that track outcomes well past the beginning of the crisis and cover all types of house purchase financing—prime and subprime mortgages, Federal Housing Administration (FHA)/Veterans Administration (VA)-insured loans, loans from small or infrequent lenders, and all-cash buyers. Their data contain information on over 33 million unique ownership sequences in just over 19 million distinct owner-occupied housing units from 1997-2012.


The researchers find that the crisis was not solely, or even primarily, a subprime sector event. It began that way, but quickly expanded into a much broader phenomenon dominated by prime borrowers' loss of homes. There were only seven quarters, all concentrated at the beginning of the housing market bust, when more homes were lost by subprime than by prime borrowers. In this period 39,094 more subprime than prime borrowers lost their homes. This small difference was reversed by the beginning of 2009. Between 2009 and 2012, 656,003 more prime than subprime borrowers lost their homes. Twice as many prime borrowers as subprime borrowers lost their homes over the full sample period.
The authors suggest that one reason for this pattern is that the number of prime borrowers dwarfs that of subprime borrowers and the other borrower/owner categories they consider. The prime borrower share averages around 60 percent and did not decline during the housing boom. Although the subprime borrower share nearly doubled during the boom, it peaked at just over 20 percent of the market. Subprime's increasing share came at the expense of the FHA/VA-insured sector, not the prime sector.
The authors' key empirical finding is that negative equity conditions can explain virtually all of the difference in foreclosure and short sale outcomes of prime borrowers compared to all cash owners. Negative equity also accounts for approximately two-thirds of the variation in subprime borrower distress. Both are true on average, over time, and across metropolitan areas.
None of the other 'usual suspects' raised by previous research or public commentators—housing quality, race and gender demographics, buyer income, and speculator status—were found to have had a major impact. Certain loan-related attributes such as initial loan-to-value (LTV), whether a refinancing occurred or a second mortgage was taken on, and loan cohort origination quarter did have some independent influence, but much weaker than that of current LTV.
The authors' findings imply that large numbers of prime borrowers who did not start out with extremely high LTVs still lost their homes to foreclosure. They conclude that the economic cycle was more important than initial buyer, housing and mortgage conditions in explaining the foreclosure crisis. These findings suggest that effective regulation is not just a matter of restricting certain exotic subprime contracts associated with extremely high default rates.

Monday, August 10, 2015

Job Training and Government Multipliers

Two new papers from the NBER:

What Works? A Meta Analysis of Recent Active Labor Market Program Evaluations, by David Card, Jochen Kluve, and Andrea Weber, NBER Working Paper No. 21431 Issued in July 2015: We present a meta-analysis of impact estimates from over 200 recent econometric evaluations of active labor market programs from around the world. We classify estimates by program type and participant group, and distinguish between three different post-program time horizons. Using meta-analytic models for the effect size of a given estimate (for studies that model the probability of employment) and for the sign and significance of the estimate (for all the studies in our sample) we conclude that: (1) average impacts are close to zero in the short run, but become more positive 2-3 years after completion of the program; (2) the time profile of impacts varies by type of program, with larger gains for programs that emphasize human capital accumulation; (3) there is systematic heterogeneity across participant groups, with larger impacts for females and participants who enter from long term unemployment; (4) active labor market programs are more likely to show positive impacts in a recession. [open link]


Clearing Up the Fiscal Multiplier Morass: Prior and Posterior Analysis, by Eric M. Leeper, Nora Traum, and Todd B. Walker, NBER Working Paper No. 21433 Issued in July 2015: We use Bayesian prior and posterior analysis of a monetary DSGE model, extended to include fiscal details and two distinct monetary-fiscal policy regimes, to quantify government spending multipliers in U.S. data. The combination of model specification, observable data, and relatively diffuse priors for some parameters lands posterior estimates in regions of the parameter space that yield fresh perspectives on the transmission mechanisms that underlie government spending multipliers. Posterior mean estimates of short-run output multipliers are comparable across regimes—about 1.4 on impact—but much larger after 10 years under passive money/active fiscal than under active money/passive fiscal—means of 1.9 versus 0.7 in present value. [open link]

Thursday, August 06, 2015

'Buying Locally'

Via the blog A Fine Theorem:

“Buying Locally,” G. J. Mailath, A. Postlewaite & L. Samuelson (2015): Arrangements where agents commit to buy only from selected vendors, even when there are more preferred products at better prices from other vendors, are common. Consider local currencies like “Ithaca Hours”, which can only be used at other participating stores and which are not generally convertible, or trading circles among co-ethnics even when trust or unobserved product quality is not important. The intuition people have for “buying locally” is to, in some sense, “keep the profits in the community”; that is, even if you don’t care at all about friendly local service or some other utility-enhancing aspect of the local store, you should still patronize it. The fruit vendor, should buy from the local bookstore even when her selection is subpar, and the book vendor should in turn patronize you even when fruits are cheaper at the supermarket.
At first blush, this seems odd to an economist. Why would people voluntarily buy something they don’t prefer? What Mailath and his coauthors show is that, actually, the noneconomist intuition is at least partially correct when individuals are both sellers and buyers. Here’s the idea. ....
One thing that isn’t explicit in the paper, perhaps because it is too trivial despite its importance, is how buy local arrangements affect welfare..., an intriguing possibility is that “buy local” arrangements may not harm social welfare at all, even if they are beneficial to in-group members. ...
[May 2015 working paper (RePEc IDEAS version)]

Tuesday, August 04, 2015

'The US Financial Sector in the Long-Run: Where are the Economies of Scale?'

 And one more  before heading out the door. From Tim Taylor:

The US Financial Sector in the Long-Run: Where are the Economies of Scale?: A larger financial sector is clearly correlated with economic development, in the sense that high-income countries around the world have on average larger markets for banks, credit cards, stock and bond markets, and so on compared with lower-income countries. But there are also concerns that the financial sector in high-income countries can grow in ways that end up creating economic instability (as I've discussed herehere, and here). Thomas Philippon provides some basic evidence on the growth of the US financial sector over the past 130 years in "Has the US Finance Industry Become Less Efficient? On the Theory and Measurement of Financial Intermediation," publishes in the April 2015 issue of the American Economic Review (105:4, pp. 1408–1438). The AER is not freely available online, but many readers can obtain access through a library subscription.

There are a couple of ways to think about the size of a country's financial sector relative to its economy. One can add up the size of certain financial markets--the market value of bank loans, stocks, bonds, and the like--and divide by GDP. Or one can add up the economic value added by the financial sector. For example, instead of adding up the bank loans, you add up the value of banking services provided. Similarly, instead of adding up the value of the stock market, you add up the value of the services provided by stockbrokers and investment manager.

Here's a figure from Philippon showing both measures of finance as a share of the US economy over the long run since 1886.

The orange line measured on the right axis is "intermediated assets," which measures the size of the financial sector as the sum of all debt and equity issued by nonfinancial firms, together with the sum of all household debt, and some other smaller categories. Back in the late 19th century, the US financial sector was roughly equal in size to GDP. By just before the Great Depression, it had risen to almost three times GDP, before sinking back to about 1.5 times GDP. More recently, you can see the financial sector spiking with the boom in real estate markets and stock markets in the mid-2000s at more than 4 times GDP, before dropping slightly. The overall trend is clearly up, but it's also clearly a bumpy ride.

The green line shows "finance income," which can be understood as a measure of the value added by firms in the financial sector. For the uninitiated, "value added" has a specific meaning to economists. Basically, it is calculated by taking the total revenue of a firm and subtracting the cost of all goods and services purchased from other firms--for example, subtracting costs of supplies purchased or machinery. In the figure, most of the "value-added" that measures  "finance income" includes all wages and salaries paid by a firm, along with any profits earned.

An intriguing pattern emerges here: finance income tracks intermediated assets fairly closely. In other words, the amount paid to the financial sector is more-or-less a fixed proportion of total financial assets. It's not obvious why this should be so. For example, imagine that because of a rise in housing prices, the total mortgage debt of households rises substantially over time, or because of rising stock prices over several decades, the total value of the stock market is up. Especially in an economy where information technology is making rapid strides, it's not clear why incomes in the financial sector should be rising at the same pace. Does a bank need to incur twice the costs if it issues a mortgage for $500,000 as compared to when it issues a mortgage for $250,000? Does an investment adviser need to incur twice the costs when giving advice on a retirement account of $1 million as when giving advice on a retirement account of $500,000? Shouldn't there be some economies of scale in financial services?

Philippon isn't the first to raise this question: for example, Burton Malkiel has asked why there aren't economies of scale in asset management fees here. But Philippon provides evidence that, for whatever reason, a lack of economies of scale has been widespread and long-lasting in the US financial sector.

Full disclosure: The AER is published by the American Economic Association, which is also the publisher of the Journal of Economic Perspectives, where I have worked as Managing Editor since 1986.

Tuesday, July 28, 2015

Is Content Aggregation Harmful?

This is from the NBER (Project Syndicate, are you listening?):

Content Aggregation by Platforms: The Case of the News Media, by Lesley Chiou and Catherine Tucker, NBER Working Paper No. 21404, July 2015: ... In recent years, the digitization of content has led to the prominence of platforms as aggregators of content in many economically important industries, including media and Internet-based industries (Evans and Schmalensee, 2012).
These new platforms consolidate content from multiple sources into one place, thereby lowering the transactions costs of obtaining content and introducing new information to consumers. ... For these reasons, platforms have attracted considerable legal and policy attention. ...
Our results indicate that ... the traffic effect is large, as aggregators may guide users to new content. We do not find evidence of a scanning effect...
Our empirical distinction between a scanning effect where the aggregator substitutes for original content and a traffic effect where the aggregator is complementary, is useful for analyzing the potential policy implications of such business models. The fact we find evidence of a "traffic effect" even with a relatively large amount of content on an aggregator, is perhaps evidence that the "fair use" exemptions often relied on by such sites are less potentially damaging to the original copyright holder than often thought.

On the comment that the benefits outweigh the harm "even with a relatively large amount of content on an aggregator," when I post an entire article, as I did yesterday with this Vox EU piece, a surprisingly high percentage of you still click through to the original.

With video, at least in most cases, there is code available to put the video on your site. You play it and it has ads, branding, etc. I've always thought (or maybe hoped) content providers should do the same thing. Provide an embed button that allows me to duplicate an article -- it would come with ads, links to other content on their site, etc. -- on my site. Reads of the article would go way up (not from just my site, I mean if they allowed everyone to do this), and it would increase the number of people who see ads associated with their content (so they could charge more).

Monday, July 27, 2015

'Poor Little Rich Kids? The Determinants of the Intergenerational Transmission of Wealth'

Genes are not as important as people think:

Poor Little Rich Kids? The Determinants of the Intergenerational Transmission of Wealth, by Sandra E. Black, Paul J. Devereux, Petter Lundborg, and Kaveh Majlesi, NBER Working Paper No. 21409 Issued in July 2015: Wealth is highly correlated between parents and their children; however, little is known about the extent to which these relationships are genetic or determined by environmental factors. We use administrative data on the net wealth of a large sample of Swedish adoptees merged with similar information for their biological and adoptive parents. Comparing the relationship between the wealth of adopted and biological parents and that of the adopted child, we find that, even prior to any inheritance, there is a substantial role for environment and a much smaller role for genetics. We also examine the role played by bequests and find that, when they are taken into account, the role of adoptive parental wealth becomes much stronger. Our findings suggest that wealth transmission is not primarily because children from wealthier families are inherently more talented or more able but that, even in relatively egalitarian Sweden, wealth begets wealth.

[Open link]

Tuesday, July 21, 2015

'Farmers Markets and Food-Borne Illness'

Marc Bellemare:

Farmers Markets and Food-Borne Illness: ... After working on it for almost two years, I am happy to finally be able to circulate my new paper titled “Farmers Markets and Food-Borne Illness,” coauthored with my colleague Rob King and my student Jenny Nguyen, in which we ask whether farmers markets are associated with food-borne illness in a systematic way. ...

In sum, what we find is:

  1. A positive relationship between the number of farmers markets and the number of reported outbreaks of food-borne illness in the average state-year./li>
  2. A positive relationship between the number of farmers markets and the number of reported cases of food-borne illness in the average state-year.
  3. A positive relationship between the number of farmers markets and the number of reported outbreaks of Campylobacter jejuni in the average state-year.
  4. A positive relationship between the number of farmers markets and the number of reported cases of Campylobacter jejuni in the average state-year.
  5. Six dogs that didn’t bark, i.e., no systematic relationship between the number of farmers markets and the number of outbreaks or cases of norovirus, Salmonella enterica, Clostridium perfringens, E. coli, Staphylococcus (i.e., staph), or scombroid food poisoning.
  6. When controlling for the number of farmers markets, there is a negative relationship between the number of farmers markets that accept SNAP and food-borne illness in the average state-year.
  7. AA doubling of the number of farmers markets in the average state-year would be associated with a relatively modest economic cost of about $900,000 in terms of additional cases of food-borne illness.

Of course, correlation is not causation, which is why we spend a great deal of time in the paper discussing the potential threats to causal identification in this context, investigating them, and trying to triangulate our findings with a number of different specifications and estimators. At the end of the day, we are pretty confident in the robustness of our core finding, viz. that there is a positive association between the number of farmers markets and the number of reported outbreaks and cases of food-borne illness. ...

Sunday, June 07, 2015

'Cyclical Variation in Real Wages'

More than 75 years ago, the EJ – under Keynes’ editorship - published a series of papers on the behavior of real wages that have had a lasting impact on the discipline – this special anniversary session discusses debates then and now about real wage dynamics, unemployment fluctuations and wage flexibility.


  • Keynesian Controversies on Compensation; Presented by John Pencavel (Stanford University)
  • Unemployment and Business Cycles; Presented by Lawrence Christiano (Northwestern University)
  • Unemployment Fluctuations, Match Quality and Wage Cyclicality of New Hires: Presented by Christopher Huckfeldt (Cornell University) and Antonella Trigari (Bocconi University)
  • Does the New Keynesian Model have a Uniqueness Problem? Presented by Benjamin Johannsen (Federal Reserve Board)

I really enjoyed this session, particularly the history of "Keynesian controversies" over wages by John Pencavel at the beginning of the session.

Wednesday, June 03, 2015

'Coordination Equilibrium and Price Stickiness'

This is the introduction to a relatively new working paper by Cidgem Gizem Korpeoglu and Stephen Spear (sent in response to my comment that I've been disappointed with the development of new alternatives to the standard NK-DSGE models):

Coordination Equilibrium and Price Stickiness, by Cidgem Gizem Korpeoglu (University College London) Stephen E. Spear (Carnegie Mellon): 1 Introduction Contemporary macroeconomic theory rests on the three pillars of imperfect competition, nominal price rigidity, and strategic complementarity. Of these three, nominal price rigidity (aka price stickiness) has been the most important. The stickiness of prices is a well-established empirical fact, with early observations about the phenomenon going back to Alfred Marshall. Because the friction of price stickiness cannot occur in markets with perfect competition, modern micro-founded models (New Keynesian or NK models, for short) have been forced to abandon the standard Arrow-Debreu paradigm of perfect competition in favor of models where agents have market power and set market prices for their own goods. Strategic complementarity enters the picture as a mechanism for explaining the kinds of coordination failures that lead to sustained slumps like the Great Depression or the aftermath of the 2008 …financial crisis. Early work by Cooper and John laid out the importance of these three features for macroeconomics, and follow-on work by Ball and Romer showed that failure to coordinate on price adjustments could itself generate strategic complementarity, effectively unifying two of the three pillars.
Not surprisingly, the Ball and Romer work was based on earlier work by a number of authors (see Mankiw and Romer's New Keynesian Economics) which used the model of Dixit and Stiglitz of monopolistic competition as the basis for price-setting behavior in a general equilibrium setting, combined with the idea of menu costs -- literally the cost of posting and communicating price changes -- and exogenously-specified adjustment time staggering to provide the friction(s) leading to nominal rigidity. While these models perform well in explaining aspects of the business cycle, they have only recently been subjected to what one would characterize as thorough empirical testing, because of the scarcity of good data on how prices actually change. This has changed in the past decade as new sources of data on price dynamics have become available, and as computational power capable of teasing out what might be called the "…fine structure" of these dynamics has emerged. On a different dimension, the overall suitability of monopolistic competition as the appropriate form of market imperfection to use as the foundation of the new macro models has been largely unquestioned, though we believe this is largely due to the tractability of the Dixit-Stiglitz model relative to other models of imperfect competition generated by large …fixed costs or increasing returns to scale not due to specialization.
In this paper, we examine both of these underlying assumptions in light of what the new empirics on pricing dynamics has found, and propose a different, and we believe, better microfoundation for New Keynesian macroeconomics based on the Shapley-Shubik market game.

Tuesday, June 02, 2015

'Stabilizing Wage Policy'

I have argued many, many times that we did not do nearly enough to help households repair their balance sheets (especially when compared to the attention that bank balance sheets received), so I like this idea from Stanford's Mordecai Kurz:

Stabilizing Wage Policy by Mordecai Kurz, Department of Economics Stanford University, Stanford, CA. (This version: May 27, 2015): Summary: A rapid recovery from deflationary shocks that result in transition to the Zero Lower Bound (ZLB) requires that policy generate an inflationary counter-force. Monetary policy cannot achieve it and the lesson of the 2007-2015 Great Recession is that growing debt give rise to a political gridlock which prevents restoration to full employment with deficit financed public spending. Even optimal investments in needed public projects cannot be undertaken at a zero interest rate. Hence, failure of policy to arrest the massive damage of eight year’s Great Recession shows the need for new policy tools. I propose such policy under the ZLB called “Stabilizing Wage Policy” which requires public intervention in markets instead of deficit financed expenditures. Section 1 develops a New Keynesian model with diverse beliefs and inflexible wages. Section 2 presents the policy and studies its efficacy.
The integrated New Keynesian (NK) model economy consists of a lower sub-economy under a ZLB and upper sub-economy with positive rate, linked by random transition between them. Household-firm-managers hold heterogeneous beliefs and inflexible wage is based on a four quarter staggered wage structure so that mean wage is a relatively inflexible function of inflation, of unemployment and of a distributed lag of productivity. Equilibrium maps of the two sub-economies exhibit significant differences which emerge from the relative rates at which the nominal rate, prices and wage rate adjust to shocks. Two key results: first, decline to the ZLB lower subeconomy causes a powerful debt-deflation spiral. Second, output level, inflation and real wages rise in the lower sub-economy if all base wages are unexpectedly raised. Unemployment falls. This result is explored and explained since it is the key analytic result that motivates the policy.
A Stabilizing Wage Policy aims to repair households’ balance sheets, expedite recovery and exit from the ZLB. It raises base wages for policy duration with quarterly cost of living adjustment and a prohibition to alter base wages in order to nullify the policy. I use demand shocks to cause recession under a ZLB and a deleveraging rule to measure recovery. The rule is calibrated to repair damaged balance sheets of US households in 2007-2015. Sufficient deleveraging and a positive rate in the upper sub-economy without a wage policy are required for exit hence at exit time inflation and output in the lower sub-economy are irrelevant for exit decision. Simulations show effective policy selects high policy intensity at the outset and given the 2007-2015 experience, a constant 10% increased base wages raises equilibrium mean wage by about 5.5%, generates a controlled inflation of 5%-6% at exit time and attains recovery in a fraction of the time it takes for recovery without policy. Under a successful policy inflation exceeds the target at exit time and when policy terminates, inflation abates rapidly if the inflation target is intact. I suggest that a stabilizing wage policy with a constant 10% increased base wages could have been initiated in September 2008. If controlled inflation of 5% for 2.25 years would have been politically tolerated, the US would have recovered and exited the ZLB in 9 quarters and full employment restored by 2012. Lower policy intensity would have resulted in smaller increased mean wage, lower inflation but increased recession’s duration. The policy would not have required any federal budget expenditures, it would have reduced public deficits after 2010 and the US would have reached 2015 with a lower national debt.
The policy negates the effect of demand shocks which cause the recession and the binding ZLB. It attains it’s goal with strong temporary intervention in the market instead of generating demand with public expenditures. It does not solve other long term structural problems that persist after exit from the ZLB and which require other solutions.

Monday, June 01, 2015

Forecasting in Economic Crises

Maurizio Bovi summarizes a new published paper he wrote with Roy Cerqueti. The paper examines lay agents' forecasts amid great recessions (with a special focus on Greece). The paper is Bovi, M. and R. Cerqueti (2015) "Forecasting macroeconomic fundamentals in economic crises" Annals of Operations Research DOI 10.1007/s10479-015-1879-4:

Forecasting in Economic Crises: Expectations are a key factor in economics and heterogeneous forecasts are a fact of life. Just to mention, there are quite significant and well-known incentives to become a sport champion or to win a Nobel Prize, yet very few persons succeed in the endeavor. The brutal truth is that the majority lags behind or gives up - heterogeneity is the rule, not the exception. By the same token lay forecasters may learn, but it is unrealistic to think that all of them—even in the long run—will achieve Muth-optimal and, hence, homogeneous forecasts. The situation is made even more complex, and more interesting to study, when the fundamental to predict, the real GDP growth rate, is well below zero and highly volatile.
In recent work (Bovi and Cerqueti, 2015) we address the topic of heterogeneous forecasting performances amid deep recessions. Lay agents are assumed to have different predictive ability in that they have equal loss functions, but different asymmetry parameters that are used as control to minimize their forecasting errors. Simulating poor performing economies populated by three groups of forecasters, we have obtained the following results.
The less sophisticated forecasters in our setting – the “medians” (using passive rule-of-thumb) - never perform as the best predictors – the “muthians” – whereas “second best” (SB) agents (acting as attentive econometricians) do that only occasionally. This regardless the size of the crisis. Thus, as in the real world, in our artificial economy heterogeneity is a structural trait. More intriguingly, simulations also show that the medians’ behavior tend to be relatively smoother than that of SB agents, and that the difference between them widens in the case very serious crises. In particular, great recessions make SB agents’ predictions relatively more biased. An explanation is that dramatic crises extend the available information set (e.g., due to greater mass media coverage), and this leads SB agents, who are more prompt to revise their forecasts than medians.
Our results are somewhat in line with Simon’s famous statement about the fact that more information does not necessarily mean better forecasting performances. Furthermore, our outcomes shed some light on what has been happening in the freak macroeconomic expectations in Greece these years. The current crisis, in fact, may be thought of as a sort of natural experiment to understand how lay decision makers react to very dramatic years. In particular, due to its terrible recent downturn, Greece is one of the most suitable cases, raising the following question: How do Greeks perceive their own personal financial situation with respect to that of their country? Clearly, the representative citizen cannot by definition systematically drift apart from that of the country where she lives, given that the nation-wide economic situation is the (weighted) sum of the individual ones in the country. Yet, it may be hard to remain objective in the course of very deep and prolonged economic crises. The evidence depicted in the following graph looks rather suggestive of the effects of deep recessions on the rationality of people’s expectations, something that conform with our findings.

[Click on figure to enlarge]

Monday, May 11, 2015

'A Note on Nominal GDP Targeting and the Zero Lower Bound'

Roberto M. Billi, senior researcher at the Sveriges Riksbank, has a new paper on nominal GDP targeting:

”A Note on Nominal GDP Targeting and the Zero Lower Bound,” Sveriges Riksbank Working Paper Series No. 270, Revised May 2015: Abstract: I compare nominal GDP level targeting to strict price level targeting in a small New Keynesian model, with the central bank operating under optimal discretion and facing a zero lower bound on nominal interest rates. I show that, if the economy is only buffeted by purely temporary shocks to inflation, nominal GDP level targeting may be preferable because it requires the burden of the shocks to be shared by prices and output. But in the presence of persistent supply and demand shocks, strict price level targeting may be superior because it induces greater policy inertia and improves the tradeoffs faced by the central bank. During lower bound episodes, somewhat paradoxically, nominal GDP level targeting leads to larger falls in nominal GDP.

Friday, May 08, 2015

'Childhood Medicaid Coverage Improves Adult Earning and Health'

I highlighted the second article below, and many others reaching similar conclusions, in my last column:

Childhood Medicaid Coverage Improves Adult Earning and Health, NBER Digest: Medicaid today covers more Americans than any other public health insurance program. Introduced in 1965, its coverage was expanded substantially, particularly to low-income children, during the 1980s and the early 1990s.
Throughout Medicaid's history, there has been debate over whether the program improves health outcomes. Two new NBER studies exploit variation in children's eligibility for Medicaid, across birth cohorts and across states with different Medicaid programs, along with rich longitudinal data on health care utilization and earnings, to estimate the long-run effects of Medicaid eligibility on health, earnings, and transfer program participation.
In Childhood Medicaid Coverage and Later Life Health Care Utilization (NBER Working Paper No. 20929), Laura R. Wherry, Sarah Miller, Robert Kaestner, and Bruce D. Meyer find that among individuals who grew up in low-income families, rates of hospitalizations and emergency department visits in adulthood are negatively related to the number of years of Medicaid eligibility in childhood. The authors exploit the fact that one of the substantial expansions of Medicaid eligibility applied only to children who were born after September 30, 1983. This resulted in a large discontinuity in the lifetime years of Medicaid eligibility for children born before and after this birthdate cutoff. Children in families with incomes between 75 and 100 percent of the poverty line experienced about 4.5 more years of Medicaid eligibility if they were born just after the September 1983 cutoff than if they were born just before, with the gain occurring between the ages of 8 and 14. The authors compare children who they estimate were in low-income families, and otherwise similar circumstances, who were born just before or just after this date, to determine how the number of years of childhood Medicaid eligibility is related to health in early adulthood. Their finding of reduced health care utilization among adults who had more years of childhood Medicaid eligibility is concentrated among African Americans, those with chronic illness conditions, and those living in low-income zip codes. The authors calculate that reduced health care utilization during one year in adulthood offsets between 3 and 5 percent of the costs of extending Medicaid coverage to a child.
In Medicaid as an Investment in Children: What is the Long-Term Impact on Tax Receipts? (NBER Working Paper No. 20835), David W. Brown, Amanda E. Kowalski, and Ithai Z. Lurie conclude that each additional year of childhood Medicaid eligibility increases cumulative federal tax payments by age 28 by $247 for women, and $127 for men. Their empirical strategy for evaluating the impact of Medicaid relies on variation in program eligibility during childhood that is associated with both birth cohort and state of residence. The authors study longitudinal data on actual tax payments until individuals are in their late 20s, and they extrapolate this information to make projections for these individuals at older ages. When they compare the incremental discounted value of lifetime tax payments with the cost of additional Medicaid coverage, they conclude that "the government will recoup 56 cents of each dollar spent on childhood Medicaid by the time these children reach age 60." This calculation is based on federal tax receipts alone, and does not consider state tax receipts or potential reductions in the use of transfer payments in adulthood.
Both studies use large databases of administrative records to analyze the long-term effects of Medicaid. The first study measures health utilization using the Healthcare Cost and Utilization Project (HCUP) State Inpatient Databases for Arizona, Iowa, New York, Oregon, and Wisconsin in 1999, and those states plus Maryland and New Jersey in 2009. State hospital discharge data were also available from Texas and California. Data on all outpatient emergency department visits were available for six states in 2009. The second study examines data on federal tax payments and constructs longitudinal earnings histories for individuals who were born between 1981 and 1984. It also analyzes administrative records on Medicaid eligibility of children in this cohort.

Saturday, May 02, 2015

'Assessing the Effects of Monetary and Fiscal Policy'

From the NBER Reporter:

Assessing the Effects of Monetary and Fiscal Policy, by Emi Nakamura and Jón Steinsson, NBER Reporter 2015 Number 1: Research Summary: Monetary and fiscal policies are central tools of macroeconomic management. This has been particularly evident since the onset of the Great Recession in 2008. In response to the global financial crisis, U.S. short-term interest rates were lowered to zero, a large fiscal stimulus package was implemented, and the Federal Reserve engaged in a broad array of unconventional policies.
Despite their centrality, the question of how effective these policies are and therefore how the government should employ them is in dispute. Many economists have been highly critical of the government's aggressive use of monetary and fiscal policy during this period, in some cases arguing that the policies employed were ineffective and in other cases warning of serious negative consequences. On the other hand, others have argued that the aggressive employment of these policies has "walk[ed] the American economy back from the edge of a second Great Depression."1
In our view, the reason for this controversy is the absence of conclusive empirical evidence about the effectiveness of these policies. Scientific questions about how the world works are settled by conclusive empirical evidence. In the case of monetary and fiscal policy, unfortunately, it is very difficult to establish such evidence. The difficulty is a familiar one in economics, namely endogeneity. ..

After explaining the endogeneity problem, empirical evidence on price rigidity and its importance for assessing policy, structural modeling, natural experiments, and so on, they turn to their evidence:

Our identification approach is to study how real interest rates respond to monetary shocks in the 30-minute intervals around Federal Open Market Committee announcements. We find that in these short intervals, nominal and real interest rates for maturities as long as several years move roughly one-for-one with each other. Changes in nominal interest rates at the time of monetary announcements therefore translate almost entirely into changes in real interest rates, while expected inflation moves very little except at very long horizons.
We use this evidence to estimate the parameters of a conventional monetary business cycle model. ... This approach suggests that monetary non-neutrality is large. Intuitively, our evidence indicates that a monetary shock that yields a substantial response for real interest rates also yields a very small response for inflation. This suggests that prices respond quite sluggishly to changes in aggregate economic conditions and that monetary policy can have large effects on the economy.
Another area in which there has been rapid progress in using innovative identification schemes to estimate the impact of macroeconomic policy is that of fiscal stimulus.9 ... Much of the literature on fiscal stimulus that makes use of natural experiments focuses on the effects of war-time spending, since it is assumed that in some cases such spending is unrelated to the state of the economy. Fortunately - though unfortunately for empirical researchers - there are only so many large wars, so the number of data points available from this approach is limited.
In our work, we use cross-state variation in military spending to shed light on the fiscal multiplier.10 The basic idea is that when the U.S. experiences a military build-up, military spending will increase in states such as California - a major producer of military goods - relative to states, such as Illinois, where there is little military production. This approach uses a lot more data than the earlier literature on military spending but makes weaker assumptions, since we require only that the U.S. did not undertake a military build-up in response to the relative weakness of the economy in California vs. Illinois. We show that a $1 increase in military spending in California relative to Illinois yields a relative increase in output of $1.50. In other words, the "relative" multiplier is quite substantial.11
There is an important issue of interpretation here. We find evidence of a large "relative multiplier," but does this imply that the aggregate multiplier also will be large? The challenge that arises in interpreting these kinds of relative estimates is that there are general equilibrium effects that are expected to operate at an aggregate but not at a local level. In particular, if government spending is increased at the aggregate level, this will induce the Federal Reserve to tighten monetary policy, which will then counteract some of the stimulative effect of the increased government spending. This type of general equilibrium effect does not arise at the local level, since the Fed can't raise interest rates in California vs. Illinois in response to increased military spending in California relative to Illinois.
We show in our paper, however, that the relative multiplier does have a very interesting counterpart at the level of the aggregate economy. Even in the aggregate setting, the general equilibrium response of monetary policy to fiscal policy will be constrained when the risk-free nominal interest rate is constrained by its lower bound of zero. Our relative multiplier corresponds more closely to the aggregate multiplier in this case.12 Our estimates are, therefore, very useful in distinguishing between new Keynesian models, which generate large multipliers in these scenarios, and plain vanilla real business cycle models, which always generate small multipliers.
The evidence from our research on both fiscal and monetary policy suggests that demand shocks can have large effects on output. Models with price-adjustment frictions can explain such output effects, as well as (by design) the microeconomic evidence on price rigidity. Perhaps this evidence is still not conclusive, but it helps to narrow the field of plausible models. This new evidence will, we hope, help limit the scope of policy predictions of macroeconomic models that policymakers need to consider the next time they face a great challenge. ...

Trends and Cycles in China's Macroeconomy

A presentation at the 30th Annual Conference on Macroeconomics:

Also, a brief interview with Tao Zha:


Monday, April 27, 2015

'Are Immigrants a Shot in the Arm for the Local Economy?'

From the NBER (open link to earlier version):

Are Immigrants a Shot in the Arm for the Local Economy?, by Gihoon Hong and John McLaren, NBER Working Paper No. 21123: Most research on the effects of immigration focuses on the effects of immigrants as adding to the supply of labor. By contrast, this paper studies the effects of immigrants on local labor demand, due to the increase in consumer demand for local services created by immigrants. This effect can attenuate downward pressure from immigrants on non-immigrants' wages, and also benefit non-immigrants by increasing the variety of local services available. For this reason, immigrants can raise native workers' real wages, and each immigrant could create more than one job. Using US Census data from 1980 to 2000, we find considerable evidence for these effects: Each immigrant creates 1.2 local jobs for local workers, most of them going to native workers, and 62% of these jobs are in non-traded services. Immigrants appear to raise local non-tradables sector wages and to attract native-born workers from elsewhere in the country. Overall, it appears that local workers benefit from the arrival of more immigrants.

Friday, April 24, 2015

'No Price Like Home: Global House Prices, 1870-2012'

Interesting paper:

No Price Like Home: Global House Prices, 1870-2012, by Katharina Knoll, Moritz Schularic, and Thomas Steger: Abstract: How have house prices evolved over the long‐run? This paper presents annual house prices for 14 advanced economies since 1870. Based on extensive data collection, we show that real house prices stayed constant from the 19th to the mid‐20th century, but rose strongly during the second half of the 20th century. Land prices, not replacement costs, are the key to understanding the trajectory of house prices. Rising land prices explain about 80 percent of the global house price boom that has taken place since World War II. Higher land values have pushed up wealth‐to‐income ratios in recent decades.

Monday, April 20, 2015

'Labor Market Slack and Monetary Policy'

Let's hope the Fed is listening:

Labor Market Slack and Monetary Policy, by David G. Blanchflower and Andrew T. Levin, NBER Working Paper No. 21094: In the wake of a severe recession and a sluggish recovery, labor market slack cannot be gauged solely in terms of the conventional measure of the unemployment rate (that is, the number of individuals who are not working at all and actively searching for a job). Rather, assessments of the employment gap should reflect the incidence of underemployment (that is, people working part time who want a full-time job) and the extent of hidden unemployment (that is, people who are not actively searching but who would rejoin the workforce if the job market were stronger). In this paper, we examine the evolution of U.S. labor market slack and show that underemployment and hidden unemployment currently account for the bulk of the U.S. employment gap. Next, using state-level data, we find strong statistical evidence that each of these forms of labor market slack exerts significant downward pressure on nominal wages. Finally, we consider the monetary policy implications of the employment gap in light of prescriptions from Taylor-style benchmark rules.

[Open link]

Tuesday, April 14, 2015

Secular Stagnation: The Long View

From the NBER Digest:

Secular Stagnation: The Long View, by Matt Nesvisky: Growth economists are divided on whether the U.S. is facing a period of "secular stagnation" - an extended period of slow economic growth in the coming decades. In "Secular Stagnation: The Long View" (NBER Working Paper No. 20836), Barry Eichengreen considers four factors that could contribute to a persistent period of below-potential output and slow growth: a rise in saving due to the global integration of emerging markets, a decline in the rate of population growth, an absence of attractive investment opportunities, and a drop in the relative price of investment goods. He concludes that a decline in the relative price of investment goods is the most likely contributor to an excess of saving over investment.

With regard to long-term future growth rates, a key point of debate is how to interpret, and project forward, the "Third Industrial Revolution": the computer age and the new economy it has created. Some argue that the economic impact of digital technology has largely run its course, while others maintain that we have yet to experience the full effect of computerization. In this context, Eichengreen looks at the economic consequences of the age of steam and of the age of electrification. His analysis identifies two dimensions of the economic impact: "range of applicability" and "range of adaptation."
Range of applicability refers to the number of sectors or activities to which the key innovations can be applied. Use of the steam engine of the first industrial revolution for many years was limited to the textile industry and railways, which accounted for only a relatively small fraction of economic activity. Electrification in the second industrial revolution, says Eichengreen, had a larger impact on output and productivity growth because it affected a host of manufacturing industries, many individual households, and a wide range of activities within decades of its development.
The "computer revolution" of the second half of the 20th century had a relatively limited impact on overall economic growth, Eichengreen writes, because computerization had deeply transformative effects on only a limited set of industries, including finance, wholesale and retail trade, and the production of computers themselves. This perspective suggests that the implications for output and productivity of the next wave of innovations will depend greatly on their range of applicability. Innovations such as new tools (quantum computers), materials (graphene), processes (genetic modification), robotics, and enhanced interactivity of digital devices all promise a broad range of applications.
Range of adaptation refers to how comprehensively economic activity must be reorganized before positive impacts on output and productivity occur. Eichengreen reasons that the greater the required range of adaptation, the higher the likelihood that growth may slow in the short run, as costly investments in adaptation must be made and existing technology must be disrupted.
Yet the slow productivity growth in the United States in recent years may have positive implications for the future, he writes. Many connected activities and sectors - health care, education, industrial research, and finance - are being disrupted by the latest technologies. But once a broad range of adaptations is complete, productivity growth should accelerate, he reasons. "This is not a prediction," Eichengreen concludes, "but a suggestion to look to the range of adaptation required in response to the current wave of innovations when seeking to interpret our slow rate of productivity growth and when pondering our future."

Wednesday, March 18, 2015

'Arezki, Ramey, and Sheng on News Shocks'

I was at this conference as well. This paper was very well received (it has been difficult to find evidence that news generates business cycles, in part because it's been difficult to find a "clean" shock):

Arezki, Ramey, and Sheng on news shocks: I attended the NBER EFG (economic fluctuations and growth) meeting a few weeks ago, and saw a very nice paper by Rabah Arezki, Valerie Ramey, and Liugang Sheng, "News Shocks in Open Economies: Evidence from Giant Oil Discoveries" (There were a lot of nice papers, but this one is more bloggable.)

They look at what happens to economies that discover they have a lot of oil. ... An oil discovery is a well identified "news shock."

Standard productivity shocks are a bit nebulous, and alter two things at once: they give greater productivity and hence incentive to work today and also news about more income in the future.

An oil discovery is well publicized. It incentivizes a small investment in oil drilling, but mostly is pure news of an income flow in the future. It does not affect overall labor productivity or other changes to preferences or technology.
Rabah,Valerie, and Liugang then construct a straightforward macro model of such an event. ...[describes model and results]...

Valerie, presenting the paper, was a bit discouraged. This "news shock" doesn't generate a pattern that looks like standard recessions, because GDP and employment go in the opposite direction.

I am much more encouraged. Here are macroeconomies behaving exactly as they should, in response to a shock where for once we really know what the shock is. And in response to a shock with a nice dynamic pattern, which we also really understand.

My comment was something to the effect of "this paper is much more important than you think. You match the dynamic response of economies to this large and very well identified shock with a standard, transparent and intuitive neoclassical model. Here's a list of some of the ingredients you didn't need: Sticky prices, sticky wages, money, monetary policy, (i.e. interest rates that respond via a policy rule to output and inflation or zero bounds that stop them from doing so), home bias, segmented financial markets, credit constraints, liquidity constraints, hand-to-mouth consumers, financial intermediation, liquidity spirals, fire sales, leverage, sudden stops, hot money, collateral constraints, incomplete markets, idiosyncratic risks, strange preferences including habits, nonexpected utility, ambiguity aversion, and so forth, behavioral biases, nonexpected utility, or rare disasters. If those ingredients are really there, they ought to matter for explaining the response to your shocks too. After all, there is only one economic structure, which is hit by many shocks. So your paper calls into question just how many of those ingredients are really there at all."

Thomas Phillipon, whose previous paper had a pretty masterful collection of a lot of those ingredients, quickly pointed out my overstatement. One needs not need every ingredient to understand every shock. Constraint variables are inequalities. A positive news shock may not cause credit constraints etc. to bind, while a negative shock may reveal them.

Good point. And really, the proof is in the pudding. If those ingredients are not necessary, then I should produce a model without them that produces events like 2008. But we've been debating the ingredients and shock necessary to explain 1932 for 82 years, so that approach, though correct, might take a while.

In the meantime, we can still cheer successful simple models and well identified shocks on the few occasions that they appear and fit data so nicely. Note to graduate students, this paper is a really nice example to follow for its integration of clear theory and excellent empirical work.