Category Archive for: Academic Papers [Return to Main]

Monday, September 28, 2015

'Cheap Talk, Round Numbers, and Signaling Behavior'

Advice about selling goods on eBay from the NBER Digest:

Cheap Talk, Round Numbers, and Signaling Behavior: In the marketplace for ordinary goods, buyers and sellers have many characteristics that are hidden from each other. From the seller's perspective, it may be beneficial to reveal some of these characteristics. For example, a patient seller may want to signal unending willingness to wait in order to secure a good deal. At the same time, an impatient seller may want to signal a desire to sell a good quickly, albeit at a lower price.

This insight is at the heart of Cheap Talk, Round Numbers, and the Economics of Negotiation (NBER Working Paper No. 21285) by Matthew Backus, Thomas Blake, and Steven Tadelis. The authors show that sellers on eBay behave in a fashion that is consistent with using round numbers as signals of impatience.
The authors analyze data from eBay's bargaining platform using its collectibles category—coins, antiques, toys, memorabilia, and the like. The process is one of sequential offers not unlike haggling in an open-air market. A seller lists an initial price, to which buyers may make counteroffers, to which sellers may make counteroffers, and so on. If a price is agreed upon, the good sells. The authors analyze 10.5 million listed items, out of which 2.8 million received offers and 2.1 million ultimately sold. Their key finding is that items listed at multiples of $100 receive lower offers on average than items listed at nearby prices, ultimately selling for 5 to 8 percent less.
It is tempting to label such behavior a mistake. However, items listed at these round numbers receive offers 6 to 11 days sooner and are 3 to 5 percent more likely to sell than items listed at "precise" numbers. Furthermore, even experienced sellers frequently list items at round numbers, suggesting it is an equilibrium behavior best modeled by rationality rather than seller error. It appears that impatient sellers are able to signal their impatience and are happy to do it, even though it nets them a lower price.
One concern with the analysis is that round-number pricing might provide a signal about the good being sold, rather than the person or firm selling it. To address this issue, the authors use data on goods originally posted with prices in British pounds. These prices are automatically translated to U.S. dollars for the American market. Hence, the authors can test what happens when goods intended to be sold at round numbers are, in fact, sold at non-round numbers. This removes the round-number signal while holding the good's features constant. In this setting, they find that buyers of goods priced in non-round dollar amounts systematically realize higher prices, though the effect is not as strong as that in their primary sample. This evidence indicates the round numbers themselves have a significant effect on bargaining outcomes.
The authors find additional evidence on the round-number phenomenon in the real estate market in Illinois from 1992 to 2002. This is a wholly different market than that for eBay collectibles, with much higher prices and with sellers typically receiving advice from professional listing agents. But here, too, there is evidence that round-number listings lead to lower sales prices. On average, homes listed at multiples of $50,000 sold for $600 less.

'The Wage Impact of the Marielitos: A Reappraisal'

No sense hiding from evidence that works against my support of immigration. This is from George Borjas (if you are unfamiliar with the Mariel boatlift, see here):

The Wage Impact of the Marielitos: A Reappraisal, by George J. Borjas, NBER Working Paper No. 21588 [open link]: This paper brings a new perspective to the analysis of the Mariel supply shock, revisiting the question and the data armed with the accumulated insights from the vast literature on the economic impact of immigration. A crucial lesson from this literature is that any credible attempt to measure the wage impact of immigration must carefully match the skills of the immigrants with those of the pre-existing workforce. The Marielitos were disproportionately low-skill; at least 60 percent were high school dropouts. A reappraisal of the Mariel evidence, specifically examining the evolution of wages in the low-skill group most likely to be affected, quickly overturns the finding that Mariel did not affect Miami’s wage structure. The absolute wage of high school dropouts in Miami dropped dramatically, as did the wage of high school dropouts relative to that of either high school graduates or college graduates. The drop in the relative wage of the least educated Miamians was substantial (10 to 30 percent), implying an elasticity of wages with respect to the number of workers between -0.5 and -1.5. In fact, comparing the magnitude of the steep post-Mariel drop in the low-skill wage in Miami with that observed in all other metropolitan areas over an equivalent time span between 1977 and 2001 reveals that the change in the Miami wage structure was a very unusual event. The analysis also documents the sensitivity of the estimated wage impact to the choice of a placebo. The measured impact is much smaller when the placebo consists of cities where pre-Mariel employment growth was weak relative to Miami.

Monday, September 21, 2015

The Lifecycle of Scholarly Articles across Fields of Economic Research

This might interest some of you:

The lifecycle of scholarly articles across fields of economic research, by Sebastian Galiani, Ramiro Gálvez, Maria Victoria Anauati, Vox EU: Citation counts stand as the de facto methodology for measuring the influence of scholarly articles in today’s economics profession. Nevertheless, a great deal of criticism has been made of the practice of naively using citation analysis to compare the impact of scholarly articles without taking into account other factors which may affect citation patterns (see Bornmann and Daniel 2008).
One recurrent criticism focuses on ‘field-dependent factors’... In a recent paper (Anauati et al. 2015), we analyze if the ‘field-dependent factors’ critique is also valid for fields of research inside economics. Our approach began by assigning into one of four fields of economic research (applied, applied theory, econometric methods and theory) every paper published in the top five economics journals –  The American Economic Review, Econometrica, the Journal of Political Economy, The Quarterly Journal of Economics, and The Review of Economic Studies.
The sample consisted of 9,672 articles published in the top five journals between 1970 and 2000. It did not include notes, comments, announcements or American Economic Review Papers and Proceedings issues. ...

What did they find?:

Conclusions Even though citation counts are an extremely valuable tool for measuring the importance of academic articles, the patterns observed for the lifecycles of papers across fields of economic research support the ‘field-dependent factors’ inside this discipline. Evidence seems to provide a basis for a caveat regarding the use of citation counts as a ‘one-size-fits-all’ yardstick to measure research outcomes in economics across fields of research, as the incentives generated by their use can be detrimental for fields of research which effectively generate valuable (but perhaps more specialized) knowledge, not only in economics but in other disciplines as well.
According to our findings, pure theoretical economic research is the clear loser in terms of citation counts. Therefore, if specialized journals' impact factors are calculated solely on the basis of citations during the first years after an article’s publication, then theoretical research will clearly not be attractive to departments, universities or journals that are trying to improve their rankings or to researchers who use their citation records when applying for better university positions or for grants. The opposite is true for applied papers and applied theory papers – these fields of research are the outright winners when citation counts are used as a measurement of articles' importance, and their citation patterns over time are highly attractive for all concerned. Econometric method papers are a special case; their citation patterns vary a great deal across different levels of success.

Saturday, September 19, 2015

Unemployment Insurance and Progressive Taxation as Automatic Stabilizers

Some preliminary results from a working paper by Alisdair Mckay and Ricardo Reis:

Optimal Automatic Stabilizers, by Alisdair McKay and Ricardo Reis: 1 Introduction How generous should the unemployment insurance system be? How progressive should the tax system be? These questions have been studied extensively and there are well-known trade-offs between social insurance and incentives. Typically these issues are explored in the context of a stationary economy. These policies, however, also serve as automatic stabilizers that alter the dynamics of the business cycle. The purpose of this paper is to ask how and when aggregate stabilization objectives call for, say, more generous unemployment benefits or a more progressive tax system than would be desirable in a stationary economy. ...
We consider two classic automatic stabilizers: unemployment benefits and progressive taxation. Both of these policies have roles in redistributing income and in providing social insurance. Redistribution affects aggregate demand in our model because households differ in their marginal propensities to consume. Social insurance affects aggregate demand through precautionary savings decisions because markets are incomplete. In addition to unemployment insurance and progressive taxation, we also consider a fiscal rule that makes government spending respond automatically to the state of the economy.
Our focus is on the manner in which the optimal fiscal structure of the economy is altered by aggregate stabilization concerns. Increasing the scope of the automatic stabilizers can lead to welfare gains if they raise equilibrium output when it would otherwise be inefficiently low and vice versa. Therefore, it is not stabilization per se that is the objective but rather eliminating inefficient fluctuations. An important aspect of the model specification is therefore the extent of inefficient business cycle fluctuations. Our model generates inefficient fluctuations because prices are sticky and monetary policy cannot fully eliminate the distortions. We show that in a reasonable calibration, more generous unemployment benefits and more progressive taxation are helpful in reducing these inefficiencies. Simply put, if unemployment is high when there is a negative output gap, a larger unemployment benefit will stimulate aggregate demand when it is inefficiently low thereby raising welfare. Similarly, if idiosyncratic risk is high when there is a negative output gap,1 providing social insurance through more progressive taxation will also increase welfare....

Tuesday, September 15, 2015

'Market Power in Healthcare'

Via Austin Frakt at The Incidental Economist (I shortened the summaries):

Market Power: Recent NBER publications by Laurence Baker, M. Kate Bundorf, and Daniel Kessler:

1) “The Effect of Hospital/Physician Integration on Hospital Choice“:

We find that a hospital’s ownership of an admitting physician dramatically increases the probability that the physician’s patients will choose the owning hospital. We also find that ownership of an admitting physician has large effects on how the hospital’s cost and quality affect patients’ hospital choice. Patients whose admitting physician is not owned by a hospital are more likely to choose facilities that are low cost and high quality. ... We conclude that hospital/physician integration affects patients’ hospital choices in a way that is inconsistent with their best interests.

2) “Does Health Plan Generosity Enhance Hospital Market Power?” :

To what extent does the generosity of health insurance coverage facilitate the exercise of market power by producers of health services?  […]

We find a statistically significant and economically important effect of plan generosity on hospital prices in uncompetitive markets. ...

Monday, September 07, 2015

'Support for Redistribution in an Age of Rising Inequality: New Stylized Facts and Some Tentative Explanations'

From the NBER:

Support for Redistribution in an Age of Rising Inequality: New Stylized Facts and Some Tentative Explanations, by Vivekinan Ashok, Ilyana Kuziemko, and Ebonya Washington, NBER Working Paper No. 21529 Issued in September 2015 [open link to earlier version]: Despite the large increases in economic inequality since 1970, American survey respondents exhibit no increase in support for redistribution, in contrast to the predictions from standard theories of redistributive preferences. We replicate these results but further demonstrate substantial heterogeneity by demographic groups. In particular, the two groups who have most moved against income redistribution are the elderly and African-Americans. We find little evidence that these subgroup trends are explained by relative economic gains or growing cultural conservatism, two common explanations. We further show that the elderly trend is uniquely American, at least relative to other developed countries with comparable survey data. While we are unable to provide definitive evidence on the cause of these two groups' declining redistributive support, we offer additional correlations which may offer fruitful directions for future research on the topic. One story consistent with the data on elderly trends is that older Americans worry that redistribution will come at their expense, in particular via cuts to Medicare. We find that the elderly have grown increasingly opposed to government provision of health insurance and that controlling for this tendency explains about 40% of their declining support for redistribution. For blacks, controlling for their declining support of race-targeted aid explains nearly 45% of their differential decline in redistributive preferences (raising the question of why support for race-targeted aid has fallen during a period when black economic catch-up to whites has stalled).

Monday, August 31, 2015

'The Effect of Payday Lending Restrictions on Liquor Sales'

This is a summary of new research from two of our former graduate students here at the University of Oregon, Harold Cuffe and Chris Gibbs (link to full paper):

The effect of payday lending restrictions on liquor sales – Synopsis, by Harold Cuffe and Chris Gibbs: The practice of short-term consumer financing known as payday lending remains controversial because the theoretical gains in welfare from greater credit access stand in opposition to anecdotal evidence that many borrowers are made worse off. Advocates for the industry assert that the loans fill a gap in credit access for underserved individuals facing temporary financial hardship. Opponents, who include many state legislatures and the Obama administration, argue that lenders target financially vulnerable individuals with little ability to pay down their principal, who may end up paying many times the borrowed amount in interest and fees.
Regulations restricting both payday loan and liquor access seek to minimize the potential for overuse. To justify intervention in the two markets, policy makers note a host of negative externalities associated with each product, and cite behavioral motivations underlying individuals' consumption decisions. In particular, researchers have shown that the same models of impulsivity and dynamically inconsistent decision making - hyperbolic preferences and the cue theory of consumption - used to describe the demand for alcohol, also describe patterns of payday loan usage. In these models, individuals can objectively benefit from a restricted choice set that limits their access to loans and liquor. The overlap in behavioral characteristics of over-users of both products suggests that liquor sales is a reasonable and interesting place to test the effectiveness of payday lending regulations.
To identify the causal effect of lending restrictions on liquor sales, we exploit a change in payday lending laws in the State of Washington. Leveraging lender- and liquor store-level data, we estimate a difference-in-differences model comparing Washington to the neighboring State of Oregon, which did not experience a change in payday lending laws during this time. We find that the law change leads to a significant reduction in liquor sales, with the largest decreases occurring at liquor stores located very near to payday lenders at the time the law took effect. Our results provide compelling evidence on how credit constraints affect consumer spending, suggest a behavioral mechanism that may underlie some payday loan usage, and provide evidence that the Washington’s payday lending regulations reduced one form of loan misuse.
Washington State enacted HB 1709 on January, 1st 2010, which introduced three new major restrictions to the payday loan industry. First the law limited the size of a payday loan to 30% of a person's monthly income or $700, whichever is less. Second the law created a state-wide database to track the issuance of payday loans in order to set a hard cap on the number of loans an individual could obtain in a twelve month period to eight, and eliminated multiple concurrent loans. This effectively prohibited the repayment of an existing loan with a new one. In the year prior to the law, the State of Washington estimated that roughly one third of all payday loan borrowers took out more than eight loans. Finally, the law mandated that borrowers were entitled to a 90 day instalment plan to pay back loans of $400 or less or 180 days for loans over $400.
The effect of the law on the industry was severe. There were 603 payday loan locations active in Washington in 2009 that were responsible for 3.24 million loans worth $1.366 billion according to Washington Division of Financial Institutions. In the year following the law change, the number of payday lenders dropped to 424, and loan volume fell to 1.09 million loans worth only $434 million. The following year the number of locations fell again to 256 with a loan volume of roughly 900,000 worth $330 million. Today there are fewer than 200 lenders in Washington and the total loan volume and value has stabilized close to the 2011 values.
A crucial feature of our estimation strategy involves accounting for potentially endogenous supply side factors that challenge efforts to separately identify changes in demand from the store response to the change. To do so, we focus on liquor control states, in which the state determines the number and location of liquor stores, the products offered, and harmonizes prices across stores to regulate and restrict liquor access. Oregon and Washington were both liquor control states until June of 2012 (Washington privatized liquor sales in June 2012).
Main Results
For this study, we use monthly store-level sales data provided by Oregon's and Washington's respective liquor control agencies from July 2008 through March 2012. Figure 4 plots estimated residuals from a regression of log liquor store sales on a set of store-by-month fixed effects, averaged over state and quarter. The graph possesses three notable features. First, prior to Washington's lending restrictions (indicated by the vertical dashed line), the states' log sales are trending in parallel, which confirming the plausibility of the ``common trends'' assumption of the DD model. Second, a persistent gap in the states' sales appears in the same quarter as the law change. This gap is the result of a relatively large downward movement in Washington's sales compared to Oregon's, consistent with a negative effect of the law on sales. Finally, the effect appears to be primarily a level shift as sales in both states maintain a common upward trend.


Our regression estimates indicate that the introduction of payday lending restrictions reduced liquor store sales by approximately 3.6% (statistically significant at the 1% level). As average Washington liquor sales were approximately $163,000 in the months prior to the law change, this represents a $5,900 decline per store each month. At the state level, the point estimate implies a $23.5 million dollar annual decrease in liquor sales. As Washington State reported that the law decreased payday loans by $932 million from 2009 to 2010, this decline represents approximately 2.5% of the change in total value of loans issued.
We see two primary explanations (not mutually exclusive) for the decline in Washington liquor sales in response to the law change. First, the effect may represent a wider permanent reduction in consumption as households lose their ability to cope with unforeseen negative income shocks. Alternatively, the drop in spending may indicate a more direct financing of liquor purchases by individuals with present-biased preferences. The first explanation implies that restrictions on payday lending negatively affect consumer welfare, while the second allows for a positive impact, since individuals with present-biased preferences may be made objectively better off with a restricted choice set.
Zinman (2013) highlights Laibson (2001) theory of Pavlovian cues as a particularly intriguing explanation for payday loan usage. In these models, consumer ``impulsivity'' makes instant gratification a special case during dynamic utility maximization, where exposure to a cue can explain dynamically inconsistent behavior. Indeed, Laibson uses liquor as a prime example of a consumption good thought to be influenced by cues, and subsequent experimental research on liquor uncovers evidence consistent with this hypothesis (MacKillop et al (2010)). In situations where payday lenders locate very near to liquor stores, individuals may be exposed to a cue for alcohol, and then see the lender as a means to satisfy the urge to make an immediate purchase. A lender and liquor store separated by even a brief walk may be far enough apart to allow an individual to resist the urge to obtain both the loan and liquor. Of course, cue-theory of consumption makes lender-liquor store distance relevant even in circumstances where individuals experience a cue only after borrowing. Lenders locating near liquor stores increase the likelihood that an individual exposed to a cue is financially liquid, and able to act on an impulse.
To investigate liquor store and lender proximity, we geocode the stores' and lenders' street addresses, and calculate walking distances for all liquor store-lender pairs within two kilometers of one another. We then repeatedly estimate our preferred specification with a full set of controls on an ever expanding window of liquor stores beginning with the stores that were located within a ten meter walking distance of a lender in the month prior to the law change, then within 100 meters, within 200 meters, etc., to two kilometres. These estimates are presented in Figure 5. The graph demonstrates a negative effect of 9.2% on those liquor stores that had a payday lender located within ten meters in the month before the law change (significant at the 1% levels), an effect almost three times as large as that overall.  The larger effect rapidly declines in distance suggesting that even a small degree of separation is significant. The degree of nonlinearity in the relationship between distance and liquor sales supports the behavioral explanation of demand.


Our analysis provides the first empirical evidence of the connection between payday lending and spending on liquor. We uncover a clear reduction in liquor sales resulting from payday lending restrictions. In addition, we find that those liquor stores located very near to lenders at the time of the law change experience declines in sales almost three times as large as the overall average.
This finding is significant because it highlights that a segment of borrowers may be willing to assume significant risk by borrowing in order to engage in alcohol consumption - an activity which carries significant personal risk of its own. The connection between payday lending restrictions and reduced liquor purchases, therefore, suggests that the benefits to payday lending restrictions extend beyond personal finance and may be large.
Effective payday loan regulation should recognize the potential for greater credit access to help or harm consumers. As Carrell and Zinman (2014) highlight, heterogeneity likely exists within the pool of payday loan users, and external factors will influence the ratio of ``productive and counter-productive borrowers.'' Lending restrictions can seek to reduce the proportion of counterproductive borrowers through the prohibition of practices known to harm consumers, including those that rely upon leveraging behavioral responses such as addiction and impulsivity. The behavioral overlap identified in the literature between counterproductive payday loan borrowers and heavy alcohol users suggests that there exists a link between the two markets. The decline in liquor sales documented here provides evidence that these regulations may be effective in promoting productive borrowing.
1. Carrell, Scott and Jonathan Zinman, “In harm's way? Payday loan access and military personnel performance," Review of Financial Studies, 2014, 27(9), 2805-2840.
2. Laibson, David, “A cue-theory of consumption," Quarterly Journal of Economics, 2001, pp. 81-119.
3. MacKillop, James, Sean O'Hagen, Stephen A Lisman, James G Murphy, Lara A Ray, Jennifer W Tidey, John E McGeary, and Peter M Monti, “Behavioral economic analysis of cue-elicited craving for alcohol," Addiction, 2010, 105 (9), 1599-1607.
4. Zinman, Jonathan, “Consumer Credit: Too Much or Too Little (or Just Right)?," Working Paper 19682, National Bureau of Economic Research November 2013.

Tuesday, August 25, 2015

'Great Recession Job Losses Severe, Enduring'

Nothing particularly surprising here -- the Great recession was unusually severe and unusually long, and hence had unusual impacts, but it's good to have numbers characterizing what happened:

Great Recession Job Losses Severe, Enduring: Of those who lost full-time jobs between 2007 and 2009, only about 50 percent were employed in January 2010 and only about 75 percent of those were re-employed in full-time jobs.
The economic downturn that began in December 2007 was associated with a rapid rise in unemployment and with an especially pronounced increase in the number of long-term unemployed. In "Job Loss in the Great Recession and its Aftermath: U.S. Evidence from the Displaced Workers Survey" (NBER Working Paper No. 21216), Henry S. Farber uses data from the Displaced Workers Survey (DWS) from 1984-2014 to study labor market dynamics. From these data he calculates both the short-term and medium-term effects of the Great Recession's sharply elevated rate of job losses. He concludes that these effects have been particularly severe.

Of the workers who lost full-time jobs between 2007 and 2009, Farber reports, only about 50 percent were employed in January 2010 and only about 75 percent of those were re-employed in full-time jobs. This means only about 35 to 40 percent of those in the DWS who reported losing a job in 2007-09 were employed full-time in January 2010. This was by far the worst post-displacement employment experience of the 1981-2014 period.
The adverse employment experience of job losers has also been persistent. While both overall employment rates and full-time employment rates began to improve in 2009, even those who lost jobs between 2011 and 2013 had very low re-employment rates and, by historical standards, very low full-time employment rates.
In addition, the data show substantial weekly earnings declines even for those who did find work, although these earnings losses were not especially large by historical standards. Farber suggests that the earnings decline measure from the DWS is appropriate for understanding how job loss affects the earnings that a full-time-employed former job-loser is able to command.
The author notes that the measures on which he focuses may understate the true economic cost of job loss, since they do not consider the value of time spent unemployed or the value of lost health insurance and pension benefits.
Farber concludes that the costs of job losses in the Great Recession were unusually severe and remain substantial years later. Most importantly, workers laid off in the Great Recession and its aftermath have been much less successful at finding new jobs, particularly full-time jobs, than those laid off in earlier periods. The findings suggest that job loss since the Great Recession has had severe adverse consequences for employment and earnings.

'Thinking, Fast and Slow: Efforts to Reduce Youthful Crime in Chicago'

From the NBER Digest:

Thinking, Fast and Slow: Efforts to Reduce Youthful Crime in Chicago: Interventions that get youths to slow down and behave less automatically in high-stakes situations show positive results in three experiments.

Disparities in youth outcomes in the United States are striking. For example, among 15-to-24 year olds, the male homicide rate in 2013 was 18 times higher for blacks than for whites. Black males lose more years of potential life before age 65 to homicide than to heart disease, America's leading overall killer. A large body of research emphasizes that, beyond institutional factors, choices and behavior contribute to these outcomes. Those choices include decisions around dropping out of high school, involvement with drugs or gangs, and how to respond to confrontations that could escalate to serious violence.
In "Thinking, Fast and Slow? Some Field Experiments to Reduce Crime and Dropout in Chicago" (NBER Working Paper No. 21178), authors Sara B. Heller, Anuj K. Shah, Jonathan Guryan, Jens Ludwig, Sendhil Mullainathan, and Harold A. Pollack explain these behavioral differences using the psychology of automaticity. Because it is mentally costly to think through every situation in detail, all of us have automatic responses to some of the situations we encounter. These responses—automaticity—are tuned to situations we commonly face.
The authors present results from three large-scale, randomized experimental studies carried out in Chicago with economically disadvantaged male youth. All three experiments show sizable behavioral responses to fairly short-duration, automaticity-reducing interventions that get youths to slow down and behave less automatically in high-stakes situations.
The first intervention (called Becoming a Man, or BAM, developed by Chicago-area nonprofit Youth Guidance) involved 2,740 males in the 7th through 10th grades in 18 public schools on the south and west sides of the city. Some youths were offered an automaticity-reducing program once a week during school or an after-school sports intervention developed by Chicago nonprofit World Sport Chicago. The authors find that participation in the programming reduced arrests over the program year for violent crimes by 44 percent, and non-violent, non-property, non-drug crimes by 36 percent. Participation also increased engagement with school, which the authors estimate could translate into gains in graduation rates of between 7 and 22 percent.
A second study of BAM randomly assigned 2,064 male 9th and 10th graders within nine Chicago public high schools to the treatment or to a control condition. The authors found that arrests of youth in the treatment group were 31 percent lower than arrests in the control group.
The third intervention was delivered by trained detention staff to high-risk juveniles housed in the Cook County Juvenile Temporary Detention Center. The curriculum in this program, while different from the first two interventions, also focused on reducing automaticity. Some 5,728 males were randomly assigned to units inside the facility that did or did not implement the program. The authors found that those who received programming were about 16 percent less likely to be returned to the detention center than those who did not.
The sizable impacts the authors observe from all three interventions stand in stark contrast to the poor record of many efforts to improve the long-term life outcomes of disadvantaged youths. As with all randomized experiments, there is the question of whether these impacts generalize to other samples and settings. The interventions considered in this study would not be costly to expand. The authors estimate that the cost of the intervention for each participant in the first two studies was between $1,178 and $2,000. In the third case, the per-participant cost was about $60 per juvenile detainee. The results suggest that expanding these programs may be more cost-effective than other crime-prevention strategies that target younger individuals.
The authors also present results from various survey measures suggesting the results do not appear to be due to changes in mechanisms like emotional intelligence or self-control. On the other hand results from some decision-making exercises the authors carried out seem to support reduced automaticity as a key mechanism. The results overall suggest that automaticity can be an important explanation for disparities in outcomes.

Monday, August 17, 2015

Stiglitz: Towards a General Theory of Deep Downturns

This is the abstract, introduction, and final section of a recent paper by Joe Stiglitz on theoretical models of deep depressions (as he notes, it's "an extension of the Presidential Address to the International Economic Association"):

Towards a General Theory of Deep Downturns, by Joseph E. Stiglitz, NBER Working Paper No. 21444, August 2015: Abstract This paper, an extension of the Presidential Address to the International Economic Association, evaluates alternative strands of macro-economics in terms of the three basic questions posed by deep downturns: What is the source of large perturbations? How can we explain the magnitude of volatility? How do we explain persistence? The paper argues that while real business cycles and New Keynesian theories with nominal rigidities may help explain certain historical episodes, alternative strands of New Keynesian economics focusing on financial market imperfections, credit, and real rigidities provides a more convincing interpretation of deep downturns, such as the Great Depression and the Great Recession, giving a more plausible explanation of the origins of downturns, their depth and duration. Since excessive credit expansions have preceded many deep downturns, particularly important is an understanding of finance, the credit creation process and banking, which in a modern economy are markedly different from the way envisioned in more traditional models.
Introduction The world has been plagued by episodic deep downturns. The crisis that began in 2008 in the United States was the most recent, the deepest and longest in three quarters of a century. It came in spite of alleged “better” knowledge of how our economic system works, and belief among many that we had put economic fluctuations behind us. Our economic leaders touted the achievement of the Great Moderation.[2] As it turned out, belief in those models actually contributed to the crisis. It was the assumption that markets were efficient and self-regulating and that economic actors had the ability and incentives to manage their own risks that had led to the belief that self-regulation was all that was required to ensure that the financial system worked well , an d that there was no need to worry about a bubble . The idea that the economy could, through diversification, effectively eliminate risk contributed to complacency — even after it was evident that there had been a bubble. Indeed, even after the bubble broke, Bernanke could boast that the risks were contained.[3] These beliefs were supported by (pre-crisis) DSGE models — models which may have done well in more normal times, but had little to say about crises. Of course, almost any “decent” model would do reasonably well in normal times. And it mattered little if, in normal times , one model did a slightly better job in predicting next quarter’s growth. What matters is predicting — and preventing — crises, episodes in which there is an enormous loss in well-being. These models did not see the crisis coming, and they had given confidence to our policy makers that, so long as inflation was contained — and monetary authorities boasted that they had done this — the economy would perform well. At best, they can be thought of as (borrowing the term from Guzman (2014) “models of the Great Moderation,” predicting “well” so long as nothing unusual happens. More generally, the DSGE models have done a poor job explaining the actual frequency of crises.[4]
Of course, deep downturns have marked capitalist economies since the beginning. It took enormous hubris to believe that the economic forces which had given rise to crises in the past were either not present, or had been tamed, through sound monetary and fiscal policy.[5] It took even greater hubris given that in many countries conservatives had succeeded in dismantling the regulatory regimes and automatic stabilizers that had helped prevent crises since the Great Depression. It is noteworthy that my teacher, Charles Kindleberger, in his great study of the booms and panics that afflicted market economies over the past several hundred years had noted similar hubris exhibited in earlier crises. (Kindleberger, 1978)
Those who attempted to defend the failed economic models and the policies which were derived from them suggested that no model could (or should) predict well a “once in a hundred year flood.” But it was not just a hundred year flood — crises have become common . It was not just something that had happened to the economy. The crisis was man-made — created by the economic system. Clearly, something is wrong with the models.
Studying crises is important, not just to prevent these calamities and to understand how to respond to them — though I do believe that the same inadequate models that failed to predict the crisis also failed in providing adequate responses. (Although those in the US Administration boast about having prevented another Great Depression, I believe the downturn was certainly far longer, and probably far deeper, than it need to have been.) I also believe understanding the dynamics of crises can provide us insight into the behavior of our economic system in less extreme times.
This lecture consists of three parts. In the first, I will outline the three basic questions posed by deep downturns. In the second, I will sketch the three alternative approaches that have competed with each other over the past three decades, suggesting that one is a far better basis for future research than the other two. The final section will center on one aspect of that third approach that I believe is crucial — credit. I focus on the capitalist economy as a credit economy , and how viewing it in this way changes our understanding of the financial system and monetary policy. ...

He concludes with:

IV. The crisis in economics The 2008 crisis was not only a crisis in the economy, but it was also a crisis for economics — or at least that should have been the case. As we have noted, the standard models didn’t do very well. The criticism is not just that the models did not anticipate or predict the crisis (even shortly before it occurred); they did not contemplate the possibility of a crisis, or at least a crisis of this sort. Because markets were supposed to be efficient, there weren’t supposed to be bubbles. The shocks to the economy were supposed to be exogenous: this one was created by the market itself. Thus, the standard model said the crisis couldn’t or wouldn’t happen ; and the standard model had no insights into what generated it.
Not surprisingly, as we again have noted, the standard models provided inadequate guidance on how to respond. Even after the bubble broke, it was argued that diversification of risk meant that the macroeconomic consequences would be limited. The standard theory also has had little to say about why the downturn has been so prolonged: Years after the onset of the crisis, large parts of the world are operating well below their potential. In some countries and in some dimension, the downturn is as bad or worse than the Great Depression. Moreover, there is a risk of significant hysteresis effects from protracted unemployment, especially of youth.
The Real Business Cycle and New Keynesian Theories got off to a bad start. They originated out of work undertaken in the 1970s attempting to reconcile the two seemingly distant branches of economics, macro-economics, centering on explaining the major market failure of unemployment, and microeconomics, the center piece of which was the Fundamental Theorems of Welfare Economics, demonstrating the efficiency of markets.[66] Real Business Cycle Theory (and its predecessor, New Classical Economics) took one route: using the assumptions of standard micro-economics to construct an analysis of the aggregative behavior of the economy. In doing so, they left Hamlet out of the play: almost by assumption unemployment and other market failures didn’t exist. The timing of their work couldn’t have been worse: for it was just around the same time that economists developed alternative micro-theories, based on asymmetric information, game theory, and behavioral economics, which provided better explanations of a wide range of micro-behavior than did the traditional theory on which the “new macro - economics” was being constructed. At the same time, Sonnenschein (1972) and Mantel (1974) showed that the standard theory provided essentially no structure for macro- economics — essentially any demand or supply function could have been generated by a set of diverse rational consumers. It was the unrealistic assumption of the representative agent that gave theoretical structure to the macro-economic models that were being developed. (As we noted, New Keynesian DSGE models were but a simple variant of these Real Business Cycles, assuming nominal wage and price rigidities — with explanations, we have suggested, that were hardly persuasive.)
There are alternative models to both Real Business Cycles and the New Keynesian DSGE models that provide better insights into the functioning of the macroeconomy, and are more consistent with micro- behavior, with new developments of micro-economics, with what has happened in this and other deep downturns . While these new models differ from the older ones in a multitude of ways, at the center of these models is a wide variety of financial market imperfections and a deep analysis of the process of credit creation. These models provide alternative (and I believe better) insights into what kinds of macroeconomic policies would restore the economy to prosperity and maintain macro-stability.
This lecture has attempted to sketch some elements of these alternative approaches. There is a rich research agenda ahead.

Sunday, August 16, 2015

'The U.S. Foreclosure Crisis Was Not Just a Subprime Event'

From the NBER Digest:

The U.S. Foreclosure Crisis Was Not Just a Subprime Event, by Les Picker, NBER: Many studies of the housing market collapse of the last decade, and the associated sharp rise in defaults and foreclosures, focus on the role of the subprime mortgage sector. Yet subprime loans comprise a relatively small share of the U.S. housing market, usually about 15 percent and never more than 21 percent. Many studies also focus on the period leading up to 2008, even though most foreclosures occurred subsequently. In "A New Look at the U.S. Foreclosure Crisis: Panel Data Evidence of Prime and Subprime Borrowers from 1997 to 2012" (NBER Working Paper No. 21261), Fernando Ferreira and Joseph Gyourko provide new facts about the foreclosure crisis and investigate various explanations of why homeowners lost their homes during the housing bust. They employ microdata that track outcomes well past the beginning of the crisis and cover all types of house purchase financing—prime and subprime mortgages, Federal Housing Administration (FHA)/Veterans Administration (VA)-insured loans, loans from small or infrequent lenders, and all-cash buyers. Their data contain information on over 33 million unique ownership sequences in just over 19 million distinct owner-occupied housing units from 1997-2012.


The researchers find that the crisis was not solely, or even primarily, a subprime sector event. It began that way, but quickly expanded into a much broader phenomenon dominated by prime borrowers' loss of homes. There were only seven quarters, all concentrated at the beginning of the housing market bust, when more homes were lost by subprime than by prime borrowers. In this period 39,094 more subprime than prime borrowers lost their homes. This small difference was reversed by the beginning of 2009. Between 2009 and 2012, 656,003 more prime than subprime borrowers lost their homes. Twice as many prime borrowers as subprime borrowers lost their homes over the full sample period.
The authors suggest that one reason for this pattern is that the number of prime borrowers dwarfs that of subprime borrowers and the other borrower/owner categories they consider. The prime borrower share averages around 60 percent and did not decline during the housing boom. Although the subprime borrower share nearly doubled during the boom, it peaked at just over 20 percent of the market. Subprime's increasing share came at the expense of the FHA/VA-insured sector, not the prime sector.
The authors' key empirical finding is that negative equity conditions can explain virtually all of the difference in foreclosure and short sale outcomes of prime borrowers compared to all cash owners. Negative equity also accounts for approximately two-thirds of the variation in subprime borrower distress. Both are true on average, over time, and across metropolitan areas.
None of the other 'usual suspects' raised by previous research or public commentators—housing quality, race and gender demographics, buyer income, and speculator status—were found to have had a major impact. Certain loan-related attributes such as initial loan-to-value (LTV), whether a refinancing occurred or a second mortgage was taken on, and loan cohort origination quarter did have some independent influence, but much weaker than that of current LTV.
The authors' findings imply that large numbers of prime borrowers who did not start out with extremely high LTVs still lost their homes to foreclosure. They conclude that the economic cycle was more important than initial buyer, housing and mortgage conditions in explaining the foreclosure crisis. These findings suggest that effective regulation is not just a matter of restricting certain exotic subprime contracts associated with extremely high default rates.

Monday, August 10, 2015

Job Training and Government Multipliers

Two new papers from the NBER:

What Works? A Meta Analysis of Recent Active Labor Market Program Evaluations, by David Card, Jochen Kluve, and Andrea Weber, NBER Working Paper No. 21431 Issued in July 2015: We present a meta-analysis of impact estimates from over 200 recent econometric evaluations of active labor market programs from around the world. We classify estimates by program type and participant group, and distinguish between three different post-program time horizons. Using meta-analytic models for the effect size of a given estimate (for studies that model the probability of employment) and for the sign and significance of the estimate (for all the studies in our sample) we conclude that: (1) average impacts are close to zero in the short run, but become more positive 2-3 years after completion of the program; (2) the time profile of impacts varies by type of program, with larger gains for programs that emphasize human capital accumulation; (3) there is systematic heterogeneity across participant groups, with larger impacts for females and participants who enter from long term unemployment; (4) active labor market programs are more likely to show positive impacts in a recession. [open link]


Clearing Up the Fiscal Multiplier Morass: Prior and Posterior Analysis, by Eric M. Leeper, Nora Traum, and Todd B. Walker, NBER Working Paper No. 21433 Issued in July 2015: We use Bayesian prior and posterior analysis of a monetary DSGE model, extended to include fiscal details and two distinct monetary-fiscal policy regimes, to quantify government spending multipliers in U.S. data. The combination of model specification, observable data, and relatively diffuse priors for some parameters lands posterior estimates in regions of the parameter space that yield fresh perspectives on the transmission mechanisms that underlie government spending multipliers. Posterior mean estimates of short-run output multipliers are comparable across regimes—about 1.4 on impact—but much larger after 10 years under passive money/active fiscal than under active money/passive fiscal—means of 1.9 versus 0.7 in present value. [open link]

Thursday, August 06, 2015

'Buying Locally'

Via the blog A Fine Theorem:

“Buying Locally,” G. J. Mailath, A. Postlewaite & L. Samuelson (2015): Arrangements where agents commit to buy only from selected vendors, even when there are more preferred products at better prices from other vendors, are common. Consider local currencies like “Ithaca Hours”, which can only be used at other participating stores and which are not generally convertible, or trading circles among co-ethnics even when trust or unobserved product quality is not important. The intuition people have for “buying locally” is to, in some sense, “keep the profits in the community”; that is, even if you don’t care at all about friendly local service or some other utility-enhancing aspect of the local store, you should still patronize it. The fruit vendor, should buy from the local bookstore even when her selection is subpar, and the book vendor should in turn patronize you even when fruits are cheaper at the supermarket.
At first blush, this seems odd to an economist. Why would people voluntarily buy something they don’t prefer? What Mailath and his coauthors show is that, actually, the noneconomist intuition is at least partially correct when individuals are both sellers and buyers. Here’s the idea. ....
One thing that isn’t explicit in the paper, perhaps because it is too trivial despite its importance, is how buy local arrangements affect welfare..., an intriguing possibility is that “buy local” arrangements may not harm social welfare at all, even if they are beneficial to in-group members. ...
[May 2015 working paper (RePEc IDEAS version)]

Tuesday, August 04, 2015

'The US Financial Sector in the Long-Run: Where are the Economies of Scale?'

 And one more  before heading out the door. From Tim Taylor:

The US Financial Sector in the Long-Run: Where are the Economies of Scale?: A larger financial sector is clearly correlated with economic development, in the sense that high-income countries around the world have on average larger markets for banks, credit cards, stock and bond markets, and so on compared with lower-income countries. But there are also concerns that the financial sector in high-income countries can grow in ways that end up creating economic instability (as I've discussed herehere, and here). Thomas Philippon provides some basic evidence on the growth of the US financial sector over the past 130 years in "Has the US Finance Industry Become Less Efficient? On the Theory and Measurement of Financial Intermediation," publishes in the April 2015 issue of the American Economic Review (105:4, pp. 1408–1438). The AER is not freely available online, but many readers can obtain access through a library subscription.

There are a couple of ways to think about the size of a country's financial sector relative to its economy. One can add up the size of certain financial markets--the market value of bank loans, stocks, bonds, and the like--and divide by GDP. Or one can add up the economic value added by the financial sector. For example, instead of adding up the bank loans, you add up the value of banking services provided. Similarly, instead of adding up the value of the stock market, you add up the value of the services provided by stockbrokers and investment manager.

Here's a figure from Philippon showing both measures of finance as a share of the US economy over the long run since 1886.

The orange line measured on the right axis is "intermediated assets," which measures the size of the financial sector as the sum of all debt and equity issued by nonfinancial firms, together with the sum of all household debt, and some other smaller categories. Back in the late 19th century, the US financial sector was roughly equal in size to GDP. By just before the Great Depression, it had risen to almost three times GDP, before sinking back to about 1.5 times GDP. More recently, you can see the financial sector spiking with the boom in real estate markets and stock markets in the mid-2000s at more than 4 times GDP, before dropping slightly. The overall trend is clearly up, but it's also clearly a bumpy ride.

The green line shows "finance income," which can be understood as a measure of the value added by firms in the financial sector. For the uninitiated, "value added" has a specific meaning to economists. Basically, it is calculated by taking the total revenue of a firm and subtracting the cost of all goods and services purchased from other firms--for example, subtracting costs of supplies purchased or machinery. In the figure, most of the "value-added" that measures  "finance income" includes all wages and salaries paid by a firm, along with any profits earned.

An intriguing pattern emerges here: finance income tracks intermediated assets fairly closely. In other words, the amount paid to the financial sector is more-or-less a fixed proportion of total financial assets. It's not obvious why this should be so. For example, imagine that because of a rise in housing prices, the total mortgage debt of households rises substantially over time, or because of rising stock prices over several decades, the total value of the stock market is up. Especially in an economy where information technology is making rapid strides, it's not clear why incomes in the financial sector should be rising at the same pace. Does a bank need to incur twice the costs if it issues a mortgage for $500,000 as compared to when it issues a mortgage for $250,000? Does an investment adviser need to incur twice the costs when giving advice on a retirement account of $1 million as when giving advice on a retirement account of $500,000? Shouldn't there be some economies of scale in financial services?

Philippon isn't the first to raise this question: for example, Burton Malkiel has asked why there aren't economies of scale in asset management fees here. But Philippon provides evidence that, for whatever reason, a lack of economies of scale has been widespread and long-lasting in the US financial sector.

Full disclosure: The AER is published by the American Economic Association, which is also the publisher of the Journal of Economic Perspectives, where I have worked as Managing Editor since 1986.

Tuesday, July 28, 2015

Is Content Aggregation Harmful?

This is from the NBER (Project Syndicate, are you listening?):

Content Aggregation by Platforms: The Case of the News Media, by Lesley Chiou and Catherine Tucker, NBER Working Paper No. 21404, July 2015: ... In recent years, the digitization of content has led to the prominence of platforms as aggregators of content in many economically important industries, including media and Internet-based industries (Evans and Schmalensee, 2012).
These new platforms consolidate content from multiple sources into one place, thereby lowering the transactions costs of obtaining content and introducing new information to consumers. ... For these reasons, platforms have attracted considerable legal and policy attention. ...
Our results indicate that ... the traffic effect is large, as aggregators may guide users to new content. We do not find evidence of a scanning effect...
Our empirical distinction between a scanning effect where the aggregator substitutes for original content and a traffic effect where the aggregator is complementary, is useful for analyzing the potential policy implications of such business models. The fact we find evidence of a "traffic effect" even with a relatively large amount of content on an aggregator, is perhaps evidence that the "fair use" exemptions often relied on by such sites are less potentially damaging to the original copyright holder than often thought.

On the comment that the benefits outweigh the harm "even with a relatively large amount of content on an aggregator," when I post an entire article, as I did yesterday with this Vox EU piece, a surprisingly high percentage of you still click through to the original.

With video, at least in most cases, there is code available to put the video on your site. You play it and it has ads, branding, etc. I've always thought (or maybe hoped) content providers should do the same thing. Provide an embed button that allows me to duplicate an article -- it would come with ads, links to other content on their site, etc. -- on my site. Reads of the article would go way up (not from just my site, I mean if they allowed everyone to do this), and it would increase the number of people who see ads associated with their content (so they could charge more).

Monday, July 27, 2015

'Poor Little Rich Kids? The Determinants of the Intergenerational Transmission of Wealth'

Genes are not as important as people think:

Poor Little Rich Kids? The Determinants of the Intergenerational Transmission of Wealth, by Sandra E. Black, Paul J. Devereux, Petter Lundborg, and Kaveh Majlesi, NBER Working Paper No. 21409 Issued in July 2015: Wealth is highly correlated between parents and their children; however, little is known about the extent to which these relationships are genetic or determined by environmental factors. We use administrative data on the net wealth of a large sample of Swedish adoptees merged with similar information for their biological and adoptive parents. Comparing the relationship between the wealth of adopted and biological parents and that of the adopted child, we find that, even prior to any inheritance, there is a substantial role for environment and a much smaller role for genetics. We also examine the role played by bequests and find that, when they are taken into account, the role of adoptive parental wealth becomes much stronger. Our findings suggest that wealth transmission is not primarily because children from wealthier families are inherently more talented or more able but that, even in relatively egalitarian Sweden, wealth begets wealth.

[Open link]

Tuesday, July 21, 2015

'Farmers Markets and Food-Borne Illness'

Marc Bellemare:

Farmers Markets and Food-Borne Illness: ... After working on it for almost two years, I am happy to finally be able to circulate my new paper titled “Farmers Markets and Food-Borne Illness,” coauthored with my colleague Rob King and my student Jenny Nguyen, in which we ask whether farmers markets are associated with food-borne illness in a systematic way. ...

In sum, what we find is:

  1. A positive relationship between the number of farmers markets and the number of reported outbreaks of food-borne illness in the average state-year./li>
  2. A positive relationship between the number of farmers markets and the number of reported cases of food-borne illness in the average state-year.
  3. A positive relationship between the number of farmers markets and the number of reported outbreaks of Campylobacter jejuni in the average state-year.
  4. A positive relationship between the number of farmers markets and the number of reported cases of Campylobacter jejuni in the average state-year.
  5. Six dogs that didn’t bark, i.e., no systematic relationship between the number of farmers markets and the number of outbreaks or cases of norovirus, Salmonella enterica, Clostridium perfringens, E. coli, Staphylococcus (i.e., staph), or scombroid food poisoning.
  6. When controlling for the number of farmers markets, there is a negative relationship between the number of farmers markets that accept SNAP and food-borne illness in the average state-year.
  7. AA doubling of the number of farmers markets in the average state-year would be associated with a relatively modest economic cost of about $900,000 in terms of additional cases of food-borne illness.

Of course, correlation is not causation, which is why we spend a great deal of time in the paper discussing the potential threats to causal identification in this context, investigating them, and trying to triangulate our findings with a number of different specifications and estimators. At the end of the day, we are pretty confident in the robustness of our core finding, viz. that there is a positive association between the number of farmers markets and the number of reported outbreaks and cases of food-borne illness. ...

Sunday, June 07, 2015

'Cyclical Variation in Real Wages'

More than 75 years ago, the EJ – under Keynes’ editorship - published a series of papers on the behavior of real wages that have had a lasting impact on the discipline – this special anniversary session discusses debates then and now about real wage dynamics, unemployment fluctuations and wage flexibility.


  • Keynesian Controversies on Compensation; Presented by John Pencavel (Stanford University)
  • Unemployment and Business Cycles; Presented by Lawrence Christiano (Northwestern University)
  • Unemployment Fluctuations, Match Quality and Wage Cyclicality of New Hires: Presented by Christopher Huckfeldt (Cornell University) and Antonella Trigari (Bocconi University)
  • Does the New Keynesian Model have a Uniqueness Problem? Presented by Benjamin Johannsen (Federal Reserve Board)

I really enjoyed this session, particularly the history of "Keynesian controversies" over wages by John Pencavel at the beginning of the session.

Wednesday, June 03, 2015

'Coordination Equilibrium and Price Stickiness'

This is the introduction to a relatively new working paper by Cidgem Gizem Korpeoglu and Stephen Spear (sent in response to my comment that I've been disappointed with the development of new alternatives to the standard NK-DSGE models):

Coordination Equilibrium and Price Stickiness, by Cidgem Gizem Korpeoglu (University College London) Stephen E. Spear (Carnegie Mellon): 1 Introduction Contemporary macroeconomic theory rests on the three pillars of imperfect competition, nominal price rigidity, and strategic complementarity. Of these three, nominal price rigidity (aka price stickiness) has been the most important. The stickiness of prices is a well-established empirical fact, with early observations about the phenomenon going back to Alfred Marshall. Because the friction of price stickiness cannot occur in markets with perfect competition, modern micro-founded models (New Keynesian or NK models, for short) have been forced to abandon the standard Arrow-Debreu paradigm of perfect competition in favor of models where agents have market power and set market prices for their own goods. Strategic complementarity enters the picture as a mechanism for explaining the kinds of coordination failures that lead to sustained slumps like the Great Depression or the aftermath of the 2008 …financial crisis. Early work by Cooper and John laid out the importance of these three features for macroeconomics, and follow-on work by Ball and Romer showed that failure to coordinate on price adjustments could itself generate strategic complementarity, effectively unifying two of the three pillars.
Not surprisingly, the Ball and Romer work was based on earlier work by a number of authors (see Mankiw and Romer's New Keynesian Economics) which used the model of Dixit and Stiglitz of monopolistic competition as the basis for price-setting behavior in a general equilibrium setting, combined with the idea of menu costs -- literally the cost of posting and communicating price changes -- and exogenously-specified adjustment time staggering to provide the friction(s) leading to nominal rigidity. While these models perform well in explaining aspects of the business cycle, they have only recently been subjected to what one would characterize as thorough empirical testing, because of the scarcity of good data on how prices actually change. This has changed in the past decade as new sources of data on price dynamics have become available, and as computational power capable of teasing out what might be called the "…fine structure" of these dynamics has emerged. On a different dimension, the overall suitability of monopolistic competition as the appropriate form of market imperfection to use as the foundation of the new macro models has been largely unquestioned, though we believe this is largely due to the tractability of the Dixit-Stiglitz model relative to other models of imperfect competition generated by large …fixed costs or increasing returns to scale not due to specialization.
In this paper, we examine both of these underlying assumptions in light of what the new empirics on pricing dynamics has found, and propose a different, and we believe, better microfoundation for New Keynesian macroeconomics based on the Shapley-Shubik market game.

Tuesday, June 02, 2015

'Stabilizing Wage Policy'

I have argued many, many times that we did not do nearly enough to help households repair their balance sheets (especially when compared to the attention that bank balance sheets received), so I like this idea from Stanford's Mordecai Kurz:

Stabilizing Wage Policy by Mordecai Kurz, Department of Economics Stanford University, Stanford, CA. (This version: May 27, 2015): Summary: A rapid recovery from deflationary shocks that result in transition to the Zero Lower Bound (ZLB) requires that policy generate an inflationary counter-force. Monetary policy cannot achieve it and the lesson of the 2007-2015 Great Recession is that growing debt give rise to a political gridlock which prevents restoration to full employment with deficit financed public spending. Even optimal investments in needed public projects cannot be undertaken at a zero interest rate. Hence, failure of policy to arrest the massive damage of eight year’s Great Recession shows the need for new policy tools. I propose such policy under the ZLB called “Stabilizing Wage Policy” which requires public intervention in markets instead of deficit financed expenditures. Section 1 develops a New Keynesian model with diverse beliefs and inflexible wages. Section 2 presents the policy and studies its efficacy.
The integrated New Keynesian (NK) model economy consists of a lower sub-economy under a ZLB and upper sub-economy with positive rate, linked by random transition between them. Household-firm-managers hold heterogeneous beliefs and inflexible wage is based on a four quarter staggered wage structure so that mean wage is a relatively inflexible function of inflation, of unemployment and of a distributed lag of productivity. Equilibrium maps of the two sub-economies exhibit significant differences which emerge from the relative rates at which the nominal rate, prices and wage rate adjust to shocks. Two key results: first, decline to the ZLB lower subeconomy causes a powerful debt-deflation spiral. Second, output level, inflation and real wages rise in the lower sub-economy if all base wages are unexpectedly raised. Unemployment falls. This result is explored and explained since it is the key analytic result that motivates the policy.
A Stabilizing Wage Policy aims to repair households’ balance sheets, expedite recovery and exit from the ZLB. It raises base wages for policy duration with quarterly cost of living adjustment and a prohibition to alter base wages in order to nullify the policy. I use demand shocks to cause recession under a ZLB and a deleveraging rule to measure recovery. The rule is calibrated to repair damaged balance sheets of US households in 2007-2015. Sufficient deleveraging and a positive rate in the upper sub-economy without a wage policy are required for exit hence at exit time inflation and output in the lower sub-economy are irrelevant for exit decision. Simulations show effective policy selects high policy intensity at the outset and given the 2007-2015 experience, a constant 10% increased base wages raises equilibrium mean wage by about 5.5%, generates a controlled inflation of 5%-6% at exit time and attains recovery in a fraction of the time it takes for recovery without policy. Under a successful policy inflation exceeds the target at exit time and when policy terminates, inflation abates rapidly if the inflation target is intact. I suggest that a stabilizing wage policy with a constant 10% increased base wages could have been initiated in September 2008. If controlled inflation of 5% for 2.25 years would have been politically tolerated, the US would have recovered and exited the ZLB in 9 quarters and full employment restored by 2012. Lower policy intensity would have resulted in smaller increased mean wage, lower inflation but increased recession’s duration. The policy would not have required any federal budget expenditures, it would have reduced public deficits after 2010 and the US would have reached 2015 with a lower national debt.
The policy negates the effect of demand shocks which cause the recession and the binding ZLB. It attains it’s goal with strong temporary intervention in the market instead of generating demand with public expenditures. It does not solve other long term structural problems that persist after exit from the ZLB and which require other solutions.

Monday, June 01, 2015

Forecasting in Economic Crises

Maurizio Bovi summarizes a new published paper he wrote with Roy Cerqueti. The paper examines lay agents' forecasts amid great recessions (with a special focus on Greece). The paper is Bovi, M. and R. Cerqueti (2015) "Forecasting macroeconomic fundamentals in economic crises" Annals of Operations Research DOI 10.1007/s10479-015-1879-4:

Forecasting in Economic Crises: Expectations are a key factor in economics and heterogeneous forecasts are a fact of life. Just to mention, there are quite significant and well-known incentives to become a sport champion or to win a Nobel Prize, yet very few persons succeed in the endeavor. The brutal truth is that the majority lags behind or gives up - heterogeneity is the rule, not the exception. By the same token lay forecasters may learn, but it is unrealistic to think that all of them—even in the long run—will achieve Muth-optimal and, hence, homogeneous forecasts. The situation is made even more complex, and more interesting to study, when the fundamental to predict, the real GDP growth rate, is well below zero and highly volatile.
In recent work (Bovi and Cerqueti, 2015) we address the topic of heterogeneous forecasting performances amid deep recessions. Lay agents are assumed to have different predictive ability in that they have equal loss functions, but different asymmetry parameters that are used as control to minimize their forecasting errors. Simulating poor performing economies populated by three groups of forecasters, we have obtained the following results.
The less sophisticated forecasters in our setting – the “medians” (using passive rule-of-thumb) - never perform as the best predictors – the “muthians” – whereas “second best” (SB) agents (acting as attentive econometricians) do that only occasionally. This regardless the size of the crisis. Thus, as in the real world, in our artificial economy heterogeneity is a structural trait. More intriguingly, simulations also show that the medians’ behavior tend to be relatively smoother than that of SB agents, and that the difference between them widens in the case very serious crises. In particular, great recessions make SB agents’ predictions relatively more biased. An explanation is that dramatic crises extend the available information set (e.g., due to greater mass media coverage), and this leads SB agents, who are more prompt to revise their forecasts than medians.
Our results are somewhat in line with Simon’s famous statement about the fact that more information does not necessarily mean better forecasting performances. Furthermore, our outcomes shed some light on what has been happening in the freak macroeconomic expectations in Greece these years. The current crisis, in fact, may be thought of as a sort of natural experiment to understand how lay decision makers react to very dramatic years. In particular, due to its terrible recent downturn, Greece is one of the most suitable cases, raising the following question: How do Greeks perceive their own personal financial situation with respect to that of their country? Clearly, the representative citizen cannot by definition systematically drift apart from that of the country where she lives, given that the nation-wide economic situation is the (weighted) sum of the individual ones in the country. Yet, it may be hard to remain objective in the course of very deep and prolonged economic crises. The evidence depicted in the following graph looks rather suggestive of the effects of deep recessions on the rationality of people’s expectations, something that conform with our findings.

[Click on figure to enlarge]

Monday, May 11, 2015

'A Note on Nominal GDP Targeting and the Zero Lower Bound'

Roberto M. Billi, senior researcher at the Sveriges Riksbank, has a new paper on nominal GDP targeting:

”A Note on Nominal GDP Targeting and the Zero Lower Bound,” Sveriges Riksbank Working Paper Series No. 270, Revised May 2015: Abstract: I compare nominal GDP level targeting to strict price level targeting in a small New Keynesian model, with the central bank operating under optimal discretion and facing a zero lower bound on nominal interest rates. I show that, if the economy is only buffeted by purely temporary shocks to inflation, nominal GDP level targeting may be preferable because it requires the burden of the shocks to be shared by prices and output. But in the presence of persistent supply and demand shocks, strict price level targeting may be superior because it induces greater policy inertia and improves the tradeoffs faced by the central bank. During lower bound episodes, somewhat paradoxically, nominal GDP level targeting leads to larger falls in nominal GDP.

Friday, May 08, 2015

'Childhood Medicaid Coverage Improves Adult Earning and Health'

I highlighted the second article below, and many others reaching similar conclusions, in my last column:

Childhood Medicaid Coverage Improves Adult Earning and Health, NBER Digest: Medicaid today covers more Americans than any other public health insurance program. Introduced in 1965, its coverage was expanded substantially, particularly to low-income children, during the 1980s and the early 1990s.
Throughout Medicaid's history, there has been debate over whether the program improves health outcomes. Two new NBER studies exploit variation in children's eligibility for Medicaid, across birth cohorts and across states with different Medicaid programs, along with rich longitudinal data on health care utilization and earnings, to estimate the long-run effects of Medicaid eligibility on health, earnings, and transfer program participation.
In Childhood Medicaid Coverage and Later Life Health Care Utilization (NBER Working Paper No. 20929), Laura R. Wherry, Sarah Miller, Robert Kaestner, and Bruce D. Meyer find that among individuals who grew up in low-income families, rates of hospitalizations and emergency department visits in adulthood are negatively related to the number of years of Medicaid eligibility in childhood. The authors exploit the fact that one of the substantial expansions of Medicaid eligibility applied only to children who were born after September 30, 1983. This resulted in a large discontinuity in the lifetime years of Medicaid eligibility for children born before and after this birthdate cutoff. Children in families with incomes between 75 and 100 percent of the poverty line experienced about 4.5 more years of Medicaid eligibility if they were born just after the September 1983 cutoff than if they were born just before, with the gain occurring between the ages of 8 and 14. The authors compare children who they estimate were in low-income families, and otherwise similar circumstances, who were born just before or just after this date, to determine how the number of years of childhood Medicaid eligibility is related to health in early adulthood. Their finding of reduced health care utilization among adults who had more years of childhood Medicaid eligibility is concentrated among African Americans, those with chronic illness conditions, and those living in low-income zip codes. The authors calculate that reduced health care utilization during one year in adulthood offsets between 3 and 5 percent of the costs of extending Medicaid coverage to a child.
In Medicaid as an Investment in Children: What is the Long-Term Impact on Tax Receipts? (NBER Working Paper No. 20835), David W. Brown, Amanda E. Kowalski, and Ithai Z. Lurie conclude that each additional year of childhood Medicaid eligibility increases cumulative federal tax payments by age 28 by $247 for women, and $127 for men. Their empirical strategy for evaluating the impact of Medicaid relies on variation in program eligibility during childhood that is associated with both birth cohort and state of residence. The authors study longitudinal data on actual tax payments until individuals are in their late 20s, and they extrapolate this information to make projections for these individuals at older ages. When they compare the incremental discounted value of lifetime tax payments with the cost of additional Medicaid coverage, they conclude that "the government will recoup 56 cents of each dollar spent on childhood Medicaid by the time these children reach age 60." This calculation is based on federal tax receipts alone, and does not consider state tax receipts or potential reductions in the use of transfer payments in adulthood.
Both studies use large databases of administrative records to analyze the long-term effects of Medicaid. The first study measures health utilization using the Healthcare Cost and Utilization Project (HCUP) State Inpatient Databases for Arizona, Iowa, New York, Oregon, and Wisconsin in 1999, and those states plus Maryland and New Jersey in 2009. State hospital discharge data were also available from Texas and California. Data on all outpatient emergency department visits were available for six states in 2009. The second study examines data on federal tax payments and constructs longitudinal earnings histories for individuals who were born between 1981 and 1984. It also analyzes administrative records on Medicaid eligibility of children in this cohort.

Saturday, May 02, 2015

'Assessing the Effects of Monetary and Fiscal Policy'

From the NBER Reporter:

Assessing the Effects of Monetary and Fiscal Policy, by Emi Nakamura and Jón Steinsson, NBER Reporter 2015 Number 1: Research Summary: Monetary and fiscal policies are central tools of macroeconomic management. This has been particularly evident since the onset of the Great Recession in 2008. In response to the global financial crisis, U.S. short-term interest rates were lowered to zero, a large fiscal stimulus package was implemented, and the Federal Reserve engaged in a broad array of unconventional policies.
Despite their centrality, the question of how effective these policies are and therefore how the government should employ them is in dispute. Many economists have been highly critical of the government's aggressive use of monetary and fiscal policy during this period, in some cases arguing that the policies employed were ineffective and in other cases warning of serious negative consequences. On the other hand, others have argued that the aggressive employment of these policies has "walk[ed] the American economy back from the edge of a second Great Depression."1
In our view, the reason for this controversy is the absence of conclusive empirical evidence about the effectiveness of these policies. Scientific questions about how the world works are settled by conclusive empirical evidence. In the case of monetary and fiscal policy, unfortunately, it is very difficult to establish such evidence. The difficulty is a familiar one in economics, namely endogeneity. ..

After explaining the endogeneity problem, empirical evidence on price rigidity and its importance for assessing policy, structural modeling, natural experiments, and so on, they turn to their evidence:

Our identification approach is to study how real interest rates respond to monetary shocks in the 30-minute intervals around Federal Open Market Committee announcements. We find that in these short intervals, nominal and real interest rates for maturities as long as several years move roughly one-for-one with each other. Changes in nominal interest rates at the time of monetary announcements therefore translate almost entirely into changes in real interest rates, while expected inflation moves very little except at very long horizons.
We use this evidence to estimate the parameters of a conventional monetary business cycle model. ... This approach suggests that monetary non-neutrality is large. Intuitively, our evidence indicates that a monetary shock that yields a substantial response for real interest rates also yields a very small response for inflation. This suggests that prices respond quite sluggishly to changes in aggregate economic conditions and that monetary policy can have large effects on the economy.
Another area in which there has been rapid progress in using innovative identification schemes to estimate the impact of macroeconomic policy is that of fiscal stimulus.9 ... Much of the literature on fiscal stimulus that makes use of natural experiments focuses on the effects of war-time spending, since it is assumed that in some cases such spending is unrelated to the state of the economy. Fortunately - though unfortunately for empirical researchers - there are only so many large wars, so the number of data points available from this approach is limited.
In our work, we use cross-state variation in military spending to shed light on the fiscal multiplier.10 The basic idea is that when the U.S. experiences a military build-up, military spending will increase in states such as California - a major producer of military goods - relative to states, such as Illinois, where there is little military production. This approach uses a lot more data than the earlier literature on military spending but makes weaker assumptions, since we require only that the U.S. did not undertake a military build-up in response to the relative weakness of the economy in California vs. Illinois. We show that a $1 increase in military spending in California relative to Illinois yields a relative increase in output of $1.50. In other words, the "relative" multiplier is quite substantial.11
There is an important issue of interpretation here. We find evidence of a large "relative multiplier," but does this imply that the aggregate multiplier also will be large? The challenge that arises in interpreting these kinds of relative estimates is that there are general equilibrium effects that are expected to operate at an aggregate but not at a local level. In particular, if government spending is increased at the aggregate level, this will induce the Federal Reserve to tighten monetary policy, which will then counteract some of the stimulative effect of the increased government spending. This type of general equilibrium effect does not arise at the local level, since the Fed can't raise interest rates in California vs. Illinois in response to increased military spending in California relative to Illinois.
We show in our paper, however, that the relative multiplier does have a very interesting counterpart at the level of the aggregate economy. Even in the aggregate setting, the general equilibrium response of monetary policy to fiscal policy will be constrained when the risk-free nominal interest rate is constrained by its lower bound of zero. Our relative multiplier corresponds more closely to the aggregate multiplier in this case.12 Our estimates are, therefore, very useful in distinguishing between new Keynesian models, which generate large multipliers in these scenarios, and plain vanilla real business cycle models, which always generate small multipliers.
The evidence from our research on both fiscal and monetary policy suggests that demand shocks can have large effects on output. Models with price-adjustment frictions can explain such output effects, as well as (by design) the microeconomic evidence on price rigidity. Perhaps this evidence is still not conclusive, but it helps to narrow the field of plausible models. This new evidence will, we hope, help limit the scope of policy predictions of macroeconomic models that policymakers need to consider the next time they face a great challenge. ...

Trends and Cycles in China's Macroeconomy

A presentation at the 30th Annual Conference on Macroeconomics:

Also, a brief interview with Tao Zha:


Monday, April 27, 2015

'Are Immigrants a Shot in the Arm for the Local Economy?'

From the NBER (open link to earlier version):

Are Immigrants a Shot in the Arm for the Local Economy?, by Gihoon Hong and John McLaren, NBER Working Paper No. 21123: Most research on the effects of immigration focuses on the effects of immigrants as adding to the supply of labor. By contrast, this paper studies the effects of immigrants on local labor demand, due to the increase in consumer demand for local services created by immigrants. This effect can attenuate downward pressure from immigrants on non-immigrants' wages, and also benefit non-immigrants by increasing the variety of local services available. For this reason, immigrants can raise native workers' real wages, and each immigrant could create more than one job. Using US Census data from 1980 to 2000, we find considerable evidence for these effects: Each immigrant creates 1.2 local jobs for local workers, most of them going to native workers, and 62% of these jobs are in non-traded services. Immigrants appear to raise local non-tradables sector wages and to attract native-born workers from elsewhere in the country. Overall, it appears that local workers benefit from the arrival of more immigrants.

Friday, April 24, 2015

'No Price Like Home: Global House Prices, 1870-2012'

Interesting paper:

No Price Like Home: Global House Prices, 1870-2012, by Katharina Knoll, Moritz Schularic, and Thomas Steger: Abstract: How have house prices evolved over the long‐run? This paper presents annual house prices for 14 advanced economies since 1870. Based on extensive data collection, we show that real house prices stayed constant from the 19th to the mid‐20th century, but rose strongly during the second half of the 20th century. Land prices, not replacement costs, are the key to understanding the trajectory of house prices. Rising land prices explain about 80 percent of the global house price boom that has taken place since World War II. Higher land values have pushed up wealth‐to‐income ratios in recent decades.

Monday, April 20, 2015

'Labor Market Slack and Monetary Policy'

Let's hope the Fed is listening:

Labor Market Slack and Monetary Policy, by David G. Blanchflower and Andrew T. Levin, NBER Working Paper No. 21094: In the wake of a severe recession and a sluggish recovery, labor market slack cannot be gauged solely in terms of the conventional measure of the unemployment rate (that is, the number of individuals who are not working at all and actively searching for a job). Rather, assessments of the employment gap should reflect the incidence of underemployment (that is, people working part time who want a full-time job) and the extent of hidden unemployment (that is, people who are not actively searching but who would rejoin the workforce if the job market were stronger). In this paper, we examine the evolution of U.S. labor market slack and show that underemployment and hidden unemployment currently account for the bulk of the U.S. employment gap. Next, using state-level data, we find strong statistical evidence that each of these forms of labor market slack exerts significant downward pressure on nominal wages. Finally, we consider the monetary policy implications of the employment gap in light of prescriptions from Taylor-style benchmark rules.

[Open link]

Tuesday, April 14, 2015

Secular Stagnation: The Long View

From the NBER Digest:

Secular Stagnation: The Long View, by Matt Nesvisky: Growth economists are divided on whether the U.S. is facing a period of "secular stagnation" - an extended period of slow economic growth in the coming decades. In "Secular Stagnation: The Long View" (NBER Working Paper No. 20836), Barry Eichengreen considers four factors that could contribute to a persistent period of below-potential output and slow growth: a rise in saving due to the global integration of emerging markets, a decline in the rate of population growth, an absence of attractive investment opportunities, and a drop in the relative price of investment goods. He concludes that a decline in the relative price of investment goods is the most likely contributor to an excess of saving over investment.

With regard to long-term future growth rates, a key point of debate is how to interpret, and project forward, the "Third Industrial Revolution": the computer age and the new economy it has created. Some argue that the economic impact of digital technology has largely run its course, while others maintain that we have yet to experience the full effect of computerization. In this context, Eichengreen looks at the economic consequences of the age of steam and of the age of electrification. His analysis identifies two dimensions of the economic impact: "range of applicability" and "range of adaptation."
Range of applicability refers to the number of sectors or activities to which the key innovations can be applied. Use of the steam engine of the first industrial revolution for many years was limited to the textile industry and railways, which accounted for only a relatively small fraction of economic activity. Electrification in the second industrial revolution, says Eichengreen, had a larger impact on output and productivity growth because it affected a host of manufacturing industries, many individual households, and a wide range of activities within decades of its development.
The "computer revolution" of the second half of the 20th century had a relatively limited impact on overall economic growth, Eichengreen writes, because computerization had deeply transformative effects on only a limited set of industries, including finance, wholesale and retail trade, and the production of computers themselves. This perspective suggests that the implications for output and productivity of the next wave of innovations will depend greatly on their range of applicability. Innovations such as new tools (quantum computers), materials (graphene), processes (genetic modification), robotics, and enhanced interactivity of digital devices all promise a broad range of applications.
Range of adaptation refers to how comprehensively economic activity must be reorganized before positive impacts on output and productivity occur. Eichengreen reasons that the greater the required range of adaptation, the higher the likelihood that growth may slow in the short run, as costly investments in adaptation must be made and existing technology must be disrupted.
Yet the slow productivity growth in the United States in recent years may have positive implications for the future, he writes. Many connected activities and sectors - health care, education, industrial research, and finance - are being disrupted by the latest technologies. But once a broad range of adaptations is complete, productivity growth should accelerate, he reasons. "This is not a prediction," Eichengreen concludes, "but a suggestion to look to the range of adaptation required in response to the current wave of innovations when seeking to interpret our slow rate of productivity growth and when pondering our future."

Wednesday, March 18, 2015

'Arezki, Ramey, and Sheng on News Shocks'

I was at this conference as well. This paper was very well received (it has been difficult to find evidence that news generates business cycles, in part because it's been difficult to find a "clean" shock):

Arezki, Ramey, and Sheng on news shocks: I attended the NBER EFG (economic fluctuations and growth) meeting a few weeks ago, and saw a very nice paper by Rabah Arezki, Valerie Ramey, and Liugang Sheng, "News Shocks in Open Economies: Evidence from Giant Oil Discoveries" (There were a lot of nice papers, but this one is more bloggable.)

They look at what happens to economies that discover they have a lot of oil. ... An oil discovery is a well identified "news shock."

Standard productivity shocks are a bit nebulous, and alter two things at once: they give greater productivity and hence incentive to work today and also news about more income in the future.

An oil discovery is well publicized. It incentivizes a small investment in oil drilling, but mostly is pure news of an income flow in the future. It does not affect overall labor productivity or other changes to preferences or technology.
Rabah,Valerie, and Liugang then construct a straightforward macro model of such an event. ...[describes model and results]...

Valerie, presenting the paper, was a bit discouraged. This "news shock" doesn't generate a pattern that looks like standard recessions, because GDP and employment go in the opposite direction.

I am much more encouraged. Here are macroeconomies behaving exactly as they should, in response to a shock where for once we really know what the shock is. And in response to a shock with a nice dynamic pattern, which we also really understand.

My comment was something to the effect of "this paper is much more important than you think. You match the dynamic response of economies to this large and very well identified shock with a standard, transparent and intuitive neoclassical model. Here's a list of some of the ingredients you didn't need: Sticky prices, sticky wages, money, monetary policy, (i.e. interest rates that respond via a policy rule to output and inflation or zero bounds that stop them from doing so), home bias, segmented financial markets, credit constraints, liquidity constraints, hand-to-mouth consumers, financial intermediation, liquidity spirals, fire sales, leverage, sudden stops, hot money, collateral constraints, incomplete markets, idiosyncratic risks, strange preferences including habits, nonexpected utility, ambiguity aversion, and so forth, behavioral biases, nonexpected utility, or rare disasters. If those ingredients are really there, they ought to matter for explaining the response to your shocks too. After all, there is only one economic structure, which is hit by many shocks. So your paper calls into question just how many of those ingredients are really there at all."

Thomas Phillipon, whose previous paper had a pretty masterful collection of a lot of those ingredients, quickly pointed out my overstatement. One needs not need every ingredient to understand every shock. Constraint variables are inequalities. A positive news shock may not cause credit constraints etc. to bind, while a negative shock may reveal them.

Good point. And really, the proof is in the pudding. If those ingredients are not necessary, then I should produce a model without them that produces events like 2008. But we've been debating the ingredients and shock necessary to explain 1932 for 82 years, so that approach, though correct, might take a while.

In the meantime, we can still cheer successful simple models and well identified shocks on the few occasions that they appear and fit data so nicely. Note to graduate students, this paper is a really nice example to follow for its integration of clear theory and excellent empirical work.

Tuesday, February 17, 2015

'Applying Keynes's Insights about Liquidity Preference to the Yield Curve'

Via email, a new paper from Josh R. Stillwagon, an Assistant Professor of Economics at Trinity College, appearing in  the Journal of International Financial Markets, Institutions & Money. The paper "applies some of Keynes's insights about liquidity preference to understanding term structure premia. The following is an excerpt paraphrased from the conclusion":

"This work uses survey data on traders' interest rate forecasts to test the expectations hypothesis of the term structure and finds clear evidence of a time-varying risk premium in four markets... Further, it identifies two significant factors which impact the magnitude of the risk premium. The first is overall consumer sentiment analogous to Keynes's "animal spirits"... The second factor is the level of and/or changes in the interest rate, consistent with the imperfect knowledge economics gap model [applied now to term premia]; the intuition being that the increasing skew to potential bond price movements from a fall in the interest rate [leaving more to fear than to hope as Keynes put it] causes investors to demand a greater premium. This was primarily observed in the medium-run relations of the I(2) CVAR, indicating that these effects are transitory suggesting, as Keynes argued, that what matters is not merely how far the interest rate is from zero but rather how far it is from recent levels."
This link is free for 50 days:

Wednesday, February 11, 2015

'The Long-Term Impact of Inequality on Entrepreneurship and Job Creation'

Via Chris Dillow, a new paper on inequality and economic growth:

The Long-Term Impact of Inequality on Entrepreneurship and Job Creation, by Roxana Gutiérrez-Romero and Luciana Méndez-Errico: Abstract We assess the extent to which historical levels of inequality affect the likelihood of businesses being created, surviving and of these cr eating jobs overtime. To this end, we build a pseudo-panel of entrepreneurs across 48 countries using the Global Entrepreneurship Monitor Survey over 2001-2009. We complement this pseudo-panel with historical data of income distribution and indicators of current business regulation. We find that in countries with higher levels of inequality in the 1700s and 1800s, businesses today are more likely to die young and create fewer jobs. Our evidence supports economic theories that argue initial wealth distribution influences countries’ development path, having therefore important policy implications for wealth redistribution.

Chris argues through a series of examples that such long-term effects are reasonable (things in the 1700s and 1800s mattering today), and then concludes with:

... All this suggests that, contrary to simple-minded neoclassical economics and Randian libertarianism, individuals are not and cannot be self-made men. We are instead creations of history. History is not simply a list of the misdeeds of irrelevant has-beens; it is a story of how we were made. Burke was right: society is "a partnership not only between those who are living, but between those who are living, those who are dead, and those who are to be born."
One radical implication of all this is Herbert Simon's:
When we compare the poorest with the richest nations, it is hard to conclude that social capital can produce less than about 90 percent of income in wealthy societies like those of the United States or Northwestern Europe. On moral grounds, then, we could argue for a flat income tax of 90 percent to return that wealth to its real owners.

I find myself skeptical of such long-term effects, but maybe...

Thursday, January 08, 2015

'The Link between High Employment and Labor Market Fluidity'

Laurent Belsie in the NBER Digest:

The Link between High Employment and Labor Market Fluidity: U.S. labor markets lost much of their fluidity well before the onset of the Great Recession, according to Labor Market Fluidity and Economic Performance (NBER Working Paper No. 20479). The economy's ability to move jobs quickly from shrinking firms to young, growing enterprises slowed after 1990. Job reallocation rates fell by more than a quarter. After 2000, the volume of hiring and firing - known as the worker reallocation rate - also dropped. The decline was broad-based, affecting multiple industries, states, and demographic groups. The groups that suffered the most were the less-educated and the young, particularly young men.
"The loss of labor market fluidity suggests the U.S. economy became less dynamic and responsive in recent decades," authors Steven J. Davis and John Haltiwanger conclude. "Direct evidence confirms that U.S. employers became less responsive to shocks in recent decades, not that employer-level shocks became less variable."

Many factors contributed to the decline in job and worker reallocation rates, among them a shift to older companies, an aging workforce, changing business models and supply chains, the effects of the information revolution on hiring, and government policies.
About a quarter of the decline in job reallocation can be explained by the decline in the formation of young firms in the U.S. From the early 1980s and until about 2000, retail and services accounted for most of the decline in job reallocation. This occurred even though jobs shifted away from manufacturing and toward retail, where job creation is normally more dynamic and worker turnover more pronounced. One reason for the slowdown in turnover was the growing importance of big box chains in the retail sector. The authors note that other studies find that jobs are more durable in larger retail firms, and their workers are more productive than workers at the smaller stores these retailers replaced.
Fewer layoffs and more employment stability are generally considered positive trends and natural outgrowths of an aging workforce. The flip side of this equation, however, is that slower job and worker reallocation mean slower creation of new jobs, putting the jobless, including young people, at a heightened risk of long-term unemployment. These developments also slow job advancement and career changes, which are associated with boosts in wages.
This is of particular significance since 2000, when the concentration of declines in job reallocation rates and the employment share of young firms shifted from the retail sector to high-tech industries.
"These developments raise concerns about productivity growth, which has close links to creative destruction and factor reallocation in prominent theories of innovation and growth and in many empirical studies," the authors write.
Government regulation also played a role in slowing job and worker reallocation rates. In 1950, under five percent of workers required a government license to hold their job; by 2008, the percentage had risen to 29 percent. Add in government certification and the share rises to 38 percent. Wrongful discharge laws make it harder to fire employees. Federal and state laws protect classes of workers based on race, religion, gender, and other attributes. Minimum-wage laws and the heightened importance of employer-provided health insurance also make job changes less frequent.
The authors study the effects of the decline in job and worker reallocation rates on employment rates by gender, education, and age, using state-level data. They find that states with especially large declines in labor market fluidity also experienced the largest declines in employment rates, with young and less-educated persons the most adversely affected.
"...if our assessment is correct," the authors conclude, "the United States is unlikely to return to sustained high employment rates without restoring labor market fluidity."

Monday, November 10, 2014

'Honest Abe Was a Co-op Dude: How the G.O.P. Can Save Us from Despotism'

Stephen Ziliak emails:

Dear Mark:
I thought you and readers of Economist's View would like to know about an essay, "Honest Abe Was a Co-op Dude: How the G.O.P. Can Save Us from Despotism", hot off the press. Here is the abstract:
Abstract: Abraham Lincoln was a co-op dude. He had a hip neck beard, sure. Everyone knows that. But few have bothered to notice that the first Republican President of the United States was an economic democrat who put labor above capital. Labor is prior to and independent of capital, Lincoln believed, and “deserves much the higher consideration”, he told Congress in his first annual address of 1861. Capital despotism is on the rise again, threatening the stability of the economy and union. The biggest problem of democracy now is not the failure to fully extend political rights, however important. The bigger problem is economic in nature. The threat today is from a lack of economic democracy—a lack of ownership, of self-reliance, of autonomy, and of justice in the distribution of rewards and punishments at work. From the appropriation of company revenue to lack of protection against pension raids, capital despotism is rife. “The road to serfdom” has many paths to choose from, Hayek warned in his important book of 1944. But too many Americans—including economists and policymakers—are neglecting the economic path, the road to serfdom caused by a lack of economic democracy. Cooperative banks and firms can help.
And here are a few excerpts:
“Labor is prior to and independent of capital. Capital is only the fruit of labor, and could never have existed if labor had not first existed. Labor is the superior of capital, and deserves much the higher consideration.”
—Abraham Lincoln
"Economists in the know have acknowledged that the worker owned cooperative firm is the most perfect model of economic democracy and rational business organization dreamed up so far. That is true around the world, from Springfield all the way back to Shelbyville, economists who’ve examined such co-ops agree. Co-ops are more productive. And every worker is an owner."
"From the Dutch blossoming of commerce in the 1600s to the Asian Spring of the 2000s, socialists and capitalists alike have not produced, it seems, a better, more efficient and democratic form of economic production and distribution. Co-ops win. Not everyone is convinced."
"If co-ops are so great, why don’t they dominate the economy? Negligence and ignorance, more than any other possible cause, it would seem.
 For example, the infamous “socialist calculation debate” in economics dragged on for two decades before a single word was said by either side, from Lange and Lerner to von Mises and Hayek, about the nature of the firm. Nary a peep from economists about how or even why firms choose to organize into production units of a certain scale, large or small. Ronald Coase’s article on “The Nature of the Firm” (1937) was good enough to fetch him a Nobel Prize. But Coase did not bring as much clarity to the debate as most economists believe.
 Coase was vague and conventional to point of embarrassment. He made straw man assumptions about the firm being a hierarchical-capitalistic entity. Coase’s firm, though more “tractable” and “realistic” than previous notions, is assumed to be run by a “master” or masters, by capitalists who seek to maximize profit by bossing around “servants”—that is, wage earners possessing little autonomy, little or no ownership, and no voting rights on capital, their sole purpose being assumed to serve the “masters” of capital.
 Said Coase, “If a workman moves from department Y to department X, he does not go because of a change in relative prices, but because he is ordered to do so.” But if Coase (himself a lovely man in person) would have taken a closer look at the real world, he could have found cooperative firms succeeding in stark contrast to the anti-democratic firms of his imagination."
Stephen T. Ziliak

Thursday, October 23, 2014

'The Effects of a Money-Financed Fiscal Stimulus'

Jordi Galí:

The Effects of a Money-Financed Fiscal Stimulus, by Jordi Galí, September 2014: Abstract I analyze the effects of an increase in government purchases financed entirely through seignorage, in both a classical and a New Keynesian framework, and compare them with those resulting from a more conventional debt-financed stimulus. My findings point to the importance of nominal rigidities in shaping those effects. Under a realistic calibration of such rigidities, a money-financed fiscal stimulus is shown to have very strong effects on economic activity, with relatively mild inflationary consequences. If the steady state is sufficiently inefficient, an increase in government purchases may increase welfare even if such spending is wasteful.

Thursday, October 02, 2014

Is Blogging or Tweeting about Research Papers Worth It?

Via the Lindau blog:

The verdict: is blogging or tweeting about research papers worth it?, by Melissa Terras: Eager to find out what impact blogging and social media could have on the dissemination of her work, Melissa Terras took all of her academic research, including papers that have been available online for years, to the web and found that her audience responded with a huge leap in interest...

Just one quick note. This is what happened when one person started promoting her research through social media. If everyone does it, and there is much more competition for eyeballs, the results might differ.

Monday, September 29, 2014

'Reconstructing Macroeconomic Theory to Manage Economic Policy'

New paper from Joseph Stiglitz:

Reconstructing Macroeconomic Theory to Manage Economic Policy, by Joseph E. Stiglitz, NBER Working Paper No. 20517, September 2014 NBER: Macroeconomics has not done well in recent years: The standard models didn't predict the Great Recession; and even said it couldn't happen. After the bubble burst, the models did not predict the full consequences.
The paper traces the failures to the attempts, beginning in the 1970s, to reconcile macro and microeconomics, by making the former adopt the standard competitive micro-models that were under attack even then, from theories of imperfect and asymmetric information, game theory, and behavioral economics.
The paper argues that any theory of deep downturns has to answer these questions: What is the source of the disturbances? Why do seemingly small shocks have such large effects? Why do deep downturns last so long? Why is there such persistence, when we have the same human, physical, and natural resources today as we had before the crisis?
The paper presents a variety of hypotheses which provide answers to these questions, and argues that models based on these alternative assumptions have markedly different policy implications, including large multipliers. It explains why the apparent liquidity trap today is markedly different from that envisioned by Keynes in the Great Depression, and why the Zero Lower Bound is not the central impediment to the effectiveness of monetary policy in restoring the economy to full employment.

[I couldn't find an open link.]

Monday, July 28, 2014

'Presidents and the U.S. Economy: An Econometric Exploration'

Alan Blinder and Mark Watson:

Presidents and the U.S. Economy: An Econometric Exploration, by Alan S. Blinder and Mark W. Watson, NBER Working Paper No. 20324 [open link]: The U.S. economy has grown faster—and scored higher on many other macroeconomic metrics—when the President of the United States is a Democrat rather than a Republican. For many measures, including real GDP growth (on which we concentrate), the performance gap is both large and statistically significant, despite the fact that postwar history includes only 16 complete presidential terms. This paper asks why. The answer is not found in technical time series matters (such as differential trends or mean reversion), nor in systematically more expansionary monetary or fiscal policy under Democrats. Rather, it appears that the Democratic edge stems mainly from more benign oil shocks, superior TFP performance, a more favorable international environment, and perhaps more optimistic consumer expectations about the near-term future. Many other potential explanations are examined but fail to explain the partisan growth gap.

Monday, July 14, 2014

'Empirical Evidence on Inflation Expectations in the New Keynesian Phillips Curve'

Via email, a comment on my comments about the difficulty of settling questions about the Phillips curve empirically:

Dear Professor Thoma,
I saw your recent post on the difficulty of empirically testing the Phillips Curve, and I just wanted to alert you to a survey paper on this topic that I wrote with Sophocles Mavroeidis and Jim Stock: "Empirical Evidence on Inflation Expectations in the New Keynesian Phillips Curve". It was published in the Journal of Economic Literature earlier this year (ungated working paper).
In the paper we estimate a vast number of specifications of the New Keynesian Phillips Curve (NKPC) on a common U.S. data set. The specification choices include the data series, inflation lag length, sample period, estimator, and so on. A subset of the specifications amount to traditional backward-looking (adaptive expectation) Phillips Curves. We are particularly interested in two key parameters: the extent to which price expectations are forward-looking, and the slope of the curve (how responsive inflation is to real economic activity).
Our meta-analysis finds that essentially any desired parameter estimates can be generated by some reasonable-sounding specification. That is, estimation of the NKPC is subject to enormous specification uncertainty. This is consistent with the range of estimates reported in the literature. Even if one were to somehow decide on a given specification, the uncertainty surrounding the parameter estimates is typically large. We give theoretical explanations for these empirical findings in the paper. To be clear: Our results do not reject the validity of the NKPC (or more generally, the presence of a short-run inflation/output trade-off), but traditional aggregate time series analysis is just not very informative about the nature of inflation dynamics.
Kind regards,
Mikkel Plagborg-Moller
PhD candidate in economics, Harvard University

Monday, June 23, 2014

Bank Failure, Relationship Lending, and Local Economic Performance

John Kandrac (a former Ph.D. student):

Bank Failure, Relationship Lending, and Local Economic Performance, by John Kandrac, Board of Governors of the Federal Reserve System, Finance and Economics Discussion Series: Abstract Whether bank failures have adverse effects on local economies is an important question for which there is conflicting and relatively scarce evidence. In this study, I use county-level data to examine the effect of bank failures and resolutions on local economies. Using quasi-experimental techniques as well as cross-sectional variation in bank failures, I show that recent bank failures lead to lower income and compensation growth, higher poverty rates, and lower employment. Additionally, I find that the structure of bank resolution appears to be important. Resolutions that include loss-sharing agreements tend to be less deleterious to local economies, supporting the notion that the importance of bank failure to local economies stems from banking and credit relationships. Finally, I show that markets with more inter-bank competition are more strongly affected by bank failure. [Download Full text]

Wednesday, May 28, 2014

'Unemployment Insurance and Disability Insurance in the Great Recession'

From the NBER Digest:

Unemployment Insurance and Disability Insurance in the Great Recession: At the end of 2012, 8.8 million American adults were receiving Social Security Disability Insurance (SSDI) benefits. The share of the American public receiving SSDI has more than doubled since 1990. This rapid growth has prompted concerns about SSDI's sustainability: recent projections suggest that the SSDI trust fund will be exhausted in 2016.
SSDI recipients tend to remain in the program, and out of the labor market, from the time they are approved for benefits until they reach retirement age. This means that if unemployed individuals turn to disability insurance as a source of benefits when they exhaust their unemployment insurance (UI), the long-term program costs can be substantial. Some have suggested that the savings from avoided SSDI cases could help to finance the cost of extending UI benefits, but little is known about the interaction between SSDI and UI.
In Unemployment Insurance and Disability Insurance in the Great Recession, (NBER Working Paper No. 19672), Andreas Mueller, Jesse Rothstein, and Till von Wachter use data from the last decade to investigate the relationship between UI exhaustion and SSDI applications. They take advantage of the variability of UI benefit durations during the recent economic downturn. The duration of these benefits was as long as 99 weeks in 2009, remained protracted for several years, then was shortened substantially in 2012. The authors focus on the uneven extension of UI benefits during and after the Great Recession to isolate variation in the duration of these benefits that is not confounded by variation in economic conditions more broadly.
The authors find very little interaction between UI benefit eligibility and SSDI applications, and conclude that SSDI applications do not appear to respond to UI exhaustion. While the authors cannot rule out small effects, they conclude that SSDI applications do not respond strongly enough to contribute meaningfully to a cost-benefit analysis of UI extensions or to account for the cyclical behavior of SSDI applications.
The authors suggest that the tendency for the number of SSDI applications to grow when the economy is weak may reflect variation in the potential reemployment wages of displaced workers, or changes in the employment opportunities of the marginally disabled that influence the evaluation of an SSDI applicant's employability. These channels are not linked to the generosity or duration of UI benefits, and they imply that more stringent functional capacity reviews of SSDI applicants may not reduce recession-induced SSDI claims if these claims reflect examiners' judgments that the applicants are truly not employable in the existing labor market.

Monday, April 28, 2014

New Research in Economics: Central Banking For All: A Modest Case for Radical Reform

Via Nicholas Gruen:

Central Banking For All: A modest Case for Radical Reform (Download): This paper offers a  radical option for banking reform: government should offer central banking services not just to commercial banks, but directly to citizens.
Key Findings
Nicholas Gruen argues the UK and other countries need radical banking reform This can be achieved by a simple change: giving ordinary people the same right to use central banks’ services as big commercial banks have. Though they enjoy high margins and/or fees, banks add little value to ‘commodity services’ like customer accounts and highly-collateralised mortgages like older ones that are partially paid off which are basically riskless.
There’s widespread agreement that the UK needs better banks and a better deal for bank customers. This report by Nicholas Gruen, economist and founding chairman of Kaggle and The Australian Centre for Social Innovation, proposes a simple but radical solution.
Gruen argues that in the age of the internet, the Bank of England can now extend the services it currently offers only to banks to everyone in the UK. In particular, it should offer (for instance through National Savings & Investments) simple, cheap deposit and savings accounts to all, paying interest at Bank Rate. Second, it should offer to guarantee any well-collateralised mortgage (for instance a residential mortgage for less than 60 per cent of the value of the collateral).
At the moment, commercial banks provide these services at a cost (both in terms of worse rates, fees with their margins inflated by their funder’s knowledge that they are implicitly government guaranteed).
By cutting out the middle-man in the form of the banks, Gruen argues customers would get a better deal, and private competitors providing finance could focus on the provision of finance where the efficient pricing of risk is essential – most particularly residential finance above 60 per cent of the value of collateral.
Policy Recommendations
The government should allow the Bank of England to provide central banking services directly to anyone who wants them, not just banks. The Bank should offer to fund or guarantee any well-collateralised mortgage (e.g. with less than 60 per cent of the property value outstanding) The Bank should, through National Savings and Investments, offer simple deposit and savings accounts to anyone who wants them with no upper limits, paying Bank Rate of interest.

Tuesday, April 08, 2014

A Model of Secular Stagnation

Gauti Eggertson and Neil Mehotra have an interesting new paper:

A Model of Secular Stagnation, by Gauti Eggertsson and Neil Mehrotra: 1 Introduction During the closing phase of the Great Depression in 1938, the President of the American Economic Association, Alvin Hansen, delivered a disturbing message in his Presidential Address to the Association (see Hansen ( 1939 )). He suggested that the Great Depression might just be the start of a new era of ongoing unemployment and economic stagnation without any natural force towards full employment. This idea was termed the ”secular stagnation” hypothesis. One of the main driving forces of secular stagnation, according to Hansen, was a decline in the population birth rate and an oversupply of savings that was suppressing aggregate demand. Soon after Hansen’s address, the Second World War led to a massive increase in government spending effectively end- ing any concern of insufficient demand. Moreover, the baby boom following WWII drastically changed the population dynamics in the US, thus effectively erasing the problem of excess sav- ings of an aging population that was of principal importance in his secular stagnation hypothesis.
Recently Hansen’s secular stagnation hypothesis has gained increased attention. One obvious motivation is the Japanese malaise that has by now lasted two decades and has many of the same symptoms as the U.S. Great Depression - namely dwindling population growth, a nominal interest rate at zero, and subpar GDP growth. Another reason for renewed interest is that even if the financial panic of 2008 was contained, growth remains weak in the United States and unemployment high. Most prominently, Lawrence Summers raised the prospect that the crisis of 2008 may have ushered in the beginning of secular stagnation in the United States in much the same way as suggested by Alvin Hansen in 1938. Summers suggests that this episode of low demand may even have started well before 2008 but was masked by the housing bubble before the onset of the crisis of 2008. In Summers’ words, we may have found ourselves in a situation in which the natural rate of interest - the short-term real interest rate consistent with full employment - is permanently negative (see Summers ( 2013 )). And this, according to Summers, has profound implications for the conduct of monetary, fiscal and financial stability policy today.
Despite the prominence of Summers’ discussion of the secular stagnation hypothesis and a flurry of commentary that followed it (see e.g. Krugman ( 2013 ), Taylor ( 2014 ), Delong ( 2014 ) for a few examples), there has not, to the best of our knowledge, been any attempt to formally model this idea, i.e., to write down an explicit model in which unemployment is high for an indefinite amount of time due to a permanent drop in the natural rate of interest. The goal of this paper is to fill this gap. ...[read more]...

In the abstract, they note the policy prescriptions for secular stagnation:

In contrast to earlier work on deleveraging, our model does not feature a strong self-correcting force back to full employment in the long-run, absent policy actions. Successful policy actions include, among others, a permanent increase in inflation and a permanent increase in government spending. We also establish conditions under which an income redistribution can increase demand. Policies such as committing to keep nominal interest rates low or temporary government spending, however, are less powerful than in models with temporary slumps. Our model sheds light on the long persistence of the Japanese crisis, the Great Depression, and the slow recovery out of the Great Recession.

Tuesday, March 04, 2014

'Will MOOCs Lead to the Democratisation of education?'

Some theoretical results on MOOCs:

Will MOOCs lead to the democratisation of education?, by Joshua Gans: With all the recent discussion of how hard it is for journalists to read academic articles, I thought I’d provide a little service here and ‘translate’ the recent NBER working paper by Daron Acemoglu, David Laibson and John List, “Equalizing Superstars” for a general audience. The paper contains a ‘light’ general equilibrium model that may be difficult for some to parse.
The paper is interested in what the effect of MOOCs or, in general, web-based teaching options would be on educational outcomes around the world, the distribution of those outcomes and the wages of teachers. ...

Thursday, February 13, 2014

Debt and Growth: There is No Magic Threshold

New paper from the IMF:

Debt and Growth: Is There a Magic Threshold?, by Andrea Pescatori ; Damiano Sandri ; John Simon [Free Full text]: Summary: Using a novel empirical approach and an extensive dataset developed by the Fiscal Affairs Department of the IMF, we find no evidence of any particular debt threshold above which medium-term growth prospects are dramatically compromised. Furthermore, we find the debt trajectory can be as important as the debt level in understanding future growth prospects, since countries with high but declining debt appear to grow equally as fast as countries with lower debt. Notwithstanding this, we find some evidence that higher debt is associated with a higher degree of output volatility.

[Via Bruce Bartlett on Twitter.]

Wednesday, February 12, 2014

'Is Increased Price Flexibility Stabilizing? Redux'

I need to read this:

Is Increased Price Flexibility Stabilizing?, by Redux Saroj Bhattarai, Gauti Eggertsson, and Raphael Schoenle, NBER Working Paper No. 19886 February 2014 [Open Link]: Abstract We study the implications of increased price flexibility on output volatility. In a simple DSGE model, we show analytically that more flexible prices always amplify output volatility for supply shocks and also amplify output volatility for demand shocks if monetary policy does not respond strongly to inflation. More flexible prices often reduce welfare, even under optimal monetary policy if full efficiency cannot be attained. We estimate a medium-scale DSGE model using post-WWII U.S. data. In a counterfactual experiment we find that if prices and wages are fully flexible, the standard deviation of annualized output growth more than doubles.

Friday, February 07, 2014

Latest from the Journal of Economic Perspectives

A few of the articles from the latest Journal of Economic Perspectives:

When Ideas Trump Interests: Preferences, Worldviews, and Policy Innovations, by Dani Rodrik: Ideas are strangely absent from modern models of political economy. In most prevailing theories of policy choice, the dominant role is instead played by "vested interests"—elites, lobbies, and rent-seeking groups which get their way at the expense of the general public. Any model of political economy in which organized interests do not figure prominently is likely to remain vacuous and incomplete. But it does not follow from this that interests are the ultimate determinant of political outcomes. Here I will challenge the notion that there is a well-defined mapping from "interests" to outcomes. This mapping depends on many unstated assumptions about the ideas that political agents have about: 1) what they are maximizing, 2) how the world works, and 3) the set of tools they have at their disposal to further their interests. Importantly, these ideas are subject to both manipulation and innovation, making them part of the political game. There is, in fact, a direct parallel, as I will show, between inventive activity in technology, which economists now routinely make endogenous in their models, and investment in persuasion and policy innovation in the political arena. I focus specifically on models professing to explain economic inefficiency and argue that outcomes in such models are determined as much by the ideas that elites are presumed to have on feasible strategies as by vested interests themselves. A corollary is that new ideas about policy—or policy entrepreneurship—can exert an independent effect on equilibrium outcomes even in the absence of changes in the configuration of political power. I conclude by discussing the sources of new ideas. Full-Text Access | Supplementary Materials

An Economist's Guide to Visualizing Data, by Jonathan A. Schwabish: Once upon a time, a picture was worth a thousand words. But with online news, blogs, and social media, a good picture can now be worth so much more. Economists who want to disseminate their research, both inside and outside the seminar room, should invest some time in thinking about how to construct compelling and effective graphics. Full-Text Access | Supplementary Materials

Wednesday, January 01, 2014

'Minimum Wages and the Distribution of Family Incomes'

Arin Dube new working paper entitled "Minimum wages and the distribution of family incomes."

Here is his short summary:

The paper tries to make sense of the existing literature, while providing new (and I would argue better) answers to old questions such as the effect on the poverty rate, and also conduct a more full-fledged distributional analysis of minimum wages and family incomes using newer tools.

Here is the abstract:

I use data from the March Current Population Survey between 1990 and 2012 to evaluate the effect of minimum wages on the distribution of family incomes for non-elderly individuals. I find robust evidence that higher minimum wages moderately reduce the share of individuals with incomes below 50, 75 and 100 percent of the federal poverty line. The elasticity of the poverty rate with respect to the minimum wage ranges between -0.12 and -0.37 across specifications with alternative forms of time-varying controls and lagged effects; most of these estimates are statistically significant at conventional levels. For my preferred (most saturated) specification, the poverty rate elasticity is -0.24, and rises in magnitude to -0.36 when accounting for lags. I also use recentered influence function regressions to estimate unconditional quantile partial effects of minimum wages on family incomes. The estimated minimum wage elasticities are sizable for the bottom quantiles of the equivalized family income distribution. The clearest effects are found at the 10th and 15th quantiles, where estimates from most specifications are statistically significant; minimum wage elasticities for these two family income quantiles range between 0.10 and 0.43 depending on control sets and lags. I also show that the canonical two-way fixed effects model---used most often in the literature---insufficiently accounts for the spatial heterogeneity in minimum wage policies, and fails a number of key falsification tests. Accounting for time-varying regional effects, and state-specific recession effects both suggest a greater impact of the policy on family incomes and poverty, while the addition of state-specific trends does not appear to substantially alter the estimates. I also provide a quantitative summary of the literature, bringing together nearly all existing elasticities of the poverty rate with respect to minimum wages from 12 different papers. The range of the estimates in this paper is broadly consistent with most existing evidence, including for some key subgroups, but previous studies often suffer from limitations including insufficiently long sample periods and inadequate controls for state-level heterogeneity, which tend to produce imprecise and erratic results.

Update: Here is the key graph from the paper (click for larger version):


Sunday, December 01, 2013

God Didn’t Make Little Green Arrows

Paul Krugman notes work by my colleague George Evans relating to the recent debate over the stability of GE models:

God Didn’t Make Little Green Arrows: Actually, they’re little blue arrows here. In any case George Evans reminds me of paper (pdf) he and co-authors published in 2008 about stability and the liquidity trap, which he later used to explain what was wrong with the Kocherlakota notion (now discarded, but still apparently defended by Williamson) that low rates cause deflation.

The issue is the stability of the deflation steady state ("on the importance of little arrows"). This is precisely the issue George studied in his 2008 European Economic Review paper with E. Guse and S. Honkapohja. The following figure from that paper has the relevant little arrows:


This is the 2-dimensional figure (click on it for a larger version) showing the phase diagram for inflation and consumption expectations under adaptive learning (in the New Keynesian model both consumption or output expectations and inflation expectations are central). The intended steady state (marked by a star) is locally stable under learning but the deflation steady state (given by the other intersection of black curves) is not locally stable and there are nearby divergent paths with falling inflation and falling output. There is also a two page summary in George's 2009 Annual Review of Economics paper.

The relevant policy issue came up in 2010 in connection with Kocherlakota's comments about interest rates, and I got George to make a video in Sept. 2010 that makes the implied monetary policy point.

I think it would be a step forward if  the EER paper helped Williamson and others who have not understood the disequilibrium stability point. The full EER reference is Evans, George; Guse, Eran and Honkapohja, Seppo, "Liquidity Traps, Learning and Stagnation" European Economic Review, Vol. 52, 2008, 1438 – 1463.

Friday, November 15, 2013

'Infant Mortality and the President’s Party'

Chris Blattman:

Do Republican Presidents kill babies?:

Across all nine presidential administrations, infant mortality rates were below trend when the President was a Democrat and above trend when the President was a Republican.

This was true for overall, neonatal, and postneonatal mortality, with effects larger for postneonatal compared to neonatal mortality rates.

Regression estimates show that, relative to trend, Republican administrations were characterized by infant mortality rates that were, on average, three percent higher than Democratic administrations.

In proportional terms, effect size is similar for US whites and blacks. US black rates are more than twice as high as white, implying substantially larger absolute effects for blacks.

A new paper titled, “US Infant Mortality and the President’s Party“. I like my title better.

The abstract also says:

Conclusions: We found a robust, quantitatively important association between net of trend US infant mortality rates and the party affiliation of the president. There may be overlooked ways by which macro-dynamics of policy impact micro-dynamics of physiology, suggesting the political system is a component of the underlying mechanism generating health inequality in the United States.