Category Archive for: Economics [Return to Main]

Monday, August 06, 2018

Links (8/6/18)

Friday, August 03, 2018

Links (8/3/18)

Economy Adds 157,000 Jobs in July, Little Evidence of Pick-up in Wage Growth

Dean Baker:

Economy Adds 157,000 Jobs in July, Little Evidence of Pick-up in Wage Growth: Unemployment rates for workers without a high school degree hit a record low as less-educated workers continue to be biggest job gainers in recovery.
The Bureau of Labor Statistics (BLS) reported the economy added 157,000 jobs in July. With upward revisions to the data from the prior two months, the average gain over the last three months was 224,000. The unemployment rate edged down to 3.9 percent as most of the rise in unemployment in June, which was due to increased labor force participation, was reversed. The employment-to-population (EPOP) ratio rose to 60.5 percent, a new high for the recovery.
In spite of the healthy pace of job growth and the low unemployment rate, there continues to be little evidence of accelerating wage growth. Over the last year, the average hourly wage has risen by 2.7 percent. There is a very small uptick to 2.87 percent if we annualize the rate of wage growth for the last three months (May, June, and July) compared with the prior three months (February, March, and April).
Interestingly, there was a modest fall in hours in July, which led to a decline in the index of aggregate weekly hours from 110.0 to 109.8. As a result, the average weekly wage actually declined slightly in July. ...
In spite of the complaints about labor shortages in sectors such as manufacturing and trucking, we continue to see little evidence in wage growth. The average hourly wage for production workers in manufacturing has risen by just 2.7 percent over the last year, while in the larger trucking and warehousing category it has risen less than 2.5 percent.
The story on the household side was overwhelmingly positive. In addition to the rise in EPOPs, the number of involuntary part-time workers fell by 176,000 to a new low for the recovery. The percentage of unemployment due to voluntary quits rose to 13.5 percent, largely reversing a drop in June. The unemployment rate for Hispanic workers fell to 4.5 percent, a new record low.
Looking at 10 year age spans for prime age workers, EPOPs have been rising for both men and women, although only women between the ages of 25 and 34 have recovered to their prerecession peak EPOP. Even this group is still slightly below its 2000 peaks. The trends in EPOPs suggests there is further room for employment to expand.
The unemployment rate for workers without a high school degree fell to 5.1 percent in July, the lowest rate since the BLS adjusted its education measures in 1992. This is 1.9 percentage points below its year-ago rate.
Less-educated workers have been the big gainers in terms of employment in the last few years of the recovery. While the unemployment rate for workers with less than a high school degree is well below the prerecession level and even its 2000 low, the unemployment rate for workers with a college degree, at 2.2 percent, is still above its prerecession low of 1.8 percent and well above its 2000 low of 1.5 percent.
Workers with just a high school degree also seem to be doing relatively better, with a 4.0 percent unemployment rate matching the prerecession low (it had been 3.9 percent in May), although still above the 3.2 percent low hit in 1999. The idea that the labor market is becoming increasingly tilted to favor more educated workers does not appear to be supported by the employment data.  
This is, again, a solid jobs report in terms of job creation and lower unemployment. However, wage growth continues to be a problem, with wages barely outpacing inflation.

Thursday, August 02, 2018

Macroeconomic Research, Present and Past

This is "a work in progress, with follow-ups on the way":

Macroeconomic Research, Present and Past, by P.J. Glandon, Ken Kuttner, Sandeep Mazumder, and Caleb Stroup, August 1, 2018: Abstract We document eight facts about published macroeconomics research by hand collecting information about the epistemological approaches, methods, and data appearing in over a thousand published papers. Macroeconomics journals have published an increasing share of theory papers over the past 38 years, with theory-based papers now comprising the majority of published macroeconomics research. The increase in quantitative models (e.g., DSGE methods) masks a decline in publication of pure theory research. Financial intermediation played an important role in about a third of macroeconomic theory papers in the 1980s and 1990s, but became less frequent until the financial crisis, at which point it once again became an important area of focus. Only a quarter of macroeconomics publications conduct falsification exercises. This finding contrasts with the year 1980, when these empirical approaches dominated macroeconomics publishing. Yet the fraction of empirical papers that rely on microdata or proprietary data has increased dramatically over the past decade, with these features now appearing in the majority of published empirical papers. All of these findings vary dramatically across individual macroeconomics field journals.

Summer 2018 Journal of Economic Perspectives

Tim Taylor:

Summer 2018 Journal of Economic Perspectives Available On-line: I was hired back in 1986 to be the Managing Editor for a new academic economics journal, at the time unnamed, but which soon launched as the Journal of Economic Perspectives. The JEP is published by the American Economic Association, which back in 2011 decided--to my delight--that it would be freely available on-line, from the current issue back to the first issue. Here, I'll start with Table of Contents for the just-released Summer 2018 issue, which in the Taylor household is known as issue #125. Below that are abstracts and direct links for all of the papers. I may blog more specifically about some of the papers in the next week or two, as well.

Jep summer 2018_________________

Symposium: Macroeconomics a Decade after the Great Recession

"What Happened: Financial Factors in the Great Recession," by Mark Gertler and Simon Gilchrist At the onset of the recent global financial crisis, the workhorse macroeconomic models assumed frictionless financial markets. These frameworks were thus not able to anticipate the crisis, nor to analyze how the disruption of credit markets changed what initially appeared like a mild downturn into the Great Recession. Since that time, an explosion of both theoretical and empirical research has investigated how the financial crisis emerged and how it was transmitted to the real sector. The goal of this paper is to describe what we have learned from this new research and how it can be used to understand what happened during the Great Recession. In the process, we also present some new empirical work. We argue that a complete description of the Great Recession must take account of the financial distress facing both households and banks and, as the crisis unfolded, nonfinancial firms as well. Exploiting both panel data and time series methods, we analyze the contribution of the house price decline, versus the banking distress indicator, to the overall decline in employment during the Great Recession. We confirm a common finding in the literature that the household balance sheet channel is important for regional variation in employment. However, we also find that the disruption in banking was central to the overall employment contraction. Full-Text Access | Supplementary Materials

"Finance and Business Cycles: The Credit-Driven Household Demand Channel," by Atif Mian and Amir Sufi What is the role of the financial sector in explaining business cycles? This question is as old as the field of macroeconomics, and an extensive body of research conducted since the Global Financial Crisis of 2008 has offered new answers. The specific idea put forward in this article is that expansions in credit supply, operating primarily through household demand, have been an important driver of business cycles. We call this the credit-driven household demand channel. While this channel helps explain the recent global recession, it also describes economic cycles in many countries over the past 40 years. Full-Text Access | Supplementary 

"Identification in Macroeconomics," by Emi Nakamura and Jón Steinsson This paper discusses empirical approaches macroeconomists use to answer questions like: What does monetary policy do? How large are the effects of fiscal stimulus? What caused the Great Recession? Why do some countries grow faster than others? Identification of causal effects plays two roles in this process. In certain cases, progress can be made using the direct approach of identifying plausibly exogenous variation in a policy and using this variation to assess the effect of the policy. However, external validity concerns limit what can be learned in this way. Carefully identified causal effects estimates can also be used as moments in a structural moment matching exercise. We use the term "identified moments" as a short-hand for "estimates of responses to identified structural shocks," or what applied microeconomists would call "causal effects." We argue that such identified moments are often powerful diagnostic tools for distinguishing between important classes of models (and thereby learning about the effects of policy). To illustrate these notions we discuss the growing use of cross-sectional evidence in macroeconomics and consider what the best existing evidence is on the effects of monetary policy. Full-Text Access | Supplementary Materials

"The State of New Keynesian Economics: A Partial Assessment," by Jordi Galí In August 2007, when the first signs emerged of what would come to be the most damaging global financial crisis since the Great Depression, the New Keynesian paradigm was dominant in macroeconomics. Ten years later, tons of ammunition has been fired against modern macroeconomics in general, and against dynamic stochastic general equilibrium models that build on the New Keynesian framework in particular. Those criticisms notwithstanding, the New Keynesian model arguably remains the dominant framework in the classroom, in academic research, and in policy modeling. In fact, one can argue that over the past ten years the scope of New Keynesian economics has kept widening, by encompassing a growing number of phenomena that are analyzed using its basic framework, as well as by addressing some of the criticisms raised against it. The present paper takes stock of the state of New Keynesian economics by reviewing some of its main insights and by providing an overview of some recent developments. In particular, I discuss some recent work on two very active research programs: the implications of the zero lower bound on nominal interest rates and the interaction of monetary policy and household heterogeneity. Finally, I discuss what I view as some of the main shortcomings of the New Keynesian model and possible areas for future research. Full-Text Access | Supplementary Materials

"On DSGE Models," by Lawrence J. Christiano, Martin S. Eichenbaum and Mathias Trabandt The outcome of any important macroeconomic policy change is the net effect of forces operating on different parts of the economy. A central challenge facing policymakers is how to assess the relative strength of those forces. Economists have a range of tools that can be used to make such assessments. Dynamic stochastic general equilibrium (DSGE) models are the leading tool for making such assessments in an open and transparent manner. We review the state of mainstream DSGE models before the financial crisis and the Great Recession. We then describe how DSGE models are estimated and evaluated. We address the question of why DSGE modelers—like most other economists and policymakers—failed to predict the financial crisis and the Great Recession, and how DSGE modelers responded to the financial crisis and its aftermath. We discuss how current DSGE models are actually used by policymakers. We then provide a brief response to some criticisms of DSGE models, with special emphasis on criticism by Joseph Stiglitz, and offer some concluding remarks. Full-Text Access | Supplementary Materials

"Evolution of Modern Business Cycle Models: Accounting for the Great Recession," Patrick J. Kehoe, Virgiliu Midrigan and Elena Pastorino Modern business cycle theory focuses on the study of dynamic stochastic general equilibrium (DSGE) models that generate aggregate fluctuations similar to those experienced by actual economies. We discuss how these modern business cycle models have evolved across three generations, from their roots in the early real business cycle models of the late 1970s through the turmoil of the Great Recession four decades later. The first generation models were real (that is, without a monetary sector) business cycle models that primarily explored whether a small number of shocks, often one or two, could generate fluctuations similar to those observed in aggregate variables such as output, consumption, investment, and hours. These basic models disciplined their key parameters with micro evidence and were remarkably successful in matching these aggregate variables. A second generation of these models incorporated frictions such as sticky prices and wages; these models were primarily developed to be used in central banks for short-term forecasting purposes and for performing counterfactual policy experiments. A third generation of business cycle models incorporate the rich heterogeneity of patterns from the micro data. A defining characteristic of these models is not the heterogeneity among model agents they accommodate nor the micro-level evidence they rely on (although both are common), but rather the insistence that any new parameters or feature included be explicitly disciplined by direct evidence. We show how two versions of this latest generation of modern business cycle models, which are real business cycle models with frictions in labor and financial markets, can account, respectively, for the aggregate and the cross-regional fluctuations observed in the United States during the Great Recession. Full-Text Access | Supplementary Materials

"Microeconomic Heterogeneity and Macroeconomic Shocks," by Greg Kaplan and Giovanni L. Violante In this essay, we discuss the emerging literature in macroeconomics that combines heterogeneous agent models, nominal rigidities, and aggregate shocks. This literature opens the door to the analysis of distributional issues, economic fluctuations, and stabilization policies—all within the same framework. In response to the limitations of the representative agent approach to economic fluctuations, a new framework has emerged that combines key features of heterogeneous agents (HA) and New Keynesian (NK) economies. These HANK models offer a much more accurate representation of household consumption behavior and can generate realistic distributions of income, wealth, and, albeit to a lesser degree, household balance sheets. At the same time, they can accommodate many sources of macroeconomic fluctuations, including those driven by aggregate demand. In sum, they provide a rich theoretical framework for quantitative analysis of the interaction between cross-sectional distributions and aggregate dynamics. In this article, we outline a state-of-the-art version of HANK together with its representative agent counterpart, and convey two broad messages about the role of household heterogeneity for the response of the macroeconomy to aggregate shocks: 1) the similarity between the Representative Agent New Keynesian (RANK) and HANK frameworks depends crucially on the shock being analyzed; and 2) certain important macroeconomic questions concerning economic fluctuations can only be addressed within heterogeneous agent models. Full-Text Access | Supplementary Materials

Symposium: Incentives in the Workplace

"Compensation and Incentives in the Workplace," by Edward P. Lazear Labor is supplied because most of us must work to live. Indeed, it is called "work" in part because without compensation, the overwhelming majority of workers would not otherwise perform the tasks. The theme of this essay is that incentives affect behavior and that economics as a science has made good progress in specifying how compensation and its form influences worker effort. This is a broad topic, and the purpose here is not a comprehensive literature review on each of many topics. Instead, a sample of some of the most applicable papers are discussed with the goal of demonstrating that compensation, incentives, and productivity are inseparably linked. Full-Text Access | Supplementary Materials

"Nonmonetary Incentives and the Implications of Work as a Source of Meaning," by Lea Cassar and Stephan Meier Empirical research in economics has begun to explore the idea that workers care about nonmonetary aspects of work. An increasing number of economic studies using survey and experimental methods have shown that nonmonetary incentives and nonpecuniary aspects of one's job have substantial impacts on job satisfaction, productivity, and labor supply. By drawing on this evidence and relating it to the literature in psychology, this paper argues that work represents much more than simply earning an income: for many people, work is a source of meaning. In the next section, we give an economic interpretation of meaningful work and emphasize how it is affected by the mission of the organization and the extent to which job design fulfills the three psychological needs at the basis of self-determination theory: autonomy, competence, and relatedness. We point to the evidence that not everyone cares about having a meaningful job and discuss potential sources of this heterogeneity. We sketch a theoretical framework to start to formalize work as a source of meaning and think about how to incorporate this idea into agency theory and labor supply models. We discuss how workers' search for meaning may affect the design of monetary and nonmonetary incentives. We conclude by suggesting some insights and open questions for future research. Full-Text Access | Supplementary Materials

"The Changing (Dis-)utility of Work," by Greg Kaplan and Sam Schulhofer-Wohl We study how changes in the distribution of occupations have affected the aggregate non-pecuniary costs and benefits of working. The physical toll of work is less now than in 1950, with workers shifting away from occupations in which people report experiencing tiredness and pain. The emotional consequences of the changing occupation distribution vary substantially across demographic groups. Work has become happier and more meaningful for women, but more stressful and less meaningful for men. These changes appear to be concentrated at lower education levels. Full-Text Access | Supplementary Materials

Individual Articles

"Social Connectedness: Measurement, Determinants, and Effects," by Michael Bailey, Rachel Cao, Theresa Kuchler, Johannes Stroebel and Arlene Wong Social networks can shape many aspects of social and economic activity: migration and trade, job-seeking, innovation, consumer preferences and sentiment, public health, social mobility, and more. In turn, social networks themselves are associated with geographic proximity, historical ties, political boundaries, and other factors. Traditionally, the unavailability of large-scale and representative data on social connectedness between individuals or geographic regions has posed a challenge for empirical research on social networks. More recently, a body of such research has begun to emerge using data on social connectedness from online social networking services such as Facebook, LinkedIn, and Twitter. To date, most of these research projects have been built on anonymized administrative microdata from Facebook, typically by working with coauthor teams that include Facebook employees. However, there is an inherent limit to the number of researchers that will be able to work with social network data through such collaborations. In this paper, we therefore introduce a new measure of social connectedness at the US county level. Our Social Connectedness Index is based on friendship links on Facebook, the global online social networking service. Specifically, the Social Connectedness Index corresponds to the relative frequency of Facebook friendship links between every county-pair in the United States, and between every US county and every foreign country. Given Facebook's scale as well as the relative representativeness of Facebook's user body, these data provide the first comprehensive measure of friendship networks at a national level. Full-Text Access | Supplementary Materials

"Recommendations for Further Reading," by Timothy Taylor
Full-Text Access | Supplementary Materials

Rulers of the World: Read Karl Marx!

From The Economist:

Rulers of the world: read Karl Marx!: ...The chief reason for the continuing interest in Marx, however, is that his ideas are more relevant than they have been for decades. The post-war consensus that shifted power from capital to labour and produced a “great compression” in living standards is fading. Globalisation and the rise of a virtual economy are producing a version of capitalism that once more seems to be out of control. The backwards flow of power from labour to capital is finally beginning to produce a popular—and often populist—reaction. No wonder the most successful economics book of recent years, Thomas Piketty’s “Capital in the Twenty-First Century”, echoes the title of Marx’s most important work and his preoccupation with inequality. ...

The 2018 Fields Medal and its Surprising Connection to Economics!

From Kevin Bryan at Updated Priors:

The 2018 Fields Medal and its Surprising Connection to Economics!: The Fields Medal and Nevanlinna Prizes were given out today. They represent the highest honor possible for young mathematicians and theoretical computer scientists, and are granted only once every four years. The mathematics involved is often very challenging for outsiders. Indeed, the most prominent of this year’s winners, the German Peter Scholze, is best known for his work on “perfectoid spaces”, and I honestly have no idea how to begin explaining them aside from saying that they are useful in a number of problems in algebraic geometry (the lovely field mapping results in algebra – what numbers solve y=2x – and geometry – noting that those solutions to y=2x form a line). Two of this year’s prizes, however, the Fields given to Alessio Figalli and the Nevanlinna to Constantinos Daskalakis, have a very tight connection to an utterly core question in economics. Indeed, both of those men have published work in economics journals!
The problem of interest concerns how best to sell an object. ...

Wednesday, August 01, 2018

Links (8/1/18)

Race, and the Race Between Stocks and Homes

"Black-white economic inequality remains large and persistent, and recent wealth inequality trends for all Americans are explained by assets, not income":

Race, and the race between stocks and homes, by Douglas Clement, The Region: Most research on long-term U.S. inequality focuses on income; relatively little examines wealth, largely due to lack of good asset data. But a June 2018 working paper from the Opportunity & Inclusive Growth Institute addresses that imbalance with a new data set developed from historical surveys, and it shows that wealth—specifically, ownership of stocks and homes—has been a central force behind U.S. inequality trends for 70 years.

...Their analysis begins by confirming the findings of other scholars: increased income polarization since the 1970s, with particular damage to the relative position of the middle-class. It also sheds new light on economic inequality between blacks and whites by quantifying vast differences in wealth as well as income, and no progress in diminishing those gaps.

Perhaps the study’s most novel contribution, however, is in revealing the singular role of household portfolio composition—ownership of different asset types—in determining inequality trends. Because the primary source of middle-class American wealth is homeownership, and the main asset holding of the top 10 percent is equity, the relative prices of the two assets have set the path for wealth distribution and driven a wedge between the evolution of income and wealth.

In brief, as home prices climbed from 1950 until the mid-2000s, middle-class wealth held its own relative to upper-class wealth even as middle-class incomes stagnated. But after the financial crisis, the stock market’s quick recovery and slow turnaround of housing prices meant soaring wealth inequality that even exceeded the last decade’s climb in income inequality. ...

Racial inequality: “The overall summary is bleak”

The demographic detail and 70-year span of the new database also permitted close analysis of racial inequality, pre- and post-civil rights eras. The picture is discouraging. Income disparities are as large now as in 1950, with black household income still just half that of white households.

The racial gap in wealth is even wider, and similarly stagnant. The median black household has less than 11 percent the wealth of the median white household (about $15,000 versus $140,000 in 2016 prices). The economists also find that the financial crisis hit black households particularly hard.

“The overall summary is bleak,” they write. “Over seven decades, next to no progress has been made in closing the black-white income gap. The racial wealth gap is equally persistent. … The typical black household remains poorer than 80% of white households.”

The race between stocks and homes

To explain the divergent trends in income and wealth inequality before the crisis, the economists draw on a key strength of the database: It includes both income and wealth information, household-by-household, and 70 years of balance sheets with detailed portfolio composition.

With this, they find that the bottom 50 percent now holds little or negative wealth (that is, debt), and its share dropped from 3 percent of total wealth in 1950 to 1.2 percent in 2016.

For the upper half, portfolio diversification determines wealth trends. The data show that homes are the primary asset for households between the 50th and the 90th percentile, while the upper 10th also owns a large share of equities. Therefore, middle-class household wealth is strongly exposed to house price fluctuation, and the top 10 percent is more sensitive to stock market variations.

This difference in asset holdings explains how, prior to the crisis, middle-class households experienced rising wealth in parallel with the top 10th, even though their real incomes stagnated and savings were negligible. But the picture changed dramatically post-crisis. In “a race between the stock market and the housing market,” the economists write, the richest 10 percent, by virtue of a climbing stock market, enjoyed soaring post-crisis wealth, while average household wealth largely stagnated. (See figure.)


“When house prices collapsed in the 2008 crisis,” the economists conclude, the “leveraged portfolio position of the middle class brought about substantial wealth losses, while the quick rebound in stock markets boosted wealth at the top. Relative price changes between houses and equities after 2007 have produced the largest spike in wealth inequality in postwar American history.”

How BBC Balance and Bad Think Tanks Discourage Evidence Based Policy

Simon Wren-Lewis:

How BBC balance and bad think tanks discourage evidence based policy:
The Knowledge Transmission Mechanism (KTM) is how knowledge produced by academics and other researchers is translated into public policy. Evidence based policy is the result of this mechanism working. The media is, in theory, an important conduit for the KTM...
The rigid application of political balance in the broadcast media is in danger of negating the KTM, and therefore evidence based policy. The moment an issue (call it issue X) is deemed ‘political’ by the media, balance dictates that any view expressed on issue X is an opinion rather than knowledge. As a result, when the media want to talk to non-politicians (‘experts’) about issue X, the imperative of balance remains.
Now suppose that in the knowledge world there is in fact a consensus on issue X. That would be a problem for balance broadcasting, because it would be difficult to get an expert to argue against the consensus. The BBC overcame this problem valiantly during Brexit, using Patrick Minford (who is not known as a trade economist) time and again to balance the IMF, the OECD, more than 90% of academic opinion etc. But another way of solving this problem is to use certain think tanks.
There are two types of think tank. The good kind can be a vital part of the KTM. There is often a genuine need for think tanks to help translate academic research into policy. ... These think tanks are an important part of the KTM, because they can establish what the academic consensus is, translate academic ideas into practical policy, and match policy problems to evidence based solutions. ...
The bad kind are rather different. These produce ‘research’ that conforms to a particular line or ideology, rather than conforming to evidence or existing academic knowledge. Sometimes these think tanks can even become policy entrepreneurs, selling policies to politicians. This is often called policy based evidence making. It would be nice to be able to distinguish between good and bad think tanks in an easy way. The good type seeks to foster the KTM, and ensure policy is evidence based, and the bad type seek to negate the KTM by producing evidence or policies that fit preconceived ideas or the policymaker’s ideology.
I would argue that transparency about funding sources provides a strong indicator of which type a think tank is. ...
Another good indicator of a bad think tank is their relationship to academia. ...
In the case of global warming the BBC has been forced ... to treat man made climate change as a fact rather than an opinion that always has to be balanced. That is not going to happen for some time over any economic issue, however strong the academic consensus (like Brexit). This is partly because the pressure from academia is much less, and partly there is still a prejudice against social science (as if evidence based policy making cannot occur for economic or social policy!). But the BBC does need to explain their attitude to the use of think tanks. ...

Tuesday, July 31, 2018

High-Speed Rail Expansion and German Worker Mobility

Morgan Foy at the NBER Digest:

High-Speed Rail Expansion and German Worker Mobility: Starting in the late 1990s, Germany expanded its high-speed rail network (HSR), connecting outlying locales to large urban areas. In The Effect of Infrastructure on Worker Mobility: Evidence from High-Speed Rail Expansion in Germany (NBER Working Paper No. 24507), Daniel F. Heuermann and Johannes F. Schmieder study how this large-scale infrastructure investment affected commuter behavior. They find that the expansion reduced travel times and increased commuting, as workers moved to jobs in smaller cities while keeping their places of residence in larger urban areas.

Until the late 1990s, the HSR system connected the largest cities of Germany. The connected cities were located in just three of the 16 German states. Areas between the large cities, through which the tracks ran, campaigned for stations, and in a second wave of expansion, the government added stops in many of these cities.

The researchers analyze the effects of this infrastructure improvement by comparing cities granted stops in the second wave of expansion with other small German cities that were not added to the rail network. They note that new rail stops were not placed due to economic conditions, such as connectedness to urban centers, but because of political factors. Moreover, unlike infrastructure investments in roads and highways, the HSR system exclusively carries passengers. It does not transport goods, so it affects labor but not product markets.

The researchers create a dataset that includes travel times, train schedules, and administrative employment data, which contain the region of work and residence for each traveler.

For the average pair of cities in their study, the high-speed rail expansion reduced travel time by 13 minutes, or about 10 percent of the pre-expansion time. The number of commuters rose by 0.25 percent for each 1 percent decrease in travel time. Reductions in commuter time and the corresponding increase in passengers followed an inverted U-shaped pattern, with the largest impacts occurring on routes of 200 to 500 kilometers in length.

The researchers estimate that 840,000 people started to use rail transportation for commuting during their 1994-2010 sample period. Twelve percent of the increase in ridership was attributable to the 10 percent reduction in commuting time as a result of HSR expansion.

The researchers find that the number of commuters from small cities to large cities is 40 percent larger than the number commuting from large to small cities. The opposite pattern, however, commuting from large cities to small, was much more sensitive to the reduction in travel time from HSR expansion. This supports the view that workers enjoy living in large urban areas and are not there solely for employment.

The study concludes that the gains from the investment in infrastructure accrued mainly to smaller cities. Commuters are twice as likely as non-commuters to be college graduates, which suggests that building HSR networks may be one way to engage relatively high-skilled workers in the economies of peripheral regions.

Links (7/31/18)

Monday, July 30, 2018

Links (7/30/18)

Wednesday, July 25, 2018

Links (7/25/18)



Monday, July 23, 2018

Links (7/23/18)

Friday, July 20, 2018

Links (7/20/18)

Wednesday, July 18, 2018

Links (7/18/18)

Monday, July 16, 2018

Links (7/16/18)

Friday, July 13, 2018

Links (7/13/18)

Wednesday, July 11, 2018

Links (7/11/18)

Monday, July 09, 2018

Links (7/9/18)

Friday, July 06, 2018

Links (7/6/18)


Wednesday, July 04, 2018

Links (7/4/18)

Monday, July 02, 2018

Links (7/2/18)

Thursday, June 28, 2018

Links (6/28/18)

Monday, June 25, 2018

Links (6/25/18)

Saturday, June 23, 2018


Wednesday, June 20, 2018


Thursday, June 14, 2018


Tuesday, June 12, 2018


Monday, June 11, 2018


Friday, June 08, 2018


Thursday, May 31, 2018

Oh, What a Stupid Trade War

Paul Krugman:

Oh, What a Stupid Trade War (Very Slightly Wonkish), by Paul Krugman, NY Times: So, the trade war is on. And what a stupid trade war it is. …

The official – and legal – justification for the steel and aluminum tariffs is national security. That’s an obviously fraudulent rationale... But Trump and co. presumably don’t care about telling lies with regard to economic policy... They would see it as all fair game if the policy delivered job gains Trump could trumpet. Will it?

OK, here’s the point where being a card-carrying economist gets me into a bit of trouble. The proper answer about the job-creation or -destruction effect of a trade policy – any trade policy, no matter how well or badly conceived – is basically zero. ….Why? The Fed... Even if tariffs were expansionary, that would just make the Fed raise rates faster, which would in turn crowd out jobs in other industries...

But I think this is a case where macroeconomics, even though I believe it’s right, gets in the way of useful discussion. We do want to know whether the Trump trade war ... would add or subtract jobs holding monetary policy constant, even though we know monetary policy won’t be constant.

And the answer, almost surely, is that this trade war will actually be a job-killer, not a job-creator, for two reasons.

First, Trump is putting tariffs on intermediate goods…, some of which themselves have to compete on world markets. Most obviously, cars and other durable manufactured goods will become more expensive to produce, which means that we’ll sell less of them; and whatever gains there are in primary metals employment will be offset by job losses in downstream industries.

Playing with the numbers, it seems highly likely that even this direct effect is a net negative for employment.

Second, other countries will retaliate against U.S. exports, costing jobs in everything from motorcycles to sausages. …

Finally – and I think this is really important – we’re dealing with real countries here, mainly democracies. Real countries have real politics; they have pride; and their electorates really, really don’t like Trump. This means that even if their leaders might want to make concessions, their voters probably won’t allow it. ...

So this is a remarkably stupid economic conflict to get into. And the situation in this trade war is likely to develop not necessarily to Trump’s advantage.


Tuesday, May 29, 2018

Is GDP Overstating Economic Activity?

From an Economic Letter at the FRBSF:

Is GDP Overstating Economic Activity?, by Zheng Liu, Mark M. Spiegel, and Eric B. Tallman: Two common measures of overall economic output are gross domestic product (GDP) and gross domestic income (GDI). GDP is based on aggregate expenditures, while GDI is based on aggregate income. In principle, the two measures should be identical. However, in practice, they are not. The differences between these two series can arise from differences in source data, errors in measuring their components, and the seasonal adjustment process.
In this Economic Letter, we evaluate the reliability of GDP relative to two alternatives, GDI and a combination of the two known as GDPplus, for measuring economic output. We test the ability of each to forecast a benchmark measure of economic activity over the past two years. We find that GDP consistently outperforms the other two as a more accurate predictor of aggregate economic activity over this period. This suggests that the relative weakness of GDI growth in recent years does not necessarily indicate weakness in overall economic growth.
Discrepancies between GDP and GDI
What drives the discrepancies between GDP and GDI is not well understood. The source data for the components that go into GDP and GDI are measured with errors, which may lead to discrepancies between the two. Further discrepancies can arise because those different components are adjusted for seasonality at different points in time (see, for example, Grimm 2007).
The differences between these two series can be large. For example, in the last two quarters of 2007, inflation-adjusted or “real” GDI was declining whereas real GDP was still growing. The year-over-year growth rate of GDP exceeded that of GDI by almost 2.6 percentage points. Over long periods, however, final measures of growth in GDP and GDI tend to yield roughly equivalent assessments of economic activity. Since 1985, real GDP grew at an average annual rate of about 3.98%, while real GDI grew at a similar average rate of 4.02%.
Since late 2015, the two series have diverged, with real GDP growth consistently exceeding real GDI growth (Figure 1). The differences in growth are significant in this period. For example, if we used GDI growth to assess overall economic activity since July 2015, then the size of real aggregate output by the end of 2017 would be $230 billion smaller than if GDP growth were used. This divergence between the two sends mixed signals regarding the strength of recent economic activity.

Figure 1
Mixed signals from GDP and GDI growth


Source: Bureau of Economic Analysis.

Evaluating GDP, GDI, combination
Researchers often debate which of these series measures economic activity more accurately. Nalewaik (2012) argues that GDI outperforms GDP in forecasting recessions. GDI does appear to exhibit more cyclical volatility than GDP. One reason may be that GDI is more highly correlated with a number of business cycle indicators, including movements in both employment and unemployment (Nalewaik 2010). On the other hand, the Bureau of Economic Analysis has resisted this conclusion, arguing that GDP is in general based on more reliable source data than GDI is (Landefeld 2010).
To evaluate the relative reliability of GDP versus GDI for measuring economic output, we compare their abilities to forecast a benchmark measure of economic activity. We focus on the Chicago Fed National Activity Index (CFNAI) as the benchmark, since it is publicly available. The CFNAI is a monthly index of national economic activity, generated as the common component of 85 monthly series in the U.S. economy. These underlying series include a wide variety of data covering production and income, employment and unemployment, personal consumption and housing, and sales and orders. The CFNAI has been shown to help forecast real GDP (Lang and Lansing 2010). We use the CFNAI as a benchmark activity indicator to evaluate the relative forecasting performances of GDP and GDI and their combinations. Since the discrepancy between these two series has persisted for several years, we focus on the final releases of the GDP and GDI series.
Some have argued that, because the GDP and GDI series contain independent information, it may be preferable to combine the two series into a single more informative activity indicator. One series that uses such a combination is the Philadelphia Fed’s GDPplus series, which is a weighted average of GDP and GDI, with the weights based on the approach described by Aruoba et al. (2016). As a weighted average, GDPplus indicates activity levels between the two individual series. We therefore also consider the forecasting performance of the GDPplus series over this period of extended discrepancy between reported GDP and GDI growth.
To confirm the accuracy of our approach, we repeated our investigation with two alternative series constructed using methodologies similar to the CFNAI. The first alternative is an aggregate economic activity index (EAI) we constructed by extracting the common components of 90 underlying monthly time series. The EAI covers a broader set of monthly indicators than the CFNAI, since we also include information from goods prices and asset prices.
The second alternative indicator we considered is an activity index constructed by Barigozzi and Luciani (2018), which we call the BL index. Like our index, the BL index includes price indexes and other measures of labor costs. The authors base their estimates on the portions of GDP and GDI that are driven by common macroeconomic shocks under the assumption that they have equivalent effects on GDP and GDI. This restriction implies that deviations between GDP and GDI are transitory, and that the two series follow each other over time.
The EAI and the BL index are both highly correlated with the CFNAI and thus yielded similar conclusions. We describe the source data and our methodology for constructing the EAI as well as the analysis using both it and the BL index in an online appendix.
Empirical results
To examine the relative performances of GDP, GDI, and GDPplus for forecasting the CFNAI, we first estimate an empirical model in which the CFNAI is related to four lagged values of one of these measures of aggregate output. Ideally, we would have used the full sample of postwar data in our model, but there are some structural breaks in the data related to factors such as changes in the monetary policy regime since the mid-1980s and the Great Moderation that make this challenging. We therefore choose to focus on the sample starting from the first quarter of 1985 in this discussion; our results using the full sample are similar, as we report in the online appendix.
To examine how well each of the measures of aggregate output are able to forecast the CFNAI, we estimate the model using the sample observations up to the end of 2015, the period before GDP and GDI diverged. Once we determine the estimated coefficients that describe each relationship, we use those values to estimate forecasts for the period when discrepancies developed, from the first quarter of 2016 to the end of 2017. We then calculate the prediction errors, measured by the root mean-squared errors, for each measure of aggregate output. The smaller the prediction error, the better the forecasting performance.
In addition to examining the forecasting performance of GDP, GDI, and GDPplus for predicting the CFNAI economic activity indicator, we also examined their forecasting performance for the unemployment rate as reported by the Bureau of Labor Statistics.
Figure 2 displays the prediction errors from 2016 to 2017 for each of the alternative output measures—GDP, GDI, and GDPplus—estimated from our model for CFNAI and unemployment. For ease of comparison, we normalize the prediction errors from the model with GDP to one. The figure shows that the prediction errors over this period based on the GDP series are substantively lower than those based on GDI or GDPplus. This finding holds true not just for these proxies for economic activity but also for our EAI and the BL index (see the online appendix). Moreover, formal statistical tests of forecasting performance indicate that the forecasts based on GDP are significantly better than those based on GDI or GDPplus at the 95% confidence level. This result suggests that, in recent periods, GDP has been a more reliable independent indicator of economic activity than either GDI or GDPplus.

Figure 2
GDP outperforms GDI, GDPplus in predicting activity


Note: Figure shows prediction errors with GDP indexed to 1.

While GDP and GDI are theoretically identical measures of economic output, they can differ significantly in practice over some periods. The differences between the two series have been particularly pronounced in the past two years, when GDP growth has been consistently stronger than GDI growth. Based on this observation, some analysts have claimed that GDP might be overstating the pace of growth and that GDI, or some combination of GDP and GDI, should be used to evaluate the levels and growth rate of economic activity.
To evaluate the validity of this claim, we compared the relative performances of GDP, GDI, and a combined measure, GDPplus, for forecasting the CFNAI, which we use as a benchmark measure of economic activity over the past two years. We find that GDP consistently outperforms both GDI and combinations of the two, such as GDPplus, in forecasting aggregate economic activity during the past two years. In this sense, GDP is a more accurate predictor of aggregate economic activity than GDI over this period. Therefore, the relative weakness of GDI growth observed in recent years does not necessarily indicate weakness in overall economic growth.
Zheng Liu is a senior research advisor in the Economic Research Department of the Federal Reserve Bank of San Francisco.
Mark M. Spiegel is a vice president in the Economic Research Department of the Federal Reserve Bank of San Francisco.
Eric B. Tallman is a research associate in the Economic Research Department of the Federal Reserve Bank of San Francisco.
Aruoba, S. Boragan, Francis X. Diebold, Jeremy Nalewaik, Frank Schorfheide, and Dongho Song. 2016. “Improving GDP Measurement: A Measurement-Error Perspective.” Journal of Econometrics 191(2), pp. 384–397.
Barigozzi, Matteo, and Matteo Luciani. 2018. “Do National Account Statistics Underestimate U.S. Real Output Growth?” Board of Governors FEDS Notes, January 9.
Grimm, Bruce T. 2007. “The Statistical Discrepancy.” Bureau of Economic Analysis Working Paper 2007-01, March 2. 
Landefeld, J. Steven. 2010. “Comments and Discussion: The Income- and Expenditure-Side Estimates of U.S. Output Growth.” Brookings Papers on Economic Activity, Spring, pp. 112–123.
Lang, David, and Kevin J. Lansing. 2010. “Forecasting Growth Over the Next Year with a Business Cycle Index.” FRBSF Economic Letter 2010-29 (September 27).
Nalewaik, Jeremy J. 2010. “The Income- and Expenditure-Side Estimates of U.S. Output Growth.” Brookings Papers on Economic Activity, Spring, pp. 71–106.
Nalewaik, Jeremy J. 2012. “Estimating Probabilities of Recession in Real Time Using GDP and GDI.” Journal of Money, Credit, and Banking 44, pp. 235–253.

Opinions expressed in FRBSF Economic Letter do not necessarily reflect the views of the management of the Federal Reserve Bank of San Francisco or of the Board of Governors of the Federal Reserve System.


Sunday, May 27, 2018

Does Economics Matter?

Chris Dillow:

Does economics matter?: Does economics matter? I ask because I suspect I would understand political debate better if I realized that it doesn’t.
Everybody tends to over-rate the importance of their profession: it’s part of deformation professionelle. Lawyers over-rate the importance of the law, artists of the arts and so on. Maybe economists do the same. Perhaps we should realize that most people who are interested in politics just aren’t interested in economics.
If we adopt this perspective, a lot falls into place. ...

Saturday, May 26, 2018

Neo- and Other Liberalisms

David Glasner at Uneasy Money:

Neo- and Other Liberalisms: Everybody seems to be worked up about “neoliberalism” these days. A review of Quinn Slobodian’s new book on the Austrian (or perhaps the Austro-Hungarian) roots of neoliberalism in the New Republic by Patrick Iber reminded me that the term “neoliberalism” which, in my own faulty recollection, came into somewhat popular usage only in the early 1980s, had actually been coined in the early the late 1930s at the now almost legendary Colloque Walter Lippmann and had actually been used by Hayek in at least one of his political essays in the 1940s. In that usage the point of neoliberalism was to revise and update the classical nineteenth-century liberalism that seemed to have run aground in the Great Depression, when the attempt to resurrect and restore what had been widely – and in my view mistakenly – regarded as an essential pillar of the nineteenth-century liberal order – the international gold standard – collapsed in an epic international catastrophe. The new liberalism was supposed to be a kinder and gentler — less relentlessly laissez-faire – version of the old liberalism, more amenable to interventions to aid the less well-off and to social-insurance programs providing a safety net to cushion individuals against the economic risks of modern capitalism, while preserving the social benefits and efficiencies of a market economy based on private property and voluntary exchange. ...

Effects of Copyrights on Science

Barbara Biasi and Petra Moser at VoxEU:

Effects of copyrights on science: Summary Copyrights grant publishers exclusive rights to content for almost a century. In science, this can involve substantial social costs by limiting who can access existing research. This column uses a unique WWII-era programme in the US, which allowed US publishers to reprint exact copies of German-owned science books, to explore how copyrights affect follow-on science. This artificial removal of copyright barriers led to a 25% decline in prices, and a 67% increase in citations. These results suggest that restrictive copyright policies slow down the progress of science considerably.

Friday, May 25, 2018


Wednesday, May 23, 2018


Monday, May 21, 2018


Thursday, May 17, 2018


Wednesday, May 16, 2018


Monday, May 14, 2018


Friday, May 11, 2018


Wednesday, May 09, 2018


Monday, May 07, 2018


Friday, May 04, 2018

Employer Concentration and Stagnant Wages

From the NBER Digest. "Two studies suggest that an increase in employers' monopsony power is associated with lower wages.":

Employer Concentration and Stagnant Wages: Stagnant wages and a declining share of labor income in GDP in recent decades have spawned a number of explanations. These include outsourcing, foreign competition, automation, and the decline of unions. Two new studies focus on another factor that may have affected the relative bargaining position of workers and firms: employer domination of local job markets. One shows that wage growth slowed as industrial consolidation increased over the past 40 years; the other shows that in many job markets across the country there is little competition for workers in specific job categories.


In Strong Employers and Weak Employees: How Does Employer Concentration Affect Wages? (NBER Working Paper No. 24307), Efraim Benmelech, Nittai Bergman, and Hyunseob Kim analyzed county-level census data for industrial firms for the period 1977 to 2009 to study the impact of employer concentration on wages in local labor markets. By focusing on manufacturing, they were able to control directly for worker productivity. The researchers found that, although there was substantial cross-sectional and time series variation in concentration, average local-level employer concentration increased between 1977-81 and 2002-9, based on the Standard Industrial Classification four-digit code for industry groups. Their measure of concentration is the Herfindahl-Hirschman Index (HHI), which is defined as the sum of the squares of the employment shares for all of the firms in a given industry. The employment-weighted mean value of this index rose from 0.698 to 0.756 during the study period, an increase of 5.8 percent. Forty percent of the plant-year observations were associated with manufacturing facilities in counties dominated by just a few firms. The researchers found a negative relationship between employer concentration and wages; it was twice as strong in the second half of their data sample as in the first half; a one standard deviation increase in the HHI was associated with a wage reduction of between 1 and 2 percent. They estimate that a firm operating in a labor market in which it was the only employer would pay wages 3.1 percent lower than those of a firm that operated in a less concentrated market. Most of the decline in wages appeared to occur as labor markets approached the pure monopsony case, namely the situation in which only one firm is hiring workers. In addition to finding lower wages in monopsony markets, the researchers also found that, over time, firms that dominate their labor markets were less likely to share productivity gains with employees. A one standard deviation decline in the HHI mapped to an increase in the elasticity of wages with respect to productivity of about 25 percent, from 0.38 to 0.47. Over the course of the study period, U.S. imports from China increased. The researchers found that import competition from China, which was associated with the closure or relocation of plants in a number of industries, accelerated the trend toward greater employer concentration in some local labor markets. This finding suggests that import competition not only reduced the demand for workers who previously produced the now-imported products, but that it may also have depressed wages for workers in other industries in affected labor markets as a result of increased labor market concentration. The only employees who did not experience wage stagnation in markets with high plant concentration were those who belonged to unions. About one quarter of the plants studied were unionized; the fraction was lower in the later than in the earlier years. Because this study focuses on workers employed by industrial firms, the fraction of workers who are union members is higher than for the U.S. labor market more broadly. To assess the robustness of their results, the researchers compared plants in the same industry owned by the same company but operating in different locations; they found that "those located in a more concentrated local labor market pay significantly lower wages."


In Concentration in U.S. Labor Markets: Evidence from Online Vacancy Data (NBER Working Paper No. 24395), José A. Azar, Ioana Marinescu, Marshall I. Steinbaum, and Bledi Taska found that in most locations employers have substantial monopsony power. The researchers studied job vacancies in the 709 federally delineated commuting zones, which depict the bounds of local economies. Drawing on a database compiled by Burning Glass Technologies from 40,000 employment websites, they calculated the level of labor market concentration by commuting zone, occupation, and quarter for the year 2016. They selected the top 200 occupations as classified by the Bureau of Labor Statistics' six-digit code, capturing 90 percent of the job postings in the database. As a yardstick for labor market concentration, the study calculated the Herfindahl-Hirschman Index measure, similar to the application in Working Paper 24307. The results suggested that the higher the market concentration, the stronger an employer's bargaining position. The average market had the equivalent of 2.5 recruiting employers. Under the standards that federal antitrust officials use when determining whether product markets exhibit excessive levels of concentration, 54 percent of the markets were highly concentrated, meaning they had the equivalent of fewer than four firms recruiting employees. Eleven percent of markets were moderately concentrated, and only 35 percent had low concentration. Nationwide, among the 30 largest occupations, marketing managers, web developers, and financial analysts faced the least favorable job markets; markets were most favorable for registered nurses, tractor-trailer drivers, and customer service representatives. The actual picture for job seekers, however, was brighter than these figures would indicate because commuting zones vary widely in employment levels. Commuting zones encompassing large cities had lower levels of labor market concentration than those around small cities or in rural areas. Accounting for the unequal distribution of employment, the researchers found that 23 percent of the national workforce is in highly or moderately concentrated labor markets. They argue that traditional market concentration thresholds underestimate workers' loss of bargaining power over time. They point out that those thresholds are geared to gauging the impact of mergers on the consumer marketplace, and that while consumers can buy products without the producers' explicit agreement, workers must find employers who agree to hire them.