- Partisanship, Parasites, and Polarization - Paul Krugman
- Trade and growth in the Iron Age - VoxEU
- Socialism in one country - Stumbling and Mumbling
- On the Economics of the Google Android Case - ProMarket
- Minutes of the Federal Open Market Committee - The Fed
- The myth of intertemporal labour supply substitution - VoxEU
- The price impact of removing the penny - Bank Underground
- Duy: "Bostic Throws Down the Gauntlet" - Calculated Risk
- A Bee Industry Update - Tim Taylor
- Parrots' economics - EurekAlert!
Thursday, August 23, 2018
Tuesday, August 21, 2018
- The G.O.P.’s Climate of Paranoia - Paul Krugman
- The biggest economic policy mistake of the last decade - mainly macro
- The Slippery Slope of Complicity - Paul Krugman
- The IT revolution and the globalisation of R&D - VoxEU
- Sources of Finance: Internal versus External - Cecchetti & Schoenholtz
- Framing Turkey’s Financial Vulnerabilites- Brad Setser
- The making of the Federal Open Market Committee - VoxEU
- Central bank digital currency: Why it matters and why not - VoxEU
Sunday, August 19, 2018
Atif Mian and Amir Sufi at VoxEU:
Credit supply and housing speculation, by Atif Mian and Amir Sufi, VoxEU: Charles P. Kindleberger, who was the world’s leading expert on financial crises, wrote that “asset price bubbles depend on the growth in credit” (Kindleberger and Aliber 2005). Nobel prize winner Vernon Smith described evidence from experimental settings showing that that the size of a bubble increased when individuals were allowed to borrow (Porter and Smith 1994). Economic theorists have taken this lesson to heart, writing down models in which easier credit helps fuel asset prices through an increase in speculative buying (Allen and Gorton 1993, Allen and Gale 2000).
A core idea in the theory of credit and bubbles is that easier credit allows optimists with high asset valuations to aggressively buy assets, and therefore boost the price (Geanakoplos 2010, Simsek 2013). Even if optimists form a small part of the overall population, easier credit can allow this small group to have a large effect on the market. Further, if the optimists suddenly lose access to credit, the price of the asset will collapse before more pessimistic individuals can be induced to buy the asset. As a result, fluctuations in credit availability increase the amplitude of fluctuations in asset prices.
Our recent study tests this idea, focusing on the boom and bust in house prices from 2000 to 2010 in the US (Mian and Sufi 2018). The study focuses on a natural experiment: the sudden acceleration of the private label mortgage securitisation (PLS) market in the late summer of 2003. The sudden rise in the PLS market, which was part of the broader global rise in shadow banking during this period, disproportionately reduced the cost of financing by lenders that did not traditionally rely on deposit financing for mortgage lending. The study shows that lenders who traditionally relied on non-deposit financing, such as CountryWide and Ameriquest Mortgage Company, suddenly boosted mortgage lending in the late summer of 2003, just as the PLS market accelerated.
To test the effect of this sudden increase in credit availability on the housing market, we exploit variation across geographic areas in the US in the location of these lenders as of 2002. Zip codes where lenders traditionally relied on non-deposit financing witnessed a sudden and large relative increase in mortgage lending just as the PLS market accelerated in 2003. Our study shows several results that suggest this is a clean experiment – the sudden and large expansion of mortgage lending in these zip codes was due to the acceleration of the PLS market, as opposed to some other factor such as a change in income prospects or beliefs about house prices among those living in these zip codes.
Consistent with models in which credit availability affects asset prices, the sharp rise in mortgage lending in these zip codes generated a boom and bust in house prices. In fact, exposure of a zip code to non-traditional lenders in 2002 predicted the severity of the collapse in house prices from 2006 to 2010.
Furthermore, US cities that had greater exposure to these lenders were more likely to experience a simultaneous increase in both house prices and construction activity during the boom. The presence of such bubble cities, such as Las Vegas and Phoenix, has puzzled economists because in most standard models the ability to easily construct more housing units should put a lid on house price growth. The results of our study suggest that easier credit was a crucial ingredient in explaining bubble cities that had both house price and construction booms. We further show that these cities witnessed a particularly painful bust from 2006 to 2010.
A unique advantage of our study is the ability to track the marginal buyers of homes that were brought into the market by easier credit. Zip codes more exposed to the acceleration of the PLS market witnessed a substantial increase in transaction volume from 2003 to 2006, and this increase in volume was almost completely driven by flippers (i.e. individuals that buy and sell multiple homes in a short period of time). Such flippers were a small fraction of the overall population – by our estimate, flippers made up less than 1% of the overall adult population in 2005 and 2006. Despite being a small part of the overall population, flippers had a disproportionate effect on the housing market because they were able to easily obtain credit.
The results support models in which easier credit can boost asset prices by giving a small group of aggressive buyers the ability to affect the overall market. In the presence of easy credit, it is not necessary for there to be a widespread increase in optimism about the housing market to generate a large increase in house prices.
Evidence from the Michigan Survey of Consumers supports this conclusion. As has been shown in previous research (Piazzesi and Schneider 2009), the fraction of the overall population who said “it is a good time to buy a home” actually declined substantially from 2003 to 2006 during the heart of the housing boom. We add to this evidence by showing that the share of individuals saying “now is a good time to buy a home” declined most in cities that experienced a large rise in house prices fuelled by the PLS market. On average, individuals became increasingly pessimistic about the housing market in cities where the PLS market fuelled a trading frenzy by flippers. Easy credit allowed a small group of individuals to boost house prices in some cities even though the average individual in these cities soured on the housing market.
Flipping fuelled by the PLS market was a crucial factor that instigated the mortgage default crisis. As early as 2007, flippers in zip codes most exposed to the PLS market had default rates above 20%. The share of all mortgage defaults from zip codes most exposed to the PLS market increased in 2007. By 2008 and 2009, defaults were rising throughout the country, but the evidence suggests that the mortgage default crisis was triggered by defaults emanating from the PLS market.
The bust also provides important lessons for the interaction of credit and asset prices. While almost all buyers in zip codes most exposed to the PLS market used a mortgage to buy a home from 2003 to 2006, the share of cash-buyers increased sharply in 2007 and afterward. This pattern is consistent with the idea that prices collapsed in part because tighter credit prevented optimists from buying homes during the sell-off, which meant more pessimistic cash-buyers became the marginal price setters. Loose credit boosted prices during the boom, and tight credit exacerbated the bust. Credit fluctuations and asset price fluctuations are closely connected.
Allen, F and D Gale (2000), “Bubbles and crises," The Economic Journal 110(460): 236-255.
Allen, F and G Gorton (1993), “Churning bubbles," The Review of Economic Studies 60(4): 813-836.
Geanakoplos, J (2010), “The leverage cycle," NBER Macroeconomics Annual 24(1): 1-66.
Kindleberger, C P and R Z Aliber (2005), Manias, panics and crashes: A history of financial crises, Palgrave Macmillan.
Mian, A and A Sufi (2018), “Credit Supply and Housing Speculation,” NBER Working Paper 24823.
Piazzesi, M and M Schneider (2009), “Momentum traders in the housing market: survey evidence and a search model," American Economic Review 99(2): 406-11.
Simsek, A(2013), “Belief disagreements and collateral constraints," Econometrica 81(1): 1-53.
Saturday, August 18, 2018
- Something Not Rotten in Denmark - Paul Krugman
- Measuring global economic activity - Jim Hamilton
- For automatic stabilizers - Stumbling and Mumbling
- Parental Assistance after Job Loss - FRB Cleveland
- Inflation targeting and large shocks - VoxEU
- Reskilling over a Lifetime - Tim Taylor
- Nominal price stickiness and real exchange rate fluctuations - VoxEU
Wednesday, August 15, 2018
- Who’s Afraid of Nancy Pelosi? - Paul Krugman
- Google Android case: Milestone or millstone? - Hal Varian
- Interest rate vs fiscal policy stabilisation - mainly macro
- Does Loyalty Pay Off? - macroblog
- Valuing ‘exclusive eyeballs’ - VoxEU
- Migrant networks boost trade - VoxEU
- Will Demographic Headwinds Hobble China’s Economy? - Liberty Street
- Can Horizontal Mergers Actually Boost Competition? - ProMarket
- Economics of Climate Change: Three Recent Takes - Tim Taylor
- Demand for automated driving technology - VoxEU
Monday, August 13, 2018
From the FRBSF:
FedViews: Kevin J. Lansing, research advisor at the Federal Reserve Bank of San Francisco, stated his views on the current economy and the outlook as of August 9, 2018.
The initial estimate of real GDP growth in the second quarter of 2018 came in at 4.1% at an annual rate, resulting in a growth rate of 2.8% over the past four quarters. For 2018 as a whole, we expect growth to come in just under 3%, well above our estimate of the economy’s long-run sustainable growth rate. Given the diminishing effects of federal fiscal stimulus over the next few years and the expected tightening of financial conditions, we project that growth will slow to just under 2% by 2020.
The Bureau of Labor Statistics reported that payroll employment increased by 157,000 jobs in July. Data for the previous two months were revised upward, resulting in an average job gain over the past three months of 224,000. Over the past year, the average monthly gain was 203,000 jobs. The unemployment rate edged down to 3.9% from 4% in June. We expect monthly job gains to remain above the breakeven level needed to keep pace with the growth rate of the labor force. Consequently, we expect the unemployment rate to decline further below our 4.6% estimate of the natural rate of unemployment.
Inflation over the past year is close to the Federal Open Market Committee’s (FOMC’s) 2% target. With unemployment below the natural rate and real GDP growth above its long-run sustainable pace, we expect some upward pressure on inflation over the medium term, causing the four-quarter inflation rate to slightly overshoot the 2% target in 2020.
Following the conclusion of its latest meeting on August 1, the FOMC announced its decision to maintain the target range for the federal funds rate at 1¾ to 2%. The Committee noted that recent economic activity has been strong and that risks to the economic outlook appear roughly balanced. The Committee expects further gradual increases in the target range for the federal funds rate.
Interest rates have continued to increase with the ongoing monetary policy normalization. Nevertheless, the current level of the federal funds rate remains accommodative as it stands about 50 basis points below our estimate of the “neutral” federal funds rate.
Few, if any, past recessions have been successfully predicted either by the Federal Reserve or professional forecasters. Forecasting recessions is difficult because each one tends to differ in important ways from previous episodes. Past recessions have been triggered by upward spiking oil prices, increases in policy interest rates designed to bring down high inflation, and bursting asset price bubbles.
Despite the varied triggers, recessions are typically preceded by some characteristic interest rate configurations. These include an inverted Treasury yield curve, an elevated real short-term interest rate, and a compressed credit spread (as measured by the yield difference between Baa corporate bonds and 10-year Treasury bonds).
An inverted yield curve is often observed after a sustained series of monetary policy tightening actions that serve to raise real short-term Treasury yields. Long-term Treasury yields, which reflect expectations of future economic conditions, tend to move up with short-term yields during the early phases of an economic expansion, but may stop doing so (resulting in a flat or inverted yield curve) if investors’ economic outlook becomes more pessimistic.
Corporate bond yields are typically higher than Treasury bond yields because corporate yields must compensate investors for the risk of default. During an economic expansion, default risk declines which causes the credit spread to compress. But a sustained expansion may cause investors to underestimate the risk of default, contributing to weak lending standards, excessive borrowing, and a credit spread that is too low. The onset of an economic slowdown or a recession would trigger the unwinding of such conditions. Research shows that optimistic credit market sentiment, as measured in part by a compressed credit spread, tends to predict slower economic activity at a two-year horizon.
The current interest rate configuration can be described as follows: (1) The Treasury yield curve is relatively flat but not inverted, (2) the real short-term interest rate has been increasing but is still low by historical standards, and (3) the credit spread is compressed. This configuration can be described as providing mixed signals about the future. While the first two observations suggest that the risk of a recession remains relatively low, factoring in the third observation would suggest a somewhat higher risk of a recession than otherwise.
The views expressed are those of the author, with input from the forecasting staff of the Federal Reserve Bank of San Francisco. They are not intended to represent the views of others within the Bank or within the Federal Reserve System.
- Partying Like It’s 1998 - Paul Krugman
- The Economy Grew Faster in Truman’s Presidency. So What? - Robert Shiller
- Why the Fed needs a new monetary policy framework - Larry Summers
- Do Import Tariffs Help Reduce Trade Deficits? - Liberty Street
- Aging, Output Per Capita and Secular Stagnation - NBER
- Inequality in the Middle East - VoxEU
- It’s Not Just Carbon Dioxide - Economic Principals
- Human capital and the supply of agricultural workers - VoxEU
- A well-deserved tribute to Nick Rowe - Worthwhile Canadian Initiative
- Protectionism and the business cycle - VoxEU
- Opening the Toolbox: The Nowcasting Code on GitHub - Liberty Street
Friday, August 10, 2018
Supply Chains and Trade War (Very Wonkish): ...last month the IGM Forum weighed in on the issue of supply chains and trade war — the issue that the current trade war, unlike previous trade conflicts, is taking place in a world where much trade consists, not of shipments of consumer goods, but of shipments of inputs used in production. The panelists more or less unanimously agreed that the prevalence of global supply chains increases the cost of trade war. But is the consensus right?
Well, although I yield to nobody in condemning the stupidity and corruption behind Trump trade policy, I’m a bit skeptical about the supply chain concern. Or maybe the best way to say this is that there are three possible stories about how supply chains might increase the costs of trade war, and while two of them are right, I suspect that many economists are buying into the third, which isn’t.
So, what difference does a supply-chain trading system make?
One thing it does is create the possibility that protectionism will be bad mercantilism — that even in its first-round effects it will actually destroy more domestic jobs than it creates, because it creates a competitive disadvantage for domestic downstream producers. ...
Another thing the rise of global supply chains has done is to increase both total trade and the gains from trade. As a result, there is more to lose from a trade war than there was a generation ago.
But what I think many economists have in mind is something more than that.
Standard trade theory tells us that the costs of a tariff — the reduction in real income — may be calculated, approximately, as
Real income loss = 0.5*tariff rate*reduction in imports
This formula suggests only moderate costs even from a major trade war. Suppose that worldwide tariff were to rise to 40 percent, and world trade were to fall by 15 percent of world GDP, a 50% reduction. Even so, world real income would fall only 3 percent.
Now, what I think many economists are suggesting is that this kind of analysis understates the losses when much of that trade is in intermediate goods. But I’m pretty sure they’re wrong, at least in the medium to long run.
Let me sketch out a model here....
That said, a trade war in a supply-chain world would cause a lot of disruption, because it would lead over time to a major restructuring of industry. This would create a lot of losers, as well as some winners, perhaps more than a trade war would have in the past. But I don’t think the notion that the total loss in real income would be bigger than conventional analysis suggests holds up. Trump’s policy moves are destructive, based on ignorance, but we shouldn’t overstate their cost.
- Mass Politics and "Populism" - Brad DeLong
- Asset prices and wealth inequality - VoxEU
- A Primer on the Jones Act and American Shipping - Tim Taylor
- Trends in mortality inequality in the US and France - VoxEU
- The economic costs of the US-China trade war - VoxEU
- Empires, Past and Present - Capital Ebbs and Flows
- What Should an Economics Research Article Look Like? - Tim Taylor
- Economics of density: evidence from the Berlin Wall - Microeconomic Insights
- The devastating cost of central banks’ caution - Financial Times
- Equality for All? - Roger E.A. Farmer
Wednesday, August 08, 2018
- Immigrants use little health care, subsidize non-immigrants - EurekAlert!
- The Ahistorical Federal Reserve - J. Bradford DeLong
- No to Academic Normalization of Trump - Dani Rodrik
- Why You Should Care About Unions - The New York Times
- Labour's Bank of England problem - Stumbling and Mumbling
- How Do the Fed’s MBS Holdings Affect the Economy? - Liberty Street
- Macroeconomic Research, Present and Past - Carola Binder
- Is capitalism rigged in favour of elites? - The Economist
- Some Facts on Global Current Account Balances - Tim Taylor
- Factor Model w Time-Varying Loadings - No Hesitations
This is from Riccardo Colacito, Bridget Hoffmann, Toan Phan and Tim Sablik:
The Impact of Higher Temperatures on Economic Growth, by Riccardo Colacito, Bridget Hoffmann, Toan Phan and Tim Sablik, Economic Brief, Richmond Fed: June 2018 was the third-warmest on average across the contiguous forty-eight states since record keeping began in 1895, according to the National Oceanic and Atmospheric Administration (NOAA). Only 1933 and 2016 saw hotter starts to the summer.
Climate scientists project that average global temperatures will rise over the coming decades, which could have a variety of environmental impacts. But what impact would higher temperatures have on the economy? To date, studies of this question have largely focused on developing countries, under the assumption that those countries are more exposed to the effects of higher temperatures. The economy in developing countries is often more reliant on agriculture or other outdoor activities, and those countries have fewer resources to devote to mitigating the effects of heat through technologies such as air conditioning. Indeed, researchers have found that higher temperatures have significant negative effects on the economic growth of developing nations.1
In the case of developed countries, such as the United States, researchers have focused largely on measuring the impact of warming on outdoor economic activities, such as agriculture.2 Since these sectors make up a relatively small share of the U.S. economy, it has generally been assumed that the economic effects of global warming for the United States would be relatively small. As Nobel prize winning economist Thomas Schelling observed in a 1992 article, "Today very little of our gross domestic product is produced outdoors, susceptible to climate."3
However, research by three authors of this Economic Brief (Colacito, Hoffmann, and Phan) finds that the consequences of higher temperatures on the U.S. economy may be more widespread than previously thought. By examining changes in temperature by season and across states, they find evidence that rising temperatures could reduce overall growth of U.S. economic output by as much as one-third by 2100.4
Warming across Seasons and across States
Attempting to measure the relationship between temperature and growth by looking at the whole United States can hide important variations. Some parts of the country have higher average temperatures. Further increasing temperatures in those areas may be more harmful than rising temperatures in parts of the country that are generally cooler. In fact, higher temperatures in colder regions or during colder seasons actually may have positive effects on economic activity because extreme cold can be as much an impediment to certain activities as extreme heat.
Highlighting the importance of these seasonal and regional variations, Colacito, Hoffmann, and Phan find no statistically significant relationship between temperature and economic growth when looking across the whole United States. But measuring the impact of temperature in different seasons and across individual states yields different results. The authors take the average of daily weather observations from NOAA for each season for 1957–2012. They define each season as a quarter of the calendar year: January through March is winter, April through June is spring, July through September is summer, and October through December is fall. This definition aligns the temperature data with the quarterly periods used for economic data.
Colacito, Hoffmann, and Phan find that temperature increases in the summer are associated with a decline in gross state product (GSP), which is the value added in production by the labor and capital of all industries in a given state. On average, each 1˚F increase in the mean summer temperature reduces the annual GSP growth rate by 0.154 percentage points. A reduction in the growth rate, as opposed to the level of economic output, has important implications for the impact of temperature changes in the long run. Changes to the growth rate compound over time and, as a result, are more lasting.
As theory would suggest, Colacito, Hoffmann, and Phan also find that higher temperatures during the colder fall months have a positive effect on growth. On average, each 1˚F increase in the mean fall temperature increases the annual GSP growth rate by 0.102 percentage points. This finding is smaller and less statistically robust than their finding for the summer effect, but it may help explain why temperature changes do not appear to have a significant effect on growth when averaged across the whole year and across the whole country: the effects in the summer and fall partly offset. The authors do not find any significant effects for temperature increases in the spring or winter.
Measuring the impact of temperature changes on states as opposed to the country as a whole also reveals significant variations. Colacito, Hoffmann, and Phan divide the country into four regions — North, South, Midwest, and West — using classifications from the U.S. Census Bureau. Average temperatures are highest in the South, and the authors find that the economies of southern states are the most sensitive to changes in summer and fall temperatures. Further investigation shows that this effect is not driven by a larger role of agriculture in southern states. In fact, the authors find that the economic effects of temperature are widespread across a variety of industries.
Rising Temperatures Hurt Many Industries
One might easily presume that higher temperatures would only affect agriculture. But in fact, studies have documented the effects of extreme temperatures on other industries. For example, temperatures above 90˚F have been found to reduce production at automobile manufacturing plants in the United States.5 Another study published by the Chicago Fed found that severe winter weather has a significant, albeit short-lived and generally small, negative effect on a variety of industries.6 In line with these findings, Colacito, Hoffmann, and Phan find that higher temperatures in the summer have a negative effect on labor productivity generally, while higher fall temperatures have a positive impact.
Losses in labor productivity have the potential to impact a wide range of industries, which is exactly what Colacito, Hoffmann, and Phan find. (Figure 1 shows results for 1998–2012.) The two largest sectors of the U.S. economy — services and FIRE (finance, insurance, and real estate) — make up half of national GDP and are both hurt by higher summer temperatures. More housing transactions take place in the spring and summer, perhaps because house shopping involves travel and outdoor activity. As temperatures rise, potential homebuyers may tend to stay inside, which could help explain the finding that higher summer temperatures negatively impact the real estate sector.7
Studies also have documented that high temperatures negatively affect health, resulting in increased hospitalizations.8 Colacito, Hoffmann, and Phan hypothesize that this connection may explain the finding that higher summer temperatures have a substantial impact on the insurance sector. As health outcomes worsen, insurers would face increased claims. Overall, the authors find that a 1˚F increase in temperature is associated with a 1.30 percentage point decline in output growth for the insurance sector.
As expected, the authors also find that higher summer temperatures have a large negative effect on agriculture, forestry, and fishing. Although this sector accounts for only about 1 percent of national GDP, losses in this area may spill over to other sectors of the economy, such as retail food services. Higher summer temperatures do have a positive effect on some industries, including utilities and mining, benefits that may stem from increased energy consumption during hotter days.
Although the effects estimated by Colacito, Hoffmann, and Phan are robust, they are also small in the short term. Over a longer horizon, however, the impact on GDP growth rates may be substantial. The authors study the effects of rising temperatures in the future using projections for average temperatures in the United States over the years 2070–99.9 These estimates use three different scenarios of future greenhouse gas emissions (high, medium, and low) by the Intergovernmental Panel on Climate Change. The authors apply these estimates to their analysis, assuming that states do not make any changes to adapt to or mitigate the effects of higher temperatures and that the effects of temperature on economic growth that they found in their state-by-state analysis do not change.
Under the low-emissions scenario, the authors estimate that rising temperatures would reduce the growth rate of GDP by 0.2 to 0.4 percentage points from 2070 through 2099, or as much as 10 percent of the historical average annual growth rate of 4 percent. Under the high-emissions scenario, rising temperatures could reduce the growth rate by up to 1.2 percentage point, or roughly one-third of the historical average annual GDP growth rate. (See Figure 2.) The authors note that these estimates should be "interpreted with caution," since future adaptations to changing temperatures may mute the long-run effects they calculate.
While the impact of future climate adaptations is unknown, Colacito, Hoffmann, and Phan do examine whether more widespread climate adaptation within their sample period may have reduced the impact of temperature on growth. In fact, they find that the negative impact of higher summer temperatures is larger and still statistically significant after 1990, while the positive fall effect becomes smaller and statistically indistinguishable from zero. Thus, if anything, they find that the negative impact of temperature increases on GDP growth has become more pronounced in recent decades despite advances in adaptive measures.
Overall, these findings suggest that rising temperatures in the future could hamper economic growth in a variety of industries even in developed nations such as the United States.
Riccardo Colacito is an associate professor of finance and economics at the University of North Carolina, Chapel Hill, and Bridget Hoffmann is an economist in the Research Department at the Inter-American Development Bank. Toan Phan is an economist and Tim Sablik is an economics writer in the Research Department at the Federal Reserve Bank of Richmond.
1John Luke Gallup, Jeffrey D. Sachs, and Andrew D. Mellinger, "Geography and Economic Development," International Regional Science Review, August 1999, vol. 22, no. 2, pp. 179–232; William D. Nordhaus, "Geography and Macroeconomics: New Data and New Findings," Proceedings of the National Academy of Sciences of the United States of America, March 2006, vol. 103, no. 10, pp. 3510–3517; Melissa Dell, Benjamin F. Jones, and Benjamin A. Olken, "Temperature Shocks and Economic Growth: Evidence from the Last Half Century," American Economic Journal: Macroeconomics, July 2012, vol. 4, no. 3, pp. 66–95.
2See, for example, Marshall Burke and Kyle Emerick, "Adaptation to Climate Change: Evidence from U.S. Agriculture," American Economic Journal: Economic Policy, August 2016, vol. 8, no. 3, pp. 106–140.
3Thomas C. Schelling, "Some Economics of Global Warming," American Economic Review, March 1992, vol. 82, no. 1, pp. 1–14.
4Riccardo Colacito, Bridget Hoffmann, and Toan Phan, "Temperature and Growth: A Panel Analysis of the United States," Federal Reserve Bank of Richmond Working Paper No. 18-09, March 2018.
5Gerard P. Cachon, Santiago Gallino, and Marcelo Olivares, "Severe Weather and Automobile Assembly Productivity," Columbia Business School Research Paper No. 12/37, December 2012.
6Justin Bloesch and François Gourio, "The Effect of Winter Weather on U.S. Economic Activity," Federal Reserve Bank of Chicago Economic Perspectives, First Quarter 2015, vol. 39, no. 1, pp. 1–20.
7 L. Rachel Ngai and Silvana Tenreyro, "Hot and Cold Seasons in the Housing Market," American Economic Review, December 2014, vol. 104, no. 12, pp. 3991–4026.
8See, for example, Ekta Choudhary and Ambarish Vaidyanathan, "Heat Stress Illness Hospitalizations — Environmental Public Health Tracking Program, 20 States, 2001–2010," Morbidity and Mortality Weekly Report, Surveillance Summaries, December 12, 2014, vol. 63, no. 13.
9Temperature estimates come from Evan H. Girvetz, Chris Zganjar, George T. Raber, Edwin P. Maurer, Peter Kareiva, and Joshua J. Lawler, "Applied Climate-Change Analysis: The Climate Wizard Tool," PLoS One, December 2009, vol. 4, no. 12, e8320.
This article may be photocopied or reprinted in its entirety. Please credit the authors, source, and the Federal Reserve Bank of Richmond and include the following statement. Views expressed in this article are those of the authors and not necessarily those of the Federal Reserve Bank of Richmond or the Federal Reserve System.
Ajay Agrawal, Joshua Gans, and Avi Goldfarb at VoxEU:
Economic policy for artificial intelligence: Artificial intelligence (AI) technologies advanced rapidly over the past several years. Governments around the world responded by developing AI strategies. France released its national AI strategy in March 2018, emphasising research funds, ethical issues, and inequality. China stated a goal of being the top AI country by 2030. The EU, Canada, Japan, the Obama administration, the Trump administration, and many others have put forth their own plans (Sutton 2018).
Pessimistic views of the impact of AI on society are widespread. Elon Musk, Stephen Hawking, Bill Gates, and others warn that rapid advances in AI could transform society for the worse. More optimistically, AI could enhance productivity so dramatically that people have plenty of income and little unpleasant work to do (Stevenson 2018). Regardless of whether one adopts a pessimistic or optimistic view, policy will shape how AI affects society.
What is AI?
While the Oxford English Dictionary defines artificial intelligence as “the theory and development of computer systems able to perform tasks normally requiring human intelligence”, the recent excitement is driven by advances in machine learning, a field of computer science focused on prediction. As machine learning pioneer Geoffrey Hinton put it: “Take any old problem where you have to predict something and you have a lot of data, and deep learning is probably going to make it work better than existing techniques”.1 Recent advances in AI can therefore be seen as a drop in the cost of prediction. Because prediction is an important input into decision-making, in recent work we discuss how AI is likely to have widespread consequences as a general purpose technology (Agrawal et al. 2018a, 2018b).
There are two aspects of AI policy.
- First, regulatory policy has an impact on the speed of diffusion of the technology and the form that the technology takes.
- Second, a number of policies focus on mitigating potential negative consequences of AI with respect to labour markets and antitrust concerns.
Policies that will influence the diffusion of AI
Liability rules will also impact the diffusion of AI (Galasso and Luo 2018). Firms will be less likely to invest in the development of AI products in the absence of clear liability rules. Autonomous vehicles provide a useful example. A number of different companies will participate in the development of a self-driving car. If a car gets into an accident, would the sensor manufacturer be liable? The telecommunications provider? The vehicle manufacturer? Or perhaps an AI software firm? Without clear rules on who is liable, all may hesitate to invest. If autonomous vehicles would save lives, should manufacturers of non-autonomous vehicles be held to higher standards than current law requires? This would accelerate diffusion of the safer technology. In contrast, if the increases in liability focus is primarily on newer technology, then diffusion will slow.
In addition, similar to other technologies, advances will be faster with more research support, well-balanced intellectual property law, and the ability to experiment in a safe way.
Policies that address the consequences of AI
A common worry about AI concerns the potential impact on jobs. If machines can do tasks normally requiring human intelligence, will there be jobs left for humans? In our view, this is the wrong question. There are plenty of horrible jobs. Furthermore, more leisure is generally considered to be a positive development, although some have raised concerns about the need to find alternate sources of meaning (Stevenson 2018). The most significant long-run policy issues relate to the potential changes to the distribution of the wealth generated by the widespread use of AI. In other words, AI may increase inequality.
If AI is like other types of information technology, it is likely to be skill-biased. The people who benefit most from AI will be educated people who already are doing relatively well. These people are also more likely to own the machines. Policies to address the consequences of AI for inequality relate to the social safety net. While some have floated relatively radical ideas to deal with the potential increase in inequality – such as a tax on robots and a universal basic income – the AI context is not unique in weighing the costs and benefits of social programmes from progressive taxation to universal healthcare.
In the shorter run, if AI diffuses widely, the transition could mean temporary displacement for many workers. Acemoglu and Restrepo (2018) emphasise a short- and medium-term mismatch between skills and technology. This means that policy preparation in advance of the diffusion of AI should consider both business cycles and education policy. Technology-driven layoffs concentrated in location and time are not unique to AI. They were a feature of factory automation and the mechanization of farming. For education policy, there are many open questions. Should we emphasise social skills and the humanities if machines increasingly are able to do technology-related prediction tasks? Should the education system evolve to focus more on adults? How do the skills needed as AI diffuses differ from the skills currently provided through the education system?
Another policy question around the diffusion of AI relates to whether it will lead to monopolisation of industry. The leading companies in AI are large in terms of revenue, profits, and especially market capitalisation (high multiples on earnings). This has led to an increase in antitrust scrutiny of the leading technology firms from governments (particularly the European Commission) and in the press (see, for example, The Economist’s 20 January 2018 cover story, “The new titans, and how to tame them”, and their subsequent story, “The market for driverless cars will head towards monopoly”, on 7 June 2018). Much of this antitrust scrutiny focuses on the role of these firms as platforms, not on their use of AI per se. The feature that makes AI different is the importance of data. Firms with more data can build better AI. Whether this leads to economies of scale and the potential for monopolisation depends on whether a small lead early in the development cycle creates a positive feedback loop and a long-run advantage.
Much of economic policy for AI is simply economic policy. For the diffusion of AI, it resembles innovation policy. For the consequences of AI, it resembles public policy (the social safety net) and competition policy (antitrust). We summarise aspects of economic policy for AI in Table 1.
Table 1 Aspects of economic policy for artificial intelligence
Although AI is like other technologies in many respects, it is unusual in a few important dimensions. Specifically, AI is both a general purpose technology (GPT) – i.e. it has a wide domain of applications – as well as an ‘invention of a method of invention’ (IMI) (Cockburn et al., 2018; Agrawal et al. 2018). Cockburn et al. assert that “… the arrival of a general purpose IMI is a sufficiently uncommon occurrence that its impact could be profound for economic growth and its broader impact on society.” They assemble and analyse the corpus of scientific papers and patenting activity in AI, and provide evidence consistent with the characterisation of machine learning as both a GPT and IMI.
The implication concerns the returns to investments in AI policy design. Due to the breadth of applications, the cost of suboptimal policy design will likely be significantly higher than with other technologies – or the benefits of optimal policy greater. Furthermore, the returns to investments in policy design are not only a function of the directeffects, where AI “directly influences both the production and the characteristics of a wide range of products and services”, but also the indirecteffects because “AI also has the potential to change the innovation process itself, with consequences that may be equally profound, and which may, over time, come to dominate the direct effect” (Cockburn et al. 2018).
Authors’ note: The points we raise in this column are based on Agrawal et al. (2018a), which in turn builds on discussions at the 2017 NBER Conference on the Economics of AI in Toronto and the associated conference volume (Agrawal et al. 2018c).
Acemoglu, D, and P Restrepo (2018), “Artificial Intelligence, Automation and Work”, in A Agrawal, J Gans, and A Goldfarb (eds), The Economics of Artificial Intelligence: An Agenda, University of Chicago Press.
Agrawal, A, J Gans, and A Goldfarb (2018a), “Economic Policy for Artificial Intelligence”, NBER Working Paper 24690.
Agrawal, A, J Gans, and A Goldfarb (2018b), Prediction Machines: The Simple Economics of Artificial Intelligence, Harvard Business School Press.
Agrawal, A, J Gans, and A Goldfarb (eds) (2018c), The Economics of Artificial Intelligence: An Agenda, University of Chicago Press.
Agrawal, A, J McHale, and A Oettl (2018), “Finding Needles in Haystacks: Artificial Intelligence and Recombinant Growth”, in A Agrawal, J Gans, and A Goldfarb (eds), The Economics of Artificial Intelligence: An Agenda, University of Chicago Press.
Cockburn, I, R Henderson, and S Stern (2018), “The Impact of Artificial Intelligence on Innovation”, in A Agrawal, J Gans, and A Goldfarb (eds), The Economics of Artificial Intelligence: An Agenda, University of Chicago Press.
Galasso, A, and H Luo (2018), “Punishing Robots: Issues in the Economics of Tort Liability and Innovation in Artificial Intelligence”, in A Agrawal, J Gans, and A Goldfarb (eds), The Economics of Artificial Intelligence: An Agenda, University of Chicago Press.
Goldfarb, A, and C Tucker (2012), “Privacy and Innovation”, in J Lerner and S Stern (eds), Innovation Policy and the Economy, Volume 12, NBER, University of Chicago Press: 65-89.
Sutton, T (2018), “An Overview of AI Strategies”, Medium, 28 June.
Stevenson, B (2018), “AI, Income, Employment, and Meaning”, in A Agrawal, J Gans, and A Goldfarb (eds), The Economics of Artificial Intelligence: An Agenda, University of Chicago Press.
 https://www.youtube.com/watch?v=2HMPRXstSvQ (accessed 22 May 2018).
Tuesday, August 07, 2018
Trump hasn’t prepared us for the inevitable economic slowdown, Washington Post: President Trump regularly and proudly takes credit for the U.S. economy’s strong performance. And with rapid growth during the second quarter, the stock market strong, the unemployment rate back below 4 percent and the midterm elections looming, Trump’s rhetoric and that of his supporters will probably escalate in coming months.
In fact, however, the president receives more of a boost from the strong economy than the other way around. This conclusion will only be reinforced if Trump’s current steps toward a trade war retard U.S. economic performance, as is increasingly feared. ...
Fiscal stimulus is like a drug with tolerance effects; to keep growth constant, deficits have to keep getting larger. Some combination of gathering foreign storm clouds, the end of growing fiscal stimulus and the delayed effect of tightening monetary policies may converge to slow or end the expansion.
The choices this administration is making invite foreign retaliation against U.S. exporters and use up fiscal capacity — even as the economy is growing rapidly. Because of this, and because there is limited room for monetary policy, the country will not be in a position to respond strongly if a downturn comes. All the more reason, therefore, to avoid pulling demand forward.
This is all quite dangerous. The president has taken credit for far more economic success than he deserves. He will disproportionately be blamed when the downturn comes. What follows will be a test of our democracy.
David Altig, Nick Bloom, Steven J. Davis, Brent Meyer, and Nick Parker at the Atlanta Fed's macroblog:
Are Tariff Worries Cutting into Business Investment?: "Nobody's model does a very good job of how uncertainty and hits to confidence affect behavior," says Deutsche Bank's Peter Hooper in a recent Wall Street Journal article. Count us as sympathetic to his viewpoint.
That's one reason why a few of us at the Atlanta Fed created a national survey of firms in collaboration with Nick Bloom of Stanford University and Steven Davis of the University of Chicago Booth School of Business. Our Survey of Business Uncertainty (SBU) elicits information about each firm's expectations and uncertainty regarding its own future capital expenditures, sales growth, employment, and costs.
A pressing issue at the moment is whether, and how, firms are reassessing their capital investment plans in light of recent tariff hikes and fears of more to come. By raising input costs, domestic tariff hikes undercut the business case for some investments. They can raise domestic investment in newly protected industries. Retaliatory tariff hikes by trading partners can also affect domestic investment by curtailing the demand for U.S. exports. An uncertain outlook for trade policy can cause firms in all industries to delay investments while they wait to see how trade policy disputes unfold.
Last month's SBU (previously known as our Survey of Business Executives) sheds some light on these matters. We first posed a simple question: "Have the recently announced tariff hikes or concerns about retaliation caused your firm to reassess its capital expenditure plans?" Yes, said about one-fifth of our respondents.
As exhibit 1 shows, the share of firms reassessing their capital plans because of tariff worries is higher for goods-producing firms than service-providers. It's 30 percent for manufacturers and 28 percent in retail & wholesale trade, transportation and warehousing. In contrast, it's only 14 percent among all service providers in our sample. These sectoral patterns make sense, given that manufacturing firms, for example, are more engaged in international commerce than most service providers.
We also asked firms how they are reassessing their capital expenditure plans in light of tariff worries. Exhibit 2 provides information on this issue. Among firms reassessing, 67 percent have placed some of their previously planned capital expenditures for 2018–19 "under review," 31 percent have "postponed" or "dropped" previously planned expenditures, 14 percent have "accelerated" their plans, and 2 percent (one firm) added new capital expenditures for 2018–19.
Finally, we asked firms how much tariff worries affect their previously planned capital expenditures. Among firms re-assessing, an average 60 percent of their capital expenditure plans are affected. The predominant form of reassessment is placing previously planned capital expenditures "under review."
Let's sum up: About one-fifth of firms in the July 2018 SBU say they are reassessing capital expenditure plans in light of tariff worries. Among this one-fifth, firms have reassessed an average 60 percent of capital expenditures previously planned for 2018–19. The main form of reassessment thus far is to place previously planned capital expenditures under review. Only 6 percent of the firms in our full sample report cutting or deferring previously planned capital expenditures in reaction to tariff worries. These findings suggest that tariff worries have had only a small negative effect on U.S. business investment to date.
Still, there are sound reasons for concern. First, 30 percent of manufacturing firms report reassessing capital expenditure plans because of tariff worries, and manufacturing is highly capital intensive. So the investment effects of trade policy frictions are concentrated in a sector that accounts for much of business investment. Second, 12 percent of the firms in our full sample report that they have placed previously planned capital expenditures under review. Third, trade policy tensions between the United States and China have only escalated since our survey went to field. The negative effects of tariff worries on U.S. business investment could easily grow.
Monday, August 06, 2018
- Notes on a Butter Republic - Paul Krugman
- An Interview with Avinash Dixit - The Politic
- Trump v. Fed - Cecchetti & Schoenholtz
- The robot paradox - Stumbling and Mumbling
- The ethnic segregation of immigrants in the US from 1850 to 1940 - VoxEU
- The Emergence and Erosion of the Retail Sales Tax - Tim Taylor
- Are Trump’s Policies Hurting Long-Term US Growth? - Kenneth Rogoff
- How Do the Fed's MBS Purchases Affect Credit Allocation? - Liberty Street
- They Want What We’ve Got - Economic Principals
Friday, August 03, 2018
- Stop Calling Trump a Populist - Paul Krugman
- Productivity Measurement Initiative - Brookings
- Why Every Good Economist Should Be Feminist - ProMarket
- Work Requirements Hurt Poor Families—and Won’t Work - Jason Furman
- The Hidden Danger for Trump in the Economy’s Growth Spurt - John Cassidy
- How China beat the Global Financial Crisis - mainly macro
Economy Adds 157,000 Jobs in July, Little Evidence of Pick-up in Wage Growth: Unemployment rates for workers without a high school degree hit a record low as less-educated workers continue to be biggest job gainers in recovery.
The Bureau of Labor Statistics (BLS) reported the economy added 157,000 jobs in July. With upward revisions to the data from the prior two months, the average gain over the last three months was 224,000. The unemployment rate edged down to 3.9 percent as most of the rise in unemployment in June, which was due to increased labor force participation, was reversed. The employment-to-population (EPOP) ratio rose to 60.5 percent, a new high for the recovery.
In spite of the healthy pace of job growth and the low unemployment rate, there continues to be little evidence of accelerating wage growth. Over the last year, the average hourly wage has risen by 2.7 percent. There is a very small uptick to 2.87 percent if we annualize the rate of wage growth for the last three months (May, June, and July) compared with the prior three months (February, March, and April).
Interestingly, there was a modest fall in hours in July, which led to a decline in the index of aggregate weekly hours from 110.0 to 109.8. As a result, the average weekly wage actually declined slightly in July. ...
In spite of the complaints about labor shortages in sectors such as manufacturing and trucking, we continue to see little evidence in wage growth. The average hourly wage for production workers in manufacturing has risen by just 2.7 percent over the last year, while in the larger trucking and warehousing category it has risen less than 2.5 percent.
The story on the household side was overwhelmingly positive. In addition to the rise in EPOPs, the number of involuntary part-time workers fell by 176,000 to a new low for the recovery. The percentage of unemployment due to voluntary quits rose to 13.5 percent, largely reversing a drop in June. The unemployment rate for Hispanic workers fell to 4.5 percent, a new record low.
Looking at 10 year age spans for prime age workers, EPOPs have been rising for both men and women, although only women between the ages of 25 and 34 have recovered to their prerecession peak EPOP. Even this group is still slightly below its 2000 peaks. The trends in EPOPs suggests there is further room for employment to expand.
The unemployment rate for workers without a high school degree fell to 5.1 percent in July, the lowest rate since the BLS adjusted its education measures in 1992. This is 1.9 percentage points below its year-ago rate.
Less-educated workers have been the big gainers in terms of employment in the last few years of the recovery. While the unemployment rate for workers with less than a high school degree is well below the prerecession level and even its 2000 low, the unemployment rate for workers with a college degree, at 2.2 percent, is still above its prerecession low of 1.8 percent and well above its 2000 low of 1.5 percent.
Workers with just a high school degree also seem to be doing relatively better, with a 4.0 percent unemployment rate matching the prerecession low (it had been 3.9 percent in May), although still above the 3.2 percent low hit in 1999. The idea that the labor market is becoming increasingly tilted to favor more educated workers does not appear to be supported by the employment data.
This is, again, a solid jobs report in terms of job creation and lower unemployment. However, wage growth continues to be a problem, with wages barely outpacing inflation.
Thursday, August 02, 2018
This is "a work in progress, with follow-ups on the way":
Macroeconomic Research, Present and Past, by P.J. Glandon, Ken Kuttner, Sandeep Mazumder, and Caleb Stroup, August 1, 2018: Abstract We document eight facts about published macroeconomics research by hand collecting information about the epistemological approaches, methods, and data appearing in over a thousand published papers. Macroeconomics journals have published an increasing share of theory papers over the past 38 years, with theory-based papers now comprising the majority of published macroeconomics research. The increase in quantitative models (e.g., DSGE methods) masks a decline in publication of pure theory research. Financial intermediation played an important role in about a third of macroeconomic theory papers in the 1980s and 1990s, but became less frequent until the financial crisis, at which point it once again became an important area of focus. Only a quarter of macroeconomics publications conduct falsification exercises. This finding contrasts with the year 1980, when these empirical approaches dominated macroeconomics publishing. Yet the fraction of empirical papers that rely on microdata or proprietary data has increased dramatically over the past decade, with these features now appearing in the majority of published empirical papers. All of these findings vary dramatically across individual macroeconomics field journals.
Summer 2018 Journal of Economic Perspectives Available On-line: I was hired back in 1986 to be the Managing Editor for a new academic economics journal, at the time unnamed, but which soon launched as the Journal of Economic Perspectives. The JEP is published by the American Economic Association, which back in 2011 decided--to my delight--that it would be freely available on-line, from the current issue back to the first issue. Here, I'll start with Table of Contents for the just-released Summer 2018 issue, which in the Taylor household is known as issue #125. Below that are abstracts and direct links for all of the papers. I may blog more specifically about some of the papers in the next week or two, as well.
Symposium: Macroeconomics a Decade after the Great Recession
"What Happened: Financial Factors in the Great Recession," by Mark Gertler and Simon Gilchrist At the onset of the recent global financial crisis, the workhorse macroeconomic models assumed frictionless financial markets. These frameworks were thus not able to anticipate the crisis, nor to analyze how the disruption of credit markets changed what initially appeared like a mild downturn into the Great Recession. Since that time, an explosion of both theoretical and empirical research has investigated how the financial crisis emerged and how it was transmitted to the real sector. The goal of this paper is to describe what we have learned from this new research and how it can be used to understand what happened during the Great Recession. In the process, we also present some new empirical work. We argue that a complete description of the Great Recession must take account of the financial distress facing both households and banks and, as the crisis unfolded, nonfinancial firms as well. Exploiting both panel data and time series methods, we analyze the contribution of the house price decline, versus the banking distress indicator, to the overall decline in employment during the Great Recession. We confirm a common finding in the literature that the household balance sheet channel is important for regional variation in employment. However, we also find that the disruption in banking was central to the overall employment contraction. Full-Text Access | Supplementary Materials
"Finance and Business Cycles: The Credit-Driven Household Demand Channel," by Atif Mian and Amir Sufi What is the role of the financial sector in explaining business cycles? This question is as old as the field of macroeconomics, and an extensive body of research conducted since the Global Financial Crisis of 2008 has offered new answers. The specific idea put forward in this article is that expansions in credit supply, operating primarily through household demand, have been an important driver of business cycles. We call this the credit-driven household demand channel. While this channel helps explain the recent global recession, it also describes economic cycles in many countries over the past 40 years. Full-Text Access | Supplementary
"Identification in Macroeconomics," by Emi Nakamura and Jón Steinsson This paper discusses empirical approaches macroeconomists use to answer questions like: What does monetary policy do? How large are the effects of fiscal stimulus? What caused the Great Recession? Why do some countries grow faster than others? Identification of causal effects plays two roles in this process. In certain cases, progress can be made using the direct approach of identifying plausibly exogenous variation in a policy and using this variation to assess the effect of the policy. However, external validity concerns limit what can be learned in this way. Carefully identified causal effects estimates can also be used as moments in a structural moment matching exercise. We use the term "identified moments" as a short-hand for "estimates of responses to identified structural shocks," or what applied microeconomists would call "causal effects." We argue that such identified moments are often powerful diagnostic tools for distinguishing between important classes of models (and thereby learning about the effects of policy). To illustrate these notions we discuss the growing use of cross-sectional evidence in macroeconomics and consider what the best existing evidence is on the effects of monetary policy. Full-Text Access | Supplementary Materials
"The State of New Keynesian Economics: A Partial Assessment," by Jordi Galí In August 2007, when the first signs emerged of what would come to be the most damaging global financial crisis since the Great Depression, the New Keynesian paradigm was dominant in macroeconomics. Ten years later, tons of ammunition has been fired against modern macroeconomics in general, and against dynamic stochastic general equilibrium models that build on the New Keynesian framework in particular. Those criticisms notwithstanding, the New Keynesian model arguably remains the dominant framework in the classroom, in academic research, and in policy modeling. In fact, one can argue that over the past ten years the scope of New Keynesian economics has kept widening, by encompassing a growing number of phenomena that are analyzed using its basic framework, as well as by addressing some of the criticisms raised against it. The present paper takes stock of the state of New Keynesian economics by reviewing some of its main insights and by providing an overview of some recent developments. In particular, I discuss some recent work on two very active research programs: the implications of the zero lower bound on nominal interest rates and the interaction of monetary policy and household heterogeneity. Finally, I discuss what I view as some of the main shortcomings of the New Keynesian model and possible areas for future research. Full-Text Access | Supplementary Materials
"On DSGE Models," by Lawrence J. Christiano, Martin S. Eichenbaum and Mathias Trabandt The outcome of any important macroeconomic policy change is the net effect of forces operating on different parts of the economy. A central challenge facing policymakers is how to assess the relative strength of those forces. Economists have a range of tools that can be used to make such assessments. Dynamic stochastic general equilibrium (DSGE) models are the leading tool for making such assessments in an open and transparent manner. We review the state of mainstream DSGE models before the financial crisis and the Great Recession. We then describe how DSGE models are estimated and evaluated. We address the question of why DSGE modelers—like most other economists and policymakers—failed to predict the financial crisis and the Great Recession, and how DSGE modelers responded to the financial crisis and its aftermath. We discuss how current DSGE models are actually used by policymakers. We then provide a brief response to some criticisms of DSGE models, with special emphasis on criticism by Joseph Stiglitz, and offer some concluding remarks. Full-Text Access | Supplementary Materials
"Evolution of Modern Business Cycle Models: Accounting for the Great Recession," Patrick J. Kehoe, Virgiliu Midrigan and Elena Pastorino Modern business cycle theory focuses on the study of dynamic stochastic general equilibrium (DSGE) models that generate aggregate fluctuations similar to those experienced by actual economies. We discuss how these modern business cycle models have evolved across three generations, from their roots in the early real business cycle models of the late 1970s through the turmoil of the Great Recession four decades later. The first generation models were real (that is, without a monetary sector) business cycle models that primarily explored whether a small number of shocks, often one or two, could generate fluctuations similar to those observed in aggregate variables such as output, consumption, investment, and hours. These basic models disciplined their key parameters with micro evidence and were remarkably successful in matching these aggregate variables. A second generation of these models incorporated frictions such as sticky prices and wages; these models were primarily developed to be used in central banks for short-term forecasting purposes and for performing counterfactual policy experiments. A third generation of business cycle models incorporate the rich heterogeneity of patterns from the micro data. A defining characteristic of these models is not the heterogeneity among model agents they accommodate nor the micro-level evidence they rely on (although both are common), but rather the insistence that any new parameters or feature included be explicitly disciplined by direct evidence. We show how two versions of this latest generation of modern business cycle models, which are real business cycle models with frictions in labor and financial markets, can account, respectively, for the aggregate and the cross-regional fluctuations observed in the United States during the Great Recession. Full-Text Access | Supplementary Materials
"Microeconomic Heterogeneity and Macroeconomic Shocks," by Greg Kaplan and Giovanni L. Violante In this essay, we discuss the emerging literature in macroeconomics that combines heterogeneous agent models, nominal rigidities, and aggregate shocks. This literature opens the door to the analysis of distributional issues, economic fluctuations, and stabilization policies—all within the same framework. In response to the limitations of the representative agent approach to economic fluctuations, a new framework has emerged that combines key features of heterogeneous agents (HA) and New Keynesian (NK) economies. These HANK models offer a much more accurate representation of household consumption behavior and can generate realistic distributions of income, wealth, and, albeit to a lesser degree, household balance sheets. At the same time, they can accommodate many sources of macroeconomic fluctuations, including those driven by aggregate demand. In sum, they provide a rich theoretical framework for quantitative analysis of the interaction between cross-sectional distributions and aggregate dynamics. In this article, we outline a state-of-the-art version of HANK together with its representative agent counterpart, and convey two broad messages about the role of household heterogeneity for the response of the macroeconomy to aggregate shocks: 1) the similarity between the Representative Agent New Keynesian (RANK) and HANK frameworks depends crucially on the shock being analyzed; and 2) certain important macroeconomic questions concerning economic fluctuations can only be addressed within heterogeneous agent models. Full-Text Access | Supplementary Materials
Symposium: Incentives in the Workplace
"Compensation and Incentives in the Workplace," by Edward P. Lazear Labor is supplied because most of us must work to live. Indeed, it is called "work" in part because without compensation, the overwhelming majority of workers would not otherwise perform the tasks. The theme of this essay is that incentives affect behavior and that economics as a science has made good progress in specifying how compensation and its form influences worker effort. This is a broad topic, and the purpose here is not a comprehensive literature review on each of many topics. Instead, a sample of some of the most applicable papers are discussed with the goal of demonstrating that compensation, incentives, and productivity are inseparably linked. Full-Text Access | Supplementary Materials
"Nonmonetary Incentives and the Implications of Work as a Source of Meaning," by Lea Cassar and Stephan Meier Empirical research in economics has begun to explore the idea that workers care about nonmonetary aspects of work. An increasing number of economic studies using survey and experimental methods have shown that nonmonetary incentives and nonpecuniary aspects of one's job have substantial impacts on job satisfaction, productivity, and labor supply. By drawing on this evidence and relating it to the literature in psychology, this paper argues that work represents much more than simply earning an income: for many people, work is a source of meaning. In the next section, we give an economic interpretation of meaningful work and emphasize how it is affected by the mission of the organization and the extent to which job design fulfills the three psychological needs at the basis of self-determination theory: autonomy, competence, and relatedness. We point to the evidence that not everyone cares about having a meaningful job and discuss potential sources of this heterogeneity. We sketch a theoretical framework to start to formalize work as a source of meaning and think about how to incorporate this idea into agency theory and labor supply models. We discuss how workers' search for meaning may affect the design of monetary and nonmonetary incentives. We conclude by suggesting some insights and open questions for future research. Full-Text Access | Supplementary Materials
"The Changing (Dis-)utility of Work," by Greg Kaplan and Sam Schulhofer-Wohl We study how changes in the distribution of occupations have affected the aggregate non-pecuniary costs and benefits of working. The physical toll of work is less now than in 1950, with workers shifting away from occupations in which people report experiencing tiredness and pain. The emotional consequences of the changing occupation distribution vary substantially across demographic groups. Work has become happier and more meaningful for women, but more stressful and less meaningful for men. These changes appear to be concentrated at lower education levels. Full-Text Access | Supplementary Materials
"Social Connectedness: Measurement, Determinants, and Effects," by Michael Bailey, Rachel Cao, Theresa Kuchler, Johannes Stroebel and Arlene Wong Social networks can shape many aspects of social and economic activity: migration and trade, job-seeking, innovation, consumer preferences and sentiment, public health, social mobility, and more. In turn, social networks themselves are associated with geographic proximity, historical ties, political boundaries, and other factors. Traditionally, the unavailability of large-scale and representative data on social connectedness between individuals or geographic regions has posed a challenge for empirical research on social networks. More recently, a body of such research has begun to emerge using data on social connectedness from online social networking services such as Facebook, LinkedIn, and Twitter. To date, most of these research projects have been built on anonymized administrative microdata from Facebook, typically by working with coauthor teams that include Facebook employees. However, there is an inherent limit to the number of researchers that will be able to work with social network data through such collaborations. In this paper, we therefore introduce a new measure of social connectedness at the US county level. Our Social Connectedness Index is based on friendship links on Facebook, the global online social networking service. Specifically, the Social Connectedness Index corresponds to the relative frequency of Facebook friendship links between every county-pair in the United States, and between every US county and every foreign country. Given Facebook's scale as well as the relative representativeness of Facebook's user body, these data provide the first comprehensive measure of friendship networks at a national level. Full-Text Access | Supplementary Materials
From The Economist:
Rulers of the world: read Karl Marx!: ...The chief reason for the continuing interest in Marx, however, is that his ideas are more relevant than they have been for decades. The post-war consensus that shifted power from capital to labour and produced a “great compression” in living standards is fading. Globalisation and the rise of a virtual economy are producing a version of capitalism that once more seems to be out of control. The backwards flow of power from labour to capital is finally beginning to produce a popular—and often populist—reaction. No wonder the most successful economics book of recent years, Thomas Piketty’s “Capital in the Twenty-First Century”, echoes the title of Marx’s most important work and his preoccupation with inequality. ...
From Kevin Bryan at Updated Priors:
The 2018 Fields Medal and its Surprising Connection to Economics!: The Fields Medal and Nevanlinna Prizes were given out today. They represent the highest honor possible for young mathematicians and theoretical computer scientists, and are granted only once every four years. The mathematics involved is often very challenging for outsiders. Indeed, the most prominent of this year’s winners, the German Peter Scholze, is best known for his work on “perfectoid spaces”, and I honestly have no idea how to begin explaining them aside from saying that they are useful in a number of problems in algebraic geometry (the lovely field mapping results in algebra – what numbers solve y=2x – and geometry – noting that those solutions to y=2x form a line). Two of this year’s prizes, however, the Fields given to Alessio Figalli and the Nevanlinna to Constantinos Daskalakis, have a very tight connection to an utterly core question in economics. Indeed, both of those men have published work in economics journals!
The problem of interest concerns how best to sell an object. ...
Wednesday, August 01, 2018
- Transaction Costs and Tethers: Why I’m a Crypto Skeptic - Paul Krugman
- How to Even Out the Pains and Gains from International Trade - ProMarket
- Price Level Targeting with Evolving Credibility - Brad DeLong
- The Marxist Bank of England - Stumbling and Mumbling
- Monetary Policy as a Jobs Guarantee - The Everyday Economist
"Black-white economic inequality remains large and persistent, and recent wealth inequality trends for all Americans are explained by assets, not income":
Race, and the race between stocks and homes, by Douglas ClementMost research on long-term U.S. inequality focuses on income; relatively little examines wealth, largely due to lack of good asset data. But a June 2018 working paper from the Opportunity & Inclusive Growth Institute addresses that imbalance with a new data set developed from historical surveys, and it shows that wealth—specifically, ownership of stocks and homes—has been a central force behind U.S. inequality trends for 70 years.
...Their analysis begins by confirming the findings of other scholars: increased income polarization since the 1970s, with particular damage to the relative position of the middle-class. It also sheds new light on economic inequality between blacks and whites by quantifying vast differences in wealth as well as income, and no progress in diminishing those gaps.
Perhaps the study’s most novel contribution, however, is in revealing the singular role of household portfolio composition—ownership of different asset types—in determining inequality trends. Because the primary source of middle-class American wealth is homeownership, and the main asset holding of the top 10 percent is equity, the relative prices of the two assets have set the path for wealth distribution and driven a wedge between the evolution of income and wealth.
In brief, as home prices climbed from 1950 until the mid-2000s, middle-class wealth held its own relative to upper-class wealth even as middle-class incomes stagnated. But after the financial crisis, the stock market’s quick recovery and slow turnaround of housing prices meant soaring wealth inequality that even exceeded the last decade’s climb in income inequality. ...
Racial inequality: “The overall summary is bleak”
The demographic detail and 70-year span of the new database also permitted close analysis of racial inequality, pre- and post-civil rights eras. The picture is discouraging. Income disparities are as large now as in 1950, with black household income still just half that of white households.
The racial gap in wealth is even wider, and similarly stagnant. The median black household has less than 11 percent the wealth of the median white household (about $15,000 versus $140,000 in 2016 prices). The economists also find that the financial crisis hit black households particularly hard.
“The overall summary is bleak,” they write. “Over seven decades, next to no progress has been made in closing the black-white income gap. The racial wealth gap is equally persistent. … The typical black household remains poorer than 80% of white households.”
The race between stocks and homes
To explain the divergent trends in income and wealth inequality before the crisis, the economists draw on a key strength of the database: It includes both income and wealth information, household-by-household, and 70 years of balance sheets with detailed portfolio composition.
With this, they find that the bottom 50 percent now holds little or negative wealth (that is, debt), and its share dropped from 3 percent of total wealth in 1950 to 1.2 percent in 2016.
For the upper half, portfolio diversification determines wealth trends. The data show that homes are the primary asset for households between the 50th and the 90th percentile, while the upper 10th also owns a large share of equities. Therefore, middle-class household wealth is strongly exposed to house price fluctuation, and the top 10 percent is more sensitive to stock market variations.
This difference in asset holdings explains how, prior to the crisis, middle-class households experienced rising wealth in parallel with the top 10th, even though their real incomes stagnated and savings were negligible. But the picture changed dramatically post-crisis. In “a race between the stock market and the housing market,” the economists write, the richest 10 percent, by virtue of a climbing stock market, enjoyed soaring post-crisis wealth, while average household wealth largely stagnated. (See figure.)
“When house prices collapsed in the 2008 crisis,” the economists conclude, the “leveraged portfolio position of the middle class brought about substantial wealth losses, while the quick rebound in stock markets boosted wealth at the top. Relative price changes between houses and equities after 2007 have produced the largest spike in wealth inequality in postwar American history.”
How BBC balance and bad think tanks discourage evidence based policy:
The Knowledge Transmission Mechanism (KTM) is how knowledge produced by academics and other researchers is translated into public policy. Evidence based policy is the result of this mechanism working. The media is, in theory, an important conduit for the KTM...
The rigid application of political balance in the broadcast media is in danger of negating the KTM, and therefore evidence based policy. The moment an issue (call it issue X) is deemed ‘political’ by the media, balance dictates that any view expressed on issue X is an opinion rather than knowledge. As a result, when the media want to talk to non-politicians (‘experts’) about issue X, the imperative of balance remains.
Now suppose that in the knowledge world there is in fact a consensus on issue X. That would be a problem for balance broadcasting, because it would be difficult to get an expert to argue against the consensus. The BBC overcame this problem valiantly during Brexit, using Patrick Minford (who is not known as a trade economist) time and again to balance the IMF, the OECD, more than 90% of academic opinion etc. But another way of solving this problem is to use certain think tanks.
There are two types of think tank. The good kind can be a vital part of the KTM. There is often a genuine need for think tanks to help translate academic research into policy. ... These think tanks are an important part of the KTM, because they can establish what the academic consensus is, translate academic ideas into practical policy, and match policy problems to evidence based solutions. ...
The bad kind are rather different. These produce ‘research’ that conforms to a particular line or ideology, rather than conforming to evidence or existing academic knowledge. Sometimes these think tanks can even become policy entrepreneurs, selling policies to politicians. This is often called policy based evidence making. It would be nice to be able to distinguish between good and bad think tanks in an easy way. The good type seeks to foster the KTM, and ensure policy is evidence based, and the bad type seek to negate the KTM by producing evidence or policies that fit preconceived ideas or the policymaker’s ideology.
I would argue that transparency about funding sources provides a strong indicator of which type a think tank is. ...
Another good indicator of a bad think tank is their relationship to academia. ...
In the case of global warming the BBC has been forced ... to treat man made climate change as a fact rather than an opinion that always has to be balanced. That is not going to happen for some time over any economic issue, however strong the academic consensus (like Brexit). This is partly because the pressure from academia is much less, and partly there is still a prejudice against social science (as if evidence based policy making cannot occur for economic or social policy!). But the BBC does need to explain their attitude to the use of think tanks. ...