I don't think this issue can be addressed without also considering the political power of large banks -- their ability to shape legislation in their favor in a way that increases the risk of financial meltdown:
Are Banks Too Large? Maybe, Maybe Not, by Luc Laeven, Lev Ratnovski, and Hui Tong, iMFdirect: Large banks were at the center of the recent financial crisis. The public dismay at costly but necessary bailouts of “too-big-to-fail” banks has triggered an active debate on the optimal size and range of activities of banks.
But this debate remains inconclusive, in part because the economics of an “optimal” bank size is far from clear. Our recent study tries to fill this gap by summarizing what we know about large banks using data for a large cross-section of banking firms in 52 countries.
We find that while large banks are riskier, and create most of the systemic risk in the financial system, it is difficult to determine an “optimal” bank size. In this setting, we find that the best policy option may not be outright restrictions on bank size, but capital—requiring large banks to hold more capital—and better bank resolution and governance.
Large banks increase systemic, not individual bank risk
Large banks have significantly grown in size, and become more involved in market-based activities since the late 1990s..., the balance sheet size of the world’s largest banks increased two to four-fold in the 10 years prior to the crisis. ...
Also, large banks appear to have a distinct, seemingly risky business model. They tend to simultaneously have lower capital..., less stable funding..., more market-based activities..., and be more organizationally complex..., than smaller banks. ...
In addition, our study confirms that large banks create most of systemic risk in today’s financial system. ... Large banks create especially high systemic risk when they have insufficient capital or unstable funding. And, large banks create high systemic risk...
Too-big-to-fail and empire building
What drives the size and the business model of large banks? Our study suggests the following:
• Implicit too-big-to-fail subsidies ... This predisposes large banks to use leverage and unstable funding, and to engage in risky market-based activities.
• Possible empire building. ...
• Economies of scale. While a good explanation for the size of large banks, recent studies suggest that they are modest. ...
Optimal bank size inconclusive
The evidence that large banks respond to too-big-to-fail and empire building incentives, and in process create systemic risk suggests that banks might become “too large” from a social welfare perspective. But there is an important caveat. We know too little about the value that large banks bring to their customers (e.g., large global corporations). The potential for economies of scale in large banks cannot be dismissed. As a result, we cannot draw conclusions as to the socially optimal bank size. And it also implies that outright restrictions on bank size or activities may be imprecise and hence costly. ...
Posted by Mark Thoma on Thursday, May 15, 2014 at 10:14 AM in Economics, Financial System, Market Failure |
This was in the daily links, but thought it deserved additional highlighting. It's from Miles Corak:
Joseph Fishkin’s book, “Bottlenecks,” explains why inequality lowers social mobility, by Miles Corak: [The Brookings Institution has been having an online discussion of Bottlenecks: A New Theory of Equal Opportunity, a book by Joseph Fishkin. This post is a re-blog of my contribution, "Money: a Bottleneck with Bite."]
... So far, it has been politically convenient to focus on upward mobility of children from the bottom of the income distribution, measured in some absolute sense, because it puts broader issues of the influence of inequality to one side.
The mobility-only approach puts the onus of the problem on the poor—their incomes, their work ethic, their schooling, their fertility choices, their parenting strategies—and abstracts from the broader context within which they must engage, define themselves, and raise their children. The rich are not part of this story.
But Professor Fishkin is right: “anyone concerned with equal opportunity ought also to be concerned with limiting inequality of income and wealth.” ...
The factors impacting on child development certainly include non-financial ones, including what social scientists clinically label the “unobserved parental characteristics”, which are correlated with income.
But inequality alters the rules of the game. It narrows the goals we pursue as individuals, shapes values, and more importantly it turns our pursuit of the good life into an arms race over positional goods, and changes both incentives and opportunities. That is what makes money a bottleneck that bites.
Bottlenecks outlines a theory of opportunity that gives us good reason to worry about outcomes, because to some important degree unequal outcomes lead to unequal opportunities. ...
Money is a significant bottleneck. Facing that fact can only be healthy, if somewhat more challenging, for the way we think about public policy. ...
Posted by Mark Thoma on Thursday, May 15, 2014 at 09:45 AM in Economics, Income Distribution, Policy |
Posted by Mark Thoma on Thursday, May 15, 2014 at 12:06 AM in Economics, Links |
And now for something completely different:
Dear Mark (you have my permission to publish this if you like),
I thought that you and readers at Economist's View would like to know about this novel paper and competition on "renganomics", an experiment on the spontaneous order of words.
"The Spontaneous Order of Words: Economics Experiments in Haiku and Renga" by Stephen T. Ziliak et al.
The paper is being published in the next issue of the International Journal of Pluralism and Economics Education 5(3, 2014)):
Abstract: The search is on for low cost collaborative learning models that foster creative cooperation and growth through spontaneous competition. I propose that a traditional renga competition for stakes can fulfill several of those goals at once. “Capitalistic Crisis,” composed by five undergraduate students, is an example of what might be called renganomics— a spontaneous, collaboratively written linked haiku about economics, inspired by haiku economics (Ziliak 2011, 2009a) and classical Japanese renga. A renga is in general a spontaneous, collaboratively written linked haiku poem with stanzas and links conventionally arranged in 5-7-5-7-7 syllabic order. In medieval Japan renga gatherings were social, political, and economic exchanges – from small to elaborate parties – with a literary end: a collectively written poem to provoke and entertain the assembled audience about a theme, mood, and season —economic seasons included. Since their ancient and royal beginnings among 8th century Japanese courtiers, renga have been written competitively and by all social classes for stakes. The current group of five student authors competed in a Spring 2014 economics class with forty other students grouped into teams of 3 or 5 at Roosevelt University. The renga competition, judged by Stephen T. Ziliak, lasted forty five minutes for a predetermined cash prize of fifty U.S. dollars. So far as we know this is the first spontaneous renga in English, or any language, to focus on economics. After a brief discussion of renga rules and the renga-haiku relationship, there follows the prize winning “Capitalistic Crisis” by Cathleen Vasquez, Joseph Molina and others, together with “Fashions of Economics: Haiku,” by Samuel Barbour, who was master-in-training at the renga.
Here is a bit more about haiku economics, which you've kindly mentioned in the past.
Stephen T. Ziliak
Posted by Mark Thoma on Wednesday, May 14, 2014 at 04:31 PM in Economics |
Summers on Piketty:
The Inequality Puzzle: Once in a great while, a heavy academic tome dominates for a time the policy debate and, despite bristling with footnotes, shows up on the best-seller list. Thomas Piketty’s Capital in the Twenty-First Century is such a volume. As with Paul Kennedy’s The Rise and Fall of the Great Powers, which came out at the end of the Reagan Administration and hit a nerve by arguing the case against imperial overreach through an extensive examination of European history, Piketty’s treatment of inequality is perfectly matched to its moment.
Like Kennedy a generation ago, Piketty has emerged as a rock star of the policy-intellectual world. His book was for a time Amazon’s bestseller. Every pundit has expressed a view on his argument, almost always wildly favorable if the pundit is progressive and harshly critical if the pundit is conservative. Piketty’s tome seems to be drawn on a dozen times for every time it is read.
This should not be surprising. At a moment when our politics seem to be defined by a surly middle class and the President has made inequality his central economic issue, how could a book documenting the pervasive and increasing concentration of wealth and income among the top 1, .1, and .01 percent of households not attract great attention? Especially when it exudes erudition from each of its nearly 700 pages, drips with literary references, and goes on to propose easily understood laws of capitalism that suggest that the trend toward greater concentration is inherent in the market system and will persist absent the adoption of radical new tax policies. ...[continue]...
Update: Piketty responds to criticism.
Posted by Mark Thoma on Wednesday, May 14, 2014 at 09:42 AM in Economics, Income Distribution |
Posted by Mark Thoma on Wednesday, May 14, 2014 at 12:06 AM
What Is Social Insurance? Take Two: More than a year ago I wrote a post titled “What Is Social Insurance?”... In that post, I more or less took the mainstream progressive view: programs like Social Security are risk-spreading programs that provide insurance against common risks like disability, living too long, poor health in old age, and so on....
I still think that social insurance programs ... provide risk-spreading insurance when viewed over a long time horizon. So from a lifetime perspective, the insurance function means that most people are made better off, even though a program as a whole may be a zero-sum game in dollar terms. But ... a crucial feature of social insurance is that it is redistributive in the short term (in an ex ante sense, not the trivial ex post sense that is true of all insurance) but risk-spreading in the long term. I happen to think that the world would be a better place if we considered the long term and, therefore, decided to maintain these programs. But I don’t think it’s obviously true that a lifetime perspective is correct and a one-year perspective is incorrect.
In particular, if you think that Social Security won’t be around when you retire, then you would logically take a short-term perspective in which you pay taxes but never receive benefits (unless you go on disability, or you die while Social Security still exists). Then you should rationally want to eliminate Social Security as soon as possible. Conversely, if you believe that Social Security will be around when you retire, then you will evaluate the whole thing, including its insurance value, which will make you more likely to vote for it. So it’s not surprising that a major component of the anti-Social Security campaign consists of trying to convince young people (who ordinarily gain the most from insurance, since they face the most uncertainty) that Social Security cannot exist when they retire.
If you want to read more, the draft chapter is up on SSRN. Enjoy.
Just one comment. I wish he'd made it clear that the worries about Social Security not being there for the young of today are unfounded.
Posted by Mark Thoma on Tuesday, May 13, 2014 at 01:05 PM in Economics, Social Insurance |
Labor Market Seems Dented, Not Broken, by Justin Wolfers: There are two schools of thought about the longer-term prospects for the labor market. The darker view is that the Great Recession wrought permanent damage: The jobs that disappeared aren’t easily replaced, and the skills of the jobless are a poor match for the jobs that remain. ...
The sunnier view is that this is not a permanent shift, but rather the natural course of a recession... It’s a sunnier view because it suggests that a continuing recovery will largely solve our unemployment problem..., leaving no lasting mark.
The past two years have been kind to this more optimistic interpretation. ... It is surely too early to draw strong conclusions, but continued movements in this direction would suggest that the Great Recession hasn’t done lasting damage, and that it’s possible for the unemployment rate to head back toward 5 percent without the emergence of hiring bottlenecks.
It seems clear to me that there has been permanent damage, but we shall see...
Posted by Mark Thoma on Tuesday, May 13, 2014 at 09:45 AM in Economics, Unemployment |
Asymmetric Misinformation: A follow-up to my post about Jaime Caruana at the BIS. One other thing that struck me was his claim that
policymakers respond asymmetrically over successive business and financial cycles, hardly tightening or even easing during booms and easing aggressively and persistently during busts
Is this true? Anyway, is symmetry in policy responses inherently desirable?
The claim that policymakers have an easy-money bias is one of those things usually said with an air of worldy wisdom; of course people don’t want to take away the punchbowl when everyone is having fun. But the reality doesn’t look at all like that. After all, if policy were consistently doing too much to fight slumps and not enough to curb booms, what you would expect is a steady ratcheting up of inflation — which isn’t at all what has happened over the past 35 years. This supposed piece of wisdom is actually a cliche from the 1970s, which hasn’t been remotely true for a generation. ...
Incidentally, the fake wisdom on monetary policy resembles a corresponding piece of fake wisdom on fiscal policy — the claim that fiscal stimulus inevitably turns into a permanent rise in government spending, because the programs never go away. That didn’t happen this time... And in fact it has never happened in the United States, as far as I can tell...
Beyond that, there are in fact good reasons for asymmetry in the response to booms and slumps...
He goes on to explain why. I'd add another reason why "symmetry is not a virtue," the difference in costs between inflation and unemployment. I believe that the costs of unemployment are much higher than the costs of inflation running a point or two (or three of four) above target, so if there is a mistake to be made, it's best to err on doing too much in a recession.
Posted by Mark Thoma on Tuesday, May 13, 2014 at 09:44 AM in Economics, Fiscal Policy, Monetary Policy |
Posted by Mark Thoma on Tuesday, May 13, 2014 at 12:06 AM in Economics, Links |
How to Shrink Inequality: Some inequality of income and wealth is inevitable, if not necessary. If an economy is to function well, people need incentives to work hard and innovate. The pertinent question is ... at what point do these inequalities become so great as to pose a serious threat to our economy, our ideal of equal opportunity and our democracy. We are near or have already reached that tipping point. ...But a return to the Gilded Age is not inevitable. ... There is no single solution for reversing widening inequality. ... Here are ten initiatives that could reverse the trends..:
1) Make work pay. ... [Min wage, EITC, etc.]
2) Unionize low-wage workers. ...
3) Invest in education. ...
4) Invest in infrastructure. ...
5) Pay for these investments with higher taxes on the wealthy. ...
6) Make the payroll tax progressive. ...
7) Raise the estate tax and eliminate the “stepped-up basis” for determining capital gains at death. ...
8) Constrain Wall Street. ...
9) Give all Americans a share in future economic gains. ... [Diversified index of stocks and bonds given to all at birth]
10) Get big money out of politics. ...
[The essay, which is much, much longer, also talks about how inequality has happened, how it threatens the foundations of our society, and why it has happened.]
Posted by Mark Thoma on Monday, May 12, 2014 at 11:50 AM in Economics, Income Distribution, Policy |
Taking advantage of Tim Taylor's kind permission to repost things of his occasionally:
NAFTA Turns 20: The North American Free Trade Agreement turns 20 this year. What's has it done? Gary Clyde Hufbauer, Cathleen Cimino, and Tyler Moran discuss the evidence in "NAFTA at 20: Misleading Charges and Positive Achievements," written for the Peterson Institute for International Economics (May 2014, Number PB14-13).
For those who don't remember, NAFTA was a burning political issue of its time, perhaps the single largest issue launching Ross Perot as a third-party candidate for president in 1992. In June 1992, before the November election, Perot was for a time polling ahead of both George Bush and Bill Clinton. NAFTA was the first time the U.S. had signed a major trade agreement with a country that wasn't another high-income country, and Perot and others made dire predictions about the result. Hufbauer, Cimino, and Moran summarize the political debate:
"In truth the claims on both sides of the NAFTA issue 20 years ago were overblown. Since the Mexican economy is less than one-tenth the size of the US economy, it is not plausible that trade integration could dramatically shape the giant US economy, even though integration could exert a substantial impact on the relatively small Mexican economy. But exaggeration and sound bites are the weapons of political battle, and trade agreements have been on the front line for two decades. President Bill Clinton, for example, declared that NAFTA would “create” 200,000 American jobs in its first two years and a million jobs in its first five years. Not to be outdone, NAFTA opponents Ross Perot and Pat Choate projected job losses of 5.9 million, driven by what Perot derided as a “giant
sucking sound” emanating from Mexico that would swallow American jobs. Both of these claims turned out to be overblown, especially the one advanced by Perot and Choate."
Of course, most economists see trade agreements in a fundamentally different way than these kinds of politically driven predictions, not about adding to the total number of jobs nor subtracting from the total number of jobs in any significant way, but instead about adding competitive pressures that restructure the labor market--adding jobs in some areas and reducing them in others. The restructuring of an economy from lower-productivity to higher-productivity jobs is a fundamental part of what raises the standard of living over time. Public policy has a legitimate role to play in smoothing this transition.
Hufbauer, Cimino, and Moran point out that every year, even when the economy is growing well, about four million Americans lose jobs involuntarily through layoffs or shutdowns. They estimate that about 5% of this job churn, perhaps 200,000 jobs per year, can be attributed to expanded trade with Mexico. Despite the dire predictions of NAFTA opponents, the U.S. economy boomed through the rest of the 1990s, and after the 2001 recession, the unemployment rate was 5% or less from June 2005 through February 2008, before the Great Recession hit with full force. NAFTA had essentially no effect on the total number of US jobs.
As the Pew surveys on public perception of FTA [free trade agreement] effects on jobs seem to confirm, American workers who owe their jobs to rising exports are usually oblivious to their dependence on foreign sales (in sharp contrast to workers who lose their jobs to rising imports). Based on the increase in US exports to Mexico, averaging $25 billion annually between 2009 and 2013, about 188,000 new US jobs were supported each year by additional sales to Mexico. The figure is almost as large as the jobs lost, but the jobs gained in other sectors pay better. On average, the export-related jobs pay 7 to 15 percent more than the lost import-competing jobs. Th e wage diff erential, while positive, is only part of overall US gains from trade with Mexico. . . .
Amidst the arithmetic of jobs lost and gained, it should not be forgotten that a large portion of two-way trade among the NAFTA economies represents imported intermediates that raise the competitiveness of US firms, enabling them to improve their export profile in world markets. In other words, imports benefit not just US consumers but also US firms that can acquire just the right intermediate components at the right price. . . . [G]ains to the US economy average several hundred thousand dollars per net job lost.
What about effects on wages? Interestingly, it turns out that various studies which do find that U.S. trade with China has had a negative effect on US manufacturing wages, similar studies do not find that trade with Mexico has had a similar effect. They explain:
Possibly the main reason the wage impact between Chinese and Mexican imports differs is that US trade with Mexico is roughly balanced and has a large intraindustry component (e.g., autos and parts shipped in both directions), whereas US trade with China is highly unbalanced and entails very large US imports of consumer goods in exchange for much smaller US exports of capital goods. Because of these features, US imports from Mexico compel considerably less job churn between industrial sectors than US imports from China, and this could account for the difference in estimated wage impact.
Overall, here's a picture showing how bilateral merchandise trade between the U.S., Canada, and Mexico has risen since the passage of NAFTA. The blue bars show bilateral imports and exports before the free trade agreement; the dark green bars show how much trade would have increased if it had just mirrored overall economic growth; and the light green bars show the increase in trade above that level.
Hufbauer, Cimino, and Moran estimate the gains in this way:
"Ample econometric evidence documents the substantial payoff from expanded two-way trade in goods and services. Through multiple channels, benefits flow both from larger exports and larger imports. ... The channels include more efficient use of resources through the workings of comparative advantage, higher average productivity of surviving firms through “sifting and sorting,” and greater variety of industrial inputs and household goods. ... As a rough rule of thumb, for advanced nations, like Canada and the United States, an agreement that promotes an additional $1 billion of two-way trade increases GDP by $200 million. For an emerging country, like Mexico, the payoff ratio is higher: An additional $1 billion of two-way trade probably increases GDP by $500 million. Based on these rules of thumb, the United States is $127 billion richer each year thanks to “extra” trade growth, Canada is $50 billion richer, and Mexico is $170 billion richer. For the United States, with a population of 320 million, the pure economic payoff is almost $400 per person."
As we look around the globalizing economy today, one emerging pattern is regional mixtures of the technology and design skills of high-income countries with the manufacturing skills of lower income economies. We see Factory Asia, with Japan and Korea partnering with China, Thailand, Indonesia, and others. We see Factory Europe, with Germany in particular partnering with countries of eastern Europe. And we see Factory North America, with the U.S. and Canada partnering with Mexico. In a sense, the NAFTA agreement back in 1994 was a test of the willingness of the U.S. economy to embrace--at least hesitantly!--the future of a globalizing economy. As the bulk of the world economy continues its shift toward middle income economies, the U.S. economy will face a continual series of adjustments to the globalizing economy in the decades ahead.
Posted by Mark Thoma on Monday, May 12, 2014 at 09:48 AM
Can you guess how conservatives will react if the EPA announces rules to combat climate change?:
Crazy Climate Economics, by Paul Krugman, Commentary, NY Times: Everywhere you look these days, you see Marxism on the rise. Well, O.K., maybe you don’t — but conservatives do. If you so much as mention income inequality, you’ll be denounced as the second coming of Joseph Stalin; Rick Santorum has declared that any use of the word “class” is “Marxism talk.” ...George Will says the only reason progressives favor trains is their goal of “diminishing Americans’ individualism in order to make them more amenable to collectivism.”
So it goes without saying that Obamacare, based on ideas originally developed at the Heritage Foundation, is a Marxist scheme... And just wait until the Environmental Protection Agency announces rules intended to slow the pace of climate change. ...
You can already get a taste of what’s coming in the ... recent Supreme Court ruling on power-plant pollution. ... Justice Scalia didn’t just dissent; he suggested that the E.P.A.’s proposed rule ... reflected the Marxist concept of “from each according to his ability.” ...
And you can just imagine what will happen when the E.P.A ... moves on to regulation of greenhouse gas emissions. ...
First, we’ll see any effort to limit pollution denounced as a tyrannical act. Pollution wasn’t always a deeply partisan issue... John McCain made ... cap-and-trade limits on greenhouse gases part of his presidential campaign. But when House Democrats actually passed a cap-and-trade bill in 2009, it was attacked as, you guessed it, Marxist. ...
Second, we’ll see claims that any effort to limit emissions will have ... “a devastating impact on our economy.” ... Now, the rules the E.P.A. is likely to impose won’t give the private sector as much flexibility as it would have had in dealing with an economywide carbon cap or emissions tax. But Republicans have only themselves to blame: Their scorched-earth opposition to any kind of climate policy has left executive action by the White House as the only route forward. ...
What about the argument that unilateral U.S. action won’t work...? ... U.S. action on climate is a necessary first step toward a broader international agreement, which will surely include sanctions on countries that don’t participate.
So the coming firestorm over new power-plant regulations won’t be a genuine debate... Instead, the airwaves will be filled with conspiracy theories and wild claims about costs, all of which should be ignored. Climate policy may finally be getting somewhere; let’s not let crazy climate economics get in the way.
Posted by Mark Thoma on Monday, May 12, 2014 at 12:24 AM in Economics, Environment, Politics |
[Accidentally set auto post for PM rather than AM, so the links are late today]:
Posted by Mark Thoma on Monday, May 12, 2014 at 12:06 AM in Economics, Links |
Price level targeting appears to be better than inflation targeting, particular at the zero bound, but the "beneficial effects hang importantly on the structure of New Keynesian models and rational expectations":
Inflation targeting vs price-level targeting: A new survey of theory and empirics, by Michael Hatcher, Patrick Minford, Vox EU: Price stability is the key goal of almost every central bank in the world. But does that mean prices levels or inflation rates? The main difference between inflation targeting and price-level targeting is the consequence of missing the target.
- Unanticipated shocks to inflation lead to corrective action when the price is the target.
- Under inflation targeting, past mistakes and shocks are treated as ‘bygones’.
If, for example, inflation is unexpectedly high today, this would be followed in the future by below average inflation under a price-level targeting regime. By contrast, inflation targeting aims for average (i.e. on-target) inflation in future years regardless of the level of current inflation (see Figure 1).
Figure 1 Inflation and price-level targeting compared
Figure 1 makes clear that expectations depend crucially on the regime in place. For example, suppose the central bank announces an inflation target of 2%. When inflation unexpectedly rises to 3% in period 3, rational households and firms will anticipate future inflation of 2% in periods 4 and 5. By contrast, expected inflation in period 5 would be only 1% with a price-level target, because price targeting calls for below-average inflation in this period. Because the central bank is obliged to offset past inflationary shocks in this way, targeting prices is ‘history dependent’ (Woodford 2003). This mechanism is important for understanding why price-level targeting gives different outcomes to inflation targeting in New Keynesian models.
A survey of new evidence and thinking
This question of targets – inflation or the aggregate price level – has excited economists for decades. Knut Wicksell first presented the view that Swedish monetary policy should stabilize the price level in 1898. A little over three decades later, Sweden experimented with price-level targeting for the first time (see Berg and Jonung 1999). But price-level targeting did not take-off; it has not been adopted by a major central bank since.
In recent years, however, economists have re-assessed the merits of price-level targeting in the light of new research and better models.1 We recently wrote a survey of this new research (Hatcher and Minford 2014), designed to bring an earlier survey by Ambler (2009) up to date. A key new development is the potential role of price-level targeting in helping monetary policy deal with the ‘zero bound’ on nominal interest rates.
Inflation targeting and the zero bound on interest rates
Consider, for instance, a situation where the economy has been hit by a large negative shock to aggregate demand, and nominal interest rates have been cut to zero in an attempt to stimulate the economy back to full capacity. Because inflation expectations remain anchored at 2% under inflation targeting, the only route by which monetary policy could stimulate the economy is further cuts in nominal interest rates – an option which has been exhausted at this point.
If households and firms understand the impotence of monetary policy in this situation, they might even expect lower future inflation. This would raise real interest rates, thus pushing down demand even further. With real interest rates either constant or rising, a lengthy recession is likely to ensue.
Targeting the price level leads to a different dynamic for inflation expectations. After the demand shock has hit and inflation falls below 2%, a credible price-level target would create the expectation of future inflation of more than 2%. In turn, this expectation will lower real interest rates today and provide necessary stimulus to aggregate demand and upward pressure on prices. This expectations mechanism has additional bite in New Keynesian models because an increase in expected inflation raises current inflation, and higher output expectations raise aggregate demand.
Both Eggertsson and Woodford (2003) and Nakov (2008) confirm this intuition. Welfare losses conditional on reaching the lower bound are much larger under inflation targeting than price targeting in New Keynesian models. More recently, Coibion et al. (2012) consider an extended model with the feature that the optimal rate of inflation can be computed. Because targeting the price level reduces the frequency and severity of zero bound episodes through its effect on expectations, the optimal inflation rate is somewhat lower than under inflation targeting. Since there are additional welfare gains associated with a lower trend rate of inflation, the potential welfare gains from price-level targeting are much larger and amount to 0.4% of GDP per year.2
Covas and Zhang (2010) and Bailliu et al. (2012) show that including in the New Keynesian model some basic financial frictions underlined by the recent financial crisis does not overturn the beneficial effects of price targeting – essentially because the main mechanism via expectations remains powerful in these models. It is important to note, however, that this mechanism rests crucially on the assumption that the price-level target is credible. Also, we cannot yet say much about the relative merits of price-level targeting in models with more sophisticated financial frictions, though we expect to see additional research soon.
The importance of rational expectations
Because the expectations mechanism under targeting the price is central to its performance, the crucial issue for policymakers is whether expectations are rational and the economy New Keynesian. One way to get at whether expectations are rational is surveys and experiments. Like many economists, however, we remain skeptical about the usefulness of these approaches and think applied macro evidence is preferable when it can be established on strong statistical grounds.3
We, therefore, turn to this literature. Early attempts to test rational expectations in macro models were made by Fair (1993) and several others. When we look at modern New Keynesian models with rational expectations imposed, we find a steady improvement over time in their empirical performance. For instance, Christiano et al. (2005) and Smets and Wouters (2007) show that New Keynesian models can match key dynamic features of US data and perform impressively in out-of-sample forecast tests. Nowadays, most major central banks consider New Keynesian models useful tools for policy analysis.4
The next logical step is to test these models directly against the data using statistical tests that accept or reject the basic model and variants of it. This challenge has been taken on by a recent strand of applied macro literature that exploits vector autoregressions (VARs) as a description of macro data. A statistical testing procedure can be built on this, known as indirect inference (see Smith 1993). The basic idea is to simulate the models to create a large number of counter-factual histories, and the VAR relationships implied by them, and then to ask whether the actual history and the VAR estimated on this actual data could be rejected as coming from this distribution at some level of statistical confidence. It turns out that this test has substantial power against mis-specified models (Le et al. 2012), quite a lot more so than tests based on likelihood which can struggle to distinguish between alternative models (see Canova and Sala 2009). Bayesian ranking is based on likelihood and can also suffer from lack of power. Though this could, in theory, be remedied by the use of strong priors, in practical terms it is difficult to come up with a set of at once uncontroversial and strong priors.
The indirect inference test can be applied to any model and its proposed parameters. Furthermore, the possibility that the original set of parameters could simply be wrongly calibrated can be explored by searching over the full range of parameter values permitted by theory.5 In recent years, a number of studies have carried out this test on New Keynesian models with rational expectations, largely on US data. For example, Le et al. (2011) reworked the Smets and Wouters (2007) model by adding a competitive sector to both the labor and the product markets and re-estimating it as above. They found that for the post-1984 Great Moderation period, the model passed the indirect inference test comfortably (p-value = 0.16), and that the ‘best’ model over this period was strongly New Keynesian.
Liu and Minford (2012) considered, again on US data, a smaller New Keynesian model. Usefully for our focus here, they tested the model under both rational expectations and behavioural expectations as in De Grauwe (2010). The behavioral expectations are the weighted average of a ‘fundamentalist’ forecasting rule, in which the output gap or inflation are forecasts at their steady state values, and a rule extrapolating the most recent value. Many policymakers have considered such behavioral rules to be probable and have had doubts about the ‘strong’ rational expectations assumption. So, the comparison is pertinent for them. Perhaps surprisingly, the rational expectations model does far better than the behavioral version. Indeed, the latter is strongly rejected even after full re-estimation, whereas the rational expectations version passes after re-estimation by a fair margin (p-value = 0.20).
Price-level targeting is found in modern macro models to be a good mechanism for helping the economy to recover from deflationary shocks driving monetary policy to the zero bound. It does this because when such shocks occur price-level targeting implies that future inflation will be boosted and so real interest rates are lowered. Moreover, this mechanism would make it feasible for trend inflation to be lowered, which would bring additional benefits. These beneficial effects hang importantly on the structure of New Keynesian models and rational expectations. The empirical literature we have surveyed does not reject these assumptions and favors rational expectations over behavioral ones. We, therefore, conclude that policymakers should continue to pay attention to price-level targeting in the future.
Ambler, S. (2009), “Price-level targeting and stabilisation policy: a survey”, Journal of Economic Surveys 23(5), 974–997.
Ambler, S. (2007), “The costs of inflation in New Keynesian models”, Bank of Canada Review (Winter), 5–14.
Andolfatto, D., Hendry, S., Moran, K. (2008), “Are inflation expectations rational?”, Journal of Monetary Economics 55(2), 406–422.
Bailey, M.J. (1956), “The welfare cost of inflationary finance”, Journal of Political Economy 64(2), 93–110.
Bailliu, J., Meh, C. and Zhang, Y. (2012), “Macroprudential rules and monetary policy when financial frictions matter”, Bank of Canada Working Paper 2012-6.
Bank of Canada (2011), Renewal of the inflation-control target.
Berg, C., Jonung, L. (1999), “Pioneering price-level targeting: the Swedish experience 1931-1937”, Journal of Monetary Economics 43(3), 525–551.
Canova, F., Sala, L. (2009), “Back to square one: Identiﬁcation issues in DSGE models”, Journal of Monetary Economics 56, 431–449.
Christiano, L.J., Eichenbaum, M.S., Evans, C.L. (2005), “Nominal rigidities and the dynamic effects of a shock to monetary policy”, Journal of Political Economy 113(1), 1–45.
Coibion, O., Gorodnichenko, Y., Wieland, J. (2012), “The optimal inflation rate in New Keynesian models: should central banks raise their inflation targets in light of the zero lower bound?”, Review of Economic Studies 79, 1371–1406.
Covas, F. and Zhang, Y. (2010), “Price-level versus inflation targeting with financial market imperfections”, Canadian Journal of Economics 43(4), 1302–1332.
De Grauwe, P. (2010), “Top-down versus bottom-up macroeconomics”, CESifo Economic Studies 56(4), 465–497.
Eggertsson, G.B. and Woodford, M. (2003), “The zero bound on interest rates and optimal monetary policy”, Brookings Papers on Economic Activity 1:2003, 139–211.
Fair, R.C. (1993), “Testing the rational expectations hypothesis in macroeconometric models”, Oxford Economic Papers 45(2), 169–190.
Hatcher, M., Minford, P. (2014), “Stabilization policy, rational expectations and price-level versus inflation targeting: a survey”, CEPR Discussion Paper No. 9820.
Le, V.P.M., Meenagh, D., Minford, P. and Wickens, M.R. (2011), “How much nominal rigidity is there in the US economy? Testing a New Keynesian DSGE model using indirect inference”, Journal of Economic Dynamics and Control 35(12), 2078–2104.
Le, V.P.M., Meenagh, D., Minford, P., and Wickens, M. (2012), “Testing DSGE models by Indirect inference and other methods: some Monte Carlo experiments”, Cardiff University Economics working paper E2012_15.
Liu, C., Minford, P. (2012), “Comparing behavioural and rational expectations for the US post-war economy”, CEPR Discussion Paper 9132.
Nakov, A. (2008), “Optimal and simple monetary policy rules with zero floor on the nominal interest rate”, International Journal of Central Banking 4(2), 73–127.
Smets, F. and Wouters, R. (2007), “Shocks and frictions in US business cycles: a Bayesian DSGE approach”, American Economic Review 97(3), 586–606.
Smith, A.A. Jr. (1993), “Estimating nonlinear time-series models using simulated vector autoregressions”, Journal of Applied Econometrics 8, 63–84.
Woodford, M. (2003), Interest and prices: Foundations of a theory of monetary policy, NJ: Princeton University Press.
1 This re-assessment has not been confined to academia: the Bank of Canada recently investigated in detail whether it should switch to a price-level targeting mandate (Bank of Canada 2011).
2 Positive trend inflation has three distinct costs in New Keynesian models (see Ambler 2007). The traditional welfare cost due to inflation acting as a tax on money holdings (see Bailey 1956) is not one of them. It is therefore conceivable that the welfare gains attainable from lowering trend inflation under price-level targeting could be larger than estimated by Coibion et al. (2012). The figure of 0.4% of GDP was arrived at by multiplying the gain of 0.5% of aggregate consumption reported in Coibion et al. by the model ratio of consumption to GDP of 0.8.
3 We provide a brief discussion of the pros and cons of survey and experimental evidence in Hatcher and Minford (2014). One important caveat highlighted by recent research is that statistical results in which survey inflation expectations differ persistently from actual inflation cannot be construed as conclusive evidence against rational expectations because this behaviour could be the result of occasional changes in monetary policy regime combined with a period of (rational) learning by the private sector (see Andolfatto et al. 2008).
4 In practice, most central banks rely on a variety of models, some of which are heavily reliant on economic theory and others which are closer to econometric models.
5 In effect this search amounts to re-estimation via indirect inference (which is asymptotically equivalent to full-information maximum likelihood but in small samples finds the model that gets closest to passing the test).
The weights on each type of forecasting rule vary over time according to the relative past success of each rule.
Posted by Mark Thoma on Sunday, May 11, 2014 at 09:54 AM in Economics, Monetary Policy |
Posted by Mark Thoma on Sunday, May 11, 2014 at 12:06 AM in Economics, Links |
Thomas Kneib sent me the details of this project in early April after a discussion about it with one of his Ph.D. students (Jan Höffler) at the INET conference, and I've been meaning to post something on it but have been negligent. So I'm glad that Dave Giles picked up the slack:
Replication in Economics: I was pleased to receive an email today, alerting me to the "Replication in Economics" wiki at the University of Göttingen:
"My name is Jan H. Höffler, I have been working on a replication project funded by the Institute for New Economic Thinking during the last two years and found your blog that I find very interesting. I like very much that you link to data and code related to what you write about. I thought you might be interested in the following:
We developed a wiki website that serves as a database of empirical studies, the availability of replication material for them and of replication studies: http://replication.uni-goettingen.de
It can help for research as well as for teaching replication to students. We taught seminars at several faculties internationally - also in Canada, at UofT - for which the information of this database was used. In the starting phase the focus was on some leading journals in economics, and we now cover more than 1800 empirical studies and 142 replications. Replication results can be published as replication working papers of the University of Göttingen's Center for Statistics.
Teaching and providing access to information will raise awareness for the need for replications, provide a basis for research about the reasons why replications so often fail and how this can be changed, and educate future generations of economists about how to make research replicable.
I would be very grateful if you could take a look at our website, give us feedback, register and vote which studies should be replicated – votes are anonymous. If you could also help us to spread the message about this project, this would be most appreciated."
I'm more than happy to spread the word, Jan. I've requested an account, and I'll definitely be getting involved with your project. This look like a great venture!
Posted by Mark Thoma on Saturday, May 10, 2014 at 08:37 AM in Economics, Methodology |
Gloomy European Economist Francesco Saraceno:
Wrong Debates: Paul Krugman has a short post on the Eurozone, today (I’d like him to write more about us; he has been too America-centered lately), pointing out that the myth of fiscal profligacy is, well, just a myth. in fact, he argues, the only fiscally irresponsible country, in the years 2000 was Greece. It is maybe worth reposting here a figure that from an old piece of this blog, that since then made it into all my classes on the Euro crisis:
The figure shows the situation of public finances in 2007, against the Maastricht benchmark (3% deficit and 60% debt) before the crisis hit. As Krugman says, only one country of the so-called PIIGS (the red dots) is clearly out of line, Greece. Portugal is virtually like France, and Spain and Ireland way better than most countries, including Germany. Italy has a stock of old debt, but its deficit in 2007 is under control.
So Krugman is right in reminding us that fiscal policy per se was not a problem before the crisis; And yet, what he calls fiscal myths, have shaped policies in the EMU, with a disproportionate emphasis on austerity. And even today,... continued fiscal consolidation is taken for granted. I will write more on this in the next days, but it is striking how we aim at the wrong target.
Posted by Mark Thoma on Saturday, May 10, 2014 at 08:33 AM in Economics, Fiscal Policy, Politics |
Dave Altig and Ellyn Terry at macroblog:
How Has Disability Affected Labor Force Participation?: You might be unaware that May is Disability Insurance Awareness Month. We weren’t aware of it until recently, but the issue of disability—as a reason for nonparticipation in the labor market—has been very much on our minds as of late. As we noted in a previous macroblog post, from the fourth quarter of 2007 through the end of 2013, the number of people claiming to be out of the labor force for reasons of illness or disability increased almost 3 million (or 23 percent). The previous post also noted that the incidence of reported nonparticipation as a result of disability/illness is concentrated (unsurprisingly) in the age group from about 51 to 60.
In the past, we have examined the effects of the aging U.S. population on the labor force participation rate (LFPR). However, we have not yet specifically considered how much the aging of the population alone is responsible for the aforementioned increase in disability as a reason for dropping out of the labor force.
The following chart depicts over time the percent (by age group) reporting disability or illness as a reason for not participating in the labor force. Each line represents a different year, with the darkest line being 2013. The chart reveals a long-term trend of rising disability or illness as a reason for labor force nonparticipation for almost every age group.
The chart also shows that disability or illness is cited most often among people 51 to 65 years old—the current age of a large segment of the baby boomer cohort. In fact, the proportion of people in this age group increased from 20 percent in 2003 to 25 percent in 2013.
How much can the change in demographics during the past decade explain the rise in disability or illness as a reason for not participating in the labor market? The answer seems to be: Not a lot.
Following an approach you may have seen in this post, we break down into three components the change in the portion of people not participating in the labor force due to disability or illness. One component measures the change resulting from shifts within age groups (the within effect). Another component measures changes due to population shifts across age groups (the between effect). A third component allows for correlation across the two effects (a covariance term). Here’s what you get:
To recap, only about one fifth of the decline in labor force participation as a result of reported illness or disability can be attributed to the population aging per se. A full three quarters appears to be associated with some sort of behavioral change.
What is the source of this behavioral change? Our experiment can’t say. But given that those who drop out of the labor force for reasons of disability/illness tend not to return, it would be worth finding out. Here is one perspective on the issue.
You can find even more on this topic via the Human Capital Compendium.
Posted by Mark Thoma on Saturday, May 10, 2014 at 12:24 AM in Economics, Unemployment |
Posted by Mark Thoma on Saturday, May 10, 2014 at 12:06 AM in Economics, Links |
Why Do Economists Still Disagree over Government Spending Multipliers?, by Daniel Carroll, Commentary, Cleveland Fed: Public debate about the effects of government spending heated up after record-large stimulus packages were enacted to address the fallout of the financial crisis. Almost as noticeable as the discord was the absence of consensus among prominent economists on the issue. While it seems a simple problem to estimate the effect of government spending on output—the size of the government multiplier—it is anything but.
Posted by Mark Thoma on Friday, May 9, 2014 at 09:24 AM in Economics, Fiscal Policy |
Economists and methodology: ...very few economists write much about methodology. This would be understandable if economics was just like some other discipline where methodological discussion was routine. This is not the case. Economics is not like the physical sciences for well known reasons. Yet economics is not like most other social sciences either: it is highly deductive, highly abstractive (in the non-philosophical sense) and rarely holistic. ...
This is a long winded way of saying that the methodology used by economics is interesting because it is unusual. Yet, as I say, you will generally not find economists writing about methodology. One reason for this is ... a feeling that the methodology being used is unproblematic, and therefore requires little discussion.
I cannot help giving the example of macroeconomics to show that this view is quite wrong. The methodology of macroeconomics in the 1960s was heavily evidence based. Microeconomics was used to suggest aggregate relationships, but not to determine them. Consistency with the data (using some chosen set of econometric criteria) often governed what was or was not allowed in a parameterised (numerical) model, or even a theoretical model. It was a methodology that some interpreted as Popperian. The methodology of macroeconomics now is very different. Consistency with microeconomic theory governs what is in a DSGE model, and evidence plays a much more indirect role. Now I have only a limited knowledge of the philosophy of science..., but I know enough to recognise this as an important methodological change. Yet I find many macroeconomists just assume that their methodology is unproblematic, because it is what everyone mainstream currently does. ...
... The classic example of an economist writing about methodology is Friedman’s Essays in Positive Economics. This puts forward an instrumentalist view: the idea that realism of assumptions do not matter, it is results that count.
Yet does instrumentalism describe Friedman’s major contributions to macroeconomics? Well one of those was the expectations augmented Phillips curve. ... Friedman argued that the coefficient on expected inflation should be one. His main reason for doing so was not that such an adaptation predicted better, but because it was based on better assumptions about what workers were interested in: real rather nominal wages. In other words, it was based on more realistic assumptions. ...
Economists do not think enough about their own methodology. This means economists are often not familiar with methodological discussion, which implies that using what they write on the subject as evidence about what they do can be misleading. Yet most methodological discussion of economics is (and should be) about what economists do, rather than what they think they do. That is why I find that the more interesting and accurate methodological writing on economics looks at the models and methods economists actually use...
Posted by Mark Thoma on Friday, May 9, 2014 at 09:18 AM in Economics, Macroeconomics, Methodology |
"Myths about who the rich really are and how they make their money":
Now That’s Rich, by Paul Krugman, Commentary, NY Times: Institutional Investor’s latest “rich list”..., its survey of the 25 highest-paid hedge fund managers, is out..., let’s think about ... about how their good fortune refutes several popular myths about income inequality...
First, modern inequality isn’t about graduates. It’s about oligarchs. Apologists for soaring inequality almost always ... talk about the rising incomes of college graduates, or maybe the top 5 percent. The goal of this misdirection is to soften the picture, to make it seem as if we’re talking about ordinary white-collar professionals who get ahead through education and hard work.
But many Americans are well-educated and work hard. ... Yet they don’t get the big bucks. ...
Second, ignore the rhetoric about “job creators”... Conservatives want you to believe that the big rewards in modern America go to innovators and entrepreneurs, people who build businesses and push technology forward. But that’s not what those hedge fund managers do for a living; they’re in the business of financial speculation...
Once upon a time, you might have been able to argue ... that all this wheeling and dealing was productive.... But, at this point, the evidence suggests that hedge funds are a bad deal for everyone except their managers... More broadly, we’re still living in the shadow of a crisis brought on by a runaway financial industry. ...
Finally, a close look at the rich list supports the thesis made famous by Thomas Piketty... — namely, that we’re on our way toward a society dominated by wealth, much of it inherited, rather than work. ...
But why does all of this matter? Basically, it’s about taxes.
America has a long tradition of imposing high taxes on big incomes and large fortunes, designed to limit the concentration of economic power as well as raising revenue. These days, however, suggestions that we revive that tradition face angry claims that taxing the rich is destructive and immoral — destructive because it discourages job creators from doing their thing, immoral because people have a right to keep what they earn.
But such claims rest crucially on myths about who the rich really are and how they make their money. Next time you hear someone declaiming about how cruel it is to persecute the rich, think about the hedge fund guys, and ask yourself if it would really be a terrible thing if they paid more in taxes.
Posted by Mark Thoma on Friday, May 9, 2014 at 12:48 AM in Economics, Income Distribution |
Posted by Mark Thoma on Friday, May 9, 2014 at 12:06 AM in Economics, Links |
Predictions and Prejudice: The 2008 crisis and its aftermath have been a testing time for economists — and the tests have been moral as well as intellectual. After all, economists made very different predictions about the effects of the various policy responses to the crisis; inevitably, some of those predictions would prove deeply wrong. So how would those who were wrong react?
The results have not been encouraging.
Brad DeLong reads Allan Meltzer in the Wall Street Journal, issuing dire warnings about the inflation to come. Newcomers to this debate may not be fully aware of the history here, so let’s recap. Meltzer began banging the inflation drum five full years ago, predicting that the Fed’s expansion of its balance sheet would cause runaway price increases; meanwhile, some of us pointed both to the theory of the liquidity trap and Japan’s experience to say that this was not going to happen. ...
Tests in economics don’t get more decisive; this is where you’re supposed to say, “OK, I was wrong, and here’s why”.
Not a chance. And the thing is, Meltzer isn’t alone. Can you think of any prominent figure on that side of the debate who has been willing to modify his beliefs in the face of overwhelming evidence? ...
Were the freshwater guys always just pretending to do something like science, when it was always politics? Is there simply too much money and too much vested interest behind their point of view?
Even if we do get a bit of inflation at some point, the people who have been warning about it repeatedly for the last half decade won't be able to say they predicted it in any real sense. Warning that there will be, say, a tornado every day for five years until one finally comes is not much of a track record, or helpful in any way. And if it never comes...
Posted by Mark Thoma on Thursday, May 8, 2014 at 02:02 PM in Economics, Inflation, Politics |
As a follow-up to the previous post on black-white differences in economic mobility:
Higher Ed Cuts, Tuition Hikes Worsen Low-Income Students’ Struggles: State cuts to higher education have led colleges and universities to make deep cuts to educational or other services, hike tuition sharply, or both, as we explain in our recently released paper. These tuition increases are hitting low-income students particularly hard, lessening their choices of schools, adding to their debt burdens — and likely deterring some from enrolling in school altogether. ...
Posted by Mark Thoma on Thursday, May 8, 2014 at 08:50 AM in Economics, Income Distribution, Universities |
Bhashkar Mazumder of the Federal Reserve Bank of Chicago
Black–white differences in intergenerational economic mobility in the United States: The large and persistent gap in economic status between blacks and whites in the United States has been a topic of considerable interest among social scientists and policymakers for many decades. The historical legacy of slavery and segregation raises the question of how long black Americans are likely to remain a disadvantaged minority. Despite the enormous literature on black–white inequality and its historical trends, few studies have directly measured black–white differences in rates of intergenerational mobility, that is, the ability of families to improve their position in the income distribution from one generation to the next. Estimates of rates of intergenerational mobility by race can provide insight on whether racial differences in the United States are likely to be eliminated and, if so, how long it might take. Furthermore, they might also help inform policymakers as to whether there are lingering racial differences in equality of opportunity and, if so, what the underlying sources for these differences are.
More generally, the relatively low rate of intergenerational mobility in the United States compared with other industrialized countries has been a growing concern to policymakers across the political spectrum.1 Understanding the sources of racial differences in intergenerational mobility might also shed light on the mechanisms behind the relatively high degree of intergenerational persistence of inequality in the United States. ...
A key finding is that in recent decades, blacks have experienced substantially less upward intergenerational mobility and substantially more downward intergenerational mobility than whites. These results are shown to be highly robust to a variety of measurement issues, such as the concept of income used, the age of the sample members, and the length of the time average used. The results are found in two different data sets that cover different birth cohorts and differ in their gender composition. Moreover, these results utilize relatively large samples of black families, so that racial differences can be shown to be statistically significant. An important implication of the results that has not been shown explicitly before is that if these patterns of mobility were to persist into the future, the implications for racial differences in the “steady-state” distribution of income would be alarming. Instead of eventually “regressing to the mean,” as some traditional measures of intergenerational mobility (when applied to the whole population) would suggest, these results imply that black Americans would make no further relative progress. Of course, it is a strong hypothetical to assume that current rates of mobility will hold in future generations. Indeed, over the past 150 years, there have been clear periods in which the racial gap in economic status has narrowed and it is certainly possible that black–white gaps could converge.4
This study also tries to shed light on which factors are associated with the racial gaps in upward and downward mobility. To be clear, while the analysis is descriptive and not causal, it nonetheless provides some highly suggestive “first-order” clues for the underlying mechanisms leading to black–white differences in intergenerational mobility. It appears that cognitive skills during adolescence, as measured by scores on the Armed Forces Qualification Test (AFQT), are strongly associated with these gaps. ... I do not interpret these scores as measuring innate endowments but rather as reflecting the accumulated differences in family background and other influences that are manifested in test scores.6 If these results are given a causal interpretation, they suggest that actions that reduce the racial gap in test scores could also reduce the racial gap in intergenerational mobility.7
A commonly proposed explanation for racial gaps in achievement has been the relatively high rates of black children growing up with single mothers. I find evidence that for blacks, the lack of two parents in the household throughout childhood does indeed hamper upward mobility. However, patterns in downward mobility are unaffected by family structure for either blacks or whites. Importantly, the negative effects of single motherhood on blacks are only identified in the SIPP, where the entire marital history during the child’s life is available. This highlights the importance of access to data on family structure over long periods rather than a single snapshot at one point in time. I also find that black–white gaps in both upward and downward mobility are significantly smaller for those who have completed 16 years of schooling.8 ...
Finally, I should also note that the focus of this article is on relative mobility across generations and that the measures are relevant for answering questions concerning the progress of blacks relative to whites. It may also be interesting to consider measures of absolute mobility, but that is not the focus of this article. ...[read more]...
Posted by Mark Thoma on Thursday, May 8, 2014 at 08:24 AM in Economics, Income Distribution |
Posted by Mark Thoma on Thursday, May 8, 2014 at 12:06 AM in Economics, Links |
Tim Haab at Environmental Economics:
Tax Miles? I still think I have a better idea...:
The California Legislature is looking at a voluntary program that would tax motorists for every mile they drive.
KCAL9’s Bobby Kaple reports that Sen. Mark DeSaulnier, D-Concord, introduced a bill to test out the vehicle miles traveled (VMT) tax because the state’s gas tax was no longer bringing in the revenue it used to due to people driving more fuel efficient vehicles. [via losangeles.cbslocal.com]
It's been almost exactly 7 years now (May 8, 2007) and people still aren't listening to me.
Taxing miles creates perverse incentives for fuel efficiency. ... In words, a mielage tax increases the tax per gallon the more fuel efficient the car. Now granted, with higher mpg you use fewer gallons to drive an equivalent number of miles, and in the end, everyone driving 100 miles will pay the same tax. And from a revenue perspective, that might be OK. But there might be a way to kill fewer birds with one stone.
As I have written a number of times, a more straightforward proposal is to simply raise the gas tax. Reaising the gas tax accomplishes a number of things 1) It raises revenue, 2) It discourages miles driven, and 3) It increases the incentive for higher fuel efficiency. ...
It's really simple. Why worry about complicated milage programs? The gas tax infrastructure is in place. Raise the gas tax and meet multiple public policy and economic goals simultaneously.
Posted by Mark Thoma on Wednesday, May 7, 2014 at 11:42 AM in Economics, Environment, Fiscal Policy |
This is from Janet Yellen's prepared testimony before the Joint Economic Committee (the actual speech is much longer than this extract):
The Economic Outlook, by Janey Yellen, FRB, May 7, 2014: Chairman Brady, Vice Chair Klobuchar, and other members of the Committee, I appreciate this opportunity to discuss the current economic situation and outlook along with monetary policy before turning to some issues regarding financial stability.
Current Economic Situation and Outlook The economy has continued to recover from the steep recession of 2008 and 2009. Real gross domestic product (GDP) growth stepped up to an average annual rate of about 3-1/4 percent over the second half of last year, a faster pace than in the first half and during the preceding two years. Although real GDP growth is currently estimated to have paused in the first quarter of this year, I see that pause as mostly reflecting transitory factors, including the effects of the unusually cold and snowy winter weather. With the harsh winter behind us, many recent indicators suggest that a rebound in spending and production is already under way, putting the overall economy on track for solid growth in the current quarter. One cautionary note, though, is that readings on housing activity--a sector that has been recovering since 2011--have remained disappointing so far this year and will bear watching.
Conditions in the labor market have continued to improve. ...
While conditions in the labor market have improved appreciably, they are still far from satisfactory. ...
Inflation has been quite low even as the economy has continued to expand. Some of the factors contributing to the softness in inflation over the past year, such as the declines seen in non-oil import prices, will probably be transitory. Importantly, measures of longer-run inflation expectations have remained stable. That said, the Federal Open Market Committee (FOMC) recognizes that inflation persistently below 2 percent--the rate that the Committee judges to be most consistent with its dual mandate--could pose risks to economic performance, and we are monitoring inflation developments closely.
Looking ahead, I expect that economic activity will expand at a somewhat faster pace this year than it did last year, that the unemployment rate will continue to decline gradually, and that inflation will begin to move up toward 2 percent. A faster rate of economic growth this year should be supported by reduced restraint from changes in fiscal policy, gains in household net worth from increases in home prices and equity values, a firming in foreign economic growth, and further improvements in household and business confidence as the economy continues to strengthen. Moreover, U.S. financial conditions remain supportive of growth in economic activity and employment.
As always, considerable uncertainty surrounds this baseline economic outlook. At present, one prominent risk is that adverse developments abroad, such as heightened geopolitical tensions or an intensification of financial stresses in emerging market economies, could undermine confidence in the global economic recovery. Another risk--domestic in origin--is that the recent flattening out in housing activity could prove more protracted than currently expected rather than resuming its earlier pace of recovery. Both of these elements of uncertainty will bear close observation. ...
While we have seen substantial improvements in labor market conditions and the overall economy since the financial crisis and severe recession, we recognize that more must be accomplished. Many Americans who want a job are still unemployed, inflation continues to run below the FOMC's longer-run objective, and work remains to further strengthen our financial system. I will continue to work closely with my colleagues and others to carry out the important mission that the Congress has given the Federal Reserve. ...
Posted by Mark Thoma on Wednesday, May 7, 2014 at 09:51 AM in Economics, Fed Speeches, Monetary Policy |
Posted by Mark Thoma on Wednesday, May 7, 2014 at 12:06 AM in Economics, Links |
Why Economists Are Finally Taking Inequality Seriously, by Mark Thoma: In economics, the examination of questions surrounding the distribution of income have been ignored or pushed into the background in the academic literature. However, rising inequality over the last several decades coupled with recent work by Thomas Piketty – which has had a surprisingly large and positive reception by the public – have propelled these questions to the forefront of economic analysis. ...
Posted by Mark Thoma on Tuesday, May 6, 2014 at 08:47 AM in Economics, Income Distribution |
From an interview with Mark Gertler:
EF : Is there anything you’ve learned from the Great Recession about the role of finance that you weren’t aware of before?
Gertler: I liken the crisis to 9/11; that is, there was an inkling that something bad could happen. I think there was some sense it wa s going to be associated with all the financial innovation, but just like with 9/11, we couldn’t see it coming. When we look back, we can piece everything together and make sense of things, but what w e didn’t really understand was the fragility in the shadow banking system, how it made the economy very vulnerable. I always think of the Warren Buffet line, “You don’t know who’s naked until you drain the swimming pool.” That’s sort of what happened here.
I think when we look back on the crisis, we can explain most of what happened given existing theory. It’s just we couldn’t see it at the time.
EF : What do you think is the best explanation for the policies that were pursued?
Gertler: At the time, I think it was partly unbridled belief in the market — that financial markets are competitive markets, and they ought t o function w el l, not taking int o account that any individual is just concerned about his or her welfare, not about the market as a whole or the exposure of the mark et a s a whole. And so you had this whole system grow up without any outside monitoring by the government. It just had individuals making these trades and making these bets; nobody was adding everything up and understanding the risk exposure. And there was this attitude that we ought to be inclusive about homeownership — that was going on as well. Plus, complacency set in. We had the Great Moderation of the 1980s and 1990s, and we all thought we’d solved the major problems in macroeconomics. There were some prominent macroeconomists saying, “Look, we shouldn’t be wasting our time on these conventional issues; we’ve already solved them.” That led to most people just being asleep at the wheel.
Much more here.
Posted by Mark Thoma on Tuesday, May 6, 2014 at 08:46 AM in Economics, Monetary Policy |
Posted by Mark Thoma on Tuesday, May 6, 2014 at 12:06 AM in Economics, Links |
Antonio Fatás (each of the four points below are explained in detail in the original post):
Refocusing economics education: Via Mark Thoma I read an interesting article about how the mainstream economics curriculum needs to be revamped (Wren-Lewis also has some nice thoughts on this issue).
I am sympathetic to some of the arguments made in those posts and the need for some serious rethinking of the way economics is taught but I would put the emphasis on slightly different arguments. First, I am not sure the recent global crisis should be the main reason to change the economics curriculum. Yes, economists failed to predict many aspects of the crisis but my view is that it was not because of the lack of tools or understanding. We have enough models in economics that explain most of the phenomena that caused and propagated the global financial crisis. There are plenty of models where individuals are not rational, where financial markets are driven by bubbles, with multiple equilbria,... that one can use to understand the last decade. We do have all these tools but as economics teachers (and researchers) we need to choose which ones to focus on. And here is where we failed. And we did it before and during the crisis but we also did it earlier. Why aren't we focusing on the right models or methodology? Here is my list of mistakes we do in our teaching, which might also reflect on our research:
#1 Too much theory, not enough emphasis on explaining empirical phenomena. ...
#2 Too many counterintuitive results. Economists like to teach things that are surprising. ...
#3 The need for a unified theory. ...
#4 We teach what our audience wants to hear. ...
I also believe the sociology within the profession needs to change.
Posted by Mark Thoma on Monday, May 5, 2014 at 12:39 PM in Economics, Macroeconomics, Methodology, Universities |
Surprise! Republicans are opposed to Obamacare. But the lengths they'll go to to validate their beliefs in the face of evidence to the contrary is startling:
Inventing a Failure, by Paul Krugman, Commentary, NY Times: On Thursday, House Republicans released a deliberately misleading report on the status of health reform, crudely rigging the numbers to sustain the illusion of failure in the face of unexpected success. Are you shocked?
You aren’t, but ... the fact that this has become standard operating procedure for a major party bodes ill for America’s future.
About that report: The really big policy news of 2014, at least so far, is the spectacular recovery of the Affordable Care Act from its stumbling start... This is a problem for Republicans, who have bet the ranch on the proposition that health reform is an unfixable failure. ... How can they respond to good news?
Well,... they have ... continued to do what they’ve been doing ever since the news on Obamacare started turning positive: sling as much mud as possible at health reform, in the hope that some of it sticks. Premiums were soaring, they declared, when they have actually come in below projections. Millions of people were losing coverage, they insisted, when the great bulk of those whose policies were canceled simply replaced them with new policies. The Obama administration was cooking the books, they cried (projection, anyone?). And, of course, they keep peddling horror stories about people suffering terribly from Obamacare, not one of which has actually withstood scrutiny.
Now comes the latest claim — that many of the people who signed up for insurance aren’t actually paying their premiums. ... Previous attacks on Obamacare were pretty much fact-free; this time the claim was backed by an actual survey purporting to show that a third of enrollees hadn’t paid their first premium.
But the survey was rigged. (Are you surprised?) ...
So why are Republicans doing this? Sad to say, there’s method in their fraudulence.
First of all, it fires up the base. ... Beyond that, the constant harping on alleged failure works as innuendo even if each individual claim collapses in the face of evidence. ...
So Republicans are spreading disinformation about health reform because it works, and because they can — there is no sign that they pay any political price when their accusations are proved false.
And that observation should scare you. What happens to the Congressional Budget Office if a party that has learned that lying about numbers works takes full control of Congress? What happens if it regains the White House, too? Nothing good, that’s for sure.
Posted by Mark Thoma on Monday, May 5, 2014 at 12:42 AM in Economics, Health Care, Politics |
Britain’s economic growth is not a sign that austerity works, by Lawrence Summers, Commentary, Washington Post: The British economy has experienced the most rapid growth in the Group of 7 over the past several months. ... Naturally enough, many have seized on Britain’s strong performance as evidence vindicating the austerity strategy that the country has followed since 2010 and rejecting the secular-stagnation idea that lack of demand constrains industrial growth over the medium term. ... Unfortunately, properly interpreted, the British experience refutes austerity advocates and confirms Keynes’s warning about the dangers of indiscriminate budget-cutting in the midst of a downturn. ...
Britain’s growth reflects a combination of the depth of the hole it found itself in, the moderation in the trend toward deeper and deeper austerity and the effects of possibly bubble-creating government loans. It may be better for the citizens of Britain than any alternative. But it certainly should not be seen as any kind of inspiration for other companies or countries. ...
Posted by Mark Thoma on Monday, May 5, 2014 at 12:33 AM in Economics, Fiscal Policy, Politics |
Difficult Labor Report, by Tim Duy: The headlines numbers from the April employment report are at first blush a challenge to the Fed's low rate commitment. One doesn't have to dig much deeper into the data, however, to see that the near term implications are minimal as the Fed maintains its strong focus on measures of labor market slack. Still, the rapid drop in unemployment - if it continues - will leave policymakers increasingly anxious that their one-way bet on labor market slack will quickly turn sour.
Nonfarm payrolls grew pay 288k, well above expectations of 215k. While this numbers pushes the three-month moving average higher, the longer-term trend remains the same:
Maybe this is the month the acceleration begins. Maybe not. Either way, the report supports the dismissal of the weak first quarter growth numbers (now tracking in negative territory) as transitory. Just as has been the case for the last three years, there is nothing here to suggest a dramatic change in the pace of underlying economic activity.
The unemployment rate decline was a bit more intersting as it collapsed to 6.3% on the back of falling labor force participation:
The downward trend accelerated in the second half of 2013, pushing us ever closer to levels traditionally associated with greater inflationary pressures and with those pressures tighter monetary policy. Policymakers, however, appear to remain content dismissing the unemployment rate in favor of a wider range of labor market indicators that suggest plenty of slack left in the economy. Federal Reserve Chair Janet Yellen's current four favorites:
The wage story is, in my opinion, the key. It is hard to argue against the labor slack story when employees can't push wages significantly higher. That alone should be enough to stay the Fed's hand. And if it isn't enough, they can always draw additional comfort from the inflation figures:
Inflation is, at best, only in the process of bottoming.
All that said, policymakers will be a little anxious that they are too quickly dismissing traditional metrics that would indicate they should be be adjusting their inflation forecasts higher in light of the unemployment decline. As I am relatively confident will be much discussed this week, variants of the Taylor rule suggest that policymakers should already be raising rates:
In this environment, policymakers will increasingly worry about the policy lags. They will want to hold rates low, but the further unemployment drops, the more they will fear that they risk falling behind the curve - that by the time the pace of wages growth accelerates, inflationary pressure will already be well established. This is especially the case if they view the 2% target as a ceiling. Hence I remain concerned that the risk is that policy turns sharply tighter relative to current expectations.
I am also challenged to see why I should not expect the now-infamous dots in the summary of economic projections to be pulled forward on the basis of the falling unemployment rate. I am looking forward to the next FOMC meeting for that alone.
I emphasize, however, that any substantially tighter policy remains only a "risk," not a baseline. I anticipate that in her Congressional testimony this week, Yellen will emphasize the alternative measures of labor market slack and the Fed's expectation that policy rates will remain well below "normal" rates for a protracted period. As a general rule one report doesn't change policy.
Bottom Line: Overall, the general contours of the employment report suggest reason to (very) modestly bring forward expected rates hikes, but little to suggest any dramatic change to the Fed's reaction function overall. Policymakers, however, will worry that the current reaction function is overly dependent on dismissing the unemployment rate as an indicator of inflationary pressure. And there is a risk that they will move quicker than expected if that bet starts to sour. Risk, not baseline.
Posted by Mark Thoma on Monday, May 5, 2014 at 12:15 AM in Economics, Fed Watch, Monetary Policy, Unemployment |
Posted by Mark Thoma on Monday, May 5, 2014 at 12:03 AM in Economics, Links |
Diane Coyle (this is also in today's links):
The mainstream economics curriculum needs an overhaul, Vox EU: One of the delayed consequences of the financial crisis is a widespread and apparently growing desire to change how economics is taught. Students in a number of countries, including vocal groups in Chile and the UK, have recently intensified the demand for reform. One recent example is a report from the Post-Crash Economics Society at the University of Manchester (Post-Crash Economics Society 2014).
Professor Wendy Carlin of University College London is leading an international group of academics in developing a new open-source course for introductory economics (funded by the Institute for New Economic Thinking). It contrasts with conventional courses in:
The question of curriculum reform was also the subject of a special session at the recent Royal Economic Society conference.
- Emphasising dynamics, instability, institutions, and environmental questions; and
- Integrating new results and empirical evidence.
Some of the issues were first raised by contributors to a VoxEU debate in 2012 entitled “What’s the use of economics?” (see also Coyle 2012). The debate noted a widespread belief that the profession’s credibility was at stake. If the core economics courses did not respond to the challenges that the Global Crisis posed, economics as a whole would suffer a significant loss of credibility.
Common reform themes
Common themes in the debate at that earlier stage were the need for students to have:
- More exposure to economic history and the history of thought;
- More practical hands-on experience with data;
- Better teaching of communication skills; and
- Some exposure to new developments in economic research.
Overall the thrust was for a less narrow and reductive approach to economics than has become the norm in undergraduate courses. Both academics and employers of graduate economists agreed on these needs.
In the UK, a working group set out such principles in a statement of principles (Coyle 2013). It concluded that undergraduate courses should become more pluralistic and should include:
- Some economic history, which could be integrated into existing courses, especially macroeconomics;
- An introduction to other disciplinary approaches;
- Possibly ‘tasters’ of the frontiers of academic economic research with potential policy application, such as behavioural economics, institutional economics, and post-crisis developments in financial economics;
- Awareness of some of the methodological debates in economics;
- Confronting all theoretical frameworks with evidence and encouraging a healthy scepticism towards all assertion from whatever source.
What do these general principles mean in practice?
Even a relatively minimal interpretation implies a substantial amount of change in many undergraduate economics programmes. In many universities, the core curriculum settled into a predictable rut. This interacted with two factors: (i) incentives for academic research to focus on technical increments to knowledge – contributions aimed solely at professional peers, and (ii) rising teaching loads and student numbers stemming from pressures on university finances.
Despite the great interest in reform among economists teaching undergraduate courses, change will take some time as these various barriers are overcome.
There is probably the widest agreement about changes such as:
- Re-introducing elements of economic history into core modules;
- Incorporating some issues on the frontiers of research into undergraduate teaching;
- Encouraging inter-disciplinary interest; and
- Ensuring students are taught key skills such as data handling and good communication.
A number of economists have commented on the desirability of updating the curriculum to reflect interesting areas of research and ‘real world’ examples (see e.g. Seabright 2013).
More disagreement exists when it comes to the question of the character of economics itself, and the extent to which the experience of the past six years calls the mainstream of the subject into question. Andrew Haldane, newly appointed as chief economist of the Bank of England, argues: “It is time to rethink some of the basic building blocks of economics.” (Post-Crash Economics Society 2014: 4.) Student groups campaigning for reform would clearly agree on the need for a radical reinterpretation of what should be in the core courses or modules.
There is some overlap between their views and those of the mainstream. For example, the UK working group cited above also recommended greater pluralism in economics: “Hostility towards other approaches is the antithesis of a dynamic self-critical discipline that is genuinely seeking to discover new and better ways of understanding the world.”
But it added: “That said, students should not be left unnecessarily confused or with the impression that all schools of thought have an equal standing, or that ‘anything goes’. There should be a balance between a) providing a coherent ‘workhorse’ framework for intellectual development and building analytical skills, and b) the candid highlighting of uncertainty, the limits of economic knowledge and the existence of serious alternative views and approaches.”
In a recent blog post, Roger Farmer of UCLA made a similar point: “My advice to students is this. […] [T]ake the time to absorb those ideas that are in the mainstream. The very best mainstream economists were the radical students who questioned authority when they were undergraduates. It is those economists who you must engage if you are to make meaningful changes that will advance our understanding.” (Farmer 2014.)
Shortcomings of the critiques
However, it is clear that the campaigning students have an incorrectly broad interpretation of the ‘neoclassical mainstream’ and a narrow interpretation of ‘pluralism’. For example, the recent report from the Post-Crash Economics Society fails to recognise the breadth of the courses available in its economics department (which it describes unfairly as a ‘monoculture’); and is itself narrow-minded in dismissing the value for economics students of courses that happen to be taught in the other social sciences and the business school.
The report also mistakenly equates pluralism with the specific views of heterodox economics, rather than the open-minded willingness to analyse economic issues from a range of alternative perspectives (including heterodox ones). There is an obvious logical fallacy in the labelling – and dismissing – of any non-heterodox views as the ‘neoclassical mainstream’.
Simon Wren-Lewis of Oxford University has expressed sympathy with the instincts of the student group but describes their conclusions as “fundamentally misguided”:
“I think it is true that economics as a discipline has tried too hard to emphasise that it is an objective, politically neutral discipline, thereby underplaying value judgements when it makes them. Worse still, sometimes heavily value laden ideas like the importance of Pareto optimality are portrayed as being value neutral, which is clearly nonsense. […] Yet the idea that it should be possible to build a science of human behaviour which is independent of ideology or politics is a noble ideal, and one which has been partly achieved. We may need (and are getting) more political economy, in the sense of recognising that economics works alongside and interacts with social and political forces, but I do not think we need more partisan economics.” (Wren-Lewis 2014.)
Indeed, the ‘science of human behaviour’ in the economic domain has made huge strides during the past 20 years or so, in the advances in applied microeconomics. More data, advances in econometric techniques, new methodologies such as randomised control trials and field experiments, interdisciplinary work with psychology in particular, the revival of economic history, and urban economics, have all contributed to scientific advance (for a survey, see Coyle 2007). This progress is certainly ‘mainstream’; it should be celebrated, and students should be agitating to be taught more about it.
Finally, it should be added that – not least because of such advances in much recent research in economics – there is by no means universal agreement among academic economists that substantial curriculum reform is needed. This reflects their view – in contrast to Andrew Haldane’s – that the basic building blocks of the subject remain solid.
This is therefore a debate with some distance to go, and not least because of the international character of the academic discipline. It seems highly unlikely, though, that undergraduate economics courses will not have changed considerably in character five or ten years from now.
Coyle, Diane (2007), The Soulful Science, Princeton University Press.
Coyle, Diane (2013), “Teaching Economics After the Crisis: Report from the Steering Group”, Royal Economic Society Newsletter, 161, April.
Coyle, Diane (ed.) (2012b), What’s the Use of Economics? Teaching the Dismal Science after the Crisis, London Publishing Partnership, September.
Farmer, Roger (2014), “Teaching Economics”, My Economic Window, 23 April.
Post-Crash Economics Society (2014), “Economics, Education and Unlearning: Economics Education at the University of Manchester”, April.
Seabright, Paul (2013), “Microeconomics for All”, Project Syndicate, 5 December.
Wren-Lewis, Simon (2014), “When economics students rebel”, Mainly Macro, 24 April.
Posted by Mark Thoma on Sunday, May 4, 2014 at 07:16 AM in Economics |
Have there been any "genuine innovations in thought in the past fifty years, especially in the social and historical sciences?" This is from Dan Little:
The big ideas: The deluge of changes that shook Europe around 1800 -- the making of the modern world -- brought with them an explosion of big new ideas, new ways of framing the social, historical, and natural world which we inhabit. Darwin, Freud, Marx, Walras, Carnot, Poincaré, Einstein -- each brought forward one or two foundational and iconoclastic ideas in terms of which to understand some very profound but mysterious features of the world. We think about the world differently because of their originality. And their categories, once shockingly strange, now seem like pure common sense. And the stock of ideas and theories we now have for understanding the social and natural world is vastly richer than it was in earlier epochs.
One theme that runs through these thinkers is the binary of order and disorder, rule and randomness. Classical physics offered a view of the world that stressed the fundamental orderliness of nature -- the idea that natural phenomena were mathematical and law-governed. Much of the intellectual ferment in nineteenth century and early twentieth century physics stemmed from the insight that randomness and the statistical properties of ensembles were even more basic (Carnot, thermodynamics; Brownian motion) and that the laws of physics were stranger than we thought (Poincaré on non-Euclidean space, Einstein on special relativity, Schrödinger on quantum physics). (Peter Galison's fascinating book, Einstein's Clocks and Poincare's Maps: Empires of Time, sheds light on the conceptual revolutions of Poincaré and Einstein.)
In a similar vein, Darwin brought the workings of random variation within a population into center stage in biology. Instead of species with fixed properties, Darwin described a process of evolution through which the properties of a species change over very long stretches of time. Rather than assuming that the features of an organism have been fine-tuned by an intelligent designer, Darwin conceived of a myopic process of adaptation and selection through which functionality could emerge without a designer. (Jonathan Howard's Darwin: A Very Short Introduction is a good account of Darwin's innovation.)
Freud and Marx brought a different sort of paradigm shift to the human sciences. Human behavior is not a transparent consequence of instrumental rationality. Rationality and full self-knowledge are not the primary keys to human action. Instead, people have hidden and only partially available thoughts, aversions, and wants, and it is the work of psychoanalysis to uncover these hidden ideas and thoughts. The self is contradictory and hidden to the actor. (Here is a nice essay by Donald Carveth on the relevance of psychoanalysis for social theory; link.)
And Marx disputed the assumptions of collective rationality and optimality that were embodied in the political economy inspired by Smith and Ricardo -- the idea that a person-independent market served to solve the most basic social problems behind the backs of the actors. In place of this assumption of neutral impersonality Marx argued that modern society reflects a fundamental opposition between groups of people based on their property and economic interests. And he argued that the behavior of the state and its primary institutions could best be understood as the expression of the interests of the dominant class. (Two very different accounts of Marx's intellectual system include Jon Elster, Making Sense of Marx and David Harvey's A Companion to Marx's Capital.)
These are indeed big and important ideas. We understand the real workings of the natural and social worlds better because of these revolutions in thought. We can ask two important questions. First, why did the modern world -- industrial revolution, democratic movements, revolutions in physics and biology -- stimulate such a rich flourishing of innovation and insight, as contrasted to the ancient world or the medieval world?
The beginnings of an answer to this question can be found in the velocity and manifest importance of the changes that western Europe began to experience from 1750 forward -- the French Revolution, the advent of the factory system in England, the spread of revolutionary ideas from anarchists and socialists into popular movements, and other profound changes. Thomas Carlyle was one of the creative thinkers who was forced to find a new vocabulary to describe industrial cities and factories in the early nineteenth century (Past and Present); these social complexes barely existed in the previous century.
The second important question is even more interesting -- where are those new ideas today? Has the twenty-first century yet created any genuinely new thinking to compare with the nineteenth century period of intellectual ferment?
I am tempted to answer this second question in the negative. We have witnessed enormous technological advancement in the past fifty years. Who, in 1964, could have imagined the connectivity created by the Internet and Google, or the extension of human cognition enabled by a connected iPad? But have we encountered genuinely innovative and insightful new ways of organizing our world in thought, about either nature or society? Perhaps not. The social sciences have certainly advanced in the half century of research that has transpired since 1964; but I'm not sure that I would say that there are fundamentally new conceptions of social reality in play. (Perhaps the ontology of assemblage might compete for a spot on the stage; link.) It seems rather that our frameworks of thought have remained somewhat static for the past fifty years.
It is possible that we should not expect big ideas at this point in our history, on the grounds that we now have a reasonably good understanding of both the natural and the social world. If we made that assumption, then we should expect long periods of incremental growth, expansion of knowledge of detail, along with combination and recombination of existing theories and concepts, but no major new breakthroughs.
There will be objections to this line of thought. One is that earlier epochs may have been more innovative and paradigm-breaking than I suggest. Stephen Greenblatt's recent book on Epicurus and Lucretius certainly makes a case for profound intellectual innovation in the ancient world (The Swerve: How the World Became Modern; link). So it is possible that the impression of radical intellectual change starting in the nineteenth century is just a consequence of the foreshortening of history; perhaps the ancients were as resourceful and creative in their thinking as the moderns were.
Another objection is that each of the thinkers I have mentioned had predecessors, including Darwin's contemporary Alfred Russel Wallace and the many versions of socialism that competed with Marx. Was Marx so innovative after all, when compared to Proudhon or Blanqui? So one should not over-emphasize the point of absolute originality. But taking all of this into account, it seems inescapable that human conceptual and imagination space took a major step forward between 1850 and 1925. And, by contrast, the past fifty or seventy-five years seem rather tame when it comes to big ideas.
What thoughts do readers have about genuine innovations in thought in the past fifty years, especially in the social and historical sciences? Are there intellectual or conceptual discoveries that strike you as being genuinely transformative to the way we understand the contemporary world?
(T. J. Clark makes a number of interesting points about the influence of modern social change on the representational arts in Farewell to an Idea: Episodes from a History of Modernism. In an earlier post I discussed the connections that seem to exist between social conditions of modernity and the forms that modern art took during those decades; link.)
Posted by Mark Thoma on Sunday, May 4, 2014 at 07:16 AM in Economics |
Posted by Mark Thoma on Sunday, May 4, 2014 at 12:03 AM in Economics, Links |
Pareto, Inequality and Government Debt: Or is economics inherently right wing?
I noted in passing in an earlier post that Pareto efficiency was obviously not a value free criteria. So those who argue that economists should only look for Pareto improvements – changes where no one is made worse off – are making a value judgement. One, and only one, of its implicit normative assumptions is that inequality does not matter. For others see, for example, Elizabeth Anderson (pdf, HT Anon). ...
The only possibly original point I wanted to make here is that the absurdity of restricting policies to Pareto improvements becomes immediately apparent if we think about government debt. Measures to reduce currently high levels of debt will almost certainly make current generations worse off, because they will have to pay the taxes (or whatever) to get debt down. Yet I do not often hear people arguing that we have to let debt stay high because the government can only implement Pareto improvements. ...
Why is there this emphasis on only looking at Pareto improvements? I think you would have to work quite hard to argue that it was intrinsic to economic theory - it would be, and is, quite possible to do economics without it. (Many economists use social welfare functions.) But one thing that is intrinsic to economic theory is the diminishing marginal utility of consumption. Couple that with the idea of representative agents that macro uses all the time (who share the same preferences), and you have a natural bias towards equality. Focusing just on Pareto improvements neutralises that possibility. Now I mention this not to imply that the emphasis put on Pareto improvements in textbooks and elsewhere is a right wing plot - I do not know enough to argue that. But it should make those (mainstream or heterodox) who believe that economics is inherently conservative pause for thought.
[There is quite a bit more detail in the original.]
Posted by Mark Thoma on Saturday, May 3, 2014 at 09:31 AM in Economics, Equity, Fiscal Policy |
The EPI's Heidi Shierholz:
Number of Missing Workers Jumps to All-Time High: ... The biggest drop in LFPR in April was among men under the age of 20. To my knowledge, data on unemployment insurance exhaustions by age don’t exist, but it is unlikely that young workers are a big proportion of exhaustions. This means that the April drop in labor force participation is likely not being driven by the expiration of federal unemployment insurance benefits last December as some have suggested, but simply by the weak labor market.
There is currently an all-time-high of 6.2 million missing workers (potential workers who are neither working nor actively seeking work due to the weak labor market). Almost a quarter of them (1.4 million) are under age 25. The ... unemployment rate for young workers would be 18.4 percent instead of 12.8 percent if the missing young workers were in the labor force looking for work and thus counted as unemployed.
For a complete picture of the labor market prospects facing the new cohort of young adults graduating from high school and college this spring, see the Class of 2014 report released yesterday. It includes, for example, a detailed discussion of the finding that there is little evidence that today’s missing young workers are “sheltering in school”.
Posted by Mark Thoma on Saturday, May 3, 2014 at 02:28 AM in Economics, Unemployment |
Posted by Mark Thoma on Saturday, May 3, 2014 at 12:03 AM in Economics, Links |
From the Journal of Economic Perspectives' Symposium on Big Data:
"Big Data: New Tricks for Econometrics," by Hal R. Varian: Computers are now involved in many economic transactions and can capture data associated with these transactions, which can then be manipulated and analyzed. Conventional statistical and econometric techniques such as regression often work well, but there are issues unique to big datasets that may require different tools. First, the sheer size of the data involved may require more powerful data manipulation tools. Second, we may have more potential predictors than appropriate for estimation, so we need to do some kind of variable selection. Third, large datasets may allow for more flexible relationships than simple linear models. Machine learning techniques such as decision trees, support vector machines, neural nets, deep learning, and so on may allow for more effective ways to model complex relationships. In this essay, I will describe a few of these tools for manipulating and analyzing big data. I believe that these methods have a lot to offer and should be more widely known and used by economists. Full-Text Access | Supplementary Materials
Posted by Mark Thoma on Friday, May 2, 2014 at 10:05 AM in Econometrics, Economics |
Dean Baker on the Jobs Report:
Economy Adds 288,000 Jobs in April, Sharp Drop in Labor Force Leads to Plunge in Unemployment: The economy added 288,000 jobs in April. With upward revisions to the prior two months’ data, this brings the three month average to 234,000. This is highest three month total since the economy added 829,000 jobs in the first three months of 2012. The household survey showed unemployment rate falling from 6.7 percent in March to 6.3 percent in April, but the drop was entirely the result of 806,000 people leaving the labor force. Employment, as measured in the household survey, actually fell by 73,000. The employment-to-population ratio (EPOP) remained unchanged at 58.9 percent. ...
On the whole, this is a very positive report. While the April job growth was likely inflated as a result of bad weather in prior months, the three month average is still near the peaks for the recovery.
Update: Jared Bernstein:
the decline in unemployment is entirely due not to job creation, but to labor force decline (employment actually fell slightly in the household survey). This important and closely watched indicator—the labor force participation rate—also fell 0.4 tenths, reversing recent gains and returning the lfpr to its low where it stood at the end of last year, commensurate with levels we haven’t seen since the late 1970s. Though part of the recent decline in the participation rate reflects our aging demographics, more than half in my judgment is due to weak demand.
The BLS noted that the large decline in the labor force—about -800,000—was likely due to fewer entrants as opposed to more leavers. And this is a volatile number, as I stress below. But neither can it be dismissed out of hand: it has been essentially stuck at historically low levels for a while now.
On the other hand, the payroll report shows pretty decent labor demand/job creation. As noted, the 288,000 jobs beat expectations, and gains for the prior two months were revised up by 36,000. Job gains occurred across most industries, with 67% of private industries expanding employment, the largest share in over two years (government employment was also up 15,000, almost all due to local government; federal employment was down slightly).
Posted by Mark Thoma on Friday, May 2, 2014 at 09:20 AM in Economics, Unemployment |
Why didn't fiscal policy makers listen to economists?:
Why Economics Failed, by Paul Krugman, Commentary, NY Times: On Wednesday, I wrapped up the class I’ve been teaching..: “The Great Recession: Causes and Consequences.” ...I found myself turning at the end to an agonizing question: Why, at the moment it was most needed and could have done the most good, did economics fail?
I don’t mean that economics was useless to policy makers. ... While ... few economists saw the crisis coming..., since the fall of Lehman Brothers, basic textbook macroeconomics has performed very well. ...
In what sense did economics work well? Economists who took their own textbooks seriously quickly diagnosed the nature of our economic malaise: We were suffering from inadequate demand ... and a depressed economy. ...
And the diagnosis ... had clear policy implications: ...this was no time to worry about budget deficits and cut spending... We needed more government spending, not less, to fill the hole left by inadequate private demand. But... Since 2010, we’ve seen a sharp decline in discretionary spending and an unprecedented decline in budget deficits, and the result has been anemic growth and long-term unemployment on a scale not seen since the 1930s.
So why didn’t we use the economic knowledge we had?
One answer is that most people find the logic of policy in a depressed economy counterintuitive. ... And even supposedly well-informed people balk at the notion that simple lack of demand can wreak so much havoc. Surely, they insist, we must have deep structural problems, like a work force that lacks the right skills; that sounds serious and wise, even though all the evidence says that it’s completely untrue.
Meanwhile, powerful political factions ... whose real goal is dismantling the social safety net have found promoting deficit panic an effective way to push their agenda. And such people have been aided and abetted by what I’ve come to think of as the trahison des nerds — the willingness of some economists to come up with analyses that tell powerful people what they want to hear, whether it’s that slashing government spending is actually expansionary, because of confidence, or that government debt somehow has dire effects on economic growth even if interest rates stay low.
Whatever the reasons basic economics got tossed aside, the result has been tragic. ... We have, all along, had the knowledge and the tools to restore full employment. But policy makers just keep finding reasons not to do the right thing.
Posted by Mark Thoma on Friday, May 2, 2014 at 12:24 AM in Economics, Fiscal Policy, Macroeconomics, Politics |
Posted by Mark Thoma on Friday, May 2, 2014 at 12:03 AM in Economics, Links |
Busy day ... quick one from Tim Noah:
Another Perspective on U.S. Economic Mobility , by Timothy Noah, Washington Wire: My fellow Think Tank-er Benjamin Domenech, in formulating a conservative response to Thomas Piketty, wrote that “Several recent studies have shown that U.S. economic mobility is very good.”
But these studies aren’t terribly useful. They define “mobility” as what happens to an individual over many years..., a predictable earnings lifecycle. Most everybody enters the workforce making nothing or next to nothing; acquires experience and makes more; tops out at some point; then retires and makes less. ...
A much more useful measure of mobility is intergenerational... How “heritable” is relative income? In the U.S., it’s about 40 percent.... That isn’t very good by international standards. It’s been in that range since ... the 1970s. I write about all this at greater length here.
Posted by Mark Thoma on Thursday, May 1, 2014 at 01:30 PM in Economics, Income Distribution |