"There are no moderates in the Republican primary":
Doubling Down on W, by Paul Krugman, commentary, NY Times: 2015 was, of course, the year of Donald Trump, whose rise has inspired horror among establishment Republicans and, let’s face it, glee — call it Trumpenfreude — among many Democrats. But Trumpism has in one way worked to the G.O.P. establishment’s advantage: it has distracted pundits and the press from the hard right turn even conventional Republican candidates have taken, a turn whose radicalism would have seemed implausible not long ago.
After all, you might have expected the debacle of George W. Bush’s presidency ... to inspire some reconsideration of W-type policies. What we’ve seen instead is a doubling down, a determination to take whatever didn’t work from 2001 to 2008 and do it again, in a more extreme form.
Start with the example that’s easiest to quantify, tax cuts..., it’s harder than ever to claim that tax cuts are the key to prosperity. ... In fact, however, establishment candidates like Marco Rubio and Jeb Bush are proposing much bigger tax cuts than W ever did. ...
What about other economic policies? The Bush administration’s determination to dismantle any restraints on banks ... looks remarkably bad in retrospect. But conservatives ... have declared their determination to repeal Dodd-Frank...
The only real move away from W-era economic ideology has been on monetary policy, and it has been a move toward right-wing fantasyland. ...
Last but not least, there’s foreign policy. You might have imagined that the story of the Iraq war ... would inspire some caution about military force as the policy of first resort. Yet swagger-and-bomb posturing is more or less universal among the leading candidates. ...
The point is that ... the mainstream contenders ... are frighteningly radical, and that none of them seem to have learned anything from past disasters.
Why does this matter? Right now conventional wisdom ... suggests even or better-than-even odds that Mr. Trump or Mr. Cruz will be the nominee, in which case everyone will be aware of the candidate’s extremism. But there’s still a substantial chance that the outsiders will falter and someone less obviously out there — probably Mr. Rubio — will end up on top.
And if this happens, it will be important to realize that not being Donald Trump doesn’t make someone a moderate, or even halfway reasonable. The truth is that there are no moderates in the Republican primary, and being reasonable appears to be a disqualifying characteristic for anyone seeking the party’s nod.
Posted by Mark Thoma on Monday, December 28, 2015 at 09:30 AM in Economics, Politics |
Posted by Mark Thoma on Monday, December 28, 2015 at 01:02 AM in Economics, Links |
A Powerful Intellectual Stumbling Block: The Belief that the Market Can Only Be Failed: Of all the strange and novel economic doctrines propounded since 2007, Stanford's John Taylor has a good claim to [propounding the strangest]: In his view, the low interest-rate, quantitative-easing, and forward-guidance policies of North Atlantic and Japanese central banks are like:
imposing an interest-rate ceiling on the longer-term market... much like the effect of a price ceiling in a [housing] rental market.... [This] decline in credit availability, reduces aggregate demand, which tends to increase unemployment, a classic unintended consequence..."
When you think about it, this analogy makes no sense at all. ...
Posted by Mark Thoma on Sunday, December 27, 2015 at 10:05 AM in Economics |
Posted by Mark Thoma on Sunday, December 27, 2015 at 01:17 AM in Economics, Links |
Doubling Down On W: As I’ve said, it’s hard these days for liberals to look at the state of the Republican primary without feeling a lot of Trumpenfreude. But one downside of The Donald’s turn in the spotlight is that the policy positions of the tonsorially conventional candidates are going largely unscrutinized, which is bad. For the fact is that the whole field has taken a hard right turn into fantasyland.
Take, for example, tax policy. Big tax cuts tilted toward the 1 percent were George W. Bush’s signature domestic achievement. But they failed to deliver the promised prosperity — and given the changes in the environment since then, any repeat of his push should seem unattractive. After all, W pretended simply to be handing back a budget surplus; today’s GOP has spent years hyperventilating about deficits. Inequality is a big issue, too, in a way it wasn’t at the end of the Clinton boom. So you might think that the current crop of candidates would be proposing something different.
Instead, what we’ve gotten is a doubling down. ...
It’s pretty amazing. The next thing you know, they’ll be bringing back the architects of the Iraq disaster to do it all over again. Oh, wait.
Posted by Mark Thoma on Saturday, December 26, 2015 at 05:36 PM
Posted by Mark Thoma on Saturday, December 26, 2015 at 03:04 AM in Economics, Links |
Things to Celebrate, Like Dreams of Flying Cars, By Paul Krugman, Commentary, NY Times: ...We’re still a very long way from space colonies and zero-gravity hotels, let alone galactic empires. But space technology is moving forward after decades of stagnation.
And to my amateur eye, this seems to be part of a broader trend, which is making me more hopeful for the future than I’ve been in a while.
You see, I got my Ph.D. in 1977, the year of the first Star Wars movie, which means that I have basically spent my whole professional life in an era of technological disappointment.
Until the 1970s, almost everyone believed that advancing technology would do in the future what it had done in the past: produce rapid, unmistakable improvement in just about every aspect of life. But it didn’t. ...
Now, there has been striking progress in our ability to process and transmit information. But while I like cat and concert videos as much as anyone, we’re still talking about a limited slice of life: ...
Over the past five or six years, however — or at least this is how it seems to me — technology has been getting physical again; once again, we’re making progress in the world of things, not just information. And that’s important.
Progress in rocketry is fun to watch, but the really big news is on energy, a field of truly immense disappointment until recently. ... The biggest effects so far have come from fracking, which has ended fears about peak oil and could, if properly regulated, be some help on climate change: Fracked gas is still fossil fuel, but burning it generates a lot less greenhouse emissions than burning coal. The bigger revolution looking forward, however, is in renewable energy, where costs of wind and especially solar have dropped incredibly fast.
Why does this matter? ... Well, you still hear claims, mostly from the right but also from a few people on the left, that we can’t take effective action on climate without bringing an end to economic growth. ...
But now we can see the shape of a sustainable, low-emission future quite clearly... Of course, it doesn’t have to happen. But if it doesn’t, the problem will be politics, not technology.
True, I’m still waiting for flying cars, not to mention hyperdrive. But we have made enough progress in the technology of things that saving the world has suddenly become much more plausible. And that’s reason to celebrate.
Posted by Mark Thoma on Friday, December 25, 2015 at 12:42 AM in Economics |
Posted by Mark Thoma on Friday, December 25, 2015 at 12:06 AM in Economics, Links |
Joshua Gans at Digitopoly:
How to generate a Golden Age: TV Edition: When we think of Golden Ages it is looking back and realising that things were better during some period of time; we just never realised it at the time. But we are currently living in a Golden Age of Television. It is better than at any point in its history. And what is more, we know it. That is simply a remarkable state of affairs.
When did the Golden Age begin? Many will mark 2008 as a turning point with Breaking Bad (or maybe a year earlier with Mad Men). Others will go back to 2002 and 2003 with The Wire, Battlestar Galactica and Lost. In reality, the Golden Age, as we know it, is really a phenomenon of the last five or six years as knowledge of great alternative programming became widespread. And there is no end in sight.
What is remarkable about it is that at the beginning of the Golden Age, industry insiders were proclaiming doom for the industry. The Internet and YouTube, in particular, not to mention piracy were destroying all and sundry (supposedly) and with them any incentives to create good content. ...
To get into this, it is useful to remind ourselves of the basic economics of product design when the designer has a certain degree of market power. As Michael Spence showed, such suppliers will tend to design products to target marginal customers rather than average customers. ...
Let’s start with the marginal customer. In the old, pre-2002, days, the marginal customer was the customer who ... would plonk themselves down in front of the TV every night for a few hours a night. Their product design was focused on marginal customers whom they could attract in a routine way. ...TV seasons were year long and the prime time stuff grabbed attention so that programming around it could be even more routine and comforting.
The current Golden Age content does not fit this mould. It is rarely year long. It is irregular. It is sometimes intense. And it often requires investment by the consumer — miss some episodes and you are lost. Moreover, consumers have to decide what they ‘feel like’ watching. ...
But how does the Internet (broadly) change all of that? The Internet has taken away the routine for many people but, importantly, allows people to still fill their attention with television content. So they can still plonk themselves down for a night of entertainment. What they do now is choose what that will be. ...
Option demand is very different to immediate consumption demand. What will drive an incentive to be a regular subscriber is not purely access but whether they believe they will want to have that access on a random day. ... Critically that does not mean that you need to serve up options at 9pm that will appeal to most people on that night. Instead, you can have content that is demanded by a consumer at some point — that they feel they might want to watch. ...
How do you do that? You need to produce content that people decide they will want to watch at some point. And as I talked about in Information Wants to be Shared, the content that best does that is content that other people — your friends etc — refer you to. In other words, traditional marketing of television is replaced by social marketing. A completely different ball game as you must please the average consumer in order to attract more consumers at the margin. ...
As a final note: I have made lots of speculative assumptions here but I am confident that looking at changed incentives to attract marginal consumers and who those consumers are is the place to look to understand how the television industry has evolved.
Posted by Mark Thoma on Thursday, December 24, 2015 at 10:19 AM in Economics |
Posted by Mark Thoma on Thursday, December 24, 2015 at 12:06 AM in Economics, Links |
The Six Major Adverse Shocks that Have Hit the U.S. Macroeconomy since 2005: Talk to people at the Federal Reserve these days about how they feel about the institution’s performance during the seven very lean years from late 2008 to late 2015, and they tend to be relatively proud of how the institution performed. Almost smug.
Why? Well, let me pull out my old workhorse-graph of the four salient components of U.S. aggregate demand since 1999:
And let me run through the six major adverse shocks to the U.S. macroeconomy since 2005...
Posted by Mark Thoma on Wednesday, December 23, 2015 at 07:53 AM in Economics |
An Aging Society Is No Problem When Wages Rise: Eduardo Porter discusses the question of whether retirees will have sufficient income in twenty or thirty years. He points out that if no additional revenue is raised, Social Security will not be able to pay full scheduled benefits after 2034.
While this is true, it is important to note that this would have also been true in the 1940, 1950s, 1960s, and 1970s. If projections were made for Social Security that assumed no increase in the payroll tax in the future, there would have been a severe shortfall in the trust fund making it unable to pay full scheduled benefits.
We have now gone 25 years with no increase in the payroll tax, by far the longest such period since the program was created. With life expectancy continually increasing, it is inevitable that a fixed tax rate will eventually prove inadequate if the retirement age is not raised. (The age for full benefits has already been raised from 65 to 66 and will rise further to 67 by 2022, but no further increases are scheduled.)
The past increases in the Social Security tax have generally not imposed a large burden on workers because real wages rose. The Social Security trustees project average wages to rise by more than 50 percent over the next three decades. If most workers share in this wage growth, then the two or three percentage point tax increase that might be needed to keep the program fully funded would be a small fraction of the wage growth workers see over this period. Of course, if income gains continue to be redistributed upward, then any increase in the Social Security tax will be a large burden.
For this reason, Social Security should be seen first and foremost as part of the story of wage inequality. If workers get their share of the benefits of productivity growth then supporting a larger population of retirees will not be a problem. On the other hand, if the wealthy manage to prevent workers from benefiting from growth during their working lives, they will also likely prevent them from having a secure retirement.
Posted by Mark Thoma on Wednesday, December 23, 2015 at 06:48 AM in Economics, Income Distribution, Social Insurance, Social Security, Taxes |
Posted by Mark Thoma on Wednesday, December 23, 2015 at 12:06 AM in Economics, Links |
Posted by Mark Thoma on Tuesday, December 22, 2015 at 10:46 AM in Economics, Video |
My views and the Fed’s views on secular stagnation: It has been two years since I resurrected Alvin Hansen’s secular stagnation idea and suggested its relevance to current conditions in the industrial world. Unfortunately experience since that time has tended to confirm the secular stagnation hypothesis. Secular stagnation is a possibility. It is not an inevitability and it can be avoided with strong policy. Unfortunately, the Fed and other policy setters remain committed to traditional paradigms and so are acting in ways that make secular stagnation more likely. ... Indeed I would judge that there is at least a two-thirds chance that we will experience zero or negative rates again in the next five years. ...
I believe its decision to raise rates last week reflected four consequential misjudgments.
First, the Fed assigns a much greater chance that we will reach 2 percent core inflation than is suggested by most available data. ...
Second, the Fed seems to mistakenly regard 2 percent inflation as a ceiling not a target. ...
Third... It is suggested that by raising rates the Fed gives itself room to lower them. ... I would say the argument that the Fed should raise rates so as to have room to lower them is in the category with the argument that I should starve myself in order to have the pleasure of relieving my hunger pangs.
Fourth, the Fed is likely underestimating secular stagnation. It is ... overestimating the neutral rate. ...
Why is the Fed making these mistakes if indeed they are mistakes? It is not because its leaders are not thoughtful or open minded or concerned with growth and employment. Rather I suspect it is because of an excessive commitment to existing models and modes of thought. Usually it takes disaster to shatter orthodoxy. We can all hope that either my worries prove misplaced or the Fed shows itself to be less in the thrall of orthodoxy than it has been of late.
The Fed's job would have been, and will be a lot easier if fiscal policy makers would help. I disagree with Charles Plosser's view on monetary policy, but I have some sympathy for the view that many people have come to expect too much from monetary policy:
... On the monetary policy side central banks have clearly pushed the envelope in an effort to stabilize and then promote real economic growth. The pressure to do so has come from inside and outside the central banks. These actions have raised expectations of what the central bank can do. For the last three or four decades, it has been widely accepted among academics and central bankers that monetary policy is primarily responsible for anchoring inflation and inflation expectations at some low level. In the United States, where the Fed operates under the so-called dual mandate to promote both price stability and maximum employment, monetary policy has also attempted to stabilize economic growth and employment. Yet it has also been widely accepted that monetary policy’s impact on real variables was limited and temporary, thus in the long-run changes in money were neutral for real variables.
The behavior of central banks during the crisis and subsequent recession has turned much of this conventional wisdom on its head. It is not clear that this is wise or prudent. Many have come to fear that without substantial support from monetary policy our economies will slump into stagnation. This would seem to fly in the face of nearly two centuries of economic thinking. ...
If secular stagnation is real, the Fed cannot overcome it by itself. Fiscal policy will have to be part of the solution. (I do think one statement above is wrong, and it gets at the heart of Summer's recent work reviving hysteresis and his statement above about commitment to orthodoxy. When Plosser says "monetary policy’s impact on real variables was limited and temporary, thus in the long-run changes in money were neutral for real variables," he is ignoring recent work by Summers, Blanchard, and Fatas showing that recessions can permanently lower our productive capacity, and it is worse when the recession lasts longer. This means that monetary policy -- and fiscal policy too -- can have a permanent impact on the natural rate of output by helping the economy to recover faster. The faster the recovery, the less the natural rate is lowered. So I agree with Summers that monetary policy needs to take the possibility of secular stagnation into account, I just wish he'd put more emphasis on the essential role of fiscal policy -- something he has certainly done in the past, e.g., "I believe that it is appropriate that we go back to an earlier tradition that has largely passed out of macroeconomics of thinking about fiscal policy as having a major role in economic stabilization.")
Posted by Mark Thoma on Tuesday, December 22, 2015 at 10:08 AM in Economics, Fiscal Policy, Monetary Policy |
I believe the evidence, overall, suggests that moderate increases in the minimum wage have negligible effects on employment. But, to be fair, David Neumark is a credible researcher and I will let him make the case that an an increase in the minimum wage may not be benign:
The Effects of Minimum Wages on Employment, by David Neumark: It is easy to be confused about what effects minimum wages have on jobs for low-skilled workers. Researchers offer conflicting evidence on whether or not raising the minimum wage means fewer jobs for these workers. Some recent studies even suggest overall employment could be harmed. This Letter sheds light on the range of estimates and the different approaches in the research that might explain some of the conflicting results. It also presents some midrange estimates of the aggregate employment effects from recent minimum wage increases based on the research literature.
The controversy begins with the theory
The standard model of competitive labor markets predicts that a higher minimum wage will lead to job loss among low-skilled workers. The simplest scenario considers a competitive labor market for a single type of labor. A “binding” minimum wage that is set higher than the competitive equilibrium wage reduces employment for two reasons. First, employers will substitute away from the low-skilled labor that is now more expensive towards other inputs, such as equipment or other capital. Second, the higher wage and new input mix implies higher prices, in turn reducing product and labor demand.
Of course, the labor market is more complicated. Most important, workers have varying skill levels, and a higher minimum wage will lead employers to hire fewer low-skilled workers and more high-skilled workers. This “labor-labor” substitution may not show up as job losses unless researchers focus on the least-skilled workers whose wages are directly pushed up by the minimum wage. Moreover, fewer jobs for the least-skilled are most important from a policy perspective, since they are the ones the minimum wage is intended to help.
In some alternative labor market models, worker mobility is limited and individual employers therefore have some discretion in setting wages. In such “monopsony” models, the effect of increasing the minimum wage becomes ambiguous. However, such models may be less applicable to labor markets for unskilled workers most affected by the minimum wage; these markets typically have many similar employers in close proximity to each other (think of a shopping mall) and high worker turnover. Nonetheless, the ultimate test is not theoretical conjecture, but evidence.
Recent research on employment effects of minimum wages
The earliest studies of the employment effects of minimum wages used only national variation in the U.S. minimum wage. They found elasticities between −0.1 and −0.3 for teens ages 16–19, and between −0.1 and −0.2 for young adults ages 16–24. An elasticity of −0.1 for teens, for example, means that a 10% increase in the wage floor reduces teen employment by 1%. Newer research used data from an increasing number of states raising their minimum wages above the federal minimum. The across-state variation allowed comparisons of changes in youth employment between states that did and did not raise their minimum wage. This made it easier to distinguish the effects of minimum wages from those of business cycle and other influences on aggregate low-skill employment. An extensive survey by Neumark and Wascher (2007) concluded that nearly two-thirds of the more than 100 newer minimum wage studies, and 85% of the most convincing ones, found consistent evidence of job loss effects on low-skilled workers.
Research since 2007, however, has reported conflicting findings. Some studies use “meta-analysis,” averaging across a set of studies to draw conclusions. For example, Doucouliagos and Stanley (2009) report an average elasticity across studies of −0.19, consistent with earlier conclusions, but argue that the true effect is closer to zero; they suggest that the biases of authors and journal editors make it more likely that studies with negative estimates will be published. However, without strong assumptions it is impossible to rule out an alternative interpretation—that peer review and publication lead to more evidence of negative estimates because the true effect is negative. In addition, meta-analyses do not assign more weight to the most compelling evidence. Indeed, they often downweight less precise estimates, even though the lower precision may be attributable to more compelling research strategies that ask more of the data. In short, meta-analysis is no substitute for critical evaluation of alternative studies.
A second strand of recent research that conflicts with earlier conclusions argues that geography matters. In other words, the only valid conclusions come from studies that compare changes among close or contiguous states or subareas of states (for example, Dube, Lester, and Reich 2010). A number of studies using narrow geographic comparisons find employment effects that are closer to zero and not statistically significant for both teenagers and restaurant workers. The studies argue that their results differ because comparisons between distant states confound actual minimum wage effects with other associated negative shocks to low-skill labor markets.
Some follow-up studies, however, suggest that limiting comparisons to geographically proximate areas generates misleading evidence of no job loss effects from minimum wages. Pointing to evidence that minimum wages tend to be raised when labor markets are tight, this research suggests that, among nearby states that are similar in other respects, minimum wage increases are more likely to be associated with positive shocks, obscuring the actual negative effects of minimum wages. Using better methods to pick appropriate comparison states, this research finds negative elasticities in the range of −0.1 to −0.2 for teenagers, and smaller elasticities for restaurant workers (see Neumark, Salas, and Wascher 2014a,b, and Allegretto et al. 2015 for a rebuttal). Other analyses that try to choose valid geographic comparisons estimate employment responses from as low as zero to as high as −0.50 (Baskaya and Rubinstein 2012; Liu, Hyclak, and Regmi 2015; Powell 2015; Totty 2015).
Some new strategies in recent studies have also found generally stronger evidence of job loss for low-skilled workers. For example, Clemens and Wither (2014) compare job changes within states between workers who received federal minimum wage increases because of lower state minimums and others whose wages were low but not low enough to be directly affected. Meer and West (2015) found longer-term dynamic effects of minimum wages on job growth; they suggest these longer-term effects arise because new firms are more able to choose labor-saving technology after a minimum wage increase than existing firms whose capital was “baked in.”
How do we summarize this evidence? Many studies over the years find that higher minimum wages reduce employment of teens and low-skilled workers more generally. Recent exceptions that find no employment effects typically use a particular version of estimation methods with close geographic controls that may obscure job losses. Recent research using a wider variety of methods to address the problem of comparison states tends to confirm earlier findings of job loss. Coupled with critiques of the methods that generate little evidence of job loss, the overall body of recent evidence suggests that the most credible conclusion is a higher minimum wage results in some job loss for the least-skilled workers—with possibly larger adverse effects than earlier research suggested.
Recent minimum wage increases and implications
Despite the evidence of job loss, policymakers and the voting public have raised minimum wages frequently and sometimes substantially in recent years. Since the last federal increase in 2009, 23 states have raised their minimum wage. In these states, minimum wages in 2014 averaged 11.5% higher than the federal minimum (Figure 1). If these higher minimum wages have in fact lowered employment opportunities, this could have implications for changes in aggregate employment over this period.
Percent difference between state and federal minimum wages, June 2014
Note that more states (31) had minimums above the federal level just before the Great Recession than do now (Figure 2). The average relative to the federal minimum was nearly three times as high at 32.3%. However, this is in part because the federal minimum wage has increased 41% since the beginning of 2007. To compare the average change across states between 2007 and 2014, I account for the smaller number of states with higher minimums in 2014 and their lower levels, and weight the states by their working-age population. I find that minimum wages were roughly 20.6% higher in 2014 than in 2007, compared with a 16.5% increase in average hourly earnings over the same period. Thus, between the federal increases in 2007–09 and recent state increases, the minimum wage has grown only slightly faster than average wages in the economy—around 4.1% over the entire seven-year period.
Percent difference between state and federal minimum wages, June 2007
From the research findings cited earlier, one can roughly translate these minimum wage increases into the overall job count. Among the studies that find job loss effects, estimated employment elasticities of −0.1 to −0.2 are at the lower range but are more defensible than the estimates of no employment effects. Some of the larger estimates are from studies that are likely to receive more scrutiny in the future.
Using a −0.1 elasticity and applying it only to teenagers implies that higher minimum wages have reduced employment opportunities by about 18,600 jobs. An elasticity of −0.2 doubles this number to around 37,300. If we instead use the larger 16–24 age group and apply the smaller elasticity to reflect that a smaller share of this group is affected, the crude estimate of missing jobs rises to about 75,600. Moreover, if some very low-skilled older adults also are affected (as suggested by Clemens and Wither 2014), the number could easily be twice as high, although there is much less evidence on older workers.
Thus, allowing for the possibility of larger job loss effects, based on other studies, and possible job losses among older low-skilled adults, a reasonable estimate based on the evidence is that current minimum wages have directly reduced the number of jobs nationally by about 100,000 to 200,000, relative to the period just before the Great Recession. This is a small drop in aggregate employment that should be weighed against increased earnings for still-employed workers because of higher minimum wages. Moreover, weighing employment losses against wage gains raises the broader question of how the minimum wage affects income inequality and poverty. This issue will be addressed in the next Economic Letter.
David Neumark is Chancellor’s Professor of Economics and Director of the Center for Economics & Public Policy at the University of California, Irvine, and a visiting scholar at the Federal Reserve Bank of San Francisco.
Allegretto, Sylvia, Arindrajit Dube, Michael Reich, and Ben Zipperer. 2015. “Credible Research Designs for Minimum Wage Studies: A Response to Neumark, Salas, and Wascher.” Unpublished manuscript.
Baskaya, Yusuf Soner, and Yona Rubinstein. 2012. “Using Federal Minimum Wage Effects to Identify the Impact of Minimum Wages on Employment and Earnings across U.S. States.” Unpublished paper, Central Bank of Turkey.
Clemens, Jeffrey, and Michael Wither. 2014. “The Minimum Wage and the Great Recession: Evidence of Effects on the Employment and Income Trajectories of Low-Skilled Workers.” NBER Working Paper 20724.
Doucouliagos, Hristos, and T.D. Stanley. 2009. “Publication Selection Bias in Minimum-Wage Research? A Meta-Regression Analysis.” British Journal of Industrial Relations 47(2), pp. 406–428.
Dube, Arindrajit, T. William Lester, and Michael Reich. 2010. “Minimum Wage Effects Across State Borders: Estimates Using Contiguous Counties.” Review of Economics and Statistics 92(4), pp. 945–964.
Liu, Shanshan, Thomas J. Hyclak, and Krishna Regmi. 2015. “Impact of the Minimum Wage on Youth Labor Markets.” Labour, early publication online.
Meer, Jonathan, and Jeremy West. 2015. “Effects of the Minimum Wage on Employment Dynamics.” Journal of Human Resources, early publication online.
Neumark, David, J.M. Ian Salas, and William Wascher. 2014a. “More on Recent Evidence on the Effects of Minimum Wages in the United States.” IZA Journal of Labor Policy 3(1).
Neumark, David, J.M. Ian Salas, and William Wascher. 2014b. “Revisiting the Minimum Wage-Employment Debate: Throwing Out the Baby with the Bathwater?” Industrial and Labor Relations Review 67(Supplement), pp. 608–648.
Neumark, David, and William Wascher. 2007. “Minimum Wages and Employment.” Foundations and Trends in Microeconomics 3(1–2), pp. 1–182.
Powell, David. 2015. “Synthetic Control Estimation beyond Case Studies: Does the Minimum Wage Decrease Teen Employment?” Unpublished manuscript.
Totty, Evan. 2015. “The Effect of Minimum Wages on Employment: A Factor Model Approach.” Unpublished manuscript.
Posted by Mark Thoma on Tuesday, December 22, 2015 at 12:15 AM in Economics, Unemployment |
Posted by Mark Thoma on Tuesday, December 22, 2015 at 12:06 AM in Economics, Links |
What are the factors behind high economic rents?: In research and debates about economic inequality in the United States, there’s been a resurgence this year in explanations for rising inequality that emphasize market power and market imperfections. In a recent piece for the New York Review of Books, Paul Krugman details how increasing market power seems to be a more attractive story of how inequality became so large in the United States. A paper that Krugman cites—by Jason Furman, Chairman of the Council of Economic Advisers, and Peter Orszag of Citigroup—emphasizes the role of “rents” in contributing to inequality and suggests increasing market power could be the reason for this trend. But an interesting new paper from Dean Baker, economist and co-director of the Center for Economic and Policy Research, proposes a very different reason for why those rents came about. ...
First, let’s define economic rents. ...
Baker points to four areas where rents are pervasive in the U.S. economy: patents and copyrights, the financial sector, high pay for CEOs and executives, and high pay for professionals more broadly. What brings these areas together is that the institutional arrangements in these portions of the economy have not only created rents, but that these rents disproportionately go to those at the top of the income ladder. Getting rid of these rents would stop the “upward redistribution of income.” ...
Altogether, these factors could have a big impact on the level of inequality. According to Baker’s calculations, the combined amount of rents from these four areas would be equal to between 6 percent and 8.5 percent of U.S. gross domestic product in 2014. That’s about the same size as the increase in the share of income going to the top 1 percent of income earners in the United States from 1979 to 2012.
What differentiates Baker’s argument from similar ones is that he is less persuaded that decreased competition among firms and firm consolidation is a major factor. Furman and Orszag mention patents as a potential cause of excess returns, but the emphasis seems to be more on market power...
In short, Baker sees the story of rents less as capital versus labor and more about a fight within labor. ...[W]hile rents might have broken through as a compelling story for why inequality has risen in the United States, the reason for high rents is still very much up for debate.
Posted by Mark Thoma on Monday, December 21, 2015 at 09:43 AM
What Is The Fed's Expectation For Financial Markets?, by Tim Duy: David Keohnae at FT Alphaville points us toward a JPM research note raising the prospect of a reappearance of Former Federal Reserve Chair Alan Greenspan’s “conundrum.” From the note:
If long-term interest rates matter more than short-term interest rates, will Fed’s current and prospective rate hikes matter much? The answer is yes if long-term interest rates respond to these short-term rate hikes. But this transmission is far from given, especially given the Fed’s decision that reinvestments would not be halted until the normalization of the funds rate is “well under way”
The previous hiking cycle of 2004-2006 is a reminder of how problematic the transmission from short rates to long-term interest rates can be. At the time, the 10y real UST yield rose by only 25bp between June 2004 and June 2006 despite the Fed lifting its target rate by 425bp (Figure 1). We depict the real rather than nominal UST yield in the chart to capture the potential impact of monetary policy actions on inflation expectations. This lack of transmission or “bond conundrum” at the time was attributed to global saving forces emanating from DM corporates and EM economies. Could these saving forces prevent once again rate hikes from transmitting to longer-term interest rates?
Keohane links to fellow Alphaville write Matthew Klein, who describes the “conundrum” as bogus. Klein draws attention to the shape of the yield curve:
In addition to forgetting his own experience at the Fed, Greenspan’s confusion can also be blamed on an unusual belief in the “normal” behaviour of forward short rates.
Short rates tend to go up and down with the business cycle, which typically lasts a lot less than ten years…
When the economy is weak and the Fed is stepping on the gas, short rates should be lower than your reasonable expectation of the average for the next ten years. (Like now.) Other times, of course, short rates are higher than your reasonable expectation of the average for the next ten years because the economy is running hot and the Fed is stepping on the brakes. Longer-term yields therefore shouldn’t always move with short-term rates.
This is why people think the slope of the yield curve is a decent signal of where the economy is going.
When the economy is peaking and poised to go into recession, short rates end up higher than long rates because traders are betting that short rates will fall significantly. To use the jargon, the curve is inverted. After the economy has hit bottom and is ready to grow, the yield curve gets nice and steep, reflecting the expectation of future increases in the short rate to match the expected acceleration in nominal spending.
What happens to the yield curve, and how the Federal Reserve responds, is one of my big questions for 2016. Almost always, the yield curve flattens after the Fed begins a tightening cycle. Within a year, the spread between the 10- and 2-year treasuries is a mere 50bp or so:
An analogous situation today would be if the Fed raises the fed funds target range over the next year but longer-term yields don’t budge. How might the Fed respond? New York Federal Reserve President William Dudley often comments on this prospect. From November 2015:
Several examples will help me make these points. During 2004 to 2007, the FOMC raised the federal funds rate target 17 meetings in a row, lifting the federal funds to 5.25 percent from 1.0 percent. Yet, during this period, financial conditions eased, as evidenced by the fact that the stock market rose, bond yields fell and credit availability—especially to housing—eased substantially. In hindsight, perhaps monetary policy should have been tightened more aggressively…
…In contrast, if financial conditions did not respond at all, or eased, then I suspect we would go more quickly, all else equal.
This raises some red flags for me. While much attention is placed on the Fed’s failure to respond more aggressively to slowing activity and deteriorating financial conditions in 2008, I lean toward thinking the more grievous policy error was in the first half of 2006 when the Federal Reserve kept raising short rates after the yield curve first inverted in February of that year:
and despite clear evidence of slowing economic activity and increasing financial stress.
So how will the Fed respond if long rates do not respond in concert with short rates? How will the Fed interpret a flattening yield curve? Do they accelerate the pace of rate increases? Do they initiate asset sales? The truth is I don’t know (or the answer is “it depends”), but I find this exchange between Federal Reserve Chair Janet Yellen and New York Times reporter Binyamin Appelbaum a bit disconcerting:
BINYAMIN APPELBAUM. Binyamin Appelbaum, the New York Times. Bill Dudley has talked about the need for the Fed to adjust policy based on the responsiveness of financial markets as you begin to increase rates. You didn't talk about that today. Is it a point that you agree with? And if so, what is it that you're looking for? How will you judge whether financial markets are accepting and transmitting these changes?
CHAIR YELLEN. Well, there are number of different channels through which monetary policy is transmitted to spending decisions, the behavior of longer term, longer term interest rates, short term interest rates matter. The value of asset prices and the exchange rate, also, these are transmission channels. We wouldn't be focused on short-term financial volatility, but were there unanticipated changes in financial conditions that were persistent and we judged to affect the outlook. We would of course have to take those into account. So, we will watch financial developments, but what we're looking at here is the longer term economic outlook, are we seeing persistent changes in financial market conditions that would have a bearing, a significant bearing, on the outlook that we would need to take account in formulating appropriate policy. Yes we would, but it's not short-term volatility in markets.
BINYAMIN APPELBAUM. The part [inaudible], you didn't see changes, you would be concerned and have to move more quickly. Are you concerned that if markets don't tighten sufficiently you may need to do more?
CHAIR YELLEN. Well, look. You know, we-- this is not an unanticipated policy move. And we have been trying to explain what our policy strategy is. So it's not as though I'm expecting to see marked immediate reaction in financial markets, expectations about Fed policy have been built into the structure of financial market prices. But we obviously will track carefully the behavior of both short and longer term interest rates, the dollar, and asset prices, and if they move in persistent and significant ways that are out of line with the expectations that we have, then of course we will take those in to account.
I don’t know that Yellen understood the question. But she should have. Dudley has been telling this story for a long, long time. Does she and/or the Committee share his expectations? Why or why not? In my opinion, this is an important question, and it looks to me like Yellen fumbled it.
Bottom Line: We have a fairly good idea of the Fed’s reaction function with respect inflation and unemployment. Not so much with respect to financial market conditions. Who shares Dudley’s views? That is a space I am watching this year.
Posted by Mark Thoma on Monday, December 21, 2015 at 08:45 AM in Economics, Fed Watch, Monetary Policy |
Nothing comes from nowhere:
The Donald and the Decider, by Paul Krugman, Commentary, NY Times: Almost six months have passed since Donald Trump overtook Jeb Bush in polls of Republican voters. At the time, most pundits dismissed the Trump phenomenon as a blip... Instead, however, his lead just kept widening. Even more striking, the triumvirate of trash-talk — Mr. Trump, Ben Carson, and Ted Cruz — now commands the support of roughly 60 percent of the primary electorate.
But how can this be happening? After all, the antiestablishment candidates now dominating the field, aside from being deeply ignorant about policy, have a habit of making false claims, then refusing to acknowledge error. Why don’t Republican voters seem to care?
Well, part of the answer has to be that the party taught them not to care. Bluster and belligerence as substitutes for analysis, disdain for any kind of measured response, dismissal of inconvenient facts reported by the “liberal media” didn’t suddenly arrive on the Republican scene last summer. On the contrary, they have long been key elements of the party brand. So how are voters supposed to know where to draw the line?. ...
Donald Trump as a political phenomenon is very much in a line of succession that runs from W. through Mrs. Palin, and in many ways he’s entirely representative of the Republican mainstream. For example, were you shocked when Mr. Trump revealed his admiration for Vladimir Putin? He was only articulating a feeling that was already widespread in his party.
Meanwhile, what do the establishment candidates have to offer as an alternative? On policy substance, not much. Remember, back when he was the presumed front-runner, Jeb Bush assembled a team of foreign-policy “experts,” ... dominated by neoconservative hard-liners, people committed, despite past failures, to the belief that shock and awe solve all problems.
In other words, Mr. Bush wasn’t articulating a notably different policy than what we’re now hearing from Trump et al...
In case you’re wondering, nothing like this process has happened on the Democratic side. When Hillary Clinton and Bernie Sanders debate..., it’s a real discussion... American political discourse as a whole hasn’t been dumbed down, just its conservative wing.
Going back to Republicans, does this mean that Mr. Trump will actually be the nominee? I have no idea. But it’s important to realize that he isn’t someone who suddenly intruded into Republican politics from an alternative universe. He, or someone like him, is where the party has been headed for a long time.
Posted by Mark Thoma on Monday, December 21, 2015 at 01:08 AM in Economics, Politics |
Posted by Mark Thoma on Monday, December 21, 2015 at 12:06 AM in Economics, Links |
I've never paid much attention to the fiscal theory of the price level:
The FTPL version of the Neo-Fisherian proposition: The Neo-Fisherian doctrine is the idea that a permanent increase in a flat nominal interest rate path will (eventually) raise the inflation rate. It is then suggested that current below target inflation is a consequence of fixing rates at their lower bound, and rates should be raised to increase inflation. David Andolfatto says there are two versions of this doctrine. The first he associates with the work of Stephanie Schmitt-Grohe and Martin Uribe, which I discussed here. He like me is not sold on this interpretation, for I think much the same reason. ... But he favours a different interpretation, based on the Fiscal Theory of the Price Level (FTPL).
Let me first briefly outline my own interpretation of the FTPL. This looks at the possibility of a fiscal regime where there is no attempt to stabilize debt. Government spending and taxes are set independently of the level or sustainability of government debt. The conventional and quite natural response to the possibility of that regime is to say it is unstable. But there is another possibility, which is that monetary policy stabilizes debt. Again a natural response would be to say that such a monetary policy regime is bound to be inconsistent with hitting an inflation target in the long run, but that is incorrect. ...
A constant nominal interest rate policy is normally thought to be indeterminate because the price level is not pinned down, even though the expected level of inflation is. In the FTPL, the price level is pinned down by the need for the government budget to balance at arbitrary and constant levels for taxes and spending. ...
I have a ... serious problem with this FTPL interpretation in the current environment. The belief that people would need to have for the FTPL to be relevant - that the government would not react to higher deficits by reducing government spending or raising taxes - does not seem to be credible, given that austerity is all about them doing exactly this despite being in a recession. As a result, I still find the Neo-Fisherian proposition, with either interpretation, somewhat unrealistic.
Posted by Mark Thoma on Sunday, December 20, 2015 at 07:54 AM in Economics, Fiscal Policy, Inflation, Macroeconomics, Monetary Policy |
Posted by Mark Thoma on Sunday, December 20, 2015 at 12:06 AM in Economics, Links |
The ambivalent role of China in global income distribution: It is well known that China’s role in reductions of global poverty and global inequality was crucial. For example, according to Chen and Ravallion, between 1981 and 2005, 98 percent (yes, ninety-eight percent) of reduction of global poverty, calculated using the poverty line $1 per person per day, was due to China. China’s role was similarly impressive when it comes to reduction of global inequality (income inequality between all individuals in the world). ....
But the question can be asked next, what happens if China continues growing fast? Will its inequality reducing effects wane, and eventually reverse? ...
Posted by Mark Thoma on Sunday, December 20, 2015 at 12:03 AM in China, Economics, Income Distribution |
Posted by Mark Thoma on Saturday, December 19, 2015 at 12:06 AM in Economics, Links |
Working Paper: The Upward Redistribution of Income: Are Rents the Story?: In the years since 1980, there has been a well-documented upward redistribution of income. While there are some differences by methodology and the precise years chosen, the top one percent of households have seen their income share roughly double from 10 percent in 1980 to 20 percent in the second decade of the 21st century. As a result of this upward redistribution, most workers have seen little improvement in living standards from the productivity gains over this period.
This paper argues that the bulk of this upward redistribution comes from the growth of rents in the economy in four major areas: patent and copyright protection, the financial sector, the pay of CEOs and other top executives, and protectionist measures that have boosted the pay of doctors and other highly educated professionals. The argument on rents is important because, if correct, it means that there is nothing intrinsic to capitalism that led to this rapid rise in inequality, as for example argued by Thomas Piketty.
PDF | Flash
Posted by Mark Thoma on Friday, December 18, 2015 at 09:43 AM in Economics, Income Distribution, Market Failure |
Why are Murdoch-controlled newspapers attacking "The Big Short?"
‘The Big Short,’ Housing Bubbles and Retold Lies, by Paul krugman, Commentary, NY Times: In May 2009 Congress created a special commission to examine the causes of the financial crisis. The idea was to emulate the celebrated Pecora Commission of the 1930s, which used careful historical analysis to help craft regulations that gave America two generations of financial stability.
But some members of the new commission had a different goal. ... Peter Wallison of the American Enterprise Institute, wrote to a fellow Republican on the commission ... it was important that what they said “not undermine the ability of the new House G.O.P. to modify or repeal Dodd-Frank”...; the party line, literally, required telling stories that would help Wall Street do it all over again.
Which brings me to a new movie the enemies of financial regulation really, really don’t want you to see.
“The Big Short” ... does a terrific job of making Wall Street skulduggery entertaining, of exploiting the inherent black humor of how it went down. ... But you don’t want me to play film critic; you want to know whether the movie got the underlying ... story right. And the answer is yes, in all the ways that matter. ...
The ...housing ... bubble ... was inflated largely via opaque financial schemes that in many cases amounted to outright fraud — and it is an outrage that basically nobody ended up being punished ... aside from innocent bystanders, namely the millions of workers who lost their jobs and the millions of families that lost their homes.
While the movie gets the essentials of the financial crisis right, the true story ... is deeply inconvenient to some very rich and powerful people. They and their intellectual hired guns have therefore spent years disseminating an alternative view ... that places all the blame ... on ... too much government, especially government-sponsored agencies supposedly pushing too many loans on the poor.
Never mind that the supposed evidence for this view has been thoroughly debunked..., constant repetition, especially in captive media, keeps this imaginary history in circulation no matter how often it is shown to be false.
Sure enough, “The Big Short” has already been the subject of vitriolic attacks in Murdoch-controlled newspapers...
The ... people who made “The Big Short” should consider the attacks a kind of compliment: The attackers obviously worry that the film is entertaining enough that it will expose a large audience to the truth. Let’s hope that their fears are justified.
Posted by Mark Thoma on Friday, December 18, 2015 at 12:51 AM in Economics, Housing, Politics, Regulation |
Posted by Mark Thoma on Friday, December 18, 2015 at 12:06 AM in Economics, Links |
Are prices sticky?:
“Sticky” sales, by Phil Davies,The Region, FRG Minneapolis: Sales are ubiquitous in the U.S. economy. Black Friday, President’s Day, Mother’s Day, the Fourth of July; almost any occasion is cause for price cutting, accompanied by prominent signage, balloons and ads in traditional and social media to make the savings known far and wide. Retailers also put on sales ostensibly to clear out inventory, celebrate being on the sidewalk and go out of business.
Economists are interested in sales, not because they want cheap stuff (well, maybe they’re as partial to a deal as anyone), but because the role of sales has a bearing on a question central to macroeconomics: How flexible are prices? Price flexibility—how quickly prices adjust to changes in costs or demand—is crucial to understanding how shocks of any kind, including fiscal and monetary policy, affect economic performance.
Retail prices rise and fall frequently as merchants put items on sale and then restore the regular, or shelf, price. Indeed, the bulk of weekly and monthly variance in individual prices is due to sales promotions, not changes in regular prices. But there’s a lively debate in economics about the true flexibility of sale prices, from a macro perspective; for all their seeming fluidity, how readily do sales respond to changes in underlying costs and unexpected events that alter economic conditions?
How sale prices respond to wholesale cost shocks and broader macroeconomic shocks such as an increase in government spending or monetary policy stimulus, or a decrease in global aggregate demand, affects the flexibility of aggregate retail prices, with profound implications for monetary policy and the accuracy of macroeconomic models that guide policymaking.
Monetary policy as a tool for influencing the economy depends on sticky prices—the idea that prices don’t adjust instantly to shifts in demand caused by changes in money supply. If they did, an increase in demand for goods and services due to monetary easing would trigger an immediate price rise, suppressing demand and leaving economic output and employment unchanged. Thus, the stickier are prices, the more effective is monetary policy in modulating economic growth in the short and medium run. (Economists generally agree that money is neutral in the long run; that is, over a long enough period of time, prices are actually quite flexible, so monetary policy has no long-run effect on the real economy.)
Recent work by Ben Malin, a senior research economist at the Minneapolis Fed, provides insight into the import of temporary sales for price stickiness and thus monetary policy. In “Informational Rigidities and the Stickiness of Temporary Sales” (Minneapolis Fed Staff Report 513), Malin uses a rich data set of prices from a U.S. retail chain to investigate how retail prices adjust in response to wholesale price increases and other economic shocks. Joining Malin in the research are economists Emi Nakamura and Jón Steinsson of Columbia University, and marketing professors Eric Anderson and Duncan Simester of Northwestern University and MIT, respectively.
Surprisingly, the authors find no change in the frequency and depth of price cuts in response to shocks. Their analysis, which also taps micro price data underlying the consumer price index to look at how sales at a representative sample of U.S. retailers respond to booms and downturns, shows that merchants rely exclusively on regular prices to adapt to cost changes and evolving economic conditions. The research “supports the view that the behavior of regular prices is what matters for aggregate price flexibility,” Malin said in interview. ...
Posted by Mark Thoma on Thursday, December 17, 2015 at 10:40 AM in Economics, Macroeconomics |
Are MOOCs the answer to educational inequality?:
How Socioeconomic Status Impacts Online Learning: The driving force behind the increasing popularity of massive open online courses (MOOCs) is that they provide — as the term defines it — open access to a massive online audience. Anyone with an Internet connection who wants to learn, can. Whether you’re rich or poor, living in a New York City high-rise or a remote Nepalese village, MOOCs promise to level the higher education playing field. The question is: Does reality reflect this ideal?
A new research study by MIT education researcher Justin Reich and Harvard University’s John Hansen seeks the answer. “Democratizing Education? Examining Access and Usage Patterns in Massive Open Online Courses” takes a close look at how socioeconomic resources influence MOOC enrollment and course completion — and whether online learning is truly opening as many doors as anticipated.
“One way we might democratize education would be to provide more widespread access to academic experiences previously reserved for the elite,” explains Reich, who is the executive director of MIT's PK-12 Initiative. “But historically, emerging learning technologies — even free ones — have often benefited people with the social, technical, and financial capital to take advantage of new innovations. As we try to bridge the digital divide, we need to carefully examine how new tools are used by learners from different walks of life.” ...
Reich’s study uses three indicators: parental educational attainment, neighborhood average educational attainment, and neighborhood median income.
The research finds that these indicators are correlated with student enrollment and success in MOOCs, especially among younger students. Young students enrolling in HarvardX and MITx on edX live in neighborhoods where the median income is 38 percent higher than typical American neighborhoods. Among teenagers who register for a HarvardX course, those with a college-educated parent have nearly twice the odds of finishing the course compared to students whose parents did not complete college. At exactly the ages where online learning could offer a new pathway into higher education, already affluent students are more likely to enroll in a course and succeed.
The takeaway is that MOOCs have not yet solved SES-related disparities in educational outcomes, and Reich believes it’s critical to turn these learnings into actions in order to narrow the gaps between MOOC perception and reality.
“MOOCs and other forms of online learning don’t yet live up to their promise to democratize education,” he says. “Closing this digital divide is exactly the kind of grand challenge that the world’s greatest universities should be tackling head on.”
Posted by Mark Thoma on Thursday, December 17, 2015 at 09:30 AM in Economics, Income Distribution, Technology, Universities |
Posted by Mark Thoma on Thursday, December 17, 2015 at 12:06 AM in Economics, Links |
As Expected, by Tim Duy: Today, the FOMC voted to raise the target range on the federal funds rate by 25bp. The accompanying statement and the Summary of Economic Projections offered no surprises. That very lack of surprise should be counted as a "win" for the Fed's communication strategy. A little bit of extra direction since September went a long way.
The statement again described the economic growth as "moderate." Although there is some external weakness, the domestic economy is solid, hence "the Committee sees the risks to the outlook for both economic activity and the labor market as balanced." The Fed continues to expect that inflation will return to target. On the basis of that forecast and lags in the policy policy process:
Given the economic outlook, and recognizing the time it takes for policy actions to affect future economic outcomes, the Committee decided to raise the target range for the federal funds rate to 1/4 to 1/2 percent.
Importantly, the Fed does not believe policy is tight:
The stance of monetary policy remains accommodative after this increase, thereby supporting further improvement in labor market conditions and a return to 2 percent inflation.
The Fed currently expect future hikes to occur only gradually:
The Committee expects that economic conditions will evolve in a manner that will warrant only gradual increases in the federal funds rate; the federal funds rate is likely to remain, for some time, below levels that are expected to prevail in the longer run.
But, this is a forecast not a promise:
However, the actual path of the federal funds rate will depend on the economic outlook as informed by incoming data.
Note that the Fed highlights the importance of actual inflation outcomes with respect to future hikes:
In light of the current shortfall of inflation from 2 percent, the Committee will carefully monitor actual and expected progress toward its inflation goal.
The Fed will proceed cautiously if evidence suggests inflation is not behaving as expected. This doesn't mean they need to see more inflation to hike rates further. But it would be nice.
No dissents; none of the possible dissenters thought their objections were sufficient to deny Federal Reserve Chair Janet Yellen a unanimous decision on this first hike.
The median forecasts for growth, employment, and inflation were virtually unchanged. Note that the central tendency range for longer run unemployment shifted down; participants continue to shave down their estimates of the natural rate of unemployment. The median rate projection for 2017 and 2018 edged down. This understates somewhat the decline in the range of the central tendency.
As I am running short of time today, I will leave any analysis of the press conference for a later time. Gradual, data dependent, not mechanical (not equally spaced or sized hikes), etc.
Bottom Line: Almost as exactly as should have been expected.
Posted by Mark Thoma on Wednesday, December 16, 2015 at 12:46 PM in Economics, Fed Watch, Monetary Policy |
Travel day. Will post as I can.
Posted by Mark Thoma on Wednesday, December 16, 2015 at 08:18 AM in Economics, Travel |
Must-Read: Kevin Hoover: The Methodology of Empirical Macroeconomics: The combination of representative-agent modeling and utility-based “microfoundations” was always a game of intellectual Three-Card Monte. Why do you ask? Why don’t we fund sociologists to investigate for what reasons–other than being almost guaranteed to produce conclusions ideologically-pleasing to some–it has flourished for a generation in spite of having no empirical support and no theoretical coherence?
Kevin Hoover: The Methodology of Empirical Macroeconomics: “Given what we know about representative-agent models…
…there is not the slightest reason for us to think that the conditions under which they should work are fulfilled. The claim that representative-agent models provide microfundations succeeds only when we steadfastly avoid the fact that representative-agent models are just as aggregative as old-fashioned Keynesian macroeconometric models. They do not solve the problem of aggregation; rather they assume that it can be ignored. ...
Posted by Mark Thoma on Wednesday, December 16, 2015 at 12:15 AM in Economics, Macroeconomics, Methodology |
Private Profit with Public Guarantee: The Real Issue with Fannie and Freddie: The NYT had a column by Jim Parrot and Mark Zandi on reforming Fannie Mae and Freddie Mac. ... The article argues that the problem with Fannie Mae and Freddie Mac was that they were considered too big to fail. It therefore puts forward the case for ending their monopoly on issuing government guaranteed mortgage-backed securities (MBS).
This argument seriously misrepresents the issues with Fannie Mae and Freddie Mac. The real problem was that they issued trillions of dollars in MBS that were implicitly backed up by the government. At the time they failed in the summer of 2008, the generally held view in financial circles was that the government would be obligated to honor their MBS regardless of whether or not it kept Fannie Mae and Freddie Mac in business. ...
This was a direct result of the perverse incentives created by a system where private shareholders and top executives stood to profit by passing risk off to the government. This incentive does not exist today. ... As long as Fannie and Freddie are essentially public companies, that do not offer high returns to shareholders and pay outlandish salaries to CEOs, no one has incentive to take excessive risks.
This changes if we allow private banks to issue mortgage backed securities with the guarantee of the government. This would mean that Goldman Sachs, Citigroup and the rest would be able to issue the same sort of subprime MBS they did in the bubble years with assurance that even in a worst case scenario the government would reimbursement investors for almost the full value of their investment. This is a great recipe for pumping up financial sector profits and another housing bubble. It does not make sense as public policy.
Posted by Mark Thoma on Wednesday, December 16, 2015 at 12:09 AM in Economics, Financial System, Housing, Market Failure |
Posted by Mark Thoma on Wednesday, December 16, 2015 at 12:06 AM in Economics, Links |
My latest column:
Donald Trump’s Divisiveness Is Bad for the Economy: White House spokesperson Josh Earnest described Donald Trump as “offensive and toxic,” though that only begins to describe the corrosive effect his bigotry, divisiveness, and xenophobia have on our society. It is at odds with our values as a nation.
It’s also bad for the economy. ...
Posted by Mark Thoma on Tuesday, December 15, 2015 at 09:09 AM in Economics, Politics |
The economic hurdles for beating global warming, by Mark Thoma: The Paris agreement on climate change is an important step forward in the battle to reduce greenhouse gas emissions. As the deal's negotiators acknowledged, the agreement won't not stop climate change by itself, but it provides an important framework for moving forward toward that goal. ...
Posted by Mark Thoma on Tuesday, December 15, 2015 at 09:07 AM in Economics, Environment |
The Federal Reserve is set to raise interest rates this week for the first time since 2006.
The final days of the zero interest-rate policy known as ZIRP are upon us; the end is here.
But the end of ZIRP is the beginning of a new chapter of monetary policy. This chapter will tell the story of the Federal Reserve’s efforts to normalize policy, and that particular tale has yet to be written. You can, however, expect Fed Chair Janet Yellen to emphasize “gradually” and “data dependent” as she pens the first few lines of the narrative at this week’s press conference....
Continue reading on Bloomberg...
Posted by Mark Thoma on Tuesday, December 15, 2015 at 09:06 AM in Economics, Fed Watch, Monetary Policy |
Posted by Mark Thoma on Tuesday, December 15, 2015 at 12:06 AM in Economics, Links |
"Paris gives us real reason to hope":
Hope From Paris, by Paul Krugman, Commentary, NY Times: Did the Paris climate accord save civilization? Maybe. That may not sound like a ringing endorsement, but it’s actually the best climate news we’ve had in a very long time. ...
Until very recently there were two huge roadblocks in the way of any kind of global deal on climate: China’s soaring consumption of coal, and the implacable opposition of America’s Republican Party. ... But there have been important changes on both fronts.
On one side, there is a visible shift in Chinese attitudes... China faces a huge air quality crisis, brought on largely by coal-burning, which makes it far more willing to wean itself from the worst form of fossil fuel consumption. And China’s ... rapidly growing middle class ... demands a higher quality of life, including air that’s relatively safe to breathe. ...
Which brings us to the U.S. Republican attitudes...: the G.O.P. is spiraling ever deeper into a black hole of denial and anti-science conspiracy theorizing. The game-changing news is that this may not matter as much as we thought..., new technology has fundamentally changed the rules.
Many people still seem to believe that renewable energy is hippie-dippy stuff, not a serious part of our future. ... The reality, however, is that costs of solar and wind power have fallen dramatically, to the point where they are close to competitive with fossil fuels even without special incentives — and progress on energy storage has made their prospects even better. Renewable energy has also become a big employer...
This energy revolution has two big implications. The first is that the cost of sharp emission reductions will be much less than even optimists used to assume... The second is that given a moderate boost — the kind that the Paris accord could provide — renewable energy could quickly give rise to new interest groups with a positive stake in saving the planet, offering an offset to the Kochs and suchlike.
Of course, it could easily go all wrong. President Cruz or President Rubio might scuttle the whole deal, and by the time we get another chance to do something about climate it could be too late.
But it doesn’t have to happen. I don’t think it’s naïve to suggest that what came out of Paris gives us real reason to hope in an area where hope has been all too scarce. Maybe we’re not doomed after all.
Posted by Mark Thoma on Monday, December 14, 2015 at 12:33 AM in China, Economics, Environment, Politics |
Makes You Wonder What The Fed Is Thinking, by Tim Duy: The Fed is poised to raise the target range on the federal funds rate this week. More on that decision tomorrow. My interest tonight is a pair of Wall Street Journal articles that together call into question the wisdom of the Fed's expected decision. The first is on inflation, or lack thereof, by Josh Zumbrun:
Central bank officials predict inflation will approach their target in 2016. The trouble is they have made the same prediction for the past four years. If the Fed is again fooled, it may find it raised rates too soon, risking recession.
A key reason for the Federal Reserve to raise interest rates is to be ahead of the curve on inflation. But given their poor inflation forecasting record, not to mention that of other central banks
why are they so sure that they must act now to head off inflationary pressures? One would expect waning confidence in their inflation forecasts to pull the center more toward the views of Chicago Federal Reserve President Charles Evans and Board Governors Lael Brainard and Daniel Tarullo and thus defer tighter policy until next year.
Now combine the inflation forecast uncertainty with the growing consensus among economists that the Fed faces the zero bound again in less than five years. This one's from Jon Hilsenrath:
Among 65 economists surveyed by The Wall Street Journal this month, not all of whom responded, more than half said it was somewhat or very likely the Fed’s benchmark federal-funds rate would be back near zero within the next five years. Ten said the Fed might even push rates into negative territory, as the European Central Bank and others in Europe have done—meaning financial institutions have to pay to park their money with the central banks...
Not a surprising conclusion given that Fed officials expect the terminal fed funds rate in the 3.3-3.8 percent range (central tendency) while the 2001-03 easing was 5.5 percentage points and the 1990-92 easing was 5.0 percentage points. You see of course how the math works. Supposedly this is of great concern at the Fed. Hilsenrath cites the October minutes:
Fed officials worry a great deal about the risk. The small gap between zero and where officials see rates going “might increase the frequency of episodes in which policy makers would not be able to reduce the federal-funds rate enough to promote a strong economic recovery…in the aftermath of negative shocks,” they concluded at their October policy meeting, according to minutes of the meeting.
The policy risks are asymmetric. They can always raise rates, but the room to lower is limited by the zero bound. But that understates the asymmetry. You should also include the asymmetry of risks around the inflation forecast. The Fed has repeated under-forecasted inflation. It seems like they should also see an asymmetry in the inflation forecast that compounds the policy response asymmetry. Asymmetries squared.
Given all of these asymmetries, I would think the Fed should continue to stand pat until they understood better the inflation dynamics. The Fed thinks otherwise. Why would Federal Reserve Chair Janet Yellen allows the Fed to be pulled in such a direction? Partly to appease the Fed hawks. And then there is this from her December speech:
Were the FOMC to delay the start of policy normalization for too long, we would likely end up having to tighten policy relatively abruptly to keep the economy from significantly overshooting both of our goals. Such an abrupt tightening would risk disrupting financial markets and perhaps even inadvertently push the economy into recession.
Yellen is wedded to the theory that the sooner the Fed begins normalizing policy, the more likely the Fed can avoid a recession-inducing sharp rise in rates. She follows up this concern with:
Moreover, holding the federal funds rate at its current level for too long could also encourage excessive risk-taking and thus undermine financial stability.
This is what Mark Dow calls "avalanche patrol":
What the Fed has begun to worry about is financial stability—even if not as an imminent threat. Its concerns are one part risk management, one part the ghost of crises past. FOMC members understand that financial excesses are a positive function of time. Stability sooner or later breeds instability. And the longer rates stay very low, the greater the risk they become built into the current financial architecture and baked into our extrapolations. Once you get to such a point, an eventual normalization becomes a lot riskier, in terms of both financial dislocations and economic activity.
This then becomes a story of a Fed caught between a world in which the policy necessary to meet their inflation target is inconsistent with financial stability. That is what they call caught between a rock and a hard place. And my sense is that Yellen feels the best way to slip through those cracks is early and gentle tightening.
Bottom Line: Given that the Fed likely only gets one chance to lift-off from the zero bound on a sustained basis, it is reasonable to think they would wait until they were absolutely sure inflation was coming. Even more so given the poor performance of their inflation forecasts. But the Fed thinks there is now more danger in waiting than moving. And so into the darkness we go.
Posted by Mark Thoma on Monday, December 14, 2015 at 12:24 AM in Economics, Fed Watch, Monetary Policy |
From Vox EU:
Offshoring and unskilled labour demand: Evidence that trade matters, by Juan Carluccio, Alejandro Cuñat, Harald Fadinger, and Christian Fons-Rosen: ... Conclusions The empirical evidence suggesting that globalisation is indeed affecting the income distribution in industrialised countries is much stronger than initially thought, at least as far as the manufacturing industry is concerned. In fact, the productivity gains from having access to cheaper inputs through offshoring are not being distributed equally between different economic actors in our rich societies. Consequently, political resistance to free trade by certain groups has a clear rationale as long as the effects discussed above are not addressed through more effective redistribution policies.
Posted by Mark Thoma on Monday, December 14, 2015 at 12:15 AM in Economics, Income Distribution, International Trade |
Posted by Mark Thoma on Monday, December 14, 2015 at 12:06 AM in Economics, Links |
John von Neumann and stochastic simulations, Understanding Society: John von Neumann was one of the genuine mathematical geniuses of the twentieth century. A particularly interesting window onto von Neumann's scientific work is provided by George Dyson in his book, Turing's Cathedral: The Origins of the Digital Universe. The book is as much an intellectual history of the mathematics and physics expertise of the Princeton Institute for Advanced Study as it is a study of any one individual, but von Neumann plays a key role in the story. His contribution to the creation of the general-purpose digital computer helped to lay the foundations for the digital world in which we now all live.
There are many interesting threads in von Neumann's intellectual life, but one aspect that is particularly interesting to me is the early application of the new digital computing technology to the problem of simulating large complex physical systems. Modeling weather and climate were topics for which researchers sought solutions using the computational power of first-generation digital computers, and the research needed to understand and design thermonuclear devices had an urgent priority during the war and post-war years. Here is a description of von Neumann's role in the field of weather modeling in designing the early applications of ENIAC (P. Lynch, "From Richardson to early numerical weather prediction"; link):
John von Neumann recognized weather forecasting, a problem of both great practical significance and intrinsic scientific interest, as ideal for an automatic computer. He was in close contact with Rossby, who was the person best placed to understand the challenges that would have to be addressed to achieve success in this venture. Von Neumann established a Meteorology Project at the Institute for Advanced Study in Princeton and recruited Jule Charney to lead it. Arrangements were made to compute a solution of a simple equation, the barotropic vorticity equation (BVE), on the only computer available, the ENIAC. Barotropic models treat the atmosphere as a single layer, averaging out variations in the vertical. The resulting numerical predictions were truly ground-breaking. Four 24-hour forecasts were made, and the results clearly indicated that the large-scale features of the mid-tropospheric flow could be forecast numerically with a reasonable resemblance to reality. (Lynch, 9)
A key innovation in the 1950s in the field of advanced computing was the invention of Monte Carlo simulation techniques to assist in the invention and development of the hydrogen bomb. Thomas Haigh, Mark Priestley, and Crispin Rope describe the development of the software supporting Monte Carlo simulations in the ENIAC machine in a contribution to the IEEE Annals of the History of Computing (link). Peter Galison offers a detailed treatment of the research communities that grew up around these new computational techniques (link). Developed first as a way of modeling nuclear fission and nuclear explosives, these techniques proved to be remarkably powerful for allowing researchers to simulate and calculate highly complex causal processes. Here is how Galison summarizes the approach:
Christened "Monte Carlo" after the gambling mecca, the method amounted to the use of random, numbers (a la roulette) to simulate the stochastic processes too complex to calculate in full analytic glory. But physicists and engineers soon elevated the Monte Carlo above the lowly status of a mere numerical calculation scheme; it came to constitute an alternative reality--in some cases a preferred one--on which "experimentation" could be conducted. (119)
At Los Alamos during the war, physicists soon recognized that the central problem was to understand the process by which neutrons fission, scatter, and join uranium nuclei deep in the fissile core of a nuclear weapon. Experiment could not probe the critical mass with sufficient detail; theory led rapidly to unsolvable integro-differential equations. With such problems, the artificial reality of the Monte Carlo was the only solution--the sampling method could "recreate" such processes by modeling a sequence of random scatterings on a computer. (120)
The approach that Ulam, Metropolis, and von Neumann proposed to take for the problem of nuclear fusion involved fundamental physical calculations and statistical estimates of interactions between neutrons and surrounding matter. They proposed to calculate the evolution of the states of a manageable number of neutrons as they traveled from a central plutonium source through spherical layers of other materials. The initial characteristics and subsequent interactions of the sampled neutrons were assigned using pseudo-random numbers. A manageable number of sampled spaces within the unit cube would be "observed" for the transit of a neutron (127) (10^4 observations). If the percentage of fission calculated in the sampled spaces exceeded a certain value, then the reaction would be self-sustaining and explosive. Here is how the simulation would proceed:
Von Neumann went on to specify the way the simulation would run. First, a hundred neutrons would proceed through a short time interval, and the energy and momentum they transferred to ambient matter would be calculated. With this "kick" from the neutrons, the matter would be displaced. Assuming that the matter was in the middle position between the displaced position and the original position, one would then recalculate the history of the hundred original neutrons. This iteration would then repeat until a "self-consistent system" of neutron histories and matter displacement was obtained. The computer would then use this endstate as the basis for the next interval of time, delta t. Photons could be treated in the same way, or if the simplification were not plausible because of photon-matter interactions, light could be handled through standard diffusion methods designed for isotropic, black-body radiation. (129)
Galison argues that there were two fairly different views in play of the significance of Monte Carlo methods in the 1950s and 1960s. According to the first view, they were simply a calculating device permitting the "computational physicist" to calculate values for outcomes that could not be observed or theoretically inferred. According to the second view, Monte Carlo methods were interpreted realistically. Their statistical underpinnings were thought to correspond exactly to the probabilistic characteristics of nature; they represented a stochastic view of physics.
King's view--that the Monte Carlo method corresponded to nature (got "back of the physics of the problem") as no deterministic differential equation ever could--I will call stochasticism. It appears in myriad early uses of the Monte Carlo, and clearly contributed to its creation. In 1949, the physicist Robert Wilson took cosmic-ray physics as a perfect instantiation of the method: "The present application has exhibited how easy it is to apply the Monte Carlo method to a stochastic problem and to achieve without excessive labor an accuracy of about ten percent." (146)
This is a very bold interpretation of a simulation technique. Rather than looking at the model as an abstraction from reality, this interpretation looks at the model as a digital reproduction of that reality. "Thus for the stochasticist, the simulation was, in a sense, of apiece with the natural phenomenon" (147).
One thing that is striking in these descriptions of the software developed in the 1950s to implement Monte Carlo methods is the very limited size and computing power of the first-generation general-purpose computing devices. Punch cards represented "the state of a single neutron at a single moment in time" (Haigh et al link 45), and the algorithm used pseudo-random numbers and basic physics to compute the next state of this neutron. The basic computations used third-order polynomial approximations (Haigh et al link 46) to compute future states of the neutron. The simulation described here resulted in the production of one million punched cards. It would seem that today one could use a spreadsheet to reproduce the von Neumann Monte Carlo simulation of fission, with each line being the computed result from the previous line after application of the specified mathematical functions to the data represented in the prior line. So a natural question to ask is -- what could von Neumann have accomplished if he had Excel in his toolkit? Experts -- is this possible?
Posted by Mark Thoma on Sunday, December 13, 2015 at 03:40 PM in Economics |
Posted by Mark Thoma on Sunday, December 13, 2015 at 12:06 AM in Economics, Links |
Interesting theory from Paul Krugman:
Debt and Demographic Debt Spirals: I’m in Portugal... I have been doing some homework about the terrible times Portugal has recently suffered. ...
We used to think that high labor mobility was a good thing for currency unions, because it would allow the union’s economy to adjust to asymmetric shocks — booms in some places, busts in others — by moving workers rather than having to cut wages in the lagging regions. But what about the tax base? If bad times cause one country’s workers to leave in large numbers, who will service its debt and care for its retirees?
Indeed, it’s easy conceptually to see how a country could enter a demographic death spiral. Start with a high level of debt, explicit and implicit. If the work force falls through emigration, servicing this debt will require higher taxes on those who remain, which could lead to more emigration, and so on. ...
Portugal, with its long tradition of outmigration, may be more vulnerable than most, but I have no idea whether it’s really in that zone. ...
Now, it’s true that emigration in an economy with mass unemployment doesn’t immediately reduce the tax base, since the marginal worker wouldn’t have been employed anyway. But it sets things up for longer-run deterioration. ...
Krugman on Portugal and the Migration Death Spiral: Paul Krugman comments that Portugal can be a situation where its aging population, combined with a large outflow of younger people due to high unemployment, can lead to an ever worsening financial situation where fewer workers are left to support a larger debt and non-working population. (Note, the key factor here is the migration, not the aging.)
That pretty well describes the picture with Puerto Rico, with a large segment of its working age population moving to the mainland United States, leaving the island with relatively few working people to provide taxable income. The upside for Puerto Rico is that at least it has benefits like Social Security and Medicare covered by the national government, compared with Portugal, which must pay for its equivalents from its own tax revenue.
Posted by Mark Thoma on Saturday, December 12, 2015 at 11:52 AM in Economics |
Posted by Mark Thoma on Saturday, December 12, 2015 at 12:06 AM in Economics, Links |
I think I disagree with Brad DeLong:
Why it's tricky for Fed officials to talk politically: Should speeches by Federal Reserve officials be limited to topics concerning monetary policy and financial stability, or should they be free to speak on any topic, no matter how politically charged it might be? It's an important question as the Fed prepares to announce next week what's looking like a significant change in its eight-year policy of zero-perecent interest rates.
Fed Chair Janet Yellen, for example, was sharply criticized for a speech last year highlighting what economists know about rising inequality and what might be done to overcome it.
This speech, which Yellen gave in October 2014, is still creating controversy. This week, it erupted again when UC Berkeley economist Brad DeLong defended Yellen against the charge that she's a "partisan hack," a description in the headline of a Washington Post story by Michael Strain after Yellen's speech. ...
Posted by Mark Thoma on Friday, December 11, 2015 at 09:06 AM in Economics, Monetary Policy, Politics |
Reaping what they've sown:
Empowering the Ugliness, by Paul Krugman, Commentary, NY Times: We live in an era of political news that is, all too often, shocking but not surprising. The rise of Donald Trump definitely falls into that category. And so does the electoral earthquake that struck France in Sunday’s regional elections, with the right-wing National Front winning more votes than either of the major mainstream parties. ...
Let me start with ... Europe..., from an American perspective it looks as if Europe’s establishment has tried to freeze the xenophobic right, not just out of political power, but out of any role in acceptable discourse. ...
What the European establishment may not have realized, however, is that its ability to define the limits of discourse rests on the perception that it knows what it is doing. ...The European project ... has never had deep popular support...
And there’s nothing quite like sustained poor economic performance ... brought on by Europe’s austerity and hard-money obsessions ... to undermine the elite’s reputation for competence. That’s probably why one recent study found a consistent historical relationship between financial crises and the rise of right-wing extremism. And history is repeating itself.
The story is quite different in America, because the Republican Party hasn’t tried to freeze out the kind of people who vote National Front in France. Instead, it has tried to exploit them, mobilizing their resentment via dog whistles to win elections. ...
But there is a strong element of bait-and-switch to this strategy. Whatever dog whistles get sent during the campaign, once in power the G.O.P. has made serving the interests of a small, wealthy economic elite, especially through big tax cuts, its main priority...
Sooner or later the angry whites who make up a large fraction, maybe even a majority, of the G.O.P. base were bound to rebel...
So along comes Donald Trump, saying bluntly the things establishment candidates try to convey in coded, deniable hints, and sounding as if he really means them. And he shoots to the top of the polls. Shocking..., but hardly surprising. ...
What I am saying ... is that this ugliness has been empowered by the very establishments that now act so horrified... In Europe the problem is the arrogance and rigidity of elite figures who refuse to learn from economic failure; in the U.S. it’s the cynicism of Republicans who summoned up prejudice to support their electoral prospects. And now both are facing the monsters they helped create.
Posted by Mark Thoma on Friday, December 11, 2015 at 12:24 AM in Economics, Politics |