« Gas Price Elasticities | Main | links for 2008-06-10 »

Monday, June 09, 2008

Real-Time Assessment of the Economy

Ben Bernanke gave a speech today at a Boston Fed conference on inflation and the Phillip's curve. Part of the speech discusses the difficulties with real-time policymaking, and those remarks are repeated below. To complement the discussion, I also included an academic paper by S. Boragan Aruoba, Francis X. Diebold, and Chiara Scotti that develops "a framework for high-frequency business conditions assessment" that is an attempt to provide a solution to this problem. The paper is, essentially, a call to action on this problem and it attempts to lead the way by providing the methodology for obtaining real time assessments of economic conditions, and by providing an illustrative example (see the graph below). This is probably geekier than I realize:

Outstanding Issues in the Analysis of Inflation, by Ben S. Bernanke, FRB: ...Forecasting and controlling inflation are, of course, central to the process of making monetary policy. In this respect, policymakers are fortunate to be able to build on an intellectual foundation provided by extensive research and practical experience. Nonetheless, much remains to be learned about both inflation forecasting and inflation control. In the spirit of this conference, my remarks this evening will highlight some key areas where additional research could help to provide a still-firmer foundation for monetary policymaking.

...I will briefly touch on four topics of particular interest for policymakers:  commodity prices and inflation, the role of labor costs in the price-setting process, issues arising from the necessity of making policy in real time, and the determinants and effects of changes in inflation expectations. ...

Real-Time Policymaking
The measurement issues I just raised point to another important concern of policymakers, namely, the necessity of making decisions in "real time," under conditions of great uncertainty--including uncertainty about the underlying state of the economy--and without the benefit of hindsight.

In the context of Phillips curve analysis, a number of researchers have highlighted the difficulty of assessing the output gap--the difference between actual and potential output--in real time.[6] An inability to measure the output gap in real time obviously limits the usefulness of the concept in practical policymaking. On the other hand, to argue that output gaps are very difficult to measure in real time is not the same as arguing that economic slack does not influence inflation; indeed, the bulk of the evidence suggests that there is a relationship, albeit one that may be less pronounced than in the past.[7] These observations suggest two useful directions for research: First, more obviously, there is scope to continue the search for measures or indicators of output gaps that provide useful information in real time. Second, we need to continue to think through the decision procedures that policymakers should use under conditions of substantial uncertainty about the state of the economy and underlying economic relationships. For example, even if the output gap is poorly measured, by taking appropriate account of measurement uncertainties and combining information about the output gap with information from other sources, we may be able to achieve better policy outcomes than would be possible if we simply ignored noisy output gap measures. Of course, similar considerations apply to other types of real-time economic information.

Inflation itself can pose real-time measurement challenges. We have multiple measures of inflation, each of which reflects different coverage, methods of construction, and seasonality, and each of which is subject to statistical noise arising from sampling, imputation of certain prices, and temporary or special factors affecting certain markets. From these measures and other information, policymakers attempt to infer the "true" underlying rate of inflation. In other words, policymakers must read the incoming data in real time to judge which changes in inflation are likely to be transitory and which may prove more persistent.  Getting this distinction right has first-order implications for monetary policy: Because monetary policy works with a lag, policy should be calibrated based on forecasts of medium-term inflation, which may differ from the current inflation rate. The need to distinguish changes in the inflation trend from temporary movements around that trend has motivated attention to various measures of "core," or underlying, inflation, including measures that exclude certain prices (such as those of food and energy), "trimmed mean" measures, and others, but other approaches are certainly worth consideration.[8] Further work on the problem of filtering the incoming data so as to obtain better measures of the underlying inflation trend could be of great value to policymakers.

The necessity of making policy in real time highlights the importance of maintaining and improving the economic data infrastructure and, in particular, working to make economic data timelier and more accurate. I noted earlier the problems in interpreting existing measures of labor compensation. Significant scope exists to improve the quality of price data as well--for example, by using the wealth of information available from checkout scanners or finding better ways to adjust for quality change. I encourage researchers to become more familiar with the strengths and shortcomings of the data that they routinely use. ...

Here's some work on this problem (also, see this link to graphs of the real activity index on Francis Diebold's website, the website says the index is updated weekly, but the last update appears to have been in April - the key graph is reproduced here):

Diebold

Real-Time Measurement of Business Condition, by  S. Boragan Aruoba, Francis X. Diebold, and Chiara Scotti: 1 Introduction Aggregate business conditions are of central importance in the business, finance, and policy communities, worldwide, and huge resources are devoted to assessment of the continuously evolving state of the real economy. Literally thousands of newspapers, newsletters, television shows, and blogs, not to mention armies of employees in manufacturing and service industries, including the financial services industries, central banks, government and non-government organizations, grapple constantly with the measurement and forecasting of evolving business conditions. Of central importance is the constant grappling. Real economic agents, making real decisions, in real time, want accurate and timely estimates of the state of real activity. Business cycle chronologies such as the NBER’s, which announce expansions and contractions very long after the fact, are not useful for guiding such decisions.[1] Against this background, we propose and illustrate a framework for high-frequency business conditions assessment in a systematic, replicable, and statistically optimal manner. Our framework has four key ingredients.

Ingredient 1. We work with a dynamic factor model, treating business conditions as an unobserved variable, related to observed indicators. Latency of business conditions is consistent with economic theory (e.g. Lucas, 1977), which emphasizes that the business cycle is not about any single variable, whether GDP, industrial production, sales, employment, or anything else. Rather, the business cycle is about the dynamics and interactions (“comovements”) of many variables.

Treating business conditions as latent is also a venerable tradition in empirical business cycle analysis, ranging from the earliest work to the most recent, and from the statistically informal to the statistically formal. On the informal side, latency of business conditions is central to many approaches, from the classic early work of Burns and Mitchell (1946) to the recent workings of the NBER business cycle dating committee, as described for example by Hall et al. (2003). On the formal side, latency of business conditions is central to the popular dynamic factor framework, whether from the “small data” perspective of Geweke (1977), Sargent and Sims (1977), Stock andWatson (1989, 1991) and Diebold and Rudebusch (1996), or the more recent “large data” perspective of Stock and Watson (2002) and Forni, Hallin, Lippi and Reichlin (2000).2

Ingredient 2. We explicitly incorporate business conditions indicators measured at different frequencies. Important business conditions indicators do in fact arrive at a variety of frequencies, including quarterly (e.g., GDP), monthly (e.g., industrial production), weekly (e.g., employment), and continuously (e.g., asset prices), and we want to be able to incorporate all of them, to provide continuously-updated measurements.

Ingredient 3. We explicitly incorporate indicators measured at high frequencies. Given that our goal is to track the high-frequency evolution of real activity, it is important to incorporate (or at least not exclude from the outset) the high-frequency information flow associated with high-frequency indicators.

Ingredient 4. We extract and forecast latent business conditions using linear yet statistically optimal procedures, which involve no approximations. The appeal of exact as opposed to approximate procedures is obvious, but achieving exact optimality is not trivial, due to complications arising from temporal aggregation of stocks vs. flows in systems with mixedfrequency data.

Related to our concerns and framework is a small but nevertheless important literature, including Stock and Watson (1989, 1991), Mariano and Murasawa (2003), Evans (2005) and Proietti and Moauro (2006), as well as Shen (1996), Abeysinghe (2000), Altissimo et al. (2002), Liu and Hall (2001), McGuckin, Ozyildirim and Zarnowitz (2003), Ghysels, Santa Clara and Valkanov (2004), and Giannone, Reichlin and Small (2008). Our contribution is different in certain respects, and similar in others, and both the differences and similarities are intentional. Let us begin by highlighting some of the differences. First, some authors like Stock and Watson (1989, 1991) work in a dynamic factor framework with exact linear filtering, but they don’t consider data at mixed frequencies or at high frequencies.

Second, other authors like Evans (2005) do not use a dynamic factor framework and do not use high-frequency data, instead focusing on estimating high-frequency GDP. Evans (2005), for example, equates business conditions with GDP growth and uses state space methods to estimate daily GDP growth using data on preliminary, advanced and final releases of GDP and other macroeconomic variables.

Third, authors like Mariano and Murasawa (2003) work in a dynamic factor framework and consider data at mixed frequencies, but not high frequencies, and their filtering algorithm is only approximate. Proietti and Moauro (2006) avoid the Mariano-Murasawa approximation at the cost of moving to a non-linear model with a corresponding rather tedious non-linear filtering algorithm.

Ultimately, however, the similarities between our work and others’ are more important than the differences, as we stand on the shoulders of many earlier authors. Effectively we (1) take a small-data dynamic factor approach to business conditions analysis, (2) recognize the potential practical value of extending the the approach to mixed-frequency data settings involving some high-frequency data, (3) recognize that doing so amounts to a filtering problem with a large amount of missing data, which the Kalman filter is optimally designed to handle, and (4) provide a prototype example of the framework in use. Hence the paper is really a “call to action,” a call to move the state-space dynamic-factor framework closer to its high-frequency limit, and hence to move statistically-rigorous business conditions analysis closer to its high-frequency limit.

We proceed as follows. In section 2 we provide a detailed statement of our dynamic-factor modeling framework, and in section 3 we represent it as a state space filtering problem with a large amount of missing data. In section 4 we report the results of a four-indicator prototype empirical analysis, using quarterly GDP, monthly employment, weekly initial jobless claims, and the daily yield curve term premium. In section 5 we report the results of a simulation exercise, calibrated to our empirical estimates, which lets us illustrate our methods and assess their efficacy in a controlled environment. In section 6 we conclude and offer directions for future research.

Footnotes

1 We do not wish to imply, however, that the NBER chronology is not useful at all. Indeed it is exceptionally useful for what it is: A retrospective historical chronology of business cycles.

2 For discussion of small-data vs. large-data dynamic factor modeling, see Diebold (2003).

    Posted by on Monday, June 9, 2008 at 08:01 PM in Academic Papers, Economics, Monetary Policy | Permalink  TrackBack (0)  Comments (12)

    TrackBack

    TrackBack URL for this entry:
    https://www.typepad.com/services/trackback/6a00d83451b33869e200e5534f8e628834

    Listed below are links to weblogs that reference Real-Time Assessment of the Economy:


    Comments

    Feed You can follow this conversation by subscribing to the comment feed for this post.