Economic Fluctuations and Growth Research Meeting
Japan Project Meets
Fall Research Meeting on Economic Fluctuations and Growth
Political Economy
Market Microstructure

Economic Fluctuations and Growth Research Meeting

The NBER's Economic Fluctuations and Growth Program held its annual Research Meeting on July 15 in Cambridge. Daron Acemoglu, NBER and MIT, and Anil K Kashyap, NBER and University of Chicago, organized this program:

Ricardo J. Caballero, MIT and NBER; Emmanuel Farhi, MIT; and Pierre-Olivier Gourinchas, University of California, Berkeley and NBER, "An Equilibrium Model of Global Imbalances and Low Interest Rates"
Discussant: Lars E. O. Svensson, Princeton University and NBER

Jonathan Heathcote, Georgetown University, and Fabrizio Perri, New York University and NBER, "The International Diversification Puzzle is Not as Bad as You Think"
Discussant: Nobuhiro Kiyotaki, London School of Economics and NBER

Raj Chetty, University of California, Berkeley and NBER, and Adam Szeidl, University of California, Berkeley, "Consumption Commitments and Risk Preferences"
Discussant: John C. Heaton, University of Chicago and NBER

Nick Bloom, Stanford University and NBER, " The Impact of Uncertainty Shocks: Firm Level Estimation and a 9/11 Simulation"
Discussant: Valerie A. Ramey, University of California, San Diego and NBER

Fatih Guvenen and Burhanettin Kuruscu, University of Texas at Austin, "Understanding Wage Inequality: Ben-Porath Meets Skill-Biased Technical Change"
Discussant: Steven J. Davis, University of Chicago and NBER

Francisco J. Buera, Northwestern University, and Joseph P. Kaboski, Ohio State University, "The Rise of the Service Economy"
Discussant: Robert E. Hall, Stanford University and NBER

Three of the most important recent facts in global macroeconomics - the sustained rise in the U.S. current account deficit, the stubborn decline in long-run real rates, and the rise in the share of U.S. assets in global portfolios - appear as anomalies from the perspective of conventional wisdom and models. Caballero and his co-authors provide a model that rationalizes these facts as an equilibrium outcome of two observed forces: 1) potential growth differentials among different regions of the world; and 2) heterogeneity in these regions' capacity to generate financial assets from real investments. In extensions of the basic model, they also generate exchange rate and FDI excess returns that are broadly consistent with the recent trends in these variables. More generally, the framework is flexible enough to shed light on a range of scenarios in a global equilibrium environment.

Heathcote and Perri show that a simple extension of one-good models can help to reconcile theory and data. In particular, they analytically solve for the equilibrium country portfolios in a two-country, two-goods model with non-diversifiable labor income and investment. In this set-up, consistent with the data, country portfolios contain a relatively small, but positive, share of foreign assets. International diversification is low because terms-of-trade movements provide considerable insurance against country-specific shocks and labor income risk (Cole and Obstfeld 1991, Acemoglu and Ventura, 2002, Pavlova and Rigobon, 2003). International diversification is positive because foreign assets are crucial in sharing the financing of investment across countries. Finally, in the model a country's portfolio share of foreign assets should depend on its trade/GDP ratio and on its capital income/GDP ratio. The authors show how this relation is qualitatively and quantitatively consistent with country portfolios in the cross section of OECD countries in the 1990s.

Chetty and Szeidl characterize risk preferences in an expected utility model with commitments. They show that commitments affect risk preferences in two ways: 1) they amplify risk aversion with respect to moderate-stake shocks; and 2) they create a motive to take large-payoff gambles. The model thus helps to resolve two basic puzzles in expected utility theory: the discrepancy between moderate-stake and large-stake risk aversion and lottery playing by insurance buyers. The authors discuss applications of the model, such as the optimal design of social insurance and tax policies, added worker effects in labor supply, and portfolio choice. Using event studies of unemployment shocks, they document evidence consistent with the consumption adjustment patterns implied by the model.

Uncertainty appears to vary strongly over time, temporarily rising by up to 200 percent around major shocks like the Cuban Missile crisis, the assassination of JFK, and 9/11. Bloom offers the first structural framework for analyzing uncertainty shocks. He builds a model with a time varying second moment, which he numerically solves and estimates using firm-level data. The parameterized model is then used to simulate a macro uncertainty shock, which produces a rapid drop and rebound in employment, investment, and productivity growth, and a moderate loss in GDP. This temporary impact of a second moment shock is different from the typically persistent impact of a first moment shock, highlighting the importance for policymakers of identifying their relative magnitudes in major shocks. The simulation of an uncertainty shock is then compared to actual 9/11 data, displaying a surprisingly good match.

Guvenen and Kuruscu present a tractable general equilibrium overlapping-generations model of human capital accumulation which is consistent with several features of the evolution of the U.S. wage distribution from 1970 to 2000. The key feature of the model, and the only source of heterogeneity, is that individuals differ in their ability to accumulate human capital. To highlight the working of the model, the authors abstract from all kinds of idiosyncratic uncertainty, and thus, wage inequality results only from differences in human capital accumulation. They examine the response of this model to skill-biased technical change (SBTC) both theoretically and quantitatively. First, they theoretically show that in response to SBTC, the model generates behavior consistent with the U.S. data including: a rise in total wage inequality; an initial fall in the education (skill) premium followed by a strong recovery, leading to a higher premium in the long-run; the fact that most of this fall and rise takes place among younger workers; a rise in within-group inequality; an increase in educational attainment; stagnation in median wage growth (and a slowdown in aggregate labor productivity); and a rise in consumption inequality that is much smaller than the rise in wage inequality. They then calibrate the model to the U.S. data before 1970 and find that the evolutions of these variables are quantitatively consistent with their empirical counterparts during SBTC (from 1970 on). These results suggest that the heterogeneity in the ability to accumulate human capital is an important feature for understanding the effects of SBTC and interpreting the transformation that the U.S. economy has gone through since the 1970s.

Buera and Kaboski present four facts and a model explaining the rise of the service economy. First, the rising share of services in output is a recent phenomenon, starting around the mid-twentieth century. Second, it reflects increases in both the relative price and relative quantity of services to commodities. Third, this rising share is entirely explained by the surge of skill-intensive services, and is contemporaneous with the increases in the relative quantity of skilled labor and the skill premium. Finally, individual services follow a distinct product cycle as an economy grows. They start being provided as market services, but are later produced at home with the purchase of manufactured intermediate inputs and durable goods. In this model, agents make decisions between the market and home provision over a continuum of wants that are satiated sequentially. The disutility of public consumption and economies of scale (in the use of specialized capital and skills) are the key elements explaining the rich dynamics of the service economy. If skilled labor has a comparative advantage in the production of newer services, the theory explains the late rise in the service economy characterized by rising relative prices and quantities of services, and growth in the relative quantity of skilled labor and the skill premium.

[back to top]

Japan Project Meets

The NBER together with the Center on the Japanese Economy and Business, The Center for Advanced Research in Finance, the European Institute of Japanese Studies, and the Australia-Japan Research Centre held a project meeting on the Japanese economy in Tokyo on September 15-16. The co-chairs of the meeting were: Magnus Blomstrom, NBER and Stockholm School of Economics; Jennifer Corbett, Australia-Japan Research Centre; Fumio Hayashi, NBER and the University of Tokyo; Charles Horioka, NBER and Osaka University; Anil K Kashyap, NBER and the Graduate School of Business, University of Chicago; and David Weinstein, NBER and Columbia University. The following papers were discussed:

Mitsuru Iwamura, Waseda University; Shigenori Shiratsuka, Bank of Japan; and Tsutomu Watanabe, Hitotsubashi University, "Massive Money Injection in an Economy with Broad Liquidity Services: The Japanese Experience 2001-6"
Discussant: John B. Taylor, Stanford University and NBER

Zekeriya Eser and Joe Peek, University of Kentucky, "Reciprocity and Network Coordination: Evidence from Japanese Banks"
Discussant: Timo Henckel, Australian National University

Shigeo Hirano, Columbia University, "Do Individual Representatives Influence Government Transfers? Evidence from Japan"
Discussant: Henry S. Farber, Princeton University and NBER

Koji Sakai, Hitotsubashi University; Iichiro Uesugi, RIETI; and Guy Yamashiro, California State University, "Effectiveness of Credit Guarantees in the Japanese Loan Market"
Discussant: Douglas W. Diamond, University of Chicago and NBER

Hiroshi Fujiki, Bank of Japan, and Etsuro Shioji, Hitotsubashi University, "Bank Health Concerns, Low Interest Rates, and Money Demand: Evidence from the Public Opinion Survey on Household Financial Assets and Liabilities"
Discussant: Kazuo Ogawa, Osaka University

Gauti Eggertsson, Federal Reserve Bank of New York, "A Tale of Two Countries: Fiscal Multipliers and Policy Coordination"
Discussant: Kenneth D. West, University of Wisconsin and NBER

Arata Ito and Tsutomu Watanabe, Hitotsubashi University, and Tomoyoshi Yabu, Bank of Japan, "Fiscal Policy Switching: Evidence from Japan, US, and UK"
Discussant: Matthew D. Shapiro, University of Michigan and NBER

Keiichiro Kobayashi, RIETI, and Masaru Inaba, University of Tokyo, "Business Cycle Accounting for the Japanese Economy"
Discussant: Julen Esteban-Pretel, University of Tokyo

Iwamura, Shiratsuka, and Watanabe present a model with broad liquidity services to discuss the consequences of massive money injection in an economy with a zero interest rate bound. They incorporate Goodfriend's (2000) idea of broad liquidity services into the model by allowing the amounts of bonds with various maturities held by a household to enter its utility function. They show that the satiation of money (or the zero marginal utility of money) is not a necessary condition for the one-period interest rate to reach the zero lower bound; instead, they present a weaker necessary condition – that the marginal liquidity service provided by money coincides with the marginal liquidity service provided by the one-period bonds, neithr of which is necessarily equal to zero. This implies that massive money injection would have some influence on an equilibrium of the economy even if it does not alter the private sector's expectations about future monetary policy. The results indicate that forward interest rates began to decline relative to the corresponding futures rates just after March 2001, when the Bank of Japan started a quantitative monetary easing policy, and that the forward and futures spread never closed until the policy ended in March 2006. The authors argue that these findings are not easy to explain in a model without broad liquidity services.

Eser and Peek provide the first detailed empirical evidence on the cooperative behavior of individual members of a functioning, real world network. In contrast to experimental evidence from limited settings, this study uses detailed annual data on the volume of loans given to individual firms from each individual bank that lends to them for a period spanning nearly twenty years. Using this detailed data, the authors are able to exploit substantial cross-sectional variation in the degree of reliance of the banks on the network as a whole and on other individual banks within the network. In addition, they are able to investigate the impact of economic stress on the cooperative behavior of individual network members by comparing the 1980s with the more turbulent 1990s. They find strong evidence that the strength of system-wide reliance on, and thus commitment to, the network, as well as pairwise reliance on other network members, plays an important role in explaining the observed cooperative behavior by Japanese banks.

Although the conventional wisdom is that representatives to the Japanese Diet are "pipelines" between the national treasury and local constituents, with great influence over the distribution of central government transfers to and within their districts, the systematic empirical evidence that this influence exists is relatively weak. Hirano uses two identification strategies to estimate how much individual Lower House Liberal Democratic Party (LDP) incumbents influence the distribution of government transfers during the period 1977 to 1992: the exogenous change in representation following the mid-term deaths of Japanese representatives; and the discontinuity surrounding elections where LDP candidates win or lose by very narrow margins. Overall, the influence of politicians on central-to-locality transfers is relatively small. However, the presence of a marginal LDP incumbent leads to about a 10 percent to 30 percent increase in per capita central government transfers to the municipalities where the incumbent has substantial electoral support.

From 1998-2001, the Japanese government implemented a massive credit guarantee program that was unprecedented in both scale and scope. Using a new panel data set of Japanese firms, co-authors Sakai, Uesugi, and Yamashiro empirically test whether government credit programs do more to stimulate small business investment or serve to worsen adverse selection problems in credit markets. They find evidence consistent with the former. Specifically, program participants 1) significantly increase their leverage, particularly their use of long-term loans, and 2) with the exception of high-risk firms, become more efficient. Overall, these findings suggest that government interventions in credit markets can be beneficial.

Fujiki and Shioji use household survey data that covers the period from 2001 through 2003 to study the cash and deposits demand of households. This data enable them to obtain empirical findings that could not previously be derived through analyses using conventional macroeconomic time-series data. First, for asset demand, they find that the fluctuations in the extensive margin (the decisions on whether or not to hold a financial product) are sometimes more important than the fluctuations in the intensive margin (the decisions on the amounts of the financial product held). Second, they conduct detailed analyses on the causes of fluctuations in the cash demand of individual households. Third, thanks to qualitative questions in the dataset, they manage to distinguish between the fluctuations in asset demand attributable to low interest rates and those in response to various measures that are aimed at enhancing the safety of household savings. Fourth, they quantify the economic effects of personal financial education.

Eggertsson offers an explanation of why recovery measures — such as fiscal spending, exchange interventions, and large increases in the money supply — had a smaller effect on nominal demand in Japan in the Great Recession (1992-2005) than in the United States during the Great Depression (1930s). In both episodes the short-term nominal interest rate was close to zero. He studies these episodes in a dynamic general equilibrium model with rational expectations and suggests that the difference is attributable to the Bank of Japan's independence. In contrast, the independence of the Federal Reserve in 1933 was eliminated, and monetary and fiscal policy was coordinated in conjunction with the recovery measures. This paper makes some preliminary suggestions for an institutional mechanism that takes advantage of policy coordination in the face of deflationary pressures, while preserving the well known advantages of central bank independence under normal circumstances.

Ito and Watanabe estimate fiscal policy feedback rules in Japan, the United States, and the United Kingdom, allowing for stochastic regime changes. Using Markov-switching regression methods, they find that the Japanese data clearly reject the view that the fiscal policy regime is fixed; that is, the Japanese government has been adopting either Ricardian or non-Ricardian policy at all times. Instead, these results indicate that fiscal policy regimes evolve over time in a stochastic manner. This is in sharp contrast with the U.S. and U.K. results in which the government's fiscal behavior is consistently characterized by Ricardian policy.

Kobayashi and Inaba conduct business cycle accounting (BCA) using the method developed by Chari, Kehoe, and McGrattan (2002a) on data from the 1980s-1990s in Japan and from the interwar period in Japan and the United States. They find that labor wedges may have been a major contributor to the decade-long recession in the 1990s in Japan. Assuming exogenous variations in the share of labor, they find that the deterioration in the labor wedge started around 1990, which coincides with the onset of the recession. Then they perform an alternative BCA exercise using the capital wedge instead of the investment wedge to check for the robustness of BCA implications for financial frictions. The accounting results with the capital wedge imply that financial frictions may have had a large depressive effect during the 1930s in the United States. This implication is the opposite of that of the original BCA findings.

[back to top]

Fall Research Meeting on Economic Fluctuations and Growth

The NBER's Program on Economic Fluctuations and Growth met at the Federal Reserve Bank of New York on September 29. NBER Research Associates Thomas J. Sargent of New York University and Christopher A. Sims of Princeton University organized the meeting. The following papers were discussed:

Robert Kollmann, University of Paris XII, "International Portfolio Equilibrium and the Current Account"
Discussant: Fabrizio Perri, Federal Reserve Bank of Minneapolis and NBER

John H. Cochrane, University of Chicago and NBER,

"Identification and Price Determination with Taylor Rules: A Critical Review"
Discussant: Eric M. Leeper, Indiana University and NBER

Stephanie Schmitt-Grohe and Martin Uribe, Duke University and NBER, "Optimal Inflation Stabilization in a Medium-Scale Macroeconomic Model" (NBER Working Paper No. 11854)
Discussant: Christopher Erceg, Federal Reserve Board of Governors

Benjamin Eden, Vanderbilt University, "International Seigniorage Payments"
Discussant: Chris Edmond, New York University

Florin O. Bilbiie, University of Oxford; Fabio Ghironi, Boston College and NBER; and Marc J. Melitz, Harvard University and NBER, "Endogenous Entry, Product Variety, and Business Cycles"
Discussant: Michael Woodford, Columbia University and NBER

A. Craig Burnside, Duke University and NBER; Martin S. Eichenbaum and Sergio Rebelo, Northwestern University and NBER; and Isaac Kleshchelski, Northwestern University, "The Returns to Currency Speculation"(NBER Working Paper No. 12489)
Discussant: Pierpaolo Benigno, New York University and NBER

Kollmann analyzes the determinants of international asset portfolios, using a neoclassical dynamic general equilibrium model with home bias in consumption. For plausible parameter values, his model explains the fact that typical investors hold most of their wealth in domestic assets (portfolio home bias). In the model, the current account balance (change in net foreign assets) is driven mainly by fluctuations in equity prices. The model predicts that the current account will be highly volatile and exhibit low serial correlation and that changes in a country's foreign equity assets and liabilities will be highly positively correlated. Kollmann then constructs current account series that include external capital gains and losses for 17 OECD economies. The behavior of the empirical series confirms his theoretical predictions.

Cochrane notes that the parameters of the Taylor rule relating interest rates to inflation and other variables are not identified in new-Keynesian models. Thus, Taylor rule regressions cannot be used to argue that the Fed conquered inflation by moving from a "passive" to an "active" policy in the early 1980s. Since there is nothing in economics to rule out explosive hyperinflations, price level determinacy requires ingredients beyond the Taylor principle, such as a non-Ricardian fiscal regime.

Schmitt-Grohe and Uribe characterize Ramsey-optimal monetary policy in a medium-scale macroeconomic model estimated to fit well the postwar U.S. business cycles. The authors find that mild deflation is Ramsey-optimal in the long run. However, the optimal inflation rate appears to be highly sensitive to the assumed degree of price stickiness. Within the window of available estimates of price stickiness (between 2 and 5 quarters), the optimal rate of inflation ranges from -4.2 percent per year (close to the Friedman rule) to -0.4 percent per year (close to price stability). This sensitivity disappears when one assumes that lump sum taxes are unavailable and that fiscal instruments take the form of distortionary income taxes. In that case, mild deflation emerges as a robust Ramsey prediction. Given the finding that the Ramsey-optimal inflation rate is negative, it is puzzling that most inflation-targeting countries pursue positive inflation goals. The authors show that the zero bound on the nominal interest rate, which is often cited as a rationale for setting positive inflation targets, is of no quantitative relevance in the present model. Finally, they characterize operational interest-rate feedback rules that best implement Ramsey-optimal stabilization policy. They find that the optimal interest-rate rule is active in price and wage inflation, mute in output growth, and moderately inertial. This rule achieves virtually the same level of welfare as the Ramsey-optimal policy.

What "liquidity services" do "over-priced" assets provide? What determines the choice of the international currency? How do international seigniorage payments affect the choice of monetary policies? What are the optimal inflation rates in the global economy? And, does a country gain when other use its currency? Eden analyzes these questions in a model in which demand uncertainty (taste shocks) and sequential trade are key. He applies the analysis to the recent policy discussion concerning the accumulation of foreign debt by the United States. He argues that the recent experience of stable demand in the United States may explain why: 1) there are sizeable excess returns of gross U.S. assets over gross U.S. liabilities, 2) the United States is "cheap" relative to the prediction of income-price regressions; 3) most U.S. liabilities are in dollar terms; and 4) a common currency increases trade. In the steady state, the stable demand country (the United States) gets seigniorage payments from foreigners with less stable demand. But this does not mean that the United States gains from having an international currency.

Bilbiie, Ghironi, and Melitz build a framework for analyzing macroeconomic business cycles that incorporates endogenous determination of the number of producers over the business cycle. Economic expansions induce higher entry rates by prospective entrants who are subject to irreversible investment costs. The sluggish response of the number of producers (because of the sunk entry costs) generates a new and potentially important endogenous propagation mechanism for real business cycle models (which typically rely on the accumulation of physical capital by a fixed number of producers). The model performs at least as well as the traditional setup with respect to the implied second-moment properties of key macroeconomic aggregates. In addition, consistent with the data, this framework predicts a procyclical number of producers, and procyclical profits, even for preference specifications that imply countercyclical markups.

Currencies that are at a forward premium tend to depreciate. This "forward-premium puzzle" represents an egregious deviation from uncovered interest parity. Burnside, Eichenbaum, Kleshchelski and Rebelo document the properties of returns to currency speculation strategies that exploit this anomaly. The first strategy, known as the carry trade, is widely used by practitioners. This strategy involves selling currencies forward that are at a forward premium and buying currencies forward that are at a forward discount. The second strategy relies on a particular regression to forecast the payoff to selling currencies forward. The authors show that these strategies yield high Sharpe ratios that are not a compensation for risk. However, these Sharpe ratios do not represent unexploited profit opportunities. In the presence of microstructure frictions, spot and forward exchange rates move against traders as they increase their positions. The resulting "price pressure" drives a wedge between average and marginal Sharpe ratios. The authors argue that marginal Sharpe ratios are zero even though average Sharpe ratios are positive.

[back to top]

Political Economy

The NBER's Working Group on Political Economy, directed by Alberto Alesina of NBER and Harvard University, met in Cambridge on October 6. These papers were discussed:

Torsten Persson, Stockholm University and NBER, and Guido Tabellini, Bocconi University, "Democratic Capital: The Nexus of Political and Economic Change"(NBER Working Paper No. 12175)
Discussant: Rafael Di Tella, Harvard University

Filipe R. Campante, Harvard University, "Redistribution in a Model of Voting and Campaign Contributions"

Discussant: Alessandro S. Lizzeri, New York University

Ernesto Dal BO, University of California, Berkeley; Pedro Dal BO, Brown University; and Jason Snyder, Northwestern University, "Political Dynasties"
Discussant: Benjamin Olken, NBER

Erik Snowberg and Eric Zitzewitz, Stanford University, and Justin Wolfers, University of Pennsylvania and NBER, "Partisan Impacts on the Economy: Evidence from Prediction Markets and Close Elections" (NBER Working Paper No. 12073)

Discussant: Roberto Perotti, Bocconi University and NBER

Francesco Caselli, London School of Economics and NBER, and Nicola Gennaioli, Stockholm University, "Economics and Politics of Alternative Institutional Reform"
Discussant: Nicola Persico, University of Pennsylvania

Christina M. Fong, Carnegie Mellon University, and Erzo F. P. Luttmer, Harvard University and NBER, "Race and Giving to Hurricane Katrina Victims: Experimental Evidence"
Discussant: Eliana La Ferrara, Bocconi University

Persson and Tabellini study the joint dynamics of economic and political change. The predictions of the simple model that they formulate are strongly supported in a panel of data on political regimes and GDP per capita for about 150 countries over a period of 150 years. Democratic capital -- measured by a nation's historical experience with democracy and by the incidence of democracy in its neighborhood – reduces the exit rate from democracy and raises the exit rate from autocracy. In democracies, a higher stock of democratic capital stimulates growth in an indirect way by decreasing the probability of a successful coup. The results suggest a virtuous circle, where the accumulation of physical and democratic capital reinforce each other, promoting economic development jointly with the consolidation of democracy.

Campante reassesses the relationship between inequality and redistribution in the context of a model in which individual political participation is endogenous and can take two distinct forms: voting and contributing to campaigns. This model, which embeds as a specific case the standard median-voter-based prediction that higher inequality leads to more redistribution, shows that the interaction between contributions and voting can explain why this prediction fails to hold: higher inequality leads to an increase in the contributions of wealthier individuals relative to those of poor individuals, and this shifts the political system in favor of the former. In equilibrium, there is a non-monotonic relationship in which redistribution is initially increasing and eventually decreasing in inequality. The model also predicts how inequality will affect political participation. Campante presents empirical evidence supporting those predictions, and hence the mechanism proposed, using data on campaign contributions and voting from U.S. presidential elections.

Dal BO, Dal BO, and Snyder study political dynasties in the U.S. Congress since its inception in 1789. They document patterns in the evolution and profile of political dynasties, study the self-perpetuation of political elites, and analyze the connection between political dynasties and political competition. They find that the percentage of dynastic legislators is decreasing over time and that dynastic legislators have been significantly more prevalent in the South, the Senate, and the Democratic party. While regional and party differences have largely disappeared over time, the difference across chambers has not. The authors document differences and similarities in the profile and political careers of dynastic politicians relative to the rest of legislators. They also find that legislators who enjoy longer tenures are significantly more likely to have relatives entering Congress later. Using instrumental variables methods, they establish that this relationship is causal: a longer period in power increases the chance that a person may start (or continue) a political dynasty. Therefore, dynastic political power is self-perpetuating in that a positive exogenous shock to a person's political power has persistent effects through posterior dynastic attainment. Finally, they find that increases in political competition are associated with fewer dynastic legislators, suggesting that dynastic politicians may be less valued by voters.

Political economists interested in discerning the effects of election outcomes on the economy have been hampered by the problem that economic outcomes also influence elections. Snowberg, Wolfers, and Zitzewitz sidestep these problems by analyzing movements in economic indicators caused by clearly exogenous changes in expectations about the likely winner during election day. Analyzing high frequency financial fluctuations on November 2 and 3 in 2004, they find that markets anticipated higher equity prices, interest rates, and oil prices, and a stronger dollar, under a Bush presidency than under Kerry. A similar Republican-Democrat differential was also observed for the 2000 Bush-Gore contest. Prediction market based analyses of all Presidential elections since 1880 also reveal a similar pattern of partisan impacts, suggesting that electing a Republican President raises equity valuations by 2-3 percent, and that since Reagan, Republican Presidents have tended to raise bond yields.

Caselli and Gennaioli compare the economic consequences and political feasibility of reforms aimed at reducing barriers to entry (deregulation) and improving contractual enforcement (legal reform). Deregulation fosters entry, thereby increasing the number of firms (entrepreneurship) and the average quality of management (meritocracy). Legal reform also reduces financial constraints on entry, but it facilitates transfers of control of incumbent firms, from untalented to talented managers. When incumbent firms are better run, entry by new firms is less profitable, so in general equilibrium a legal reform may improve meritocracy at the expense of entrepreneurship. As a result, legal reform encounters less political opposition than deregulation, as it preserves incumbents' rents, while at the same time allowing the less efficient among them to transfer control and capture (part of) the resulting efficiency gains. Using this insight, the authors show that there may be dynamic complementarities in the reform path, whereby reformers can skillfully use legal reform in the short run to create a constituency supporting future deregulations. Generally speaking, the model here suggests that "Coasian" reforms improving the scope of private contracting are likely to mobilize greater political support because – rather than undermining the rents of incumbents – they allow for an endogenous compensation of losers. Some preliminary empirical evidence supports the view that the market for control of incumbent firms plays an important role in an industry's response to legal reform.

Fong and Luttmer investigate individual motives for giving to the needy using a large randomized experiment. In the experiment, respondents from the general population had an opportunity to give to victims of a natural disaster – namely, Hurricane Katrina. Respondents first saw a small presentation about Katrina victims in a small city. By showing pictures with either predominantly black or predominantly white victims, the researchers manipulated respondents' perceptions of the race of the victims in that city. They then used accompanying audio information to manipulate perceptions of the income and worthiness of the victims. Respondents then decided how to split $100 between themselves and the Katrina victims. The income of the victims had a highly significant effect on giving; respondents gave more when they believed the victims to be poorer. Surprisingly, race had virtually no effect on giving, even though it had a highly significant effect on beliefs about the racial composition of the victims. Similarly, information about the worthiness of the victims affected beliefs but not giving.

[back to top]

Market Microstructure

The NBER's Working Group on Market Microstructure, directed by Research Associate Bruce Lehmann of University of California, San Diego, met on October 6 in Cambridge. The meeting was organized by Lehmann; Duane Seppi of Carnegie Mellon University; and Avanidhar Subrahmanyam, University of California, Los Angeles. The following papers were discussed:

Avraham Kamara, Xiaoxia Lou, and Ronnie Sadka, University of Washington, "The Polarization of Systematic Liquidity in the Cross-Section of Stocks"
Discussant: Jay Coughenour, University of Delaware

Darwin Choi and Heather Tookes, Yale University, and Mila Getmansky, University of Massachusetts, Amherst, "Convertible Bond Arbitrage, Liquidity Externalities and Stock Prices"
Discussant: Nicolas Bollen, Vanderbilt University

Ellyn Boukus, Yale University, and Joshua V. Rosenberg, Federal Reserve Bank of New York, "The Information Content of FOMC Minutes"
Discussant: Michael Fleming, Federal Reserve Bank of New York

Zhi Da, University of Notre Dame, and Pengjie Gao, Northwestern University, "Clientele Change, Liquidity Shock, and the Return on Financially Distressed Stocks"

Discussant: Gergana Jostova, George Washington University

Hendrik Bessembinder and Ivalina Kalcheva, University of Utah, "Liquidity Biases in Asset Pricing Tests"
Discussant: Gideon Saar, Cornell University

Paolo Pasquariello, University of Michigan, and Clara Vega, Federal Reserve Board of Governors, "Strategic Order Flow in the On-The-Run and Off-The-Run Bond Markets"
Discussant: Arvind Krishnamurthy, Northwestern University

Kamara, Lou, and Sadka demonstrate that the cross-sectional variation of liquidity commonality has increased over the period 1963-2005. In particular, the sensitivity of large-cap firms' liquidity to market liquidity has increased, while that of small-cap firms has declined. This increased polarization of systematic liquidity can be explained by patterns in institutional ownership over the sample period. The analysis also indicates that the ability to diversify aggregate liquidity shocks by holding large-cap stocks has declined. The evidence, therefore, suggests that the fragility of the U.S. equity market to unanticipated liquidity events has increased over the past few decades.

Choi, Tookes, and Getmansky use convertible bond issuance and equity short interest data to identify convertible bond arbitrage activity and examine its impact on stock market liquidity and prices for the period 1991 to 2005. They find considerable evidence that arbitrage-induced short selling is related to liquidity improvements in the stock. They then link total issuance and their proxy for arbitrage activity to convertible bond arbitrage, hedge fund flows, and returns data. They find that issuance is sensitive to both the supply of capital from arbitrageurs and to their measure of convertible bond arbitrage activity. The latter finding suggests an important role for arbitrageurs' use of the funds that they raise.

Boukus and Rosenberg analyze the information content of Federal Open Market Committee minutes from 1987-2005. They apply an objective, statistical methodology known as Latent Semantic Analysis to decompose each minutes release into its characteristic themes. They show that these themes are correlated with current and future economic conditions. Their evidence suggests that market participants can extract a complex, multifaceted signal from the minutes. In particular, Treasury yield changes around the time of the minutes release depend on the specific themes expressed, the level of monetary policy uncertainty, and the economic outlook.

Da and Gao provide empirical evidence supporting the view that a sharp rise in a firm's default likelihood causes a change in its shareholder clientele. As institutions decrease their holdings of the firm's share, trading volume and cost increase; the order imbalance measure indicates large selling pressure. The resulting liquidity shock leads to a further concession in the stock price, recovering in the subsequent month though. Such price recovery explains the first-month abnormally high return earned by stocks with high default likelihood that is documented in Vassalou and Xing (2004). The abnormally high return is therefore mostly a reward for providing liquidity when it is most needed, rather than compensation for bearing a systematic default risk.

Bessembinder and Kalcheva examine how microstructure biases arising from "bid-ask bounce" affect empirical asset pricing tests. They mainly focus on tests of whether liquidity is priced, but their analysis also provides new insights regarding tests of whether systematic risk is priced. They present theory and simulation-based evidence indicating that bid-ask spreads and endogenous trade or no-trade decisions lead to biases in observable risk and return measures that affect the reliability of asset pricing tests. Their most robust finding is that these frictions can lead to upward bias in estimates of the return premium for illiquidity. They exploit the fact that CRSP has reported closing quotes for Nasdaq National Market System stocks since 1983 to verify empirically that the estimated return premium related to the bid-ask spread is significantly larger when returns are computed from closing prices rather than quote midpoints. They also document that, depending on research design, microstructure considerations potentially obscure the relation between average returns and betas. They discuss possible methodological corrections for these microstructure biases, and conditions under which they may be effective.

Pasquariello and Vega study the determinants of liquidity and price differentials between on-the-run and off-therun U.S. Treasury bond markets. To guide their analysis, they develop a parsimonious model of multi-asset speculative trading in which endowment shocks separate the on-the-run security from an otherwise identical off-the-run security. They then explore the equilibrium implications of these shocks on both off/on-the-run price and liquidity differentials in the presence of two realistic market frictions – information heterogeneity and imperfect competition among informed traders – and a public signal. They test these implications by analyzing daily differences in market liquidity and yields for on-the-run and off-the-run three-month, six-month, and one-year U.S Treasury bills and two-year, five-year, and ten-year U.S. Treasury notes.The evidence suggests that 1) off/on-the-run bid-ask spread differentials are economically and statistically significant, even after controlling for differences in several of the bonds' intrinsic characteristics (such as duration, convexity, or repo rates); 2) their corresponding yield differentials are neither, inconsistent with the illiquidity premium hypothesis; and 3) off/on-the-run liquidity differentials are larger for bonds of shorter maturity, immediately following bond auction dates, when the uncertainty surrounding the ensuing auction allocations is high, when the dispersion of beliefs across informed traders is high, and when macroeconomic announcements are noisy, consistent with our stylized model.

[back to top]

National Bureau of Economic Research
1050 Massachusetts Ave.
Cambridge, MA 02138

Twitter RSS

View Full Site: One timeAlways