EMail:
WWW: https://economics.mit.edu/faculty/wnewey
NBER Program Affiliations:
LS
NBER Affiliation: Research Associate
Institutional Affiliation: Massachusetts Institute of Technology
Information about this author at RePEc
Demand Analysis with Many Prices with Victor Chernozhukov, Jerry A. Hausman: w26424 From its inception, demand estimation has faced the problem of "many prices." This paper provides estimators of average demand and associated bounds on exact consumer surplus when there are many prices in crosssection or panel data. For crosssection data we provide a debiased machine learner of consumer surplus bounds that allows for general heterogeneity and solves the "zeros problem" of demand. For panel data we provide bias corrected, ridge regularized estimators of average coefficients and consumer surplus bounds. In scanner data we find smaller panel elasticities than crosssection and that soda price increases are regressive.  
On Bunching and Identification of the Taxable Income Elasticity with Sören Blomquist, Anil Kumar, CheYuan Liang: w24136 The taxable income elasticity is a key parameter for predicting the effect of tax reform or designing an income tax. Bunching at kinks and notches in a single budget set have been used to estimate the taxable income elasticity. We show that when the heterogeneity distribution is unrestricted the amount of bunching at a kink or a notch is not informative about the size of the taxable income elasticity, and neither is the entire distribution of taxable income for a convex budget set. Kinks do provide information about the size of the elasticity when a priori restrictions are placed on the heterogeneity distribution. They can identify the elasticity when the heterogeneity distribution is specified across the kink and provide bounds under restrictions on the heterogeneity distribution. We also...  
Double/Debiased Machine Learning for Treatment and Structural Parameters with Victor Chernozhukov, Denis Chetverikov, Mert Demirer, Esther Duflo, Christian Hansen, James Robins: w23564 We revisit the classic semiparametric problem of inference on a low dimensional parameter θ_0 in the presence of highdimensional nuisance parameters η_0. We depart from the classical setting by allowing for η_0 to be so highdimensional that the traditional assumptions, such as Donsker properties, that limit complexity of the parameter space for this object break down. To estimate η_0, we consider the use of statistical or machine learning (ML) methods which are particularly wellsuited to estimation in modern, very highdimensional cases. ML methods perform well by employing regularization to reduce variance and trading off regularization bias with overfitting in practice. However, both regularization bias and overfitting in estimating η_0 cause a heavy bias in estimators of θ_0 that are... Published: Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney Newey & James Robins, 2018. "Double/debiased machine learning for treatment and structural parameters," The Econometrics Journal, vol 21(1), pages C1C68. citation courtesy of  
Automatic Lag Selection in Covariance Matrix Estimation with Kenneth D. West: t0144 We propose a nonparametric method for automatically selecting the number of autocovariances to use in computing a heteroskedasticity and autocorrelation consistent covariance matrix. For a given kernel for weighting the autocovariances, we prove that our procedure is asymptotically equivalent to one that is optimal under a mean squared error loss function. Monte Carlo simulations suggest that our procedure performs tolerably well, although it does result in size distortions. Published:
