Robust Estimation with Exponentially Tilted Hellinger Distance
This paper is concerned with estimation of parameters deﬁned by moment equalities. In this context, Kitamura, Otsu and Evdokimov (2013a) have introduced the minimum Hellinger distance (HD) estimator which is asymptotically semiparametrically eﬃcient when the model is correctly speciﬁed and achieves optimal minimax robust properties under small deviations from the model (local misspeciﬁcation). This paper evaluates the performance of inference procedures under two comple-mentary types of misspeciﬁcation, local and global. After showing that HD is not robust to global misspeciﬁcation, we introduce, in the spirit of Schennach (2007), the exponentially tilted Hellinger distance (ETHD) estimator by combining the Hellinger distance and the Kullback-Leibler information criterion. Our estimator shares the same desirable asymptotic properties as HD under correct speciﬁcation and local misspeciﬁcation, and remains well-behaved under global misspeciﬁcation. ETHD therefore appears to be the ﬁrst estimator that is eﬃcient under correct speciﬁcation, and robust to both global and local misspeciﬁcation.