×

M-estimators in regression models. (English) Zbl 1205.62091

Summary: Regression analysis plays a vital role in many areas of science. Almost all regression analyses rely on the method of least squares for estimation of the parameters in the models. But this method is constructed under specific assumptions, such as normality of the error distributions. When outliers are present in the data, this method of estimation, results in parameter estimates that do not provide useful information for the majority of the data. Robust regression analyses have been developed as an improvement to least squares estimation in the presence of outliers. The main purpose of robust regression analysis is to fit a model that represents the information of the majority of the data. Many researchers have worked in this field and developed methods for these problems. The most commonly used robust estimators are P. J. Huber’s M-estimator [Ann. Math. Stat. 35, 73–101 (1964; Zbl 0136.39805)], Hampel’s estimator, and Tukey’s bisquare estimator, etc. In this paper, an attempt is made to review such type of estimators and to carry out a simulation study of these estimators in regression models. R code has been written for this purpose and illustrations are provided.

MSC:

62J05 Linear regression; mixed models
62F35 Robustness and adaptive procedures (parametric inference)
65C60 Computational problems in statistics (MSC2010)
62J02 General nonlinear regression
62-04 Software, source code, etc. for problems pertaining to statistics

Citations:

Zbl 0136.39805

Software:

R
PDFBibTeX XMLCite
Full Text: DOI Link