ArticleOriginal scientific text

Title

Redescending M-estimators in regression analysis, cluster analysis and image analysis

Authors 1

Affiliations

  1. Carl von Ossietzky University Oldenburg, Institute for Mathematics, Postfach 2503, D-26111 Oldenburg, Germany

Abstract

We give a review on the properties and applications of M-estimators with redescending score function. For regression analysis, some of these redescending M-estimators can attain the maximum breakdown point which is possible in this setup. Moreover, some of them are the solutions of the problem of maximizing the efficiency under bounded influence function when the regression coefficient and the scale parameter are estimated simultaneously. Hence redescending M-estimators satisfy several outlier robustness properties. However, there is a problem in calculating the redescending M-estimators in regression. While in the location-scale case, for example, the Cauchy estimator has only one local extremum this is not the case in regression. In regression there are several local minima reflecting several substructures in the data. This is the reason that the redescending M-estimators can be used to detect substructures in data, i.e. they can be used in cluster analysis. If the starting point of the iteration to calculate the estimator is coming from the substructure then the closest minimum corresponds to this substructure. This property can be used to construct an edge and corner preserving smoother for noisy images so that there are applications in image analysis as well.

Keywords

redescending M-estimator, regression, breakdown point, optimality, cluster analysis, image analysis, kernel estimator

Bibliography

  1. D.F. Andrews, P.J. Bickel, F.R. Hampel, P.J. Huber, W.H. Rogers and J.W. Tukey, Robust Estimates of Location. Survey and Advances, Princeton University Press, Princeton 1972.
  2. O. Arslan, A simple test to identify good solutions of redescending M estimating equations for regression, Developments in Robust Statistics, Proceedings of ICORS 2001, R. Dutter, U. Gather, P.J. Rousseeuw and P. Filzmoser, (Eds.), (2003), 50-61.
  3. T. Bednarski and Ch.H. Müller, Optimal bounded influence regression and scale M-estimators, Statistics 35 (2001), 349-369.
  4. P.J. Bickel, Quelque aspects de la statistique robuste, In École d'Été de Probabilités de St. Flour. Springer Lecture Notes in Math. 876 (1981), 1-72.
  5. P.J. Bickel, Robust regression based on infinitesimal neighbourhoods, Ann. Statist. 12 (1984), 1349-1368.
  6. H. Chen and P. Meer, Robust computer vision through kernel density estimation, ECCV 2002, LNCS 2350, A. Heyden et al. (Eds.), Springer, Berlin (2002), 236-250.
  7. H. Chen, P. Meer and D.E. Tyler, Robust regression for data with multiple structures, 2001 IEEE Conference on Computer Vision and Pattern Recognition, vol. I, Kauai, HI, (2001), 1069-1075.
  8. C.K. Chu, I.K. Glad, F. Godtliebsen and J.S. Marron, Edge-preserving smoothers for image processing, J. Amer. Statist. Assoc. 93, (1998), 526-541.
  9. B.R. Clarke, Uniqueness and Frechét differentiability of functional solutions to maximum likelihood type equations, Ann. Statist. 4 (1983), 1196-1205.
  10. B.R. Clarke, Asymptotic theory for description of regions in which Newton-Raphson iterations converge to location M-estimators, J. Statist. Plann. Inference 15 (1986), 71-85.
  11. J.R. Collins, Robust estimation of a location parameter in the presence of asymmetry, Ann. Statist. 4 (1976), 68-85.
  12. J.B. Copas, On the unimodality of the likelihood for the Cauchy distribution, Biometrika 62 (1975), 701-704.
  13. D.L. Donoho. and J.P. Huber, The notion of breakdown point, P.J. Bickel, K.A. Doksum and J.L. Hodges, Jr., Eds., A Festschrift for Erich L. Lehmann, Wadsworth, Belmont, CA, (1983), 157-184.
  14. S.P. Ellis and S. Morgenthaler, Leverage and breakdown in L1 regression, J. Amer. Statist. Assoc. 87 (1992), 143-148.
  15. D.A. Freedman and P. Diaconis, On inconsistent M-estimators, Ann. Statist. 10 (1982), 454-461.
  16. G. Gabrielsen, On the unimodality of the likelihood for the Cauchy distribution: Some comments, Biometrika 69 (1982), 677-678.
  17. F.R. Hampel, Optimally bounding the gross-error-sensitivity and the influence of position in factor space, Proceedings of the ASA Statistical Computing Section, ASA, Washington, D.C., (1978), 59-64.
  18. F.R. Hampel, E.M. Ronchetti, P.J. Rousseeuw and W.A. Stahel, Robust Statistics - The Approach Based on Influence Functions, John Wiley, New York 1986.
  19. W. Härdle and T. Gasser, Robust nonparametric function fitting, J. R. Statist. Soc. B 46 (1984), 42-51.
  20. X. He, J. Jurecková, R. Koenker and S. Portnoy, Tail behavior of regression estimators and their breakdown points, Econometrica 58 (1990), 1195-1214.
  21. X. He, D.G. Simpson and G. Wang, Breakdown points of t-type regression estimators, Biometrika 87 (2000), 675-687.
  22. C. Hennig, Regression fixed point clusters: motivation, consistency and simulations, Preprint 2000-02, Fachbereich Mathematik, Universität Hamburg 2000.
  23. C. Hennig, Clusters, outliers, and regression: Fixed point clusters, Journal of Multivariate Analysis. 86/1 (2003), 183-212.
  24. M. Hillebrand, On robust corner-preserving smoothing in image processing, Ph.D. thesis at the Carl von Ossietzky University Oldenburg, Germany 2002.
  25. M. Hillebrand and Ch.H. Müller, On consistency of redescending M-kernel smoothers, Submitted 2002.
  26. P.J. Huber, Minimax aspects of bounded-influence regression (with discussion), J. Amer. Statist. Assoc. 78 (1983), 66-80.
  27. J. Jurecková and P.K. Sen, Robust Statistical Procedures. Asymptotics and Interrelations, Wiley, New York 1996.
  28. W.S. Krasker, Estimation in linear regression models with disparate data points, Econometrica 48 (1980), 1333-1346.
  29. V. Kurotschka and Ch.H. Müller, Optimum robust estimation of linear aspects in conditionally contaminated linear models, Ann. Statist. 20 (1992), 331-350.
  30. K.L. Lange, R.J.A. Little and J.M.G. Taylor, Robust statistical modeling using the t distribution J. Amer. Statist. Assoc. 84 (1989), 881-896.
  31. R.A. Maronna, O.H. Bustos and V.J. Yohai, Bias- and efficiency-robustness of general M-estimators for regression with random carriers, Smoothing Techniques for Curve Estimation (T. Gasser and M. Rosenblatt, eds.) Springer, Berlin, Lecture Notes in Mathematics 757 (1979), 91-116.
  32. I. Mizera, On consistent M-estimators: tuning constants, unimodality and breakdown, Kybernetika 30 (1994), 289-300.
  33. I. Mizera, Weak continuity of redescending M-estimators of location with an unbounded objective function, Tatra Mountains Math. Publ. 7 (1996), 343-347.
  34. I. Mizera and Ch.H. Müller, Breakdown points and variation exponents of robust M-estimators in linear models, Ann. Statist. 27 (1999), 1164-1177.
  35. I. Mizera and Ch.H. Müller, Breakdown points of Cauchy regression-scaleestimators, Stat. & Prob. Letters 57 (2002), 79-89.
  36. S. Morgenthaler, Fitting redescending M-estimators in regression, Robust Regression, H.D. Lawrence and S. Arthur, (Eds.), Dekker, New York (1990), 105-128.
  37. Ch.H. Müller, Optimal designs for robust estimation in conditionally contaminated linear models, J. Statist. Plann. Inference. 38 (1994), 125-140.
  38. Ch.H. Müller, Breakdown points for designed experiments, J. Statist. Plann. Inference, 45 (1995), 413-427.
  39. Ch.H. Müller, Optimal breakdown point maximizing designs, Tatra Mountains Math. Publ. 7, (1996), 79-85.
  40. Ch.H. Müller, Robust Planning and Analysis of Experiments, Springer, New York, Lecture Notes in Statistics 124 (1997).
  41. Ch.H. Müller, On the use of high breakdown point estimators in the image analysis, Tatra Mountains Math. Publ. 17 (1999), 283-293.
  42. Ch.H. Müller, Robust estimators for estimating discontinuous functions, Metrika 55 (2002a), 99-109.
  43. Ch.H. Müller, Comparison of high-breakdown-point estimators for image denoising, Allg. Stat. Archiv 86 (2002b), 307-321.
  44. Ch.H. Müller and T. Garlipp, Simple consistent cluster methods based on redescending M-estimators with an application to edge identification in images, To appear in Journal of Multivariate Analysis, (2002).
  45. P. Qiu, Nonparametric estimation of jump surface, The Indian Journal of Statistics 59, Series A, (1997), 268-294.
  46. H. Rieder, Robust regression estimators and their least favorable contamination curves, Stat. Decis. 5 (1987), 307-336.
  47. H. Rieder, Robust Asymptotic Statistics, Springer, New York 1994.
  48. B.W. Silverman, Density Estimation for Statistics and Data Analysis, Chapman and Hall, London 1986.
  49. S. Smith and J. Brady, SUSAN - a new approach to low level image processing, International Journal of Computer Vision 23 (1997), 45-78.
  50. R.H. Zamar, Robust estimation in the errors-in-variables model, Biometrika 76 (1989), 149-160.
Pages:
59-75
Main language of publication
English
Received
2003-10-04
Published
2004
Exact and natural sciences