When using inverse regression methods in dimension reduction models, the popular linearity condition has a paradoxical effect: ignoring the linearity condition yields a more efficient estimator than making use of the linearity condition. By considering classes of parametric models, which include the linearity condition as a special case, we examine this phenomenon using a geometry approach, and provide an intuitive and extended explanation. Our findings explain what the real cause of the paradox is, indicate how to properly handle the linearity condition and reveal the true role of the linearity condition. Our analysis directly leads to new estimators that further improve the existing efficient estimator that did not specifically account for the linearity condition and the possible constant variance condition.
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Statistics, Probability and Uncertainty