Statistics and Biostatistics seminar series
Robert
Bassett Room: M3 3127 |
One-Step Estimation with Quasi-Newton and Scaled Proximal Methods
In this talk, we focus on statistical estimators computed using iterative optimization methods that are terminated before reaching an extremum. Classical results by Le Cam on maximum likelihood estimators (MLEs) assert that a one-step estimator (OSE), in which a single Newton-Raphson iteration is performed from a starting point with certain properties, is asymptotically equivalent to the MLE. We further develop early-stopping results by deriving properties of one-step estimators defined by a single iteration of scaled proximal methods. Our main results show the asymptotic equivalence of the likelihood-based estimator and various one-step estimators defined by scaled proximal methods. By interpreting OSEs as the last of a sequence of iterates, our results provide insight on scaling numerical tolerance with sample size. Our setting contains quasi-newton algorithms and scaled proximal gradient descent applied to certain composite models as a special case, making our results applicable to many problems of practical interest.