Department Seminar Michael Schweinberger Link to join seminar: Hosted on Webex. |
Scalable statistical learning of network models in high-dimensional n = 1 and p → ∞ scenarios, with statistical guarantees
An important question in statistical network analysis is how to construct and estimate models of dependent network data without sacrificing computational scalability and statistical guarantees. We demonstrate that scalable estimation of models for dependent network data is possible, by establishing the first consistency results and convergence rates for pseudo-likelihood-based M-estimators for parameter vectors of increasing dimension based on a single observation of dependent random variables. The main results cover models of dependent random variables with countable sample spaces, and may be of independent interest. To showcase consistency results and convergence rates, we introduce a novel class of generalized β-models with dependent connections and parameter vectors of increasing dimension. We establish consistency results and convergence rates for pseudo-likelihood-based M-estimators of generalized β-models with dependent connections, in dense- and sparse-network regimes.