Header logo is ei

Regularized principal manifolds




Many settings of unsupervised learning can be viewed as quantization problems - the minimization of the expected quantization error subject to some restrictions. This allows the use of tools such as regularization from the theory of (supervised) risk minimization for unsupervised learning. This setting turns out to be closely related to principal curves, the generative topographic map, and robust coding. We explore this connection in two ways: (1) we propose an algorithm for finding principal manifolds that can be regularized in a variety of ways; and (2) we derive uniform convergence bounds and hence bounds on the learning rates of the algorithm. In particular, we give bounds on the covering numbers which allows us to obtain nearly optimal learning rates for certain types of regularization operators. Experimental results demonstrate the feasibility of the approach.

Author(s): Smola, AJ. and Mika, S. and Schölkopf, B. and Williamson, RC.
Journal: Journal of Machine Learning Research
Volume: 1
Pages: 179-209
Year: 2001
Month: June
Day: 0

Department(s): Empirical Inference
Bibtex Type: Article (article)

Digital: 0
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PDF


  title = {Regularized principal manifolds},
  author = {Smola, AJ. and Mika, S. and Sch{\"o}lkopf, B. and Williamson, RC.},
  journal = {Journal of Machine Learning Research},
  volume = {1},
  pages = {179-209},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = jun,
  year = {2001},
  month_numeric = {6}