## Statistical Theory for Nonlinear Function Approximation: Neural Nets, Mixture Models, and Adaptive Kernel Machines

- April 7, 2005
- 3:30 p.m.
- LeConte 412

## Abstract

The fields of statistics, information theory, approximation theory, and machine learning have been engaged for many years in a quest for function fitting in high dimensions that is computationally efficient, flexible, and provably accurate. While the computational issues remain most vexing, there have been surprising successes uncovering desirable statistical properties of nonlinear function approximation and basis adaptation from very large dictionaries of candidate basis functions. We illustrate the main principles and issues in three settings: artificial neural nets, mixture modeling, and adaptation in kernel machines.