## The Manifold Moving Least-Squares (Manifold-MLS) Framework: approximating manifolds and reconstructing their atlas from discrete data sets

- Nov. 1, 2019
- 3:30 p.m.
- LeConte 317R

## Abstract

Differentiable manifolds are an indispensable ‘language’ in modern physics and mathematics. As such, there is a plethora of analytic tools designed to investigate manifold-based models (e.g., connections, differential forms, curvature tensors, parallel transport, bundles). However, in order to facilitate these tools, one normally assumes access to the manifold’s atlas of charts (i.e., local parametrizations). In recent years, manifold-based modeling has permeated into data analysis as well, usually in order to avoid working in high dimensions. However, in these data-driven models charts are not accessible and the only information at hand are the samples themselves. As a result, the common practice in *Manifold Learning* is to project the data into a lower dimensional Euclidean domain, while maintaining some notion of distance (e.g., geodesic or diffusion).

In this talk we introduce an alternative approach named the Manifold Moving Least-Squares (*Manifold-MLS*) that, given a finite set of samples, reconstructs an atlas of charts and provides an approximation of the manifold itself. Under certain (non-restrictive) sampling assumptions, we prove that the *Manifold-MLS* produces a smooth Riemannian manifold approximating the sampled one, even in case of noisy samples. We show that the approximation converges to the sampled manifold in case the number of samples tends to infinity, and give the exact convergence rates.