LEARNING KERNELS WITH NONLOCAL DEPENDENCE IN MEAN FIELD EQUATIONS AND THE EXTENSION PROBLEM ON DIRICHLET SPACE
Embargo until
Date
2023-05-08
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Johns Hopkins University
Abstract
This thesis concludes the research conducted during the author’s Ph.D. in both applied and pure math topics.
The first part of the thesis introduces a nonparametric algorithm to learn interaction kernels of mean-field equations for 1st-order systems of interacting particles. The algorithm employs a probabilistic error functional derived from the likelihood of the mean-field equation’s diffusion process and achieves convergence in a weighted L2 space by using least squares with regularization on data-adaptive hypothesis spaces. The identifiability of the algorithm is also characterized. Under data-adaptive L2 spaces, the Fre ́chet derivative of the loss functional leads to a semi-positive integral operator. Thus identifiability holds on the eigenspaces with nonzero eigenvalues of the integral operator, and on the L2 spaces if and only if the integral operators are strictly positive. Therefore, the inverse problem is ill-posed, and subsequently, we propose RKHS-based regularization and numerically demonstrate its accuracy.
The second part investigates the problem of learning kernels in operators from data. The nonlocal dependence of kernels and singular inversion operators make the inverse problem ill-posed. We present Data Adaptive RKHS Tikhonov Regularization (DARTR). The data-adaptive RKHS (reproducing kernel Hilbert space) is the function space of identifiability, whose norm is viable for Tikhonov regularization. DARTR selects the optimal regularization parameter and is demonstrated to be accurate, robust, and convergent. Moreover, by the natural connection between Tikhonov regularization and
Bayesian priors, DARTR can also be considered as a data-adaptive prior, which proved to achieve a stable posterior whose mean has a small noise limit, both theoretically and computationally. Numerical tests show that a fixed prior can lead to a divergent posterior mean in the presence of errors, whereas the data-adaptive prior is convergent.
Finally, we prove the Ho ̈lder regularity and the Harnack principle for the weak solutions of the equation p ́Lqsu “ 0 defined on fractals in a quite general framework of Dirichlet spaces generated by L. The heat kernel of L is assumed to have sub-Gaussian estimates, which is common in the case of fractals. Our goal is to give a general setup for the investigation of the powers of generators of Dirichlet form.
Primary Readers and Advisors:
Yannick Sire, Fei Lu
Description
Keywords
Mean-Field Equations, Interacting Particle System, Inverse Problem, Tikhonov Regularization, Dirichlet Space, Harnack Inequality, Heat Kernel Estimate