Convergence of Langevin-Simulated Annealing algorithms with multiplicative noise II: Total Variation
Résumé
We study the convergence of Langevin-Simulated Annealing type algorithms with multiplicative noise, i.e. for $V : \mathbb{R}^d \to \mathbb{R}$ a potential function to minimize, we consider the stochastic differential equation $dY_t = - \sigma \sigma^\top \nabla V(Y_t) dt + a(t)\sigma(Y_t)dW_t + a(t)^2\Upsilon(Y_t)dt$, where $(W_t)$ is a Brownian motion, where $\sigma : \mathbb{R}^d \to \mathcal{M}_d(\mathbb{R})$ is an adaptive (multiplicative) noise, where $a : \mathbb{R}^+ \to \mathbb{R}^+$ is a function decreasing to $0$ and where $\Upsilon$ is a correction term. Allowing $\sigma$ to depend on the position brings faster convergence in comparison with the classical Langevin equation $dY_t = -\nabla V(Y_t)dt + \sigma dW_t$. In a previous paper we established the convergence in $L^1$-Wasserstein distance of $Y_t$ and of its associated Euler scheme $\bar{Y}_t$ to $\text{argmin}(V)$ with the classical schedule $a(t) = A\log^{-1/2}(t)$. In the present paper we prove the convergence in total variation distance. The total variation case appears more demanding to deal with and requires regularization lemmas.