Winnie Xu, Ricky T.Q. Chen, Xuechen Li, David Duvenaud
We explore the concept of infinite-dimensional stochastic variational inference in learning the dynamics of continuous-time Neural ODEs with instantaneous noise. Our framework trains Bayesian neural networks by parameterizing weight uncertainty with stochastic differential equations (SDEs) and associate efficient gradient-based algorithms. We also conjecture and derive zero variance gradient estimators to the infinite-dimensional case as the approximate posterior converges to the true posterior.
International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.