Dmitry Vetrov
Dmitry Vetrov (graduated from Moscow State Univerisity in 2003, PhD in 2006) is a research professor at Higher School of Economics, Moscow, and leading researcher at AIRI, Moscow. He is a founder and the head of the Bayesian Methods Research Group which became one of the strongest research groups in Russia. Three of his recent PhD students became researchers at DeepMind. His research focuses on combining Bayesian framework with Deep Learning models. His group is also actively involved in developing more efficient algorithms for diffusion models, studying the properties of loss landscape in deep neural networks, building scalable tools for stochastic optimization, application of tensor decomposition methods to large-scale Machine Learning, improving conditional text generation models, etc.
Recent Publications
- To Stay or Not to Stay in the Pre-train Basin: Insights on Ensembling in Transfer Learning (2023) NeurIPS 2023
- Star-Shaped Denoising Diffusion Probabilistic Models (2023) NeurIPS 2023
- Entropic Neural Optimal Transport via Diffusion Processes (2023) NeurIPS 2023
- StyleDomain: Efficient and Lightweight Parameterizations of StyleGAN for One-shot and Few-shot Domain Adaptation (2023) ICCV 2023
- UnDiff: Unsupervised Voice Restoration with Unconditional Diffusion Model (2023) Interspeech 2023
- HIFI++: A Unified Framework for Bandwidth Extension and Speech Enhancement (2023) ICASSP 2023
- MARS: Masked Automatic Ranks Selection in Tensor Decompositions (2023) AISTATS 2023
- HyperDomainNet: Universal Domain Adaptation for Generative Adversarial Network (2022) NeurIPS 2022
- Training Scale-Invariant Neural Networks on the Sphere Can Happen in Three Regimes (2022) NeurIPS 2022
- FFC-SE: Fast Fourier Convolution for Speech Enhancement (2022) INTERSPEECH 2022
- Variational Autoencoders for Precoding Matrices with High Spectral Efficiency (2022) MOTOR 2022
- On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay (2021) NeurIPS 2021
- Leveraging Recursive Gumbel-Max Trick for Approximate Inference in Combinatorial Spaces (2021) NeurIPS 2021
- On Power Laws in Deep Ensembles (2020) NeurIPS 2020
- On Power Laws in Deep Ensembles (2020)