1. Dr. Kirill Struminsky

    Congrats to Kirill Struminsky who got his PhD on Learning guarantees and efficient inference for structured prediction on February 13!

    This work examines gauge functions in a structural prediction problem, allowing us to obtain learning guarantees without assuming the consistency of the target function. In addition, within the framework of the probabilistic approach, gradient estimates have been developed for working with latent structural variables in the form of permutations or subsets, and a number of practical applications have been considered.

  2. Dr. Maxim Kodryan

    On January 23 the defense of Maxim Kodryan's PhD thesis on "Training Dynamics and Loss Landscape of Neural Networks with Scale-Invariant Parameters" took place.

    Scale invariance is one of the key properties inherent in the parameters of most modern neural network architectures. Provided by the ubiquitous presence of layers of normalization of intermediate activations and/or weights, scale invariance, as the name implies, consists in the invariance of the function implemented by the neural network when its parameters are multiplied by an arbitrary positive scalar. In his work, Maxim investigates the effects of this property on the training dynamics of neural network models, as well as its influence on the intrinsic structure of the loss landscape.

  3. CVPR'23: Outstanding Reviewing

    This june Dmitry Vetrov was named one of the CVPR 2023 Outstanding Reviewers. Out of more than 7000 individuals who have completed at least one review for CVPR 2023, Program Chairs designated 232 whose reviews stood out for their exceptional quality and helpfulness, based on nominations and ratings submitted by the Area Chairs.

    Eduard Pockonechnyy, Maksim Nakhodnov, Semion Elistratov, Aibek Alanov, Viacheslav Meshchaninov helped review CVPR papers. Congratulations to our team!

  4. PhD Parade 2022

    We're happy to announce that several members of our lab have completed and successfully defended their PhD studies:

    Our congratulations to newly baked doctors!

  5. Dr. Alexander Novikov

    On December 16 our alumnus Alexander Novikov got his PhD on “Tensor methods for machine learning”. The thesis is focused on using low-rank tensor decomposition algorithms to develop faster methods for training models, methods for compressing and accelerating models, and creating less resource-intensive machine learning models from scratch. Dissertation was supervised by Ivan Oseledets and Dmitry Vetrov. Congrats to Dr. Alexander!

  6. AI Super Grant

    HSE University, where most of our group is based, got a major grant from the Russian Government to support research in the field of AI in 2021-2024. Funding – the total amount is over 1 billion rubles – will be received from the budget of the Russian Federation, and our partners: SberBank, Yandex, MTS AI. The purpose of the project is to develop new AI technologies allowed to expand the scope of its application and overcome existing limitations for solving applied tasks, and to optimize AI-models. Dmitry Vetrov will provide scientific guidance for the project.

  7. Sergey Troshin to receive a scholarship

    Sergey Troshin, a graduate student at BayesGroup, was awarded a personal scholarship for contribution to Computer Science. The scholarship has been recently established by the Faculty of Computer Science of HSE University for their students.

  8. Dr. Daniil Polykovskiy

    On May 20 Daniil Polykovskiy defended his thesis on "Generative Models for Drug Discovery". The PhD Dissertation was prepared at HSE and supervised by Dmitry Vetrov. We congratulate Daniil — our new Doctor of Philosophy in Computer Science!

  9. Ilya Segalovich Scholarship Winners

    Three young researchers of our team became the Ilya Segalovich Scholarship winners. The scholarship established by the Faculty of Computer Science (HSE University) in 2015 rewards academic and scientific achievements. Congrats to our winners, Artyom Gadetsky, Denis Rakitin and Sergey Troshin!

Page 1 out of 4 Older →