I am interested in well motivated optimization problems which lend themselves to an insightful mathematical analysis.
Website
Konstantin Mishchenko is a Ph.D. candidate at Visual Computing Center (VCC) in King Abdullah University of Science and Technology (KAUST), studying under the supervision of Professor Peter Richtarik in his research group.
Education and Early Career
Konstantin Mishchenko obtained his bachelor degree in Computer Science and Physics from Moscow Institute of Physics and Technology in 2016. After that in 2017, he received his master degree in Machine Learning MASH from Université Paris Dauphine in Paris, France.
Konstantin had many internships during his studies. The most recent was with Amazon in which he was an applied scientist intern to build machine learning models. Before that, he was an intern at AIM High Tech in Mosco.
Research Interest
He is interested in optimization methods for machine learning and especially in stochastic and distributed algorithms for large-scale problems, medical imaging and numerical methods in computer science.
Honors and Awards
Konstantin was the 1st place winner at PlumeLabs Data Challenge. Also, he was awarded the Paris Graduate School of Mathematics fellowship for the years 2016 and 2017. He won the 1st prize at the Higher School of Economics Olympiad on Applied Math and Informatics in 2015. He was awarded 1st prize at both Moscow Mathematical Olympiad and Phystech International in Mathematics during 2012.
Selected Publications
- Konstantin Mishchenko, Filip Hanzely, Peter Richtárik. 99% of Parallel Optimization is Inevitably a Waste of Time, arXiv:1901.09437, 2019.
- Konstantin Mishchenko, Eduard Gorbunov, Martin Takáč, Peter Richtárik. Distributed Learning with Compressed Gradient Differences, arXiv:1901.09269, 2019.
- Konstantin Mishchenko, Peter Richtárik. A Stochastic Penalty Model for Convex and Nonconvex Optimization with Big Constraints, arXiv:1810.13387, 2018.
- Filip Hanzely, Konstantin Mishchenko, Peter Richtárik. SEGA: Variance Reduction via Gradient Sketching, Advances in Neural Information Processing Systems 31, 2018.
- Konstantin Mishchenko, Franck Iutzeler, Jérôme Malick. A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm, arXiv:1806.09429, 2018.
- Konstantin Mishchenko, Franck Iutzeler, Jérôme Malick, Massih-Reza Amini. A Delay-tolerant Proximal-Gradient Algorithm for Distributed Learning, Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3587-3595, 2018.
Education Profile
- M.Sc., Machine Learning, École normale supérieure Paris-Saclay and Paris-Dauphine, 2017
- B.Sc., Computer Science and Physics, Moscow Institute of Physics and Technology, 2016
Awards and Distinctions
- 1st place, PlumeLabs datachallenge, 2017
- Paris Graduate School of Mathematics fellowship, (2016-2017)
- 1st prize, Higher School of Economics Olympiad on Applied Math and Informatics, 2015
- 1st prize, Moscow Mathematical Olympiad, 2012
- 1st prize, Phystech International, Mathematics, 2012