2023 - Research.com Computer Science in Saudi Arabia Leader Award
2023 - Research.com Mathematics in Saudi Arabia Leader Award
2022 - Research.com Rising Star of Science Award
His primary scientific interests are in Algorithm, Convex function, Coordinate descent, Mathematical optimization and Sampling. His Algorithm study combines topics in areas such as Separable space and Stochastic optimization. His Convex function research incorporates elements of Combinatorics, Gradient descent, Lasso, Convex optimization and Function.
His Coordinate descent study integrates concerns from other disciplines, such as Linear system, Computational intelligence and Type. His Mathematical optimization research incorporates themes from Robustification, Local algorithm, Rate of convergence, Speedup and Random coordinate descent. His study in the fields of Importance sampling under the domain of Sampling overlaps with other disciplines such as Context.
His primary areas of investigation include Algorithm, Convex function, Applied mathematics, Mathematical optimization and Coordinate descent. He has included themes like Matrix, Simple and Compression in his Algorithm study. His biological study spans a wide range of topics, including Combinatorics, Minification, Convex optimization, Function and Differentiable function.
His Applied mathematics research is multidisciplinary, incorporating elements of Rate of convergence, Stochastic gradient descent, Hessian matrix and Importance sampling. His work deals with themes such as Convergence, Dual and Speedup, which intersect with Mathematical optimization. His studies deal with areas such as Discrete mathematics, Acceleration, Optimization problem, Random coordinate descent and Lipschitz continuity as well as Coordinate descent.
Peter Richtárik mainly focuses on Convex function, Applied mathematics, Algorithm, Rate of convergence and Variance reduction. His Convex function research is multidisciplinary, incorporating perspectives in Function, Differentiable function, Importance sampling and Minification. The various areas that he examines in his Applied mathematics study include Iterative method, Stochastic gradient descent, Newton's method and Coordinate descent.
His work in the fields of Computation overlaps with other areas such as Dither. His Rate of convergence study combines topics from a wide range of disciplines, such as Quadratic equation, Gradient descent, Ergodic theory, Mathematical optimization and Convex optimization. His Mathematical optimization research includes themes of Sampling, Convergence and Fixed point.
His main research concerns Applied mathematics, Stochastic gradient descent, Rate of convergence, Variance reduction and Convex function. His work is dedicated to discovering how Applied mathematics, Function are connected with Constant, Quadratic equation and Sublinear function and other disciplines. His work in Stochastic gradient descent tackles topics such as Stochastic optimization which are related to areas like Hessian matrix, Numerical linear algebra, Jacobian matrix and determinant and Importance sampling.
His research in Rate of convergence intersects with topics in Gradient descent, Iterated function, Mathematical optimization and Compression. His work in Gradient descent addresses issues such as Combinatorics, which are connected to fields such as Federated learning. His research combines Computation and Convex function.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Federated Learning: Strategies for Improving Communication Efficiency
Jakub Konečný;H. Brendan McMahan;Felix X. Yu;Peter Richtarik.
arXiv: Learning (2016)
Federated Optimization: Distributed Machine Learning for On-Device Intelligence
Jakub Konečný;H. Brendan McMahan;Daniel Ramage;Peter Richtarik.
arXiv: Learning (2016)
Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
Peter Richtárik;Martin Takáč.
Mathematical Programming (2014)
Generalized Power Method for Sparse Principal Component Analysis
Michel Journée;Yurii Nesterov;Peter Richtárik;Rodolphe Sepulchre.
Journal of Machine Learning Research (2010)
Parallel coordinate descent methods for big data optimization
Peter Richtárik;Martin Takáč.
Mathematical Programming (2016)
Accelerated, Parallel, and Proximal Coordinate Descent
Olivier Fercoq;Peter Richtárik.
Siam Journal on Optimization (2015)
Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting
Jakub Konecny;Jie Liu;Peter Richtarik;Martin Takac.
IEEE Journal of Selected Topics in Signal Processing (2016)
Randomized Iterative Methods for Linear Systems
Robert Mansel Gower;Peter Richtárik.
SIAM Journal on Matrix Analysis and Applications (2015)
Mini-Batch Primal and Dual Methods for SVMs
Martin Takac;Avleen Bijral;Peter Richtarik;Nati Srebro.
international conference on machine learning (2013)
Adding vs. Averaging in Distributed Primal-Dual Optimization
Chenxin Ma;Virginia Smith;Martin Jaggi;Michael Jordan.
international conference on machine learning (2015)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: