- Home
- Best Scientists - Mathematics
- Peter Richtárik

Discipline name
D-index
D-index (Discipline H-index) only includes papers and citation values for an examined
discipline in contrast to General H-index which accounts for publications across all
disciplines.
Citations
Publications
World Ranking
National Ranking

Mathematics
D-index
38
Citations
6,653
131
World Ranking
1122
National Ranking
11

- Algorithm
- Statistics
- Mathematical optimization

His primary scientific interests are in Algorithm, Convex function, Coordinate descent, Mathematical optimization and Sampling. His Algorithm study combines topics in areas such as Separable space and Stochastic optimization. His Convex function research incorporates elements of Combinatorics, Gradient descent, Lasso, Convex optimization and Function.

His Coordinate descent study integrates concerns from other disciplines, such as Linear system, Computational intelligence and Type. His Mathematical optimization research incorporates themes from Robustification, Local algorithm, Rate of convergence, Speedup and Random coordinate descent. His study in the fields of Importance sampling under the domain of Sampling overlaps with other disciplines such as Context.

- Federated Learning: Strategies for Improving Communication Efficiency (1080 citations)
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function (579 citations)
- Federated Optimization: Distributed Machine Learning for On-Device Intelligence (466 citations)

His primary areas of investigation include Algorithm, Convex function, Applied mathematics, Mathematical optimization and Coordinate descent. He has included themes like Matrix, Simple and Compression in his Algorithm study. His biological study spans a wide range of topics, including Combinatorics, Minification, Convex optimization, Function and Differentiable function.

His Applied mathematics research is multidisciplinary, incorporating elements of Rate of convergence, Stochastic gradient descent, Hessian matrix and Importance sampling. His work deals with themes such as Convergence, Dual and Speedup, which intersect with Mathematical optimization. His studies deal with areas such as Discrete mathematics, Acceleration, Optimization problem, Random coordinate descent and Lipschitz continuity as well as Coordinate descent.

- Algorithm (37.69%)
- Convex function (36.94%)
- Applied mathematics (30.60%)

- Convex function (36.94%)
- Applied mathematics (30.60%)
- Algorithm (37.69%)

Peter Richtárik mainly focuses on Convex function, Applied mathematics, Algorithm, Rate of convergence and Variance reduction. His Convex function research is multidisciplinary, incorporating perspectives in Function, Differentiable function, Importance sampling and Minification. The various areas that he examines in his Applied mathematics study include Iterative method, Stochastic gradient descent, Newton's method and Coordinate descent.

His work in the fields of Computation overlaps with other areas such as Dither. His Rate of convergence study combines topics from a wide range of disciplines, such as Quadratic equation, Gradient descent, Ergodic theory, Mathematical optimization and Convex optimization. His Mathematical optimization research includes themes of Sampling, Convergence and Fixed point.

- Tighter Theory for Local SGD on Identical and Heterogeneous Data (57 citations)
- Federated Learning of a Mixture of Global and Local Models (41 citations)
- Natural Compression for Distributed Deep Learning (34 citations)

- Statistics
- Algorithm
- Mathematical analysis

His main research concerns Applied mathematics, Stochastic gradient descent, Rate of convergence, Variance reduction and Convex function. His work is dedicated to discovering how Applied mathematics, Function are connected with Constant, Quadratic equation and Sublinear function and other disciplines. His work in Stochastic gradient descent tackles topics such as Stochastic optimization which are related to areas like Hessian matrix, Numerical linear algebra, Jacobian matrix and determinant and Importance sampling.

His research in Rate of convergence intersects with topics in Gradient descent, Iterated function, Mathematical optimization and Compression. His work in Gradient descent addresses issues such as Combinatorics, which are connected to fields such as Federated learning. His research combines Computation and Convex function.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Federated Learning: Strategies for Improving Communication Efficiency

Jakub Konečný;H. Brendan McMahan;Felix X. Yu;Peter Richtarik.

arXiv: Learning **(2016)**

1702 Citations

Federated Optimization: Distributed Machine Learning for On-Device Intelligence

Jakub Konečný;H. Brendan McMahan;Daniel Ramage;Peter Richtarik.

arXiv: Learning **(2016)**

710 Citations

Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function

Peter Richtárik;Martin Takáč.

Mathematical Programming **(2014)**

665 Citations

Generalized Power Method for Sparse Principal Component Analysis

Michel Journée;Yurii Nesterov;Peter Richtárik;Rodolphe Sepulchre.

Journal of Machine Learning Research **(2010)**

553 Citations

Parallel coordinate descent methods for big data optimization

Peter Richtárik;Martin Takáč.

Mathematical Programming **(2016)**

372 Citations

Accelerated, Parallel, and Proximal Coordinate Descent

Olivier Fercoq;Peter Richtárik.

Siam Journal on Optimization **(2015)**

272 Citations

Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting

Jakub Konecny;Jie Liu;Peter Richtarik;Martin Takac.

IEEE Journal of Selected Topics in Signal Processing **(2016)**

242 Citations

Randomized Iterative Methods for Linear Systems

Robert Mansel Gower;Peter Richtárik.

SIAM Journal on Matrix Analysis and Applications **(2015)**

178 Citations

Distributed coordinate descent method for learning with big data

Peter Richtárik;Martin Takáč.

Journal of Machine Learning Research **(2016)**

138 Citations

Semi-Stochastic Gradient Descent Methods

Jakub Konečný;Peter Richtárik.

arXiv: Machine Learning **(2013)**

135 Citations

University of California, Berkeley

King Abdullah University of Science and Technology

KU Leuven

Cornell University

French Institute for Research in Computer Science and Automation - INRIA

Centrum Wiskunde & Informatica

Singapore University of Technology and Design

Louisiana State University

Paris Dauphine University

University of Cambridge

French Institute for Research in Computer Science and Automation - INRIA

Publications: 20

Profile was last updated on December 6th, 2021.

Research.com Ranking is based on data retrieved from the Microsoft Academic Graph (MAG).

The ranking d-index is inferred from publications deemed to belong to the considered discipline.

If you think any of the details on this page are incorrect, let us know.

Contact us

We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:

Something went wrong. Please try again later.