D-Index & Metrics Best Publications

D-Index & Metrics

Discipline name D-index D-index (Discipline H-index) only includes papers and citation values for an examined discipline in contrast to General H-index which accounts for publications across all disciplines. Citations Publications World Ranking National Ranking
Mathematics D-index 38 Citations 6,653 131 World Ranking 1122 National Ranking 11

Overview

What is he best known for?

The fields of study he is best known for:

  • Algorithm
  • Statistics
  • Mathematical optimization

His primary scientific interests are in Algorithm, Convex function, Coordinate descent, Mathematical optimization and Sampling. His Algorithm study combines topics in areas such as Separable space and Stochastic optimization. His Convex function research incorporates elements of Combinatorics, Gradient descent, Lasso, Convex optimization and Function.

His Coordinate descent study integrates concerns from other disciplines, such as Linear system, Computational intelligence and Type. His Mathematical optimization research incorporates themes from Robustification, Local algorithm, Rate of convergence, Speedup and Random coordinate descent. His study in the fields of Importance sampling under the domain of Sampling overlaps with other disciplines such as Context.

His most cited work include:

  • Federated Learning: Strategies for Improving Communication Efficiency (1080 citations)
  • Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function (579 citations)
  • Federated Optimization: Distributed Machine Learning for On-Device Intelligence (466 citations)

What are the main themes of his work throughout his whole career to date?

His primary areas of investigation include Algorithm, Convex function, Applied mathematics, Mathematical optimization and Coordinate descent. He has included themes like Matrix, Simple and Compression in his Algorithm study. His biological study spans a wide range of topics, including Combinatorics, Minification, Convex optimization, Function and Differentiable function.

His Applied mathematics research is multidisciplinary, incorporating elements of Rate of convergence, Stochastic gradient descent, Hessian matrix and Importance sampling. His work deals with themes such as Convergence, Dual and Speedup, which intersect with Mathematical optimization. His studies deal with areas such as Discrete mathematics, Acceleration, Optimization problem, Random coordinate descent and Lipschitz continuity as well as Coordinate descent.

He most often published in these fields:

  • Algorithm (37.69%)
  • Convex function (36.94%)
  • Applied mathematics (30.60%)

What were the highlights of his more recent work (between 2019-2021)?

  • Convex function (36.94%)
  • Applied mathematics (30.60%)
  • Algorithm (37.69%)

In recent papers he was focusing on the following fields of study:

Peter Richtárik mainly focuses on Convex function, Applied mathematics, Algorithm, Rate of convergence and Variance reduction. His Convex function research is multidisciplinary, incorporating perspectives in Function, Differentiable function, Importance sampling and Minification. The various areas that he examines in his Applied mathematics study include Iterative method, Stochastic gradient descent, Newton's method and Coordinate descent.

His work in the fields of Computation overlaps with other areas such as Dither. His Rate of convergence study combines topics from a wide range of disciplines, such as Quadratic equation, Gradient descent, Ergodic theory, Mathematical optimization and Convex optimization. His Mathematical optimization research includes themes of Sampling, Convergence and Fixed point.

Between 2019 and 2021, his most popular works were:

  • Tighter Theory for Local SGD on Identical and Heterogeneous Data (57 citations)
  • Federated Learning of a Mixture of Global and Local Models (41 citations)
  • Natural Compression for Distributed Deep Learning (34 citations)

In his most recent research, the most cited papers focused on:

  • Statistics
  • Algorithm
  • Mathematical analysis

His main research concerns Applied mathematics, Stochastic gradient descent, Rate of convergence, Variance reduction and Convex function. His work is dedicated to discovering how Applied mathematics, Function are connected with Constant, Quadratic equation and Sublinear function and other disciplines. His work in Stochastic gradient descent tackles topics such as Stochastic optimization which are related to areas like Hessian matrix, Numerical linear algebra, Jacobian matrix and determinant and Importance sampling.

His research in Rate of convergence intersects with topics in Gradient descent, Iterated function, Mathematical optimization and Compression. His work in Gradient descent addresses issues such as Combinatorics, which are connected to fields such as Federated learning. His research combines Computation and Convex function.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Best Publications

Federated Learning: Strategies for Improving Communication Efficiency

Jakub Konečný;H. Brendan McMahan;Felix X. Yu;Peter Richtarik.
arXiv: Learning (2016)

1702 Citations

Federated Optimization: Distributed Machine Learning for On-Device Intelligence

Jakub Konečný;H. Brendan McMahan;Daniel Ramage;Peter Richtarik.
arXiv: Learning (2016)

710 Citations

Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function

Peter Richtárik;Martin Takáč.
Mathematical Programming (2014)

665 Citations

Generalized Power Method for Sparse Principal Component Analysis

Michel Journée;Yurii Nesterov;Peter Richtárik;Rodolphe Sepulchre.
Journal of Machine Learning Research (2010)

553 Citations

Parallel coordinate descent methods for big data optimization

Peter Richtárik;Martin Takáč.
Mathematical Programming (2016)

372 Citations

Accelerated, Parallel, and Proximal Coordinate Descent

Olivier Fercoq;Peter Richtárik.
Siam Journal on Optimization (2015)

272 Citations

Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting

Jakub Konecny;Jie Liu;Peter Richtarik;Martin Takac.
IEEE Journal of Selected Topics in Signal Processing (2016)

242 Citations

Randomized Iterative Methods for Linear Systems

Robert Mansel Gower;Peter Richtárik.
SIAM Journal on Matrix Analysis and Applications (2015)

178 Citations

Distributed coordinate descent method for learning with big data

Peter Richtárik;Martin Takáč.
Journal of Machine Learning Research (2016)

138 Citations

Semi-Stochastic Gradient Descent Methods

Jakub Konečný;Peter Richtárik.
arXiv: Machine Learning (2013)

135 Citations

Best Scientists Citing Peter Richtárik

H. Vincent Poor

H. Vincent Poor

Princeton University

Publications: 42

Mehdi Bennis

Mehdi Bennis

University of Oulu

Publications: 38

Mingyi Hong

Mingyi Hong

University of Minnesota

Publications: 30

Tong Zhang

Tong Zhang

Hong Kong University of Science and Technology

Publications: 29

Wotao Yin

Wotao Yin

Alibaba Group (China)

Publications: 28

Dusit Niyato

Dusit Niyato

Nanyang Technological University

Publications: 28

Han Liu

Han Liu

Princeton University

Publications: 26

Jean-Christophe Pesquet

Jean-Christophe Pesquet

CentraleSupélec

Publications: 22

Ji Liu

Ji Liu

University of Rochester

Publications: 21

Alejandro Ribeiro

Alejandro Ribeiro

University of Pennsylvania

Publications: 21

Mark Schmidt

Mark Schmidt

University of British Columbia

Publications: 21

Michael I. Jordan

Michael I. Jordan

University of California, Berkeley

Publications: 20

Francis Bach

Francis Bach

French Institute for Research in Computer Science and Automation - INRIA

Publications: 20

Nathan Srebro

Nathan Srebro

Toyota Technological Institute at Chicago

Publications: 19

Michael W. Mahoney

Michael W. Mahoney

University of California, Berkeley

Publications: 19

Suvrit Sra

Suvrit Sra

MIT

Publications: 19

Profile was last updated on December 6th, 2021.
Research.com Ranking is based on data retrieved from the Microsoft Academic Graph (MAG).
The ranking d-index is inferred from publications deemed to belong to the considered discipline.

If you think any of the details on this page are incorrect, let us know.

Contact us
Something went wrong. Please try again later.