D-Index & Metrics Best Publications
Research.com 2022 Rising Star of Science Award Badge
Mathematics
Saudi Arabia
2023

D-Index & Metrics D-index (Discipline H-index) only includes papers and citation values for an examined discipline in contrast to General H-index which accounts for publications across all disciplines.

Discipline name D-index D-index (Discipline H-index) only includes papers and citation values for an examined discipline in contrast to General H-index which accounts for publications across all disciplines. Citations Publications World Ranking National Ranking
Rising Stars D-index 52 Citations 12,143 199 World Ranking 214 National Ranking 8
Mathematics D-index 46 Citations 8,916 234 World Ranking 973 National Ranking 10
Computer Science D-index 50 Citations 11,576 236 World Ranking 3653 National Ranking 10

Research.com Recognitions

Awards & Achievements

2023 - Research.com Computer Science in Saudi Arabia Leader Award

2023 - Research.com Mathematics in Saudi Arabia Leader Award

2022 - Research.com Rising Star of Science Award

Overview

What is he best known for?

The fields of study he is best known for:

  • Algorithm
  • Statistics
  • Mathematical optimization

His primary scientific interests are in Algorithm, Convex function, Coordinate descent, Mathematical optimization and Sampling. His Algorithm study combines topics in areas such as Separable space and Stochastic optimization. His Convex function research incorporates elements of Combinatorics, Gradient descent, Lasso, Convex optimization and Function.

His Coordinate descent study integrates concerns from other disciplines, such as Linear system, Computational intelligence and Type. His Mathematical optimization research incorporates themes from Robustification, Local algorithm, Rate of convergence, Speedup and Random coordinate descent. His study in the fields of Importance sampling under the domain of Sampling overlaps with other disciplines such as Context.

His most cited work include:

  • Federated Learning: Strategies for Improving Communication Efficiency (1080 citations)
  • Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function (579 citations)
  • Federated Optimization: Distributed Machine Learning for On-Device Intelligence (466 citations)

What are the main themes of his work throughout his whole career to date?

His primary areas of investigation include Algorithm, Convex function, Applied mathematics, Mathematical optimization and Coordinate descent. He has included themes like Matrix, Simple and Compression in his Algorithm study. His biological study spans a wide range of topics, including Combinatorics, Minification, Convex optimization, Function and Differentiable function.

His Applied mathematics research is multidisciplinary, incorporating elements of Rate of convergence, Stochastic gradient descent, Hessian matrix and Importance sampling. His work deals with themes such as Convergence, Dual and Speedup, which intersect with Mathematical optimization. His studies deal with areas such as Discrete mathematics, Acceleration, Optimization problem, Random coordinate descent and Lipschitz continuity as well as Coordinate descent.

He most often published in these fields:

  • Algorithm (37.69%)
  • Convex function (36.94%)
  • Applied mathematics (30.60%)

What were the highlights of his more recent work (between 2019-2021)?

  • Convex function (36.94%)
  • Applied mathematics (30.60%)
  • Algorithm (37.69%)

In recent papers he was focusing on the following fields of study:

Peter Richtárik mainly focuses on Convex function, Applied mathematics, Algorithm, Rate of convergence and Variance reduction. His Convex function research is multidisciplinary, incorporating perspectives in Function, Differentiable function, Importance sampling and Minification. The various areas that he examines in his Applied mathematics study include Iterative method, Stochastic gradient descent, Newton's method and Coordinate descent.

His work in the fields of Computation overlaps with other areas such as Dither. His Rate of convergence study combines topics from a wide range of disciplines, such as Quadratic equation, Gradient descent, Ergodic theory, Mathematical optimization and Convex optimization. His Mathematical optimization research includes themes of Sampling, Convergence and Fixed point.

Between 2019 and 2021, his most popular works were:

  • Tighter Theory for Local SGD on Identical and Heterogeneous Data (57 citations)
  • Federated Learning of a Mixture of Global and Local Models (41 citations)
  • Natural Compression for Distributed Deep Learning (34 citations)

In his most recent research, the most cited papers focused on:

  • Statistics
  • Algorithm
  • Mathematical analysis

His main research concerns Applied mathematics, Stochastic gradient descent, Rate of convergence, Variance reduction and Convex function. His work is dedicated to discovering how Applied mathematics, Function are connected with Constant, Quadratic equation and Sublinear function and other disciplines. His work in Stochastic gradient descent tackles topics such as Stochastic optimization which are related to areas like Hessian matrix, Numerical linear algebra, Jacobian matrix and determinant and Importance sampling.

His research in Rate of convergence intersects with topics in Gradient descent, Iterated function, Mathematical optimization and Compression. His work in Gradient descent addresses issues such as Combinatorics, which are connected to fields such as Federated learning. His research combines Computation and Convex function.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Best Publications

Federated Learning: Strategies for Improving Communication Efficiency

Jakub Konečný;H. Brendan McMahan;Felix X. Yu;Peter Richtarik.
arXiv: Learning (2016)

2335 Citations

Federated Optimization: Distributed Machine Learning for On-Device Intelligence

Jakub Konečný;H. Brendan McMahan;Daniel Ramage;Peter Richtarik.
arXiv: Learning (2016)

1035 Citations

Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function

Peter Richtárik;Martin Takáč.
Mathematical Programming (2014)

721 Citations

Generalized Power Method for Sparse Principal Component Analysis

Michel Journée;Yurii Nesterov;Peter Richtárik;Rodolphe Sepulchre.
Journal of Machine Learning Research (2010)

598 Citations

Parallel coordinate descent methods for big data optimization

Peter Richtárik;Martin Takáč.
Mathematical Programming (2016)

402 Citations

Accelerated, Parallel, and Proximal Coordinate Descent

Olivier Fercoq;Peter Richtárik.
Siam Journal on Optimization (2015)

303 Citations

Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting

Jakub Konecny;Jie Liu;Peter Richtarik;Martin Takac.
IEEE Journal of Selected Topics in Signal Processing (2016)

274 Citations

Randomized Iterative Methods for Linear Systems

Robert Mansel Gower;Peter Richtárik.
SIAM Journal on Matrix Analysis and Applications (2015)

222 Citations

Mini-Batch Primal and Dual Methods for SVMs

Martin Takac;Avleen Bijral;Peter Richtarik;Nati Srebro.
international conference on machine learning (2013)

179 Citations

Adding vs. Averaging in Distributed Primal-Dual Optimization

Chenxin Ma;Virginia Smith;Martin Jaggi;Michael Jordan.
international conference on machine learning (2015)

156 Citations

If you think any of the details on this page are incorrect, let us know.

Contact us

Best Scientists Citing Peter Richtárik

H. Vincent Poor

H. Vincent Poor

Princeton University

Publications: 42

Mehdi Bennis

Mehdi Bennis

University of Oulu

Publications: 38

Mingyi Hong

Mingyi Hong

University of Minnesota

Publications: 30

Tong Zhang

Tong Zhang

Hong Kong University of Science and Technology

Publications: 29

Wotao Yin

Wotao Yin

Alibaba Group (China)

Publications: 28

Dusit Niyato

Dusit Niyato

Nanyang Technological University

Publications: 28

Han Liu

Han Liu

Northwestern University

Publications: 26

Jean-Christophe Pesquet

Jean-Christophe Pesquet

CentraleSupélec

Publications: 22

Ji Liu

Ji Liu

University of Rochester

Publications: 21

Alejandro Ribeiro

Alejandro Ribeiro

University of Pennsylvania

Publications: 21

Mark Schmidt

Mark Schmidt

University of British Columbia

Publications: 21

Michael I. Jordan

Michael I. Jordan

University of California, Berkeley

Publications: 20

Francis Bach

Francis Bach

École Normale Supérieure

Publications: 20

Nathan Srebro

Nathan Srebro

Toyota Technological Institute at Chicago

Publications: 19

Michael W. Mahoney

Michael W. Mahoney

University of California, Berkeley

Publications: 19

Suvrit Sra

Suvrit Sra

MIT

Publications: 19

Trending Scientists

Ouri Wolfson

Ouri Wolfson

University of Illinois at Chicago

Minsu Cho

Minsu Cho

Pohang University of Science and Technology

Chenglong Tang

Chenglong Tang

Xi'an Jiaotong University

Ronald G. Larson

Ronald G. Larson

University of Michigan–Ann Arbor

Jiandong Ding

Jiandong Ding

Fudan University

Meine van Noordwijk

Meine van Noordwijk

World Agroforestry Centre

Gerd E. Vegarud

Gerd E. Vegarud

Norwegian University of Life Sciences

Joel E. Cohen

Joel E. Cohen

Rockefeller University

Paul Cos

Paul Cos

University of Antwerp

Elisabeth Alve

Elisabeth Alve

University of Oslo

Robert Arnone

Robert Arnone

University of Southern Mississippi

Kenneth D. Carr

Kenneth D. Carr

New York University

Dae-Shik Kim

Dae-Shik Kim

Korea Advanced Institute of Science and Technology

Martin Sarter

Martin Sarter

University of Michigan–Ann Arbor

Anthony H. Barnett

Anthony H. Barnett

University of Birmingham

Constantine Frangakis

Constantine Frangakis

Johns Hopkins University

Something went wrong. Please try again later.