D-Index & Metrics Best Publications

D-Index & Metrics D-index (Discipline H-index) only includes papers and citation values for an examined discipline in contrast to General H-index which accounts for publications across all disciplines.

Discipline name D-index D-index (Discipline H-index) only includes papers and citation values for an examined discipline in contrast to General H-index which accounts for publications across all disciplines. Citations Publications World Ranking National Ranking
Computer Science D-index 54 Citations 9,278 139 World Ranking 3068 National Ranking 54

Overview

What is he best known for?

The fields of study he is best known for:

  • Machine learning
  • Statistics
  • Artificial intelligence

Mathematical optimization, Artificial intelligence, Convex optimization, Algorithm and Stochastic optimization are his primary areas of study. His research in Mathematical optimization intersects with topics in Stochastic gradient descent, Learnability and Regret. His research integrates issues of Machine learning, Theoretical computer science and Euclidean space in his study of Artificial intelligence.

The study incorporates disciplines such as Function and Greedy algorithm in addition to Convex optimization. His Algorithm research is multidisciplinary, incorporating perspectives in Artificial neural network, Norm and Time series. Ohad Shamir combines subjects such as Distributed algorithm, Type and Applied mathematics with his study of Stochastic optimization.

His most cited work include:

  • Optimal distributed online prediction using mini-batches (430 citations)
  • Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization (379 citations)
  • Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes (341 citations)

What are the main themes of his work throughout his whole career to date?

His primary areas of study are Mathematical optimization, Artificial intelligence, Algorithm, Machine learning and Applied mathematics. His Mathematical optimization research integrates issues from Sampling, Stochastic gradient descent, Regret and Convex optimization. Ohad Shamir has included themes like Theoretical computer science and Pattern recognition in his Artificial intelligence study.

Artificial neural network, Dimension, Lipschitz continuity and Kernel is closely connected to Function in his research, which is encompassed under the umbrella topic of Algorithm. In the field of Machine learning, his study on Semi-supervised learning overlaps with subjects such as Collaborative filtering. His Applied mathematics study incorporates themes from Regularization, Gradient descent, Optimization problem, Upper and lower bounds and Stationary point.

He most often published in these fields:

  • Mathematical optimization (26.11%)
  • Artificial intelligence (25.12%)
  • Algorithm (18.23%)

What were the highlights of his more recent work (between 2017-2021)?

  • Artificial neural network (12.32%)
  • Applied mathematics (14.29%)
  • Function (13.79%)

In recent papers he was focusing on the following fields of study:

Ohad Shamir mainly focuses on Artificial neural network, Applied mathematics, Function, Upper and lower bounds and Stochastic gradient descent. His Artificial neural network research is multidisciplinary, relying on both Exponential growth, Algorithm, Bounded function and Theoretical computer science. His Applied mathematics research includes elements of Regularization, Non convex optimization, Gradient descent, Norm and Stationary point.

His biological study spans a wide range of topics, including Manifold, Mathematical optimization, Grassmannian and Convex optimization. Minimax is the focus of his Mathematical optimization research. Ohad Shamir interconnects Optimization problem and Combinatorics in the investigation of issues within Stochastic gradient descent.

Between 2017 and 2021, his most popular works were:

  • Size-Independent Sample Complexity of Neural Networks. (176 citations)
  • Spurious Local Minima are Common in Two-Layer ReLU Neural Networks. (92 citations)
  • On the Power and Limitations of Random Features for Understanding Neural Networks (53 citations)

In his most recent research, the most cited papers focused on:

  • Statistics
  • Machine learning
  • Artificial intelligence

Ohad Shamir mainly investigates Artificial neural network, Applied mathematics, Upper and lower bounds, Algorithm and Stochastic gradient descent. His study looks at the intersection of Artificial neural network and topics like Exponential growth with Gradient descent and Polynomial. His Upper and lower bounds research incorporates themes from Logarithm, Stochastic optimization, Minimax and Convex optimization.

His Convex optimization study combines topics from a wide range of disciplines, such as Regularization, Convex function, Numerical analysis and Mathematical optimization. In general Algorithm study, his work on Residual neural network, Residual and Linear prediction often relates to the realm of Network architecture, thereby connecting several areas of interest. His work deals with themes such as Optimization problem, Stationary point and Maxima and minima, which intersect with Stochastic gradient descent.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Best Publications

Optimal distributed online prediction using mini-batches

Ofer Dekel;Ran Gilad-Bachrach;Ohad Shamir;Lin Xiao.
Journal of Machine Learning Research (2012)

560 Citations

Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization

Alexander Rakhlin;Ohad Shamir;Karthik Sridharan.
international conference on machine learning (2012)

552 Citations

The Power of Depth for Feedforward Neural Networks

Ronen Eldan;Ohad Shamir.
conference on learning theory (2016)

543 Citations

Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes

Ohad Shamir;Tong Zhang.
international conference on machine learning (2013)

478 Citations

Communication-Efficient Distributed Optimization using an Approximate Newton-type Method

Ohad Shamir;Nati Srebro;Tong Zhang.
international conference on machine learning (2014)

390 Citations

Learnability, Stability and Uniform Convergence

Shai Shalev-Shwartz;Ohad Shamir;Nathan Srebro;Karthik Sridharan.
Journal of Machine Learning Research (2010)

359 Citations

Stochastic Convex Optimization.

Shai Shalev-Shwartz;Ohad Shamir;Nathan Srebro;Karthik Sridharan.
conference on learning theory (2009)

283 Citations

On the Computational Efficiency of Training Neural Networks

Roi Livni;Shai Shalev-Shwartz;Ohad Shamir.
neural information processing systems (2014)

234 Citations

Adaptively Learning the Crowd Kernel

Omer Tamuz;Ce Liu;Serge Belongie;Ohad Shamir.
arXiv: Learning (2011)

234 Citations

Learning and generalization with the information bottleneck

Ohad Shamir;Sivan Sabato;Naftali Tishby.
Theoretical Computer Science (2010)

200 Citations

If you think any of the details on this page are incorrect, let us know.

Contact us

Best Scientists Citing Ohad Shamir

Nathan Srebro

Nathan Srebro

Toyota Technological Institute at Chicago

Publications: 44

Peter Richtárik

Peter Richtárik

King Abdullah University of Science and Technology

Publications: 44

Elad Hazan

Elad Hazan

Princeton University

Publications: 43

Michael I. Jordan

Michael I. Jordan

University of California, Berkeley

Publications: 41

Rong Jin

Rong Jin

Alibaba Group (China)

Publications: 38

Tianbao Yang

Tianbao Yang

Texas A&M University

Publications: 37

Karthik Sridharan

Karthik Sridharan

Cornell University

Publications: 35

Alexander Rakhlin

Alexander Rakhlin

MIT

Publications: 34

Tong Zhang

Tong Zhang

Hong Kong University of Science and Technology

Publications: 34

Peter L. Bartlett

Peter L. Bartlett

Google (United States)

Publications: 33

Jason D. Lee

Jason D. Lee

Princeton University

Publications: 33

Francis Bach

Francis Bach

École Normale Supérieure

Publications: 32

Shai Shalev-Shwartz

Shai Shalev-Shwartz

Hebrew University of Jerusalem

Publications: 32

Lorenzo Rosasco

Lorenzo Rosasco

University of Genoa

Publications: 29

Huan Xu

Huan Xu

Alibaba Group (China)

Publications: 28

Martin J. Wainwright

Martin J. Wainwright

University of California, Berkeley

Publications: 26

Trending Scientists

Elaine Shi

Elaine Shi

Carnegie Mellon University

Massimo Bergamasco

Massimo Bergamasco

Sant'Anna School of Advanced Studies

Byungkyu Kim

Byungkyu Kim

Korea Aerospace University

Francesco Paolucci

Francesco Paolucci

University of Bologna

Di Chen

Di Chen

University of Science and Technology Beijing

Tim M. Townes

Tim M. Townes

University of Alabama at Birmingham

P. Dick

P. Dick

Paul Dick and Associates

Véronique Chevalier

Véronique Chevalier

Institut Pasteur

Raymond M. Schiffelers

Raymond M. Schiffelers

Utrecht University

Marc B. Parlange

Marc B. Parlange

Monash University

Nigel W. Arnell

Nigel W. Arnell

University of Reading

Nicolas Toni

Nicolas Toni

University of Lausanne

Karen M. Rodrigue

Karen M. Rodrigue

The University of Texas at Dallas

Charles Warlow

Charles Warlow

University of Edinburgh

Nicholas J. Wheeler

Nicholas J. Wheeler

University of Birmingham

Zhi-Yun Li

Zhi-Yun Li

University of Virginia

Something went wrong. Please try again later.