H-Index & Metrics Top Publications

H-Index & Metrics

Discipline name H-index Citations Publications World Ranking National Ranking
Engineering and Technology H-index 55 Citations 18,988 125 World Ranking 992 National Ranking 434

Overview

What is he best known for?

The fields of study he is best known for:

  • Statistics
  • Machine learning
  • Artificial intelligence

His primary scientific interests are in Mathematical optimization, Regularization, Gradient descent, Artificial intelligence and Applied mathematics. His work on Optimization problem as part of general Mathematical optimization study is frequently linked to Uniform convergence, bridging the gap between disciplines. His work in Regularization addresses subjects such as Support vector machine, which are connected to disciplines such as Limit.

His studies deal with areas such as Stochastic gradient descent and Linear separability as well as Gradient descent. His work investigates the relationship between Artificial intelligence and topics such as Machine learning that intersect with problems in Pattern recognition. His Applied mathematics research includes themes of Matrix decomposition, Quadratic equation, Stochastic optimization and Linear prediction.

His most cited work include:

  • Pegasos: primal estimated sub-gradient solver for SVM (1247 citations)
  • Pegasos: Primal Estimated sub-GrAdient SOlver for SVM (937 citations)
  • Maximum-Margin Matrix Factorization (918 citations)

What are the main themes of his work throughout his whole career to date?

The scientist’s investigation covers issues in Mathematical optimization, Algorithm, Applied mathematics, Artificial intelligence and Gradient descent. He has included themes like Artificial neural network, Stochastic gradient descent, Upper and lower bounds and Regular polygon in his Mathematical optimization study. As a part of the same scientific family, Nathan Srebro mostly works in the field of Algorithm, focusing on Simple and, on occasion, Distribution.

His work carried out in the field of Applied mathematics brings together such families of science as Factorization, Regularization, Norm and Matrix. His Artificial intelligence research includes elements of Machine learning and Pattern recognition. His study focuses on the intersection of Support vector machine and fields such as Kernel with connections in the field of Kernel and Discrete mathematics.

He most often published in these fields:

  • Mathematical optimization (28.76%)
  • Algorithm (19.31%)
  • Applied mathematics (19.31%)

What were the highlights of his more recent work (between 2017-2021)?

  • Applied mathematics (19.31%)
  • Gradient descent (12.88%)
  • Upper and lower bounds (9.87%)

In recent papers he was focusing on the following fields of study:

His primary areas of investigation include Applied mathematics, Gradient descent, Upper and lower bounds, Algorithm and Mathematical optimization. His Applied mathematics study incorporates themes from Regularization, Iterated function and Stochastic gradient descent. His Gradient descent study integrates concerns from other disciplines, such as Contrast, Monotone polygon, Margin, Separable space and Norm.

His Upper and lower bounds research includes themes of Discrete mathematics, Class, Stochastic optimization, Distributed learning and Stationary point. His biological study spans a wide range of topics, including Structure, Kernel method, Kernel, Simple and Kernel. The Mathematical optimization study combines topics in areas such as Quadratic equation and Convex optimization.

Between 2017 and 2021, his most popular works were:

  • The implicit bias of gradient descent on separable data (264 citations)
  • Towards Understanding the Role of Over-Parametrization in Generalization of Neural Networks (164 citations)
  • Implicit Bias of Gradient Descent on Linear Convolutional Networks (140 citations)

In his most recent research, the most cited papers focused on:

  • Statistics
  • Machine learning
  • Artificial intelligence

His primary areas of study are Gradient descent, Applied mathematics, Norm, Upper and lower bounds and Discrete mathematics. His Gradient descent research is multidisciplinary, incorporating perspectives in Factorization, Underdetermined system, Margin and Non homogeneous. His Applied mathematics study incorporates themes from Linear prediction, Separable space, Monotone polygon and Regularization.

Regularization is the subject of his research, which falls under Artificial intelligence. His Upper and lower bounds research incorporates elements of Mathematical optimization and Minimax. His study on VC dimension is often connected to Linear spline as part of broader study in Discrete mathematics.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Top Publications

Pegasos: primal estimated sub-gradient solver for SVM

Shai Shalev-Shwartz;Yoram Singer;Nathan Srebro;Andrew Cotter.
Mathematical Programming (2011)

2484 Citations

Pegasos: Primal Estimated sub-GrAdient SOlver for SVM

Shai Shalev-Shwartz;Yoram Singer;Nathan Srebro.
international conference on machine learning (2007)

1506 Citations

Maximum-Margin Matrix Factorization

Nathan Srebro;Jason Rennie;Tommi S. Jaakkola.
neural information processing systems (2004)

1246 Citations

Fast maximum margin matrix factorization for collaborative prediction

Jasson D. M. Rennie;Nathan Srebro.
international conference on machine learning (2005)

1190 Citations

Equality of opportunity in supervised learning

Moritz Hardt;Eric Price;Nathan Srebro.
neural information processing systems (2016)

1026 Citations

Weighted low-rank approximations

Nathan Srebro;Tommi Jaakkola.
international conference on machine learning (2003)

920 Citations

Exploring Generalization in Deep Learning

Behnam Neyshabur;Srinadh Bhojanapalli;David McAllester;Nathan Srebro.
neural information processing systems (2017)

611 Citations

Rank, trace-norm and max-norm

Nathan Srebro;Adi Shraibman.
conference on learning theory (2005)

396 Citations

Uncovering shared structures in multiclass classification

Yonatan Amit;Michael Fink;Nathan Srebro;Shimon Ullman.
international conference on machine learning (2007)

372 Citations

The implicit bias of gradient descent on separable data

Daniel Soudry;Elad Hoffer;Mor Shpigel Nacson;Suriya Gunasekar.
Journal of Machine Learning Research (2018)

354 Citations

Profile was last updated on December 6th, 2021.
Research.com Ranking is based on data retrieved from the Microsoft Academic Graph (MAG).
The ranking h-index is inferred from publications deemed to belong to the considered discipline.

If you think any of the details on this page are incorrect, let us know.

Contact us

Top Scientists Citing Nathan Srebro

Francis Bach

Francis Bach

French Institute for Research in Computer Science and Automation - INRIA

Publications: 71

Peter Richtárik

Peter Richtárik

King Abdullah University of Science and Technology

Publications: 70

Shai Shalev-Shwartz

Shai Shalev-Shwartz

Hebrew University of Jerusalem

Publications: 54

Ohad Shamir

Ohad Shamir

Weizmann Institute of Science

Publications: 54

Rong Jin

Rong Jin

Alibaba Group (China)

Publications: 52

Tianbao Yang

Tianbao Yang

University of Iowa

Publications: 45

Martin J. Wainwright

Martin J. Wainwright

University of California, Berkeley

Publications: 40

Massimiliano Pontil

Massimiliano Pontil

Italian Institute of Technology

Publications: 40

Dacheng Tao

Dacheng Tao

University of Sydney

Publications: 38

Michael I. Jordan

Michael I. Jordan

University of California, Berkeley

Publications: 38

Ping Li

Ping Li

Baidu (United States)

Publications: 37

Tong Zhang

Tong Zhang

Hong Kong University of Science and Technology

Publications: 35

Quanquan Gu

Quanquan Gu

University of California, Los Angeles

Publications: 33

Qiang Yang

Qiang Yang

Hong Kong University of Science and Technology

Publications: 31

Zhi-Hua Zhou

Zhi-Hua Zhou

Nanjing University

Publications: 30

Something went wrong. Please try again later.