His primary scientific interests are in Mathematical optimization, Regularization, Gradient descent, Artificial intelligence and Applied mathematics. His work on Optimization problem as part of general Mathematical optimization study is frequently linked to Uniform convergence, bridging the gap between disciplines. His work in Regularization addresses subjects such as Support vector machine, which are connected to disciplines such as Limit.
His studies deal with areas such as Stochastic gradient descent and Linear separability as well as Gradient descent. His work investigates the relationship between Artificial intelligence and topics such as Machine learning that intersect with problems in Pattern recognition. His Applied mathematics research includes themes of Matrix decomposition, Quadratic equation, Stochastic optimization and Linear prediction.
The scientist’s investigation covers issues in Mathematical optimization, Algorithm, Applied mathematics, Artificial intelligence and Gradient descent. He has included themes like Artificial neural network, Stochastic gradient descent, Upper and lower bounds and Regular polygon in his Mathematical optimization study. As a part of the same scientific family, Nathan Srebro mostly works in the field of Algorithm, focusing on Simple and, on occasion, Distribution.
His work carried out in the field of Applied mathematics brings together such families of science as Factorization, Regularization, Norm and Matrix. His Artificial intelligence research includes elements of Machine learning and Pattern recognition. His study focuses on the intersection of Support vector machine and fields such as Kernel with connections in the field of Kernel and Discrete mathematics.
His primary areas of investigation include Applied mathematics, Gradient descent, Upper and lower bounds, Algorithm and Mathematical optimization. His Applied mathematics study incorporates themes from Regularization, Iterated function and Stochastic gradient descent. His Gradient descent study integrates concerns from other disciplines, such as Contrast, Monotone polygon, Margin, Separable space and Norm.
His Upper and lower bounds research includes themes of Discrete mathematics, Class, Stochastic optimization, Distributed learning and Stationary point. His biological study spans a wide range of topics, including Structure, Kernel method, Kernel, Simple and Kernel. The Mathematical optimization study combines topics in areas such as Quadratic equation and Convex optimization.
His primary areas of study are Gradient descent, Applied mathematics, Norm, Upper and lower bounds and Discrete mathematics. His Gradient descent research is multidisciplinary, incorporating perspectives in Factorization, Underdetermined system, Margin and Non homogeneous. His Applied mathematics study incorporates themes from Linear prediction, Separable space, Monotone polygon and Regularization.
Regularization is the subject of his research, which falls under Artificial intelligence. His Upper and lower bounds research incorporates elements of Mathematical optimization and Minimax. His study on VC dimension is often connected to Linear spline as part of broader study in Discrete mathematics.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Pegasos: primal estimated sub-gradient solver for SVM
Shai Shalev-Shwartz;Yoram Singer;Nathan Srebro;Andrew Cotter.
Mathematical Programming (2011)
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
Shai Shalev-Shwartz;Yoram Singer;Nathan Srebro.
international conference on machine learning (2007)
Maximum-Margin Matrix Factorization
Nathan Srebro;Jason Rennie;Tommi S. Jaakkola.
neural information processing systems (2004)
Fast maximum margin matrix factorization for collaborative prediction
Jasson D. M. Rennie;Nathan Srebro.
international conference on machine learning (2005)
Equality of opportunity in supervised learning
Moritz Hardt;Eric Price;Nathan Srebro.
neural information processing systems (2016)
Weighted low-rank approximations
Nathan Srebro;Tommi Jaakkola.
international conference on machine learning (2003)
Exploring Generalization in Deep Learning
Behnam Neyshabur;Srinadh Bhojanapalli;David McAllester;Nathan Srebro.
neural information processing systems (2017)
Rank, trace-norm and max-norm
Nathan Srebro;Adi Shraibman.
conference on learning theory (2005)
Uncovering shared structures in multiclass classification
Yonatan Amit;Michael Fink;Nathan Srebro;Shimon Ullman.
international conference on machine learning (2007)
The implicit bias of gradient descent on separable data
Daniel Soudry;Elad Hoffer;Mor Shpigel Nacson;Suriya Gunasekar.
Journal of Machine Learning Research (2018)
Profile was last updated on December 6th, 2021.
Research.com Ranking is based on data retrieved from the Microsoft Academic Graph (MAG).
The ranking h-index is inferred from publications deemed to belong to the considered discipline.
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: