2010 - Fellow of the Association for the Advancement of Artificial Intelligence (AAAI) For significant contributions to the theory and practice of efficient machine learning algorithms.
His scientific interests lie mostly in Artificial intelligence, Machine learning, Algorithm, Boosting and Mathematical optimization. The study incorporates disciplines such as Natural language processing, Query expansion and Pattern recognition in addition to Artificial intelligence. His Machine learning research focuses on Synthetic data and how it relates to Quadratic programming, Learnability, Coding and Special case.
His study focuses on the intersection of Algorithm and fields such as Margin Infused Relaxed Algorithm with connections in the field of Decision problem and Lemma. His study on Optimization problem, Interior point method and Gradient projection is often connected to Online learning as part of broader study in Mathematical optimization. His studies in Optimization problem integrate themes in fields like Regularization, Kernel and Support vector machine.
His primary areas of investigation include Artificial intelligence, Algorithm, Machine learning, Pattern recognition and Mathematical optimization. His Artificial intelligence study incorporates themes from Online algorithm and Natural language processing. His study in Algorithm is interdisciplinary in nature, drawing from both Mixture model, Supervised learning, Probabilistic logic and Theoretical computer science.
His work on Ranking SVM, Ranking, Text categorization and Kernel method as part of general Machine learning study is frequently linked to Simple, bridging the gap between disciplines. His Pattern recognition study integrates concerns from other disciplines, such as Margin and Iterative method. The Mathematical optimization study combines topics in areas such as Online machine learning, Regularization and Applied mathematics.
His primary areas of study are Artificial intelligence, Pattern recognition, Algorithm, Machine learning and Representation. Yoram Singer studies Artificial intelligence, namely Regularization. He interconnects Embedding, Zero shot learning, Word embedding and Convex combination in the investigation of issues within Pattern recognition.
His Algorithm research incorporates elements of Sparse matrix and Robustness. His Empirical risk minimization and Ranking study in the realm of Machine learning connects with subjects such as Subject matter and Implementation. Many of his research projects under Mathematical optimization are closely connected to Newton's method in optimization with Newton's method in optimization, tying the diverse disciplines of science together.
Artificial intelligence, Pattern recognition, Mathematical optimization, Algorithm and Word embedding are his primary areas of study. His Artificial intelligence study combines topics from a wide range of disciplines, such as Machine learning, Matrix norm and Collaborative filtering. His Machine learning research is multidisciplinary, incorporating elements of Cognitive neuroscience of visual object recognition, Inference, Categorization and Statistics.
In his study, which falls under the umbrella issue of Mathematical optimization, Representation and Kernel smoother is strongly linked to Eigendecomposition of a matrix. His work carried out in the field of Algorithm brings together such families of science as Learnability and Robustness. His Word embedding study combines topics in areas such as Convex combination, Image transformation and Zero shot learning.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Adaptive Subgradient Methods for Online Learning and Stochastic Optimization
John Duchi;Elad Hazan;Yoram Singer.
Journal of Machine Learning Research (2011)
Improved boosting algorithms using confidence-rated predictions
Robert E. Schapire;Yoram Singer.
conference on learning theory (1998)
Feature-rich part-of-speech tagging with a cyclic dependency network
Kristina Toutanova;Dan Klein;Christopher D. Manning;Yoram Singer.
north american chapter of the association for computational linguistics (2003)
BoosTexter: A Boosting-based Systemfor Text Categorization
Robert E. Schapire;Yoram Singer.
Machine Learning (2000)
An efficient boosting algorithm for combining preferences
Yoav Freund;Raj Iyer;Robert E. Schapire;Yoram Singer.
Journal of Machine Learning Research (2003)
On the algorithmic implementation of multiclass kernel-based vector machines
Koby Crammer;Yoram Singer.
Journal of Machine Learning Research (2002)
Reducing multiclass to binary: a unifying approach for margin classifiers
Erin L. Allwein;Robert E. Schapire;Yoram Singer.
Journal of Machine Learning Research (2001)
Pegasos: primal estimated sub-gradient solver for SVM
Shai Shalev-Shwartz;Yoram Singer;Nathan Srebro;Andrew Cotter.
Mathematical Programming (2011)
Online Passive-Aggressive Algorithms
Koby Crammer;Koby Crammer;Ofer Dekel;Joseph Keshet;Shai Shalev-Shwartz.
Journal of Machine Learning Research (2006)
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
Shai Shalev-Shwartz;Yoram Singer;Nathan Srebro.
international conference on machine learning (2007)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: