2016 - Member of the National Academy of Sciences
2014 - Member of the National Academy of Engineering For contributions to machine learning through invention and development of boosting algorithms.
2009 - Fellow of the Association for the Advancement of Artificial Intelligence (AAAI) For significant contributions to machine learning, including the theory and practice of boosting.
2004 - ACM Paris Kanellakis Theory and Practice Award Theory and practice of boosting
His main research concerns Artificial intelligence, Machine learning, Boosting, Algorithm and AdaBoost. Robert E. Schapire interconnects Stability, Bondareva–Shapley theorem, Query expansion and Pattern recognition in the investigation of issues within Artificial intelligence. His study in the field of Multiclass classification and Overfitting is also linked to topics like Function, Multiple data and Gene ontology.
His Boosting research incorporates themes from Learning to rank, BrownBoost, Boosting methods for object categorization, Decision tree and Ensemble learning. His study explores the link between BrownBoost and topics such as LPBoost that cross with problems in LogitBoost. The Algorithm study combines topics in areas such as Expected value, Exponential function and Convex optimization.
Robert E. Schapire mostly deals with Artificial intelligence, Boosting, Machine learning, Algorithm and Mathematical optimization. His Artificial intelligence research focuses on Stability and how it relates to Semi-supervised learning. His research integrates issues of BrownBoost, Convex optimization, AdaBoost and Generalization error in his study of Boosting.
As a part of the same scientific family, he mostly works in the field of BrownBoost, focusing on LPBoost and, on occasion, LogitBoost. His Machine learning research incorporates elements of Probabilistic logic and Regression. His study in the field of Timed automaton also crosses realms of Function.
His scientific interests lie mostly in Regret, Artificial intelligence, Oracle, Mathematical optimization and Theoretical computer science. His biological study spans a wide range of topics, including Machine learning and Pattern recognition. His Machine learning course of study focuses on Regression and Leverage, Realizability and LPBoost.
The study incorporates disciplines such as Conditional probability, Boosting and Convex combination in addition to Mathematical optimization. His research in Boosting intersects with topics in Bounded function, Residual, Residual neural network, AdaBoost and Generalization error. His study in Theoretical computer science is interdisciplinary in nature, drawing from both State and Reinforcement learning.
Robert E. Schapire mainly focuses on Regret, Artificial intelligence, Oracle, Machine learning and Algorithm. His work carried out in the field of Regret brings together such families of science as Discrete mathematics, Open problem, Reduction, Class and Mathematical optimization. His Artificial intelligence study frequently draws connections between adjacent fields such as Game theory.
He has researched Machine learning in several fields, including Classifier, Optimization problem, Active learning and Realizability. His Algorithm study combines topics from a wide range of disciplines, such as Overfitting, Prior probability, Robustness and Mirror descent. His studies in Reinforcement learning integrate themes in fields like Bellman equation, Model of computation, Theoretical computer science and Enumeration.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
A Decision Theoretic Generalization of On-Line Learning and an Application to Boosting
Y. Freund;R. Schapire.
Research Papers in Economics (2010)
Maximum entropy modeling of species geographic distributions
.
Ecological Modelling (2006)
Experiments with a new boosting algorithm
Yoav Freund;Robert E. Schapire.
international conference on machine learning (1996)
Novel methods improve prediction of species' distributions from occurrence data
Jane Elith;Catherine H. Graham;Robert P. Anderson;Miroslav Dudík.
(2006)
The Strength of Weak Learnability
Robert E. Schapire.
Machine Learning (1990)
Improved boosting algorithms using confidence-rated predictions
Robert E. Schapire;Yoram Singer.
conference on learning theory (1998)
A Short Introduction to Boosting
Yoav Freund;Robert E. Schapire.
(1999)
Boosting the margin: a new explanation for the effectiveness of voting methods
Robert E. Schapire;Yoav Freund;Peter Bartlett;Wee Sun Lee.
Annals of Statistics (1998)
BoosTexter: A Boosting-based Systemfor Text Categorization
Robert E. Schapire;Yoram Singer.
Machine Learning (2000)
An efficient boosting algorithm for combining preferences
Yoav Freund;Raj Iyer;Robert E. Schapire;Yoram Singer.
Journal of Machine Learning Research (2003)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
University of California, San Diego
Google (United States)
University of Pennsylvania
Microsoft (United States)
Princeton University
MIT
Duke University
Google (United States)
Duke University
Tel Aviv University
University of Pavia
University of Cambridge
Technion – Israel Institute of Technology
East China Normal University
Warsaw University of Technology
Central China Normal University
University of Calgary
Technical University of Munich
University of Erlangen-Nuremberg
Stanford University
University of Genoa
University of Leeds
University of Copenhagen
Massachusetts Eye and Ear Infirmary
University of California, Berkeley
Friedrich Schiller University Jena