2006 - Fellow of the Association for the Advancement of Artificial Intelligence (AAAI) For significant contributions to machine learning, especially knowledge-intensive approaches, and the application of machine learning to problems in computational biology.
His primary areas of study are Artificial intelligence, Machine learning, Artificial neural network, Time delay neural network and Nervous system network models. Artificial intelligence is closely attributed to Set in his work. His Machine learning research incorporates elements of Algorithm and Task.
Jude W. Shavlik works mostly in the field of Artificial neural network, limiting it down to topics relating to Decision tree and, in certain cases, Tree, Range, Simple, Matching and Structure, as a part of the same area of interest. His Time delay neural network research focuses on subjects like Deep learning, which are linked to Recurrent neural network. His Nervous system network models research includes themes of Physical neural network and Stochastic neural network.
Artificial intelligence, Machine learning, Artificial neural network, Inductive logic programming and Task are his primary areas of study. His research in Artificial intelligence intersects with topics in Set and Statistical relational learning. His Machine learning study combines topics from a wide range of disciplines, such as Information extraction and Data mining.
As a part of the same scientific family, he mostly works in the field of Inductive logic programming, focusing on Precision and recall and, on occasion, Field. His Task research incorporates themes from Natural language processing, Advice and Reinforcement learning. In his study, Algorithm is inextricably linked to Backpropagation, which falls within the broad field of Connectionism.
His primary areas of study are Artificial intelligence, Machine learning, Statistical relational learning, Boosting and Task. His study brings together the fields of Statistical inference and Artificial intelligence. His study in the field of Support vector machine is also linked to topics like Gradient boosting.
The study incorporates disciplines such as Ensemble learning and Data-driven in addition to Statistical relational learning. His Boosting research integrates issues from Gradient based algorithm, Markov logic network, Conditional probability distribution and Missing data. His Task study integrates concerns from other disciplines, such as Relation, Set and Domain knowledge.
His primary scientific interests are in Artificial intelligence, Machine learning, Statistical relational learning, Markov chain and Data science. His studies deal with areas such as Simple and Big data as well as Artificial intelligence. His Machine learning study combines topics in areas such as Domain, Scalability and Markov process.
His work carried out in the field of Domain brings together such families of science as Learning classifier system, Unsupervised learning, Reinforcement learning and Control theory. The Statistical relational learning study combines topics in areas such as Ensemble learning and Boosting. His study in Data science is interdisciplinary in nature, drawing from both Crowdsourcing, Web page, World Wide Web, Knowledge base and Workflow.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Extracting Refined Rules from Knowledge-Based Neural Networks
Geoffrey G. Towell;Jude W. Shavlik.
Machine Learning (1993)
Knowledge-based artificial neural networks
Geoffrey G. Towell;Jude W. Shavlik.
Artificial Intelligence (1994)
Extracting Tree-Structured Representations of Trained Networks
Mark Craven;Jude W. Shavlik.
neural information processing systems (1995)
Refinement of approximate domain theories by knowledge-based neural networks
Geoffrey G. Towell;Jude W. Shavlik;Michiel O. Noordewier.
national conference on artificial intelligence (1990)
Symbolic and neural learning algorithms: an experimental comparison
Jude W. Shavlik;Raymond J. Mooney;Geoffrey G. Towell.
Machine Learning (1991)
Readings in Machine Learning
Jude W. Shavlik;Thomas E. Deitterich;Thomas Dietterich.
(1991)
Actively Searching for an Effective Neural Network Ensemble
David W Opitz;Jude W Shavlik.
Connection Science (1996)
Using sampling and queries to extract rules from trained neural networks
Mark Craven;Jude W. Shavlik.
international conference on machine learning (1994)
Generating Accurate and Diverse Members of a Neural-Network Ensemble
David W. Opitz;Jude W. Shavlik.
neural information processing systems (1995)
Using neural networks for data mining
Mark W. Craven;Jude W. Shavlik.
Future Generation Computer Systems (1997)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
MIT
Technical University of Darmstadt
University of Wisconsin–Madison
University of Washington
Stanford University
Rice University
University of Porto
Oregon State University
ETH Zurich
The University of Texas at Austin
Beijing Institute of Technology
University of Rennes
University of Bristol
National Taiwan University
National Institutes of Health
The Graduate University for Advanced Studies, SOKENDAI
Rega Institute for Medical Research
St James's University Hospital
Nagoya City University
University of Leoben
University of Cagliari
Aix-Marseille University
Tel Aviv University
Osaka University
Emory University
University of Tokyo