Jeff Clune mainly focuses on Artificial intelligence, Artificial neural network, Machine learning, Evolutionary algorithm and Convolutional neural network. Jeff Clune combines subjects such as State and Pattern recognition with his study of Artificial intelligence. His Pattern recognition study incorporates themes from Gradient descent and Deep neural networks.
His study looks at the relationship between Artificial neural network and fields such as Robot, as well as how they intersect with chemical problems. His research integrates issues of Evolutionary computation, HyperNEAT and Encoding in his study of Evolutionary algorithm. Jeff Clune undertakes multidisciplinary studies into Convolutional neural network and Layer in his work.
His primary scientific interests are in Artificial intelligence, Artificial neural network, Machine learning, Reinforcement learning and Evolutionary algorithm. Jeff Clune incorporates Artificial intelligence and Modularity in his research. His study explores the link between Artificial neural network and topics such as Feature that cross with problems in Set.
Within one scientific family, he focuses on topics pertaining to Visualization under Machine learning, and may sometimes address concerns connected to Probabilistic logic. Jeff Clune has included themes like Backpropagation, Robotics, Stochastic gradient descent and Forgetting in his Reinforcement learning study. While working on this project, Jeff Clune studies both Convolutional neural network and Layer.
His primary areas of study are Artificial intelligence, Artificial neural network, Reinforcement learning, Function and Forgetting. His Artificial intelligence study integrates concerns from other disciplines, such as Machine learning and Computer vision. Jeff Clune has researched Machine learning in several fields, including Contextual image classification, Algorithm and Camera trap.
His Artificial neural network research incorporates elements of Robot, Task, Hybrid algorithm and Parameterized complexity. His biological study spans a wide range of topics, including Data science, Computation and Human–computer interaction. His work carried out in the field of Computation brings together such families of science as Variety and State.
Reinforcement learning, Function, Forgetting, Simple and Simplicity are his primary areas of study. The various areas that Jeff Clune examines in his Reinforcement learning study include Range, Human–computer interaction and Heuristic. A majority of his Function research is a blend of other scientific areas, such as Sequence learning, Lifelong learning, Artificial intelligence and Artificial neural network.
His Simple study frequently draws connections between adjacent fields such as Data science.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
How transferable are features in deep neural networks
Jason Yosinski;Jeff Clune;Yoshua Bengio;Hod Lipson.
neural information processing systems (2014)
Deep neural networks are easily fooled: High confidence predictions for unrecognizable images
Anh Nguyen;Jason Yosinski;Jeff Clune.
computer vision and pattern recognition (2015)
Understanding Neural Networks Through Deep Visualization
Jason Yosinski;Jeff Clune;Anh Mai Nguyen;Thomas J. Fuchs.
arXiv: Computer Vision and Pattern Recognition (2015)
Robots that can adapt like animals
Antoine Cully;Jeff Clune;Danesh Tarapore;Danesh Tarapore;Danesh Tarapore;Jean-Baptiste Mouret.
Nature (2015)
Deep Neuroevolution: Genetic Algorithms Are a Competitive Alternative for Training Deep Neural Networks for Reinforcement Learning
Felipe Petroski Such;Vashisht Madhavan;Edoardo Conti;Joel Lehman.
arXiv: Neural and Evolutionary Computing (2017)
The evolutionary origins of modularity
Jeff Clune;Jeff Clune;Jean-Baptiste Mouret;Hod Lipson.
Proceedings of The Royal Society B: Biological Sciences (2013)
Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning.
.
Proceedings of the National Academy of Sciences of the United States of America (2018)
Automatically identifying wild animals in camera trap images with deep learning.
.
(2017)
Plug & Play Generative Networks: Conditional Iterative Generation of Images in Latent Space
Anh Nguyen;Jeff Clune;Yoshua Bengio;Alexey Dosovitskiy.
computer vision and pattern recognition (2017)
Synthesizing the preferred inputs for neurons in neural networks via deep generator networks
Anh Mai Nguyen;Alexey Dosovitskiy;Jason Yosinski;Thomas Brox.
neural information processing systems (2016)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
University of Central Florida
Columbia University
University of Lorraine
Michigan State University
University of Montreal
Google (United States)
United States Department of Agriculture
University of Minnesota
Cornell University
The University of Texas at Austin
French Institute for Research in Computer Science and Automation - INRIA
Publications: 18
Johns Hopkins University
Universitat Politècnica de Catalunya
Sungkyunkwan University
Indian Institute of Technology Delhi
Kyushu University
Agricultural University of Athens
University of Rochester Medical Center
Boston University
University of Nebraska Medical Center
University of California, Irvine
Stanford University
International Agency For Research On Cancer
Royal North Shore Hospital
University of Modena and Reggio Emilia
London School of Hygiene & Tropical Medicine