2021 - IEEE John von Neumann Medal “For contributions to the science and engineering of large-scale distributed computer systems and artificial intelligence systems.”
2016 - Fellow of the American Academy of Arts and Sciences
2013 - Fellow of the American Association for the Advancement of Science (AAAS)
2012 - ACM Prize in Computing For their leadership in the science and engineering of Internet-scale distributed systems.
2009 - Member of the National Academy of Engineering For contributions to the science and engineering of large-scale distributed computer systems.
2009 - ACM Fellow For contributions to the science and engineering of large-scale distributed computer systems.
His primary areas of investigation include Artificial intelligence, Machine learning, Artificial neural network, Deep learning and Word embedding. His Artificial intelligence research integrates issues from Pattern recognition, Vocabulary and Natural language processing. His work deals with themes such as Ensemble learning and Acoustic model, which intersect with Artificial neural network.
His Deep learning research incorporates elements of Context, CUDA, Distributed computing and Reinforcement learning. As a part of the same scientific study, Jeffrey Dean usually deals with the Word embedding, concentrating on Word2vec and frequently concerns with Syntax. His work on Distributional semantics as part of general Word study is frequently linked to Simple, bridging the gap between disciplines.
Artificial intelligence, Information retrieval, Machine learning, World Wide Web and Artificial neural network are his primary areas of study. His Artificial intelligence study combines topics from a wide range of disciplines, such as Pattern recognition and Natural language processing. The concepts of his Information retrieval study are interwoven with issues in Set and Database.
Jeffrey Dean has included themes like Computation, Inference and Dataflow in his Machine learning study. The study incorporates disciplines such as Language model and Speech recognition in addition to Machine translation. Word embedding is a primary field of his research addressed under Word.
Jeffrey Dean mainly investigates Artificial intelligence, Machine learning, Deep learning, Artificial neural network and Machine translation. His Reinforcement learning study, which is part of a larger body of work in Artificial intelligence, is frequently linked to Health informatics, bridging the gap between disciplines. His Machine learning study combines topics in areas such as SIGNAL, Computation, Inference and Dataflow.
His work carried out in the field of Deep learning brings together such families of science as Routing, Context and Data science. Jeffrey Dean has researched Artificial neural network in several fields, including Language model, Natural language understanding and Human–computer interaction. His Machine translation research includes themes of Sentence and Natural language.
His scientific interests lie mostly in Artificial intelligence, Machine learning, Deep learning, Artificial neural network and Machine translation. His research brings together the fields of Key and Artificial intelligence. His research investigates the link between Machine learning and topics such as Computation that cross with problems in Dataflow, Multi-core processor and CUDA.
His biological study spans a wide range of topics, including Data point, Statistical model and Interoperability. Jeffrey Dean combines subjects such as Graph and Set with his study of Artificial neural network. His Machine translation course of study focuses on Sentence and Speech recognition.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
MapReduce: simplified data processing on large clusters
Jeffrey Dean;Sanjay Ghemawat.
Communications of The ACM (2008)
Distributed Representations of Words and Phrases and their Compositionality
Tomas Mikolov;Ilya Sutskever;Kai Chen;Greg S Corrado.
neural information processing systems (2013)
Efficient Estimation of Word Representations in Vector Space
Tomas Mikolov;Kai Chen;Greg S. Corrado;Jeffrey Dean.
arXiv: Computation and Language (2013)
Bigtable: A Distributed Storage System for Structured Data
Fay Chang;Jeffrey Dean;Sanjay Ghemawat;Wilson C. Hsieh.
ACM Transactions on Computer Systems (2008)
Distilling the Knowledge in a Neural Network
Geoffrey E. Hinton;Oriol Vinyals;Jeffrey Dean.
arXiv: Machine Learning (2015)
TensorFlow: a system for large-scale machine learning
Martín Abadi;Paul Barham;Jianmin Chen;Zhifeng Chen.
operating systems design and implementation (2016)
TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems
Martín Abadi;Ashish Agarwal;Paul Barham;Eugene Brevdo.
arXiv: Distributed, Parallel, and Cluster Computing (2015)
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu;Mike Schuster;Zhifeng Chen;Quoc V. Le.
arXiv: Computation and Language (2016)
Large Scale Distributed Deep Networks
Jeffrey Dean;Greg Corrado;Rajat Monga;Kai Chen.
neural information processing systems (2012)
Building high-level features using large scale unsupervised learning
Marc'aurelio Ranzato;Rajat Monga;Matthieu Devin;Kai Chen.
international conference on machine learning (2012)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: