Chris Dyer spends much of his time researching Artificial intelligence, Natural language processing, Word, Parsing and Language model. His Artificial intelligence research focuses on Rule-based machine translation, Machine translation, Recurrent neural network, Inference and Translation. The concepts of his Machine translation study are interwoven with issues in Phrase and Word error rate.
His Natural language processing study integrates concerns from other disciplines, such as Structure and Linguistics, Syntax. His Word research is multidisciplinary, incorporating perspectives in Speech recognition, Binary number, Character, Similarity and Pattern recognition. His Parsing research incorporates themes from Tree, Embedding, Representation and Space.
His main research concerns Artificial intelligence, Natural language processing, Word, Machine translation and Language model. In his work, Inference is strongly intertwined with Machine learning, which is a subfield of Artificial intelligence. Chris Dyer combines subjects such as Context and Task with his study of Natural language processing.
Chris Dyer has included themes like WordNet, Set, Similarity, Vocabulary and Character in his Word study. His biological study deals with issues like Translation, which deal with fields such as Discriminative model. His Language model study combines topics in areas such as Artificial neural network, Sentence, Syntax and State.
His primary areas of study are Artificial intelligence, Natural language processing, Language model, Syntax and Artificial neural network. Generative grammar, Natural language, Inference, Segmentation and Training set are subfields of Artificial intelligence in which his conducts study. His Natural language processing research includes elements of Context, Structure and Semantics.
His Language model research incorporates elements of Machine learning, Recurrent neural network, Word and State. His Syntax study combines topics from a wide range of disciplines, such as Sentence and Representation. His Artificial neural network study incorporates themes from Synonym and Symbol.
Artificial intelligence, Language model, Natural language processing, Artificial neural network and Generalization are his primary areas of study. His research ties Software and Artificial intelligence together. His Language model study integrates concerns from other disciplines, such as Recurrent neural network, Binary number, Generator, Grammar induction and Convolutional neural network.
His studies in Recurrent neural network integrate themes in fields like Generative grammar, Parsing and Rule-based machine translation. His Natural language processing research is multidisciplinary, incorporating elements of Semantics and Syntax. He interconnects Algorithm, Contrast, Arithmetic logic unit and Source code in the investigation of issues within Artificial neural network.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Moses: Open Source Toolkit for Statistical Machine Translation
Philipp Koehn;Hieu Hoang;Alexandra Birch;Chris Callison-Burch.
meeting of the association for computational linguistics (2007)
Hierarchical Attention Networks for Document Classification
Zichao Yang;Diyi Yang;Chris Dyer;Xiaodong He.
north american chapter of the association for computational linguistics (2016)
Neural Architectures for Named Entity Recognition
Guillaume Lample;Miguel Ballesteros;Sandeep Subramanian;Kazuya Kawakami.
north american chapter of the association for computational linguistics (2016)
Relational inductive biases, deep learning, and graph networks
Peter W. Battaglia;Jessica B. Hamrick;Victor Bapst;Alvaro Sanchez-Gonzalez.
arXiv: Learning (2018)
Retrofitting Word Vectors to Semantic Lexicons
Manaal Faruqui;Jesse Dodge;Sujay Kumar Jauhar;Chris Dyer.
north american chapter of the association for computational linguistics (2015)
Improved Part-of-Speech Tagging for Online Conversational Text with Word Clusters
Olutobi Owoputi;Brendan O'Connor;Chris Dyer;Kevin Gimpel.
north american chapter of the association for computational linguistics (2013)
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
Chris Dyer;Miguel Ballesteros;Wang Ling;Austin Matthews.
international joint conference on natural language processing (2015)
Data-Intensive Text Processing with MapReduce
Jimmy Lin;Chris Dyer.
(2010)
A Simple, Fast, and Effective Reparameterization of IBM Model 2
Chris Dyer;Victor Chahuneau;Noah A. Smith.
north american chapter of the association for computational linguistics (2013)
Finding Function in Form: Compositional Character Models for Open Vocabulary Word Representation
Wang Ling;Chris Dyer;Alan W Black;Isabel Trancoso.
empirical methods in natural language processing (2015)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
University of Washington
University of Oxford
Carnegie Mellon University
Carnegie Mellon University
Carnegie Mellon University
Instituto Superior Técnico
University College London
University of Maryland, College Park
University of Melbourne
Carnegie Mellon University
University of Sheffield
University of Queensland
Drexel University
KU Leuven
University of Nottingham
University of New South Wales
Spanish National Research Council
University of Clermont Auvergne
Rockefeller University
West Virginia University
University College London
University of Erlangen-Nuremberg
La Jolla Institute For Allergy & Immunology
Charles University
Emory University
Kennedy Krieger Institute