H-Index & Metrics Best Publications

H-Index & Metrics

Discipline name H-index Citations Publications World Ranking National Ranking
Computer Science D-index 69 Citations 33,765 179 World Ranking 905 National Ranking 545

Overview

What is he best known for?

The fields of study he is best known for:

  • Artificial intelligence
  • Programming language
  • Linguistics

Chris Dyer spends much of his time researching Artificial intelligence, Natural language processing, Word, Parsing and Language model. His Artificial intelligence research focuses on Rule-based machine translation, Machine translation, Recurrent neural network, Inference and Translation. The concepts of his Machine translation study are interwoven with issues in Phrase and Word error rate.

His Natural language processing study integrates concerns from other disciplines, such as Structure and Linguistics, Syntax. His Word research is multidisciplinary, incorporating perspectives in Speech recognition, Binary number, Character, Similarity and Pattern recognition. His Parsing research incorporates themes from Tree, Embedding, Representation and Space.

His most cited work include:

  • Moses: Open Source Toolkit for Statistical Machine Translation (4525 citations)
  • Neural Architectures for Named Entity Recognition (2331 citations)
  • Hierarchical Attention Networks for Document Classification (2218 citations)

What are the main themes of his work throughout his whole career to date?

His main research concerns Artificial intelligence, Natural language processing, Word, Machine translation and Language model. In his work, Inference is strongly intertwined with Machine learning, which is a subfield of Artificial intelligence. Chris Dyer combines subjects such as Context and Task with his study of Natural language processing.

Chris Dyer has included themes like WordNet, Set, Similarity, Vocabulary and Character in his Word study. His biological study deals with issues like Translation, which deal with fields such as Discriminative model. His Language model study combines topics in areas such as Artificial neural network, Sentence, Syntax and State.

He most often published in these fields:

  • Artificial intelligence (75.64%)
  • Natural language processing (59.64%)
  • Word (22.55%)

What were the highlights of his more recent work (between 2017-2020)?

  • Artificial intelligence (75.64%)
  • Natural language processing (59.64%)
  • Language model (20.00%)

In recent papers he was focusing on the following fields of study:

His primary areas of study are Artificial intelligence, Natural language processing, Language model, Syntax and Artificial neural network. Generative grammar, Natural language, Inference, Segmentation and Training set are subfields of Artificial intelligence in which his conducts study. His Natural language processing research includes elements of Context, Structure and Semantics.

His Language model research incorporates elements of Machine learning, Recurrent neural network, Word and State. His Syntax study combines topics from a wide range of disciplines, such as Sentence and Representation. His Artificial neural network study incorporates themes from Synonym and Symbol.

Between 2017 and 2020, his most popular works were:

  • Relational inductive biases, deep learning, and graph networks (891 citations)
  • Learning Deep Generative Models of Graphs (241 citations)
  • The NarrativeQA Reading Comprehension Challenge (232 citations)

In his most recent research, the most cited papers focused on:

  • Artificial intelligence
  • Programming language
  • Linguistics

Artificial intelligence, Language model, Natural language processing, Artificial neural network and Generalization are his primary areas of study. His research ties Software and Artificial intelligence together. His Language model study integrates concerns from other disciplines, such as Recurrent neural network, Binary number, Generator, Grammar induction and Convolutional neural network.

His studies in Recurrent neural network integrate themes in fields like Generative grammar, Parsing and Rule-based machine translation. His Natural language processing research is multidisciplinary, incorporating elements of Semantics and Syntax. He interconnects Algorithm, Contrast, Arithmetic logic unit and Source code in the investigation of issues within Artificial neural network.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Best Publications

Moses: Open Source Toolkit for Statistical Machine Translation

Philipp Koehn;Hieu Hoang;Alexandra Birch;Chris Callison-Burch.
meeting of the association for computational linguistics (2007)

6096 Citations

Neural Architectures for Named Entity Recognition

Guillaume Lample;Miguel Ballesteros;Sandeep Subramanian;Kazuya Kawakami.
north american chapter of the association for computational linguistics (2016)

2849 Citations

Hierarchical Attention Networks for Document Classification

Zichao Yang;Diyi Yang;Chris Dyer;Xiaodong He.
north american chapter of the association for computational linguistics (2016)

2358 Citations

Relational inductive biases, deep learning, and graph networks

Peter W. Battaglia;Jessica B. Hamrick;Victor Bapst;Alvaro Sanchez-Gonzalez.
arXiv: Learning (2018)

873 Citations

Improved Part-of-Speech Tagging for Online Conversational Text with Word Clusters

Olutobi Owoputi;Brendan O'Connor;Chris Dyer;Kevin Gimpel.
north american chapter of the association for computational linguistics (2013)

851 Citations

Data-Intensive Text Processing with MapReduce

Jimmy Lin;Chris Dyer.
(2010)

841 Citations

A Simple, Fast, and Effective Reparameterization of IBM Model 2

Chris Dyer;Victor Chahuneau;Noah A. Smith.
north american chapter of the association for computational linguistics (2013)

679 Citations

Transition-Based Dependency Parsing with Stack Long Short-Term Memory

Chris Dyer;Miguel Ballesteros;Wang Ling;Austin Matthews.
arXiv: Computation and Language (2015)

649 Citations

Finding Function in Form: Compositional Character Models for Open Vocabulary Word Representation

Wang Ling;Chris Dyer;Alan W Black;Isabel Trancoso.
empirical methods in natural language processing (2015)

621 Citations

Improving Vector Space Word Representations Using Multilingual Correlation

Manaal Faruqui;Chris Dyer.
conference of the european chapter of the association for computational linguistics (2014)

599 Citations

If you think any of the details on this page are incorrect, let us know.

Contact us

Best Scientists Citing Chris Dyer

Andy Way

Andy Way

Dublin City University

Publications: 134

Graham Neubig

Graham Neubig

Carnegie Mellon University

Publications: 118

Philipp Koehn

Philipp Koehn

Johns Hopkins University

Publications: 85

Eiichiro Sumita

Eiichiro Sumita

National Institute of Information and Communications Technology

Publications: 85

Qun Liu

Qun Liu

Huawei Technologies (China)

Publications: 83

Hermann Ney

Hermann Ney

RWTH Aachen University

Publications: 67

Yang Liu

Yang Liu

Tsinghua University

Publications: 65

Marcello Federico

Marcello Federico

Amazon (United States)

Publications: 64

Francisco Casacuberta

Francisco Casacuberta

Universitat Politècnica de València

Publications: 62

Noah A. Smith

Noah A. Smith

University of Washington

Publications: 58

Holger Schwenk

Holger Schwenk

Facebook (United States)

Publications: 57

Chris Callison-Burch

Chris Callison-Burch

University of Pennsylvania

Publications: 56

Lucia Specia

Lucia Specia

Imperial College London

Publications: 53

Jörg Tiedemann

Jörg Tiedemann

University of Helsinki

Publications: 52

Christopher D. Manning

Christopher D. Manning

Stanford University

Publications: 51

Yue Zhang

Yue Zhang

Westlake University

Publications: 51

Something went wrong. Please try again later.