2023 - Research.com Computer Science in Australia Leader Award
His primary areas of study are Artificial intelligence, Natural language processing, Word, Machine learning and Machine translation. Trevor Cohn integrates several fields in his works, including Artificial intelligence and Quality. His studies deal with areas such as Exploit and Benchmark as well as Natural language processing.
His Word research is multidisciplinary, relying on both Lexical item, Similarity, Baseline and Component. His Machine learning study incorporates themes from Social network, Named-entity recognition, Friendship and Cognitive reframing. The Machine translation study which covers Translation that intersects with Phrase.
His main research concerns Artificial intelligence, Natural language processing, Machine learning, Machine translation and Word. His Artificial intelligence research is multidisciplinary, incorporating perspectives in Speech recognition and Pattern recognition. His work deals with themes such as Context and Transfer, which intersect with Natural language processing.
His work on Sentiment analysis as part of general Machine learning study is frequently connected to Gaussian process, therefore bridging the gap between diverse disciplines of science and establishing a new relationship between them. The concepts of his Machine translation study are interwoven with issues in Grammar and Phrase. Trevor Cohn interconnects Algorithm and Conditional random field in the investigation of issues within Inference.
Trevor Cohn mainly focuses on Artificial intelligence, Natural language processing, Machine learning, Word and Machine translation. His study in Artificial intelligence is interdisciplinary in nature, drawing from both Domain and Computer vision. Trevor Cohn works in the field of Natural language processing, focusing on Parsing in particular.
The Stability research he does as part of his general Machine learning study is frequently linked to other disciplines of science, such as Training, therefore creating a link between diverse domains of science. The Word study combines topics in areas such as Sentence, Pragmatics, Speech act and Oracle. His studies in Machine translation integrate themes in fields like Computer security, Adversary and Training set.
His primary scientific interests are in Artificial intelligence, Natural language processing, Named-entity recognition, Word and Machine learning. In his study, Linguistic sequence complexity is strongly linked to Domain, which falls under the umbrella field of Artificial intelligence. The study incorporates disciplines such as Context and Transfer in addition to Natural language processing.
His Named-entity recognition research includes elements of Event, Information extraction, Information retrieval and Key. His studies deal with areas such as Sentence and Machine translation as well as Word. He works mostly in the field of Machine learning, limiting it down to concerns involving Inference and, occasionally, Crowdsourcing, Benchmark, Graphical model, Conjugate prior and Bayesian probability.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
DyNet: The Dynamic Neural Network Toolkit
Graham Neubig;Chris Dyer;Yoav Goldberg;Austin Matthews.
arXiv: Machine Learning (2017)
DyNet: The Dynamic Neural Network Toolkit
Graham Neubig;Chris Dyer;Yoav Goldberg;Austin Matthews.
arXiv: Machine Learning (2017)
Low Resource Dependency Parsing: Cross-lingual Parameter Sharing in a Neural Network Parser
Long Duong;Trevor Cohn;Steven Bird;Paul Cook.
international joint conference on natural language processing (2015)
Low Resource Dependency Parsing: Cross-lingual Parameter Sharing in a Neural Network Parser
Long Duong;Trevor Cohn;Steven Bird;Paul Cook.
international joint conference on natural language processing (2015)
Graph-to-Sequence Learning using Gated Graph Neural Networks
Daniel Beck;Gholamreza Haffari;Trevor Cohn.
meeting of the association for computational linguistics (2018)
Graph-to-Sequence Learning using Gated Graph Neural Networks
Daniel Beck;Gholamreza Haffari;Trevor Cohn.
meeting of the association for computational linguistics (2018)
Sentence Compression Beyond Word Deletion
Trevor Cohn;Mirella Lapata.
international conference on computational linguistics (2008)
QuEst - A translation quality estimation framework
Lucia Specia;Kashif Shah;Jose G.C. de Souza;Trevor Cohn.
(2013)
QuEst - A translation quality estimation framework
Lucia Specia;Kashif Shah;Jose G.C. de Souza;Trevor Cohn.
(2013)
Sentence Compression Beyond Word Deletion
Trevor Cohn;Mirella Lapata.
international conference on computational linguistics (2008)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
University of Melbourne
RMIT University
University of Oxford
Charles Darwin University
Imperial College London
University of Sheffield
Carnegie Mellon University
University of Edinburgh
Google (United States)
University of Edinburgh
University of Oxford
Grenoble Alpes University
National Institutes of Health
University of Pennsylvania
Commonwealth Scientific and Industrial Research Organisation
University of Tokyo
Carleton University
Institut de Recherche pour le Développement
University of Pisa
University of Pennsylvania
National Institutes of Health
University of North Carolina at Chapel Hill
Electric Power Research Institute
KU Leuven
University of Alabama in Huntsville
University of Tokyo