Mo Yu spends much of his time researching Artificial intelligence, Natural language processing, Question answering, Reading comprehension and Recurrent neural network. His research in the fields of Entity linking overlaps with other disciplines such as Code. His Natural language processing study combines topics in areas such as Embedding, Word, Convolutional neural network and Selection.
The Open domain research Mo Yu does as part of his general Question answering study is frequently linked to other disciplines of science, such as Pipeline, Rank and Component, therefore creating a link between diverse domains of science. In Recurrent neural network, Mo Yu works on issues like State, which are connected to DUAL, Pattern recognition, Superresolution, Image and Benchmark. His research integrates issues of Textual entailment, Regularization and SemEval in his study of Sentence.
Mo Yu focuses on Artificial intelligence, Natural language processing, Machine learning, Question answering and Domain. His work in Artificial intelligence addresses subjects such as Relationship extraction, which are connected to disciplines such as Named-entity recognition. Mo Yu integrates many fields, such as Natural language processing and Reading comprehension, in his works.
His work carried out in the field of Question answering brings together such families of science as Semantic reasoner and Complex question. In his study, Pattern recognition is strongly linked to Feature, which falls under the umbrella field of Word. His study looks at the relationship between Embedding and fields such as Sentence, as well as how they intersect with chemical problems.
Mo Yu mainly investigates Artificial intelligence, Question answering, Natural language processing, Natural language and Theoretical computer science. In the field of Artificial intelligence, his study on Natural language interaction and Interrogative word overlaps with subjects such as Domain, Interaction method and Product. His Question answering research includes elements of Dependency, Benchmark and Complex question.
His Natural language processing research is multidisciplinary, incorporating perspectives in Word and Translation language. His study explores the link between Natural language and topics such as Parsing that cross with problems in Relationship extraction, Representation, Relation and Ambiguity. His research in Theoretical computer science focuses on subjects like Knowledge graph, which are connected to Small set.
His primary areas of investigation include Artificial intelligence, Question answering, Natural language processing, Object and Class. His work deals with themes such as Relation and Ambiguity, which intersect with Artificial intelligence. Mo Yu has researched Question answering in several fields, including Dependency, Theoretical computer science and Benchmark.
His studies deal with areas such as Word and Translation language as well as Natural language processing. His Object research is multidisciplinary, relying on both Segmentation, Machine learning, Feature and Synthetic data. His Feature research incorporates themes from Feature extraction and Pattern recognition.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
A Structured Self-Attentive Sentence Embedding.
Zhouhan Lin;Minwei Feng;Cicero Nogueira dos Santos;Mo Yu.
international conference on learning representations (2017)
Target-dependent Twitter Sentiment Classification
Long Jiang;Mo Yu;Ming Zhou;Xiaohua Liu.
meeting of the association for computational linguistics (2011)
Comparative Study of CNN and RNN for Natural Language Processing
Wenpeng Yin;Katharina Kann;Mo Yu;Hinrich Schütze.
arXiv: Computation and Language (2017)
Improving Lexical Embeddings with Semantic Knowledge
Mo Yu;Mark Dredze.
meeting of the association for computational linguistics (2014)
R 3 : Reinforced Ranker-Reader for Open-Domain Question Answering.
Shuohang Wang;Mo Yu;Xiaoxiao Guo;Zhiguo Wang.
national conference on artificial intelligence (2018)
Improved Neural Relation Detection for Knowledge Base Question Answering
Mo Yu;Wenpeng Yin;Kazi Saidul Hasan;Cícero Nogueira dos Santos.
meeting of the association for computational linguistics (2017)
Dilated Recurrent Neural Networks
Shiyu Chang;Yang Zhang;Wei Han;Mo Yu.
neural information processing systems (2017)
Image Super-Resolution via Dual-State Recurrent Networks
Wei Han;Shiyu Chang;Ding Liu;Mo Yu.
computer vision and pattern recognition (2018)
Diverse Few-Shot Text Classification with Multiple Metrics
Mo Yu;Xiaoxiao Guo;Jinfeng Yi;Shiyu Chang.
north american chapter of the association for computational linguistics (2018)
Simple Question Answering by Attentive Convolutional Neural Network
Wenpeng Yin;Mo Yu;Bing Xiang;Bowen Zhou.
international conference on computational linguistics (2016)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: