Nan Duan mainly investigates Artificial intelligence, Question answering, Natural language processing, Information retrieval and Recurrent neural network. Many of his studies on Artificial intelligence apply to Information flow as well. Nan Duan has included themes like DUAL, Task analysis, Knowledge extraction, Scheme and Visualization in his Question answering study.
His Natural language processing research incorporates themes from Commonsense reasoning, Set and Transformer. The various areas that Nan Duan examines in his Information retrieval study include Relation, Natural language and Chatbot. His Recurrent neural network research is multidisciplinary, relying on both Question generation, Probabilistic logic and Leverage.
His primary scientific interests are in Artificial intelligence, Natural language processing, Question answering, Information retrieval and Natural language. Artificial intelligence is frequently linked to Machine learning in his study. His work on Parsing as part of general Natural language processing study is frequently connected to Encoder, therefore bridging the gap between diverse disciplines of science and establishing a new relationship between them.
His Parsing study combines topics from a wide range of disciplines, such as Artificial neural network, Multi-task learning, SQL and Logical form. His Question answering study combines topics in areas such as Question generation, Leverage and Knowledge base. He has researched Benchmark in several fields, including Natural language understanding, Set and Task.
His primary areas of study are Artificial intelligence, Natural language processing, Benchmark, Machine learning and Transformer. His study in the field of Natural language and Question answering also crosses realms of Natural and Encoder. His work carried out in the field of Question answering brings together such families of science as Dependency grammar, Information flow, Graph based and Knowledge base.
His Natural language processing study integrates concerns from other disciplines, such as Machine reading, Comprehension and Generative grammar. In the subject of general Machine learning, his work in Overfitting is often linked to Base and Scale, thereby combining diverse domains of study. His Transformer research includes themes of Language model, Rewriting, Layer and Continuous embedding.
Nan Duan mostly deals with Artificial intelligence, Natural language processing, Machine learning, Transformer and Benchmark. Many of his research projects under Artificial intelligence are closely connected to Adapter with Adapter, tying the diverse disciplines of science together. His studies deal with areas such as Inference, Graph based, Commonsense knowledge, Knowledge base and Information flow as well as Question answering.
His work on Natural language understanding as part of general Natural language processing research is frequently linked to Paragraph, bridging the gap between disciplines. His Machine learning research integrates issues from n-gram and Automatic summarization. The concepts of his Transformer study are interwoven with issues in Language model, Documentation and Natural language.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Unicoder-VL: A Universal Encoder for Vision and Language by Cross-Modal Pre-Training.
Gen Li;Nan Duan;Yuejian Fang;Ming Gong.
national conference on artificial intelligence (2020)
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
Zhangyin Feng;Daya Guo;Duyu Tang;Nan Duan.
empirical methods in natural language processing (2020)
Question Answering and Question Generation as Dual Tasks
Duyu Tang;Nan Duan;Tao Qin;Zhao Yan.
arXiv: Computation and Language (2017)
Question Generation for Question Answering
Nan Duan;Duyu Tang;Peng Chen;Ming Zhou.
empirical methods in natural language processing (2017)
Unicoder: A Universal Language Encoder by Pre-training with Multiple Cross-lingual Tasks
Haoyang Huang;Yaobo Liang;Nan Duan;Ming Gong.
empirical methods in natural language processing (2019)
Knowledge-Based Question Answering as Machine Translation
Junwei Bao;Nan Duan;Ming Zhou;Tiejun Zhao.
meeting of the association for computational linguistics (2014)
K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters.
Ruize Wang;Duyu Tang;Nan Duan;zhongyu wei.
arXiv: Computation and Language (2020)
Unicoder-VL: A Universal Encoder for Vision and Language by Cross-modal Pre-training
Gen Li;Nan Duan;Yuejian Fang;Ming Gong.
arXiv: Computer Vision and Pattern Recognition (2019)
Constraint-Based Question Answering with Knowledge Graph
Junwei Bao;Nan Duan;Zhao Yan;Ming Zhou.
international conference on computational linguistics (2016)
XGLUE: A New Benchmark Datasetfor Cross-lingual Pre-training, Understanding and Generation
Yaobo Liang;Nan Duan;Yeyun Gong;Ning Wu.
empirical methods in natural language processing (2020)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
Sinovation Ventures
Fudan University
Microsoft (United States)
Harbin Institute of Technology
Fudan University
University of Science and Technology of China
Beihang University
Harbin Institute of Technology
Microsoft (United States)
Beijing University of Posts and Telecommunications
The Ohio State University
Bentley University
University of Michigan–Ann Arbor
Gdańsk University of Technology
East China Normal University
Trent University
University of Idaho
Czech Academy of Sciences
Rockefeller University
University of Alberta
Kyushu University
Columbia University
University of Calgary
Harvard University
University of Helsinki
University of Pennsylvania