H-Index & Metrics Top Publications

H-Index & Metrics

Discipline name H-index Citations Publications World Ranking National Ranking
Computer Science H-index 54 Citations 15,304 337 World Ranking 2325 National Ranking 231

Overview

What is he best known for?

The fields of study he is best known for:

  • Artificial intelligence
  • Machine learning
  • Natural language processing

The scientist’s investigation covers issues in Artificial intelligence, Natural language processing, Word, Parsing and Machine learning. Sentence, Artificial neural network, Feature, Embedding and Representation are the core of his Artificial intelligence study. His Natural language processing research is mostly focused on the topic Sentiment analysis.

His studies deal with areas such as Context and Semantic similarity as well as Word. His biological study spans a wide range of topics, including Syntax, Syntax and Discriminative model. The Machine learning study combines topics in areas such as Data mining and Documentation.

His most cited work include:

  • Document Modeling with Gated Recurrent Neural Network for Sentiment Classification (829 citations)
  • Learning Sentiment-Specific Word Embedding for Twitter Sentiment Classification (637 citations)
  • Aspect Level Sentiment Classification with Deep Memory Network (379 citations)

What are the main themes of his work throughout his whole career to date?

His scientific interests lie mostly in Artificial intelligence, Natural language processing, Machine learning, Word and Sentence. As part of his studies on Artificial intelligence, Ting Liu frequently links adjacent subjects like Pattern recognition. His Natural language processing study combines topics in areas such as Context, Feature, Speech recognition, Semantics and SemEval.

Ting Liu regularly links together related areas like Representation in his Context studies. His research in Word tackles topics such as Paraphrase which are related to areas like Information retrieval. His Parsing study frequently draws connections to adjacent fields such as Dependency.

He most often published in these fields:

  • Artificial intelligence (69.69%)
  • Natural language processing (51.31%)
  • Machine learning (15.51%)

What were the highlights of his more recent work (between 2019-2021)?

  • Artificial intelligence (69.69%)
  • Natural language processing (51.31%)
  • Graph (7.16%)

In recent papers he was focusing on the following fields of study:

His primary scientific interests are in Artificial intelligence, Natural language processing, Graph, Machine learning and Information retrieval. Ting Liu has included themes like Consistency and Dialog box in his Artificial intelligence study. His biological study spans a wide range of topics, including Style, Comprehension and Mechanism.

Ting Liu has researched Graph in several fields, including Paragraph, Utterance, Phrase and Logical form. The study incorporates disciplines such as Domain, Space and Word in addition to Machine learning. His research in Information retrieval intersects with topics in Scheme, Session and Action.

Between 2019 and 2021, his most popular works were:

  • CodeBERT: A Pre-Trained Model for Programming and Natural Languages (47 citations)
  • Dynamic Fusion Network for Multi-Domain End-to-end Task-Oriented Dialog (15 citations)
  • From static to dynamic word representations: a survey (13 citations)

In his most recent research, the most cited papers focused on:

  • Artificial intelligence
  • Machine learning
  • Programming language

His primary areas of study are Artificial intelligence, Natural language processing, Machine learning, Transformer and Persona. Many of his studies on Artificial intelligence involve topics that are commonly interrelated, such as Source code. His Parsing study, which is part of a larger body of work in Natural language processing, is frequently linked to Point, bridging the gap between disciplines.

His studies examine the connections between Machine learning and genetics, as well as such issues in Training set, with regards to Domain, Spoken language, Set and Supervised learning. His Transformer research integrates issues from Theoretical computer science, Graph neural networks, Adjacency matrix, Documentation and Natural language. His Word study combines topics from a wide range of disciplines, such as Artificial neural network, Polysemy, Feature selection and Pattern recognition.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Top Publications

Document Modeling with Gated Recurrent Neural Network for Sentiment Classification

Duyu Tang;Bing Qin;Ting Liu.
empirical methods in natural language processing (2015)

1313 Citations

Learning Sentiment-Specific Word Embedding for Twitter Sentiment Classification

Duyu Tang;Furu Wei;Nan Yang;Ming Zhou.
meeting of the association for computational linguistics (2014)

1182 Citations

LTP: A Chinese Language Technology Platform

Wanxiang Che;Zhenghua Li;Ting Liu.
international conference on computational linguistics (2010)

564 Citations

Deep learning for event-driven stock prediction

Xiao Ding;Yue Zhang;Ting Liu;Junwen Duan.
international conference on artificial intelligence (2015)

518 Citations

Aspect Level Sentiment Classification with Deep Memory Network

Duyu Tang;Bing Qin;Ting Liu.
empirical methods in natural language processing (2016)

473 Citations

Effective LSTMs for Target-Dependent Sentiment Classification

Duyu Tang;Bing Qin;Xiaocheng Feng;Ting Liu.
international conference on computational linguistics (2016)

465 Citations

Computer-aided writing system and method with cross-language writing wizard

Ting Liu;Ming Zhou;Jian Wang.
(2001)

353 Citations

Learning Semantic Representations of Users and Products for Document Level Sentiment Classification

Duyu Tang;Bing Qin;Ting Liu.
international joint conference on natural language processing (2015)

292 Citations

Learning Semantic Hierarchies via Word Embeddings

Ruiji Fu;Jiang Guo;Bing Qin;Wanxiang Che.
meeting of the association for computational linguistics (2014)

281 Citations

Coooolll: A Deep Learning System for Twitter Sentiment Classification

Duyu Tang;Furu Wei;Bing Qin;Ting Liu.
international conference on computational linguistics (2014)

256 Citations

Profile was last updated on December 6th, 2021.
Research.com Ranking is based on data retrieved from the Microsoft Academic Graph (MAG).
The ranking h-index is inferred from publications deemed to belong to the considered discipline.

If you think any of the details on this page are incorrect, let us know.

Contact us

Top Scientists Citing Ting Liu

Yue Zhang

Yue Zhang

Westlake University

Publications: 94

Erik Cambria

Erik Cambria

Nanyang Technological University

Publications: 45

Bing Liu

Bing Liu

Peking University

Publications: 41

Guodong Zhou

Guodong Zhou

Soochow University

Publications: 41

Hai Zhao

Hai Zhao

Shanghai Jiao Tong University

Publications: 41

Xuanjing Huang

Xuanjing Huang

Fudan University

Publications: 38

Heng Ji

Heng Ji

University of Illinois at Urbana-Champaign

Publications: 33

Wanxiang Che

Wanxiang Che

Harbin Institute of Technology

Publications: 33

Xipeng Qiu

Xipeng Qiu

Fudan University

Publications: 32

Maosong Sun

Maosong Sun

Tsinghua University

Publications: 30

Jiawei Han

Jiawei Han

University of Illinois at Urbana-Champaign

Publications: 30

Noah A. Smith

Noah A. Smith

University of Washington

Publications: 27

Ivan Vulić

Ivan Vulić

University of Cambridge

Publications: 26

Thomas R. Gruber

Thomas R. Gruber

Apple (United States)

Publications: 25

Chris Dyer

Chris Dyer

Google (United States)

Publications: 25

Something went wrong. Please try again later.