2022 - Research.com Rising Star of Science Award
Yu Cheng spends much of his time researching Artificial intelligence, Artificial neural network, Machine learning, Benchmark and Pattern recognition. Yu Cheng integrates Artificial intelligence and Code in his research. His Artificial neural network research incorporates elements of Circulant matrix, Fast Fourier transform and Time complexity.
Many of his research projects under Machine learning are closely connected to Network architecture, Original data and Domain adaptation with Network architecture, Original data and Domain adaptation, tying the diverse disciplines of science together. The various areas that Yu Cheng examines in his Benchmark study include Anomaly detection, Data mining, Autoencoder and Kernel. His study in the fields of Segmentation under the domain of Pattern recognition overlaps with other disciplines such as Sequence modeling.
His primary areas of investigation include Artificial intelligence, Machine learning, Natural language processing, Pattern recognition and Computer vision. His work in Deep learning, Language model, Artificial neural network, Feature and Question answering are all subfields of Artificial intelligence research. His Artificial neural network research incorporates themes from Circulant matrix, Algorithm, Theoretical computer science and Convolutional neural network.
In general Machine learning, his work in Leverage is often linked to Metric linking many areas of study. His work in Natural language processing addresses subjects such as Representation, which are connected to disciplines such as Matching. His Pattern recognition study which covers Facial recognition system that intersects with Feature extraction.
His scientific interests lie mostly in Artificial intelligence, Natural language processing, Question answering, Language model and Machine learning. His research brings together the fields of Computer vision and Artificial intelligence. Yu Cheng has included themes like Relation and Coreference in his Natural language processing study.
His studies in Question answering integrate themes in fields like Theoretical computer science, Feature learning and Closed captioning. His biological study spans a wide range of topics, including Compression, Transformer and Benchmark. Yu Cheng combines subjects such as Subspace topology and Robustness with his study of Machine learning.
Artificial intelligence, Question answering, Language model, Natural language processing and Inference are his primary areas of study. His Artificial intelligence course of study focuses on Machine learning and Benchmark. Yu Cheng interconnects Matching and Sentence in the investigation of issues within Question answering.
The Matching study combines topics in areas such as Theoretical computer science, Artificial neural network, Interpretability, Graph and Machine translation. His research investigates the connection between Language model and topics such as Transformer that intersect with problems in Speech recognition, HERO and Closed captioning. His Natural language processing research focuses on Coreference and how it relates to Relation.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
A Survey of Model Compression and Acceleration for Deep Neural Networks
Yu Cheng;Duo Wang;Pan Zhou;Tao Zhang.
arXiv: Learning (2017)
MMD GAN: Towards Deeper Understanding of Moment Matching Network
Chun-Liang Li;Wei-Cheng Chang;Yu Cheng;Yiming Yang.
neural information processing systems (2017)
Deep Model Based Domain Adaptation for Fault Diagnosis
Weining Lu;Bin Liang;Yu Cheng;Deshan Meng.
IEEE Transactions on Industrial Electronics (2017)
Patient Knowledge Distillation for BERT Model Compression
Siqi Sun;Yu Cheng;Zhe Gan;Jingjing Liu.
empirical methods in natural language processing (2019)
UNITER: UNiversal Image-TExt Representation Learning
Yen-Chun Chen;Linjie Li;Licheng Yu;Ahmed El Kholy.
european conference on computer vision (2020)
EnlightenGAN: Deep Light Enhancement Without Paired Supervision
Yifan Jiang;Xinyu Gong;Ding Liu;Yu Cheng.
IEEE Transactions on Image Processing (2021)
Model Compression and Acceleration for Deep Neural Networks: The Principles, Progress, and Challenges
Yu Cheng;Duo Wang;Pan Zhou;Tao Zhang.
IEEE Signal Processing Magazine (2018)
Risk Prediction with Electronic Health Records: A Deep Learning Approach.
Yu Cheng;Fei Wang;Ping Zhang;Jianying Hu.
siam international conference on data mining (2016)
An Exploration of Parameter Redundancy in Deep Networks with Circulant Projections
Yu Cheng;Yu Cheng;Felix X. Yu;Rogerio S. Feris;Sanjiv Kumar.
international conference on computer vision (2015)
Fully-Adaptive Feature Sharing in Multi-Task Networks with Applications in Person Attribute Classification
Yongxi Lu;Abhishek Kumar;Shuangfei Zhai;Yu Cheng.
computer vision and pattern recognition (2017)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
Microsoft (United States)
MIT
Northwestern University
Northwestern University
IBM (United States)
Huazhong University of Science and Technology
The University of Texas at Austin
King Abdullah University of Science and Technology
ByteDance
Northwestern University
Harbin Institute of Technology
University of Valencia
University of Minnesota
Shandong University of Science and Technology
Oak Ridge National Laboratory
Samsung (South Korea)
New South Wales Department of Primary Industries
University of Veterinary Medicine Vienna
Alfred Wegener Institute for Polar and Marine Research
Johannes Gutenberg University of Mainz
Centre national de la recherche scientifique, CNRS
Scottish Universities Environmental Research Centre
University of Maryland, Baltimore
University of California, Irvine
Sorbonne University
Princeton University