2022 - Research.com Rising Star of Science Award
Tengyu Ma mostly deals with Artificial intelligence, Artificial neural network, Algorithm, Gradient descent and Machine learning. His work deals with themes such as Residual and Natural language processing, which intersect with Artificial intelligence. His Artificial neural network study incorporates themes from Paraphrase, Generative grammar, Generative model and Mathematical optimization.
His Generative model research integrates issues from Smoothing, Speech recognition, Sentence, Unsupervised learning and Transfer of learning. His work on Communication complexity and Data point as part of his general Algorithm study is frequently connected to Function and Upper and lower bounds, thereby bridging the divide between different branches of science. Tengyu Ma has included themes like Time complexity, Stochastic gradient descent and Conjecture in his Gradient descent study.
His primary areas of study are Artificial neural network, Artificial intelligence, Mathematical optimization, Algorithm and Machine learning. His Gradient descent and Stochastic gradient descent study in the realm of Artificial neural network connects with subjects such as Polynomial and Metric. His research investigates the connection with Gradient descent and areas like Combinatorics which intersect with concerns in Regret.
His work in Artificial intelligence addresses subjects such as Pattern recognition, which are connected to disciplines such as Noise reduction. His biological study spans a wide range of topics, including Normalization, Estimator, Latent variable and Inference. His studies deal with areas such as Theoretical computer science and Natural language processing as well as Word.
The scientist’s investigation covers issues in Artificial intelligence, Artificial neural network, Regularization, Machine learning and Stochastic gradient descent. His Artificial intelligence study integrates concerns from other disciplines, such as Standard model and Pattern recognition. Tengyu Ma interconnects Margin and Combinatorics in the investigation of issues within Artificial neural network.
The Regularization study combines topics in areas such as Sample size determination, Linear regression and Deep neural networks. His research integrates issues of Data point and Conjugate gradient method in his study of Machine learning. His work carried out in the field of Stochastic gradient descent brings together such families of science as Mathematical optimization, Leverage and Applied mathematics.
His primary scientific interests are in Regularization, Applied mathematics, Stochastic gradient descent, Upper and lower bounds and Algorithm. His Regularization research is multidisciplinary, incorporating perspectives in Artificial neural network and Gradient descent. His Artificial neural network study combines topics from a wide range of disciplines, such as Sample size determination and Linear regression.
The various areas that Tengyu Ma examines in his Applied mathematics study include Stability and Deep neural networks. His Stochastic gradient descent study combines topics in areas such as Covariance and Gaussian noise. Tengyu Ma focuses mostly in the field of Algorithm, narrowing it down to matters related to MNIST database and, in some cases, Classifier and Sharpening.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
A Simple but Tough-to-Beat Baseline for Sentence Embeddings
Sanjeev Arora;Yingyu Liang;Tengyu Ma.
international conference on learning representations (2017)
Matrix Completion has No Spurious Local Minimum
Rong Ge;Jason D. Lee;Tengyu Ma.
neural information processing systems (2016)
Generalization and Equilibrium in Generative Adversarial Nets (GANs)
Sanjeev Arora;Rong Ge;Yingyu Liang;Tengyu Ma.
international conference on machine learning (2017)
Provable Bounds for Learning Some Deep Representations
Sanjeev Arora;Aditya Bhaskara;Rong Ge;Tengyu Ma.
international conference on machine learning (2014)
A Latent Variable Model Approach to PMI-based Word Embeddings
Sanjeev Arora;Yuanzhi Li;Yingyu Liang;Tengyu Ma.
Transactions of the Association for Computational Linguistics (2016)
Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss
Kaidi Cao;Colin Wei;Adrien Gaidon;Nikos Arechiga.
neural information processing systems (2019)
Identity Matters in Deep Learning
Moritz Hardt;Tengyu Ma.
international conference on learning representations (2016)
Finding approximate local minima faster than gradient descent
Naman Agarwal;Zeyuan Allen-Zhu;Brian Bullins;Elad Hazan.
symposium on the theory of computing (2017)
Algorithmic Regularization in Over-parameterized Matrix Sensing and Neural Networks with Quadratic Activations
Yuanzhi Li;Tengyu Ma;Hongyang Zhang.
conference on learning theory (2018)
Fixup Initialization: Residual Learning Without Normalization
Hongyi Zhang;Yann N. Dauphin;Tengyu Ma.
international conference on learning representations (2019)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
Princeton University
Princeton University
Stanford University
Princeton University
Stanford University
MIT
Google (United States)
Max Planck Institute for Intelligent Systems
Stanford University
Harvard University
Hunan University
Google (United States)
University of Perugia
China Pharmaceutical University
TU Dresden
Autonomous University of Yucatán
Kansas State University
Tufts University
University of Hohenheim
London School of Hygiene & Tropical Medicine
North Carolina State University
Imperial College London
University of Western Australia
La Trobe University
University of Bologna
National and Kapodistrian University of Athens