Ming Yuan focuses on Mathematical optimization, Estimator, Applied mathematics, Model selection and Lasso. His work carried out in the field of Mathematical optimization brings together such families of science as Regression analysis, Tensor product of Hilbert spaces, Feature selection and Regularization. Ming Yuan combines subjects such as Nonparametric statistics, Quantile, Econometrics, Minimax and Algorithm with his study of Estimator.
His study looks at the intersection of Applied mathematics and topics like Least squares with Coefficient matrix, Linear model, Coefficient of determination and Shrinkage estimator. Ming Yuan studies Lasso, focusing on Elastic net regularization in particular. He works mostly in the field of Focus, limiting it down to concerns involving Machine learning and, occasionally, Artificial intelligence.
Ming Yuan mostly deals with Mathematical optimization, Applied mathematics, Estimator, Artificial intelligence and Statistics. His Mathematical optimization research includes themes of Regularization, Monte Carlo method, Reproducing kernel Hilbert space and Lasso. While the research belongs to areas of Lasso, Ming Yuan spends his time largely on the problem of Feature selection, intersecting his research to questions surrounding Data mining.
His studies deal with areas such as Nonparametric statistics, Rate of convergence, Estimation theory, Algorithm and Mean squared error as well as Estimator. His Artificial intelligence study incorporates themes from Machine learning and Pattern recognition. Ming Yuan interconnects Regression analysis and Model selection in the investigation of issues within Linear regression.
Ming Yuan mainly investigates Estimator, Minimax, Applied mathematics, Algorithm and Nanophotonics. Ming Yuan usually deals with Estimator and limits it to topics linked to Multivariate statistics and Singular value decomposition. His research integrates issues of Structure, Iterated function, Group and Dimension in his study of Minimax.
His work deals with themes such as Kernel embedding of distributions, Upper and lower bounds, Tensor and Dimensionality reduction, which intersect with Applied mathematics. As a member of one scientific family, Ming Yuan mostly works in the field of Algorithm, focusing on Inference and, on occasion, Statistical inference, Consistent estimator and Matrix completion. Ming Yuan merges many fields, such as Mathematical optimization and Simple, in his writings.
His primary areas of study are Applied mathematics, Minimax, Tensor, Estimator and Algorithm. The Applied mathematics study combines topics in areas such as Upper and lower bounds, Regression and Dimensionality reduction. The various areas that Ming Yuan examines in his Upper and lower bounds study include Entropy estimation, Entropy, Efficient estimator, Mean squared error and Independent and identically distributed random variables.
His Minimax research incorporates themes from Subspace topology, Intrinsic dimension, Statistical model and Convex set. Ming Yuan focuses mostly in the field of Tensor, narrowing it down to topics relating to Gradient descent and, in certain cases, Optimization problem and Multilinear map. His study in Algorithm is interdisciplinary in nature, drawing from both Norm and Statistical inference.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Model selection and estimation in regression with grouped variables
Ming Yuan;Yi Lin.
Journal of The Royal Statistical Society Series B-statistical Methodology (2006)
Model selection and estimation in regression with grouped variables
Ming Yuan;Yi Lin.
Journal of The Royal Statistical Society Series B-statistical Methodology (2006)
Model selection and estimation in the Gaussian graphical model
Ming Yuan;Yi Lin.
Biometrika (2007)
Model selection and estimation in the Gaussian graphical model
Ming Yuan;Yi Lin.
Biometrika (2007)
High Dimensional Semiparametric Gaussian Copula Graphical Models.
Han Liu;Fang Han;Ming Yuan;John D. Lafferty.
international conference on machine learning (2012)
High Dimensional Semiparametric Gaussian Copula Graphical Models.
Han Liu;Fang Han;Ming Yuan;John D. Lafferty.
international conference on machine learning (2012)
Composite quantile regression and the oracle model selection theory
Y. Hui Zou;Ming Yuan.
Annals of Statistics (2008)
Composite quantile regression and the oracle model selection theory
Y. Hui Zou;Ming Yuan.
Annals of Statistics (2008)
High Dimensional Inverse Covariance Matrix Estimation via Linear Programming
Ming Yuan.
Journal of Machine Learning Research (2010)
High Dimensional Inverse Covariance Matrix Estimation via Linear Programming
Ming Yuan.
Journal of Machine Learning Research (2010)
Annals of Statistics
(Impact Factor: 4.904)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
University of Pennsylvania
University of Wisconsin–Madison
University of Minnesota
Northwestern University
University of Wisconsin–Madison
Rutgers, The State University of New Jersey
Carnegie Mellon University
MIT
Georgia Institute of Technology
University of Wisconsin–Madison
University of California, Irvine
University of Pennsylvania
Philips (Finland)
Hong Kong Baptist University
Mount Vernon Hospital
Huazhong University of Science and Technology
National Institute of Genetics
University of Copenhagen
University of Lleida
University of Giessen
University of Virginia
University of Miami
University of Colorado Boulder
University of California, Los Angeles
King's College London
University of Geneva