His primary areas of study are Artificial intelligence, Statistics, Machine learning, Algorithm and Data mining. Jerome H. Friedman interconnects Covariate and Regression in the investigation of issues within Artificial intelligence. His work on Regression analysis, Optimal discriminant analysis and Recursive partitioning as part of general Statistics study is frequently connected to Bayes error rate and Bayes' theorem, therefore bridging the gap between diverse disciplines of science and establishing a new relationship between them.
Jerome H. Friedman regularly ties together related areas like Econometrics in his Machine learning studies. Jerome H. Friedman has researched Algorithm in several fields, including Best bin first and Nearest neighbor search. The various areas that Jerome H. Friedman examines in his Data mining study include Regularization, Range searching, Set, Elastic net regularization and Applied mathematics.
Artificial intelligence, Machine learning, Statistics, Mathematical optimization and Algorithm are his primary areas of study. Jerome H. Friedman focuses mostly in the field of Artificial intelligence, narrowing it down to matters related to Regression and, in some cases, Regression analysis. His work in Machine learning addresses subjects such as Data mining, which are connected to disciplines such as Feature selection.
His Mathematical optimization research is multidisciplinary, incorporating elements of Smoothing and Lasso. His Lasso course of study focuses on Coordinate descent and Regularization. His research integrates issues of Variable, Set and Projection pursuit in his study of Algorithm.
Jerome H. Friedman focuses on Artificial intelligence, Machine learning, Lasso, Applied mathematics and Pattern recognition. His Artificial intelligence research includes themes of Tree, Probability distribution and Regression. His Machine learning study incorporates themes from Statistical learning and Inference.
His Lasso study integrates concerns from other disciplines, such as Algorithm, Generalization, Set and Asymptotic distribution. The Generalized linear model research Jerome H. Friedman does as part of his general Applied mathematics study is frequently linked to other disciplines of science, such as R package, therefore creating a link between diverse domains of science. His Pattern recognition research is multidisciplinary, incorporating perspectives in Linear least squares, Sparse regression, Nonparametric regression and Generalized additive model.
Jerome H. Friedman mainly investigates Lasso, Artificial intelligence, Machine learning, Applied mathematics and Statistical learning. The Lasso study combines topics in areas such as Regularization, Set, Linear regression and Consistency. His studies in Artificial intelligence integrate themes in fields like Tree and Regression.
His Machine learning research incorporates elements of Sample and Decision rule. His research integrates issues of Generalization and Elastic net regularization in his study of Applied mathematics. His study connects Inference and Statistical learning.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
Trevor Hastie;Robert J. Tibshirani;Jerome Friedman.
(2013)
Classification and regression trees
Leo Breiman.
(1983)
Classification and Regression Trees.
John Van Ryzin;Leo Breiman;Jerome H. Friedman;Richard A. Olshen.
Journal of the American Statistical Association (1986)
Greedy function approximation: A gradient boosting machine.
Jerome H. Friedman.
Annals of Statistics (2001)
The Elements of Statistical Learning
Trevor Hastie;Robert Tibshirani;Jerome H. Friedman.
(2001)
Regularization Paths for Generalized Linear Models via Coordinate Descent
Jerome Friedman;Trevor Hastie;Robert Tibshirani.
Journal of Statistical Software (2010)
Multivariate Adaptive Regression Splines
Jerome H. Friedman.
Annals of Statistics (1991)
Additive Logistic Regression : A Statistical View of Boosting
Jerome Friedman;Trevor Hastie;Robert Tibshirani.
Annals of Statistics (2000)
Stochastic gradient boosting
Jerome H. Friedman.
Computational Statistics & Data Analysis (2002)
Sparse inverse covariance estimation with the graphical lasso
Jerome Friedman;Trevor Hastie;Robert Tibshirani.
Biostatistics (2008)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
ETH Zurich
Autonomous University of Barcelona
LG Corporation (South Korea)
Nvidia (United Kingdom)
The University of Texas at Austin
University of California, San Francisco
Pontificia Universidad Católica de Chile
The Ohio State University
University of Queensland
University of Utah
Oregon State University
Janssen (Belgium)
Centers for Disease Control and Prevention
National Institutes of Health
Linköping University
University of Hong Kong