The scientist’s investigation covers issues in Artificial neural network, Algorithm, Artificial intelligence, Mars Exploration Program and Function. His work carried out in the field of Artificial neural network brings together such families of science as Variation, Data parallelism and Computer engineering. His Algorithm study combines topics in areas such as Covariance function, Jacobian matrix and determinant, Convolutional neural network and Contextual image classification.
His Artificial intelligence research is multidisciplinary, incorporating elements of Machine learning and Computation. His Machine learning study combines topics from a wide range of disciplines, such as Sampling, Probabilistic logic, Density estimation and Inference. He interconnects Pixel, Optics and Remote sensing in the investigation of issues within Mars Exploration Program.
His primary areas of investigation include Artificial intelligence, Artificial neural network, Algorithm, Machine learning and Gaussian process. His Artificial intelligence study frequently links to adjacent areas such as Pattern recognition. The Artificial neural network study combines topics in areas such as Regularization, Kernel and Task.
The concepts of his Algorithm study are interwoven with issues in Subspace topology, Stochastic gradient descent, Markov chain Monte Carlo, Jacobian matrix and determinant and Function. His work in the fields of Unsupervised learning and Convolutional neural network overlaps with other areas such as Process. Jascha Sohl-Dickstein has included themes like Sampling, Probabilistic logic, Inference and Generative model in his Unsupervised learning study.
Jascha Sohl-Dickstein spends much of his time researching Artificial neural network, Algorithm, Artificial intelligence, Gaussian process and Machine learning. His study in the field of Gradient descent is also linked to topics like Kernel. His Algorithm research incorporates elements of Sampling and Markov chain Monte Carlo.
When carried out as part of a general Artificial intelligence research project, his work on Deep learning, Backpropagation and Recurrent neural network is frequently linked to work in Meta learning, therefore connecting diverse disciplines of study. He focuses mostly in the field of Deep learning, narrowing it down to topics relating to Statistical physics and, in certain cases, Linear model. The various areas that Jascha Sohl-Dickstein examines in his Machine learning study include Contextual image classification and Optimization problem.
His scientific interests lie mostly in Artificial neural network, Algorithm, Gaussian process, Deep learning and Artificial intelligence. Jascha Sohl-Dickstein is interested in Gradient descent, which is a field of Artificial neural network. His Algorithm study incorporates themes from Laplace transform, Activation function, Feedforward neural network and Nonlinear system.
His study with Deep learning involves better knowledge in Machine learning. His research in Machine learning focuses on subjects like Norm, which are connected to Backpropagation. His work often combines Artificial intelligence and Jamming studies.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Density estimation using Real NVP
Laurent Dinh;Jascha Sohl-Dickstein;Samy Bengio.
international conference on learning representations (2016)
Unrolled Generative Adversarial Networks
Luke Metz;Ben Poole;David Pfau;Jascha Sohl-Dickstein.
international conference on learning representations (2016)
Deep knowledge tracing
Chris Piech;Jonathan Bassen;Jonathan Huang;Surya Ganguli.
neural information processing systems (2015)
Stratigraphy and sedimentology of a dry to wet eolian depositional system, Burns formation, Meridiani Planum, Mars
J.P. Grotzinger;R.E. Arvidson;J.F. Bell;W. Calvin.
Earth and Planetary Science Letters (2005)
On the expressive power of deep neural networks
Maithra Raghu;Ben Poole;Jon M. Kleinberg;Surya Ganguli.
international conference on machine learning (2017)
Deep Neural Networks as Gaussian Processes
Jaehoon Lee;Yasaman Bahri;Roman Novak;Samuel S. Schoenholz.
international conference on learning representations (2018)
Mars Exploration Rover Athena Panoramic Camera (Pancam) investigation
J.F. Bell;S. W. Squyres;Kenneth E. Herkenhoff;J.N. Maki.
Journal of Geophysical Research (2003)
Deep Knowledge Tracing
Chris Piech;Jonathan Spencer;Jonathan Huang;Surya Ganguli.
arXiv: Artificial Intelligence (2015)
Exponential expressivity in deep neural networks through transient chaos
Ben Poole;Subhaneil Lahiri;Maithreyi Raghu;Jascha Sohl-Dickstein.
neural information processing systems (2016)
Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent
Jaehoon Lee;Lechao Xiao;Samuel S Schoenholz;Yasaman Bahri.
neural information processing systems (2019)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
Stanford University
Arizona State University
Cornell University
Johns Hopkins University Applied Physics Laboratory
University of California, Berkeley
Max Planck Society
California Institute of Technology
Google (United States)
Cornell University
Space Science Institute
University of California, Davis
LG Corporation (South Korea)
Doshisha University
Harvard University
Sapienza University of Rome
University of Montpellier
Georgia Institute of Technology
Thomas Jefferson University
University of Genoa
University of North Carolina at Chapel Hill
Charité - University Medicine Berlin
University of Queensland
Centre de Recherche en Neurosciences de Lyon
University of British Columbia
Queensland University of Technology
Université Paris Cité