His scientific interests lie mostly in Curse of dimensionality, Tensor, Applied mathematics, Density matrix renormalization group and Rank. The various areas that Ivan V. Oseledets examines in his Curse of dimensionality study include Eigenvalues and eigenvectors, Matrix and Algorithm, Singular value decomposition, Computation. To a larger extent, he studies Algebra with the aim of understanding Singular value decomposition.
His research integrates issues of Backpropagation, Representation, Dimensionality reduction, Convolutional neural network and Least squares in his study of Tensor. His research on Applied mathematics frequently links to adjacent areas such as Tangent space. Ivan V. Oseledets has researched Rank in several fields, including Pure mathematics, Combinatorics, Iterative method, Preconditioner and Numerical analysis.
His primary areas of study are Applied mathematics, Algorithm, Artificial intelligence, Rank and Artificial neural network. The study incorporates disciplines such as Tensor, Matrix product state, Discretization, Low-rank approximation and Computation in addition to Applied mathematics. Ivan V. Oseledets combines subjects such as Discrete mathematics and Eigenvalues and eigenvectors with his study of Computation.
His Algorithm study also includes
Ivan V. Oseledets mostly deals with Artificial intelligence, Artificial neural network, Algorithm, Applied mathematics and Deep learning. His Artificial intelligence research incorporates themes from Machine learning, Recommender system and Euclidean geometry. His Artificial neural network study combines topics in areas such as Convolutional neural network, Pattern recognition and Data mining.
His Algorithm research incorporates elements of Optimal design, Compression and Interpolation. His Applied mathematics research includes elements of Discretization, Stochastic gradient descent, Partial differential equation and Rank. He works mostly in the field of Simple, limiting it down to concerns involving Combinatorics and, occasionally, Ring, Tensor and Computation.
His primary areas of investigation include Rank, Theoretical computer science, Artificial intelligence, Artificial neural network and Tensor. Ivan V. Oseledets has included themes like Tensor, Combinatorics, Ring, Compression and Computation in his Rank study. His work deals with themes such as Permutation, Numerical stability, Degeneracy, Algorithm and Convolutional neural network, which intersect with Compression.
His Theoretical computer science research is multidisciplinary, incorporating elements of Adversarial system and Multiple image. Ivan V. Oseledets frequently studies issues relating to Tucker decomposition and Artificial intelligence. His study in Tensor is interdisciplinary in nature, drawing from both Approximation algorithm, Matrix decomposition, Iterative method, Sparse matrix and Dimensionality reduction.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
I. V. Oseledets.
SIAM Journal on Scientific Computing (2011)
Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions
I. V. Oseledets;E. E. Tyrtyshnikov.
SIAM Journal on Scientific Computing (2009)
TT-cross approximation for multidimensional arrays
Ivan Oseledets;Eugene Tyrtyshnikov.
Linear Algebra and its Applications (2010)
Speeding-up Convolutional Neural Networks Using Fine-tuned CP-Decomposition
Vadim Lebedev;Vadim Lebedev;Yaroslav Ganin;Maksim Rakhuba;Maksim Rakhuba;Ivan Oseledets.
international conference on learning representations (2015)
Unifying time evolution and optimization with matrix product states
Jutho Haegeman;Christian Lubich;Ivan Oseledets;Ivan Oseledets;Bart Vandereycken.
Physical Review B (2016)
Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions
Andrzej Cichocki;Namgil Lee;Ivan Oseledets;Anh-Huy Phan.
How to find a good submatrix
S. Goreinov;I. Oseledets;D. Savostyanov;E. Tyrtyshnikov.
Matrix Methods: Theory, Algorithms, Applications (2010)
Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1.
Andrzej Cichocki;Namgil Lee;Ivan V. Oseledets;Anh Huy Phan.
arXiv: Numerical Analysis (2016)
Tucker Dimensionality Reduction of Three-Dimensional Arrays in Linear Time
I. V. Oseledets;D. V. Savostianov;E. E. Tyrtyshnikov.
SIAM Journal on Matrix Analysis and Applications (2008)
Solution of Linear Systems and Matrix Inversion in the TT-Format
Ivan V. Oseledets;Sergey V. Dolgov.
SIAM Journal on Scientific Computing (2012)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: