2017 - SIAM Fellow For fundamental contributions to continuous optimization theory, analysis, development of algorithms, and scientific applications.
His primary areas of investigation include Mathematical optimization, Convex optimization, Rate of convergence, Proximal Gradient Methods and Convex analysis. His Mathematical optimization research is multidisciplinary, incorporating elements of Bregman divergence, Algorithm, Convex function and Deblurring. His biological study spans a wide range of topics, including Image processing and Minification.
His study in Proximal Gradient Methods is interdisciplinary in nature, drawing from both Modes of convergence, Optimization problem, Proximal gradient methods for learning and Random coordinate descent. His Random coordinate descent research includes elements of Deconvolution and Inverse problem. His Convex analysis research incorporates themes from Uniform convergence, Weak convergence, Subderivative and Applied mathematics.
His primary scientific interests are in Mathematical optimization, Convex optimization, Rate of convergence, Applied mathematics and Proximal Gradient Methods. The study incorporates disciplines such as Algorithm, Bounded function and Nonlinear programming in addition to Mathematical optimization. Marc Teboulle regularly ties together related areas like Deblurring in his Algorithm studies.
Marc Teboulle has researched Convex optimization in several fields, including Linear matrix inequality, Quadratic programming and Duality. Characterization is closely connected to Metric in his research, which is encompassed under the umbrella topic of Rate of convergence. His Proximal Gradient Methods research is multidisciplinary, incorporating perspectives in Proximal gradient methods for learning, Subgradient method and Random coordinate descent.
Marc Teboulle focuses on Mathematical optimization, Rate of convergence, Minification, Applied mathematics and Lipschitz continuity. His work carried out in the field of Mathematical optimization brings together such families of science as Nonlinear system and Proximal Gradient Methods. His studies in Rate of convergence integrate themes in fields like Fixed point, Iterated function, Metric, Sequence and Monotonic function.
His Minification study incorporates themes from Subsequence, Vector-valued function and Convex set. His Lipschitz continuity research integrates issues from Bregman divergence and Lemma. While the research belongs to areas of Bregman divergence, he spends his time largely on the problem of Convex function, intersecting his research to questions surrounding Random coordinate descent.
His primary areas of study are Mathematical optimization, Minification, Applied mathematics, Focus and Numerical analysis. His Mathematical optimization research is multidisciplinary, incorporating perspectives in Convex function, Proximal Gradient Methods and Feature. Marc Teboulle combines subjects such as Subderivative and Random coordinate descent with his study of Convex function.
His study with Proximal Gradient Methods involves better knowledge in Convex optimization. His Numerical analysis study frequently links to related topics such as Rate of convergence. His Rate of convergence study combines topics from a wide range of disciplines, such as Euclidean space and Combinatorics.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
Amir Beck;Marc Teboulle.
Siam Journal on Imaging Sciences (2009)
Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
A. Beck;M. Teboulle.
IEEE Transactions on Image Processing (2009)
Proximal alternating linearized minimization for nonconvex and nonsmooth problems
Jérôme Bolte;Shoham Sabach;Marc Teboulle.
Mathematical Programming (2014)
Mirror descent and nonlinear projected subgradient methods for convex optimization
Amir Beck;Marc Teboulle.
Operations Research Letters (2003)
Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
Gong Chen;Marc Teboulle.
Siam Journal on Optimization (1993)
Asymptotic cones and functions in optimization and variational inequalities
Alfred Auslender;Marc Teboulle.
(2002)
A proximal-based decomposition method for convex minimization problems
Gong Chen;Marc Teboulle.
Mathematical Programming (1994)
AN OLD‐NEW CONCEPT OF CONVEX RISK MEASURES: THE OPTIMIZED CERTAINTY EQUIVALENT
Aharon Ben-Tal;Marc Teboulle.
Mathematical Finance (2007)
Gradient-based algorithms with applications to signal-recovery problems.
Amir Beck;Marc Teboulle.
Convex Optimization in Signal Processing and Communications (2009)
Interior Gradient and Proximal Methods for Convex and Conic Optimization
Alfred Auslender;Marc Teboulle.
Siam Journal on Optimization (2006)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
Tel Aviv University
Technion – Israel Institute of Technology
Instituto Nacional de Matemática Pura e Aplicada
University of Montpellier
University of British Columbia
Weizmann Institute of Science
University of Newcastle Australia
Instituto Nacional de Matemática Pura e Aplicada
University of Chicago
Arizona State University
University of Pennsylvania
University of Alberta
Google (United States)
The Ohio State University
University of Messina
University of Technology Sydney
University of Manchester
University of Washington
University of Helsinki
University of North Carolina at Chapel Hill
University of Münster
University of Minnesota
KU Leuven
École Normale Supérieure
University of Haifa
Johns Hopkins University