D-Index & Metrics Best Publications

D-Index & Metrics D-index (Discipline H-index) only includes papers and citation values for an examined discipline in contrast to General H-index which accounts for publications across all disciplines.

Discipline name D-index D-index (Discipline H-index) only includes papers and citation values for an examined discipline in contrast to General H-index which accounts for publications across all disciplines. Citations Publications World Ranking National Ranking
Engineering and Technology D-index 36 Citations 6,750 74 World Ranking 3283 National Ranking 137
Computer Science D-index 32 Citations 5,620 95 World Ranking 9131 National Ranking 382

Overview

What is he best known for?

The fields of study he is best known for:

  • Artificial intelligence
  • Statistics
  • Machine learning

His primary areas of study are Rate of convergence, Mathematical optimization, Artificial intelligence, Convexity and Machine learning. His studies deal with areas such as Polytope, Submodular set function, Subgradient method and Applied mathematics as well as Rate of convergence. He interconnects Convergence, Segmentation, Structured prediction and Markov chain in the investigation of issues within Mathematical optimization.

His work in Convergence addresses subjects such as Algorithm, which are connected to disciplines such as Reproducing kernel Hilbert space, Flow network and Video tracking. His work in the fields of Artificial intelligence, such as Cluster analysis and Unsupervised learning, intersects with other areas such as The Internet and Set. His Machine learning research is multidisciplinary, incorporating elements of Training set and Robustness.

His most cited work include:

  • SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives (521 citations)
  • SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives (511 citations)
  • DiscLDA: Discriminative Learning for Dimensionality Reduction and Classification (302 citations)

What are the main themes of his work throughout his whole career to date?

Simon Lacoste-Julien mainly focuses on Artificial intelligence, Mathematical optimization, Algorithm, Rate of convergence and Machine learning. His Artificial intelligence study integrates concerns from other disciplines, such as Sequence and Pattern recognition. In general Mathematical optimization study, his work on Subgradient method often relates to the realm of Quadratic equation, thereby connecting several areas of interest.

His research integrates issues of Pairwise comparison, Moment and Reproducing kernel Hilbert space in his study of Algorithm. His Rate of convergence research includes themes of Overhead, Gradient method, Asynchronous communication, Applied mathematics and Speedup. The Machine learning study combines topics in areas such as Base, Adaptation, Robustness and Benchmark.

He most often published in these fields:

  • Artificial intelligence (36.62%)
  • Mathematical optimization (29.58%)
  • Algorithm (26.06%)

What were the highlights of his more recent work (between 2019-2021)?

  • Artificial intelligence (36.62%)
  • Machine learning (20.42%)
  • Applied mathematics (13.38%)

In recent papers he was focusing on the following fields of study:

Simon Lacoste-Julien mainly investigates Artificial intelligence, Machine learning, Applied mathematics, Convergence and Artificial neural network. His study in Artificial intelligence is interdisciplinary in nature, drawing from both Causal model, Stochastic optimization and Categorical variable. His work on Proxy as part of general Machine learning research is frequently linked to Variable, thereby connecting diverse disciplines of science.

His study on Bregman divergence is often connected to Convex optimization and Operator as part of broader study in Applied mathematics. Simon Lacoste-Julien has included themes like Mathematical optimization and Constraint in his Convergence study. He combines subjects such as Stochastic gradient descent and Constant with his study of Mathematical optimization.

Between 2019 and 2021, his most popular works were:

  • Stochastic Polyak Step-size for SGD: An Adaptive Learning Rate for Fast Convergence (21 citations)
  • A Closer Look at the Optimization Landscapes of Generative Adversarial Networks (18 citations)
  • A Tight and Unified Analysis of Gradient-Based Methods for a Whole Spectrum of Differentiable Games. (14 citations)

In his most recent research, the most cited papers focused on:

  • Artificial intelligence
  • Statistics
  • Machine learning

Simon Lacoste-Julien mainly focuses on Applied mathematics, Gradient descent, Bilinear interpolation, Convex optimization and Convergence. In his works, Simon Lacoste-Julien undertakes multidisciplinary study on Applied mathematics and Convex function. His work carried out in the field of Gradient descent brings together such families of science as Real line, Adversarial machine learning, Complex plane and Strongly monotone.

His Bilinear interpolation study combines topics from a wide range of disciplines, such as Class, Stationary point and Hamiltonian. He has researched Convergence in several fields, including Mathematical optimization, Stochastic gradient descent, Newton's method and Training set. His Interpolation research integrates issues from Binary classification, Rate of convergence, Broyden–Fletcher–Goldfarb–Shanno algorithm and Hessian matrix.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Best Publications

A closer look at memorization in deep networks

Devansh Arpit;Stanisław Jastrzębski;Nicolas Ballas;David Krueger.
international conference on machine learning (2017)

703 Citations

A closer look at memorization in deep networks

Devansh Arpit;Stanisław Jastrzębski;Nicolas Ballas;David Krueger.
international conference on machine learning (2017)

703 Citations

SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

Aaron Defazio;Francis Bach;Simon Lacoste-Julien.
neural information processing systems (2014)

676 Citations

SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

Aaron Defazio;Francis Bach;Simon Lacoste-Julien.
neural information processing systems (2014)

676 Citations

DiscLDA: Discriminative Learning for Dimensionality Reduction and Classification

Simon Lacoste-Julien;Fei Sha;Michael I. Jordan.
neural information processing systems (2008)

510 Citations

DiscLDA: Discriminative Learning for Dimensionality Reduction and Classification

Simon Lacoste-Julien;Fei Sha;Michael I. Jordan.
neural information processing systems (2008)

510 Citations

Block-Coordinate Frank-Wolfe Optimization for Structural SVMs

Simon Lacoste-Julien;Martin Jaggi;Mark Schmidt;Patrick Pletscher.
international conference on machine learning (2013)

339 Citations

Block-Coordinate Frank-Wolfe Optimization for Structural SVMs

Simon Lacoste-Julien;Martin Jaggi;Mark Schmidt;Patrick Pletscher.
international conference on machine learning (2013)

339 Citations

A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method

Simon Lacoste-Julien;Mark W. Schmidt;Francis R. Bach.
arXiv: Learning (2012)

238 Citations

A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method

Simon Lacoste-Julien;Mark W. Schmidt;Francis R. Bach.
arXiv: Learning (2012)

238 Citations

If you think any of the details on this page are incorrect, let us know.

Contact us

Best Scientists Citing Simon Lacoste-Julien

Peter Richtárik

Peter Richtárik

King Abdullah University of Science and Technology

Publications: 78

Francis Bach

Francis Bach

École Normale Supérieure

Publications: 55

Suvrit Sra

Suvrit Sra

MIT

Publications: 38

Tianbao Yang

Tianbao Yang

Texas A&M University

Publications: 35

Eric P. Xing

Eric P. Xing

Carnegie Mellon University

Publications: 33

Masashi Sugiyama

Masashi Sugiyama

RIKEN

Publications: 30

Jun Zhu

Jun Zhu

Tsinghua University

Publications: 26

Tong Zhang

Tong Zhang

Hong Kong University of Science and Technology

Publications: 25

Alejandro Ribeiro

Alejandro Ribeiro

University of Pennsylvania

Publications: 24

Ali H. Sayed

Ali H. Sayed

École Polytechnique Fédérale de Lausanne

Publications: 21

Ivan Laptev

Ivan Laptev

French Institute for Research in Computer Science and Automation - INRIA

Publications: 20

Michael I. Jordan

Michael I. Jordan

University of California, Berkeley

Publications: 19

Josef Sivic

Josef Sivic

Czech Technical University in Prague

Publications: 19

Dacheng Tao

Dacheng Tao

University of Sydney

Publications: 18

Yoshua Bengio

Yoshua Bengio

University of Montreal

Publications: 18

Julien Mairal

Julien Mairal

Grenoble Alpes University

Publications: 16

Trending Scientists

Xiaolin Hu

Xiaolin Hu

Tsinghua University

Hopeton Walker

Hopeton Walker

Chip Law Group

Shinsuke Kato

Shinsuke Kato

University of Tokyo

John A. Gladysz

John A. Gladysz

Texas A&M University

Sergey V. Dmitriev

Sergey V. Dmitriev

Russian Academy of Sciences

Frank Uhlmann

Frank Uhlmann

The Francis Crick Institute

Cameron K. Ghalambor

Cameron K. Ghalambor

Norwegian University of Science and Technology

Paul R. Langford

Paul R. Langford

Imperial College London

Tom H. Stevens

Tom H. Stevens

University of Oregon

Kun Yan Zhu

Kun Yan Zhu

Kansas State University

Jean Charles Munch

Jean Charles Munch

Technical University of Munich

Magnus P. Borres

Magnus P. Borres

Thermo Fisher Scientific (Israel)

Wiebo Brouwer

Wiebo Brouwer

University of Groningen

Francis D. Pagani

Francis D. Pagani

University of Michigan–Ann Arbor

Klaus Lechner

Klaus Lechner

University of Vienna

Mark Birkinshaw

Mark Birkinshaw

University of Bristol

Something went wrong. Please try again later.