Angelia Nedic mainly investigates Mathematical optimization, Subgradient method, Convex optimization, Convergence and Distributed algorithm. Her studies in Mathematical optimization integrate themes in fields like Rate of convergence and Algorithm. Her Subgradient method research integrates issues from Consensus, Slater's condition and Duality.
The various areas that Angelia Nedic examines in her Convex optimization study include Convex function, Convergence of random variables, Stochastic optimization and Sequence. Her Convergence research is multidisciplinary, relying on both Function and Linear programming. Her research investigates the connection between Distributed algorithm and topics such as Algorithm design that intersect with issues in Average consensus.
Mathematical optimization, Convex function, Rate of convergence, Convex optimization and Algorithm are her primary areas of study. Angelia Nedic has included themes like Distributed algorithm and Convergence in her Mathematical optimization study. Her study on Convex function also encompasses disciplines like
Her biological study spans a wide range of topics, including Theoretical computer science, Hessian matrix, Directed graph, Combinatorics and Applied mathematics. Her Convex optimization study incorporates themes from Generalization, Bounded function and Logarithm. Her work on Random projection as part of general Algorithm study is frequently connected to Constraint, therefore bridging the gap between diverse disciplines of science and establishing a new relationship between them.
Her primary areas of study are Mathematical optimization, Convex function, Convex optimization, Rate of convergence and Optimization problem. Her Mathematical optimization research is multidisciplinary, relying on both Resource allocation, Convergence, Node, Distributed algorithm and Monotonic function. Her study in Convergence is interdisciplinary in nature, drawing from both Multi-agent system and Shared resource.
Her research in Convex function intersects with topics in Sequence, Gradient method, Applied mathematics, Function and Linear programming. Her Convex optimization study integrates concerns from other disciplines, such as Algorithm, Logarithm and Numerical analysis. Her study focuses on the intersection of Rate of convergence and fields such as Acceleration with connections in the field of Curvature.
Her scientific interests lie mostly in Convex optimization, Convex function, Mathematical optimization, Algorithm and Function. Her study looks at the relationship between Convex optimization and fields such as Numerical analysis, as well as how they intersect with chemical problems. The study incorporates disciplines such as Node and Rate of convergence in addition to Mathematical optimization.
Angelia Nedic has researched Algorithm in several fields, including Distributed algorithm, Dual, Measure, Finite set and Variational inequality. Her Function research is multidisciplinary, incorporating elements of Iterated function, Stochastic optimization and Constant. Her studies in Optimization problem integrate themes in fields like Convergence and Multi-agent system.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Distributed Subgradient Methods for Multi-Agent Optimization
A. Nedic;A. Ozdaglar.
IEEE Transactions on Automatic Control (2009)
Convex Analysis and Optimization
Dimitri P. Bertsekas;Angelia Nedić;Asuman E. Ozdaglar.
(2003)
Constrained Consensus and Optimization in Multi-Agent Networks
A. Nedic;A. Ozdaglar;P.A. Parrilo.
IEEE Transactions on Automatic Control (2010)
Distributed optimization over time-varying directed graphs
Angelia Nedic;Alex Olshevsky.
conference on decision and control (2013)
Distributed Stochastic Subgradient Projection Algorithms for Convex Optimization
S. Sundhar Ram;Angelia Nedic;Venugopal V. Veeravalli.
Journal of Optimization Theory and Applications (2010)
On distributed averaging algorithms and quantization effects
A. Nedic;A. Olshevsky;A. Ozdaglar;J.N. Tsitsiklis.
conference on decision and control (2008)
Incremental Subgradient Methods for Nondifferentiable Optimization
Angelia Nedic;Dimitri P. Bertsekas.
Siam Journal on Optimization (2001)
Achieving Geometric Convergence for Distributed Optimization Over Time-Varying Graphs
Angelia Nedić;Alex Olshevsky;Wei Shi.
Siam Journal on Optimization (2017)
Subgradient Methods for Saddle-Point Problems
Angelia Nedic;Asuman E. Ozdaglar.
Journal of Optimization Theory and Applications (2009)
Approximate Primal Solutions and Rate Analysis for Dual Subgradient Methods
Angelia Nedić;Asuman Ozdaglar.
Siam Journal on Optimization (2008)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
University of Illinois at Urbana-Champaign
MIT
University of Illinois at Urbana-Champaign
Arizona State University
University of Illinois at Urbana-Champaign
Arizona State University
University of Illinois at Urbana-Champaign
Facebook (United States)
University of California, Irvine
North Carolina State University
London School of Economics and Political Science
Czech Technical University in Prague
Pennsylvania State University
Imperial College London
University of California, Berkeley
Oregon State University
New York State Department of Health
University of Illinois at Chicago
Bielefeld University
University of Hawaii at Manoa
Utrecht University
University of California, Davis
Princeton University
Queensland University of Technology
University of North Carolina at Chapel Hill