Nonlinear Programming, 3rd Edition, 非線形プログラミング, 第3版,9781886529052,978-1-886529-05-2
Description
This is a thoroughly rewritten version of the 1999 2nd edition of our best-selling nonlinear programming book. New material was included, some of the old material was discarded, and a large portion of the remainder was reorganized or revised. The number of pages has increased by about 100.
The book provides a comprehensive and accessible presentation of algorithms for solving continuous optimization problems. It relies on rigorous mathematical analysis, but also aims at an intuitive exposition that makes use of visualization where possible. It places particular emphasis on modern developments, and their widespread applications in fields such as large-scale resource allocation problems, signal processing, and machine learning.
The 3rd edition brings the book in closer harmony with the companion works Convex Optimization Theory (Athena Scientific, 2009), Convex Optimization Algorithms (Athena Scientific, 2015), Convex Analysis and Optimization (Athena Scientific, 2003), and Network Optimization (Athena Scientific, 1998).
These works are complementary in that they deal primarily with convex, possibly nondifferentiable, optimization problems and rely on convex analysis. By contrast the nonlinear programming book focuses primarily on analytical and computational methods for possibly nonconvex differentiable problems. It relies primarily on calculus and variational analysis, yet it still contains a detailed presentation of duality theory and its uses for both convex and nonconvex problems.
Among its special features, the book:
- Provides extensive coverage of iterative optimization methods within a unifying framework
- Covers in depth duality theory from both a variational and a geometric point of view
- Provides a detailed treatment of interior point methods for linear programming
- Includes much new material on a number of topics, such as proximal algorithms, alternating direction methods of multipliers, and conic programming
- Focuses on large-scale optimization topics of much current interest, such as first order methods, incremental methods, and distributed asynchronous computation, and their applications in machine learning, signal processing, neural network training, and big data applications
- Includes a large number of examples and exercises
- Was developed through extensive classroom use in first-year graduate courses