《凸最佳化理論》是清華大學出版社出版發行博賽克斯(DimitriP.Bertsekas)編著的實體書。
基本介紹
- 書名:凸最佳化理論
- 作者:博賽克斯(DimitriP.Bertsekas)
- ISBN:9787302237600
- 出版社: 清華大學出版社
- 出版時間: 2011年1月1日
- 開本:16開
圖書信息
內容簡介
作者簡介
圖書目錄
1.1. Linear Algebra and Real Analysis
1.1.1. Vectors and Matrices
1.1.2. Topological Properties
1.1.3. Square Matrices
1.1.4. Derivatives
1.2. Convex Sets and Functions
1.3. Convex and Affine Hulls
1.4. Relative Interior, Closure, and Continuity
1.5. Recession Cones
1.5.1. Nonemptiness of Intersections of Closed Sets
1.5.2. Closedness Under Linear Transformations
1.6. Notes, Sources, and Exercises
2. Convexity and Optimization
2.1. Global and Local Minima
2.2. The Projection Theorem
2.3. Directions of Recession and Existence of Optimal Solutions
2.3.1. Existence of Solutions of Convex Programs
2.3.2. Unbounded Optimal Solution Sets
2.3.3. Partial Minimization of Convex Functions
2.4. Hyperplanes
2.5. An Elementary Form of Duality
2.5.1. Nonvertical Hyperplanes
2.5.2. Min Common/Max Crossing Duality
2.6. Saddle Point and Minimax Theory
2.6.1. Min Common/Max Crossing Framework for Minimax
2.6.2. Minimax Theorems
2.6.3. Saddle Point Theorems
2.7. Notes, Sources, and Exercises
3. Polyhedral Convexity
3.1. Polar Cones
3.2. Polyhedral Cones and Polyhedral Sets
3.2.1. Farkas' Lemma and Minkowski-Weyl Theorem
3.2.2. Polyhedral Sets
3.2.3. Polyhedral Functions
3.3. Extreme Points
3.3.1. Extreme Points of Polyhedral Sets
3.4. Polyhedral Aspects of Optimization
3.4.1. Linear Programming
3.4.2. Integer Programming
3.5. Polyhedral Aspects of Duality
3.5.1. Polyhedral Proper Separation
3.5.2. Min Common/Max Crossing Duality
3.5.3. Minimax Theory Under Polyhedral Assumptions
3.5.4. A Nonlinear Version of Farkas' Lemma
3.5.5. Convex Programming
3.6. Notes, Sources, and Exercises
4. Subgradients and Constrained Optimization
4.1. Directional Derivatives
4.2. Subgradients and Subdifferentials
4.3. e-Subgradients
4.4. Subgradients of Extended Real-Valued Functions
4.5. Directional Derivative of the Max Function
4.6. Conical Approximations
4.7. Optimality Conditions
4.8. Notes, Sources, and Exercises
5. Lagrange Multipliers
5.1. Introduction to Lagrange Multipliers
5.2. Enhanced Fritz John Optimality Conditions
5.3. Informative Lagrange Multipliers
5.3.1. Sensitivity
5.3.2. Alternative Lagrange Multipliers
5.4. Pseudonormality and Constraint Qualifications
5.5. Exact Penalty Functions
5.6. Using the Extended Representation
5.7. Extensions Under Convexity Assumptions
5.8. Notes, Sonrces, and Exercises
6. Lagrangian Duality
6.1. Geometric Multipliers
6.2. Duality Theory
6.3. Linear and Quadratic Programming Duality
6.4. Existence of Geometric Multipliers
6.4.1. Convex Cost Linear Constraints
6.4.2. Convex Cost Convex Constraints
6.5. Strong Duality and the Primal Function
6.5.1. Duality Gap and the Primal Function
6.5.2. Conditions for No Duality Gap
6.5.3. Subgradients of the Primal Function
6.5.4. Sensitivity Analysis
6.6. Fritz John Conditions when there is no Optimal Solution
6.6.1. Enhanced Fritz John Conditions
6.6.2. Informative Geometric Multipliers
6.7. Notes, Sources, and Exercises
7. Conjugate Duality
7.1. Conjugate Functions
7.2. Fenchel Duality Theorems
7.2.1. Connection of Fenchel Duality and Minimax Theory
7.2.2. Conic Duality
7.3. Exact Penalty Functions
7.4. Notes, Sources, and Exercises
8. Dual Computational Methods
8.1. Dual Derivatives and Subgradients
8.2. Subgradient Methods
8.2.1. Analysis of Subgradient Methods
8.2.2. Subgradient Methods with Randomization
8.3. Cutting Plane Methods
8.4. Ascent Methods
8.5. Notes, Sources, and Exercises
References
Index