Braun, Gabor

Conditional gradient methods: from core principles to AI applications - Philadelphia: Society for Industrial and Applied Mathematics (SIAM), 2025. - ix, 195p.: col., ill.; pbk.: 25 cm. - MOS-SIAM Series on Optimization .

Include Bibliography, Glossary and Index

Conditional Gradient Methods: From Core Principles to AI Applications offers a definitive and modern treatment of one of the most elegant and versatile algorithmic families in optimization: the Frank–Wolfe method and its many variants. Originally proposed in the 1950s, these projection-free techniques have seen a powerful resurgence, now playing a central role in machine learning, signal processing, and large-scale data science. This comprehensive monograph guides readers through the foundations of constrained optimization and into cutting-edge territory—including stochastic, online, and distributed settings—by uniting deep theoretical insights with practical considerations, and uses a clear narrative, rigorous proofs, and illuminating illustrations to demystify adaptive variants, away-steps, and the nuances of dealing with structured convex sets. Most of the algorithms in the book are implemented in the FrankWolfe.jl Julia package and available on a supplementary website. https://epubs.siam.org/doi/book/10.1137/1.9781611978568

9781611978551


Conditional Gradient Methods
Frank–Wolfe Algorithm
Constrained Optimization
First-order Methods
Linear Minimization Oracle (LMO)
Projection-free Algorithms
Convex Optimization
Adaptive Step Sizes
Away-Step Frank–Wolfe
Fully-Corrective Frank–Wolfe
Mathematical Optimization Society

519.6 BRA