The absence of derivatives, often combined with the presence of noise or lack of smoothness, is a major challenge for optimization. This book explains how sampling and model techniques are used in derivative-free methods and how these methods are designed to efficiently and rigorously solve optimization problems. Although readily accessible to readers with a modest background in computational mathematics, it is also intended to be of interest to researchers in the field. Introduction to Derivative-Free Optimization is the first contemporary comprehensive treatment of optimization without derivatives.
This book covers most of the relevant classes of algorithms from direct search to model-based approaches. It contains a comprehensive description of the sampling and modeling tools needed for derivative-free optimization; these tools allow the reader to better understand the convergent properties of the algorithms and identify their differences and similarities. Introduction to Derivative-Free Optimization also contains analysis of convergence for modified Nelder–Mead and implicit-filtering methods, as well as for model-based methods such as wedge methods and methods based on minimum-norm Frobenius models.
Sprache
Verlagsort
Zielgruppe
Produkt-Hinweis
Broschur/Paperback
Klebebindung
Maße
Höhe: 256 mm
Breite: 177 mm
Dicke: 20 mm
Gewicht
ISBN-13
978-0-89871-668-9 (9780898716689)
Schweitzer Klassifikation
Andrew R. Conn is a research staff member at the IBM T. J. Watson Research Center, Yorktown Heights, NY. In 1994 he was (with N. I. M. Gould and Ph. L. Toint) a joint recipient of the Beale/Orchard-Hays Prize for Computational Excellence in Mathematical Programming and with Chandu Visweswariah he received an IBM Corporate Award in 2002 for contributions to circuit tuning. Currently his major application projects are in the petroleum industry.
Preface; 1. Introduction; Part I. Sampling and Modeling: 2. Sampling and linear models; 3. Interpolating nonlinear models; 4. Regression nonlinear models; 5. Underdetermined interpolating models; 6. Ensuring well poisedness and suitable derivative-free models; Part II. Frameworks and Algorithms: 7. Directional direct-search methods; 8. Simplicial direct-search methods; 9. Line-search methods based on simplex derivatives; 10. Trust-region methods based on derivative-free models; 11. Trust-region interpolation-based methods; Part III. Review of Other Topics: 12. Review of surrogate model management; 13. Review of constrained and other extensions to derivative-free optimization; Appendix: software for derivative-free optimization; Bibliography; Index.