实例介绍
Introductory lectures on convex optimization,全并适合打印
Applied Optimization Volume 87 Series editors Panos M. pardalos University of florida, U.S.A Donald w. hearn University of florida, U.S.A INTRODUCTORY LECTURES ON CONVEX OPTIMIZATION a Basic course By Yurii Nesterov Center of Operations Research and Econometrics, (cORE Universite Catholique de louvain ucl Louvain-la-Neuve, Belgium Kluwer Academic Publishers Boston/ dordrecht/ London Distributors for north central and South america: Kluwer academic publishers 101 Philip drive Assinippi Park Norwell. massachusetts 02061 USA Telephone(781)871-6600 Fa(781)871-6528 E-Mail<kluwer@wkap.com> Distributors for all other countries Kluwer Academic Publishers group Post office Box 322 3300 AH Dordrecht, THE NETHERLANDS Telephone 31 78 6576 000 Fax31786576474 E-Mail <orderdept@wkap nI> ElectronicServices<http://www.wkap.nb> Library of Congress Cataloging-in-Publication Nesterov Y urri Introductory Lectures on Convex Optimization: A Basic Course IsBN14020-7553-7 Copyright C 2004 by Kluwer Academic Publishers All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photo-copying, microfilming, recording, or otherwise, without the prior written permission of the publisher, with the exception of any material supplied specifically for the purpose of being entered and executed on a computer system, for exclusive use by the purchaser of the work PermissionsforbookspublishedintheUsa:permissions@wkap.com Permissions for books published in Europe: permissions @wkap. nl Printed on acid-free paper. Printed in the United Kingdom by biddles/BT Global Contents reface Acknowledgments XI Introduction 1. NONLINEAR OPTIMIZATION 1.1 World of nonlinear optimization 1.1.1 General formulation of the problem 1.1.2 Performance of numerical methods 1.1.3 Complexity bounds for global optimization 1.1.4 Identity cards of the fields 13 1.2 Local methods in unconstrained minimization 1. 2. 1 Relaxation and approximation 1.2.2 Classes of differentiable functions 1.2.3 Gradient method 25 1.2.4 Newton method 32 1. 3 First-order methods in nonlinear optimization 37 1.3.1 Gradient method and newton method What is different? 37 3.2 Conjugate gradients 42 1.3.3 Constrained minimization 46 2. SMOOTH CONVEX OPTIMIZATION 51 2.1 Minimization of smooth functions 51 2.1.1 Smooth convex functions 51 2.1.2 Lower complexity bounds for Fi,(Rn) 58 2. 1.3 Strongly convex functions 63 2.1.4 Lower complexity bounds for SmL (Rn) 66 INTRODUCTORY LECTURES ON CONVEX OPTIMIZATION 2.1.5 Gradient method 68 2.2 Optimal methods 2.2.1 Optimal methods 71 2.2.2 Convex sets 81 2.2.3 Gradient mapping 86 2.2. 4 Minimization methods for simple sets 87 2.3 Minimization problem with smooth components 90 2.3. 1 Minimax problem 0 2.3.2 Gradient mapping 93 2.3.3 Minimization methods for minimax problem 96 2.3.4 Optimization with functional constraints 100 2.3.5 Method for constrained minimization 105 3. NONSMOOTH CONVEX OPTIMIZATION 3.1 General convex functions 3. 1.1 Motivation and definitions 3.1.2 Operations with convex functions 117 3.1.3 Continuity and differentiability 121 3. 1.4 Separation theorems 124 3.1.5 Subgradients 126 3.1.6 Computing subgradients 130 3.2 Nonsmooth minimization methods 135 3.2.1 General lower complexity bounds 135 3.2.2 Main lemma 138 3.2.3 Subgradient method 141 3.2.4 Minimization with functional constraints 3.2.5 Complexity bounds in finite dimension 146 3.2.6 Cutting plane schemes 149 3.3 Methods with complete data 156 3.3.1 Model of nonsmooth function 157 3.3.2 Kelley method 158 3.3.3 Level method 160 3.3.4 Constrained minimization 164 4. STRUCTURAL OPTIMIZATION 171 4.1 Self-concordant functions 171 4.1.1 Black box concept in convex optimization 171 4.1.2 What the Newton method actually does? 173 4.1.3 Definition of self-concordant function 175 Contents 4.1.4 Main inequalities 181 4.1.5 Minimizing the self-concordant function 187 4.2 Self-concordant barriers 192 4.2.1 Motivation 192 4.2.2 Definition of self-concordant barriers 193 4.2.3 Main inequalities 196 4.2.4 Path-following scheme 199 4.2.5 Finding the analytic center 203 4.2.6 Problems with functional constraints 206 4.3 Applications of structural optimization 210 4.3.1 Bounds on parameters of self-concordant barriers 210 4.3.2 Linear and quadratic optimization 213 4.3.3 Semidefinite optimization 216 4.3.4 Extremal ellipsoids 220 4.3.5 Separable optimization 224 4.3.6 Choice of minimization scheme 227 Bibliography 231 References 233 Index 235 reface It was in the middle of the 1980s, when the seminal paper by Kar- markar opened a new epoch in nonlinear optimization. The importance of this paper, containing a new polynomial-time algorithm for linear op- timization problerris, was not only in its complexity bound. At that time, the most surprising feature of this algorithm was that the theoretical pre diction of its high efficiency was supported by excellent computational results. This unusual fact dramatically changed the style and direc- tions of the research in nonlinear optimization. Thereafter it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. In a new rapidly develop ing field, which got the name "polynomial-time interior-point methods such a justification was obligatory After almost fifteen years of intensive research, the Inain results of this development started to appear in monographs [12, 14, 16, 17, 18, 19 Approximately at that time the author was asked to prepare a new course on nonlinear optimization for graduate students. The idea was to create a course which would reflect the new developments in the field. Actually this was a major challenge. At the time only the theory of interior-point methods for linear optimization was polished enough to be explained to students. The general theory of self-concordant functions had appeared in print only once in the form of research monograph [12. Moreover it was clear that the new theory of interior-point methods represented only a part of a general theory of convex optimization, a rather involved field with the complexity bounds, optimal methods, etc. The majority of the latter results were published in different journals in Russian. The book you see now is a result of an attempt to present serious things in an elementary form. as is always the case with a one-semester course, the most difficult probleIn is the selection of the material. For X INTRODUCTORY LECTURES ON CONVEX OPTIMIZATION us the target notions were the complexity of the optimization problems and a provable efficiency of numerical schemes supported by complex ity bounds. In view of a severe volume limitation, we had to be very pragmatic. Any concept or fact included in the book is absolutely nec essary for the analysis of at least one optimization scheme. Surprisingly enough, none of the material presented requires any facts from duality theory Thus, this topic is completely omitted. This does not mean, of course that the author neglects this fundamental concept. However, we hope that for the first treatment of the subject such a compromise is acceptable. The main goal of this course is the development of a correct under- standing of the complexity of different optimization problems. This goal was not chosen by chance. Every year I meet Ph. D. students of different specializations who ask me for advice on reasonable numerical schemes for their optimization models. And very often they seem to have come too late. In my experience, if an optimization model is created without taking into account the abilities of numerical schemes the chances that it will be possible to find an acceptable numerical solution are close to zero. In any field of human activity, if we create something, we know in advance why we are doing so and what we are going to do with the result. And only in numerical modelling is the situation still different This course was given during several years at Universite Catholique de Louvain(Louvain-la-Neuve, Belgium). The course is self-contained. It consists of four chapters(Nonlinear optimization, Smooth convex opti- mization,Nonsmooth convex optimization and Structural optimization Interior-point methods )) The chapters are essentially independent and can be used as parts of more general courses on convex analysis or op- timization. In our experience each chapter can be covered in three two- hour lectures. We assume a reader to have a standard undergraduate background in analysis and linear algebra. We provide the reader with short bibliographical notes which should help in a closer examination of the subject YURII NESTEROV Louvain-la-Neuve, Belgium May,2003. 【实例截图】
【核心代码】
标签:
小贴士
感谢您为本站写下的评论,您的评论对其它用户来说具有重要的参考价值,所以请认真填写。
- 类似“顶”、“沙发”之类没有营养的文字,对勤劳贡献的楼主来说是令人沮丧的反馈信息。
- 相信您也不想看到一排文字/表情墙,所以请不要反馈意义不大的重复字符,也请尽量不要纯表情的回复。
- 提问之前请再仔细看一遍楼主的说明,或许是您遗漏了。
- 请勿到处挖坑绊人、招贴广告。既占空间让人厌烦,又没人会搭理,于人于己都无利。
关于好例子网
本站旨在为广大IT学习爱好者提供一个非营利性互相学习交流分享平台。本站所有资源都可以被免费获取学习研究。本站资源来自网友分享,对搜索内容的合法性不具有预见性、识别性、控制性,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,平台无法对用户传输的作品、信息、内容的权属或合法性、安全性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论平台是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二与二十三条之规定,若资源存在侵权或相关问题请联系本站客服人员,点此联系我们。关于更多版权及免责申明参见 版权及免责申明
网友评论
我要评论