Optimization in Banach Spaces

ยท Springer Nature
เบ›เบถเป‰เบกเบญเบตเบšเบธเบ
126
เปœเป‰เบฒ
เบšเปเปˆเป„เบ”เป‰เบขเบฑเป‰เบ‡เบขเบทเบ™เบเบฒเบ™เบˆเบฑเบ”เบญเบฑเบ™เบ”เบฑเบš เปเบฅเบฐ เบ„เบณเบ•เบดเบŠเบปเบก เบชเบถเบเบชเบฒเป€เบžเบตเปˆเบกเป€เบ•เบตเบก

เบเปˆเบฝเบงเบเบฑเบšเบ›เบถเป‰เบก e-book เบ™เบตเป‰

The book is devoted to the study of constrained minimization problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such problems are well studied in a finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an objective function and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory.
In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under the presence of computational errors. It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors.



เบเปˆเบฝเบงเบเบฑเบšเบœเบนเป‰เบ‚เบฝเบ™

โ€‹Alexander J. Zaslavski is professor in the Department of Mathematics, Technion-Israel Institute of Technology, Haifa, Israel. He has authored numerous books with Springer, the most recent of which include Turnpike Phenomenon and Symmetric Optimization Problems (978-3-030-96972-1), Turnpike Theory for the Robinsonโ€“Solowโ€“Srinivasan Model (978-3-030-60306-9), The Projected Subgradient Algorithm in Convex Optimization (978-3-030-60299-4), Convex Optimization with Computational Errors (978-3-030-37821-9), Turnpike Conditions in Infinite Dimensional Optimal Control (978-3-030-20177-7), Optimization on Solution Sets of Common Fixed Point Problems (978-3-030-78848-3).

เปƒเบซเป‰เบ„เบฐเปเบ™เบ™ e-book เบ™เบตเป‰

เบšเบญเบเบžเบงเบเป€เบฎเบปเบฒเบงเปˆเบฒเบ—เปˆเบฒเบ™เบ„เบดเบ”เปเบ™เบงเปƒเบ”.

เบญเปˆเบฒเบ™โ€‹เบ‚เปเป‰โ€‹เบกเบนเบ™โ€‹เบ‚เปˆเบฒเบงโ€‹เบชเบฒเบ™

เบชเบฐเบกเบฒเบ”เป‚เบŸเบ™ เปเบฅเบฐ เปเบ—เบฑเบšเป€เบฅเบฑเบ”
เบ•เบดเบ”เบ•เบฑเป‰เบ‡ เปเบญเบฑเบš Google Play Books เบชเบณเบฅเบฑเบš Android เปเบฅเบฐ iPad/iPhone. เบกเบฑเบ™เบŠเบดเป‰เบ‡เบ‚เปเป‰เบกเบนเบ™เป‚เบ”เบเบญเบฑเบ”เบ•เบฐเป‚เบ™เบกเบฑเบ”เบเบฑเบšเบšเบฑเบ™เบŠเบตเบ‚เบญเบ‡เบ—เปˆเบฒเบ™ เปเบฅเบฐ เบญเบฐเบ™เบธเบเบฒเบ”เปƒเบซเป‰เบ—เปˆเบฒเบ™เบญเปˆเบฒเบ™เบ—เบฒเบ‡เบญเบญเบ™เบฅเบฒเบ เบซเบผเบท เปเบšเบšเบญเบญเบšเบฅเบฒเบเป„เบ”เป‰ เบšเปเปˆเบงเปˆเบฒเบ—เปˆเบฒเบ™เบˆเบฐเบขเบนเปˆเปƒเบช.
เปเบฅเบฑเบšเบ—เบฑเบญเบš เปเบฅเบฐ เบ„เบญเบกเบžเบดเบงเป€เบ•เบต
เบ—เปˆเบฒเบ™เบชเบฒเบกเบฒเบ”เบŸเบฑเบ‡เบ›เบถเป‰เบกเบชเบฝเบ‡เบ—เบตเปˆเบŠเบทเป‰เปƒเบ™ Google Play เป‚เบ”เบเปƒเบŠเป‰เป‚เบ›เบฃเปเบเบฃเบกเบ—เปˆเบญเบ‡เป€เบงเบฑเบšเบ‚เบญเบ‡เบ„เบญเบกเบžเบดเบงเป€เบ•เบตเบ‚เบญเบ‡เบ—เปˆเบฒเบ™เป„เบ”เป‰.
eReaders เปเบฅเบฐเบญเบธเบ›เบฐเบเบญเบ™เบญเบทเปˆเบ™เป†
เป€เบžเบทเปˆเบญเบญเปˆเบฒเบ™เปƒเบ™เบญเบธเบ›เบฐเบเบญเบ™ e-ink เป€เบŠเบฑเปˆเบ™: Kobo eReader, เบ—เปˆเบฒเบ™เบˆเบณเป€เบ›เบฑเบ™เบ•เป‰เบญเบ‡เบ”เบฒเบงเป‚เบซเบผเบ”เป„เบŸเบฅเปŒ เปเบฅเบฐ เป‚เบญเบ™เบเป‰เบฒเบเบกเบฑเบ™เป„เบ›เปƒเบชเปˆเบญเบธเบ›เบฐเบเบญเบ™เบ‚เบญเบ‡เบ—เปˆเบฒเบ™เบเปˆเบญเบ™. เบ›เบฐเบ•เบดเบšเบฑเบ”เบ•เบฒเบกเบ„เบณเปเบ™เบฐเบ™เบณเบฅเบฐเบญเบฝเบ”เบ‚เบญเบ‡ เบชเบนเบ™เบŠเปˆเบงเบเป€เบซเบผเบทเบญ เป€เบžเบทเปˆเบญเป‚เบญเบ™เบเป‰เบฒเบเป„เบŸเบฅเปŒเป„เปƒเบชเปˆ eReader เบ—เบตเปˆเบฎเบญเบ‡เบฎเบฑเบš.