Hurtig levering
Fremragende Trustpilot
Op til 20% Rabat på nye medlemsordrer
Kurv
Convex Optimization with Computational Errors
Af: Alexander J. Zaslavski Engelsk Paperback
SPAR
kr 85
Convex Optimization with Computational Errors
Af: Alexander J. Zaslavski Engelsk Paperback
The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors.  For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE].  In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors.  In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors.  All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students.  The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization  to engineering and economics.
Eksklusiv medlemspris 365 kr
Medlemspris 374 kr
Eksklusiv medlemspris og medlemspris er kun for medlemmer. Du bliver automatisk medlem når du køber til eksklusiv medlemspris eller medlemspris. Få 7 dages gratis medlemskab (herefter automatisk 89 kr/30 dage). Læs mere om fordelene
Gratis fragt
23 - 25 hverdage
10 kr
Lavt pakkegebyr
Normalpris 450 kr
Fragt: 59 kr
23 - 25 hverdage
20 kr
Pakkegebyr
Spar 85 kr
Se vores konkurrenters priser her
God 15.831 anmeldelser på
The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors.  For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE].  In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors.  In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors.  All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students.  The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization  to engineering and economics.
Produktdetaljer
Sprog: Engelsk
Sider: 360
ISBN-13: 9783030378240
Indbinding: Paperback
Udgave:
ISBN-10: 3030378241
Kategori: Variationsregning
Udg. Dato: 1 feb 2021
Længde: 23mm
Bredde: 154mm
Højde: 234mm
Oplagsdato: 1 feb 2021
Forfatter(e): Alexander J. Zaslavski
Forfatter(e) Alexander J. Zaslavski


Kategori Variationsregning


Sprog Engelsk


Indbinding Paperback


Sider 360


Udgave


Længde 23mm


Bredde 154mm


Højde 234mm

MEDLEMSFORDELE
GRATIS FRAGT
SPAR OP TIL 90%
Andre har også købt
BOG (PAPERBACK)
Eksklusiv medlemspris kr 178

kr 184
Normalpris
kr 179
Medlemspris
SPAR
kr 6
BOG (INDBUNDET)
Eksklusiv medlemspris kr 170

kr 270
Normalpris
kr 180
Medlemspris
SPAR
kr 100
BOG (INDBUNDET)
Eksklusiv medlemspris kr 380

kr 499
Normalpris
kr 392
Medlemspris
SPAR
kr 119
BOG (INDBUNDET)
Eksklusiv medlemspris kr 859

kr 1.499
Normalpris
kr 923
Medlemspris
SPAR
kr 640
BOG (INDBUNDET)
Eksklusiv medlemspris kr 214

kr 320
Normalpris
kr 225
Medlemspris
SPAR
kr 106
BOG (HÆFTET)
Eksklusiv medlemspris kr 199

kr 299
Normalpris
kr 209
Medlemspris
SPAR
kr 100
BOG (INDBUNDET)
Eksklusiv medlemspris kr 203

kr 300
Normalpris
kr 213
Medlemspris
SPAR
kr 97
BOG (PAPERBACK)
Eksklusiv medlemspris kr 165

kr 198
Normalpris
kr 168
Medlemspris
SPAR
kr 33
BOG (HÆFTET)
Eksklusiv medlemspris kr 154

kr 229
Normalpris
kr 162
Medlemspris
SPAR
kr 75
BOG (HÆFTET)
Eksklusiv medlemspris kr 211

kr 299
Normalpris
kr 220
Medlemspris
SPAR
kr 88
BOG (INDBUNDET)
Eksklusiv medlemspris kr 229

kr 299
Normalpris
kr 236
Medlemspris
SPAR
kr 70
BOG (HARDBACK)
Eksklusiv medlemspris kr 223

kr 299
Normalpris
kr 231
Medlemspris
SPAR
kr 76
BOG (INDBUNDET)
Eksklusiv medlemspris kr 208

kr 299
Normalpris
kr 217
Medlemspris
SPAR
kr 91
BOG (HÆFTET)
Eksklusiv medlemspris kr 27

kr 227
Normalpris
kr 47
Medlemspris
SPAR
kr 200
BOG (HÆFTET)
Eksklusiv medlemspris kr 154

kr 229
Normalpris
kr 162
Medlemspris
SPAR
kr 75
BOG (HARDBACK)
Eksklusiv medlemspris kr 89

kr 249
Normalpris
kr 105
Medlemspris
SPAR
kr 160
BOG (INDBUNDET)
Eksklusiv medlemspris kr 211

kr 299
Normalpris
kr 220
Medlemspris
SPAR
kr 88
BOG (PAPERBACK)
Eksklusiv medlemspris kr 84

kr 90
Normalpris
kr 85
Medlemspris
SPAR
kr 6
BOG (INDBUNDET)
Eksklusiv medlemspris kr 199

kr 300
Normalpris
kr 209
Medlemspris
SPAR
kr 101
BOG (PAPERBACK)
Eksklusiv medlemspris kr 22

kr 111
Normalpris
kr 31
Medlemspris
SPAR
kr 89
Vi anbefaler også
BOG (PAPERBACK)
Eksklusiv medlemspris kr 407

kr 506
Normalpris
kr 417
Medlemspris
SPAR
kr 99
BOG (PAPERBACK)
Eksklusiv medlemspris kr 407

kr 506
Normalpris
kr 417
Medlemspris
SPAR
kr 99
BOG (HARDBACK)
Eksklusiv medlemspris kr 1017

kr 1.104
Normalpris
kr 1.026
Medlemspris
SPAR
kr 87
BOG (HARDBACK)
Eksklusiv medlemspris kr 1059

kr 1.154
Normalpris
kr 1.069
Medlemspris
SPAR
kr 95
BOG (PAPERBACK)
Eksklusiv medlemspris kr 909

kr 982
Normalpris
kr 916
Medlemspris
SPAR
kr 73
BOG (PAPERBACK)
Eksklusiv medlemspris kr 294

kr 349
Normalpris
kr 300
Medlemspris
SPAR
kr 55
BOG (PAPERBACK)
Eksklusiv medlemspris kr 919

kr 1.239
Normalpris
kr 951
Medlemspris
SPAR
kr 320
BOG (PAPERBACK)
Eksklusiv medlemspris kr 919

kr 1.239
Normalpris
kr 951
Medlemspris
SPAR
kr 320
BOG (PAPERBACK)
Eksklusiv medlemspris kr 701

kr 867
Normalpris
kr 718
Medlemspris
SPAR
kr 166
BOG (HARDBACK)
Eksklusiv medlemspris kr 774

kr 993
Normalpris
kr 796
Medlemspris
SPAR
kr 219
BOG (PAPERBACK)
Eksklusiv medlemspris kr 840

kr 1.126
Normalpris
kr 869
Medlemspris
SPAR
kr 286
BOG (PAPERBACK)
Eksklusiv medlemspris kr 761

kr 1.014
Normalpris
kr 786
Medlemspris
SPAR
kr 253
BOG (HARDBACK)
Eksklusiv medlemspris kr 761

kr 1.014
Normalpris
kr 786
Medlemspris
SPAR
kr 253
BOG (HARDBACK)
Eksklusiv medlemspris kr 637

kr 806
Normalpris
kr 654
Medlemspris
SPAR
kr 169
BOG (PAPERBACK)
Eksklusiv medlemspris kr 761

kr 1.014
Normalpris
kr 786
Medlemspris
SPAR
kr 253
BOG (PAPERBACK)
Eksklusiv medlemspris kr 569

kr 664
Normalpris
kr 579
Medlemspris
SPAR
kr 95
BOG (HARDBACK)
Eksklusiv medlemspris kr 759

kr 1.014
Normalpris
kr 785
Medlemspris
SPAR
kr 255
BOG (PAPERBACK)
Eksklusiv medlemspris kr 759

kr 1.014
Normalpris
kr 785
Medlemspris
SPAR
kr 255