Hurtig levering
Fremragende Trustpilot
Op til 20% Rabat på nye medlemsordrer
Kurv
Convex Optimization with Computational Errors
Af: Alexander J. Zaslavski Engelsk Paperback
SPAR
kr 83
Convex Optimization with Computational Errors
Af: Alexander J. Zaslavski Engelsk Paperback
The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors.  For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE].  In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors.  In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors.  All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students.  The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization  to engineering and economics.
Eksklusiv medlemspris 367 kr
Medlemspris 375 kr
Eksklusiv medlemspris og medlemspris er kun for medlemmer. Du bliver automatisk medlem når du køber til eksklusiv medlemspris eller medlemspris. Få 7 dages gratis medlemskab (herefter automatisk 89 kr/30 dage). Læs mere om fordelene
Gratis fragt
23 - 25 hverdage
10 kr
Lavt pakkegebyr
Normalpris 450 kr
Fragt: 59 kr
23 - 25 hverdage
20 kr
Pakkegebyr
Spar 83 kr
Se vores konkurrenters priser her
God 15.858 anmeldelser på
The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors.  For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE].  In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors.  In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors.  All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students.  The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization  to engineering and economics.
Produktdetaljer
Sprog: Engelsk
Sider: 360
ISBN-13: 9783030378240
Indbinding: Paperback
Udgave:
ISBN-10: 3030378241
Kategori: Variationsregning
Udg. Dato: 1 feb 2021
Længde: 23mm
Bredde: 154mm
Højde: 234mm
Oplagsdato: 1 feb 2021
Forfatter(e): Alexander J. Zaslavski
Forfatter(e) Alexander J. Zaslavski


Kategori Variationsregning


Sprog Engelsk


Indbinding Paperback


Sider 360


Udgave


Længde 23mm


Bredde 154mm


Højde 234mm


Udg. Dato 1 feb 2021


Oplagsdato 1 feb 2021

MEDLEMSFORDELE
GRATIS FRAGT
SPAR OP TIL 90%
Andre har også købt
BOG (PAPERBACK)
Eksklusiv medlemspris kr 183

kr 189
Normalpris
kr 184
Medlemspris
SPAR
kr 6
BOG (HÆFTET)
Eksklusiv medlemspris kr 200

kr 280
Normalpris
kr 208
Medlemspris
SPAR
kr 80
BOG (INDBUNDET)
Eksklusiv medlemspris kr 380

kr 499
Normalpris
kr 392
Medlemspris
SPAR
kr 119
LYDBOG
Eksklusiv medlemspris kr 95

kr 129
Normalpris
kr 98
Medlemspris
SPAR
kr 34
BOG (INDBUNDET)
Eksklusiv medlemspris kr 199

kr 299
Normalpris
kr 209
Medlemspris
SPAR
kr 100
BOG (INDBUNDET)
Eksklusiv medlemspris kr 208

kr 299
Normalpris
kr 217
Medlemspris
SPAR
kr 91
BOG (INDBUNDET)
Eksklusiv medlemspris kr 214

kr 320
Normalpris
kr 225
Medlemspris
SPAR
kr 106
BOG (HÆFTET)
Eksklusiv medlemspris kr 188

kr 269
Normalpris
kr 196
Medlemspris
SPAR
kr 81
BOG (HÆFTET)
Eksklusiv medlemspris kr 203

kr 300
Normalpris
kr 213
Medlemspris
SPAR
kr 97
BOG (INDBUNDET)
Eksklusiv medlemspris kr 223

kr 320
Normalpris
kr 233
Medlemspris
SPAR
kr 97
BOG (INDBUNDET)
Eksklusiv medlemspris kr 919

kr 1.499
Normalpris
kr 977
Medlemspris
SPAR
kr 580
BOG (INDBUNDET)
Eksklusiv medlemspris kr 199

kr 299
Normalpris
kr 209
Medlemspris
SPAR
kr 100
BOG (HÆFTET)
Eksklusiv medlemspris kr 297

kr 350
Normalpris
kr 302
Medlemspris
SPAR
kr 53
BOG (INDBUNDET)
Eksklusiv medlemspris kr 199

kr 320
Normalpris
kr 211
Medlemspris
SPAR
kr 121
BOG (INDBUNDET)
Eksklusiv medlemspris kr 262

kr 349
Normalpris
kr 271
Medlemspris
SPAR
kr 87
BOG (HÆFTET)
Eksklusiv medlemspris kr 187

kr 249
Normalpris
kr 193
Medlemspris
SPAR
kr 62
BOG (INDBUNDET)
Eksklusiv medlemspris kr 329

kr 499
Normalpris
kr 346
Medlemspris
SPAR
kr 170
BOG (HÆFTET)
Eksklusiv medlemspris kr 199

kr 299
Normalpris
kr 209
Medlemspris
SPAR
kr 100
BOG (HARDBACK)
Eksklusiv medlemspris kr 25

kr 227
Normalpris
kr 45
Medlemspris
SPAR
kr 202
BOG (INDBUNDET)
Eksklusiv medlemspris kr 165

kr 269
Normalpris
kr 175
Medlemspris
SPAR
kr 104
Vi anbefaler også
BOG (PAPERBACK)
Eksklusiv medlemspris kr 407

kr 506
Normalpris
kr 417
Medlemspris
SPAR
kr 99
BOG (PAPERBACK)
Eksklusiv medlemspris kr 407

kr 506
Normalpris
kr 417
Medlemspris
SPAR
kr 99
BOG (HARDBACK)
Eksklusiv medlemspris kr 1017

kr 1.104
Normalpris
kr 1.026
Medlemspris
SPAR
kr 87
BOG (HARDBACK)
Eksklusiv medlemspris kr 1062

kr 1.154
Normalpris
kr 1.071
Medlemspris
SPAR
kr 92
BOG (PAPERBACK)
Eksklusiv medlemspris kr 929

kr 1.003
Normalpris
kr 936
Medlemspris
SPAR
kr 74
BOG (PAPERBACK)
Eksklusiv medlemspris kr 296

kr 349
Normalpris
kr 301
Medlemspris
SPAR
kr 53
BOG (PAPERBACK)
Eksklusiv medlemspris kr 919

kr 1.239
Normalpris
kr 951
Medlemspris
SPAR
kr 320
BOG (PAPERBACK)
Eksklusiv medlemspris kr 919

kr 1.239
Normalpris
kr 951
Medlemspris
SPAR
kr 320
BOG (PAPERBACK)
Eksklusiv medlemspris kr 701

kr 867
Normalpris
kr 718
Medlemspris
SPAR
kr 166
BOG (HARDBACK)
Eksklusiv medlemspris kr 791

kr 1.014
Normalpris
kr 813
Medlemspris
SPAR
kr 223
BOG (PAPERBACK)
Eksklusiv medlemspris kr 840

kr 1.126
Normalpris
kr 869
Medlemspris
SPAR
kr 286
BOG (PAPERBACK)
Eksklusiv medlemspris kr 761

kr 1.014
Normalpris
kr 786
Medlemspris
SPAR
kr 253
BOG (HARDBACK)
Eksklusiv medlemspris kr 761

kr 1.014
Normalpris
kr 786
Medlemspris
SPAR
kr 253
BOG (HARDBACK)
Eksklusiv medlemspris kr 651

kr 822
Normalpris
kr 668
Medlemspris
SPAR
kr 171
BOG (PAPERBACK)
Eksklusiv medlemspris kr 761

kr 1.014
Normalpris
kr 786
Medlemspris
SPAR
kr 253
BOG (PAPERBACK)
Eksklusiv medlemspris kr 569

kr 664
Normalpris
kr 579
Medlemspris
SPAR
kr 95
BOG (HARDBACK)
Eksklusiv medlemspris kr 761

kr 1.014
Normalpris
kr 786
Medlemspris
SPAR
kr 253
BOG (PAPERBACK)
Eksklusiv medlemspris kr 761

kr 1.014
Normalpris
kr 786
Medlemspris
SPAR
kr 253