• Complain

Adil Bagirov Napsu Karmitsa - Introduction to Nonsmooth Optimization

Here you can read online Adil Bagirov Napsu Karmitsa - Introduction to Nonsmooth Optimization full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. City: Cham, year: 2016, publisher: Springer International Publishing, genre: Home and family. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Adil Bagirov Napsu Karmitsa Introduction to Nonsmooth Optimization

Introduction to Nonsmooth Optimization: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Introduction to Nonsmooth Optimization" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Adil Bagirov Napsu Karmitsa: author's other books


Who wrote Introduction to Nonsmooth Optimization? Find out the surname, the name of the author of the book and a list of all author's works by series.

Introduction to Nonsmooth Optimization — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Introduction to Nonsmooth Optimization" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Part I
Nonsmooth Analysis and Optimization
Introduction
Convexity plays a crucial role in mathematical optimization. Especially, convexity is the most important concept in constructing optimality conditions. In smooth (continuously differentiable) optimization theory, differentiation entails locally linearizing the functions by the gradients, leading to a lower approximation of a convex function. These ideas can be generalized for nonsmooth convex functions resulting in the concepts of subgradients and subdifferentials . A subgradient preserves the property of the gradient, providing a lower approximation of the function, but in the nonsmooth case it is not unique anymore. Thus, instead of one gradient vector we end up with a set of subgradients called subdifferentials.
Unfortunately, convexity is often too demanding an assumption in practical applications, and we have to be able to deal with nonconvex functions as well. From a practical point of view, locally Lipschitz continuous functions are proved to be a suitable and sufficiently general class of nonconvex functions. In a convex case, differentiation is based on the linearization of a function. For nonconvex functions, the differentiation can be seen as convexification. All of the concepts of convex analysis can be generalized for a nonconvex case in a very natural way. As in the convex analysis, the links between nonconvex analysis and geometry are strong: subgradients and generalized directional derivatives have a one-to-one interpretation to the tangent and normal cones of epigraphs and level sets.
Our aim here is to present the theory of nonsmooth analysis for optimization in a compact and easy-to-understand form. The reader is assumed to have some basic knowledge of linear algebra, elementary real analysis, and smooth nonlinear optimization. To give as self-contained a description as possible, we define every concept used and prove almost all theorems and lemmas.
This part is organized as follows: after recalling some basic results and notions from smooth analysis, we concentrate on Convex Analysis in Chap.. We start our consideration with geometry by defining convex sets and cones. The main resultand foundation for the forthcoming theoryis the separation theorem stating that two convex sets can always be separated by a hyperplane. Next we turn to analytical concepts, by defining subgradients and subdifferentials. Finally, we show that those geometrical and analytical notions can be connected utilizing epigraphs, level sets, and the distance function.
Chapter is devoted to Nonconvex Analysis. We generalize all of the convex concepts to locally Lipschitz continuous functions. We also show that nonsmooth analysis is a natural enlargement of classical differential theory by generalizing all of the familiar derivation rules such as the mean-value theorem and the chain rule. The price we have to pay when relaxing the assumptions of differentiability and convexity is that instead of equalities, we only obtain inclusions in most of the results. However, by posing some mild regularity assumptions, we can turn the inclusions back into equalities. After the analytical consideration, we generalize, in a similar way to the convex case, all of the geometrical concepts, and give links between analysis and geometry. At the end of the chapter, we recall the definitions of quasidifferentials, codifferentials, and limiting subdifferentials all of which can be used to generalize the subdifferential of convex functions to the nonconvex case.
Chapter concentrates on the theory of nonsmooth optimization. Optimality conditions are an essential part of mathematical optimization theory, heavily affecting, for example, the development of optimization methods. We formulate the necessary conditions for locally Lipschitz continuous functions to attain local minima both in unconstrained and constrained cases. These conditions are proved to be sufficient for convex problems, and the found minima are global. We formulate both geometrical and analytical conditions based on cones and subdifferentials, respectively. We consider both general geometrical constraint sets and analytical constraints determined by functions and inequalities. In the latter case, we end up to generalizing the classical Fritz John (FJ) and Karush-Kuhn-Tucker (KKT) conditions.
The question, Can we still attain sufficient optimality conditions when relaxing the convexity assumptions? has been the initiating force behind the study of generalized convexities. In Chap. , we define and analyze the properties of the generalized pseudo- and quasiconvexities for locally Lipschitz continuous functions. Finally, we formulate relaxed versions of the sufficient optimality conditions, where the convexity is replaced by generalized pseudo- and quasiconvexities.
The last chapter of this part, Chap. , deals with Approximations of Subdifferentials. We introduce the concept of continuous approximations to subdifferential, and a discrete gradient that can be used as an approximation to the subgradient at a given point.
Springer International Publishing Switzerland 2014
Adil Bagirov , Napsu Karmitsa and Marko M. Mkel Introduction to Nonsmooth Optimization 10.1007/978-3-319-08114-4_1
1. Theoretical Background
Adil Bagirov 1
(1)
Centre for Informatics and Applied Optimization, School of Information Technology and Mathematical Sciences, University of Ballarat, Ballarat, VIC, Australia
(2)
Department of Mathematics and Statistics, University of Turku, Turku, Finland
Adil Bagirov (Corresponding author)
Email:
Napsu Karmitsa
Email:
Marko M. Mkel
Email:
In this chapter, we first collect some notations and basic results of smooth analysis. We also recall fundamentals from matrix calculus.
1.1 Notations and Definitions
All the vectors Picture 1 are considered as column vectors and, correspondingly, all the transposed vectors Picture 2 are considered as row vectors (bolded symbols are used for vectors). We denote by Picture 3 the usual inner product and by Picture 4 the norm in the -dimensional real Euclidean space In other words where - photo 5 -dimensional real Euclidean space In other words where and are in - photo 6 . In other words,
where and are in and - photo 7
where Picture 8 and Picture 9 are in Picture 10 and Picture 11 are the Picture 12 th components of the vectors Picture 13 and Picture 14
Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Introduction to Nonsmooth Optimization»

Look at similar books to Introduction to Nonsmooth Optimization. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Introduction to Nonsmooth Optimization»

Discussion, reviews of the book Introduction to Nonsmooth Optimization and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.