Notre boutique utilise des cookies pour améliorer l'expérience utilisateur et nous vous recommandons d'accepter leur utilisation pour profiter pleinement de votre navigation.
In this book, the global optimization of a nonconvex objective function is studied via stochastic perturbation.
Stochastic perturbation is a method for the transformation of local minimization procedures in to global ones in the framework of continuous optimization.
We have considered a general problem of unconstrained continuous and linear constraints optimization where the objective function may be nonsmooth.
Standard meth-ods for smooth functions usually generate a descent direction by using the gradient and may be extended to nonsmooth situations by using a generalized gradient instead of the standard one whenever it is necessary.
For instance, Clarke’s generalized gradients may be used at the points where the objective function is not differentiable.
According to this observation, we have considered a variable metric descent method and introduced suitable affine local approximations to be used.
The projected variable metric descent method is considered for continuous optimization with linear constraints, and we have considered generalized reduced gradient (GRG) for nonlinear constraints optimization where the objective function is twice differentiable.
Abdelkrim El Mouatasim, born in 1973, received a Ph.D.
degree in applied mathematics and scientific computation in 2007 from Mohammadia Engineering School in Rabat.
Currently he works for the Department of Mathematics at the Faculty of Ouarzazate FPO, Ibn Zohr University, Morocco.
His research interests include global optimization modelling.
Attention : dernières pièces disponibles !
Date de disponibilité: