Rapid gradient penalty schemes and convergence for solving constrained convex optimization problem in Hilbert spaces
Keywords:
Rapid gradient penalty algorithm, penalization, constraint minimization, fenchel conjugateAbstract
The purposes of this paper are to establish and study the convergence of a new gradient scheme with penalization terms called rapid gradient penalty algorithm (RGPA) for minimizing a convex differentiable function over the set of minimizers of a convex differentiable constrained function. Under the observation of some appropriate choices for the available properties of the considered functions and scalars, we can generate a suitable algorithm that weakly converges to a minimal solution of the considered constraint minimization problem. Further, we also provide a numerical example to compare the rapid gradient penalty algorithm (RGPA) and the algorithm introduced by Peypouquet [20].