• 제목/요약/키워드: Nonsmooth optimization

검색결과 20건 처리시간 0.018초

OPTIMALITY AND DUALITY IN NONSMOOTH VECTOR OPTIMIZATION INVOLVING GENERALIZED INVEX FUNCTIONS

  • Kim, Moon-Hee
    • Journal of applied mathematics & informatics
    • /
    • 제28권5_6호
    • /
    • pp.1527-1534
    • /
    • 2010
  • In this paper, we consider nonsmooth optimization problem of which objective and constraint functions are locally Lipschitz. We establish sufficient optimality conditions and duality results for nonsmooth vector optimization problem given under nearly strict invexity and near invexity-infineness assumptions.

A SYNCRO-PARALLEL NONSMOOTH PGD ALGORITHM FOR NONSMOOTH OPTIMIZATION

  • Feng, Shan;Pang, Li-Ping
    • Journal of applied mathematics & informatics
    • /
    • 제24권1_2호
    • /
    • pp.333-342
    • /
    • 2007
  • A nonsmooth PGD scheme for minimizing a nonsmooth convex function is presented. In the parallelization step of the algorithm, a method due to Pang, Han and Pangaraj (1991), [7], is employed to solve a subproblem for constructing search directions. The convergence analysis is given as well.

ROBUST DUALITY FOR NONSMOOTH MULTIOBJECTIVE OPTIMIZATION PROBLEMS

  • Lee, Gue Myung;Kim, Moon Hee
    • 충청수학회지
    • /
    • 제30권1호
    • /
    • pp.31-40
    • /
    • 2017
  • In this paper, we consider a nonsmooth multiobjective robust optimization problem with more than two locally Lipschitz objective functions and locally Lipschitz constraint functions in the face of data uncertainty. We prove a nonsmooth sufficient optimality theorem for a weakly robust efficient solution of the problem. We formulate a Wolfe type dual problem for the problem, and establish duality theorems which hold between the problem and its Wolfe type dual problem.

ON THE SUBDIFFERENTIAL OF A NONLINEAR COMPLEMENTARITY PROBLEM FUNCTION WITH NONSMOOTH DATA

  • Gao, Yan
    • Journal of applied mathematics & informatics
    • /
    • 제27권1_2호
    • /
    • pp.335-341
    • /
    • 2009
  • In this paper, a system of nonsmooth equations reformulated from a nonlinear complementarity problem with nonsmooth data is studied. The formulas of some subdifferentials for related functions in this system of nonsmooth equations are developed. The present work can be applied to Newton methods for solving this kind of nonlinear complementarity problem.

  • PDF

GENERALIZED PROXIMAL ITERATIVELY REWEIGHTED ℓ1 ALGORITHM WITH CO-COERCIVENESS FOR NONSMOOTH AND NONCONVEX MINIMIZATION PROBLEM

  • Myeongmin Kang
    • 충청수학회지
    • /
    • 제37권1호
    • /
    • pp.41-55
    • /
    • 2024
  • The nonconvex and nonsmooth optimization problem has been widely applicable in image processing and machine learning. In this paper, we propose an extension of the proximal iteratively reweighted ℓ1 algorithm for nonconvex and nonsmooth minmization problem. We assume the co-coerciveness of a term of objective function instead of Lipschitz gradient condition, which is generalized property of Lipschitz continuity. We prove the global convergence of the proposed algorithm. Numerical results show that the proposed algorithm converges faster than original proximal iteratively reweighed algorithm and existing algorithms.

A QUASI-NEWTON BUNDLE METHOD BASED ON APPROXIMATE SUBGRADIENTS

  • Jie, Shen;Pang, Li-Ping
    • Journal of applied mathematics & informatics
    • /
    • 제23권1_2호
    • /
    • pp.361-367
    • /
    • 2007
  • In this paper we propose an implementable method for solving a nonsmooth convex optimization problem by combining Moreau-Yosida regularization, bundle and quasi-Newton ideas. The method we propose makes use of approximate subgradients of the objective function, which makes the method easier to implement. We also prove the convergence of the proposed method under some additional assumptions.

A MODIFIED BFGS BUNDLE ALGORITHM BASED ON APPROXIMATE SUBGRADIENTS

  • Guo, Qiang;Liu, Jian-Guo
    • Journal of applied mathematics & informatics
    • /
    • 제28권5_6호
    • /
    • pp.1239-1248
    • /
    • 2010
  • In this paper, an implementable BFGS bundle algorithm for solving a nonsmooth convex optimization problem is presented. The typical method minimizes an approximate Moreau-Yosida regularization using a BFGS algorithm with inexact function and the approximate gradient values which are generated by a finite inner bundle algorithm. The approximate subgradient of the objective function is used in the algorithm, which can make the algorithm easier to implement. The convergence property of the algorithm is proved under some additional assumptions.