• Title/Summary/Keyword: vector programming

Search Result 139, Processing Time 0.023 seconds

A Decomposition Method for Two stage Stochstic Programming with Block Diagonal Structure (블록 대각 구조를 지닌 2단계 확률계획법의 분해원리)

  • 김태호;박순달
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.10 no.1
    • /
    • pp.9-13
    • /
    • 1985
  • This paper develops a decomposition method for stochastic programming with a block diagonal structure. Here we assume that the right-hand side random vector of each subproblem is differente each other. We first, transform this problem into a master problem, and subproblems in a similar way to Dantizig-Wolfe's Decomposition Princeple, and then solve this master problem by solving subproblems. When we solve a subproblem, we first transform this subproblem to a Deterministic Equivalent Programming (DEF). The form of DEF depends on the type of the random vector of the subproblem. We found the subproblem with finite discrete random vector can be transformed into alinear programming, that with continuous random vector into a convex quadratic programming, and that with random vector of unknown distribution and known mean and variance into a convex nonlinear programming, but the master problem is always a linear programming.

  • PDF

ANOTHER APPROACH TO MULTIOBJECTIVE PROGRAMMING PROBLEMS WITH F-CONVEX FUNCTIONS

  • LIU SANMING;FENG ENMIN
    • Journal of applied mathematics & informatics
    • /
    • v.17 no.1_2_3
    • /
    • pp.379-390
    • /
    • 2005
  • In this paper, optimality conditions for multiobjective programming problems having F-convex objective and constraint functions are considered. An equivalent multiobjective programming problem is constructed by a modification of the objective function. Furthermore, an F-Lagrange function is introduced for a constructed multiobjective programming problem, and a new type of saddle point is introduced. Some results for the new type of a saddle point are given.

FENCHEL DUALITY THEOREM IN MULTIOBJECTIVE PROGRAMMING PROBLEMS WITH SET FUNCTIONS

  • Liu, Sanming;Feng, Enmin
    • Journal of applied mathematics & informatics
    • /
    • v.13 no.1_2
    • /
    • pp.139-152
    • /
    • 2003
  • In this paper, we characterize a vector-valued convex set function by its epigraph. The concepts of a vector-valued set function and a vector-valued concave set function we given respectively. The definitions of the conjugate functions for a vector-valued convex set function and a vector-valued concave set function are introduced. Then a Fenchel duality theorem in multiobjective programming problem with set functions is derived.

MULTIOBJECTIVE FRACTIONAL PROGRAMMING WITH A MODIFIED OBJECTIVE FUNCTION

  • Kim, Do-Sang
    • Communications of the Korean Mathematical Society
    • /
    • v.20 no.4
    • /
    • pp.837-847
    • /
    • 2005
  • We consider multiobjective fractional programming problems with generalized invexity. An equivalent multiobjective programming problem is formulated by using a modification of the objective function due to Antczak. We give relations between a multiobjective fractional programming problem and an equivalent multiobjective fractional problem which has a modified objective function. And we present modified vector saddle point theorems.

Quadratic Loss Support Vector Interval Regression Machine for Crisp Input-Output Data

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.2
    • /
    • pp.449-455
    • /
    • 2004
  • Support vector machine (SVM) has been very successful in pattern recognition and function estimation problems for crisp data. This paper proposes a new method to evaluate interval regression models for crisp input-output data. The proposed method is based on quadratic loss SVM, which implements quadratic programming approach giving more diverse spread coefficients than a linear programming one. The proposed algorithm here is model-free method in the sense that we do not have to assume the underlying model function. Experimental result is then presented which indicate the performance of this algorithm.

  • PDF

Improvement of Support Vector Clustering using Evolutionary Programming and Bootstrap

  • Jun, Sung-Hae
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.8 no.3
    • /
    • pp.196-201
    • /
    • 2008
  • Statistical learning theory has three analytical tools which are support vector machine, support vector regression, and support vector clustering for classification, regression, and clustering respectively. In general, their performances are good because they are constructed by convex optimization. But, there are some problems in the methods. One of the problems is the subjective determination of the parameters for kernel function and regularization by the arts of researchers. Also, the results of the learning machines are depended on the selected parameters. In this paper, we propose an efficient method for objective determination of the parameters of support vector clustering which is the clustering method of statistical learning theory. Using evolutionary algorithm and bootstrap method, we select the parameters of kernel function and regularization constant objectively. To verify improved performances of proposed research, we compare our method with established learning algorithms using the data sets form ucr machine learning repository and synthetic data.

MODIFIED GEOMETRIC PROGRAMMING PROBLEM AND ITS APPLICATIONS

  • ISLAM SAHIDUL;KUMAR ROY TAPAN
    • Journal of applied mathematics & informatics
    • /
    • v.17 no.1_2_3
    • /
    • pp.121-144
    • /
    • 2005
  • In this paper, we propose unconstrained and constrained posynomial Geometric Programming (GP) problem with negative or positive integral degree of difficulty. Conventional GP approach has been modified to solve some special type of GP problems. In specific case, when the degree of difficulty is negative, the normality and the orthogonality conditions of the dual program give a system of linear equations. No general solution vector exists for this system of linear equations. But an approximate solution can be determined by the least square and also max-min method. Here, modified form of geometric programming method has been demonstrated and for that purpose necessary theorems have been derived. Finally, these are illustrated by numerical examples and applications.

Support Vector Machine Based on Type-2 Fuzzy Training Samples

  • Ha, Ming-Hu;Huang, Jia-Ying;Yang, Yang;Wang, Chao
    • Industrial Engineering and Management Systems
    • /
    • v.11 no.1
    • /
    • pp.26-29
    • /
    • 2012
  • In order to deal with the classification problems of type-2 fuzzy training samples on generalized credibility space. Firstly the type-2 fuzzy training samples are reduced to ordinary fuzzy samples by the mean reduction method. Secondly the definition of strong fuzzy linear separable data for type-2 fuzzy samples on generalized credibility space is introduced. Further, by utilizing fuzzy chance-constrained programming and classic support vector machine, a support vector machine based on type-2 fuzzy training samples and established on generalized credibility space is given. An example shows the efficiency of the support vector machine.

MULTIOBJECTIVE VARIATIONAL PROGRAMMING UNDER GENERALIZED VECTOR VARIATIONAL TYPE I INVEXITY

  • Kim, Moon-Hee
    • Communications of the Korean Mathematical Society
    • /
    • v.19 no.1
    • /
    • pp.179-196
    • /
    • 2004
  • Mond-Weir type duals for multiobjective variational problems are formulated. Under generalized vector variational type I invexity assumptions on the functions involved, sufficient optimality conditions, weak and strong duality theorems are proved efficient and properly efficient solutions of the primal and dual problems.

Study on Support Vector Machines Using Mathematical Programming (수리계획법을 이용한 서포트 벡터 기계 방법에 관한 연구)

  • Yoon, Min;Lee, Hak-Bae
    • The Korean Journal of Applied Statistics
    • /
    • v.18 no.2
    • /
    • pp.421-434
    • /
    • 2005
  • Machine learning has been extensively studied in recent years as effective tools in pattern classification problem. Although there have been several approaches to machine learning, we focus on the mathematical programming (in particular, multi-objective and goal programming; MOP/GP) approaches in this paper. Among them, Support Vector Machine (SVM) is gaining much popularity recently. In pattern classification problem with two class sets, the idea is to find a maximal margin separating hyperplane which gives the greatest separation between the classes in a high dimensional feature space. However, the idea of maximal margin separation is not quite new: in 1960's the multi-surface method (MSM) was suggested by Mangasarian. In 1980's, linear classifiers using goal programming were developed extensively. This paper proposes a new family of SVM using MOP/GP techniques, and discusses its effectiveness throughout several numerical experiments.