• Title/Summary/Keyword: vector approximation

Search Result 185, Processing Time 0.021 seconds

GEOMETRIC AND APPROXIMATION PROPERTIES OF GENERALIZED SINGULAR INTEGRALS IN THE UNIT DISK

  • Anastassiou George A.;Gal Sorin G.
    • Journal of the Korean Mathematical Society
    • /
    • v.43 no.2
    • /
    • pp.425-443
    • /
    • 2006
  • The aim of this paper is to obtain several results in approximation by Jackson-type generalizations of complex Picard, Poisson-Cauchy and Gauss-Weierstrass singular integrals in terms of higher order moduli of smoothness. In addition, these generalized integrals preserve some sufficient conditions for starlikeness and univalence of analytic functions. Also approximation results for vector-valued functions defined on the unit disk are given.

Performance Enhancement of a DVA-tree by the Independent Vector Approximation (독립적인 벡터 근사에 의한 분산 벡터 근사 트리의 성능 강화)

  • Choi, Hyun-Hwa;Lee, Kyu-Chul
    • The KIPS Transactions:PartD
    • /
    • v.19D no.2
    • /
    • pp.151-160
    • /
    • 2012
  • Most of the distributed high-dimensional indexing structures provide a reasonable search performance especially when the dataset is uniformly distributed. However, in case when the dataset is clustered or skewed, the search performances gradually degrade as compared with the uniformly distributed dataset. We propose a method of improving the k-nearest neighbor search performance for the distributed vector approximation-tree based on the strongly clustered or skewed dataset. The basic idea is to compute volumes of the leaf nodes on the top-tree of a distributed vector approximation-tree and to assign different number of bits to them in order to assure an identification performance of vector approximation. In other words, it can be done by assigning more bits to the high-density clusters. We conducted experiments to compare the search performance with the distributed hybrid spill-tree and distributed vector approximation-tree by using the synthetic and real data sets. The experimental results show that our proposed scheme provides consistent results with significant performance improvements of the distributed vector approximation-tree for strongly clustered or skewed datasets.

Incremental Support Vector Learning Method for Function Approximation (함수 근사를 위한 점증적 서포트 벡터 학습 방법)

  • 임채환;박주영
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.135-138
    • /
    • 2002
  • This paper addresses incremental learning method for regression. SVM(support vector machine) is a recently proposed learning method. In general training a support vector machine requires solving a QP (quadratic programing) problem. For very large dataset or incremental dataset, solving QP problems may be inconvenient. So this paper presents an incremental support vector learning method for function approximation problems.

  • PDF

Kernel Adatron Algorithm of Support Vector Machine for Function Approximation (함수근사를 위한 서포트 벡터 기계의 커널 애더트론 알고리즘)

  • Seok, Kyung-Ha;Hwang, Chang-Ha
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.6
    • /
    • pp.1867-1873
    • /
    • 2000
  • Function approximation from a set of input-output pairs has numerous applications in scientific and engineering areas. Support vector machine (SVM) is a new and very promising classification, regression and function approximation technique developed by Vapnik and his group at AT&TG Bell Laboratories. However, it has failed to establish itself as common machine learning tool. This is partly due to the fact that this is not easy to implement, and its standard implementation requires the use of optimization package for quadratic programming (QP). In this appear we present simple iterative Kernel Adatron (KA) algorithm for function approximation and compare it with standard SVM algorithm using QP.

  • PDF

COMMON FIXED POINT AND INVARIANT APPROXIMATION RESULTS

  • Abbas, Mujahid;Kim, Jong-Kyu
    • Bulletin of the Korean Mathematical Society
    • /
    • v.44 no.3
    • /
    • pp.537-545
    • /
    • 2007
  • Necessary conditions for the existence of common fixed points for noncommuting mappings satisfying generalized contractive conditions in the setup of certain metrizable topological vector spaces are obtained. As applications, related results on best approximation are derived. Our results extend, generalize and unify various known results in the literature.

Automatic classification of power quality disturbances using orthogonal polynomial approximation and higher-order spectra (직교 다항식 근사법과 고차 통계를 이용한 전력 외란의 자동식별)

  • 이재상;이철호;남상원
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.1436-1439
    • /
    • 1997
  • The objective of this paper is to present an efficient and practical approach to the automatic classification of power quality(PQ) disturbances, where and orthogonal polynomial approximation method is emloyed for the detection and localization of PQ disturbances, and a feature vector, newly extracted form the bispectra of the detected signal, is utilized for the automatic rectgnition of the various types of PQ disturbances. To demonstrae the performance and applicabiliyt of the proposed approach, some simulation results are provided.

  • PDF

Forward Backward PAST (Projection Approximation Subspace Tracking) Algorithm for the Better Subspace Estimation Accuracy

  • Lim, Jun-Seok
    • The Journal of the Acoustical Society of Korea
    • /
    • v.27 no.1E
    • /
    • pp.25-29
    • /
    • 2008
  • The projection approximation subspace tracking (PAST) is one of the attractive subspace tracking algorithms, because it estimatesthe signal subspace adaptively and continuously. Furthermore, the computational complexity is relatively low. However, the algorithm still has room for improvement in the subspace estimation accuracy. In this paper, we propose a new algorithm to improve the subspace estimation accuracy using a normally ordered input vector and a reversely ordered input vector simultaneously.

Multiclass Support Vector Machines with SCAD

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.5
    • /
    • pp.655-662
    • /
    • 2012
  • Classification is an important research field in pattern recognition with high-dimensional predictors. The support vector machine(SVM) is a penalized feature selector and classifier. It is based on the hinge loss function, the non-convex penalty function, and the smoothly clipped absolute deviation(SCAD) suggested by Fan and Li (2001). We developed the algorithm for the multiclass SVM with the SCAD penalty function using the local quadratic approximation. For multiclass problems we compared the performance of the SVM with the $L_1$, $L_2$ penalty functions and the developed method.

ON THE LEBESGUE SPACE OF VECTOR MEASURES

  • Choi, Chang-Sun;Lee, Keun-Young
    • Bulletin of the Korean Mathematical Society
    • /
    • v.48 no.4
    • /
    • pp.779-789
    • /
    • 2011
  • In this paper we study the Banach space $L^1$(G) of real valued measurable functions which are integrable with respect to a vector measure G in the sense of D. R. Lewis. First, we investigate conditions for a scalarly integrable function f which guarantee $f{\in}L^1$(G). Next, we give a sufficient condition for a sequence to converge in $L^1$(G). Moreover, for two vector measures F and G with values in the same Banach space, when F can be written as the integral of a function $f{\in}L^1$(G), we show that certain properties of G are inherited to F; for instance, relative compactness or convexity of the range of vector measure. Finally, we give some examples of $L^1$(G) related to the approximation property.