• Title/Summary/Keyword: Inverse Optimal Approach

Search Result 34, Processing Time 0.024 seconds

Neural source localization using particle filter with optimal proportional set resampling

  • Veeramalla, Santhosh Kumar;Talari, V.K. Hanumantha Rao
    • ETRI Journal
    • /
    • v.42 no.6
    • /
    • pp.932-942
    • /
    • 2020
  • To recover the neural activity from Magnetoencephalography (MEG) and Electroencephalography (EEG) measurements, we need to solve the inverse problem by utilizing the relation between dipole sources and the data generated by dipolar sources. In this study, we propose a new approach based on the implementation of a particle filter (PF) that uses minimum sampling variance resampling methodology to track the neural dipole sources of cerebral activity. We use this approach for the EEG data and demonstrate that it can naturally estimate the sources more precisely than the traditional systematic resampling scheme in PFs.

Learning Optimal Trajectory Generation for Low-Cost Redundant Manipulator using Deep Deterministic Policy Gradient(DDPG) (저가 Redundant Manipulator의 최적 경로 생성을 위한 Deep Deterministic Policy Gradient(DDPG) 학습)

  • Lee, Seunghyeon;Jin, Seongho;Hwang, Seonghyeon;Lee, Inho
    • The Journal of Korea Robotics Society
    • /
    • v.17 no.1
    • /
    • pp.58-67
    • /
    • 2022
  • In this paper, we propose an approach resolving inaccuracy of the low-cost redundant manipulator workspace with low encoder and low stiffness. When the manipulators are manufactured with low-cost encoders and low-cost links, the robots can run into workspace inaccuracy issues. Furthermore, trajectory generation based on conventional forward/inverse kinematics without taking into account inaccuracy issues will introduce the risk of end-effector fluctuations. Hence, we propose an optimization for the trajectory generation method based on the DDPG (Deep Deterministic Policy Gradient) algorithm for the low-cost redundant manipulators reaching the target position in Euclidean space. We designed the DDPG algorithm minimizing the distance along with the jacobian condition number. The training environment is selected with an error rate of randomly generated joint spaces in a simulator that implemented real-world physics, the test environment is a real robotic experiment and demonstrated our approach.

Study of Forming Analysis Auto-body Panel Using One-step Theory (One-Step 이론을 이용한 차체판넬 성형 해석에 관한 연구)

  • Ahn H.G.;KO H.H.;Lee C.H.;Ahn B.I.;Moon W.S.;Jung D.W.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.585-588
    • /
    • 2005
  • Many process parameters have an effect on the auto-body panel forming process. A well designed blank shape causes the material to flow smoothly, reduces the punch and yields a product with uniform thickness distribution. Therefore, the determination of an initial blank shape plays the important role of saving time and cost in the auto-body panel forming process. For these reasons, some approaches to estimate the initial blank shape have been implemented, in this paper The one-step approach using a finite element inverse method will be introduced to predict the optimal forming with changing of blank pressure the developed program is applied to auto-body panel forming.

  • PDF

Determination of Optimal Cluster Size Using Bootstrap and Genetic Algorithm (붓스트랩 기법과 유전자 알고리즘을 이용한 최적 군집 수 결정)

  • Park, Min-Jae;Jun, Sung-Hae;Oh, Kyung-Whan
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.13 no.1
    • /
    • pp.12-17
    • /
    • 2003
  • Optimal determination of cluster size has an effect on the result of clustering. In K-means algorithm, the difference of clustering performance is large by initial K. But the initial cluster size is determined by prior knowledge or subjectivity in most clustering process. This subjective determination may not be optimal. In this Paper, the genetic algorithm based optimal determination approach of cluster size is proposed for automatic determination of cluster size and performance upgrading of its result. The initial population based on attribution is generated for searching optimal cluster size. The fitness value is defined the inverse of dissimilarity summation. So this is converged to upgraded total performance. The mutation operation is used for local minima problem. Finally, the re-sampling of bootstrapping is used for computational time cost.

A Study on the Analysis of Optimal Asset Allocation and Welfare Improvemant Factors through ESG Investment (ESG투자를 통한 최적자산배분과 후생개선 요인분석에 관한 연구)

  • Hyun, Sangkyun;Lee, Jeongseok;Rhee, Joon-Hee
    • Journal of Korean Society for Quality Management
    • /
    • v.51 no.2
    • /
    • pp.171-184
    • /
    • 2023
  • Purpose: First, this paper suggests an alternative approach to find optimal portfolio (stocks, bonds and ESG stocks) under the maximizing utility of investors. Second, we include ESG stocks in our optimal portfolio, and compare improvement of welfares in the case with and without ESG stocks in portfolio. Methods: Our main method of analysis follows Brennan et al(2002), designed under the continuous time framework. We assume that the dynamics of stock price follow the Geometric Brownian Motion (GBM) while the short rate have the Vasicek model. For the utility function of investors, we use the Power Utility Function, which commonly used in financial studies. The optimal portfolio and welfares are derived in the partial equilibrium. The parameters are estimated by using Kalman filter and ordinary least square method. Results: During the overall analysis period, the portfolio including ESG, did not show clear welfare improvement. In 2017, it has slightly exceeded this benchmark 1, showing the possibility of improvement, but the ESG stocks we selected have not strongly shown statistically significant welfare improvement results. This paper showed that the factors affecting optimal asset allocation and welfare improvement were different each other. We also found that the proportion of optimal asset allocation was affected by factors such as asset return, volatility, and inverse correlation between stocks and bonds, similar to traditional financial theory. Conclusion: The portfolio with ESG investment did not show significant results in welfare improvement is due to that 1) the KRX ESG Leaders 150 selected in our study is an index based on ESG integrated scores, which are designed to affect stability rather than profitability. And 2) Korea has a short history of ESG investment. During the limited analysis period, the performance of stock-related assets was inferior to bond assets at the time of the interest rate drop.

Optimization Model on the World Wide Web Organization with respect to Content Centric Measures (월드와이드웹의 내용기반 구조최적화)

  • Lee Wookey;Kim Seung;Kim Hando;Kang Sukho
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.30 no.1
    • /
    • pp.187-198
    • /
    • 2005
  • The structure of a Web site can prevent the search robots or crawling agents from confusion in the midst of huge forest of the Web pages. We formalize the view on the World Wide Web and generalize it as a hierarchy of Web objects such as the Web as a set of Web sites, and a Web site as a directed graph with Web nodes and Web edges. Our approach results in the optimal hierarchical structure that can maximize the weight, tf-idf (term frequency and inverse document frequency), that is one of the most widely accepted content centric measures in the information retrieval community, so that the measure can be used to embody the semantics of search query. The experimental results represent that the optimization model is an effective alternative in the dynamically changing Web environment by replacing conventional heuristic approaches.

Optimal Non-Uniform Resampling Algorithm (최적 비정규 리샘플링 알고리즘)

  • Sin, Geon-Sik;Lee, Hak-Mu;Gang, Mun-Gi
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.39 no.2
    • /
    • pp.50-55
    • /
    • 2002
  • The standard approach of image resampling is to fit the original image with continuous model and resample the function at a desired rate. We used the B-spline function as the continuous model because it oscillates less than the others. The main purpose of this paper is the derivation of a nonuniform optimal resampling algorithm. To derive it, needing approximation can be computed in three steps: 1) determining the I-spline coefficients by matrix inverse process, 2) obtaining the transformed-spline coefficients by the optimal resampling algorithm derived from the orthogonal projection theorem, 3) converting of the result back into the signal domain by indirect B-spline transformation. With these methods, we can use B-spline in the non-uniform resampling, which is proved to be a good kernel in uniform resampling, and can also verify the applicability from our experiments.

An Analysis of Privacy and Accuracy for Privacy-Preserving Techniques by Matrix-based Randomization (행렬 기반 랜덤화를 적용한 프라이버시 보호 기술의 안전성 및 정확성 분석)

  • Kang, Ju-Sung;An, A-Ron;Hong, Do-Won
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.18 no.4
    • /
    • pp.53-68
    • /
    • 2008
  • We study on the practical privacy-preserving techniques by matrix-based randomization approach. We clearly examine the relationship between the two parameters associated with the measure of privacy breach and the condition number of matrix in order to achieve the optimal transition matrix. We propose a simple formula for efficiently calculating the inverse of transition matrix which are needed in the re-construction process of random substitution algorithm, and deduce some useful connections among standard error and another parameters by obtaining condition numbers according to norms of matrix and the expectation and variance of the transformed data. Moreover we give some experimental results about our theoretical expressions by implementing random substitution algorithm.

Quadrilateral Irregular Network for Mesh-Based Interpolation

  • Tae Beom Kim;Chihyung Lee
    • The Journal of Engineering Geology
    • /
    • v.33 no.3
    • /
    • pp.439-459
    • /
    • 2023
  • Numerical analysis has been adopted in nearly all modern scientific and engineering fields due to the rapid and ongoing evolution of computational technology, with the number of grid or mesh points in a given data field also increasing. Some values must be extracted from large data fields to evaluate and supplement numerical analysis results and observational data, thereby highlighting the need for a fast and effective interpolation approach. The quadrilateral irregular network (QIN) proposed in this study is a fast and reliable interpolation method that is capable of sufficiently satisfying these demands. A comparative sensitivity analysis is first performed using known test functions to assess the accuracy and computational requirements of QIN relative to conventional interpolation methods. These same interpolation methods are then employed to produce simple numerical model results for a real-world comparison. Unlike conventional interpolation methods, QIN can obtain reliable results with a guaranteed degree of accuracy since there is no need to determine the optimal parameter values. Furthermore, QIN is a computationally efficient method compared with conventional interpolation methods that require the entire data space to be evaluated during interpolation, even if only a subset of the data space requires interpolation.

A Statistical Image Segmentation Method in the Hierarchical Image Structure (계층적 영상구조에서 통계적 방법에 의한 영상분할)

  • 최성진
    • Journal of Broadcast Engineering
    • /
    • v.1 no.2
    • /
    • pp.165-175
    • /
    • 1996
  • In this paper, the image segmentation method based on the hierarchical pyramid image structure of reduced resolution versions of the image for solving the problems in the conventional methods is presented. This method is described the object detection and delineation by statistical approach. In the object detection method, IFSVR( Inverse-father-son variance ratio) method and FSVR(father-son variance ratio ) method are proposed for solving clustering validity problem occurred In the hierarchical pyramid image structure. An optimal object pixel Is detected at some level by this method. In the object delineation method, the iterative algorithm by top-down traversing method is proposed for moving the optimal object pixel to levels of higher resolution. Using the computer simulation, the results by the proposed statistical methods and object traversing method are investigated for the binary Image and the real image. At the results of computer simulation, the proposed methods of image segmentation based on the hierarchical pyramid Image structure seem to have useful properties and deserve consideration as a possible alternative to existing methods of image segmentation. The computation for the proposed method is required 0(log n) for n${\times}$n input image.

  • PDF