• Title/Summary/Keyword: Global Minimum

Search Result 594, Processing Time 0.029 seconds

Multiple-Hypothesis RAIM Algorithm with an RRAIM Concept (RRAIM 기법을 활용한 다중 가설 사용자 무결성 감시 알고리듬)

  • Yun, Ho;Kee, Changdon
    • Journal of Advanced Navigation Technology
    • /
    • v.16 no.4
    • /
    • pp.593-601
    • /
    • 2012
  • This paper develops and analyzes a new multiple-hypothesis Receiver Autonomous Integrity Monitoring (RAIM) algorithm as a candidate for future standard architecture. The proposed algorithm can handle simultaneous multiple failures as well as a single failure. It uses measurement residuals and satellite observation matrices of several consecutive epochs for Failure Detection and Exclusion (FDE). The proposed algorithm redueces the Minimum Detectable Bias (MDB) via the Relative RAIM (RRAIM) scheme. Simulation results show that the proposed algorithm can detect and filter out multiple failures in tens of meters.

Automated Segmentation of the Lateral Ventricle Based on Graph Cuts Algorithm and Morphological Operations

  • Park, Seongbeom;Yoon, Uicheul
    • Journal of Biomedical Engineering Research
    • /
    • v.38 no.2
    • /
    • pp.82-88
    • /
    • 2017
  • Enlargement of the lateral ventricles have been identified as a surrogate marker of neurological disorders. Quantitative measure of the lateral ventricle from MRI would enable earlier and more accurate clinical diagnosis in monitoring disease progression. Even though it requires an automated or semi-automated segmentation method for objective quantification, it is difficult to define lateral ventricles due to insufficient contrast and brightness of structural imaging. In this study, we proposed a fully automated lateral ventricle segmentation method based on a graph cuts algorithm combined with atlas-based segmentation and connected component labeling. Initially, initial seeds for graph cuts were defined by atlas-based segmentation (ATS). They were adjusted by partial volume images in order to provide accurate a priori information on graph cuts. A graph cuts algorithm is to finds a global minimum of energy with minimum cut/maximum flow algorithm function on graph. In addition, connected component labeling used to remove false ventricle regions. The proposed method was validated with the well-known tools using the dice similarity index, recall and precision values. The proposed method was significantly higher dice similarity index ($0.860{\pm}0.036$, p < 0.001) and recall ($0.833{\pm}0.037$, p < 0.001) compared with other tools. Therefore, the proposed method yielded a robust and reliable segmentation result.

An analysis of using trend and relationship among DRGs, Nursing Diagnoses and Nursing Interventions (DRG, 간호진단, 간호중재의 활용경향 및 관계분석;미국의 일 지역을 중심으로)

  • Jung, Myun-Sook
    • Journal of Korean Academy of Nursing Administration
    • /
    • v.8 no.2
    • /
    • pp.207-219
    • /
    • 2002
  • The purposes of this research were to: a) define the changing trends of DRGs in comparison to the National Data, b) define the changing trends of Nursing Diagnoses and Nursing Interventions for the 5 most frequently occurring Diagnostic Related Groups (DRGs) across 3 years, and c) define the relationships between nursing diagnoses and nursing Interventions for the 5 most frequently occurring DRGs across the 3 years. This study was a secondary data analysis of medical and nursing data based on the United States Nursing Minimum Data Set and the Uniform Hospital Discharge Data Set retrieved from a Midwestern USA medical center. The results showed interesting comparisons with national statistics as well as practice relevant trends within the nursing data. Additionally, the results showed the possibility that nursing data can be extracted from the medical data, so they can used in the nursing productivity and cost issues etc. In conclusion, this study supports the power of minimum data sets and nursing classifications to begin to describe a more global perspective the inter-relationships and trends of nursing data within the medical diagnosis context.

  • PDF

Review of ISO Standards on Human-System Interaction Published during 2008-2013

  • Lee, Dhong Ha
    • Journal of the Ergonomics Society of Korea
    • /
    • v.33 no.5
    • /
    • pp.433-452
    • /
    • 2014
  • Objective: The aim of this study is to give ergonomists the brief summary of the recently published ISO standards on human-system interaction and tips for application of the standards. Background: Standard developers did hard work on developing a standard in a concise manner. But most of standards are often bulky in volume. Readers of the standards are difficult to catch key points from the voluminous contents of standards and intermingle among them. Method: Focused on newly developed display/control technology, this study reviewed the 14 ISO standards on human-system interaction published during 2008-2013 and summarized key points from them. Results: Schematic diagrams and tables concisely illustrated the processes, procedures, dimensions, or best practices recommended by the standards concerning conception, design, and usability testing for consumer products. Conclusion: The standards provided the minimum level of requirements on design and evaluation on the physical input devices, electronic displays, and control interfaces based on the current state of technology. But the minimum requirements specified in the standards nowadays become mandatory ergonomic requirements in global trade world. Application: Ergonomists can take a quick and broad view on international standardization activities on newly developed display/control technology from this summary study.

An Improved Mean-Variance Optimization for Nonconvex Economic Dispatch Problems

  • Kim, Min Jeong;Song, Hyoung-Yong;Park, Jong-Bae;Roh, Jae-Hyung;Lee, Sang Un;Son, Sung-Yong
    • Journal of Electrical Engineering and Technology
    • /
    • v.8 no.1
    • /
    • pp.80-89
    • /
    • 2013
  • This paper presents an efficient approach for solving economic dispatch (ED) problems with nonconvex cost functions using a 'Mean-Variance Optimization (MVO)' algorithm with Kuhn-Tucker condition and swap process. The aim of the ED problem, one of the most important activities in power system operation and planning, is to determine the optimal combination of power outputs of all generating units so as to meet the required load demand at minimum operating cost while satisfying system equality and inequality constraints. This paper applies Kuhn-Tucker condition and swap process to a MVO algorithm to improve a global minimum searching capability. The proposed MVO is applied to three different nonconvex ED problems with valve-point effects, prohibited operating zones, transmission network losses, and multi-fuels with valve-point effects. Additionally, it is applied to the large-scale power system of Korea. The results are compared with those of the state-of-the-art methods as well.

Effects of CSR Activities on Business Performance of Logistics Firms

  • JEON, Ho-Jin;KIM, Young-Min;YOUN, Myoung-Kil
    • Journal of Distribution Science
    • /
    • v.17 no.12
    • /
    • pp.23-32
    • /
    • 2019
  • Purpose As consumer awareness grows, the importance of CSR becomes even more important for long-term growth. In response to this current trend, the purpose of this study is to analyze the effect of CSR activities on business performance for logistics companies. Research design, data, and methodology - Between CSR activities and growth, there was a generally positive(+) relationships between activities such as donation and volunteerism and the growth of the enterprise. In terms of the relationship between environmental factors and growth, negative results were expressed. In case of profitability, improved welfare for workers has had a positive impact on corporate profitability. Results - With respect to stability, a high proportion of equity capital is not considered to be more active in SCR activities. Significant negative results were given between the minimum factors for entry, transportation, and noise generation factors and the ratio of liabilities, which are representative friction factors in the community. Conclusions - With respect to stability, a high proportion of equity capital is not considered to be more active in SCR activities. Significant negative results were given between the minimum factors for entry, transportation, and noise generation factors and the ratio of liabilities, which are representative friction factors in the community.

ELIMINATION OF BIAS IN THE IIR LMS ALGORITHM (IIR LMS 알고리즘에서의 바이어스 제거)

  • Nam, Seung-Hyon;Kim, Yong-Hoh
    • The Journal of Natural Sciences
    • /
    • v.8 no.1
    • /
    • pp.5-15
    • /
    • 1995
  • The equation error formulation in the adaptive IIR filtering provides convergence to a global minimum regardless a local minimum with a large stability margin. However, the equation error formulation suffers from the bias in the coefficient estimates. In this paper, a new algorithm, which does not require a prespecification of the noise variance, is proposed for the equation error formulation. This algorithm is based on the equation error smoothing and provides an unbiased parameter estimate in the presence of white noise. Through simulations, it is demonstrated that the algorithm eliminates the bias in the parameter estimate while retaining good properties of the equation error formulation such as fast convergence speed and the large stability margin.

  • PDF

On-line Vector Quantizer Design Using Simulated Annealing Method (Simulated Annealing 방법을 이용한 온라인 벡터 양자화기 설계)

  • Song, Geun-Bae;Lee, Haeng-Se
    • The KIPS Transactions:PartB
    • /
    • v.8B no.4
    • /
    • pp.343-350
    • /
    • 2001
  • 백터 양자화기 설계는 다차원의 목적함수를 최소화하는 학습 알고리즘을 필요로 한다. 일반화된 Lloyd 방법(GLA)은 벡터 양자화기 설계를 위해 오늘날 가장 널리 사용되는 알고리즘이다. GLA 는 일괄처리(batch) 방식으로 코드북을 생성하며 목적함수를 단조 감소시키는 강하법(descent algorithm)의 일종이다. 한편 Kohonen 학습법(KLA)은 학습벡터가 입력되는 동안 코드북이 갱신되는 온라인 벡터 양자화기 설계 알고리즘 이다. KLA는 원래 신경망 학습을 위해 Kohonen에 의해 제안되었다. KLA 역시 GLA와 마찬가지로 강하법의 일종이라 할 수 있다. 따라서 이들 두 알고리즘은, 비록 사용하기 편리하고 안정적으로 동작을 하지만, 극소(local minimum) 점으로 수렴하는 문제를 안고 있다. 우리는 이 문제와 관련하여 simulated annealing(SA) 방법의 응용을 논하고자 한다. SA는 현재까지 극소에 빠지지 않고 최소(global minimum)로 수렴하면서, 해의 수렴이 (통계적으로) 보장되는 유일한 방법이라 할 수 있다. 우리는 먼저 GLA에 SA를 응용한 그 동안의 연구를 개괄한다. 다음으로 온라인 방식의 벡터 양자화가 설계에 SA 방법을 응용함으로써 SA 방법에 기초한 새로운 온라인 학습 알고리즘을 제안한다. 우리는 이 알고리즘을 OLVQ-SA 알고리즘이라 부르기로 한다. 가우스-마코프 소스와 음성데이터에 대한 벡터양자화 실험 결과 제안된 방법이 KLA 보다 일관되게 우수한 코드북을 생성함을 보인다.

  • PDF

Investigation of Thermal Behavior Characteristic in Chemical Mechanical Polishing Performance (CMP 결과에 영향을 미치는 열적거동 특성에 관한 연구)

  • Jeong, Young-Seok;Kim, Hyoung-Jae;Choi, Jae-Young;Kim, Goo-Youn;Jeong, Hae-Do
    • Proceedings of the Korean Institute of Electrical and Electronic Material Engineers Conference
    • /
    • 2004.07b
    • /
    • pp.1283-1287
    • /
    • 2004
  • The design rules are being more strict with requirement of operation speed and development of IC industry. For this reason, required minimum line-width has been narrowed under sub-micron region. As the length of minimum line-width is narrowed, local and global planarization are being prominent. CMP(Chemical-Mechanical Polishing), one of the planarizarion technology, is a process which polishes with the ascent of chemical reaction and relative velocity between pad and wafer without surface defects. CMP is performed with a complex interaction among many factors, how CMP has an interaction with such factors is not evident. Accordingly, the studies on this are still carrying out. Therefore, an examination of the CMP phenomena and an accurate understanding of compositive factors are urgently needed. In this paper, we will consider of the relations between the effects of temperature which influences many factors having an effect on polishing results and the characteristics of CMP in order to understand and estimate the influence of temperature. Then, through the interaction of shown temperature and polishing result, we could expect to boost fundamental understanding on complex CMP phenomena.

  • PDF

DL-RRT* algorithm for least dose path Re-planning in dynamic radioactive environments

  • Chao, Nan;Liu, Yong-kuo;Xia, Hong;Peng, Min-jun;Ayodeji, Abiodun
    • Nuclear Engineering and Technology
    • /
    • v.51 no.3
    • /
    • pp.825-836
    • /
    • 2019
  • One of the most challenging safety precautions for workers in dynamic, radioactive environments is avoiding radiation sources and sustaining low exposure. This paper presents a sampling-based algorithm, DL-RRT*, for minimum dose walk-path re-planning in radioactive environments, expedient for occupational workers in nuclear facilities to avoid unnecessary radiation exposure. The method combines the principle of random tree star ($RRT^*$) and $D^*$ Lite, and uses the expansion strength of grid search strategy from $D^*$ Lite to quickly find a high-quality initial path to accelerate convergence rate in $RRT^*$. The algorithm inherits probabilistic completeness and asymptotic optimality from $RRT^*$ to refine the existing paths continually by sampling the search-graph obtained from the grid search process. It can not only be applied to continuous cost spaces, but also make full use of the last planning information to avoid global re-planning, so as to improve the efficiency of path planning in frequently changing environments. The effectiveness and superiority of the proposed method was verified by simulating radiation field under varying obstacles and radioactive environments, and the results were compared with $RRT^*$ algorithm output.