• Title/Summary/Keyword: Level Set Methods

Search Result 1,044, Processing Time 0.026 seconds

Particle Swarm Optimization for Redundancy Allocation of Multi-level System considering Alternative Units (대안 부품을 고려한 다계층 시스템의 중복 할당을 위한 입자 군집 최적화)

  • Chung, Il Han
    • Journal of Korean Society for Quality Management
    • /
    • v.47 no.4
    • /
    • pp.701-711
    • /
    • 2019
  • Purpose: The problem of optimizing redundancy allocation in multi-level systems is considered when each item in a multi-level system has alternative items with the same function. The number of redundancy of multi-level system is allocated to maximize the reliability of the system under path set and cost limitation constraints. Methods: Based on cost limitation and path set constraints, a mathematical model is established to maximize system reliability. Particle swarm optimization is employed for redundant allocation and verified by numerical experiments. Results: Comparing the particle swarm optimization method and the memetic algorithm for the 3 and 4 level systems, the particle swarm optimization method showed better performance for solution quality and search time. Particularly, the particle swarm optimization showed much less than the memetic algorithm for variation of results. Conclusion: The proposed particle swarm optimization considerably shortens the time to search for a feasible solution in MRAP with path set constraints. PS optimization is expected to reduce search time and propose the better solution for various problems related to MRAP.

Investigating Functional Level in Patients with Stroke using ICF Concept (ICF core-set를 이용한 뇌졸중 환자의 기능수행 분석)

  • Song, Jumin;Lee, Haejung
    • The Journal of Korean Physical Therapy
    • /
    • v.26 no.5
    • /
    • pp.351-357
    • /
    • 2014
  • Purpose: The purpose of this study was to investigate level of functioning in patients with stroke using Modified Bathel Index (MBI), World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), and ICF core-set for stroke. Methods: Sixty-four patients with stroke were recruited for this study from nine medical institutes. The ICF core-set for stroke, WHODAS 2.0, and MBI were used to collect subjects' functional levels. ICF core-set was employed here as a standard frame to observe multi-dimension of functioning, that is physiological bodily function, activity and participation (AP) in daily life, and current environmental factors (EF) in patients with stroke. WHODAS 2.0 and MBI were also used in order to have a specific functioning level for subjects. The linkage of each item in WHODAS 2.0 and MBI into the ICF core-set for stroke was examined. Pearson correlation coefficient was used for analysis of their relationships. Results: Functioning level of participants showed moderate resulting from MBI and WHODAS 2.0 ($73.48{\pm}22.27$ and $35.55{\pm}12.53$, respectively). Strong relationship was observed between ICF core-set and WHODAS 2.0, and with MBI. Each item of disability scales was obtained its linkage into ICF in the domain of AP. However, lack of correlation between MBI and ICF in the domain of EF was found due to absence of related factors. Conclusion: MBI was found to be linked mainly into ICF in the domain of AP and to have limited linkage into EF. Therefore, it should be suggested that the ICF concept frame should be used as a multi-dimensional approach to patients with stroke.

Numerical Analysis of Three-dimensional Sloshing Flow Using Least-square and Level-set Method (최소자승법과 Level-set 방법을 적용한 3차원 슬로싱 유동의 수치해석)

  • Jeon, Byoung Jin;Choi, Hyoung Gwon
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.41 no.11
    • /
    • pp.759-765
    • /
    • 2017
  • In this study, a three-dimensional least-square, level-set-based two-phase flow code was developed for the simulation of three-dimensional sloshing problems using finite element discretization. The code was validated by solving some benchmark problems. The proposed method was found to provide improved results against other existing methods, by using a coarser mesh. The results of the numerical experiments conducted during the course of this study showed that the proposed method was both robust and accurate for the simulation of three-dimensional sloshing problems. Using a substantially coarse grid, historical results of the dynamic pressure at a selected position corresponded with existing experimental data. The pressure history with a finer grid was similar to that of a coarse grid; however, a fine grid provided higher peak pressures. The present method could be extended to the analysis of a sloshing problem in a complex geometrical configuration using unstructured meshes owing to the features of FEM.

ANALYSIS OF ELECTROWETTING DYNAMICS WITH LEVEL SET METHOD AND ASSESSMENT OF PROPERTY INTERPOLATION METHODS (레벨셋 기법을 이용한 전기습윤 현상의 동적 거동에 대한 해석 및 물성 보간 방법에 대한 고찰)

  • Park, J.K.;Kang, K.H.
    • 한국전산유체공학회:학술대회논문집
    • /
    • 2010.05a
    • /
    • pp.551-555
    • /
    • 2010
  • Electrowetting is a versatile tool to handle tiny droplets and forms a backbone of digital microfluidics. Numerical analysis is necessary to fully understand the dynamics of electrowetting, especially in designing electrowetting-based devices, such as liquid lenses and reflective displays. We developed a numerical method to analyze the general contact-line problems, incorporating dynamic contact angle models. The method is based on the conservative level set method to capture the interface of two fluids without loss of mass. We applied the method to the analysis of spreading process of a sessile droplet for step input voltages and oscillation of the droplet for alternating input voltages in electrowetting. The result was compared with experimental data. It is shown that contact line friction significantly affects the contact line motion and the oscillation amplitude. The pinning process of contact line was well represented by including the hysteresis effect in the contact angle models. In level set method, in the mean time, material properties are made to change smoothly across an interface of two materials with different properties by introducing an interpolation or smoothing scheme. So far, the weighted arithmetic mean (WAM) method has been exclusively adopted in level set method, without complete assessment for its validity. We viscosity, thermal conductivity, electrical conductivity, and permittivity, can be an alternative. I.e., the WHM gives more accurate results than the WAM method in certain circumstances. The interpolation scheme should be selected considering various characteristics including type of property, ratio of property of two fluids, geometry of interface, and so on.

  • PDF

SURFACE RECONSTRUCTION FROM SCATTERED POINT DATA ON OCTREE

  • Park, Chang-Soo;Min, Cho-Hon;Kang, Myung-Joo
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.16 no.1
    • /
    • pp.31-49
    • /
    • 2012
  • In this paper, we propose a very efficient method which reconstructs the high resolution surface from a set of unorganized points. Our method is based on the level set method using adaptive octree. We start with the surface reconstruction model proposed in [20]. In [20], they introduced a very fast and efficient method which is different from the previous methods using the level set method. Most existing methods[21, 22] employed the time evolving process from an initial surface to point cloud. But in [20], they considered the surface reconstruction process as an elliptic problem in the narrow band including point cloud. So they could obtain very speedy method because they didn't have to limit the time evolution step by the finite speed of propagation. However, they implemented that model just on the uniform grid. So they still have the weakness that it needs so much memories because of being fulfilled only on the uniform grid. Their algorithm basically solves a large linear system of which size is the same as the number of the grid in a narrow band. Besides, it is not easy to make the width of band narrow enough since the decision of band width depends on the distribution of point data. After all, as far as it is implemented on the uniform grid, it is almost impossible to generate the surface on the high resolution because the memory requirement increases geometrically. We resolve it by adapting octree data structure[12, 11] to our problem and by introducing a new redistancing algorithm which is different from the existing one[19].

Determination of safe levels and toxic levels for feed hazardous materials in broiler chickens: a review

  • Jong Hyuk Kim
    • Journal of Animal Science and Technology
    • /
    • v.65 no.3
    • /
    • pp.490-510
    • /
    • 2023
  • Feed safety is needed to produce and provide safe animal feeds for consumers, animals, and the environment. Although feed safety regulations have been set for each country, there is a lack of clear feed safety regulations for each livestock. Feed safety regulations are mainly focused on heavy metals, mycotoxins, and pesticides. Each country has different safe levels of hazardous materials in diets. Safe levels of hazardous materials in diets are mostly set for mixed diets of general livestock. Although there is a difference in the metabolism of toxic materials among animals, the safe level of feed is not specific for individual animals. Therefore, standardized animal testing methods and toxicity studies for each animal are needed to determine the correct safe and toxic levels of hazardous materials in diets. If this goal is achieved, it will be possible to improve livestock productivity, health, and product safety by establishing appropriate feed safety regulations. It will also provide an opportunity to secure consumer confidence in feed and livestock products. Therefore, it is necessary to establish a scientific feed safety evaluation system suitable for each country's environment. The chance of outbreaks of new hazardous materials is increasing. Thus, to set up appropriate toxic levels or safe levels in feed, various toxicity methods have been used to determine toxic levels of hazardous materials for humans and animals. Appropriate toxic testing methods should be developed and used to accurately set up and identify toxicity and safe levels in food and feed.

Design and Implementation of an Efficient Buffer Replacement Method for Real-time Multimedia Databases Environments (실시간 멀티미디어 데이터베이스 환경을 위한 효율적인 버퍼교체 기법 설계 및 구현)

  • 신재룡;피준일;유재수;조기형
    • Journal of Korea Multimedia Society
    • /
    • v.5 no.4
    • /
    • pp.372-385
    • /
    • 2002
  • In this paper, we propose an efficient buffer replacement method for the real-time multimedia data. The proposed method has multi level priority to consider the real-time characteristics. Each priority level is divided into a cold data set that is likely to be referenced for the first time and a hot data set that is likely to be re-referenced. An operation to select the victim data is sequentially executed from the cold set with the minimum priority level to the hot set with the maximum Priority level. It is chosen only at the lower level than or equal to the priority of the transaction that requests a buffer allocation. In the cold set, our method selects a media that has the maximum size in the level for a target of victim first of all. And in the hot set, our method selects a medium that has the maximum interval of the reference first of all. Since it maintains many popular media in the limited buffer space, the buffer hit ratio is increased. It also manages many service requests. As a result, our method improves the overall performance of the system. We compare the proposed method with the Priority-Hints method in terms of the buffer hit ratio and the deadline missing ratio of transactions. It is shown through the performance evaluation that our method outperforms the existing methods.

  • PDF

Stable Bottom Detection and Optimum Bottom Offset for Echo Integration of Demersal Fish (저서어자원량의 음향추정에 있어서 해저기준과 해저 오프셋의 최소화)

  • 황두진
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.36 no.3
    • /
    • pp.195-201
    • /
    • 2000
  • This paper discusses methods for the stable bottom detection and the optimum bottom offset which enable to separate the fish echoes from the bottom echoes with echo integration of demersal fish. In preprocessing of the echo signal, the bottom detection has to be done stably against the fluctuation of echo level and the bottom offset has to be set to a minimum height such that near bottom fish echoes are included Two methods of bottom detection, namely echo level threshold method and maximum echo slope method were compared and analyzed. The echo level method works well if the ideal threshold level was given but it sometimes misses the bottom because of the fluctuation of the echo. Another method to detect the bottom which uses maximum echo slope indicates the simple and stable bottom detection. In addition, the bottom offset has to be set near to the bottom but not to include the bottom echo. Optimum bottom offset should be set a few samples before the detected bottom echo which relates the beginning of pulse shape and acoustic beam pattern to the bottom feature.

  • PDF

Interpretation of Quality Statistics Using Sampling Error (샘플링오차에 의한 품질통계 모형의 해석)

  • Choi, Sung-Woon
    • Journal of the Korea Safety Management & Science
    • /
    • v.10 no.2
    • /
    • pp.205-210
    • /
    • 2008
  • The research interprets the principles of sampling error design for quality statistics models such as hypothesis test, interval estimation, control charts and acceptance sampling. Introducing the proper discussions of the design of significance level according to the use of hypothesis test, then it presents two methods to interpret significance by Neyman-Pearson and Fisher. Second point of the study proposes the design of confidence level for interval estimation by Bayesian confidence set, frequentist confidential set and fiducial interval. Third, the content also indicates the design of type I error and type II error considering both productivity and customer claim for control chart. Finally, the study reflects the design of producer's risk with operating charistictics curve, screening and switch rules for the purpose of purchasing and subcontraction.

A Model for Measuring Standardization Level of Information and Communication Technology (정보통신 표준화 지수측정 모형 개발 연구)

  • 이승환;박명철;이상우;구경철
    • Korean Management Science Review
    • /
    • v.20 no.2
    • /
    • pp.95-111
    • /
    • 2003
  • The standard issue in the information and telecommunication industry is increasingly important with the rapid development of technology. This paper proposes an index model which can measure the degree of standardization in the Korean information and telecommunication field. We first classified ICT sector into 14 sub-sectors. Then for each sub-sector, we considered a set of important determinants to measure the level of standardization, and constructed a linear equation using this set of determinants. Finally we estimated the relative degree of importance of each determinant using the AHP methodology. The proposed model found that overall level of standardization in the Korean ICT industry is relatively low, and ‘IMT-2000 technology’ and ‘computer network technology’ among 14 sub-sectors are highly standardized sub-sectors. The validity of the proposed model was also partially proved using two different methods, holistic and historical approach.