• Title/Summary/Keyword: 절단자료

Search Result 248, Processing Time 0.023 seconds

Restoration for the censored image vai EM algorithm (EM알고리즘을 이용한 중도절단화상에 대한 복원)

  • 김승구
    • The Korean Journal of Applied Statistics
    • /
    • v.10 no.2
    • /
    • pp.309-323
    • /
    • 1997
  • Although there are many photochemical images of which are censored while they are recorded, normal approaches are often applied to the restorations for them. In this case, it yields a restored image which might have serious bias. However, solutions for this problem are hardly found in the research of image restorations. This article provides a method of image restoration via EM algorithm for the censored images of which are contaminated with Gaussian noise and blur, also presents some results of simulation for artificial images censorized.

  • PDF

Generalized Exponential Regression Model with Randomly Censored Data (임의중도절단자료를 갖는 일반화된 지수회귀모형)

  • 하일도
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.4 no.2
    • /
    • pp.39-43
    • /
    • 1999
  • We consider generalized exponential regression model with randomly censored data and propose a modified Fisher scoring method which estimates the model parameters. For this, the likelihood equations are derived and then the estimating algorithm is developed. We illustrate the proposed method using a real data.

  • PDF

Developing statistical models and constructing clinical systems for analyzing semi-competing risks data produced from medicine, public heath, and epidemiology (의료, 보건, 역학 분야에서 생산되는 준경쟁적 위험자료를 분석하기 위한 통계적 모형의 개발과 임상분석시스템 구축을 위한 연구)

  • Kim, Jinheum
    • The Korean Journal of Applied Statistics
    • /
    • v.33 no.4
    • /
    • pp.379-393
    • /
    • 2020
  • A terminal event such as death may censor an intermediate event such as relapse, but not vice versa in semi-competing risks data, which is often seen in medicine, public health, and epidemiology. We propose a Weibull regression model with a normal frailty to analyze semi-competing risks data when all three transition times of the illness-death model are possibly interval-censored. We construct the conditional likelihood separately depending on the types of subjects: still alive with or without the intermediate event, dead with or without the intermediate event, and dead with the intermediate event missing. Optimal parameter estimates are obtained from the iterative quasi-Newton algorithm after the marginalization of the full likelihood using the adaptive importance sampling. We illustrate the proposed method with extensive simulation studies and PAQUID (Personnes Agées Quid) data.

A two-sample test with interval censored competing risk data using multiple imputation (다중대체방법을 이용한 구간 중도 경쟁 위험 모형에서의 이표본 검정)

  • Kim, Yuwon;Kim, Yang-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.2
    • /
    • pp.233-241
    • /
    • 2017
  • Interval censored data frequently occur in observation studies where the subject is followed periodically. In this paper, our interest is to suggest a test statistic to compare the CIF of two groups with interval censored failure time data in the presence of competing risks. Gray (1988) suggested a test statistic for right censored data that motivated a well-known Fine and Gray's subdistribution hazard model. A multiple imputation technique is adopted to adopt Gray's test statistic to interval censored data. The powers and sizes of the suggested method are investigated through diverse simulation schemes. The main merit of the suggested method is its simplicity to implement with existing software for right censored data. The method is illustrated by analyzing Bangkok's HIV cohort dataset.

Review for time-dependent ROC analysis under diverse survival models (생존 분석 자료에서 적용되는 시간 가변 ROC 분석에 대한 리뷰)

  • Kim, Yang-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.35 no.1
    • /
    • pp.35-47
    • /
    • 2022
  • The receiver operating characteristic (ROC) curve was developed to quantify the classification ability of marker values (covariates) on the response variable and has been extended to survival data with diverse missing data structure. When survival data is understood as binary data (status of being alive or dead) at each time point, the ROC curve expressed at every time point results in time-dependent ROC curve and time-dependent area under curve (AUC). In particular, a follow-up study brings the change of cohort and incomplete data structures such as censoring and competing risk. In this paper, we review time-dependent ROC estimators under several contexts and perform simulation to check the performance of each estimators. We analyzed a dementia dataset to compare the prognostic power of markers.

Estimating the Economic Value of Recreation Sea Fishing in the Yellow Sea: An Application of Count Data Model (가산자료모형을 이용한 서해 태안군 유어객의 편익추정)

  • Choi, Jong Du
    • Environmental and Resource Economics Review
    • /
    • v.23 no.2
    • /
    • pp.331-347
    • /
    • 2014
  • The purpose of this study is to estimate the economic value of the recreational sea fishing in the Yellow Sea using count data model. For estimating consumer surplus, we used several count data model of travel cost recreation demand such as a poisson model(PM), a negative binomial model(NBM), a truncated poisson model(TPM), and a truncated negative binomial model(TNBM). Model results show that there is no exist the over-dispersion problem and a NBM was statistically more suitable than the other models. All parameters estimated are statistically significant and theoretically valid. The NBM was applied to estimate the travel demand and consumer surplus. The consumer surplus pre trip was estimated to be 254,453won, total consumer surplus per person and per year 1,536,896won.

Development of mathematical learning materials through geometric problems and the invention of pentominoes (기하학적 문제와 펜토미노의 발명을 통한 수학 학습에서의 자료 개발)

  • Hwang, Sun-Wook;Shim, Sang-Kil
    • Journal for History of Mathematics
    • /
    • v.20 no.1
    • /
    • pp.57-72
    • /
    • 2007
  • Recently, dissection puzzles such as pentominoes have been used in mathematics education. But they are not actively applicable as a tool of problem solving or introducing mathematical concepts since researches about the historical background and developments of mathematical applications of such puzzles have not been effectively accomplished. In this article, in order to use pentominoes in mathematical teaming effectively, we first investigate geometric problems related to dissection puzzles and the historic background of development of pentominoes. And then we collect and classify data related to pentomino activities which can be applicable to mathematics classes based on the 7th elementary school national curriculum. Finally, we suggest several basic materials and directions to develop more systematic learning materials about pentominoes.

  • PDF

Nesting Algorithm for Optimal Layout of Cutting parts in Laser Cutting Process (레이저 절단공정에서 절단부재의 최적배치를 위한 네스팅 알고리즘)

  • 한국찬;나석주
    • Journal of Welding and Joining
    • /
    • v.12 no.2
    • /
    • pp.11-19
    • /
    • 1994
  • 레이저 가공기술은 재료가공 분야에서 넓은 응용분야를 가지고 있으며, 특히 절단, 용접, 열처리 등의 가공분야에서 고정밀도와 자동화의 용이성으로 인해 생산성이 높은, 고부가가치의 첨단응용 기술로 부각되고 있다. 특히 레이저절단은 타 절단법에 비교되는 절단정도, 열영향, 생산성, 작업 환경등의 각종 우위성으로 박판 및 후판절단분야에서 급속한 보급을 보이기 시작하였다. 현재 대 부분의 레이저 가공기는 CNC화 되어가고 있는 추세이며, 레이저 절단의 경우 생산성증대 및 고 정밀화를 위하여 CAD/CAM인터페이스에 의한 자동화가 필연적인 상황이다. 뿐만아니라 고출력 레이저 발전기를 가공 기본체에 탑재한 탑재형 레이저가공기의 출현으로 대형부재의 절단이 가능 하게 되었으며, 더불어 절단공정의 무인화를 지향하는 각종 시스템이 개발되고 있다. 이와 같은 무인화, 생산성증대, 작업시간단축과 러닝 코스트 및 재료의 절감을 위한 노력의 일환으로 컴 퓨터에 의한 자동 및 반자동 네스팅 시스템의 개발을 들 수 있다. 레이저에 의한 2차원 절단응 용분야에서의 네스팅작업은 설계가 끝난 각 부품의 절단작업의 전단계로서 수행되며, 일반적으로 네스팅공정이 완료되면 절단경로를 결정하고 가공조건과 함께 수치제어공작기계의 제어에 필요한 NC코드를 생성하게 된다. 최근에는 이와 같은 네스팅 시스템이 일부 생산현장에 적용되고 있 으나 이러한 시스템들의 대부분이 외국에서 개발된 것을 수입하여 사용하는 실정이다. 2차원 패턴의 최적자동배치문제는 비단 레이저 절단과 같은 열가공 분야에서 뿐만 아니라 블랭킹 금형, 의류, 유리, 목재등 여러분야에서 응용이 가능하며 패키지의 국산화가 시급한 실정이다. 네스 팅작업은 적용되는 분야에 따라 요구사항과 구속조건이 달라지며 이로 인해 알고리즘과 자료구 조도 달라지게 되나 공통적인 목표는 주어진 영역안에서 겹침없이 배치하면서 버림율을 최소화 하는 것이다. 지난 10여년간 여러 산업의 응용분야에서는 네스팅시스템의 도입이 활발하게 이 루어지고 있는데 수동에 반자동 및 자동에 이르기까지 다양하나 자동네스팅시스템의 경우 배치 효율의 신뢰성이 비교적 부족하기 때문에 아직까지는 생산현장에서 기피하는 실정이다. 배치알 고리즘의 관점에서 볼 때 이러한 문제들은 NP-complete문제로 분류하며 제한된 시간안에 최적의 해를 구하기가 가능한 조합 최적화 문제로 알려져 있다. 따라서 이 글에서는 레이저 절단분야 에서의 네스팅시스템에 관한 개요와 최근의 연구동향 그리고 몇 가지 전형적인 네스팅 알고리 즘들을 소개하고 비교분석을 통해 개선점을 간략하게 논의하고자 한다.

  • PDF

A Study on Drought Trend in Han River Basin (한강유역의 가뭄경향에 관한 연구)

  • Kim, Hyeong-Su;Mun, Jang-Won;Kim, Jae-Hyeong;Kim, Jung-Hun
    • Journal of Korea Water Resources Association
    • /
    • v.33 no.4
    • /
    • pp.437-446
    • /
    • 2000
  • THe drought analysis is performed by applications of truncation level method and conditional probability concept for hydrologic time series in Han river basin. The distributed trend of conditional probability is determined using kriging method for the time series. This study uses daily flowrate, monthly rainfall, and daily high temperature data sets. The daily flowrate data of 12 years(1986~1997) is used for the analysis. Also, the 14 years' data sets(1986~1999) for monthly rainfall and daily high temperature obtained from the National Weather Service of Korea are used in this study. In the cases of flowrate and rainfall data sets, the estimated value corresponding to the truncation level is decreased as the truncation level is increased but in the high temperature data, the value is increased as the truncation level is increased. The conditional probability varies according to the observations and sites. However, the distributed trend of drought is similar over the basin. As a result, the possibility of the drought is high in the middle and lower parts of Han river basin and thus it is recommended the distributed trend of drought be considered when the plan or measures for drought are established.

  • PDF

Optimal Sampling Method of Censored Data for Optimizing Preventive Maintenance (예방정비 최적화를 위한 중도절단 자료의 최적 샘플링 방안)

  • Lee, In-Hyun;Oh, Sea-Hwa;Li, Chang-Long;Yang, Dong-In;Lee, Key-Seo
    • Journal of the Korean Society for Railway
    • /
    • v.16 no.3
    • /
    • pp.196-201
    • /
    • 2013
  • As there is no failure data for the entire lifecycle of a product, when analyzing reliability measures based on early failure data only, there may be a significant error between the estimated mean life and the real one, because it can be underestimated, or on the other hand, it can be overestimated when analyzing reliability measures based on a large amount of censored data with the failure data. To resolve the issue, this study proposes an optimal sampling estimation procedure that selects the proportion of censored data to estimate the optimal distribution with the idea that the estimated distribution could be approximated as closely as the real life distribution. This would work if we sampled the optimal proportion on the censored data, because failure data has real intrinsic distribution in any situation. We validate the proposed procedure using an actual example. If the proposed method is applied to the maintenance policy of TWC (Train to Wayside Communication) system, then we can establish the optimal maintenance policy. Thus, we expect that it will be effective for improvement of reliability and cost savings.