• Title/Summary/Keyword: misclassification

Search Result 226, Processing Time 0.026 seconds

Review of Public Health Aspects of Exposure to Agent Orange (고엽제 노출에 따른 건강위해의 보건학적 고찰)

  • Yang, Won-Ho;Hong, Ga-Yeon;Kim, Geun-Bae
    • Journal of Environmental Health Sciences
    • /
    • v.38 no.3
    • /
    • pp.175-183
    • /
    • 2012
  • Objectives: Controversy regarding the relationship between exposure to Agent Orange and disease has progressed for more than four decades, both at home and abroad. Recently, the allegation by US veteran Steve House of the burial of Agent Orange at the US Army base Camp Carroll located in Waegwan-eup, Korea, has emerged. We reviewed published articles and reports related to Agent Orange. Methods: Articles and reports were collected online using the keywords 'agent orange' and 'health' and then reviewed. Results: A number of epidemiologic studies have reported disease outcomes due to exposure to Agent Orange, while others were unable to establish a link to the injuries of veterans of the Vietnam War. This can be explained by the fact that accurate exposure assessment should be carried out since exposure misclassification in epidemiologic studies can affect estimates of risk. In the case of the burial of Agent Orange at Camp Carroll, an exposure pathway could be through underground water supplies, which differs from the cases of Vietnam and Seveso in Italy. Conclusion: There still remains a dispute among academics regarding the relationship between exposure to Agent Orange and disease, although Agent Orange is a highly toxic chemical. This dispute indicates that accurate exposure pathway and exposure assessment is needed.

A Note on the Bias in the Multi-nomial Classification (다항분류상 편의에 관한 연구)

  • 윤용운
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.1 no.1
    • /
    • pp.45-48
    • /
    • 1978
  • If two inspectors classify items in a lot into m classes, it is possible that each of them makes wrong classification in some cases, thus causing bias. Expressions have been obtained for the limits of this bias in estimating the proportion of the different classes. From the results of the classification they obtained limit for the estimates of Proportions have been worked out, based on assumption regarding the magnitudes of probabilities of misclassification. Now we suppose that $P_{ti}{\;}(t=1.2)$ is the probability that t the inspector classifies correctly an item in class $A_i$ and $q_{tji}$ is the probability that he misclassifies in $A_j$ an item actually belonging to $A_i$, therefor, $P_{ti}+ \sum\limits_{j{\neq}i}q_{tji}=1$ An estimate for the proportion $P_k$ of the class $A_k$ in the lot would be $\hat{P}_k=r_{kk}+(\frac{1}{2})\sum\limits_{j{\neq}k}r_{kj}+r_{jk}$ The % Bias in proportion $\hat{P}_k$ is $\frac{E(\hat{P}_k)-P_k}{P_k}{\times}100$

  • PDF

A Study on the Data Fusion Method using Decision Rule for Data Enrichment (의사결정 규칙을 이용한 데이터 통합에 관한 연구)

  • Kim S.Y.;Chung S.S.
    • The Korean Journal of Applied Statistics
    • /
    • v.19 no.2
    • /
    • pp.291-303
    • /
    • 2006
  • Data mining is the work to extract information from existing data file. So, the one of best important thing in data mining process is the quality of data to be used. In this thesis, we propose the data fusion technique using decision rule for data enrichment that one phase to improve data quality in KDD process. Simulations were performed to compare the proposed data fusion technique with the existing techniques. As a result, our data fusion technique using decision rule is characterized with low MSE or misclassification rate in fusion variables.

Robust Traffic Monitoring System by Spatio-Temporal Image Analysis (시공간 영상 분석에 의한 강건한 교통 모니터링 시스템)

  • 이대호;박영태
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.11
    • /
    • pp.1534-1542
    • /
    • 2004
  • A novel vision-based scheme of extracting real-time traffic information parameters is presented. The method is based on a region classification followed by a spatio-temporal image analysis. The detection region images for each traffic lane are classified into one of the three categories: the road, the vehicle, and the shadow, using statistical and structural features. Misclassification in a frame is corrected by using temporally correlated features of vehicles in the spatio-temporal image. Since only local images of detection regions are processed, the real-time operation of more than 30 frames per second is realized without using dedicated parallel processors, while ensuring detection performance robust to the variation of weather conditions, shadows, and traffic load.

An Analysis of Noise Robustness for Multilayer Perceptrons and Its Improvements (다층퍼셉트론의 잡음 강건성 분석 및 향상 방법)

  • Oh, Sang-Hoon
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.1
    • /
    • pp.159-166
    • /
    • 2009
  • In this paper, we analyse the noise robustness of MLPs(Multilayer perceptrons) through deriving the probability density function(p.d.f.) of output nodes with additive input noises and the misclassification ratio with the integral form of the p.d.f. functions. Also, we propose linear preprocessing methods to improve the noise robustness. As a preprocessing stage of MLPs, we consider ICA(independent component analysis) and PCA(principle component analysis). After analyzing the noise reduction effect using PCA or ICA in the viewpoints of SNR(Singal-to-Noise Ratio), we verify the preprocessing effects through the simulations of handwritten-digit recognition problems.

Automatic Electronic Cleansing in Computed Tomography Colonography Images using Domain Knowledge

  • Manjunath, KN;Siddalingaswamy, PC;Prabhu, GK
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.16 no.18
    • /
    • pp.8351-8358
    • /
    • 2016
  • Electronic cleansing is an image post processing technique in which the tagged colonic content is subtracted from colon using CTC images. There are post processing artefacts, like: 1) soft tissue degradation; 2) incomplete cleansing; 3) misclassification of polyp due to pseudo enhanced voxels; and 4) pseudo soft tissue structures. The objective of the study was to subtract the tagged colonic content without losing the soft tissue structures. This paper proposes a novel adaptive method to solve the first three problems using a multi-step algorithm. It uses a new edge model-based method which involves colon segmentation, priori information of Hounsfield units (HU) of different colonic contents at specific tube voltages, subtracting the tagging materials, restoring the soft tissue structures based on selective HU, removing boundary between air-contrast, and applying a filter to clean minute particles due to improperly tagged endoluminal fluids which appear as noise. The main finding of the study was submerged soft tissue structures were absolutely preserved and the pseudo enhanced intensities were corrected without any artifact. The method was implemented with multithreading for parallel processing in a high performance computer. The technique was applied on a fecal tagged dataset (30 patients) where the tagging agent was not completely removed from colon. The results were then qualitatively validated by radiologists for any image processing artifacts.

Retrospective Exposure Assessment of Wafer Fabrication Workers in the Semiconductor Industry (반도체 웨이퍼 가공 공정 역학 조사에서 과거 노출 평가 방법 고찰)

  • Park, Dong-Uk
    • Journal of Environmental Health Sciences
    • /
    • v.37 no.1
    • /
    • pp.12-21
    • /
    • 2011
  • The objective of this study is to review retrospective exposure assessment methods used in wafer fabrication operations to determine whether adverse health effects including mortality or cancer incidence are related to employment in particular work activities and to recommend an appropriate approach for retrospective exposure assessment methods for epidemiological study. The goal of retrospective exposure assessment for such studies is to assign each study subject to a workgroup in such a way that differences in exposure within the workgroups are minimized, as well as to maximize the contrasts in exposure between workgroups. To reduce the misclassification of exposure and to determine if adverse health effects including mortality or cancer incidence are related to particular work activities of wafer fabrication workers, a minimum requirement of work history information on the wafer manufacturing eras, job and department at which they were exposed should be assessed. Retrospective assessment of the task that semiconductor workers performed should be conducted to determine not only the effect of a particular job on the development of adverse health effects including mortality or cancer incidence, but also to adjust for the healthy worker effect. In order to identify specific hazardous agents that may cause adverse health effects, past exposure to a specific agent or agent matrices should also be assessed.

Spatial Epidemiology and Environmental Health: On the Use of Spatially Referenced Health and Environment Data (공간역학과 환경보건: 공간위치정보 활용에 대한 고찰)

  • Han, Dai-Kwon;Hwang, Seung-Sik
    • Journal of Environmental Health Sciences
    • /
    • v.37 no.1
    • /
    • pp.1-11
    • /
    • 2011
  • Recent advances in Geographic Information Systems and spatial statistical and analytical methods, along with the availability of spatially referenced health and environmental data, have created unique opportunities to investigate spatial associations between environment exposures and health outcomes at multiple spatial scales and resolutions. However, the increased use of spatial data also faces challenges, one of which is to ensure certainty and accuracy of locational data that meets the needs of a study. This article critically reviews the use of spatially referenced data in epidemiologic studies, focusing on the issue of locational uncertainty generated from the process of geocoding health and environmental data. Primarily, major issues involving the use of spatially referenced data are addressed, including completeness and positional accuracy, potential source of bias and exposure misclassification, and implications for epidemiologic studies. The need for critical assessment and caution in designing and conducting spatial epidemiology studies is briefly discussed.

A new classification method using penalized partial least squares (벌점 부분최소자승법을 이용한 분류방법)

  • Kim, Yun-Dae;Jun, Chi-Hyuck;Lee, Hye-Seon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.5
    • /
    • pp.931-940
    • /
    • 2011
  • Classification is to generate a rule of classifying objects into several categories based on the learning sample. Good classification model should classify new objects with low misclassification error. Many types of classification methods have been developed including logistic regression, discriminant analysis and tree. This paper presents a new classification method using penalized partial least squares. Penalized partial least squares can make the model more robust and remedy multicollinearity problem. This paper compares the proposed method with logistic regression and PCA based discriminant analysis by some real and artificial data. It is concluded that the new method has better power as compared with other methods.

Investigations on the Optimal Support Vector Machine Classifiers for Predicting Design Feasibility in Analog Circuit Optimization

  • Lee, Jiho;Kim, Jaeha
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.15 no.5
    • /
    • pp.437-444
    • /
    • 2015
  • In simulation-based circuit optimization, many simulation runs may be wasted while evaluating infeasible designs, i.e. the designs that do not meet the constraints. To avoid such a waste, this paper investigates the use of support vector machine (SVM) classifiers in predicting the design's feasibility prior to simulation and the optimal selection of the SVM parameters, namely, the Gaussian kernel shape parameter ${\gamma}$ and the misclassification penalty parameter C. These parameters affect the complexity as well as the accuracy of the model that SVM represents. For instance, the higher ${\gamma}$ is good for detailed modeling and the higher C is good for rejecting noise in the training set. However, our empirical study shows that a low ${\gamma}$ value is preferable due to the high spatial correlation among the circuit design candidates while C has negligible impacts due to the smooth and clean constraint boundaries of most circuit designs. The experimental results with an LC-tank oscillator example show that an optimal selection of these parameters can improve the prediction accuracy from 80 to 98% and model complexity by $10{\times}$.