• Title/Summary/Keyword: Decision Threshold

Search Result 334, Processing Time 0.037 seconds

Semi-Fragile Image Watermarking for Authentication Using Wavelet Packet Transform Based on The Subband Energy (부대역 에너지 기반 웨이블릿 패킷 변환을 이용한 인증을 위한 세미 프레자일 영상 워터마킹)

  • Park, Sang-Ju;Kwon, Tae-Hyeon
    • The KIPS Transactions:PartB
    • /
    • v.12B no.4 s.100
    • /
    • pp.421-428
    • /
    • 2005
  • A new method of Semi-fragile image watermarking which ensures the integrity of the contents of digital image is presented. Proposed watermarking scheme embeds watermark in the form of quantization noise on the wavelet transform coefficients in a specific mid frequency subbands selected from a wavelet packet decomposition based on energy distribution of wavelet transform coefficients. By controlling the strength of embedded watermark using HVS (Human Visual System) characteristic, it is imperceptible by a human viewer while robust against non-malicious attack such as compression for storage and/or transmission. When an attack is applied on the original image, it is highly probable that wavelet transform coefficients not only at the exact attack positions but also the neighboring ones are modified. Therefore, proposed authentication method utilizes whether both current coefficient and its neighbors are damaged. together. So it can efficiently detect and accurately localize attacks inflicted on the content of original image. Decision threshold for authentication can be user controlled for different application areas as needed.

Cost-Effectiveness Analysis for National Dyslipidemia Screening Program in Korea: Results of Best Case Scenario Analysis Using a Markov Model

  • Kim, Jae-Hyun;Park, Eun-Cheol;Kim, Tae-Hyun;Nam, Chung-Mo;Chun, Sung-Youn;Lee, Tae-Hoon;Park, Sohee
    • Health Policy and Management
    • /
    • v.29 no.3
    • /
    • pp.357-367
    • /
    • 2019
  • Background: This study evaluated the cost-effectiveness of 21 different national dyslipidemia screening strategies according to total cholesterol (TC) cutoff and screening interval among 40 years or more for the primary prevention of coronary heart disease over a lifetime in Korea, from a societal perspective. Methods: A decision tree was used to estimate disease detection with the 21 different screening strategies, while a Markov model was used to model disease progression until death, quality-adjusted life years (QALYs) and costs from a Korea societal perspective. Results: The results showed that the strategy with TC 200 mg/dL and 4-year interval cost \4,625,446 for 16.65105 QALYs per person and strategy with TC 200 mg/dL and 3-year interval cost \4,691,771 for 16.65164 QALYs compared with \3,061,371 for 16.59877 QALYs for strategy with no screening. The incremental cost-effectiveness ratio of strategy with TC 200 mg/dL and 4-year interval versus strategy with no screening was \29,916,271/QALY. At a Korea willingness-to-pay threshold of \30,500,000/QALY, strategy with TC 200 mg/dL and 4-year interval is cost-effective compared with strategy with no screening. Sensitivity analyses showed that results were robust to reasonable variations in model parameters. Conclusion: In this study, revised national dyslipidemia screening strategy with TC 200 mg/dL and 4-year interval could be a cost-effective option. A better understanding of the Korean dyslipidemia population may be necessary to aid in future efforts to improve dyslipidemia diagnosis and management.

Development of an IoT-Based Dizziness Detection System for VR Applications (VR 애플리케이션을 위한 사물인터넷 기반 어지럼증 검출 시스템 개발)

  • Ko, Euni;Kim, Youngcheon;Park, Hyelee;Jung, Wonseok;Seo, Jeongwook
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2019.05a
    • /
    • pp.423-425
    • /
    • 2019
  • Users may experience a sub-type of motion sickness, called cybersickness, when interacting with virtual reality (VR) applications in the state of wearing head mounted display (HMD) devices. Although the root cause of cybersickness is still unclear, it is believed to result from a sensory mismatch between visual and vestibular systems. However, there is a lack of studies developing data collection and analysis systems to measure cybersickness. In this paper, therefore, a system is designed that collects electroencephalography (EEG) and physiological data from a user wearing a VR HMD device through an internet of things (IoT) platform and decides whether a user experiences a symptom of cybersickness, namely dizziness, or not by using a decision threshold. Experimental results showed that the proposed system achieved about 92% accuracy of a dizziness detection when considering 14 participants.

  • PDF

Sperm chromatin structure assay versus sperm chromatin dispersion kits: Technical repeatability and choice of assisted reproductive technology procedure

  • Laxme B, Vidya;Stephen, Silviya;Devaraj, Ramyashree;Mithraprabhu, Sridurga;Bertolla, Ricardo P.;Mahendran, Tara
    • Clinical and Experimental Reproductive Medicine
    • /
    • v.47 no.4
    • /
    • pp.277-283
    • /
    • 2020
  • Objective: The sperm DNA fragmentation index (DFI) guides the clinician's choice of an appropriate assisted reproductive technology (ART) procedure. The DFI can be determined using commercially available methodologies, including sperm chromatin dispersion (SCD) kits and sperm chromatin structure assay (SCSA). Currently, when DFI is evaluated using SCD kits, the result is analyzed in reference to the SCSA-derived threshold for the choice of an ART procedure. In this study, we compared DFI values obtained using SCSA with those obtained using SCD and determined whether the difference affects the choice of ART procedure. Methods: We compared SCSA to two SCD kits, CANfrag (n=36) and Halosperm (n=31), to assess the DFI values obtained, the correlations between tests, the technical repeatability, and the impact of DFI on the choice of ART. Results: We obtained higher median DFI values using SCD kits than when using SCSA, and this difference was significant for the CANfrag kit (p<0.001). The SCD kits had significantly higher coefficients of variation than SCSA (p<0.001). In vitro fertilization/intracytoplasmic sperm injection (IVF/ICSI) would be chosen for a significantly higher proportion of patients if a decision were made based on DFI derived from SCD rather than DFI determined using SCSA (p=0.003). Conclusion: Our results indicate that SCD kit-specific thresholds should be established in order to avoid the unnecessary use of IVF/ICSI based on sperm DNA damage for the management of infertility. Appropriate measures should be taken to mitigate the increased variability inherent to the methods used in these tests.

Underwater Navigation of AUVs Using Uncorrelated Measurement Error Model of USBL

  • Lee, Pan-Mook;Park, Jin-Yeong;Baek, Hyuk;Kim, Sea-Moon;Jun, Bong-Huan;Kim, Ho-Sung;Lee, Phil-Yeob
    • Journal of Ocean Engineering and Technology
    • /
    • v.36 no.5
    • /
    • pp.340-352
    • /
    • 2022
  • This article presents a modeling method for the uncorrelated measurement error of the ultra-short baseline (USBL) acoustic positioning system for aiding navigation of underwater vehicles. The Mahalanobis distance (MD) and principal component analysis are applied to decorrelate the errors of USBL measurements, which are correlated in the x- and y-directions and vary according to the relative direction and distance between a reference station and the underwater vehicles. The proposed method can decouple the radial-direction error and angular direction error from each USBL measurement, where the former and latter are independent and dependent, respectively, of the distance between the reference station and the vehicle. With the decorrelation of the USBL errors along the trajectory of the vehicles in every time step, the proposed method can reduce the threshold of the outlier decision level. To demonstrate the effectiveness of the proposed method, simulation studies were performed with motion data obtained from a field experiment involving an autonomous underwater vehicle and USBL signals generated numerically by matching the specifications of a specific USBL with the data of a global positioning system. The simulations indicated that the navigation system is more robust in rejecting outliers of the USBL measurements than conventional ones. In addition, it was shown that the erroneous estimation of the navigation system after a long USBL blackout can converge to the true states using the MD of the USBL measurements. The navigation systems using the uncorrelated error model of the USBL, therefore, can effectively eliminate USBL outliers without loss of uncontaminated signals.

Development of Real-time QRS-complex Detection Algorithm for Portable ECG Measurement Device (휴대용 심전도 측정장치를 위한 실시간 QRS-complex 검출 알고리즘 개발)

  • An, Hwi;Shim, Hyoung-Jin;Park, Jae-Soon;Lhm, Jong-Tae;Joung, Yeun-Ho
    • Journal of Biomedical Engineering Research
    • /
    • v.43 no.4
    • /
    • pp.280-289
    • /
    • 2022
  • In this paper, we present a QRS-complex detection algorithm to calculate an accurate heartbeat and clearly recognize irregular rhythm from ECG signals. The conventional Pan-Tompkins algorithm brings false QRS detection in the derivative when QRS and noise signals have similar instant variation. The proposed algorithm uses amplitude differences in 7 adjacent samples to detect QRS-complex which has the highest amplitude variation. The calculated amplitude is cubed to dominate QRS-complex and the moving average method is applied to diminish the noise signal's amplitude. Finally, a decision rule with a threshold value is applied to detect accurate QRS-complex. The calculated signals with Pan-Tompkins and proposed algorithms were compared by signal-to-noise ratio to evaluate the noise reduction degree. QRS-complex detection performance was confirmed by sensitivity and the positive predictive value(PPV). Normal ECG, muscle noise ECG, PVC, and atrial fibrillation signals were achieved which were measured from an ECG simulator. The signal-to-noise ratio difference between Pan-Tompkins and the proposed algorithm were 8.1, 8.5, 9.6, and 4.7, respectively. All ratio of the proposed algorithm is higher than the Pan-Tompkins values. It indicates that the proposed algorithm is more robust to noise than the Pan-Tompkins algorithm. The Pan-Tompkins algorithm and the proposed algorithm showed similar sensitivity and PPV at most waveforms. However, with a noisy atrial fibrillation signal, the PPV for QRS-complex has different values, 42% for the Pan-Tompkins algorithm and 100% for the proposed algorithm. It means that the proposed algorithm has superiority for QRS-complex detection in a noisy environment.

ROC Analysis of Diagnostie Performance in Liver Scan (간스캔의 ROC분석에 의한 진단적 평가)

  • Lee, Myung-Chul;Moon, Dae-Hyuk;Koh, Chang-Soon;Matumoto, Toru;Tateno, Yukio
    • The Korean Journal of Nuclear Medicine
    • /
    • v.22 no.1
    • /
    • pp.39-45
    • /
    • 1988
  • To evaluate diagnostic accuracy of liver scintigraphy we analysed liver scans of 143 normal and 258 patients with various liver diseases. Three ROC curves for SOL, liver cirrhosis and diffuse liver disease were fitted using rating methods and areas under the ROC curves and their standard errors were calculated by the trapezoidal rule and the variance of the Wilcoxon statistic suggested by McNeil. We compared these results with that of National Institute of Radiological Science in Japan. 1) The sensitivity of liver scintigraphy was 74.2% in SOL, 71.8% in liver cirrhosis and 34.0% in diffuse liver disease. The specificity was 96.0% in SOL, 94.2% in liver cirrhosis and 87.6% in diffuse liver diasease. 2) ROC curves of SOL and liver cirrhosis approached the upper left-hand corner closer than that of diffuse liver disease. Area (${\pm}$ standard error). under the ROC curve was $0.868{\pm}0.024$ in SOL and $0.867{\pm}0.028$ in liver cirrhosis. These were significantly higher than $0.658{\pm}0.043$ in diffuse liver disease. 3) There was no interobserver difference in terms of ROC curves. But low sensitivty and high specificity of authors' SOL diagnosis suggested we used more strict decision threshold.

  • PDF

Data-driven Model Prediction of Harmful Cyanobacterial Blooms in the Nakdong River in Response to Increased Temperatures Under Climate Change Scenarios (기후변화 시나리오의 기온상승에 따른 낙동강 남세균 발생 예측을 위한 데이터 기반 모델 시뮬레이션)

  • Gayeon Jang;Minkyoung Jo;Jayun Kim;Sangjun Kim;Himchan Park;Joonhong Park
    • Journal of Korean Society on Water Environment
    • /
    • v.40 no.3
    • /
    • pp.121-129
    • /
    • 2024
  • Harmful cyanobacterial blooms (HCBs) are caused by the rapid proliferation of cyanobacteria and are believed to be exacerbated by climate change. However, the extent to which HCBs will be stimulated in the future due to increased temperature remains uncertain. This study aims to predict the future occurrence of cyanobacteria in the Nakdong River, which has the highest incidence of HCBs in South Korea, based on temperature rise scenarios. Representative Concentration Pathways (RCPs) were used as the basis for these scenarios. Data-driven model simulations were conducted, and out of the four machine learning techniques tested (multiple linear regression, support vector regressor, decision tree, and random forest), the random forest model was selected for its relatively high prediction accuracy. The random forest model was used to predict the occurrence of cyanobacteria. The results of boxplot and time-series analyses showed that under the worst-case scenario (RCP8.5 (2100)), where temperature increases significantly, cyanobacterial abundance across all study areas was greatly stimulated. The study also found that the frequencies of HCB occurrences exceeding certain thresholds (100,000 and 1,000,000 cells/mL) increased under both the best-case scenario (RCP2.6 (2050)) and worst-case scenario (RCP8.5 (2100)). These findings suggest that the frequency of HCB occurrences surpassing a certain threshold level can serve as a useful diagnostic indicator of vulnerability to temperature increases caused by climate change. Additionally, this study highlights that water bodies currently susceptible to HCBs are likely to become even more vulnerable with climate change compared to those that are currently less susceptible.

Statistical Voice Activity Defector Based on Signal Subspace Model (신호 준공간 모델에 기반한 통계적 음성 검출기)

  • Ryu, Kwang-Chun;Kim, Dong-Kook
    • The Journal of the Acoustical Society of Korea
    • /
    • v.27 no.7
    • /
    • pp.372-378
    • /
    • 2008
  • Voice activity detectors (VAD) are important in wireless communication and speech signal processing, In the conventional VAD methods, an expression for the likelihood ratio test (LRT) based on statistical models is derived in discrete Fourier transform (DFT) domain, Then, speech or noise is decided by comparing the value of the expression with a threshold, This paper presents a new statistical VAD method based on a signal subspace approach, The probabilistic principal component analysis (PPCA) is employed to obtain a signal subspace model that incorporates probabilistic model of noisy signal to the signal subspace method, The proposed approach provides a novel decision rule based on LRT in the signal subspace domain, Experimental results show that the proposed signal subspace model based VAD method outperforms those based on the widely used Gaussian distribution in DFT domain.

A Recidivism Prediction Model Based on XGBoost Considering Asymmetric Error Costs (비대칭 오류 비용을 고려한 XGBoost 기반 재범 예측 모델)

  • Won, Ha-Ram;Shim, Jae-Seung;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.127-137
    • /
    • 2019
  • Recidivism prediction has been a subject of constant research by experts since the early 1970s. But it has become more important as committed crimes by recidivist steadily increase. Especially, in the 1990s, after the US and Canada adopted the 'Recidivism Risk Assessment Report' as a decisive criterion during trial and parole screening, research on recidivism prediction became more active. And in the same period, empirical studies on 'Recidivism Factors' were started even at Korea. Even though most recidivism prediction studies have so far focused on factors of recidivism or the accuracy of recidivism prediction, it is important to minimize the prediction misclassification cost, because recidivism prediction has an asymmetric error cost structure. In general, the cost of misrecognizing people who do not cause recidivism to cause recidivism is lower than the cost of incorrectly classifying people who would cause recidivism. Because the former increases only the additional monitoring costs, while the latter increases the amount of social, and economic costs. Therefore, in this paper, we propose an XGBoost(eXtream Gradient Boosting; XGB) based recidivism prediction model considering asymmetric error cost. In the first step of the model, XGB, being recognized as high performance ensemble method in the field of data mining, was applied. And the results of XGB were compared with various prediction models such as LOGIT(logistic regression analysis), DT(decision trees), ANN(artificial neural networks), and SVM(support vector machines). In the next step, the threshold is optimized to minimize the total misclassification cost, which is the weighted average of FNE(False Negative Error) and FPE(False Positive Error). To verify the usefulness of the model, the model was applied to a real recidivism prediction dataset. As a result, it was confirmed that the XGB model not only showed better prediction accuracy than other prediction models but also reduced the cost of misclassification most effectively.