• Title/Summary/Keyword: meaningful error

Search Result 156, Processing Time 0.023 seconds

New Watermarking Technique Using Data Matrix and Encryption Keys

  • Kim, Il-Hwan;Kwon, Chang-Hee;Lee, Wang-Heon
    • Journal of Electrical Engineering and Technology
    • /
    • v.7 no.4
    • /
    • pp.646-651
    • /
    • 2012
  • Meaningful logos or random sequences have been used in the current digital watermarking techniques of 2D bar code. The meaningful logos can not only be created by copyright holders based on their unique information, but are also very effective when representing their copyrights. The random sequences enhance the security of the watermark for verifying one's copyrights against intentional or unintentional attacks. In this paper, we propose a new watermarking technique taking advantage of Data Matrix as well as encryption keys. The Data Matrix not only recovers the original data by an error checking and correction algorithm, even when its high-density data storage and barcode are damaged, but also encrypts the copyright verification information by randomization of the barcode, including ownership keys. Furthermore, the encryption keys and the patterns are used to localize the watermark, and make the watermark robust against attacks, respectively. Through the comparison experiments of the copyright information extracted from the watermark, we can verify that the proposed method has good quality and is robust to various attacks, such as JPEG compression, filtering and resizing.

The Operators' Non-compliance Behavior to Conduct Emergency Operating Procedures - Comparing with the Complexity of the Procedural Steps

  • Park Jinkyun;Jung Wondea
    • Nuclear Engineering and Technology
    • /
    • v.35 no.5
    • /
    • pp.412-425
    • /
    • 2003
  • According to the results of related studies, one of the typical factors related to procedure related human errors is the complexity of procedures. This means that comparing the change of the operators' behavior with respect to the complexity of procedures may be meaningful in clarifying the reasons for the operators' non-compliance behavior. In this study, to obtain data related to the operators' non-compliance behavior, emergency training records were collected using a full scope simulator. And three types of the operators' behavior (such as strict adherence, skipping redundant actions and modifying action sequences) observed from the collected emergency training records were compared with the complexity of the procedural steps. As the results, two remarkable relationships are obtained. They are: 1) the operators seem to frequently adopt non-compliance behavior to conduct the procedural steps that have an intermediate procedural complexity, 2) the operators seems to accommodate their non-compliance behavior to the complexity of the procedural steps. Therefore, it is expected that these relationships can be used as meaningful clues not only to scrutinize the reason for non-compliance behavior but also to suggest appropriate remedies for the reduction of non-compliance behavior that can result in procedure related human error.

Fast Image Stitching Based on Improved SURF Algorithm Using Meaningful Features (의미 있는 특징점을 이용한 향상된 SURF 알고리즘 기반의 고속 이미지 스티칭 기법)

  • Ahn, Hyo-Chang;Rhee, Sang-Burm
    • The KIPS Transactions:PartB
    • /
    • v.19B no.2
    • /
    • pp.93-98
    • /
    • 2012
  • Recently, we can easily create high resolution images with digital cameras for high-performance and make use them at variety fields. Especially, the image stitching method which adjusts couple of images has been researched. Image stitching can be used for military purposes such as satellites and reconnaissance aircraft, and computer vision such as medical image and the map. In this paper, we have proposed fast image stitching based on improved SURF algorithm using meaningful features in the process of images matching after extracting features from scenery image. The features are extracted in each image to find out corresponding points. At this time, the meaningful features can be searched by removing the error, such as noise, in extracted features. And these features are used for corresponding points on image matching. The total processing time of image stitching is improved due to the reduced time in searching out corresponding points. In our results, the processing time of feature matching and image stitching is faster than previous algorithms, and also that method can make natural-looking stitched image.

Analysis of the Relations between Design Errors Detected during BIM-based Design Validation and their Impacts Using Logistic Regression (로지스틱 회귀분석을 이용한 BIM 설계 검토에 의하여 발견된 설계 오류와 그 영향도간의 관계 분석)

  • Won, Jong-Sung;Kim, Jae-Yeo
    • Journal of the Korea Institute of Building Construction
    • /
    • v.17 no.6
    • /
    • pp.535-544
    • /
    • 2017
  • This paper analyzes the relations between design errors, prevented by building information modeling (BIM)-based design validation, and their impacts in order to identify critical consideration factors for implementing BIM-based design validation in architecture, engineering, and construction (AEC) projects. More than 800 design errors detected by BIM-based design validation in two BIM-based projects in South Korea are categorized according to their causes (illogical error, discrepancy, and missing item) and work types (structure, architecture, and mechanical, electrical, and plumbing (MEP)). The probabilistic relations among the independent variables, including the causes and work types of design errors, and the dependent variables, including the project delays, cost overruns, low quality, and rework generation that can be caused by these errors, are analyzed using logistic regression. The characteristics of each design error are analyzed by means of face-to-face interviews with practitioners. According to the results, the impacts of design error causes in predicting the probability values of project delays, cost overruns, low quality, and rework generation were statistically meaningful.

A Study on the 3-Dimensional Analysis by Bundle Adjustment in Close Range Photogrammetry (근접사진측량의 번들조정에 의한 삼차원 위치해석에 관한 연구)

  • 백은기;목찬상
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.6 no.2
    • /
    • pp.10-18
    • /
    • 1988
  • In the three-dimensional analysis and deformation analysis of large structures, efficient is the use of the multiple method of close range photogrammetry which approaches the object distance. This study analyzes the influence of errors according to the overlap, the control points, and the object distance, to solve the problems which are raised in the multiple method. A wall-board, 7 meters by 3 meters, was used as a test field on which a total of 225 unknown points were equally disposed. The photographs with changing the overlap and object distance were taken by P-31 camera system. a total of 143 negatives are used in this study for computing 3-dimensional coordinates and its standard errors, and bundle adjustment of strips and blocks developed with on-line system is applied. In case of decreasing the number of control points, simulation error increases but actual error decreases and increases again. Due to the changed of object distances Z error represents largely compared to X, Y error, but good results in Z can be obtained by increasing the redundancy. And simulation error or actual error shows best results at the endlap of about 70%. To sum up this study, approprate arrangement of control points and overlap is meaningful, and multiple method by short object distance will be widely used to precision and deformation analysis of critical structures.

  • PDF

Analysis of the Sixth Graders' Strategies and Errors of Division-With-Remainder Problems (나머지가 있는 나눗셈 문장제에 대한 초등학교 6학년 학생들의 해결 전략 및 오류 분석)

  • Ha, Mihyun;Chang, Hyewon
    • Journal of Elementary Mathematics Education in Korea
    • /
    • v.20 no.4
    • /
    • pp.717-735
    • /
    • 2016
  • For teaching division-with-remainder(DWR) problems, it is necessary to know students' strategies and errors about DWR problems. The purpose of this study is to investigate and analyze students' strategies and errors of DWR problems and to make some meaningful suggestions for teaching various methods of solving DWR problems. We constructed a test which consists of fifteen DWR problems to investigate students' solving strategies and errors. These problems include mathematical as well as syntactic structures. To apply this test, we selected 177 students from eight elementary schools in various districts of Seoul. The results were analyzed both qualitatively and quantitatively. The sixth graders' strategies can be classified as follows : Single strategies, Multi strategies and Assistant strategies. They used Division(D) strategy, Multiplication(M) strategy, and Additive Approach(A) strategy as sub-strategies. We noticed that frequently used strategies do not coincide with strategies for their success. While students in middle group used Assistant strategies frequently, students in higher group used Single strategies frequently. The sixth graders' errors can be classified as follows : Formula error(F error), Calculation error(C error), Calculation Product error(P error) and Interpretation error(I error). In this study, there were 4 elements for syntaxes in problems : large number, location of divisor and dividend, divisor size, vocabularies. When students in lower group were solving the problems, F errors appeared most frequently. However, in case of higher group, I errors appeared most frequently. Based on these results, we made some didactical suggestions.

An Analysis on the Repeated Error Patterns in Division of Fraction by Elementary Students (초등학생들이 분수의 나눗셈에서 보이는 반복적 오류 분석)

  • Kim, Kyung-Mi;Kang, Wan
    • Education of Primary School Mathematics
    • /
    • v.11 no.1
    • /
    • pp.1-19
    • /
    • 2008
  • This study analyzed the repeated error patterns in division of fraction by elementary students through observation of their test papers. The questions for this study were following. First, what is the most changable thing among the repeated error patterns appeared in division of fraction by elementary students? Second, what is the most frequent error patterns in division of fraction by elementary students? First of all, the ratios of incorrect answers in division of fraction by general students were researched. This research was the only one time. The purpose was to know what kind of compositions in the problems were appeared more errors. Total 554 6th grade students(300 boys and 254 girls) from 6 elementary schools in Seoul are participated in this research. On the basis of this, the study for analysis began in earnest. 5 tests made progress for about 4 months. Total 181 6th grade students(92 boys and 89 girls) from S elementary school in Seoul were participated in this. After each test, to confirm the errors and to classify them were done. Then the repeated error patterns were arranged into 4 types: alpha, beta, gamma and delta type. Consequently, conclusions can be derived as follows. First, most students modify their errors as time goes by even though they make errors about already learned contents. Second, most students who appeared errors make them continually caused a reciprocal of natural number in the divisor when they calculate computations about '(fraction) $\div$ (natural number)'. Third, most students recognize that the divisor have to change the reciprocal when they calculate division of fraction through they modify their errors repeatedly.

  • PDF

A Study on the Application of Constrained Bayes Estimation for Product Quality Control (Constrained 베이즈 추정방식의 제품 품질관리 활용방안에 관한 연구)

  • Kim, Tai-Kyoo;Kim, Myung Joon
    • Journal of Korean Society for Quality Management
    • /
    • v.43 no.1
    • /
    • pp.57-66
    • /
    • 2015
  • Purpose: The purpose of this study is to apply the constrained Bayesian estimation methodology for product quality control process and prove the effectiveness of the product management by comparing with the well-known Bayes estimator through data performance result. Methods: The Bayes and constrained Bayes estimators were produced based on the theoretical background and for confirming the effectiveness of suggested application, the deviation index was defined and calculated for the comparison. Results: The statistical analysis result shows that applying the suggested estimation methodology, that is, constrained Bayes estimator improves the effectiveness of the index with regard to reduce the error by matching the first two empirical moments. Conclusion: Considering the advanced Bayesian approaches such as constrained Bayes estimation for the product quality control process, the newly defined deviation index reduces the error for estimating the parameter histogram which is reflected both location and deviation parameters and furthermore various Bayesian perspective approaches seems to be meaningful for managing the product quality control process.

A Study on the Bayes Estimation Application for Korean Standard-Quality Excellence Index(KS-QEI) (베이즈 추정방식의 품질우수성지수 적용 방안에 관한 연구)

  • Kim, Tai Kyoo;Kim, Myung Joon
    • Journal of Korean Society for Quality Management
    • /
    • v.42 no.4
    • /
    • pp.747-756
    • /
    • 2014
  • Purpose: The purpose of this study is to apply the Bayesian estimation methodology for producing 'Korean Standard -Quality Excellence Index' model and prove the effectiveness of the new approach based on survey data by comparing the current index with the new index produced by Bayesian estimation method. Methods: The 'Korean Standard -Quality Excellence Index' was produced through the collected survey data by Bayesian estimation method and comparing the deviation with two results for confirming the effectiveness of suggested application. Results: The statistical analysis result shows that suggested estimator, that is, empirical Bayes estimator improves the effectiveness of the index with regard to reduce the error under specific loss function, which is suggested for checking the goodness of fit. Conclusion: Considering the Bayesian techniques such as empirical Bayes estimator for producing the quality excellence index reduces the error for estimating the parameter of interest and furthermore various Bayesian perspective approaches seems to be meaningful for producing the corresponding index.

A Study on an Automatical BKLS Measurement By Programming Technology

  • Shin, YeounOuk;Kim, KiBum
    • International journal of advanced smart convergence
    • /
    • v.7 no.3
    • /
    • pp.73-78
    • /
    • 2018
  • This study focuses on presenting the IT program module provided by BKLS measure in order to solve the problem of capital cost due to information asymmetry of external investors and corporate executives. Barron at al(1998) set up a BKLS measure to guide the market by intermediate analysts. The BKLS measure was measured by using the changes in the analyst forecast dispersion and analyst mean forecast error squared. This study suggests a model of the algorithm that the BKLS measure can be provided to all investors immediately by IT program in order to deliver the meaningful value in the domestic capital market as measured. This is a method of generating and analyzing real-time or non-real-time prediction models by transferring the predicted estimates delivered to the Big Data Log Analysis System through the statistical DB to the statistical forecasting engine. Because BKLS measure is not carried out in a concrete method, it is practically very difficult to estimate the BKLS measure. It is expected that the BKLS measure of Barron at al(1998) introduced in this study and the model of IT module provided in real time will be the starting point for the follow-up study for the introduction and realization of IT technology in the future.