• Title/Summary/Keyword: Procedure robustness

Search Result 182, Processing Time 0.022 seconds

A Study on the Methods for the Robust Job Stress Management for Nuclear Power Plant Workers using Response Surface Data Mining (반응표면 데이터마이닝 기법을 이용한 원전 종사자의 강건 직무 스트레스 관리 방법에 관한 연구)

  • Lee, Yonghee;Jang, Tong Il;Lee, Yong Hee
    • Journal of the Korean Society of Safety
    • /
    • v.28 no.1
    • /
    • pp.158-163
    • /
    • 2013
  • While job stress evaluations are reported in the recent surveys upon the nuclear power plants(NPPs), any significant advance in the types of questionnaires is not currently found. There are limitations to their usefulness as analytic tools for the management of safety resources in NPPs. Data mining(DM) has emerged as one of the key features for data computing and analysis to conduct a survey analysis. There are still limitations to its capability such as dimensionality associated with many survey questions and quality of information. Even though some survey methods may have significant advantages, often these methods do not provide enough evidence of causal relationships and the statistical inferences among a large number of input factors and responses. In order to address these limitations on the data computing and analysis capabilities, we propose an advanced procedure of survey analysis incorporating the DM method into a statistical analysis. The DM method can reduce dimensionality of risk factors, but DM method may not discuss the robustness of solutions, either by considering data preprocesses for outliers and missing values, or by considering uncontrollable noise factors. We propose three steps to address these limitations. The first step shows data mining with response surface method(RSM), to deal with specific situations by creating a new method called response surface data mining(RSDM). The second step follows the RSDM with detailed statistical relationships between the risk factors and the response of interest, and shows the demonstration the proposed RSDM can effectively find significant physical, psycho-social, and environmental risk factors by reducing the dimensionality with the process providing detailed statistical inferences. The final step suggest a robust stress management system which effectively manage job stress of the workers in NPPs as a part of a safety resource management using the surrogate variable concept.

Design of robust Watermarking Algorithm against the Geometric Transformation for Medical Image Security (의료 영상보안을 위한 기하학적 변형에 견고한 워터마킹 알고리즘 설계)

  • Lee, Yun-Bae;Oh, Guan-Tack
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.13 no.12
    • /
    • pp.2586-2594
    • /
    • 2009
  • A digital watermarking technique used as a protection and certifying mechanism of copyrighted creations including music, still images, and videos in terms of finding any loss in data, reproduction and pursuit. This study suggests using a selected geometric invariant point through the whole processing procedure of an image and inserting and extracting based on the invariant point so that it will be robust in a geometric transformation attack. The introduced algorithm here is based on a watershed splitting method in order to make medical images strong against RST(Rotation Scale, Translation) transformation and other processing. It also helps to maintain the watermark in images that are compressed and stored for a period of time. This algorithm also proved that is has robustness against not only JPEG compression attack, but also RST attack and filtering attack.

A Hardware Architecture of Hough Transform Using an Improved Voting Scheme (개선된 보팅 정책을 적용한 허프 변환 하드웨어 구조)

  • Lee, Jeong-Rok;Bae, Kyeong-Ryeol;Moon, Byungin
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.38A no.9
    • /
    • pp.773-781
    • /
    • 2013
  • The Hough transform for line detection is widely used in many machine vision applications due to its robustness against data loss and distortion. However, it is not appropriate for real-time embedded vision systems, because it has inefficient computation structure and demands a large number of memory accesses. Thus, this paper proposes an improved voting scheme of the Hough transform, and then applies this scheme to a Hough transform hardware architecture so that it can provide real-time performance with less hardware resource. The proposed voting scheme reduces computation overhead of the voting procedure using correlation between adjacent pixels, and improves computational efficiency by increasing reusability of vote values. The proposed hardware architecture, which adopts this improved scheme, maximizes its throughput by computing and storing vote values for many adjacent pixels in parallel. This parallelization for throughput improvement is accomplished with little hardware overhead compared with sequential computation.

A new validated analytical method for the quality control of red ginseng products

  • Kim, Il-Woung;Cha, Kyu-Min;Wee, Jae Joon;Ye, Michael B.;Kim, Si-Kwan
    • Journal of Ginseng Research
    • /
    • v.37 no.4
    • /
    • pp.475-482
    • /
    • 2013
  • The main active components of Panax ginseng are ginsenosides. Ginsenoside Rb1 and Rg1 are accepted as marker substances for quality control worldwide. The analytical methods currently used to detect these two compounds unfairly penalize steamed and dried (red) P. ginseng preparations, because it has a lower content of those ginsenosides than white ginseng. To manufacture red ginseng products from fresh ginseng, the ginseng roots are exposed to high temperatures for many hours. This heating process converts the naturally occurring ginsenoside Rb1 and Rg1 into artifact ginsenosides such as ginsenoside Rg3, Rg5, Rh1, and Rh2, among others. This study highlights the absurdity of the current analytical practice by investigating the time-dependent changes in the crude saponin and the major natural and artifact ginsenosides contents during simmering. The results lead us to recommend (20S)- and (20R)-ginsenoside Rg3 as new reference materials to complement the current P. ginseng preparation reference materials ginsenoside Rb1 and Rg1. An attempt has also been made to establish validated qualitative and quantitative analytical procedures for these four compounds that meet International Conference of Harmonization (ICH) guidelines for specificity, linearity, range, accuracy, precision, detection limit, quantitation limit, robustness and system suitability. Based on these results, we suggest a validated analytical procedure which conforms to ICH guidelines and equally values the contents of ginsenosides in white and red ginseng preparations.

Improvements on Speech Recognition for Fast Speech (고속 발화음에 대한 음성 인식 향상)

  • Lee Ki-Seung
    • The Journal of the Acoustical Society of Korea
    • /
    • v.25 no.2
    • /
    • pp.88-95
    • /
    • 2006
  • In this Paper. a method for improving the performance of automatic speech recognition (ASR) system for conversational speech is proposed. which mainly focuses on increasing the robustness against the rapidly speaking utterances. The proposed method doesn't require an additional speech recognition task to represent speaking rate quantitatively. Energy distribution for special bands is employed to detect the vowel regions, the number of vowels Per unit second is then computed as speaking rate. To improve the Performance for fast speech. in the pervious methods. a sequence of the feature vectors is expanded by a given scaling factor, which is computed by a ratio between the standard phoneme duration and the measured one. However, in the method proposed herein. utterances are classified by their speaking rates. and the scaling factor is determined individually for each class. In this procedure, a maximum likelihood criterion is employed. By the results from the ASR experiments devised for the 10-digits mobile phone number. it is confirmed that the overall error rate was reduced by $17.8\%$ when the proposed method is employed

Design and Implementation of OBCP Engine based on Lua VM for AT697F/VxWorks Platform (AT697F/VxWorks 플랫폼에서 Lua 가상머신 기반의 OBCP 엔진 설계 및 구현)

  • Choi, Jong-Wook;Park, Su-Hyun
    • Journal of Satellite, Information and Communications
    • /
    • v.12 no.3
    • /
    • pp.108-113
    • /
    • 2017
  • The OBCP called 'operator on board' is that of a procedure to be executed on-board, which can be easily be loaded, executed, and also replaced, without modifying the remainder of the FSW. The use of OBCP enhances the on-board autonomy capabilities and increases the robustness to ground stations outages. The OBCP engine which is the core module of OBCP component in the FSW interprets and executes of the procedures based on script language written using a high-level language, possibly compiled, and it is relying on a virtual machine of the OBCP engine. FSW team in KARI has studied OBCP since 2010 as FSW team's internal projects, and made some OBCP engines such as Java KVM, RTCS/C and KKOMA on ERC32 processor target only for study. Recently we have been studying ESA's OBCP standard and implementing Lua and MicroPython on LEON2-FT/AT697F processor target as the OBCP engine. This paper presents the design and implementation of Lua for the OBCP engine on AT697F processor with VxWorks RTOS, and describes the evaluation result and performance of the OBCP engine.

Design of Sliding Mode Fuzzy Controller for Vibration Reduction of Large Structures (대형구조물의 진동 감소를 위한 슬라이딩 모드 퍼지 제어기의 설계)

  • 윤정방;김상범
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.3 no.3
    • /
    • pp.63-74
    • /
    • 1999
  • A sliding mode fuzzy control (SMFC) algorithm is presented for vibration of large structures. Rule-base of the fuzzy inference engine is constructed based on the sliding mode control, which is one of the nonlinear control algorithms. Fuzziness of the controller makes the control system robust against the uncertainties in the system parameters and the input excitation. Non-linearity of the control rule makes the controller more effective than linear controllers. Design procedure based on the present fuzzy control is more convenient than those of the conventional algorithms based on complex mathematical analysis, such as linear quadratic regulator and sliding mode control(SMC). Robustness of presented controller is illustrated by examining the loop transfer function. For verification of the present algorithm, a numerical study is carried out on the benchmark problem initiated by the ASCE Committee on Structural Control. To achieve a high level of realism, various aspects are considered such as actuator-structure interaction, modeling error, sensor noise, actuator time delay, precision of the A/D and D/A converters, magnitude of control force, and order of control model. Performance of the SMFC is examined in comparison with those of other control algorithms such as $H_{mixed 2/{\infty}}$ optimal polynomial control, neural networks control, and SMC, which were reported by other researchers. The results indicate that the present SMFC is an efficient and attractive control method, since the vibration responses of the structure can be reduced very effectively and the design procedure is simple and convenient.

  • PDF

Detection of Phantom Transaction using Data Mining: The Case of Agricultural Product Wholesale Market (데이터마이닝을 이용한 허위거래 예측 모형: 농산물 도매시장 사례)

  • Lee, Seon Ah;Chang, Namsik
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.161-177
    • /
    • 2015
  • With the rapid evolution of technology, the size, number, and the type of databases has increased concomitantly, so data mining approaches face many challenging applications from databases. One such application is discovery of fraud patterns from agricultural product wholesale transaction instances. The agricultural product wholesale market in Korea is huge, and vast numbers of transactions have been made every day. The demand for agricultural products continues to grow, and the use of electronic auction systems raises the efficiency of operations of wholesale market. Certainly, the number of unusual transactions is also assumed to be increased in proportion to the trading amount, where an unusual transaction is often the first sign of fraud. However, it is very difficult to identify and detect these transactions and the corresponding fraud occurred in agricultural product wholesale market because the types of fraud are more intelligent than ever before. The fraud can be detected by verifying the overall transaction records manually, but it requires significant amount of human resources, and ultimately is not a practical approach. Frauds also can be revealed by victim's report or complaint. But there are usually no victims in the agricultural product wholesale frauds because they are committed by collusion of an auction company and an intermediary wholesaler. Nevertheless, it is required to monitor transaction records continuously and to make an effort to prevent any fraud, because the fraud not only disturbs the fair trade order of the market but also reduces the credibility of the market rapidly. Applying data mining to such an environment is very useful since it can discover unknown fraud patterns or features from a large volume of transaction data properly. The objective of this research is to empirically investigate the factors necessary to detect fraud transactions in an agricultural product wholesale market by developing a data mining based fraud detection model. One of major frauds is the phantom transaction, which is a colluding transaction by the seller(auction company or forwarder) and buyer(intermediary wholesaler) to commit the fraud transaction. They pretend to fulfill the transaction by recording false data in the online transaction processing system without actually selling products, and the seller receives money from the buyer. This leads to the overstatement of sales performance and illegal money transfers, which reduces the credibility of market. This paper reviews the environment of wholesale market such as types of transactions, roles of participants of the market, and various types and characteristics of frauds, and introduces the whole process of developing the phantom transaction detection model. The process consists of the following 4 modules: (1) Data cleaning and standardization (2) Statistical data analysis such as distribution and correlation analysis, (3) Construction of classification model using decision-tree induction approach, (4) Verification of the model in terms of hit ratio. We collected real data from 6 associations of agricultural producers in metropolitan markets. Final model with a decision-tree induction approach revealed that monthly average trading price of item offered by forwarders is a key variable in detecting the phantom transaction. The verification procedure also confirmed the suitability of the results. However, even though the performance of the results of this research is satisfactory, sensitive issues are still remained for improving classification accuracy and conciseness of rules. One such issue is the robustness of data mining model. Data mining is very much data-oriented, so data mining models tend to be very sensitive to changes of data or situations. Thus, it is evident that this non-robustness of data mining model requires continuous remodeling as data or situation changes. We hope that this paper suggest valuable guideline to organizations and companies that consider introducing or constructing a fraud detection model in the future.

Analysis of Repeated Measurement Problem in SP data (SP 데이터의 Repeated Measurement Problem 분석)

  • CHO, Hye-Jin
    • Journal of Korean Society of Transportation
    • /
    • v.20 no.1
    • /
    • pp.111-119
    • /
    • 2002
  • One of the advantages of SP methods is the possibility of getting a number of responses from each respondent. However, when the repeated observations from each respondent are analysed by applying the simple modeling method, a potential problem is created because of upbiased significance due to the repeated observation from each respondent. This study uses a variety of approaches to explore this issue and to test the robustness of the simple model estimates. Among several different approaches, the Jackknife method and Kocurs method were applied. The Jackknife method was implemented using a program JACKKNIFE. The model estimate results of Jackknife method and Kocurs method were compared with those of the uncorrected estimates in order to test whether there was repeated measurement problem or not and the extent to which this problem affected the model estimates. The standard errors between the uncorrected model estimates and Jackknife estimates were also compared. The results reveals that the t-ratios of Kocurs are much lower than those of the uncorrected method and Jackknife estimates, indicating that Kocurs method underestimates the significance of the coefficients. Jackknife method produced the almost same coefficients as those of the uncorrected model but the lower t-ratios. These results indicate that the coefficients of the uncorrected method are accurate but that their significance are somewhat overestimated. In this study. 1 concluded that the repeated measurement Problem did exist in our data, but that it did not affect the model estimation results significantly. It is recommended that such a test should become a standard procedure. If it turns out that the analysis based on the simple uncorrected method are influenced by the repeated measurement problem. it should be corrected.

A Study on Security Level-based Authentication for Supporting Multiple Objects in RFID Systems (다중 객체 지원을 위한 RFID 시스템에서 보안 레벨 기반의 인증 기법에 관한 연구)

  • Kim, Ji-Yeon;Jung, Jong-Jin;Jo, Geun-Sik;Lee, Kyoon-Ha
    • The Journal of Society for e-Business Studies
    • /
    • v.13 no.1
    • /
    • pp.21-32
    • /
    • 2008
  • RFID systems provide technologies of automatic object identification through wireless communications in invisible ranges and adaptability against various circumstances. These advantages make RFID systems to be applied in various fields of industries and individual life. However, it is difficult to use tags with distinction as tags are increasingly used in life because a tag usually stores only one object identifier in common RFID applications. In addition, RFID systems often make serious violation of privacy caused by various attacks because of their weakness of radio frequency communication. Therefore, information sharing methods among applications are necessary for expansive development of RFID systems. In this paper, we propose efficient RFID scheme. At first, we design a new RFID tag structure which supports many object identifiers of different applications in a tag and allows those applications to access them simultaneously. Secondly, we propose an authentication protocol to support the proposed tag structure. The proposed protocol is designed by considering of robustness against various attacks in low cost RFID systems. Especially, the proposed protocol is focused on efficiency of authentication procedure by considering security levels of applications. In the proposed protocol, each application goes through one of different authentication procedures according to their security levels. Finally, we prove efficiency of th proposed scheme compared with the other schemes through experiments and evaluation.

  • PDF