• Title/Summary/Keyword: Graphical Data

Search Result 634, Processing Time 0.028 seconds

The realization of Load Flow under Graphic Interface I (그래픽 인터페이스를 통한 조류계산 구현 I)

  • Hwang, In-Jun;Kim, Kun-Joong;Kim, Kyu-Wang;Shin, Man-Cheol;O, Sung-Kun
    • Proceedings of the KIEE Conference
    • /
    • 2004.11b
    • /
    • pp.220-222
    • /
    • 2004
  • In the paper architecture of load flow under graphic interface we mentioned that the define of graphic interface and the way of system modeling. In the view point of engineering, domain define is the step of process which is seek out the general solution. This is a base of program's structure. When making a relationship inner component, relevance and restriction are affected by data type and etc. In this process logical data is realized as a symbol. And system configuration decide that main function and extension by the analysis user's demand and requirements. Specially engineering analysis software has accuracy also through the numerical method. To represent this wee need more powerful graphical component. This helps user's accessibility. We can combine 10, numerical library and graphical component as a system element by domain definition. In this paper we will materialize as a step of implementation and decide a direction of programming. In conclusion we will see the analysis software's necessary function to make a better.

  • PDF

ILVA: Integrated audit-log analysis tool and its application. (시스템 보안 강화를 위한 로그 분석 도구 ILVA와 실제 적용 사례)

  • 차성덕
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.9 no.3
    • /
    • pp.13-26
    • /
    • 1999
  • Widespread use of Internet despite numerous positive aspects resulted in increased number of system intrusions and the need for enhanced security mechanisms is urgent. Systematic collection and analysis of log data are essential in intrusion investigation. Unfortunately existing logs are stored in diverse and incompatible format thus making an automated intrusion investigation practically impossible. We examined the types of log data essential in intrusion investigation and implemented a tool to enable systematic collection and efficient analysis of voluminous log data. Our tool based on RBDMS and SQL provides graphical and user-friendly interface. We describe our experience of using the tool in actual intrusion investigation and explain how our tool can be further enhanced.

Development of Control Algorithm and Pick & Placer (반도체 소자 Pick &Placer 및 제어 알고리즘 개발)

  • 심성보;김재희;유범상
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2004.10a
    • /
    • pp.1339-1343
    • /
    • 2004
  • This paper presents a development of the control algorithm and Pick & Placer. The Pick & Placer provides a powerful multi-task system that includes both graphical and remote interface. Users can easily set up sorting parameters and record important data including wafer number, data, and operator information. This System sets up a dustproof device and massively machined components to provide an extremely stable sorting environment. Precise resolution and accuracy result from using machine vision, a pneumatic slide drive and close -looped positioning.

  • PDF

Note on classification and regression tree analysis (분류와 회귀나무분석에 관한 소고)

  • 임용빈;오만숙
    • Journal of Korean Society for Quality Management
    • /
    • v.30 no.1
    • /
    • pp.152-161
    • /
    • 2002
  • The analysis of large data sets with hundreds of thousands observations and thousands of independent variables is a formidable computational task. A less parametric method, capable of identifying important independent variables and their interactions, is a tree structured approach to regression and classification. It gives a graphical and often illuminating way of looking at data in classification and regression problems. In this paper, we have reviewed and summarized tile methodology used to construct a tree, multiple trees and the sequential strategy for identifying active compounds in large chemical databases.

Firework Plot as a Graphical Exploratory Data Analysis Tool to Evaluate the Impact of Outliers in a Mixture Experiment (혼합물 실험에서 특이값의 영향을 평가하기 위한 그래픽 탐색적 자료분석 도구로서의 불꽃그림)

  • Jang, Dae-Heung;Ahn, SoJin;Kim, Youngil
    • The Korean Journal of Applied Statistics
    • /
    • v.27 no.4
    • /
    • pp.629-643
    • /
    • 2014
  • It is common to check the validity of an assumed model with the heavy use of diagnostics tools when conducting data analysis with regression techniques; however, outliers and influential data points often distort the regression output in undesired manner. Jang and Anderson-Cook (2013) proposed a graphical method called a firework plot for exploratory analysis that could visualize the trace of the impact of possible outlying and/or influential data points on individual regression coefficients and the overall residual sum of squares(SSE) measure. They developed 3-D plot as well as pair-wise plot for the appropriate measures of interest. In this paper, the approach was extended further to tell the strength of their approach; in addition, a more meaningful interpretation was possible by adding a measure not mentioned in their paper. This approach was applied to the mixture experiment because we felt that a detailed analysis of statistical measure sensitivity is required in a small experiment.

Evolutionary Hypernetwork Model for Higher Order Pattern Recognition on Real-valued Feature Data without Discretization (이산화 과정을 배제한 실수 값 인자 데이터의 고차 패턴 분석을 위한 진화연산 기반 하이퍼네트워크 모델)

  • Ha, Jung-Woo;Zhang, Byoung-Tak
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.2
    • /
    • pp.120-128
    • /
    • 2010
  • A hypernetwork is a generalized hypo-graph and a probabilistic graphical model based on evolutionary learning. Hypernetwork models have been applied to various domains including pattern recognition and bioinformatics. Nevertheless, conventional hypernetwork models have the limitation that they can manage data with categorical or discrete attibutes only since the learning method of hypernetworks is based on equality comparison of hyperedges with learned data. Therefore, real-valued data need to be discretized by preprocessing before learning with hypernetworks. However, discretization causes inevitable information loss and possible decrease of accuracy in pattern classification. To overcome this weakness, we propose a novel feature-wise L1-distance based method for real-valued attributes in learning hypernetwork models in this study. We show that the proposed model improves the classification accuracy compared with conventional hypernetworks and it shows competitive performance over other machine learning methods.

Regional Myocardial Blood Flow Estimation Using Rubidium-82 Dynamic Positron Emission Tomography and Dual Integration Method (Rubidium-82 심근 Dynamic PET 영상과 이중적분법을 이용한 국소 심근 혈류 예측의 기본 모델 연구)

  • 곽철은;정재민
    • Journal of Biomedical Engineering Research
    • /
    • v.16 no.2
    • /
    • pp.223-230
    • /
    • 1995
  • This study investigates a combined mathematical model for the quantitative estimation of regional myocardial blood flow in experimental canine coronary artery occlusion and in patients with ischemic myocardial diseases using Rb-82 dynamic myocardial positron emission tomography. The coronary thrombosis was induced using the new catheter technique by narrowing the lumen of coronary vessel gradually, which finally led to partial obstruction of coronary artery. Thirty four Rb-82 dynamic myocardial PET scans were performed sequentially for each experiment using our 5, 10 and 20 second acquisition protocol, respectively, and six to seven regions of interest were drawn on each transaxial slices, one on left ventricular chamber for input function and the others on normal and decreased perfusion myocardial segments for the flow estimation in those regions. Two compartment model and graphical analysis method have been applied to the measured sets of regional PET data, and the rate constants of influx to myocardial tissue were calculated for regional myocardial flow estimates with the two parameter fits of raw data by the Levenberg-Marquardt method. The results showed that, (I) two compartment model suggested by Kety-Schmidt, with proper modification of the measured data and volume of distribution, could be used for the simple estimation of regional myocardial blood flow, (2) the calculated regional myocardial blood flow estimates were dependent on the selection of input function, which reflected partial volume effect and left ventricular wall motion in previously used graphical analysis, and (3) mathematically fitted input and tissue time activity curves were more suitable than the direct application of the measured data in terms of convergence.

  • PDF

Development of Graphical Solution for Computer-Assisted Fault Diagnosis: Preliminary Study (컴퓨터 원용 결함진단을 위한 그래픽 솔루션 개발에 관한 연구)

  • Yoon, Han-Bean;Yun, Seung-Man;Han, Jong-Chul;Cho, Min-Kook;Lim, Chang-Hwy;Heo, Sung-Kyn;Shon, Cheol-Soon;Kim, Seong-Sik;Lee, Seok-Hee;Lee, Suk;Kim, Ho-Koung
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.29 no.1
    • /
    • pp.36-42
    • /
    • 2009
  • We have developed software for converting the volumetric voxel data obtained from X-ray computed tomography(CT) into computer-aided design(CAD) data. The developed software can used for non-destructive testing and evaluation, reverse engineering, and rapid prototyping, etc. The main algorithms employed in the software are image reconstruction, volume rendering, segmentation, and mesh data generation. The feasibility of the developed software is demonstrated with the CT data of human maxilla and mandible bones.

Statistical Analysis on the Web Using PHP3 (PHP3를 이용한 웹상에서의 통계분석)

  • Hwang, Jin-Soo;Uhm, Dae-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.10 no.2
    • /
    • pp.501-510
    • /
    • 1999
  • We have seen a rapid development of multimedia intustry as computer evolves and the internet has changed our way of life dramatically in these days. There we several attempts to teach elementary statistics on the web but most of them are based on commercial products. The need for statistical data analysis and decision making based on those analysis is growing. In this article we try to show one way of reaching that goal by using a server side scripting language PHP3 toghether with extra graphical module and statistical distribution module on the web. We showed some elementary exploratory graphical data analysis and statistical inferences. There are plenty of room of improvements to make it a full blown statistical analysis tool on the web in the new future. All the programs and databases used in our article we public programs. The main engine PHP3 is included as an apache web server module so it is very light and fast. It will be much better when the PHP4(ZEND) will be officially out in terms of processing speed.

  • PDF

Goodness-of-fit tests based on generalized Lorenz curve for progressively Type II censored data from a location-scale distributions

  • Lee, Wonhee;Lee, Kyeongjun
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.2
    • /
    • pp.191-203
    • /
    • 2019
  • The problem of examining how well an assumed distribution fits the data of a sample is of significant and must be examined prior to any inferential process. The observed failure time data of items are often not wholly available in reliability and life-testing studies. Lowering the expense and period associated with tests is important in statistical tests with censored data. Goodness-of-fit tests for perfect data can no longer be used when the observed failure time data are progressive Type II censored (PC) data. Therefore, we propose goodness-of-fit test statistics and a graphical method based on generalized Lorenz curve for PC data from a location-scale distribution. The power of the proposed tests is then assessed through Monte Carlo simulations. Finally, we analyzed two real data set for illustrative purposes.