• Title/Summary/Keyword: Analysts

Search Result 386, Processing Time 0.027 seconds

A Comparative Study on Factor Recovery of Principal Component Analysis and Common Factor Analysis (주성분분석과 공통요인분석에 대한 비교연구: 요인구조 복원 관점에서)

  • Jung, Sunho;Seo, Sangyun
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.6
    • /
    • pp.933-942
    • /
    • 2013
  • Common factor analysis and principal component analysis represent two technically distinctive approaches to exploratory factor analysis. Much of the psychometric literature recommends the use of common factor analysis instead of principal component analysis. Nonetheless, factor analysts use principal component analysis more frequently because they believe that principal component analysis could yield (relatively) less accurate estimates of factor loadings compared to common factor analysis but most often produce similar pattern of factor loadings, leading to essentially the same factor interpretations. A simulation study is conducted to evaluate the relative performance of these two approaches in terms of factor pattern recovery under different experimental conditions of sample size, overdetermination, and communality.The results show that principal component analysis performs better in factor recovery with small sample sizes (below 200). It was further shown that this tendency is more prominent when there are a small number of variables per factor. The present results are of practical use for factor analysts in the field of marketing and the social sciences.

Development of the Financial Account Pre-screening System for Corporate Credit Evaluation (분식 적발을 위한 재무이상치 분석시스템 개발)

  • Roh, Tae-Hyup
    • The Journal of Information Systems
    • /
    • v.18 no.4
    • /
    • pp.41-57
    • /
    • 2009
  • Although financial information is a great influence upon determining of the group which use them, detection of management fraud and earning manipulation is a difficult task using normal audit procedures and corporate credit evaluation processes, due to the shortage of knowledge concerning the characteristics of management fraud, and the limitation of time and cost. These limitations suggest the need of systemic process for !he effective risk of earning manipulation for credit evaluators, external auditors, financial analysts, and regulators. Moot researches on management fraud have examined how various characteristics of the company's management features affect the occurrence of corporate fraud. This study examines financial characteristics of companies engaged in fraudulent financial reporting and suggests a model and system for detecting GAAP violations to improve reliability of accounting information and transparency of their management. Since the detection of management fraud has limited proven theory, this study used the detecting method of outlier(upper, and lower bound) financial ratio, as a real-field application. The strength of outlier detecting method is its use of easiness and understandability. In the suggested model, 14 variables of the 7 useful variable categories among the 76 financial ratio variables are examined through the distribution analysis as possible indicators of fraudulent financial statements accounts. The developed model from these variables show a 80.82% of hit ratio for the holdout sample. This model was developed as a financial outlier detecting system for a financial institution. External auditors, financial analysts, regulators, and other users of financial statements might use this model to pre-screen potential earnings manipulators in the credit evaluation system. Especially, this model will be helpful for the loan evaluators of financial institutes to decide more objective and effective credit ratings and to improve the quality of financial statements.

An Elementary-Function-Based Refinement Method for Use Cases to Improve Reliability of Use Case Points (유스케이스 점수 측정의 신뢰도 향상을 위한 단위기능 중심의 유스케이스 정제 방법)

  • Heo, Ryoung;Seo, Young-Duk;Baik, Doo-Kwon
    • Journal of KIISE
    • /
    • v.42 no.9
    • /
    • pp.1117-1123
    • /
    • 2015
  • Use The Use Case Points method is a software estimation method that is based on user requirements. When requirement analysts elicit user requirements, they obtain different use cases because different levels of detail are possible for the Use Case, and this affects the Use Case Points. In this paper, we suggest a method to refine the level of detail of the Use Case by using the concept of an elementary function. This refinement method achieves the desired reliability for the Use Case Points because it produces less of a deviation in the Use Case Points for different requirement analysts than other methods that are based on the step, transaction, and narrative of the Use Case.

A Study on Quality Control of Inorganic Acids using Ion Chromatograph (이온크로마토그래피를 활용한 무기산류 정도관리 방법 연구)

  • Park, Hae Dong;Park, Seung-Hyun;Jung, Kihyo
    • Journal of Korean Society of Occupational and Environmental Hygiene
    • /
    • v.31 no.1
    • /
    • pp.22-30
    • /
    • 2021
  • Objectives: The objectives of this study were to develop a quality control protocol of inorganic acids using ion chromatograph and to evaluate analytical proficiency of the legally designated agencies. Methods: This study prepared inorganic acid samples by injecting three anion certified solutions (chloride, nitrate, and sulfate) on the quartz filters. To investigate the storage stability and concentration consistency of the samples, 240 samples for each anion were tested at weeks 0, 2, 4, 8, 12, and 16 while storing at 4℃ and 25℃. To evaluate analytical proficiency, two separate testings were administrated for six skilled analysts and 46 analysts affiliated with legally designated agencies. Results: Average recoveries of the three ions after 16 weeks of storage were fairly high (over 95%). In addition, average recoveries (chloride = 97%, nitrate = 96%, and sulfate = 103%) after 16 weeks of storage at low temperate were relatively higher than those (94%, 93%, and 98%) at room temperature. The coefficients of variation (CV) for the three ions were less than 5% except for the sulfate sample at 5.56 ㎍ (CV = 12.4%). The average ratios of the concentration values analyzed by the legally designated agencies to the injected concentrations were close to 1. However, their CVs were relatively greater (chloride ≤ 49%, nitrate ≤ 14%, and sulfate ≤ 28%), which implies a need for quality control. Conclusions: The quality control protocol used in this study for the three inorganic acids can be utilized in the quality control for ion chromatography.

ChatGPT-based Software Requirements Engineering (ChatGPT 기반 소프트웨어 요구공학)

  • Jongmyung Choi
    • Journal of Internet of Things and Convergence
    • /
    • v.9 no.6
    • /
    • pp.45-50
    • /
    • 2023
  • In software development, the elicitation and analysis of requirements is a crucial phase, and it involves considerable time and effort due to the involvement of various stakeholders. ChatGPT, having been trained on a diverse array of documents, is a large language model that possesses not only the ability to generate code and perform debugging but also the capability to be utilized in the domain of software analysis and design. This paper proposes a method of requirements engineering that leverages ChatGPT's capabilities for eliciting software requirements, analyzing them to align with system goals, and documenting them in the form of use cases. In software requirements engineering, it suggests that stakeholders, analysts, and ChatGPT should engage in a collaborative model. The process should involve using the outputs of ChatGPT as initial requirements, which are then reviewed and augmented by analysts and stakeholders. As ChatGPT's capability improves, it is anticipated that the accuracy of requirements elicitation and analysis will increase, leading to time and cost savings in the field of software requirements engineering.

A Study on the Utilization of Artificial Intelligence-Based Infringement Analysis Tools (인공지능 기반 침해분석 도구 활용에 관한 연구)

  • Yang Hwan Seok
    • Convergence Security Journal
    • /
    • v.24 no.2
    • /
    • pp.3-8
    • /
    • 2024
  • Recently, in order to build a cyber threats have increased in number and complexity. These threats increase the risk of using personally owned devices for work. This research addresses how to utilize an AI-enabled breach analysis tool. To this end, we developed and proposed the feasibility of using an AI-based breach analysis tool that reduces the workload of analysts and improves analysis efficiency through automated analysis processes. This allows analysts to focus on more important tasks. The purpose of this research is to propose the development and utilization of an AI-based breach analysis tool. We propose a new research direction in the field of breach analysis and suggest that automated tools should be improved in performance, coverage, and ease of use to enable organizations to respond to cyberattacks more effectively. As a research method, we developed a breach analysis tool using A.I. technology and studied various use cases. We also evaluated the performance, coverage, and ease of use of automated tools, and conducted research on predicting and preventing breaches and automatically responding to them. As a result, this research will serve as a foundation for the development and utilization of AI-based breach analysis tools, which can be used to respond to cyberattacks more effectively through experiments.

A Study on Absorbing Boundaries for Wave Propagation in Semi-Infinite Elastic Media (반무한 영역에서의 탄성파 진행문제를 위한 흡수경계에 관한 연구)

  • 이종세
    • Proceedings of the Earthquake Engineering Society of Korea Conference
    • /
    • 2000.04a
    • /
    • pp.451-457
    • /
    • 2000
  • In many dynamic problems such as foundation vibrations ultrasonic nondestructive evaluation and blasting analysts are confronted with the problem of wave propagation in an infinite or semi-infinite media. In order to simulate this situation by a finite analytical model provisions must be made to absorb the stress waves arriving at the boundary. Absorbing boundaries are mathematical artifacts used to prevent wave reflections at the boundaries of discrete models for infinite media under dynamic loads. An analytical study is carried out to examine the effectiveness of Lysmer-Kuhlemeyer model one of the most widely used absorbing boundaries. Validity of the absorbing boundary conditions suggested by Lymer-Kuhlemeyer is examined by adopting the solution of Ewing et al. to the problem of plane waves from a harmonic normal force on the surface of an elastic half-space. The Ewing's problem is than numerically simulated using the finite element method on a semi-circular mesh with and without absorbing boundaries which are represented by viscous dashpots. The absorption ratios are calculated by comparing the displacements at the absorbing boundaries to those at the free field without absorbing boudaries.

  • PDF

Hadoop and MapReduce (하둡과 맵리듀스)

  • Park, Jeong-Hyeok;Lee, Sang-Yeol;Kang, Da Hyun;Won, Joong-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.5
    • /
    • pp.1013-1027
    • /
    • 2013
  • As the need for large-scale data analysis is rapidly increasing, Hadoop, or the platform that realizes large-scale data processing, and MapReduce, or the internal computational model of Hadoop, are receiving great attention. This paper reviews the basic concepts of Hadoop and MapReduce necessary for data analysts who are familiar with statistical programming, through examples that combine the R programming language and Hadoop.

A Study on the Relative Importance of Underlying Competencies of Business Analysts

  • Park, Joon;Jeong, Seung Ryul
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.8
    • /
    • pp.3986-4007
    • /
    • 2016
  • Business analysis is a key factor of project success or failure in information systems. However, there are few studies on business analysis competencies. The objective of this paper is to identify which competencies a business analyst (BA) needs, and analyze the importance weights and priorities of business analysis competencies. Literature review yielded 6 competency dimensions and 30 competencies. Based on interviews with 12 experts and analytic hierarchy process analysis, the relative importance weight and priority of each business analysis competency were analyzed. Moreover, an importance-perception gap between stakeholders in different positions was identified. This result can be used as selection and development criteria for superior BAs that are responsible for solving business problems using information systems solutions.

Saliency Score-Based Visualization for Data Quality Evaluation

  • Kim, Yong Ki;Lee, Keon Myung
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.4
    • /
    • pp.289-294
    • /
    • 2015
  • Data analysts explore collections of data to search for valuable information using various techniques and tricks. Garbage in, garbage out is a well-recognized idiom that emphasizes the importance of the quality of data in data analysis. It is therefore crucial to validate the data quality in the early stage of data analysis, and an effective method of evaluating the quality of data is hence required. In this paper, a method to visually characterize the quality of data using the notion of a saliency score is introduced. The saliency score is a measure comprising five indexes that captures certain aspects of data quality. Some experiment results are presented to show the applicability of proposed method.