• Title/Summary/Keyword: Complexity measure

Search Result 310, Processing Time 0.025 seconds

Measurement of Classes Complexity in the Object-Oriented Analysis Phase (객체지향 분석 단계에서의 클래스 복잡도 측정)

  • Kim, Yu-Kyung;Park, Jai-Nyun
    • Journal of KIISE:Software and Applications
    • /
    • v.28 no.10
    • /
    • pp.720-731
    • /
    • 2001
  • Complexity metrics have been developed for the structured paradigm of software development are not suitable for use with the object-oriented(OO) paradigm, because they do not support key object-oriented concepts such as inheritance, polymorphism. message passing and encapsulation. There are many researches on OO software metrics such as program complexity or design metrics. But metrics measuring the complexity of classes at the OO analysis phase are needed because they provide earlier feedback to the development project. and earlier feedback means more effective developing and less costly maintenance. In this paper, we propose the new metrics to measure the complexity of analysis classes which draw out in the analysis based on RUP(Rational Unified Process). By the collaboration complexity, is denoted by CC, we mean the maximum number of the collaborations can be achieved with each of the collaborator and determine the potential complexity. And the interface complexity, is denoted by IC, shows the difficulty related to understand the interface of collaborators each other. We verify theoretically the suggested metrics for Weyuker's nine properties. Moreover, we show the computation results for analysis classes of the system which automatically respond to questions of the user using the text mining technique. As a result of the comparison of CC and CBO and WMC suggested by Chidamber and Kemerer, the class that have highly the proposed metric value maintain the high complexity at the design phase too. And the complexity can be represented by CC and IC more than CBO and WMC. We can expect that our metrics may provide us the earlier feedback and hence possible to predict the efforts, costs and time required to remainder processes. As a result, we expect to develop the cost-effective OO software by reviewing the complexity of analysis classes in the first stage of SDLC(Software Development Life Cycle).

  • PDF

Complexity Metrics for Analysis Classes in the Unified Software Development Process (Unified Process의 분석 클래스에 대한 복잡도 척도)

  • 김유경;박재년
    • The KIPS Transactions:PartD
    • /
    • v.8D no.1
    • /
    • pp.71-80
    • /
    • 2001
  • Object-Oriented (OO) methodology to use the concept like encapsulation, inheritance, polymorphism, and message passing demands metrics that are different from structured methodology. There are many studies for OO software metrics such as program complexity or design metrics. But the metrics for the analysis class need to decrease the complexity in the analysis phase so that greatly reduce the effort and the cost of system development. In this paper, we propose new metrics to measure the complexity of analysis classes which draw out in the analysis phase based on Unified Process. By the collaboration complexity, is denoted by CC, we mean the maximum number of the collaborations can be achieved with each of the collaborator and detennine the potential complexity. And the interface complexity, is denoted by IC, shows the difficulty related to understand the interface of collaborators each other. We prove mathematically that the suggested metrics satisfy OO characteristics such as class size and inheritance. And we verify it theoretically for Weyuker' s nine properties. Moreover, we show the computation results for analysis classes of the system which automatically respond to questions of the it's user using the text mining technique. As we compared CC and IC to CBO and WMC, the complexity can be represented by CC and IC more than CBO and WMC. We expect to develop the cost-effective OO software by reviewing the complexity of analysis classes in the first stage of SDLC (Software Development Life Cycle).

  • PDF

3D Vision-Based Local Path Planning System of a Humanoid Robot for Obstacle Avoidance

  • Kang, Tae-Koo;Lim, Myo-Taeg;Park, Gwi-Tae;Kim, Dong W.
    • Journal of Electrical Engineering and Technology
    • /
    • v.8 no.4
    • /
    • pp.879-888
    • /
    • 2013
  • This paper addresses the vision based local path planning system for obstacle avoidance. To handle the obstacles which exist beyond the field of view (FOV), we propose a Panoramic Environment Map (PEM) using the MDGHM-SIFT algorithm. Moreover, we propose a Complexity Measure (CM) and Fuzzy logic-based Avoidance Motion Selection (FAMS) system to enable a humanoid robot to automatically decide its own direction and walking motion when avoiding an obstacle. The CM provides automation in deciding the direction of avoidance, whereas the FAMS system chooses the avoidance path and walking motion, based on environment conditions such as the size of the obstacle and the available space around it. The proposed system was applied to a humanoid robot that we designed. The results of the experiment show that the proposed method can be effectively applied to decide the avoidance direction and the walking motion of a humanoid robot.

Luminance Projection Model for Efficient Video Similarity Measure (효율적인 비디오 유사도 측정을 위한 휘도 투영모델)

  • Kim, Sang-Hyun
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.10 no.2
    • /
    • pp.132-135
    • /
    • 2009
  • The video similarity measure is very important factor to index and to retrieve for video data. In this paper, we propose the luminance projection model to measure the video similarity efficiently. Most algorithms for video indexing have been commonly used histograms, edges, or motion features, whereas in this paper, the proposed algorithm is employed an efficient measure using the luminance projection. To index effectively the video sequences and to decrease the computational complexity, we calculate video similarity using the key frames extracted by the cumulative measure, and compare the set of key frames using the modified Hausdorff distance. Experimental results show that the proposed luminance projection model yields the remarkable accuracy and performance than the conventional algorithm.

  • PDF

Model Parameter-based Rate Control Algorithm for Constant Quality Real-Time Video Coding (실시간 부호화를 위한 모델 파라미터 기반 일정 화질 비트율 제어 기법)

  • Jeong, Jin-Woo;Cho, Kyung-Min;Choe, Yoon-Sik
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.3
    • /
    • pp.93-102
    • /
    • 2008
  • In this paper, we propose a rate control algorithm for constant quality real time video coding. To achieve constant quality, previous algorithm exploit mean absolute of difference(MAD) as measure of frame complexity. However, if scene is abruptly changed or if quantization parameter is not constant, encoder produces various output bits with same MAD. Therefore we know that MAD does not appropriately reflect characteristic of frame. To solve this problem, we exploit model parameter as measure of frame complexity. Because model parameter means slope between output bits and MAD, it reflects correctly complexity of frame. And because previous model, R-MAD model, is not considered quantization parameter, as quantization parameter increases or decreases, model parameter of frame also vary. So model parameter obtained using previous model cannot reflect internal characteristic of video. We solve this problem using proposed model, which is considered quantization parameter. Experiment results show that our algorithm provide better performance, in terms of quality smoothness than previous algorithm. Especially, when scene is abruptly changed, our algorithm alleviates quality drop.

MEASURING THE INFLUENCE OF TASK COMPLEXITY ON HUMAN ERROR PROBABILITY: AN EMPIRICAL EVALUATION

  • Podofillini, Luca;Park, Jinkyun;Dang, Vinh N.
    • Nuclear Engineering and Technology
    • /
    • v.45 no.2
    • /
    • pp.151-164
    • /
    • 2013
  • A key input for the assessment of Human Error Probabilities (HEPs) with Human Reliability Analysis (HRA) methods is the evaluation of the factors influencing the human performance (often referred to as Performance Shaping Factors, PSFs). In general, the definition of these factors and the supporting guidance are such that their evaluation involves significant subjectivity. This affects the repeatability of HRA results as well as the collection of HRA data for model construction and verification. In this context, the present paper considers the TAsk COMplexity (TACOM) measure, developed by one of the authors to quantify the complexity of procedure-guided tasks (by the operating crew of nuclear power plants in emergency situations), and evaluates its use to represent (objectively and quantitatively) task complexity issues relevant to HRA methods. In particular, TACOM scores are calculated for five Human Failure Events (HFEs) for which empirical evidence on the HEPs (albeit with large uncertainty) and influencing factors are available - from the International HRA Empirical Study. The empirical evaluation has shown promising results. The TACOM score increases as the empirical HEP of the selected HFEs increases. Except for one case, TACOM scores are well distinguished if related to different difficulty categories (e.g., "easy" vs. "somewhat difficult"), while values corresponding to tasks within the same category are very close. Despite some important limitations related to the small number of HFEs investigated and the large uncertainty in their HEPs, this paper presents one of few attempts to empirically study the effect of a performance shaping factor on the human error probability. This type of study is important to enhance the empirical basis of HRA methods, to make sure that 1) the definitions of the PSFs cover the influences important for HRA (i.e., influencing the error probability), and 2) the quantitative relationships among PSFs and error probability are adequately represented.

A study on extraction of the frames representing each phoneme in continuous speech (연속음에서의 각 음소의 대표구간 추출에 관한 연구)

  • 박찬응;이쾌희
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.33B no.4
    • /
    • pp.174-182
    • /
    • 1996
  • In continuous speech recognition system, it is possible to implement the system which can handle unlimited number of words by using limited number of phonetic units such as phonemes. Dividing continuous speech into the string of tems of phonemes prior to recognition process can lower the complexity of the system. But because of the coarticulations between neiboring phonemes, it is very difficult ot extract exactly their boundaries. In this paper, we propose the algorithm ot extract short terms which can represent each phonemes instead of extracting their boundaries. The short terms of lower spectral change and higher spectral chang eare detcted. Then phoneme changes are detected using distance measure with this lower spectral change terms, and hgher spectral change terms are regarded as transition terms or short phoneme terms. Finally lower spectral change terms and the mid-term of higher spectral change terms are regarded s the represent each phonemes. The cepstral coefficients and weighted cepstral distance are used for speech feature and measuring the distance because of less computational complexity, and the speech data used in this experimetn was recoreded at silent and ordinary in-dorr environment. Through the experimental results, the proposed algorithm showed higher performance with less computational complexity comparing with the conventional segmetnation algorithms and it can be applied usefully in phoneme-based continuous speech recognition.

  • PDF

Measurement of program volume complexity using fuzzy self-organizing control (퍼지 적응 제어를 이용한 프로그램 볼륨 복잡도 측정)

  • 김재웅
    • Journal of the Korea Computer Industry Society
    • /
    • v.2 no.3
    • /
    • pp.377-388
    • /
    • 2001
  • Software metrics provide effective methods for characterizing software. Metrics have traditionally been composed through the definition of an equation, but this approach restricted within a full understanding of every interrelationships among the parameters. This paper use fuzzy logic system that is capable of uniformly approximating any nonlinear function and applying cognitive psychology theory. First of all, we extract multiple regression equation from the factors of 12 software complexity metrics collected from Java programs. We apply cognitive psychology theory in program volume factor, and then measure program volume complexity to execute fuzzy learning. This approach is sound, thus serving as the groundwork for further exploration into the analysis and design of software metrics.

  • PDF

A Study on Measurement Model of the Physical Complexity of Facade Design of Building on Street (경관 가이드라인 설정을 위한 가로변 건축물 외관디자인의 물리적 복합성 측정에 관한 연구)

  • 유창균;이석주;조용준
    • Journal of the Korean housing association
    • /
    • v.14 no.1
    • /
    • pp.87-94
    • /
    • 2003
  • As important elements consisting of city streetscape, facade design on building is generally very significant. But without active acceptance and understanding of the concept that the building has a private objective as personal property, it is not easy to take an involvement into design as well as to establish reasonable and scientific standards of harmony. Therefore, for desirable streetscape planning, it is indispensable to know how to closely examine the visual harmony of already established buildings in each street and how to get the solutions for its realization. In this respect, this study is to try to examine and verify the feasibility of our present streetscape situation by experimental application of acceptable Y. Elesheshtawy's model(1997), an interpretation of quantitative index of street buildings by Gestalt theory, for the preparation of the foundation of institute and standards of building design which has social value in contributing to visual and spacious harmony in our street space without giving any damage to private property. From the result, I can assure the validity that the physical complexity, whose schema is socially and culturally different from our reality, is applicable to our actual streetscape in some extent.

Changes in the Recognition Rate of Kodály Learning Devices using Machine Learning (머신러닝을 활용한 코다이 학습장치의 인식률 변화)

  • YunJeong LEE;Min-Soo KANG;Dong Kun CHUNG
    • Journal of Korea Artificial Intelligence Association
    • /
    • v.2 no.1
    • /
    • pp.25-30
    • /
    • 2024
  • Kodály hand signs are symbols that intuitively represent pitch and note names based on the shape and height of the hand. They are an excellent tool that can be easily expressed using the human body, making them highly engaging for children who are new to music. Traditional hand signs help beginners easily understand pitch and significantly aid in music learning and performance. However, Kodály hand signs have distinctive features, such as the ability to indicate key changes or chords using both hands and to clearly represent accidentals. These features enable the effective use of Kodály hand signs. In this paper, we aim to investigate the changes in recognition rates according to the complexity of scales by creating a device for learning Kodály hand signs, teaching simple Do-Re-Mi scales, and then gradually increasing the complexity of the scales and teaching complex scales and children's songs (such as "May Had A Little Lamb"). The learning device utilizes accelerometer and bending sensors. The accelerometer detects the tilt of the hand, while the bending sensor detects the degree of bending in the fingers. The utilized accelerometer is a 6-axis accelerometer that can also measure angular velocity, ensuring accurate data collection. The learning and performance evaluation of the Kodály learning device were conducted using Python.