• Title/Summary/Keyword: Requirements Normalization

Search Result 17, Processing Time 0.025 seconds

A Technique for Requirements Normalization and Selection based on Practical Approach (사례 기반의 요구사항 정형화 및 선정 평가 기법)

  • Byun, Jung-Won;Rhew, Sung-Yul;Kim, Jin-Su
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.11
    • /
    • pp.149-161
    • /
    • 2012
  • In the customer-centered world, which is used internet and social network services actively, it is important that their needs should be embraced into systems. Our study proposes a technique to normalize and select 1,800 customer's needs at C company. In order to normalize the requirements, we have specified requirements by considering a standard for requirements specification, have identified a set of keywords of requirements and have constructed a relation-graph of requirements. The graph presents objectives to design and build a system, and shows a degree of relative importance for each requirement. And then, we propose a techniques for requirements selection according to their contribution points, which are calculated using relative degree. We demonstrates our techniques by a case study of C company.

Study on Improving Learning Speed of Artificial Neural Network Model for Ammunition Stockpile Reliability Classification (저장탄약 신뢰성분류 인공신경망모델의 학습속도 향상에 관한 연구)

  • Lee, Dong-Nyok;Yoon, Keun-Sig;Noh, Yoo-Chan
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.21 no.6
    • /
    • pp.374-382
    • /
    • 2020
  • The purpose of this study is to improve the learning speed of an ammunition stockpile reliability classification artificial neural network model by proposing a normalization method that reduces the number of input variables based on the characteristic of Ammunition Stockpile Reliability Program (ASRP) data without loss of classification performance. Ammunition's performance requirements are specified in the Korea Defense Specification (KDS) and Ammunition Stockpile reliability Test Procedure (ASTP). Based on the characteristic of the ASRP data, input variables can be normalized to estimate the lot percent nonconforming or failure rate. To maintain the unitary hypercube condition of the input variables, min-max normalization method is also used. Area Under the ROC Curve (AUC) of general min-max normalization and proposed 2-step normalization is over 0.95 and speed-up for marching learning based on ASRP field data is improved 1.74 ~ 1.99 times depending on the numbers of training data and of hidden layer's node.

Feature-Oriented Requirements Change Management with Value Analysis (가치분석을 통한 휘처 기반의 요구사항 변경 관리)

  • Ahn, Sang-Im;Chong, Ki-Won
    • The Journal of Society for e-Business Studies
    • /
    • v.12 no.3
    • /
    • pp.33-47
    • /
    • 2007
  • The requirements have been changed during development progresses, since it is impossible to define all of software requirements. These requirements change leads to mistakes because the developers cannot completely understand the software's structure and behavior, or they cannot discover all parts affected by a change. Requirement changes have to be managed and assessed to ensure that they are feasible, make economic sense and contribute to the business needs of the customer organization. We propose a feature-oriented requirements change management method to manage requirements change with value analysis and feature-oriented traceability links including intermediate catalysis using features. Our approach offers two contributions to the study of requirements change: (1) We define requirements change tree to make user requirements change request generalize by feature level. (2) We provide overall process such as change request normalization, change impact analysis, solution dealing with change request, change request implementation, change request evaluation. In addition, we especially present the results of a case study which is carried out in asset management portal system in details.

  • PDF

REVIEW OF DYNAMIC LOADING J-R TEST METHOD FOR LEAK BEFORE BREAK OF NUCLEAR PIPING

  • Oh, Young-Jin;Hwang, Il-Soon
    • Nuclear Engineering and Technology
    • /
    • v.38 no.7
    • /
    • pp.639-656
    • /
    • 2006
  • In order to apply the leak before break (LBB) concept to nuclear piping systems, the dynamic strain aging effect of low carbon steel materials has to be taken into account, in compliance with the requirements of the Korean Standard Review Guide (KSRG) 3.6.3-1. For this goal, J-R tests are needed for a range of various temperatures and loading rates, including dynamic loading conditions. In the dynamic loading J-R test, the unloading compliance method can not be applied to measure the crack growth and direct current potential drop (DCPD) method; this method also has a problem defining the crack initiation point. The normalization method is known as a very useful method to determine the J-R curve under dynamic loading because it does not need additional equipment or complicated loading sequences such as electric current or unloading. This method was accepted by the American Society for Testing and Materials (ASTM) as a standard test method E1820 A15 in 2001. However, it has not yet been clearly verified yet if the normalization method is sufficiently reliable to be applied to LBB. In this study, the basic background of the J-integral, LBB and dynamic loading J-R test are explained, and the current status for dynamic loading J-R test methods are reviewed from the view point of LBB for nuclear piping. In particular, the theoretical and historical background of the normalization method which has received attention recently, is summarized. Recent studies for this method are introduced and future works are suggested that may improve the reliability of LBB for nuclear piping.

A Model of Quality Function Deployment with Cost-Quality Tradeoffs (품질과 비용의 득실관계를 고려한 품질기능전개 모형)

  • 우태희;박재현
    • Proceedings of the Safety Management and Science Conference
    • /
    • 2002.05a
    • /
    • pp.227-230
    • /
    • 2002
  • This paper presents an analytic method of quality function deployment(QFD) that is to maximize customer satisfaction subject to technical and economic sides in process design. We have used Wasserman's normalization method and the analytical hierarchy process(AHP) to determine the intensity of the relationship between customer requirements and process design attributes. This paper also shows cost-quality model the tradeoff between quality and cost as a linear programming(LP) with new constraints that have designated special process required allocating firstly The cost-quality function deployment of piston ring is presented to illustrate the feasibility of such techniques.

  • PDF

A Model of Quality Function Deployment with Cost-Quality Tradeoffs (품질과 비용의 득실관계를 고려한 품질기능전개 모형)

  • 우태희;박재현
    • Journal of the Korea Safety Management & Science
    • /
    • v.4 no.2
    • /
    • pp.169-178
    • /
    • 2002
  • This paper presents an analytic method of quality function deployment(QFD) that is to maximize customer satisfaction subject to technical and economic sides in process design. We have used Wasserman's normalization method and the analytical hierarchy process(AHP) to determine the intensity of the relationship between customer requirements and process design attributes. This paper also shows cost-quality model the tradeoff between quality and cost as a linear programming(LP) with new constraints that have designated special process required allocating firstly. The cost-quality function deployment of piston ring is presented to illustrate the feasibility of such techniques.

Requirements Trace Table Expansion and How to Normalization (요구사항추적테이블의 확장 및 정규화 방안)

  • Kim, Ju-Young;Rhew, Sung-Yul
    • The KIPS Transactions:PartD
    • /
    • v.16D no.2
    • /
    • pp.201-212
    • /
    • 2009
  • There are various methods to trace output in software development to verify the consistency and completeness of requirements. Existing studies do present the trace meta-model or automated tools, but fail to list specifically traced output or traced items. Also, in regards to trace tables, which contain traced items, existing studies don‘t consider the whole software development process but merely its sub-process. Given this, the present study suggests an extended requirements tracetable that tracks output from the inception of the project through the architectural design phase to the application delivery, following up on the researcher’s previous study on a tracetable that considered only a sub-process of the whole development process. In addition, in order to address the problem of the tracking process becoming complicated with increased tracefields due to an extended trace table, the researcher suggests a method for normalizing a requirements tracetable that can integrate and separate by development process phase. Apply it to theH system development project of a D company, and this study caseverified application possibility of study, confirmed an effect of a chase to easily find out an error of requirements. Improve precision of a traceto verify consistency of requirements and completeness through this study, and will minimize failure of a software development.

Deep Neural Network-based Jellyfish Distribution Recognition System Using a UAV (무인기를 이용한 심층 신경망 기반 해파리 분포 인식 시스템)

  • Koo, Jungmo;Myung, Hyun
    • The Journal of Korea Robotics Society
    • /
    • v.12 no.4
    • /
    • pp.432-440
    • /
    • 2017
  • In this paper, we propose a jellyfish distribution recognition and monitoring system using a UAV (unmanned aerial vehicle). The UAV was designed to satisfy the requirements for flight in ocean environment. The target jellyfish, Aurelia aurita, is recognized through convolutional neural network and its distribution is calculated. The modified deep neural network architecture has been developed to have reliable recognition accuracy and fast operation speed. Recognition speed is about 400 times faster than GoogLeNet by using a lightweight network architecture. We also introduce the method for selecting candidates to be used as inputs to the proposed network. The recognition accuracy of the jellyfish is improved by removing the probability value of the meaningless class among the probability vectors of the evaluated input image and re-evaluating it by normalization. The jellyfish distribution is calculated based on the unit jellyfish image recognized. The distribution level is defined by using the novelty concept of the distribution map buffer.

Automatic Generation of Cause-Effect Graph through Refining Requirements Specifications based on Semantic rules with Corpus Normalization (말뭉치 정규화와 의미 규칙 기반 요구사항 정제를 통한 원인-결과 그래프 자동 생성)

  • Jang, Woo Sung;Kim, R.Young Chul
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2019.10a
    • /
    • pp.691-693
    • /
    • 2019
  • 현실적으로 요구사항의 불명료성은 테스트 케이스 추출에 어려움을 초래한다. 명료한 요구사항 기반의 사용자 승인 테스트는 소프트웨어의 올바른 품질을 증가시키고, 유지보수 비용을 감소시킨다. 하지만 중소기업에서는 촉박한 개발 기간, 테스트 도구 구매 비용의 부담, 낮은 테스트 기술 레벨 등의 이유로 좋은 품질의 테스트를 수행하기가 힘들다. 이러한 문제점의 해결을 위해 말뭉치 정규화를 이용한 의미 규칙으로 불명료한 요구사항을 간결하고 명료한 요구사항으로 변경하기 위한 메커니즘을 제안한다. 또한 이를 원인-결과 그래프 자동 생성하는 방법을 제안한다. 이는 원인-결과 그래프를 통해 테스트케이스를 최대한 생성하는 기초가 될 수 있다.

A STUDY ON INTER-RELATIONSHIP OF VEGETATION INDICES USING IKONOS AND LANDSAT-7 ETM+ IMAGERY

  • Yun, Young-Bo;Lee, Sung-Hun;Cho, Seong-Ik;Cho, Woo-Sug
    • Proceedings of the KSRS Conference
    • /
    • v.2
    • /
    • pp.852-855
    • /
    • 2006
  • There is an increasing need to use data from different sensors in order to maximize the chances of obtaining a cloud-free image and to meet timely requirements for information. However, the use of data from multiple sensor systems is depending on comprehensive relationships between sensors of different types. Indeed, a study of inter-sensor relationships is well advanced in the effective use of remotely sensed data from multiple sensors. This paper was concerned with relationships between sensors of different types for vegetation indices (VI). The study was conducted using IKONOS and Landsat-7 ETM+ images. IKONOS and Landsat-7 ETM+ image of the same or about the same dates were acquired. The Landsat-7 ETM+ images were resampled in order to make them coincide with the pixel sizes of IKONOS. Inter-relationships of vegetation indices between images were performed using at-satellite reflectance obtained by converting image digital number (DN). All images were applied to topographic normalization method in order to reduce topographic effect in digital imagery. Also, Inter-sensor model equations between two sensors were developed and applied to other study region. In the result, the relational equations can be used to compute or interpret VI of one sensor using the VI of another sensor.

  • PDF