• Title/Summary/Keyword: Variability Modeling Techniques

Search Result 17, Processing Time 0.023 seconds

Techniques for Designing Logic and Workflow Variability in Software Component Development (소프트웨어 컴포넌트 개발을 위한 논리 및 워크플로우 가변성 설계 기법)

  • 정광선;김수동
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.8
    • /
    • pp.1027-1042
    • /
    • 2004
  • A Software Component is a module that is reused among a lot of projects, systems, and companies rather than a single application. Components can be reused in various systems if they provide not only the common functionalities required in many applications but also the diverse aspects to be customized for being suitable for customers' demands. From the development phase, components should be designed and developed considering the variable aspects they have for convenient customization. Easily customized components can be frequently reused in lots of applications. In the literature, there are some modeling and customizing techniques. But they suggested only conceptual or basic methods based on Object-Oriented. And the practical instructions for reusing component were not provided sufficiently. Moreover, there are few techniques that consider the proper variability types components have. Thus, those techniques are not appropriate for applying to black box component completely developed and released. In this paper, we classify variabilities that components have in functional aspect into two categories. The one is logic variability, and the other is workflow variability. For each classified variability, we propose the three kind of modeling techniques, which are selection, plug in and externalization. Also detailed instructions for practical design and application are provided.

Multiresponse Optimization Using a Response Surface Approach to Taguchi′s Parameter Design (다구찌의 파라미터 설계에 대한 반응표면 접근방법을 이용한 다반응 최적화)

  • 이우선;이종협;임성수
    • Journal of Korean Society for Quality Management
    • /
    • v.27 no.1
    • /
    • pp.165-194
    • /
    • 1999
  • Taguchi's parameter design seeks proper choice of levels of controllable factors (Parameters in Taguchi's terminology) that makes the qualify characteristic of a product optimal while making its variability small. This aim can be achieved by response surface techniques that allow flexibility in modeling and analysis. In this article, a collection of response surface modeling and analysis techniques is proposed to deal with the multiresponse optimization problem in experimentation with Taguchi's signal and noise factors.

  • PDF

A Study of Paper Couture Based on Paper Modeling Techniques

  • Hong, Sungsun
    • Journal of Fashion Business
    • /
    • v.18 no.3
    • /
    • pp.73-90
    • /
    • 2014
  • Paper, once known and used only as a medium for printing or handicrafts, is now being used in new fields including artistic clothing, and an environment-friendly material for fashion, while the functionality of its formative characteristics and esthetics have been newly highlighted. On this basis, this study performed a content analysis of paper couture and categorization of types of paper modeling techniques based on 904 paper couture submitted to paper fashion shows, exhibitions and contest exhibits from 2001 to 2013. Analysis results showed that paper textile types were most common at 86.64%, while techniques using laminating, bonding, overlapping or paper as-is represented 62.17%. Expressive techniques in which paper was cut or torn and attached to paper clothing was 11.62%, paper folding was 5.75%, drawing and coloring 4.65%, and finally, paper cutting was 2.65%. Meanwhile, among paper modeling techniques using paper yarn textiles, a paper weaving technique was 6.75%. Moreover, other techniques in which paper modeling techniques or subsidiary clothing was blended were 3.65%, and Dak peeling textiles were 1.33%. Paper paste moulding textiles types represented 1.44%, above all papier $m{\hat{a}}ch{\acute{e}}$ techniques of 0.55% and creasing and holding techniques were 0.88%. Paper is sufficient to express the artists' creativity as well as having qualities as an artistic medium, such as variability through combined use with other materials, variation in form, suitability for reuse of waste paper, and environmental friendliness. Also, various paper modeling techniques can be blended with textiles for a generalized technology that overcomes the limits of paper and textiles.

Formal Specification and Modeling Techniques of Component Workflow Variability (컴포넌트 워크플로우 가변성의 정형 명세 및 모델링 기법)

  • Lee, Jong-Kook;Cho, Eun-Sook;Kim, Soo-Dong
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.10
    • /
    • pp.703-725
    • /
    • 2002
  • It is well recognized that component-based development (CBD) is an effective approach to manage the complexity of modem software development. To achieve the benefits of low-cost development and higher productivity, effective techniques to maximize component reusability should be developed. Component is a set of related concepts and objects, and provides a particular coarse-grained business service. Often, these components include various message flows among the objects in the component, called 'business workflow`. Blackbox components that include but hide business workflow provide higher reusability and productivity. A key difficulty of using blackbox components with business workflow is to let the workflow be customized by each enterprise. In this paper, we provide techniques to model the variability of family members and to customize the business workflow of components. Our approach is to provide formal specification on the component variability, and to define techniques to customize them by means of the formalism.

Potential of regression models in projecting sea level variability due to climate change at Haldia Port, India

  • Roshni, Thendiyath;K., Md. Sajid;Samui, Pijush
    • Ocean Systems Engineering
    • /
    • v.7 no.4
    • /
    • pp.319-328
    • /
    • 2017
  • Higher prediction efficacy is a very challenging task in any field of engineering. Due to global warming, there is a considerable increase in the global sea level. Through this work, an attempt has been made to find the sea level variability due to climate change impact at Haldia Port, India. Different statistical downscaling techniques are available and through this paper authors are intending to compare and illustrate the performances of three regression models. The models: Wavelet Neural Network (WNN), Minimax Probability Machine Regression (MPMR), Feed-Forward Neural Network (FFNN) are used for projecting the sea level variability due to climate change at Haldia Port, India. Model performance indices like PI, RMSE, NSE, MAPE, RSR etc were evaluated to get a clear picture on the model accuracy. All the indices are pointing towards the outperformance of WNN in projecting the sea level variability. The findings suggest a strong recommendation for ensembled models especially wavelet decomposed neural network to improve projecting efficiency in any time series modeling.

UML-based OO Framework Modeling Techniques (UML 기반의 객체지향 프레임워크 모델링 기법)

  • Yoo, Young-Ran;Park, Dong-Hyuk;Kim, Soo-Dong
    • Journal of KIISE:Software and Applications
    • /
    • v.27 no.3
    • /
    • pp.227-240
    • /
    • 2000
  • The research of the variability gains more gravity in component-based software development, because it helps to extend the reusability of the component. A domain-specific component supports the more variability, the wider scope that the component can be applied to. However, the more variability included in a component, It makes the size of a component bigger, and the cost to construct the component is rises. As a result, this disturbs making an optimized system. In this paper, we classify the variability into 3 types, according to their features. And we propose some implementation techniques for each type based COM. Moreover, we also propose a process to analysis and design the variability with their artifacts, which includes some tasks from variability extraction to implementation of it. This proposed process can be applied as a part of the component developing process.

  • PDF

Research Trend of DFN Modeling Methodology: Representation of Spatial Distribution Characteristics of Fracture Networks (DFN 모델링 연구 동향 소개: 균열망의 공간적 분포 특성 모사를 중심으로)

  • Jineon, Kim;Jiwon, Cho;iIl-Seok, Kang;Jae-Joon, Song
    • Tunnel and Underground Space
    • /
    • v.32 no.6
    • /
    • pp.464-477
    • /
    • 2022
  • DFN (discrete fracture network) models that take account of spatial variability and correlation between rock fractures have been demanded for analysis of fractured rock mass behavior for wide areas with high reliability, such as that of underground nuclear waste repositories. In this regard, this report describes the spatial distribution characteristics of fracture networks, and the DFN modeling methodologies that aim to represent such characteristics. DFN modeling methods have been proposed to represent the spatial variability of rock fractures by defining fracture domains (Darcel et al., 2013) and the spatial correlation among fractures by genetic modeling techniques that imitate fracture growth processes (Davy et al., 2013, Libby et al., 2019, Lavoine et al., 2020).These methods, however, require further research for their application to field surveys and for modeling in-situ rock fracture networks.

Variability Support in Component-based Product Lines using Component Code Generation (컴포넌트 코드 생성을 통한 컴포넌트 기반 제품 라인에서의 가변성 지원)

  • Choi, Seung-Hoon
    • Journal of Internet Computing and Services
    • /
    • v.6 no.4
    • /
    • pp.21-35
    • /
    • 2005
  • Software product-lines is the software development paradigm to attain the rapid development of quality applications by customizing the reconfigurable components and composing them based on predefined software architectures. Recently various methodologies for the component-based product lines are proposed, but these don't provide the specific implementation techniques of the components in terms of variability resolution mechanism. In other hand, the several approaches to implement the component supporting the variabilities resolution are developed, but these don't define the systematic analysis and design method considering the variabilities from the initial phase. This paper proposes the integration of PLUS, the one of product line methodologies extending UML modeling, and component code generation technique in order to increase the efficiency of producing the specific product in the software product lines. In this paper, the component has the hierarchical architecture consisting of the implementation elements, and each implementation elements are implemented as XSLT scripts. The codes of the components are generated from the feature selection. Using the microwave oven product lines as case study, the development process for the reconfigurable components supporting the automatic variability resolution is described.

  • PDF

A literature review on RSM-based robust parameter design (RPD): Experimental design, estimation modeling, and optimization methods (반응표면법기반 강건파라미터설계에 대한 문헌연구: 실험설계, 추정 모형, 최적화 방법)

  • Le, Tuan-Ho;Shin, Sangmun
    • Journal of Korean Society for Quality Management
    • /
    • v.46 no.1
    • /
    • pp.39-74
    • /
    • 2018
  • Purpose: For more than 30 years, robust parameter design (RPD), which attempts to minimize the process bias (i.e., deviation between the mean and the target) and its variability simultaneously, has received consistent attention from researchers in academia and industry. Based on Taguchi's philosophy, a number of RPD methodologies have been developed to improve the quality of products and processes. The primary purpose of this paper is to review and discuss existing RPD methodologies in terms of the three sequential RPD procedures of experimental design, parameter estimation, and optimization. Methods: This literature study composes three review aspects including experimental design, estimation modeling, and optimization methods. Results: To analyze the benefits and weaknesses of conventional RPD methods and investigate the requirements of future research, we first analyze a variety of experimental formats associated with input control and noise factors, output responses and replication, and estimation approaches. Secondly, existing estimation methods are categorized according to their implementation of least-squares, maximum likelihood estimation, generalized linear models, Bayesian techniques, or the response surface methodology. Thirdly, optimization models for single and multiple responses problems are analyzed within their historical and functional framework. Conclusion: This study identifies the current RPD foundations and unresolved problems, including ample discussion of further directions of study.

Response Surface Approach to Integrated Optimization Modeling for Parameter and Tolerance Design (반응표면분석법을 이용한 모수 및 공차설계 통합모형)

  • Young Jin Kim
    • Journal of Korean Society for Quality Management
    • /
    • v.30 no.4
    • /
    • pp.58-67
    • /
    • 2002
  • Since the inception of off-line quality control, it has drawn a particular attention from research community and it has been implemented in a wide variety of industries mainly due to its extensive applicability to numerous real situations. Emphasizing design issues rather than control issues related to manufacturing processes, off-line quality control has been recognized as a cost-effective approach to quality improvement. It mainly consists of three design stages: system design, parameter design, and tolerance design which are implemented in a sequential manner. Utilizing experimental designs and optimization techniques, off-line quality control is aimed at achieving product performance insensitive to external noises by reducing process variability. In spite of its conceptual soundness and practical significance, however, off-line quality control has also been criticized to a great extent due to its heuristic nature of investigation. In addition, it has also been pointed out that the process optimization procedures are inefficient. To enhance the current practice of off-line quality control, this study proposes an integrated optimization model by utilizing a well-established statistical tool, so called response surface methodology (RSM), and a tolerance - cost relationship.