• Title/Summary/Keyword: Software Test Process

Search Result 590, Processing Time 0.022 seconds

Finite Element Analysis of an Agricultural Tractor Cabin based on the OECD Standard(code 4) (OECD규정(제4항)에 기초한 농업용 트랙터 캐빈의 유한요소 해석)

  • 하창욱;김현진;구남서;권영두
    • Journal of Biosystems Engineering
    • /
    • v.28 no.4
    • /
    • pp.305-314
    • /
    • 2003
  • The ROPS of an agricultural tractor is designed to protect its driver when the tractor overturns. Although the current OECD tests to determine whether the ROPS meets the requirements of the OECD regulation are desirable, they need long time to test. We experimental time and effort by using CAE. We conducted a finite element analysis for the ROPS design of a Dae-Dong tractor cabin in an attempt to reduce the design and manufacturing time. This study shows the interpretative skill using MARC(v.2000) for designing ROPS and difference between the results of testing and FEA. Design process is generally divided into two phases: a concept and a detail design. The concept design uses simple analysis to predict structural behavior, whereas the detail design involves a finite element analysis performed by the results of the concept design. This study focused on the detail design and used Patran(v.2000r2) and MARC(v.2000) of the MSC software corporation. The model consisted of 4812 elements and 4582 nodes. Four tests. specified in the OECD standards, were performed: (1) longitudinal loading test (2) rear crushing test (3) side loading test (4), and front crushing test. Independent analyses were also performed for each test, along with a sequential analysis. When compared, the results of the independent and sequential analyses were found to be similar to the test results.

The algorithm design and the test bed construction method of processing for periodic delayed data (주기적 지연 데이터 처리를 위한 알고리즘 설계 및 테스트 베드 구축 방법)

  • Sang-hoon Koh;Ho-jin Song;Nam-ho Keum;Pil-joong Yoo;Se-kwon Oh;Young-sung Kim
    • Journal of Advanced Navigation Technology
    • /
    • v.27 no.1
    • /
    • pp.102-110
    • /
    • 2023
  • The MATS(Missile Assembly Test Set) is manufactured and used to check the function of the missile during the period of development for the guided missile system, and the requirements for power and communication are managed for equipment production. The MATS developer implements software according to the proposed communication standard to guarantee the reliability of the data that communicates with the guided missile. The test bed is built and self-performance evaluation is performed after implementation. And the verification process is performed using the standard equipment. The characteristics of periodic delay for data transmission must be reflected when building a test bed. This paper describes a test bed construction method for data processing with periodic delay. Also This paper compares and evaluates the performance by changing the previously designed algorithm.

Enhanced Security Measurement of Web Application Testing by Outsourcing (외주 개발 웹 어플리케이션 테스팅의 보안성 강화 방안)

  • Choi, Kyong-Ho;Lee, DongHwi
    • Convergence Security Journal
    • /
    • v.15 no.4
    • /
    • pp.3-9
    • /
    • 2015
  • A web application that allows a web service created by a internal developer who has security awareness show certain level of security. However, in the case of development by outsourcing, it is inevitable to implement the development centered on requested function rather than the issue of security. Thus in this paper, we improve the software testing process focusing on security for exclusion the leakage of important information and using an unauthorized service that results from the use of the vulnerable web application. The proposed model is able to consider security in the initial stage of development even when outsourced web application, especially, It can prevent the development schedule delay caused by the occurrence of modification for program created by programer who has low security awareness. This result shows that this model can be applied to the national defense area for increasing demand web application centered resource management system to be able to prevent service of web application with security vulnerability based on high test.

BIM-DRIVEN ENERGY ANALYSIS FOR ZERO NET ENERGY TEST HOME (ZNETH)

  • Yong K. Cho;Thaddaeus A. Bode;Sultan Alaskar
    • International conference on construction engineering and project management
    • /
    • 2009.05a
    • /
    • pp.276-284
    • /
    • 2009
  • As an on-going research project, Zero Net Energy Test Home (ZNETH) project investigates effective approaches to achieve whole-house environmental and energy goals. The main research objectives are (1) to identify energy saving solutions for designs, materials, and construction methods for the ZNETH house and (2) to verify whether ZNETH house can produce more energy than the house uses by utilizing Building Information Modeling (BIM) and energy analysis tools. The initial project analysis is conducted using building information modeling (BIM) and energy analysis tools. The BIM-driven research approach incorporates architectural and construction engineering methods for improving whole-building performance while minimizing increases in overall building cost. This paper discusses about advantages/disadvantages of using BIM integrated energy analysis, related interoperability issues between BIM software and energy analysis software, and results of energy analysis for ZNETH. Although this investigation is in its early stage, several dramatic outcomes have already been observed. Utilizing BIM for energy analysis is an obvious benefit because of the ease by which the 3D model is transferred, and the speed that an energy model can be analyzed and interpreted to improve design. The research will continue to use the ZNETH project as a testing bed for the integration of sustainable design into the BIM process.

  • PDF

Scalability Estimations of a Workcase-based Workflow Engine (워크케이스 기반 워크플로우 엔진의 초대형성 성능 평가)

  • Ahn, Hyung-Jin;Park, Min-Jae;Lee, Ki-Won;Kim, Kwang-Hoon
    • Journal of Internet Computing and Services
    • /
    • v.9 no.6
    • /
    • pp.89-97
    • /
    • 2008
  • Recently, many organizations such as companies or institutions have demanded induction of very large-scale workflow management system in order to process a large number of business-instances. Workflow-related vendors have focused on physical extension of workflow engines based on device-level clustering, so as to provide very large-scale workflow services. Performance improvement of workflow engine by simple physical-connection among computer systems which don't consider logical-level software architecture lead to wastes of time and cost for construction of very large-scale workflow service environment. In this paper, we propose methodology for performance improvement based on logical software architectures of workflow engine. We also evaluate scalable performance between workflow engines using the activity instance based architecture and workcase based architecture, our proposed architecture. Through analysis of this test's result, we can observe that software architectures to be applied on a workflow engine have an effect on scalable performance.

  • PDF

The Study for NHPP Software Reliability Model based on Chi-Square Distribution (카이제곱 NHPP에 의한 소프트웨어 신뢰성 모형에 관한 연구)

  • Kim, Hee-Cheul
    • Journal of the Korea Society of Computer and Information
    • /
    • v.11 no.1 s.39
    • /
    • pp.45-53
    • /
    • 2006
  • Finite failure NHPP models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this paper, Goel-Okumoto and Yamada-Ohba-Osaki model was reviewed, proposes the $x^2$ reliability model, which can capture the increasing nature of the failure occurrence rate per fault. Algorithm to estimate the parameters used to maximum likelihood estimator and bisection method, model selection based on SSE, AIC statistics and Kolmogorov distance, for the sake of efficient model, was employed. Analysis of failure using real data set, SYS2(Allen P.Nikora and Michael R.Lyu), for the sake of proposing shape parameter of the $x^2$ distribution using the degree of freedom, was employed. This analysis of failure data compared with the $x^2$ model and the existing model using arithmetic and Laplace trend tests, Kolmogorov test is presented.

  • PDF

Error-Resilience Enhancement based on Polyphase Down Sampling for the H.264 Video Coding Technology (에러 강인성 향상을 위한 다상 다운 샘플링 적용 H.264 동영상 부호화 기술)

  • Jung, Eun Ku;Jia, Jie;Kim, Hae Kwang;Choi, Hae Chul;Kim, Jae Gon
    • Journal of Broadcast Engineering
    • /
    • v.10 no.3
    • /
    • pp.340-347
    • /
    • 2005
  • This paper presents a polyphase down sampling based multiple description coding applied to H.264 video coding standard. For a given macroblock, a residual macroblock is calculated by motion estimation, and before applying DCT, quantization and entrophy coding of the H.264 coding process, the polyphase down sampling is applied to the residual macroblock to code in four separate descriptions. Experiments were performed for all the 9 test sequences of JVT SVC standardization in various packet loss patterns. Experimental results show that the proposed one gives 0.5 to 5 dB enhancement over an error-concealment based on the slice group map technolgoy.

A Study on Optimization of Injection-molded System Using CAE and Design of Experiment (CAE와 실험계획법을 연계한 사출 성형 시스템 최적화에 관한 연구)

  • Oh Jung-Yeol;Huh Yong-Jeong
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.7 no.3
    • /
    • pp.271-277
    • /
    • 2006
  • Injection molding process is the manufacturing process that can obtain a high quality products in large quantity to a low cost. Since there are many input factors in every situation that can influence part's quality, the method is difficult to save the exact simulation data. Latest, it deals with the CAE method that supports the experiment, it is applied to the Design of Experiment for the optimum injection molding process. If there are many factors, the interaction among those factors must be considered by applying Design of Experiment which is taken from the technique of minimizing the number of experiment. Without a real test, it is taken the simulation data using $Moldflow^(R)$ software. $Moldflow^(R)$ is used for the analysis of injection molding process, it is analyzed the factors that affect a warpage using the Taguchi method and then the optimal injection molding process is obtained.

  • PDF

Improvement of Electroforming Process System Based on Double Hidden Layer Network (이중 비밀 다층구조 네트워크에 기반한 전기주조 공정 시스템의 개선)

  • Byung-Won Min
    • Journal of Internet of Things and Convergence
    • /
    • v.9 no.3
    • /
    • pp.61-67
    • /
    • 2023
  • In order to optimize the pulse electroforming copper process, a double hidden layer BP (Back Propagation) neural network is constructed. Through sample training, the mapping relationship between electroforming copper process conditions and target properties is accurately established, and the prediction of microhardness and tensile strength of the electroforming layer in the pulse electroforming copper process is realized. The predicted results are verified by electrodeposition copper test in copper pyrophosphate solution system with pulse power supply. The results show that the microhardness and tensile strength of copper layer predicted by "3-4-3-2" structure double hidden layer neural network are very close to the experimental values, and the relative error is less than 2.32%. In the parameter range, the microhardness of copper layer is between 100.3~205.6MPa and the tensile strength is between 112~485MPa.When the microhardness and tensile strength are optimal,the corresponding process conditions are as follows: current density is 2A-dm-2, pulse frequency is 2KHz and pulse duty cycle is 10%.

Target bit allocation algorithm for generation of high quality static test stream (고화질 정지화 테스트 스트림의 생성을 위한 목표비트 할당 알고리즘)

  • Lee Gwang soon;Han Chan ho;Jang Soo wook;Kim Eun su;Sohng Kyu ik
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.3C
    • /
    • pp.147-152
    • /
    • 2005
  • In this paper, we proposed a method for compressing the static video test patterns in high quality to test the picture quality in DTV. In our method, we use the fact that the generated bits and average quantization value have almost identical distribution characteristics per each GOP and we propose a new target bit allocation method suitable for compressing the static test pattern while the target bit allocation method in MPEG-2 TM5 is suitable for the moving picture. The proposed target bit allocation method is to maintain the high quality video continuously by using the normalized complexities which are updated or maintained by means of picture qualities at each GOP. Experiment result showed that the test pattern stream encoded by MPEG-2 software with the proposed algorithm had a stable bit rate and good video quality during the decoding process.