Verification, Validation, and Accreditation (VV&A) Considering Military and Defense Characteristics

  • Received : 2015.02.18
  • Accepted : 2015.03.16
  • Published : 2015.03.30


In this paper, we identify the characteristics of modeling and simulation (M&S) for military and defense and propose the method of verification, validation, and accreditation (VV&A) using the identified characteristics. M&S has been widely used for many different applications in military and defense, including training, analysis, and acquisition. Various methods and processes of VV&A have been proposed by researchers and M&S practitioners to guarantee the correctness of M&S. The notion of applying formal credibility assessment in VV&A originated in software engineering reliability testing and the systems engineering development process. However, the VV&A techniques and processes proposed for M&S by the research community have not addressed the characteristics and issues specific to military and defense. We first identify the characteristics and issues of military/defense M&S and then propose techniques and methods for VV&A that are specific for military/defense M&S. Possible approaches for the development of VV&A are also proposed.


Supported by : Agency for Defense Development


  1. Adrion, W. R., Branstad, M. A., and Cherniavsky, J. C. (1982), Validation, Verification, and Testing of Computer Software, ACM Computing Surveys (CSUR), 14(2), 159-192.
  2. Ahn, J. (2007), Pilot Study: Credibility Assessment of Spacenet 1.3 with NASA-STD-(I)-7009.
  3. Balci, O. (1994), Validation, Verification, and Testing Techniques Throughout the Life Cycle of a Simulation Study, Annals of Operations Research, 53(1), 121-173.
  4. Balci, O. Adams, R. J., Myers, D. S., and Nance, R. E. (2002), A Collaborative Evaluation Environment for Credibility Assessment of Modeling and Simulation Applications, Paper presented at the Simulation Conference.
  5. Choi, Y. J. (2012), The Study of Process for VV&A on Acquiring the Credibility of M&S, Journal of the Korea Society of Systems Engineering, 5(2).
  6. Godefroid, P. (2005), Software Model Checking: The Verisoft Approach, Formal Methods, System Design, 26(2), 77-101.
  7. Godefroid, P. and Khurshid, S. (2002), Exploring Very Large State Spaces Using Genetic Algorithms, Tools and Algorithms for the Construction and Analysis of Systems, 266-280.
  8. Heiko, A. (2012), Consensus measurement in Delphi studies: review and implications for future quality assurance, Technological Forecasting and Social Change, 79(8), 1525-1536.
  9. IEEE Standard for Software Verification and Validation (1998), 1012-1998.
  10. Jang, Y. J., Kim, J. H., Jo, H. J., Hwang, I. H., and Kim, D. Y. (2012), Research Direction Proposal on VV&A of the Defense M&S and the Categorization of VV&A Method Based on Modeling Abstraction, Journal of the Korea Society for Simulation, 2.
  11. Kilikauskas, M., and David, H. H. (2005), The use of M&S VV&A as a risk mitigation strategy in defense acquisition, The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology, 2(4), 209-216.
  12. Medium Range Surface to Air Missile System Chun-Gung (2012), developed by Agency for Defense Development, Internal White Paper.
  13. NASA Systems Engineering Handbook and INCOSE Systems Engineering Handbook.
  14. NASA-STD-7009 (Standard for Models and Simulations) 2008.
  15. National Computerization Agency, (1990), Verification, Validation and Test Guideline for Software.
  16. Oberkampf, W. L., Trucano, T. G., and Hirsch, C. (2004), Verification, Validation, and Predictive Capability in Computational Engineering and Physics, Applied Mechanics Reviews, 57(5), 345-384.
  17. Tolk, A. (2012), Engineering Principles of Combat Modeling and Distributed Simulation, Wiley.
  18. Wakeland, W., Shervais, S., and Raffo, D. (2005), Heuristic Optimization as a V&V Tool for Software Process Simulation Models, Software Process: Improvement and Practice, 10(3), 301-309.