DOI QR코드

DOI QR Code

OAPR-HOML'1: Optimal automated program repair approach based on hybrid improved grasshopper optimization and opposition learning based artificial neural network

  • MAMATHA, T. (Dept. of CSE, JNTUA University) ;
  • RAMA SUBBA REDDY, B. (Dept. of CSE, SV College of Engineering) ;
  • BINDU, C SHOBA (Dept. of CSE, JNTUA University)
  • Received : 2022.04.05
  • Published : 2022.04.30

Abstract

Over the last decade, the scientific community has been actively developing technologies for automated software bug fixes called Automated Program Repair (APR). Several APR techniques have recently been proposed to effectively address multiple classroom programming errors. However, little attention has been paid to the advances in effective APR techniques for software bugs that are widely occurring during the software life cycle maintenance phase. To further enhance the concept of software testing and debugging, we recommend an optimized automated software repair approach based on hybrid technology (OAPR-HOML'1). The first contribution of the proposed OAPR-HOML'1 technique is to introduce an improved grasshopper optimization (IGO) algorithm for fault location identification in the given test projects. Then, we illustrate an opposition learning based artificial neural network (OL-ANN) technique to select AST node-level transformation schemas to create the sketches which provide automated program repair for those faulty projects. Finally, the OAPR-HOML'1 is evaluated using Defects4J benchmark and the performance is compared with the modern technologies number of bugs fixed, accuracy, precession, recall and F-measure.

Keywords

Acknowledgement

Authors thank, K. Abijith Rao (CEO, SNIST) and Management of Sreenidhi Institute of Science and Technology, JNT University Anantapuramu for providing a assistance to establish working environment in the lab to carry out my present research. Authors also acknowledge, Prof. C V Tomy ( Director), T. Shiva Reddy(Prinicipal), Dr. Aruna varanasi (head of the department) for continuous moral support, help and encouragement.

References

  1. Peng, R., Li, Y.F., Zhang, W.J. and Hu, Q.P., 2014. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction. Reliability Engineering & System Safety, 126, pp.37-43. https://doi.org/10.1016/j.ress.2014.01.004
  2. Chen, T.Y., Tse, T.H. and Zhou, Z.Q., 2010. Semi-proving: An integrated method for program proving, testing, and debugging. IEEE Transactions on Software Engineering, 37(1), pp.109-125. https://doi.org/10.1109/TSE.2010.23
  3. Gokhale, S.S., Lyu, M.R. and Trivedi, K.S., 2006. Incorporating fault debugging activities into software reliability models: A simulation approach. IEEE Transactions on reliability, 55(2), pp.281-292. https://doi.org/10.1109/TR.2006.874911
  4. Cao, P., Dong, Z., Liu, K. and Cai, K.Y., 2013. Quantitative effects of software testing on reliability improvement in the presence of imperfect debugging. Information Sciences, 218, pp.119-132. https://doi.org/10.1016/j.ins.2012.06.034
  5. Kondo, K. and Yoshida, M., 2005. Use of hybrid models for testing and debugging control software for electromechanical systems. IEEE/ASME Transactions on Mechatronics, 10(3), pp.275-284. https://doi.org/10.1109/TMECH.2005.848289
  6. Hierons, R.M., 2014. Generating complete controllable test suites for distributed testing. IEEE transactions on software engineering, 41(3), pp.279-293. https://doi.org/10.1109/TSE.2014.2364035
  7. Lin, C.T. and Li, Y.F., 2014. Rate-based queueing simulation model of open source software debugging activities. IEEE Transactions on Software Engineering, 40(11), pp.1075-1099. https://doi.org/10.1109/TSE.2014.2354032
  8. Huang, C.Y., Kuo, S.Y. and Lyu, M.R., 2007. An assessment of testing-effort dependent software reliability growth models. IEEE transactions on Reliability, 56(2), pp.198-211. https://doi.org/10.1109/TR.2007.895301
  9. Cotroneo, D., Pietrantuono, R. and Russo, S., 2013. Combining operational and debug testing for improving reliability. IEEE Transactions on Reliability, 62(2), pp.408-423. https://doi.org/10.1109/TR.2013.2257051
  10. Huang, C.Y. and Lin, C.T., 2006. Software reliability analysis by considering fault dependency reliability, 55(3), pp.436-450. https://doi.org/10.1109/TR.2006.879607
  11. Kapur, P.K., Pham, H., Anand, S. and Yadav, K., 2011. A unified approach for developing software reliability growth models in the presence of imperfect debugging and error generation. IEEE Transactions on Reliability, 60(1), pp.331-340. https://doi.org/10.1109/TR.2010.2103590
  12. Popentiu-Vladicescu, F. and Albeanu, G., 2016. Nature-inspired approaches in software faults identification and debugging. Procedia Computer Science, 92, pp.6-12. https://doi.org/10.1016/j.procs.2016.07.315
  13. Wong, W.E., Sugeta, T., Qi, Y. and Maldonado, J.C., 2005. Smart debugging software architectural design in SDL. Journal of Systems and Software, 76(1), pp.15-28. https://doi.org/10.1016/j.jss.2004.06.026
  14. Cai, K.Y., Cao, P., Dong, Z. and Liu, K., 2010. Mathematical modeling of software reliability testing with imperfect debugging. Computers & Mathematics with Applications, 59(10), pp.3245- 3285. https://doi.org/10.1016/j.camwa.2010.03.011
  15. Tokuno, K. and Yamada, S., 2003. Markovian software reliability measurement with a geometrically decreasing perfect debugging rate. Mathematical and computer modelling, 38(11-13), pp.1443-1451. https://doi.org/10.1016/S0895-7177(03)90148-8
  16. Minnerup, P., Lenz, D., Kessler, T. and Knoll, A., 2016. Debugging Autonomous Driving Systems Using Serialized Software Components. IFAC- PapersOnLine, 49(15), pp.44-49. https://doi.org/10.1016/j.ifacol.2016.07.612
  17. Chen, J. and Venkataramani, G., 2016. enDebug: A hardware-software framework for automated energy debugging. Journal of Parallel and Distributed Computing, 96, pp.121-133. https://doi.org/10.1016/j.jpdc.2016.05.005
  18. Abreu, R., Zoeteweij, P. and Van Gemund, A.J., 2011. Simultaneous debugging of software faults. Journal of Systems and Software, 84(4), pp.573-586. https://doi.org/10.1016/j.jss.2010.11.915
  19. Serrano, E., Quirin, A., Botia, J. and Cordon, O., 2010. Debugging complex software systems by means of pathfinder networks. Information Sciences, 180(5), pp.561-583. https://doi.org/10.1016/j.ins.2009.11.007
  20. Wang, J., Wu, Z., Shu, Y. and Zhang, Z., 2015. An I mperfect software debugging model considering loglogistic distribution fault content function. Journal of Systems and Software, 100, pp.167-181 https://doi.org/10.1016/j.jss.2014.10.040
  21. Li, Q. and Pham, H., 2017. NHPP software reliability model considering the uncertainty of operating environments with imperfect debugging and testing coverage. Applied Mathematical Modelling, 51, pp.68-85. https://doi.org/10.1016/j.apm.2017.06.034
  22. Li, F., Li, Z., Huo, W. and Feng, X., 2016. Locating software faults based on minimum debugging frontier set. IEEE Transactions on Software Engineering, 43(8), pp.760-776. https://doi.org/10.1109/TSE.2016.2632122
  23. Zhang, Y., Zhu, B., Fang, Y., Guo, S., Zhang, A. and Zhong, S., 2017. Secure inter-domain forwarding loop test in software defined networks. IEEE Transactions on Dependable and Secure Computing, 17(1), pp.162-178. https://doi.org/10.1109/tdsc.2017.2731773
  24. Gazzola, L., Micucci, D. and Mariani, L., 2017. Automatic software repair: A survey. IEEE Transactions on Software Engineering, 45(1), pp.34-67. https://doi.org/10.1109/tse.2017.2755013
  25. Kong, X., Zhang, L., Wong, W.E. and Li, B., 2018. The impacts of techniques, programs and tests on automated program repair: An empirical study. Journal of Systems and Software, 137, pp.480-496. https://doi.org/10.1016/j.jss.2017.06.039
  26. Qiu, K., Zheng, Z., Trivedi, K.S. and Yin, B., 2019. Stress testing with influencing factors to accelerate data race software failures. IEEE Transactions on Reliability, 69(1), pp.3-21. https://doi.org/10.1109/tr.2019.2895052
  27. Jiang, J., Xiong, Y. and Xia, X., 2019. A manual inspection of Defects4J bugs and its implications for automatic program repair. Science China Information Sciences, 62(10), pp.1-16.
  28. Gupta, N., Sharma, A. and Pachariya, M.K., 2020. Testing and debugging: an empirical evaluation of integrated approaches. Sadhana, 45, pp.1-15. https://doi.org/10.1007/s12046-019-1235-5
  29. Kim, J., Kim, J., Lee, E. and Kim, S., 2020. The effectiveness of context-based change application on automatic program repair. Empirical Software Engineering, 25(1), pp.719-754. https://doi.org/10.1007/s10664-019-09770-1
  30. Caballero, R., Martin-Martin, E., Riesco, A. and Tamarit, S., 2021. A unified framework for declarative debugging and testing. Information and Software Technology, 129, p.106427. https://doi.org/10.1016/j.infsof.2020.106427
  31. Ye, H., Martinez, M., Durieux, T. and Monperrus, M., 2021. A comprehensive study of automatic program repair on the QuixBugs benchmark. Journal of Systems and Software, 171, p.110825. https://doi.org/10.1016/j.jss.2020.110825
  32. Liu, K., Li, L., Koyuncu, A., Kim, D., Liu, Z., Klein, J. and Bissyande, T.F., 2021. A critical review on the evaluation of automated program repair systems. Journal of Systems and Software, 171, p.110817. https://doi.org/10.1016/j.jss.2020.110817
  33. Hua, J., Zhang, M., Wang, K. and Khurshid, S., 2018, October. Sketchfix: A tool for automated program repair approach using lazy candidate generation. In Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (pp. 888-891).