DOI QR코드

DOI QR Code

Photorealistic Ray-traced Visualization Approach for the Interactive Biomimetic Design of Insect Compound Eyes

  • Nguyen, Tung Lam (Department of Mechanical Engineering, Hanbat National University) ;
  • Trung, Hieu Tran Doan (Department of Mechanical Engineering, Hanbat National University) ;
  • Lee, Wooseok (Department of Mechanical Engineering, Hanbat National University) ;
  • Lee, Hocheol (Department of Mechanical Engineering, Hanbat National University)
  • Received : 2021.07.29
  • Accepted : 2021.10.01
  • Published : 2021.12.25

Abstract

In this study, we propose a biomimetic optical structure design methodology for investigating micro-optical mechanisms associated with the compound eyes of insects. With these compound eyes, insects can respond fast while maintaining a wide field of view. Also, considerable research attention has been focused on the insect compound eyes to utilize these benefits. However, their nano micro-structures are complex and challenging to demonstrate in real applications. An effectively integrated design methodology is required considering the manufacturing difficulty. We show that photorealistic ray-traced visualization is an effective method for designing the biomimetic of a micro-compound eye of an insect. We analyze the image formation mechanism and create a three-dimensional computer-aided design model. Then, a ray-trace visualization is applied to observe the optical image formation. Finally, the segmented images are stitched together to generate an image with a wide-angle; the image is assessed for quality. The high structural similarity index (SSIM) value (approximately 0.84 to 0.89) of the stitched image proves that the proposed MATLAB-based image stitching algorithm performs effectively and comparably to the commercial software. The results may be employed for the understanding, researching, and design of advanced optical systems based on biological eyes and for other industrial applications.

Keywords

Acknowledgement

This study was supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (No. NRF-2019R1A2C1004844) and the research fund of Hanbat National University in 2020.

References

  1. M. F. Land and D.-E. Nilsson, Animal Eyes, 2nd ed. (Oxford University Express, NY, USA. 2012).
  2. K.-H. Jeong, J. Kim, and L. P. Lee, "Biologically inspired artificial compound eyes," Science 312, 557-561 (2006). https://doi.org/10.1126/science.1123053
  3. J. Duparre, P. Dannberg, P. Schreiber, A. Brauer, and A. Tunnermann, "Thin compound-eye camera," Appl. Opt. 44, 2949-2956 (2005). https://doi.org/10.1364/AO.44.002949
  4. L. Lei and A. Y. Yi, "Design and fabrication of a freeform microlens array for a compact large-field-of-view compound-eye camera," Appl. Opt. 51, 1843-1852 (2012). https://doi.org/10.1364/AO.51.001843
  5. Y. Cheng, J. Cao, Q. Hao, F. Zhang, S. Wang, W. Xia, L. Meng, Y. Zhang, and H. Yu, "Compound eye and retinalike combination sensor with a large field of view based on a space-variant curved micro lens array," Appl. Opt. 56, 3502-3509 (2017). https://doi.org/10.1364/AO.56.003502
  6. A. Bruckner, J. Duparre, R. Leitel, P. Dannberg, A. Brauer, and A. Tunnermann, "Thin wafer-level camera lenses inspired by insect compound eyes," Opt. Express 18, 24379-24394 (2010). https://doi.org/10.1364/OE.18.024379
  7. O. Cogal and Y. Leblevichi, "An insect eye inspired miniaturized multi camera system for endoscopic imaging," IEEE Trans. Biomed. Circuits Syst. 11, 212-224 (2017). https://doi.org/10.1109/TBCAS.2016.2547388
  8. S. Wu, T. Zhang, G. Zhang, B. Schoenemann, F. Neri, M. Zhu, C. Bu, J. Han, and K.-D. Kuhnert, "Artificial compound eye: a survey of the state-of-the-art," Artif. Intell. Rev. 48, 573-603 (2017). https://doi.org/10.1007/s10462-016-9513-7
  9. C. S. Min, Y. W. Kim, and J. S. Lee, "Analysis of interrelation between FFR and OCTCT diagnostic for bending stenosed blood vessel using CFD," in Proc. Korean Society of Manufacturing Technology Engineers Autumn Annual Conference (Seoul, Korea, Oct. 2018), p. 9.
  10. G. Edward, "Optics and phylogeny: is there an insight? The evolution of superposition eyes in the Decapoda (Crustacea)," Contrib. Zool. 67, 223-236 (1998). https://doi.org/10.1163/18759866-06704001
  11. M. F. Land, "Optics and vision in invertebrates," in Handbook of sensory physiology, H. Autrum, Ed. (Springer, 1981), Vol. VII/6B, pp. 471-592.
  12. Z. Wang and Z. Yang, "Review on image stitching techniques," Multimed. Syst. 26, 413-430 (2020). https://doi.org/10.1007/s00530-020-00651-y
  13. R. Szeliski, "Image alignment and stitching: A tutorial," Found. Trends Comput. Graph. Vis. (2006).
  14. S. T. Y. Suen, E. Y. Lam, and K. K. Y. Wong, "Photographic stitching with optimized object and color matching based on image derivatives," Opt. Express 15, 7689-7696 (2007). https://doi.org/10.1364/OE.15.007689
  15. C. Je and H.-M. Park, "Optimized hierarchical block matching for fast and accurate image registration," Signal Process. Image Commun. 28, 779-791 (2013). https://doi.org/10.1016/j.image.2013.04.002
  16. J. Kim and J. A. Fessler, "Intensity based image registration using robust correlation coefficients," IEEE Trans. Med. Imaging 23, 1430-1444 (2004). https://doi.org/10.1109/TMI.2004.835313
  17. Y. Li and V. Monga, "SIASM: Sparsity based image alignment and stitching method for robust image mosaicking," in Proc. IEEE International Conference on Image Processing -ICIP (Phoenix, AZ, USA, Sept. 2016), pp. 1828-1832.
  18. M. Brown and D. Lowe, "Automatic panoramic image stitching using invariant features," Int. J. Comput. Vis. 74, 59-73 (2007). https://doi.org/10.1007/s11263-006-0002-3
  19. Y. Xiong and K. Pulli, "Fast panorama stitching for highquality panoramic images on mobile phones," IEEE Trans. Consum. Electron. 56, 298-306 (2010). https://doi.org/10.1109/TCE.2010.5505931
  20. Y. Li, M. Tofighi, and V. Monga, "Robust alignment for panoramic stitching via an exact rank constraint," IEEE Trans. Image Process. 28, 4730-4745 (2019). https://doi.org/10.1109/tip.2019.2909800
  21. D. G. Lowe, "Distinctive image features from scale-invariant key points," Int. J. Comput. Vis. 60, 91-110 (2004). https://doi.org/10.1023/B:VISI.0000029664.99615.94
  22. H. Bay, T. Tuytelaars, and L. G. Van, "SURF: Speeded up robust features," in Computer Vision-ECCV 2006 (Lecture Notes in Computer Science Series, Vol. 3951), A. Leonardis, H. Bischof, A. Pinz, Eds. (Springer, Berlin, Germany, 2006).
  23. P. F. Alcantarilla, A. Bartoli, and A. J. Davison, "KAZE Features," in Computer Vision-ECCV 2012 (Lecture Notes in Computer Science Sereis, Vol. 7577), A. Fitzgibbon, S. Lazebnik, P. Perona, Y. Sato, C. Schmid, Eds. (Springer, Berlin,Germany, 2012).
  24. S. Eddins, "Multiresolution Pyramid part 3: Laplacian pyramids," (MathWorks, Published date: 2019 April 16) https://blogs.mathworks.com/steve/2019/04/16/multiresolution-pyramids-part-3-laplacian-pyramids/?s_tid=blogs_rc_1 (Accessed date: 2020 July).
  25. P. J. Burt and E. H. Adelson, "A multiresolution spline with application to image mosaics," ACM. Trans. Graph. 2, 217-236 (1983). https://doi.org/10.1145/245.247
  26. H. Lee, U. Yang, and H.-J. Choi, "Analysis of the design parameters for a lightfield near-eye display based on a pinhole array," Curr. Opt. Photon. 4, 121-126 (2020). https://doi.org/10.3807/COPP.2020.4.2.121
  27. H. Lee, M. Park, H. Lee, and H.-J. Choi, "An analysis on the range of singular fusion of augmented reality devices," Curr. Opt. Photon. 4, 540-544 (2020). https://doi.org/10.3807/COPP.2020.4.6.540
  28. Q. Wu, H. Li, F. Meng, and K. N. Ngan, "A perceptually weighted rank correlation indicator for objective image quality assessment," IEEE Trans. Image Process. 27, 2499-2513 (2018). https://doi.org/10.1109/tip.2018.2799331
  29. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, "Image quality assessment: from error visibility to structural similarity," IEEE Trans. Image Process. 13, 600-612 (2004). https://doi.org/10.1109/TIP.2003.819861