DOI QR코드

DOI QR Code

Real-time Interactive Particle-art with Human Motion Based on Computer Vision Techniques

컴퓨터 비전 기술을 활용한 관객의 움직임과 상호작용이 가능한 실시간 파티클 아트

  • Jo, Ik Hyun (School of Computer Science and Engineering, Kyungpook National University) ;
  • Park, Geo Tae (Department of Digital Media Art, Kyungpook National University) ;
  • Jung, Soon Ki (School of Computer Science and Engineering, Kyungpook National University)
  • Received : 2017.09.04
  • Accepted : 2017.12.15
  • Published : 2018.01.31

Abstract

We present a real-time interactive particle-art with human motion based on computer vision techniques. We used computer vision techniques to reduce the number of equipments that required for media art appreciations. We analyze pros and cons of various computer vision methods that can adapted to interactive digital media art. In our system, background subtraction is applied to search an audience. The audience image is changed into particles with grid cells. Optical flow is used to detect the motion of the audience and create particle effects. Also we define a virtual button for interaction. This paper introduces a series of computer vision modules to build the interactive digital media art contents which can be easily configurated with a camera sensor.

Keywords

References

  1. May 2011 In-depth Report about Culture Technology, Korea Creative Content Agency, 2011.
  2. Interactive Media Canvas, http://hybe.org/blog2/ (accessed Aug., 21, 2017).
  3. J.A. Roh and S.W. Lee, "Design of 3D Interactive Particle System with Range Image Sensor: Case of Media Art Work In Vivo-Silico," Society of Design Convergence, Vol. 24, No. 2, pp. 83-98, 2014.
  4. In Order to Control, www.notabenevisual.com/works/in-order-to-control (accessed Aug., 22, 2017).
  5. Soma Mapping, http://digitalartfestival.tw/daf11/kt/en/a-3.html (accessed Aug., 22, 2017).
  6. S.H. Nam and Y.E. Ki, "The Dance Movement Visualization Using a Motion Capture System in a Media," Asia-Pacific Journal of Multimedia Services Convergent with Art, Vol. 3, No. 2, pp. 41-46, 2013. https://doi.org/10.14257/AJMAHS.2013.12.03
  7. Y.M. Lim, E.S. Hong, and J.W. Lee, "A Real-Time Interactive Shadow Avatar with Facial Emotions," Journal of Korea Multimedia Society, Vol. 10, No. 4, pp. 506-515, 2007.
  8. C. Lim and J.S. Yun, "Digital Mirror Using Particle System Based on Motion Detection," Journal of The Korea Contents Association, Vol. 11, No. 11, pp. 62-69, 2011. https://doi.org/10.5392/JKCA.2011.11.11.062
  9. P. KaewTraKulPong and R. Bowden, "An Improved Adaptive Background Mixture Model for Realtime Tracking with Shadow Detection," Proceeding of European Workshop Advanced Video Based Surveillance Systems, 2001.
  10. Z. Zivkovic, "Improved Adaptive Gaussian Mixture Model for Background Subtraction," Proceedings of the 17th International Conference on Pattern Recognition, pp. 28-31, 2004.
  11. Z. Zivkovica and F.V.D. Heijden, "Efficient Adaptive Density Estimation Per Image Pixel for the Task of Background Subtraction," Pattern Recognition Letters, Vol. 27, No. 7, pp. 773-780, 2006. https://doi.org/10.1016/j.patrec.2005.11.005
  12. A.B. Godbehere, A. Matsukawa, and K. Goldberg, "Visual Tracking of Human Visitors Under Variable-Lighting Conditions for a Responsive Audio Art Installation," Proceeding of American Control Conference, pp. 4305-4312, 2012.
  13. B.D. Lucas and T. Kanade, "An Iterative Image Registration Technique with an Application to Stereo Vision," Proceeding of Imaging Understanding Workshop, pp. 121-130, 1981.
  14. G. Farneback, "Fast and Accurate Motion Estimation Using Orientation Tensors and Parametric Motion Models," Proceeding of 15th International Conference on Pattern Recognition, pp. 135-139, 2000.
  15. B.K.P. Horn and B.G. Schunck, "Determining Optical Flow," Artificial Intelligence, Vol. 17, No. 1-3, pp. 185-203, 1981. https://doi.org/10.1016/0004-3702(81)90024-2