DOI QR코드

DOI QR Code

Dynamic swarm particle for fast motion vehicle tracking

  • Received : 2018.08.13
  • Accepted : 2019.02.13
  • Published : 2020.02.07

Abstract

Nowadays, the broad availability of cameras and embedded systems makes the application of computer vision very promising as a supporting technology for intelligent transportation systems, particularly in the field of vehicle tracking. Although there are several existing trackers, the limitation of using low-cost cameras, besides the relatively low processing power in embedded systems, makes most of these trackers useless. For the tracker to work under those conditions, the video frame rate must be reduced to decrease the burden on computation. However, doing this will make the vehicle seem to move faster on the observer's side. This phenomenon is called the fast motion challenge. This paper proposes a tracker called dynamic swarm particle (DSP), which solves the challenge. The term particle refers to the particle filter, while the term swarm refers to particle swarm optimization (PSO). The fundamental concept of our method is to exploit the continuity of vehicle dynamic motions by creating dynamic models based on PSO. Based on the experiments, DSP achieves a precision of 0.896 and success rate of 0.755. These results are better than those obtained by several other benchmark trackers.

Keywords

References

  1. F. J. Martinez et al., Emergency services in future intelligent transportation systems based on vehicular communication networks, IEEE Intell. Transp. Syst. Mag. 2 (2010), 6-20. https://doi.org/10.1109/MITS.2010.938166
  2. W. Jian et al., A survey on video-based vehicle behavior analysis algorithms, J. Multimed. 7 (2012), 223-230.
  3. M. Bommes et al., Video based intelligent transportation systems state of the art and future development, Transp. Res. Proc. 14 (2016), 4495-4504. https://doi.org/10.1016/j.trpro.2016.05.372
  4. E. Toropov et al., Traffic flow from a low frame rate city camera, in Proc. IEEE Int. Conf. Image Process., Quebec, Canada, Sept. 2015, pp. 3802-3806.
  5. Y. Wu, J. Lim, and M. Yang, Object tracking benchmark, IEEE Trans. Pattern Anal. Mach. Intell. 37 (2015), 1834-1848. https://doi.org/10.1109/TPAMI.2014.2388226
  6. P. Korshunov and W. T. Ooi, Reducing frame rate for object tracking, Advances in multimedia modeling (S. Boll, Q. Tian, L. Zhang, Z. Zhang and Y.-P. Phoebe Chen, eds.), Springer, Berlin, Heidelberg, 2010, pp. 454-464.
  7. N. Wang et al., Understanding and diagnosing visual tracking systems, in Proc. IEEE Int. Conf. Comput. Vision, Santiago, Chile, Dec. 2015, pp. 3101-3109.
  8. A. Del Bimbo and F. Dini, Particle filter-based visual tracking with a first order dynamic model and uncertainty adaptation, Comput. Vis. Image Underst. 115 (2011), 771-786. https://doi.org/10.1016/j.cviu.2011.01.004
  9. W. Guo, Q. Zhao, and G. Dongbing, Visual tracking using an insect vision embedded particle filter, Math. Probl. Eng. 2015 (2015), 1-16.
  10. A. A. S. Gunawan and I. Wasito, Nonretinotopic particle filter for visual tracking, J. Theor. Appl. Inf. Technol. 63 (2014), 104-111.
  11. A. Del Bimbo and F. Dini, Particle filter-based visual tracking with a first order dynamic model and uncertainty adaptation, Comput. Vis. Image Underst. 115 (2011), 771-786. https://doi.org/10.1016/j.cviu.2011.01.004
  12. W. Guo, Q. Zhao, and G. Dongbing, Visual tracking using an insect vision embedded particle filter, Math. Probl. Eng. 2015 (2015), 1-16.
  13. A. A. S. Gunawan and I. Wasito, Nonretinotopic particle filter for visual tracking, J. Theor. Appl. Inf. Technol. 63 (2014), 104-111.
  14. A. A. S. Gunawan, M. I. Fanany, and W. Jatmiko, Deep extreme tracker based on bootstrap particle filter, J Theor. Appl. Inf. Technol. 66 (2014), 857-863.
  15. A. Yilmaz, O. Javed and M. Shah, Object tracking: A survey, ACM Comput. Surv. 38 (2006), 13:1-13:45.
  16. K. Kang et al., Invariant-feature based object tracking using discrete dynamic swarm optimization, ETRI J. 39 (2017), 151-162. https://doi.org/10.4218/etrij.17.0116.0584
  17. N. Wang and D.-Y. Yeung, Learning a deep compact image representation for visual tracking, Advances in neural information processing systems 26 (C. J. C. Burges, L. Bottou, M. Welling, Z. Ghahramani and K. Q. Weinberger, eds.), Curran Associates, Inc., San Diego, CA, 2013, pp. 809-817.
  18. A. A. S. Gunawan and W. Jatmiko, Geometric deep particle filter for motorcycle tracking: development of intelligent traffic system in Jakarta, Int. J. Smart Sens. Intell. Syst. 8 (2015), 429-463.
  19. K. Zhang et al., Robust visual tracking via convolutional networks without training, IEEE Trans. Image Process. 25 (2016), 1779-1792. https://doi.org/10.1109/TIP.2016.2531283
  20. C. Ma et al., Hierarchical convolutional features for visual tracking, in Proc. IEEE Int. Conf. Comput. Vision, Santiago, Chile, Dec. 2015, pp. 3074-3082.
  21. Z. Xiaowei, L. Hong, and S. Xiaohong, Object tracking with an evolutionary particle filter based on self-adaptive multi-features fusion, Int. J. Adv. Rob. Syst. 10 (2013), 61-71. https://doi.org/10.5772/54869
  22. G. S. Walia and R. Kapoor, Intelligent video target tracking using an evolutionary particle filter based upon improved cuckoo search, Expert Syst. Appl. 41 (2014), 6315-6326. https://doi.org/10.1016/j.eswa.2014.03.012
  23. J. Kwon, K. M. Lee, and F. C. Park, Visual tracking via geometric particle filtering on the affine group with optimal importance functions, in Proc. IEEE Conf. Comput. Vision Pattern Recogn., Miami, FL, USA, June 2009, pp. 991-998.
  24. Y. Xue and S.-Q. Dai, Continuum traffic model with the consideration of two delay time scales, Phys. Rev. E 68 (2003), 1-6.
  25. Q. Chen and Y. Wang, A cellular automata (ca) model for motorized vehicle flows influenced by bicycles along the roadside, J. Adv. Transport. 50 949-966. https://doi.org/10.1002/atr.1382
  26. Y. Luo et al., Modeling the interactions between car and bicycle in heterogeneous traffic, J. Adv. Transport. 49 (2015), 29-47. https://doi.org/10.1002/atr.1257
  27. M. Clerc and J. Kennedy, The particle swarm - explosion, stability, and convergence in a multidimensional complex space, IEEE Trans. Evol. Comput. 6 (2002), 58-73. https://doi.org/10.1109/4235.985692
  28. A. Torralba, R. Fergus and W. Freeman, 80 millions tiny images: a large dataset for non-parametric object and scene recognition, IEEE Trans. Pattern Anal. Mach. Intell. 30 (2008), 1958-1970. https://doi.org/10.1109/TPAMI.2008.128
  29. Y. Wu, J. Lim, and M. Yang, Object tracking benchmark, IEEE Trans. Pattern Anal. Mach. Intell. 37 (2015), 1834-1848. https://doi.org/10.1109/TPAMI.2014.2388226
  30. M. Kristan et al., The visual object tracking VOT2016 challenge results, in Proc. Comput. Vision - ECCV 2016 Workshops, Amsterdam, Netherlands, Oct. 2016, pp. 777-823.
  31. A. Royet et al., Tracking benchmark and evaluation for manipulation tasks, in Proc. IEEE Int. Conf. Robotics Autom., Seattle, WA, USA, May 2015, pp. 2448-2453.
  32. B. Babenko, M.-H. Yang, and S. Belongie, Robust object tracking with online multiple instance learning, IEEE Trans. Pattern Anal. Mach. Intell. 33 (2011), 1619-1632. https://doi.org/10.1109/TPAMI.2010.226
  33. A. A. S. Gunawan et al., Tracking efficiency measurement of dynamic models on geometric particle filter using KLD-resampling, Int. Conf. Adv. Comput. Sci. Inf. Syst. 1 (2014), 385-388.
  34. M. Danelljan et al., Eco: efficient convolution operators for tracking, in Proc. IEEE Conf. Comput. Vision pattern Recogn., Honolulu, HI, USA, July 2017, pp. 6931-6939.
  35. J. F. Henriques et al., High-speed tracking with kernelized correlation filters, IEEE Trans. Pattern Anal. Mach. Intell. 37 (2015), 583-596. https://doi.org/10.1109/TPAMI.2014.2345390
  36. L. Bertinetto et al., Fully-convolutional siamese networks for object tracking, in Proc. Comput. Vision - ECCV 2016 Workshops, Amsterdam, Netherlands, Oct. 2016, pp. 850-865.
  37. J. Valmadre et al., End-to-end representation learning for correlation filter based tracking, in Proc. IEEE Conf. Comput. Vision Pattern Recogn., Honolulu, HI, USA, July 2017, pp. 5000-5008.
  38. H. Nam and B. Han, Learning multi-domain convolutional neural networks for visual tracking, in Proc. IEEE Conf. Comput. Vision Pattern Recogn., Las Vegas, NV, USA, June 2016, pp. 4293-4302.
  39. C. Bao et al., Real time robust L1 tracker using accelerated proximal gradient approach, in Proc. IEEE Conf. Comput. Vis. Pattern Recogn., Providence, RI, USA, June 2012, pp. 1830-1837.
  40. L. Sevilla-Lara and E. Learned-Miller, Distribution fields for tracking, in Proc. IEEE Conf. Comput. Vis. Pattern Recogn., Providence, RI, USA, June 2012, pp. 1910-1917.
  41. D. A. Ross et al., Incremental learning for robust visual tracking, Int. J. Comput. Vision 77 (2007), 125-141. https://doi.org/10.1007/s11263-007-0075-7
  42. J. F. Henriques et al., Exploiting the circulant structure of tracking-by-detection with kernels, Lect. Notes Comput. Sci. (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 7575 LNCS (2012), no. PART 4, 702-715.
  43. M. Kristan et al., The sixth visual object tracking VOT-2018 challenge results, in Eur. Conf. Comput. Vis. Workshops, Munich, Germany, Sept. 2018, pp. 3-53.

Cited by

  1. A Formal and Quantifiable Log Analysis Framework for Test Driving of Autonomous Vehicles vol.20, pp.5, 2020, https://doi.org/10.3390/s20051356