DOI QR코드

DOI QR Code

Multiple Camera Collaboration Strategies for Dynamic Object Association

  • Cho, Shung-Han (Mobile Systems Design Laboratory, Dept. of Electrical and Computer Engineering, Stony Brook University-SUNY) ;
  • Nam, Yun-Young (Center of excellence for Ubiquitous System, Ajou University) ;
  • Hong, Sang-Jin (Mobile Systems Design Laboratory, Dept. of Electrical and Computer Engineering, Stony Brook University-SUNY)
  • Received : 2010.08.03
  • Accepted : 2010.10.01
  • Published : 2010.12.23

Abstract

In this paper, we present and compare two different multiple camera collaboration strategies to reduce false association in finding the correspondence of objects. Collaboration matrices are defined with the required minimum separation for an effective collaboration because homographic lines for objects association are ineffective with the insufficient separation. The first strategy uses the collaboration matrices to select the best pair out of many cameras having the maximum separation to efficiently collaborate on the object association. The association information in selected cameras is propagated to unselected cameras by the global information constructed from the associated targets. While the first strategy requires the long operation time to achieve the high association rate due to the limited view by the best pair, it reduces the computational cost using homographic lines. The second strategy initiates the collaboration process of objects association for all the pairing cases of cameras regardless of the separation. In each collaboration process, only crossed targets by a transformed homographic line from the other collaborating camera generate homographic lines. While the repetitive association processes improve the association performance, the transformation processes of homographic lines increase exponentially. The proposed methods are evaluated with real video sequences and compared in terms of the computational cost and the association performance. The simulation results demonstrate that the proposed methods effectively reduce the false association rate as compared with basic pair-wise collaboration.

Keywords

References

  1. W. Hu, M. Hu, X. Zhou, T. Tan, J. Lou and S. Maybank, "Principal axis-based correspondence between multiple cameras for people tracking," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 4, pp. 663-671, Apr., 2006. https://doi.org/10.1109/TPAMI.2006.80
  2. S. Velipasalar and W. Wolf, "Multiple object tracking and occlusion handling by information exchange between uncalibrated cameras," in Proc. of IEEE International Conf. on Image Processing, pp. 418-421, Sept. 11-15, 2005.
  3. J. Orwell, P. Remagnino and G.A. Jones, "Multiple camera color tracking," in Proc. of IEEE International Workshop on Visual Surveillance, pp. 14-24, June 26, 1999.
  4. J. Krumm, S. Harris, B. Meyers, B. Brumitt, M. Hale and S. Shafer, "Multi-camera multi-person tracking for easy living," in Proc. of IEEE International Workshop on Visual Surveillance, pp. 3-10, July 1, 2000.
  5. A. Mittal and L.S. Davis, "M2Tracker: A multi-view approach to segmenting and tracking people in a cluttered scene using region based stereo," in Proc. of European Conf. on Computer Vision, pp. 18-36, May 28-31, 2002.
  6. J. Li, C.S. Chua and Y.K. Ho, "Color based multiple people tracking," in Proc. of IEEE International Conf. on Control, Automation, Robotics and Vision, vol. 1, pp. 309-314, Dec. 2-5, 2002.
  7. Y. Caspi, D. Simakov and M. Irani, "Feature-based sequence-to-sequence matching," International Journal of Computer Vision, pp. 53-64, June, 2006.
  8. Q. Cai and J. K. Aggarwal, "Tracking human motion using multiple cameras", in Proc. of International Conf. on Pattern Recognition, Vienna, Austria, vol. 3, pp. 68-72, Aug. 25-29, 1996.
  9. J. Black and T. Ellis, "Multi camera image tracking," Image and Vision Computing, vol. 24, pp. 1256-1267, Nov., 2006. https://doi.org/10.1016/j.imavis.2005.06.002
  10. H. Tsutsui, J. Miura and Y. Shirai, "Optical flow-based person tracking by multiple cameras," in Proc. of IEEE International Conf. on Multisensor Fusion and Integration in Intelligent Systems, pp. 91-96, Aug. 2001.
  11. A. Utsumi, H. Mori, J. Ohya and M. Yachida, "Multiple human tracking using multiple cameras," in Proc. of IEEE International Conf. on Automatic Face and Gesture Recognition, pp. 498-503, Apr. 14-16, 1998.
  12. J. Kang, I. Cohen, and G. Medioni. "Continuous tracking within and across camera streams," in Proc. of IEEE International Conf. on Computer Vision and Pattern Recognition, vol. 1, pp. 267-272, June 18-20, 2003.
  13. T. H. Chang, S. Gong and E.J. Ong, "Tracking multiple people under occlusion using multiple cameras," in Proc. of British Machine Vision Conf., pp. 566-575, Sept. 11-14, 2000.
  14. S. Calderara, A. Prati, R. Vezzani and R. Cucchiara, "Consistent labeling for multi-camera object tracking," in Proc. of International Conf. on Image Analysis and Processing, pp. 1206-1214, Sept. 6-8, 2005.
  15. S. Khan and M. Shah, "Consistent labeling of tracked objects in multiple cameras with overlapping fields of view," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, no. 10, Oct., 2003.
  16. Y. Kyong, S. H. Cho, S. Hong and W. D. Cho, "Local initiation method for multiple object association in surveillance environment with multiple cameras," in Proc. of IEEE International Conf. on Advanced Video and Signal based Surveillance, Sept. 1-3, 2008.

Cited by

  1. Local and Global Information Exchange for Enhancing Object Detection and Tracking vol.6, pp.5, 2010, https://doi.org/10.3837/tiis.2012.05.009
  2. Trajectory based database management for intelligent surveillance system with heterogeneous sensors vol.75, pp.23, 2010, https://doi.org/10.1007/s11042-015-2725-z