DOI QR코드

DOI QR Code

Smart Phone Picture Recognition Algorithm Using Electronic Maps of Architecture Configuration

건물 배치 전자도면을 이용한 모바일 폰의 피사체 인지 방법

  • 임재걸 (동국대학교 컴퓨터공학부) ;
  • 주재훈 (동국대학교 경영관광대학 경상학부) ;
  • 이계영 (동국대학교 컴퓨터공학부)
  • Received : 2012.01.02
  • Accepted : 2012.04.23
  • Published : 2012.08.31

Abstract

As the techniques of electronic and information are advancing, the computing power of a smart phone is becoming more powerful and the storage capacity of a smart phone is becoming larger. As the result, various new useful services are becoming available on smart phones. The context-aware service and mobile augmented reality have recently been the most popular research topics. For those newly developed services, identifying the object in the picture taken by the camera on the phone performs an extremely important role. So, many researches of identifying pictures have been published and most of them are based on the time consuming image recognition techniques. On the contrary, this paper introduces a very fast and effective method of identifying the objects on the photo making use of the sensor data obtained from the smart phone and electronic maps. Our method estimates the line of sight of the camera with the location and orientation information provided by the smart phone. Then it finds any element of the map which intersects the line of sight. By investigating those intersecting elements, our method identifies the objects on the photo.

전자 정보 기술이 발전함에 따라, 스마트폰의 계산 능력은 더욱 막강해지고 저장장치의 용량은 더욱 거대해지고 있다. 이에 따라, 유용한 새로운 스마트폰 서비스가 많이 등장하고 있는데, 상황인지 서비스와 모바일 증강 현실이 근래에 가장 많이 연구되고 있는 대표적인 주제이다. 이러한 새로운 서비스에서 스마트폰에 장착된 카메라로 촬영한 사진 속의 개체를 인지하는 것은 매우 중요한 역할을 수행한다. 그래서 사진을 인지하는 연구가 많이 출판되었는데, 대부분은 처리 시간이 매우 긴 이미지 인지 기술 기반이다. 이와 반대로, 본 논문은 스마트폰의 센서로부터 수집한 데이터와 전자지도를 이용하여 사진 속의 개체를 인지하는 매우 빠르고 효율적인 방법을 제안한다. 제안하는 방법은 스마트폰이 제공하는 현재 위치와 방향각으로 카메라의 시선을 구하고, 카메라 시선과 교차하는 전자지도의 요소들을 찾는다. 그리고 교차하는 요소들을 조사하여 사진 속의 개체를 인지한다.

Keywords

References

  1. 김경중, 조성배, "상황인지 휴대폰 기술개발 동향", 정보통신산업진흥원, [IITA] 정보통신연구진흥원 학술정보 주간기술동향 1280호, 2007.
  2. 주재훈, 임재걸, "개인화 콘텐트 서비스 방법", 특허등록번호 10-1082953, 2011.
  3. Abe, N., Oogami, W., Shimada, A., Nagahara, H., and Taniguchi, R., "Clickable Real World : Interaction with Real-World Landmarks Using Mobile Phone Camera," IEEE Region 10 Conference, pp. 914-917, 2010.
  4. Abe, T., Takada, T., Kawamura, H., Yasuno, T., and Sonehara, N., "Image-Identification Methods for Camera- Equipped Mobile Phones," International Conference on Mobile Data Management, pp. 372-376, 2007.
  5. Bruns, E., Brombach, B., and Bimber, O., "Mobile Phone-Enabled Museum Guidance with Adaptive Classification," IEEE Computer Graphics and Applications, Vol. 28, No. 4, pp. 98-102, 2008. https://doi.org/10.1109/MCG.2008.77
  6. Bruns, E., Brombach, B. Zeidler, T., and Bimber, O., "Enabling Mobile Phones To Support Large-Scale Museum Guidance," IEEE Multimedia, Vol. 14, No. 2, pp. 16-25, 2007. https://doi.org/10.1109/MMUL.2007.33
  7. Chen, D., Tsai, S., Vedantham, R., Grzeszczuk, R., and Girod, B., "Streaming Mobile Augmented Reality on Mobile Phones," IEEE International Symposium on Mixed and Augmented Reality, pp. 181-182, 2009.
  8. Cipolla, R., Robertson, D., and Tordoff, B., "Imagebased Localization," Proceedings of International Conference on Virtual Systems and Multimedia(VSMM), pp. 22-29, 2004.
  9. Han, E., Yang, H., and Jung, K., "Mobile Education through Camera-Equipped Mobile Phones," International Conference on Convergence Information Technology, pp. 1656-1661, 2007.
  10. Hile, H. and Borriello, G., "Positioning and Orientation in Indoor Environments Using Camera Phones," IEEE Computer Graphics and Applications, Vol. 28, No. 4, pp. 32-39, 2008. https://doi.org/10.1109/MCG.2008.80
  11. Kalkusch, M., Lidy, T., Knapp, M., Reitmayr, G., Kaufmann, H., and Schmalstieg, D., "Structured Visual Markers for Indoor Pathfinding," Proceedings of the First IEEE International Workshop on ARToolKit, 2002.
  12. Lee, A., Lee, J., Lee, S., and Choi, J., "Real-Time Camera Pose Estimation for Augmented Reality System Using a Square Marker," International Symposium on Wearable Computers(ISWC), pp. 1-2, 2010.
  13. Li, Y., Lim, J., Lim, K., and You, Y., "Showroom introduction using mobile phone based on scene image recognition," IEEE International Conference on Multimedia and Expo, pp. 1294-1297, 2009.
  14. Lim, J., Li, Y., You, Y., and Chevallet, J., "Scene Recognition with Camera Phones for Tourist Information Access," IEEE International Conference on Multimedia and Expo, pp. 100-103, 2007.
  15. Mitchell, K. and Race, N., "uLearn : Facilitating Ubiquitous Learning Through Camera Equipped Mobile Phones," IEEE International Workshop on Wireless and Mobile Technologies in Education, 2005.
  16. Mohring, M., Lessig, C., and Bimber, O., "Video See Through Consumer Cell- Phones," Proceedings of International Symposium on Mixed and Augmented Reality (ISMAR), pp. 252-253, 2004.
  17. Mulloni, A., Wagner, D., Barakonyi, I., and Schmalstieg, D., "Indoor Positioning and Navigation with Camera Phones," IEEE Pervasive Computing, pp. 22-31, 2009.
  18. Naimark, L. and Foxlin, E., "Circular Data Matrix Fiducial System and Robust Image Processing for a Wearable Vision Inertial Self-Tracker," Proceedings of the 1st International Symposium on Mixed and Augmented Reality, p. 27, 2002.
  19. Noh, H., Lee, J., Oh, S., Hwang, K., and Cho, D., "Exploiting Indoor Location and Mobile Information for Context-Awareness Service," Information Processing and Management, In Press, Corrected Proof, Available online 12 March 2011.
  20. Oe, M., Sato, T., and Yokoya, N., "Estimating Camera Position and Posture by Using Feature Landmark Database," Proceedings of Scandinavian Conference on Image Analysis(SCIA), pp. 171-181, 2005.
  21. Paucher, R. and Turk, M., "Location-Based Augmented Reality on Mobile Phones," IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops(CVPRW), pp. 9-16, 2010.
  22. Rosten, E. and Drummond, T., "Fusing Points and Lines for High Performance Tracking," Proceedings of International Conference on Computer Vision(ICCV), Vol. 2, pp. 1508-1515, 2005.
  23. Sato, J., Takahashi, T., Ide, I., and Murase, H., "Change Detection in Streetscapes from GPS Coordinated Omni-Directional Image Sequences," Proceedings of International Conference on Pattern Recognition(ICPR), Vol. 4, pp. 935-938, 2006.
  24. Schmalstieg, D. and Wagner, D., "Experiences with Handheld Augmented Reality," Proceedings of 6th International Symposium on Mixed and Augmented Reality(ISMAR), pp. 3-8, 2007.
  25. Skrypnyk, I. and Lowe, D. G., "Scene Modelling, Recognition and Tracking with Invariant Image Features," Proceedings of International Symposium on Mixed and Augmented Reality(ISMAR), pp. 110-119, 2004.
  26. Vacchetti, L., Lepetit, V., and Fua, P., "Combining Edge and Texture Information for Real-Time Accurate 3d Camera Tracking," Proceedings of International Symposium on Mixed and Augmented Reality(ISMAR), pp. 48-57, 2004.
  27. Vandenhouten, R. and Selz, M., "Identification and tracking of goods with the mobile phone," International Symposium on Logistics and Industrial Informatics, pp. 25-29, 2007.
  28. Wagner, D. and Schmalstieg, D., "First Steps Towards Handheld Augmented Reality," Proceedings of International Symposium on Wearable Computers(ISWC), pp. 21-23, 2003.
  29. Yeo, C., Chia, L., Cham, T., and Rajon, D., "Click4BuildingID@NTU : Click for Building Identification with GPS-enabled Camera Cell Phone," IEEE International Conference on Multimedia and Expo, pp. 1059-1062, 2007.
  30. Yim, J., "Introducing a Decision Treebased Indoor Positioning Technique," Expert Systems with Applications, Vol. 34, No. 2, pp. 1296-1302, 2008. https://doi.org/10.1016/j.eswa.2006.12.028

Cited by

  1. A Study on Automatic Tooth Root Segmentation For Dental CT Images vol.19, pp.4, 2014, https://doi.org/10.7838/jsebs.2014.19.4.045