DOI QR코드

DOI QR Code

An alternative method for smartphone input using AR markers

  • Kang, Yuna (1st R&D Institute-3, Agency for Defense Development) ;
  • Han, Soonhung (Department of Ocean System Engineering, Korean Advanced Institute for Science and Technology)
  • 투고 : 2014.01.08
  • 심사 : 2014.03.18
  • 발행 : 2014.07.01

초록

As smartphones came into wide use recently, it has become increasingly popular not only among young people, but among middle-aged people as well. Most smartphones adopt capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone's screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen; this causes difficulties in making precise inputs. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. A marker is placed in front of the smartphone camera. Then, the camera image of the marker is analyzed to determine the position of the marker as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

키워드

참고문헌

  1. Wikipedia [Internet]. c2013 [cited 2013 Sep 14]. Available from: http://www.wikipedia.org/
  2. Kang YA, Han SH. Improvement of smartphone interface using AR marker. Transaction of the Society of CAD/CAM Engineers. 2011; 16(5): 361-369.
  3. Kang YA, Han SH. Improvement of smartphone interface using an AR marker. In: Proceedings of the 11th International Conference on Virtual Reality Continuum and its Applications in Industry; 2012 Dec; NTU, Singapore.
  4. Endo Y, Tada M, Mochimaru M. Reconstructing individual hand models from motion capture data. Journal of Computational Design and Engineering. 2014; 1(1): 1-12. https://doi.org/10.7315/JCDE.2014.001
  5. Huh SJ, Lee SW. A hierarchical Bayesian network for real-time continuous hand gesture recognition. Journal of Korea Institute of Information Scientists and Engineers: Softwares and Applications. 2009; 36(12): 967-1039.
  6. Starner T, Pentland A. Real-time American sign language recognition from video using hidden Markov models. MIT Media Lab., MIT, Cambridge, MA, 1995; Tech. Rep. TR-375, p. 1195-1207.
  7. De Silva LC, Aizawa K, Hatori M. Detection and tracking of facial feature by using edge pixel counting and deformable circular template matching. IEICE Transactions on Information and System. 1995; E78-D(9): 1195-1207.
  8. Oh ST, Jun BH. Head gesture recognition using facial pose states and automata technique. Journal of Korea Institute of Information Scientists and Engineers: Softwares and Applications. 2001; 28(12): 947-954.
  9. Bondarchuk V, Jung IR, Kim CS, Koh SJ. Implementation of mobile user interface control using camera captured information. In: Proceedings of Summer Conference of the Institute of Electronics Engineers of Korea; 2009 Jul 8-10; Jeju, Korea; p. 873- 874.
  10. Lee CS, Cheon SY, Sohn MG, Lee SH. Hand gesture interface using mobile camera devices. Journal of Korea Institute of Information Scientists and Engineers: Computing Practices and Letters. 2010; 16(5): 621-625.
  11. Rohs M, Zweifel P. A conceptual framework for camera phonebased interaction techniques, Proceedings of the Third International Conference on Pervasive Computing; 2005 May 08-13; Munich, Germany; p. 171-189.
  12. Hansen TR, Eriksson E, Lykke-Olesen A. Mixed interaction space: designing for camera based interaction with mobile devices. In: International Conference of Human-Computer Interaction; 2005 Apr; Portland, OR.
  13. Park HJ, Moon HC. AR-based tangible interaction using a finger fixture for digital handheld products. Transaction of the Society of CAD/CAM Engineers. 2011; 16(1): 1-10.
  14. Hachet M, Pouderoux J, Guitton P. A camera-based interface for interaction with mobile handheld computers. In: Proceedings of the 2005 Symposium on Interactive 3D Graphics and Games; 2005 Apr 3-6; Washington D.C., USA; p. 65-72.
  15. Byun JH, Kim MS. Tangible interaction: application for a new interface method for mobile device - Focused on development of virtual keyboard using camera input. Journal of Korean Society of Design Science. 2004; 17(3): 441-448.
  16. Lee JH, Han SH. Application of mixed reality with safety sign panel for a manufacturing simulation. In: Proceeding of the Society of CAD/CAM Conference; 2007 Jan 31-Feb 2; Pyungchang, Gangwondo; p. 582-588.
  17. Lee JH, Han SH, Cheon SU. Recognition of safety sign panel for mixed reality application in a factory layout planning. Transaction of the Society of CAD/CAM Engineers. 2009; 14(1): 42-49.

피인용 문헌

  1. Virtual reality applications in manufacturing industries: Past research, present findings, and future directions vol.23, pp.1, 2015, https://doi.org/10.1177/1063293X14568814
  2. Interoperability of product and manufacturing information using ontology vol.23, pp.3, 2015, https://doi.org/10.1177/1063293X15590462