• 제목/요약/키워드: Mobile interaction

검색결과 618건 처리시간 0.03초

모바일 장치를 위한 동작 추적형 이미지 브라우징 시스템 (Image Browsing in Mobile Devices Using User Motion Tracking)

  • 임성훈;황재인;최승문;김정현
    • 한국HCI학회논문지
    • /
    • 제3권1호
    • /
    • pp.49-56
    • /
    • 2008
  • 현재 대부분의 모바일 장치들엔 디지털 카메라가 설치되어 있으며 많은 양의 이미지 데이터들을 저장할 수 있다. 하지만, 이러한 경우 장치 속의 이미지들에 대한 브라우징을 하기 어려워지고, 그에 걸리는 시간도 증가하게 된다. 특히 모바일 장치의 경우 화면의 크기가 작으며, 일반 컴퓨터와 비교하여 부자연스럽고, 불편한 인터페이스를 갖고 있어 어려움을 더욱 증가 시킨다. 우리는 이를 해결하기 위해 3차원 가시화 방법과 모션 센싱을 이용한 인터페이스를 제안하고, 제안된 가시화 방법과 인터페이스의 조합을 통해 모바일 장치에서의 효과적인 이미지 브라우징 방법을 모색해 보았다. 또한 제안한 방법들을 기존의 방법과 비교하여 사용성 평가를 하였다.

  • PDF

4 산업혁명과 멀티미디어 융합 기술 : 모바일 로봇 기반 이동형 프로젝션 기술을 이용한 Pervasive AR 플랫폼 구축 (The Fourth Industrial Revolution and Multimedia Converging Technology: Pervasive AR Platform Construction using a Mobile Robot based Projection Technology)

  • 채승호;양윤식;한탁돈
    • 한국멀티미디어학회논문지
    • /
    • 제20권2호
    • /
    • pp.298-312
    • /
    • 2017
  • The fourth industrial revolution is expected to show technological innovation that develops among different fields beyond boundaries through the convergence and integration of fields. With the development and convergence of digital technology, users can receive information anywhere in the world. In this paper, we propose an adaptive interaction concept in a various environment by using a mobile robot based on projection augmented reality (AR). Most previous studies have aimed fixed projector or projection for a pre-designed environment. Thus, they provide only limited information. To overcome the abovementioned problem, we provide the adaptive information by implementing a projection AR system that can be mounted on the mobile robot. For that, the mobile robot based on the projection system was defined as Pervasive AR. Pervasive AR is configured with a pervasive display, a pervasive interface, and seamless interaction. The Pervasive AR technology enables the user to access information immediately by expanding the display area into real space, which implies an environment of intuitive and convenient interaction by expanding the user interface. This system can be applied to various areas, such as a home environment and a public space.

An Automatic and Scalable Application Crawler for Large-Scale Mobile Internet Content Retrieval

  • Huang, Mingyi;Lyu, Yongqiang;Yin, Hao
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제12권10호
    • /
    • pp.4856-4872
    • /
    • 2018
  • The mobile internet has grown ubiquitous across the globe with the widespread use of smart devices. However, the designs of modern mobile operating systems and their applications limit content retrieval with mobile applications. The mobile internet is not as accessible as the traditional web, having more man-made restrictions and lacking a unified approach for crawling and content retrieval. In this study, we propose an automatic and scalable mobile application content crawler, which can recognize the interaction paths of mobile applications, representing them as interaction graphs and automatically collecting content according to the graphs in a parallel manner. The crawler was verified by retrieving content from 50 non-game applications from the Google Play Store using the Android platform. The experiment showed the efficiency and scalability potential of our crawler for large-scale mobile internet content retrieval.

이동형 시선추적기를 활용한 초등교사의 과학 수업 분석 (Elementary Teacher's Science Class Analysis using Mobile Eye Tracker)

  • 신원섭;김장환;신동훈
    • 한국초등과학교육학회지:초등과학교육
    • /
    • 제36권4호
    • /
    • pp.303-315
    • /
    • 2017
  • The purpose of this study is to analyze elementary teachers' science class objectively and quantitatively using Mobile Eye Tracker. The mobile eye tracker is easy to wear in eyeglasses form. And experiments are collected in video form, so it is very useful for realizing objective data of teacher's class situation in real time. Participants in the study were 2 elementary teachers, and they are teaching sixth grade science in Seoul. Participants took a 40-minute class wearing a mobile eye tracker. Eye movements of participants were collected at 60 Hz, and the collected eye movement data were analyzed using SMI BeGaze 3.7. In this study, the area related to the class was set as the area of interest, we analyzed the visual occupancy of teachers. In addition, we analyzed the linguistic interaction between teacher and students. The results of the study are as follows. First, we analyze the visual occupancy of meaningful areas in teaching-learning activities by class stage. Second, the analysis of eye movements when teachers interacted with students showed that teacher A had a high percentage of students' faces, while teacher B had a high visual occupation in areas not related to classes. Third, the linguistic interaction of the participants were analyzed. Analysis areas include questions, attention-focused language, elementary science teaching terminology, daily interaction, humor, and unnecessary words. This study shows that it is possible to analyze elementary science class objectively and quantitatively through analysis of visual occupancy using mobile eye tracking. In addition, it is expected that teachers' visual attention in teaching activities can be used as an index to analyze the form of language interaction.

모바일 디바이스에서 사용자의 입 바람을 이용한 연기 시뮬레이션의 상호작용 방법 (Interaction Technique in Smoke Simulations using Mouth-Wind on Mobile Devices)

  • 김종현
    • 한국컴퓨터그래픽스학회논문지
    • /
    • 제24권4호
    • /
    • pp.21-27
    • /
    • 2018
  • 본 논문은 모바일 디바이스에서 사용자의 입 바람을 통해 실시간으로 인터랙션 할 수 있는 기법을 제시한다. 모바일과 가상현실 분야에서 사용자의 인터랙션 기술이 중요함에도 불구하고 여전히 사용자 인터페이스 기술의 다양성이 부족하며 대부분 손을 이용한 스크린의 터치나 동작 인식이다. 본 연구에서는 사용자의 입 바람을 이용하여 실시간으로 상호작용할 수 있는 새로운 인터페이스 기술을 제안한다. 사용자와 모바일 디바이스 간의 각도와 위치를 이용하여 바람의 방향을 결정하며 사용자의 입 바람과 가중치 함수를 기반으로 바람의 크기를 계산한다. 제안하는 기술의 우수성을 보여주기 위해 입 바람 외력을 유체 방정식에 적용하여 실시간으로 벡터장의 흐름을 시각화하는 결과를 보여준다. 우리는 제안하는 방법의 결과를 모바일 디바이스 환경에서 보여주지만 인터페이스 기술을 요구하는 가상현실과 증강현실 분야에 응용할 수 있다.

키넥트와 스마트폰을 활용한 공용 공간상에서 모바일 상호작용 (Mobile Interaction Using Smartphones and Kinect in a Global Space)

  • 김민석;이재열
    • 대한산업공학회지
    • /
    • 제40권1호
    • /
    • pp.100-107
    • /
    • 2014
  • This paper presents a co-located and mobile interaction technique using smartphones in a global space. To effectively detect the locations and orientations of smartphones, the proposed approach utilizes Kinect that captures RGB image as well as 3D depth information. Based on the locations and orientations of smartphones, the proposed approach can support direct, collaborative and private interactions with the global space. Thus, it can provide more effective mobile interactions for local space exploration and collaboration.

웹 및 모바일 폰에서의 인터랙티브 3D-View 이미지 서비스 기술 (Interactive 3D-View Image Service on Web and Mobile Phone)

  • Jeon, Kyeong-Won;Kwon, Yong-Moo;Jo, Sang-Woo;Ki, Jeong-Seok
    • 한국HCI학회:학술대회논문집
    • /
    • 한국HCI학회 2007년도 학술대회 1부
    • /
    • pp.518-523
    • /
    • 2007
  • This paper presents web service and service on mobile phone about research on virtual URS(Ubiquitous Robotic Space). We modeled the URS. Then, we find the location of robot in the virtual URS on web and mobile phone. We control the robot view with mobile phone. This paper addresses the concept of virtual URS and introduces interaction between robot in the virtual URS and human using web and mobile phone service. Then, this paper introduces a case of service on mobile phone.

  • PDF

A Study on Developmental Direction of Interface Design for Gesture Recognition Technology

  • Lee, Dong-Min;Lee, Jeong-Ju
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.499-505
    • /
    • 2012
  • Objective: Research on the transformation of interaction between mobile machines and users through analysis on current gesture interface technology development trend. Background: For smooth interaction between machines and users, interface technology has evolved from "command line" to "mouse", and now "touch" and "gesture recognition" have been researched and being used. In the future, the technology is destined to evolve into "multi-modal", the fusion of the visual and auditory senses and "3D multi-modal", where three dimensional virtual world and brain waves are being used. Method: Within the development of computer interface, which follows the evolution of mobile machines, actively researching gesture interface and related technologies' trend and development will be studied comprehensively. Through investigation based on gesture based information gathering techniques, they will be separated in four categories: sensor, touch, visual, and multi-modal gesture interfaces. Each category will be researched through technology trend and existing actual examples. Through this methods, the transformation of mobile machine and human interaction will be studied. Conclusion: Gesture based interface technology realizes intelligent communication skill on interaction relation ship between existing static machines and users. Thus, this technology is important element technology that will transform the interaction between a man and a machine more dynamic. Application: The result of this study may help to develop gesture interface design currently in use.