• Title/Summary/Keyword: Touch-points extraction

Search Result 5, Processing Time 0.026 seconds

Implementation of non-Wearable Air-Finger Mouse by Infrared Diffused Illumination (적외선 확산 투광에 의한 비장착형 공간 손가락 마우스 구현)

  • Lee, Woo-Beom
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.15 no.2
    • /
    • pp.167-173
    • /
    • 2015
  • Extraction of Finger-end points is one of the most process for user multi-commands in the Hand-Gesture interface technology. However, most of previous works use the geometric and morphological method for extracting a finger-end points. Therefore, this paper proposes the method of user finger-end points extraction that is motivated a ultrared diffused illumination, which is used for the user commands in the multi-touch display device. Proposed air-mouse is worked by the quantity state and moving direction of extracted finger-end points. Also, our system includes a basic mouse event, as well as the continuous command function for expending a user multi-gesture. In order to evaluate the performance of the our proposed method, after applying to the web browser application as a command device. As a result, the proposed method showed the average 90% success-rate for the various user-commands.

Implementation of a DI Multi-Touch Display Using an Improved Touch-Points Detection and Gesture Recognition (개선된 터치점 검출과 제스쳐 인식에 의한 DI 멀티터치 디스플레이 구현)

  • Lee, Woo-Beom
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.11 no.1
    • /
    • pp.13-18
    • /
    • 2010
  • Most of the research in the multi-touch area is based on the FTIR(Frustrated Total Internal Re리ection), which is just implemented by using the previous approach. Moreover, there are not the software solutions to improve a performance in the multi touch-blobs detection or the user gesture recognition. Therefore, we implement a multi-touch table-top display that is based on the DI(Diffused Illumination), the improved touch-points detection and user gesture recognition. The proposed method supports a simultaneous transformation multi-touch command for objects in the running application. Also, the system latency time is reduced by the proposed ore-testing method in the multi touch-blobs detection processing. Implemented device is simulated by programming the Flash AS3 application in the TUIO(Tangible User Interface Object) environment that is based on the OSC(Open Sound Control) protocol. As a result, Our system shows the 37% system latency reduction, and is successful in the multi-touch gestures recognition.

Automation of Bio-Industrial Process Via Tele-Task Command(I) -identification and 3D coordinate extraction of object- (원격작업 지시를 이용한 생물산업공정의 생력화 (I) -대상체 인식 및 3차원 좌표 추출-)

  • Kim, S. C.;Choi, D. Y.;Hwang, H.
    • Journal of Biosystems Engineering
    • /
    • v.26 no.1
    • /
    • pp.21-28
    • /
    • 2001
  • Major deficiencies of current automation scheme including various robots for bioproduction include the lack of task adaptability and real time processing, low job performance for diverse tasks, and the lack of robustness of take results, high system cost, failure of the credit from the operator, and so on. This paper proposed a scheme that could solve the current limitation of task abilities of conventional computer controlled automatic system. The proposed scheme is the man-machine hybrid automation via tele-operation which can handle various bioproduction processes. And it was classified into two categories. One category was the efficient task sharing between operator and CCM(computer controlled machine). The other was the efficient interface between operator and CCM. To realize the proposed concept, task of the object identification and extraction of 3D coordinate of an object was selected. 3D coordinate information was obtained from camera calibration using camera as a measurement device. Two stereo images were obtained by moving a camera certain distance in horizontal direction normal to focal axis and by acquiring two images at different locations. Transformation matrix for camera calibration was obtained via least square error approach using specified 6 known pairs of data points in 2D image and 3D world space. 3D world coordinate was obtained from two sets of image pixel coordinates of both camera images with calibrated transformation matrix. As an interface system between operator and CCM, a touch pad screen mounted on the monitor and remotely captured imaging system were used. Object indication was done by the operator’s finger touch to the captured image using the touch pad screen. A certain size of local image processing area was specified after the touch was made. And image processing was performed with the specified local area to extract desired features of the object. An MS Windows based interface software was developed using Visual C++6.0. The software was developed with four modules such as remote image acquisiton module, task command module, local image processing module and 3D coordinate extraction module. Proposed scheme shoed the feasibility of real time processing, robust and precise object identification, and adaptability of various job and environments though selected sample tasks.

  • PDF

Fingertip Extraction and Hand Motion Recognition Method for Augmented Reality Applications (증강현실 응용을 위한 손 끝점 추출과 손 동작 인식 기법)

  • Lee, Jeong-Jin;Kim, Jong-Ho;Kim, Tae-Young
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.2
    • /
    • pp.316-323
    • /
    • 2010
  • In this paper, we propose fingertip extraction and hand motion recognition method for augmented reality applications. First, an input image is transformed into HSV color space from RGB color space. A hand area is segmented using double thresholding of H, S value, region growing, and connected component analysis. Next, the end points of the index finger and thumb are extracted using morphology operation and subtraction for a virtual keyboard and mouse interface. Finally, the angle between the end points of the index finger and thumb with respect to the center of mass point of the palm is calculated to detect the touch between the index finger and thumb for implementing the click of a mouse button. Experimental results on various input images showed that our method segments the hand, fingertips, and recognizes the movements of the hand fast and accurately. Proposed methods can be used the input interface for augmented reality applications.

Region of Interest Extraction Method and Hardware Implementation of Matrix Pattern Image (매트릭스 패턴 영상의 관심 영역 추출 방법 및 하드웨어 구현)

  • Cho, Hosang;Kim, Geun-Jun;Kang, Bongsoon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.4
    • /
    • pp.940-947
    • /
    • 2015
  • This paper presents the region of interest pattern image extraction method on a display printed matrix pattern. Proposed method can not use conventional method such as laser, ultrasonic waves and touch sensor. It searches feature point and rotation angle using luminance and pattern reliable feature points of input image, and then it extracts region of interest. In order to extract region of interest, we simulate proposed method using pattern image written various angles on display panel. The proposed method makes progress using the OpenCV and the window program, and was designed using Verilog-HDL and was verified through the FPGA Board(xc6vlx760) of Xilinx.