• 제목/요약/키워드: hand language

Search Result 471, Processing Time 0.021 seconds

Hand Language Translation Using Kinect

  • Pyo, Junghwan;Kang, Namhyuk;Bang, Jiwon;Jeong, Yongjin
    • Journal of IKEEE
    • /
    • v.18 no.2
    • /
    • pp.291-297
    • /
    • 2014
  • Since hand gesture recognition was realized thanks to improved image processing algorithms, sign language translation has been a critical issue for the hearing-impaired. In this paper, we extract human hand figures from a real time image stream and detect gestures in order to figure out which kind of hand language it means. We used depth-color calibrated image from the Kinect to extract human hands and made a decision tree in order to recognize the hand gesture. The decision tree contains information such as number of fingers, contours, and the hand's position inside a uniform sized image. We succeeded in recognizing 'Hangul', the Korean alphabet, with a recognizing rate of 98.16%. The average execution time per letter of the system was about 76.5msec, a reasonable speed considering hand language translation is based on almost still images. We expect that this research will help communication between the hearing-impaired and other people who don't know hand language.

A Structure and Framework for Sign Language Interaction

  • Kim, Soyoung;Pan, Younghwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.34 no.5
    • /
    • pp.411-426
    • /
    • 2015
  • Objective: The goal of this thesis is to design the interaction structure and framework of system to recognize sign language. Background: The sign language of meaningful individual gestures is combined to construct a sentence, so it is difficult to interpret and recognize the meaning of hand gesture for system, because of the sequence of continuous gestures. This being so, in order to interpret the meaning of individual gesture correctly, the interaction structure and framework are needed so that they can segment the indication of individual gesture. Method: We analyze 700 sign language words to structuralize the sign language gesture interaction. First of all, we analyze the transformational patterns of the hand gesture. Second, we analyze the movement of the transformational patterns of the hand gesture. Third, we analyze the type of other gestures except hands. Based on this, we design a framework for sign language interaction. Results: We elicited 8 patterns of hand gesture on the basis of the fact on whether the gesture has a change from starting point to ending point. And then, we analyzed the hand movement based on 3 elements: patterns of movement, direction, and whether hand movement is repeating or not. Moreover, we defined 11 movements of other gestures except hands and classified 8 types of interaction. The framework for sign language interaction, which was designed based on this mentioned above, applies to more than 700 individual gestures of the sign language, and can be classified as an individual gesture in spite of situation which has continuous gestures. Conclusion: This study has structuralized in 3 aspects defined to analyze the transformational patterns of the starting point and the ending point of hand shape, hand movement, and other gestures except hands for sign language interaction. Based on this, we designed the framework that can recognize the individual gestures and interpret the meaning more accurately, when meaningful individual gesture is input sequence of continuous gestures. Application: When we develop the system of sign language recognition, we can apply interaction framework to it. Structuralized gesture can be used for using database of sign language, inventing an automatic recognition system, and studying on the action gestures in other areas.

Numeric Sign Language Interpreting Algorithm Based on Hand Image Processing (영상처리 기반 숫자 수화표현 인식 알고리즘)

  • Gwon, Kyungpil;Yoo, Joonhyuk
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.14 no.3
    • /
    • pp.133-142
    • /
    • 2019
  • The existing auxiliary communicating aids for the hearing-impaired have an inconvenience of using additional expensive sensing devices. This paper presents a hand image detection based algorithm to interpret the sign language of the hearing-impaired. The proposed sign language recognition system exploits the hand image only captured by the camera without using any additional gloves with extra sensors. Based on the hand image processing, the system can perfectly classify several numeric sign language representations. This work proposes a simple lightweight classification algorithm to identify the hand image of the hearing-impaired to communicate with others even further in an environment of complex background. Experimental results show that the proposed system can interpret the numeric sign language quite well with an accuracy of 95.6% on average.

Development of Hand Shape Editor for Sign Language Motion (수화 동작을 위한 손 모양 편집 프로그램의 개발)

  • Oh, Young-Joon;Park, Kwang-Hyun;Bien, Zeung-Nam
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.216-218
    • /
    • 2007
  • Korean Sign Language (KSL) is a communication method for the Deaf in Korea, and hand shape is one of important elements in sign language. In this paper, we developed a KSL hand shape editor to simply compose hand shape and connect it to a database. We can edit hand shape by a graphical user interface (GUI) on 3D virtual reality environment. Hand shape codes are connected to a sign word editor to synthesize sign motion and to decrease total amount of KSL data.

  • PDF

Research on Development of VR Realistic Sign Language Education Content Using Hand Tracking and Conversational AI (Hand Tracking과 대화형 AI를 활용한 VR 실감형 수어 교육 콘텐츠 개발 연구)

  • Jae-Sung Chun;Il-Young Moon
    • Journal of Advanced Navigation Technology
    • /
    • v.28 no.3
    • /
    • pp.369-374
    • /
    • 2024
  • This study aims to improve the accessibility and efficiency of sign language education for both hearing impaired and non-deaf people. To this end, we developed VR realistic sign language education content that integrates hand tracking technology and conversational AI. Through this content, users can learn sign language in real time and experience direct communication in a virtual environment. As a result of the study, it was confirmed that this integrated approach significantly improves immersion in sign language learning and contributes to lowering the barriers to sign language learning by providing learners with a deeper understanding. This presents a new paradigm for sign language education and shows how technology can change the accessibility and effectiveness of education.

A Method for Generating Inbetween Frames in Sign Language Animation (수화 애니메이션을 위한 중간 프레임 생성 방법)

  • O, Jeong-Geun;Kim, Sang-Cheol
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.5
    • /
    • pp.1317-1329
    • /
    • 2000
  • The advanced techniques for video processing and computer graphics enables a sign language education system to appear. the system is capable of showing a sign language motion for an arbitrary sentence using the captured video clips of sign language words. In this paper, a method is suggested which generates the frames between the last frame of a word and the first frame of its following word in order to animate hand motion. In our method, we find hand locations and angles which are required for in between frame generation, capture and store the hand images at those locations and angles. The inbetween frames generation is simply a task of finding a sequence of hand angles and locations. Our method is computationally simple and requires a relatively small amount of disk space. However, our experiments show that inbetween frames for the presentation at about 15fps (frame per second) are achieved so tat the smooth animation of hand motion is possible. Our method improves on previous works in which computation cost is relativey high or unnecessary images are generated.

  • PDF

Sign Language Recognition System Using SVM and Depth Camera (깊이 카메라와 SVM을 이용한 수화 인식 시스템)

  • Kim, Ki-Sang;Choi, Hyung-Il
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.11
    • /
    • pp.63-72
    • /
    • 2014
  • In this paper, we propose a sign language recognition system using SVM and depth camera. Especially, we focus on the Korean sign language. For the sign language system, we suggest two methods, one in hand feature extraction stage and the other in recognition stage. Hand features are consisted of the number of fingers, finger length, radius of palm, and direction of the hand. To extract hand features, we use Distance Transform and make hand skeleton. This method is more accurate than a traditional method which uses contours. To recognize hand posture, we develop the decision tree with the hand features. For more accuracy, we use SVM to determine the threshold value in the decision tree. In the experimental results, we show that the suggested method is more accurate and faster when extracting hand features a recognizing hand postures.

Enhanced Sign Language Transcription System via Hand Tracking and Pose Estimation

  • Kim, Jung-Ho;Kim, Najoung;Park, Hancheol;Park, Jong C.
    • Journal of Computing Science and Engineering
    • /
    • v.10 no.3
    • /
    • pp.95-101
    • /
    • 2016
  • In this study, we propose a new system for constructing parallel corpora for sign languages, which are generally under-resourced in comparison to spoken languages. In order to achieve scalability and accessibility regarding data collection and corpus construction, our system utilizes deep learning-based techniques and predicts depth information to perform pose estimation on hand information obtainable from video recordings by a single RGB camera. These estimated poses are then transcribed into expressions in SignWriting. We evaluate the accuracy of hand tracking and hand pose estimation modules of our system quantitatively, using the American Sign Language Image Dataset and the American Sign Language Lexicon Video Dataset. The evaluation results show that our transcription system has a high potential to be successfully employed in constructing a sizable sign language corpus using various types of video resources.

Hand Motion Design for Performance Enhancement of Vision Based Hand Signal Recognizer (영상기반의 안정적 수신호 인식기를 위한 손동작 패턴 설계 방법)

  • Shon, Su-Won;Beh, Joung-Hoon;Yang, Cheol-Jong;Wang, Han;Ko, Han-Seok
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.48 no.4
    • /
    • pp.30-37
    • /
    • 2011
  • This paper proposes a language set of hand motions for enhancing the performance of vision-based hand signal recognizer. Based on the statistical analysis of the angular tendency of hand movements in sign language and the hand motions in practical use, we construct four motion primitives as building blocks for basic hand motions. By combining these motion primitives, we design a discernable 'fundamental hand motion set' toward increasing the hand signal recognition. To demonstrate the validity of proposed designing method, we develop a 'fundamental hand motion set' recognizer based on hidden Markov model (HMM). The recognition system showed 99.01% recognition rate on the proposed language set. This result validates that the proposed language set enhances discernaility among the hand motions such that the performance of hand signal recognizer is improved.

On the Problems of North and South Korean Scholars′ Studies on the Genealogy of Korean Language (남북한 학자의 국어 계통 연구의 제문제)

  • 정광
    • Lingua Humanitatis
    • /
    • v.6
    • /
    • pp.169-183
    • /
    • 2004
  • So far I have reviewed the two controversial opinions of the North Korean and the South Korean linguists concerning the position of the Koguryeo language in the formation of Korean. Many South Korean scholars in favor of the Altaic Language Family Hypothesis argue that the ancient Korean language consisted of two different languages, one of which was the northern dialect including four languages such as the Koguryeo language (the largest one within the area), the Puyo language, the Okche language, the Yemaek language, and the other was the southern dialect, the largest language of which is the Shinla language. On the other hand, the linguists of North Korea claim that in Koguryeo and Shinla the same language was spoken and that modern Korean is formed based on the Koguryeo language. Before evaluating which of these claims is correct I would like to turn to the scarcity of the linguistic data of the Koguryeo language. Compared with the pragmatic methodology of the South Korean linguists in the studies on the Altaic affinity of Korean, the North Korean scholars need to present still more evidences in order to support their argument. In Chung (1993) I argued that studies on the genealogy of the Korean language or history had to be performed regardless of tile political purpose or for the purposes. We should admit the historical fact that there had been many tribal states in the Korean peninsula before the ancient Korean stage, those of which had been emerged to become three kingdoms. Those kingdoms were unified by Shinla, which was connected to Koryeo Dynasty. We cannot disregard the fact that the Korean language has been developed hand in hand with these historical process with those steps related with each age. The first thing we should do right now is to collect the remaining data of the Koguryeo language recorded in the old written materials, which have been found in North Korea as many as possible. Also, 1 hope that the linguists of South Korea achieve more academic success in the comparative studies of the Paekjae language, the Shinla language, and other adjacent Altaic languages.

  • PDF