• Title/Summary/Keyword: music tempo

Search Result 53, Processing Time 0.024 seconds

Automatic Emotion Classification of Music Signals Using MDCT-Driven Timbre and Tempo Features

  • Kim, Hyoung-Gook;Eom, Ki-Wan
    • The Journal of the Acoustical Society of Korea
    • /
    • v.25 no.2E
    • /
    • pp.74-78
    • /
    • 2006
  • This paper proposes an effective method for classifying emotions of the music from its acoustical signals. Two feature sets, timbre and tempo, are directly extracted from the modified discrete cosine transform coefficients (MDCT), which are the output of partial MP3 (MPEG 1 Layer 3) decoder. Our tempo feature extraction method is based on the long-term modulation spectrum analysis. In order to effectively combine these two feature sets with different time resolution in an integrated system, a classifier with two layers based on AdaBoost algorithm is used. In the first layer the MDCT-driven timbre features are employed. By adding the MDCT-driven tempo feature in the second layer, the classification precision is improved dramatically.

Depending on Mode and Tempo Cues for Musical Emotion Identification in Children With Cochlear Implants (조성 및 템포 단서에 따른 인공와우이식 아동의 음악 정서 지각)

  • Lee, Yoonji
    • Journal of Music and Human Behavior
    • /
    • v.21 no.1
    • /
    • pp.29-47
    • /
    • 2024
  • The purpose of this study was to investigate how children with cochlear implants (CI) perceive emotion in music depending on mode and tempo cues, and to compare them to NH children. Participants in this study included 13 CI children who were implanted with either unilateral or bilateral cochlear implants aged between 7 and 13 years, 36 NH children, and 20 NH adults. The musical stimuli used in this study were piano recordings in either major or minor mode, with tempos of 130 bpm and 56 bpm. A comparison of the emotion perception levels of NH children and NH adults before the experiment showed that there was no significant difference between the two groups. Meanwhile, the way they perceive different emotions from each music condition varies, in that CI children perceived all music conditions except as happy, while NH children perceived music in a major key as happy and music in a minor key as sad. It supports that CI children tend to rely primarily on tempo cues to process and identify emotional information from music, which is contrary to NH children. It is important to note that this study enhanced and specified the understanding of how CI children perceive music emotion and use specific musical elements in the process. These findings indicate baseline data on emotion perception in music in CI children.

Multi-channel EEG classification method according to music tempo stimuli using 3D convolutional bidirectional gated recurrent neural network (3차원 합성곱 양방향 게이트 순환 신경망을 이용한 음악 템포 자극에 따른 다채널 뇌파 분류 방식)

  • Kim, Min-Soo;Lee, Gi Yong;Kim, Hyoung-Gook
    • The Journal of the Acoustical Society of Korea
    • /
    • v.40 no.3
    • /
    • pp.228-233
    • /
    • 2021
  • In this paper, we propose a method to extract and classify features of multi-channel ElectroEncephalo Graphy (EEG) that change according to various musical tempo stimuli. In the proposed method, a 3D convolutional bidirectional gated recurrent neural network extracts spatio-temporal and long time-dependent features from the 3D EEG input representation transformed through the preprocessing. The experimental results show that the proposed tempo stimuli classification method is superior to the existing method and the possibility of constructing a music-based brain-computer interface.

A Study on the Trend of Korean Pop Music Preference Through Digital Music Market (디지털 음악 시장을 통해 본 한국 대중가요 선호경향에 관한 연구)

  • Chung, Ji-Yun;Kim, Myoung-Jun
    • Journal of Digital Contents Society
    • /
    • v.18 no.6
    • /
    • pp.1025-1032
    • /
    • 2017
  • Recently the domestic popular song market has been growing mainly in digital sound sources. As a result of analyzing the top 100 music charts from 2012 to 2016 through digital sound sources and musical scores, the average annual BPM has fallen by 11.26 over five years. Every year, The style of music has diversified every year, and the proportion of Hip-hop has doubled from 8.5% in 2012 to 17.8% in 2015. Dance music and ballads have a high preference rate, but the relationship is inversely proportional. Singer composition was inversely proportional to the ratio of female solo to male group. Especially, the relationship between BPM and the Major/Minor key is that 81.42% for slow tempo songs is Major key and 53.85% for fast tempo songs is minor key. In the case of TV drama OST, the solo singer 's music was preferred, the music style was 80% pop and 20% ballad.

A Method for Measuring the Difficulty of Music Scores

  • Song, Yang-Eui;Lee, Yong Kyu
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.4
    • /
    • pp.39-46
    • /
    • 2016
  • While the difficulty of the music can be classified by a variety of standard, conventional methods are classified by the subjective judgment based on the experience of many musicians or conductors. Music score is difficult to evaluate as there is no quantitative criterion to determine the degree of difficulty. In this paper, we propose a new classification method for determining the degree of difficulty of the music. In order to determine the degree of difficulty, we convert the score, which is expressed as a traditional music score, into electronic music sheet. Moreover, we calculate information about the elements needed to play sheet music by distance of notes, tempo, and quantifying the ease of interpretation. Calculating a degree of difficulty of the entire music via the numerical data, we suggest the difficulty evaluation of the score, and show the difficulty of music through experiments.

Conversion Program of Music Score Chord using OpenCV and Deep Learning (영상 처리와 딥러닝을 이용한 악보 코드 변환 프로그램)

  • Moon, Ji-su;Kim, Min-ji;Lim, Young-kyu;Kong, Ki-sok
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.21 no.1
    • /
    • pp.69-77
    • /
    • 2021
  • This paper deals with the development of an application that converts the PDF music score entered by the user into a MIDI file of the chord the user wants. This application converts the PDF file into a PNG file for chord conversion when the user enters the PDF music score file and the chord which the user wants to change. After recognizing the melody of sheet music through image processing algorithm and recognizing the tempo of sheet music notes through deep learning, then the MIDI file of chord for existing sheet music is produced. The OpenCV algorithm and deep learning can recognize minim note, quarter note, eighth note, semi-quaver note, half rest, eighth rest, quarter rest, semi-quaver rest, successive notes and chord notes. The experiment shows that the note recognition rate of the music score was 100% and the tempo recognition rate was 90% or more.

Design of Music Learning Assistant Based on Audio Music and Music Score Recognition

  • Mulyadi, Ahmad Wisnu;Machbub, Carmadi;Prihatmanto, Ary S.;Sin, Bong-Kee
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.5
    • /
    • pp.826-836
    • /
    • 2016
  • Mastering a musical instrument for an unskilled beginning learner is not an easy task. It requires playing every note correctly and maintaining the tempo accurately. Any music comes in two forms, a music score and it rendition into an audio music. The proposed method of assisting beginning music players in both aspects employs two popular pattern recognition methods for audio-visual analysis; they are support vector machine (SVM) for music score recognition and hidden Markov model (HMM) for audio music performance tracking. With proper synchronization of the two results, the proposed music learning assistant system can give useful feedback to self-training beginners.

Music Tempo Tracking and Motion Pattern Selection for Dancing Robots (댄싱 로봇의 구현을 위한 음악 템포 추출 및 모션 패턴 결정 방법)

  • Jun, Myoung-Jae;Ryu, Minsoo
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2009.11a
    • /
    • pp.369-370
    • /
    • 2009
  • Robot이 음악에 맞춰 어떤 행동을 하기 위해선 먼저 Acoustic을 이해 할 수 있는 인지 능력이 필요하며 인지한 음악적 내용을 Dance Motion에 가깝게 Action을 표현할 수 있어야 한다. 본 논문에서는 신호처리와 기계학습을 사용하여 음악의 Tempo를 Tracking하고 이것을 참고하여 행동 Pattern을 결정하는 Dance Robot System을 소개한다.

Finding Measure Position Using Combination Rules of Musical Notes in Monophonic Song (단일 음원 노래에서 음표의 조합 규칙을 이용한 마디 위치 찾기)

  • Park, En-Jong;Shin, Song-Yi;Lee, Joon-Whoan
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.10
    • /
    • pp.1-12
    • /
    • 2009
  • There exist some regular multiple relations in the intervals of notes when they are combined within one measure. This paper presents a method to find the exact measure positions in monophonic song based on those relations. In the proposed method the individual intervals are segmented at first and the rules that state the multiple relations are used to find the measure position. The measures can be applied as the foundational information for extracting beat and tempo of a song which can be used as background knowledge of automatic music transcription system. The proposed method exactly detected the measure positions of 11 songs out of 12 songs except one song which consist of monophonic voice song of the men and women. Also one can extract the information of beat and tempo of a song using the information about extracted measure positions with music theory.

The Effects of Visual and Auditory Stimulation on the Ability to Perform Exercise (시각과 청각 자극이 운동수행능력에 미치는 영향)

  • Park, Kwang-Hyun;Kim, Yu-Min;Kim, Hyun-A;Seo, Han-Bit;Son, Won-Bin;Song, Eun-Ji;Shin, Su-Jin;Ahn, Ha-Rim;Lee, Choong-Jung;Cho, Min-Ok;Kim, Min-Hee
    • PNF and Movement
    • /
    • v.14 no.2
    • /
    • pp.121-130
    • /
    • 2016
  • Purpose: The purpose of this study was to investigate the effects of visual and auditory stimulation on the ability to perform exercise. Methods: One hundred twenty subjects were randomly divided into four groups (Green light and Fast tempo music, GF; Green light and Slow tempo music, GS; Red light and Fast tempo music, RF; and Red light and Slow tempo music, RS). One of either two visual stimuli or one of two auditory stimuli were applied to each group. The experiment was conducted randomly twice in two environments: one had visual and auditory stimuli and one had no stimulation. Muscle strength, grip, endurance, quickness, agility, concentration, and balance were measured to determine the ability to perform exercise. Results: Significant differences were found in the muscle strength of the participants who were exposed to the auditory factor and the interaction of visual and auditory factors. In endurance, significant differences were found in all of the factors: visual, auditory, and the interaction of visual and auditory. In quickness, agility, and balance ability there were significant differences in the visual factor. In concentration, there was a significant difference in the auditory factor. Conclusion: Visual stimuli, auditory stimuli, and their interaction influenced the ability to perform exercise. These facts imply that providing the proper environmental stimulation is important to increase the ability to perform during exercise.