• Title/Summary/Keyword: Music Emotion

Search Result 213, Processing Time 0.024 seconds

Listeners' Perception of Intended Emotions in Music

  • Chong, Hyun Ju;Jeong, Eunju;Kim, Soo Ji
    • International Journal of Contents
    • /
    • v.9 no.4
    • /
    • pp.78-85
    • /
    • 2013
  • Music functions as a catalyst for various emotional experiences. Among the numerous genres of music, film music has been reported to induce strong emotional responses. However, the effectiveness of film music in evoking different types of emotions and its relationship in terms of which musical elements contribute to listeners' perception of intended emotion have been rarely investigated. The purpose of this study was to examine the congruence between the intended emotion and the perceived emotion of listeners in film music listening and to identify musical characteristics of film music that correspond with specific types of emotion. Additionally, the study aimed to investigate possible relationships between participants' identification responses and personal musical experience. A total of 147 college students listened to twelve 15-second music excerpts and identified the perceived emotion during music listening. The results showed a high degree of congruence between the intended emotion in film music and the participants' perceived emotion. Existence of tonality and modality were found to play an important role in listeners' perception of intended emotion. The findings suggest that identification of perceived emotion in film music excerpts was congruent regardless of individual differences. Specific music components that led to high congruence are further discussed.

A Study on the Variation of Music Characteristics based on User Controlled Music Emotion (음악 감성의 사용자 조절에 따른 음악의 특성 변형에 관한 연구)

  • Nguyen, Van Loi;Xubin, Xubin;Kim, Donglim;Lim, Younghwan
    • The Journal of the Korea Contents Association
    • /
    • v.17 no.3
    • /
    • pp.421-430
    • /
    • 2017
  • In this paper, research results on the change of music emotion are described. Our gaol was to provide a method of changing music emotion by a human user. Then we tried to find a way of transforming the contents of the original music into the music whose emotion is similar with the changed emotion. For the purpose, a method of changing the emotion of playing music on two-dimensional plan was describe. Then the original music should be transformed into the music which emotion would be equal to the changed emotion. As the first step a method of deciding which music factors and how much should be changed was presented. Finally the experimental method of editing by sound editor for changing the emotion was described. There are so many research results on the recognition of music emotion. But the try of changing the music emotion is very rare. So this paper would open another way of doing research on music emotion field.

SYMMER: A Systematic Approach to Multiple Musical Emotion Recognition

  • Lee, Jae-Sung;Jo, Jin-Hyuk;Lee, Jae-Joon;Kim, Dae-Won
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.11 no.2
    • /
    • pp.124-128
    • /
    • 2011
  • Music emotion recognition is currently one of the most attractive research areas in music information retrieval. In order to use emotion as clues when searching for a particular music, several music based emotion recognizing systems are fundamentally utilized. In order to maximize user satisfaction, the recognition accuracy is very important. In this paper, we develop a new music emotion recognition system, which employs a multilabel feature selector and multilabel classifier. The performance of the proposed system is demonstrated using novel musical emotion data.

Emotion-based music visualization using LED lighting control system (LED조명 시스템을 이용한 음악 감성 시각화에 대한 연구)

  • Nguyen, Van Loi;Kim, Donglim;Lim, Younghwan
    • Journal of Korea Game Society
    • /
    • v.17 no.3
    • /
    • pp.45-52
    • /
    • 2017
  • This paper proposes a new strategy of emotion-based music visualization. Emotional LED lighting control system is suggested to help audiences enhance the musical experience. In the system, emotion in music is recognized by a proposed algorithm using a dimensional approach. The algorithm used a method of music emotion variation detection to overcome some weaknesses of Thayer's model in detecting emotion in a one-second music segment. In addition, IRI color model is combined with Thayer's model to determine LED light colors corresponding to 36 different music emotions. They are represented on LED lighting control system through colors and animations. The accuracy of music emotion visualization achieved to over 60%.

Ranking Tag Pairs for Music Recommendation Using Acoustic Similarity

  • Lee, Jaesung;Kim, Dae-Won
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.3
    • /
    • pp.159-165
    • /
    • 2015
  • The need for the recognition of music emotion has become apparent in many music information retrieval applications. In addition to the large pool of techniques that have already been developed in machine learning and data mining, various emerging applications have led to a wealth of newly proposed techniques. In the music information retrieval community, many studies and applications have concentrated on tag-based music recommendation. The limitation of music emotion tags is the ambiguity caused by a single music tag covering too many subcategories. To overcome this, multiple tags can be used simultaneously to specify music clips more precisely. In this paper, we propose a novel technique to rank the proper tag combinations based on the acoustic similarity of music clips.

A Study on the Performance of Music Retrieval Based on the Emotion Recognition (감정 인식을 통한 음악 검색 성능 분석)

  • Seo, Jin Soo
    • The Journal of the Acoustical Society of Korea
    • /
    • v.34 no.3
    • /
    • pp.247-255
    • /
    • 2015
  • This paper presents a study on the performance of the music search based on the automatically recognized music-emotion labels. As in the other media data, such as speech, image, and video, a song can evoke certain emotions to the listeners. When people look for songs to listen, the emotions, evoked by songs, could be important points to consider. However; very little study has been done on the performance of the music-emotion labels to the music search. In this paper, we utilize the three axes of human music perception (valence, activity, tension) and the five basic emotion labels (happiness, sadness, tenderness, anger, fear) in measuring music similarity for music search. Experiments were conducted on both genre and singer datasets. The search accuracy of the proposed emotion-based music search was up to 75 % of that of the conventional feature-based music search. By combining the proposed emotion-based method with the feature-based method, we achieved up to 14 % improvement of search accuracy.

A Convergence Study on Music-color Association Responses of People with Visual Impairment Mediated by Emotion (시각장애인의 정서 기반 음악-색채 연합에 대한 융복합적 연구)

  • Park, Hye-Young
    • Journal of the Korea Convergence Society
    • /
    • v.10 no.5
    • /
    • pp.313-321
    • /
    • 2019
  • The purpose of this study was to examine music-color association response(MCAR) of people with visual impairment through music-emotion scale and music-color scale. The study was conducted on 60 participants(30 congenital/ 30 adventitious) who are using services of two welfare centers at S and B cities. For this, four basic emotions (happiness, sadness, anger, and fear) mediated by music were selected, and MCAR to emotion-inducing music were analyzed through self-report method. As a result, first, there were found contrasts in MCAR between happiness and sadness according to type of emotion, however, similar in anger and fear. Second, in MCAR among three variables of the music-emotion scale(valence, arousal and intensity), valence was congruent with MCAR according to type of emotion, arousal marked high scores in negative emotions, and scores of intensity in happiness and sadness were higher than those in anger and fear. Third, there were no significant differences between two groups of people with congenital and adventitious visual impairments. It is meaningful that this study showed the MCAR can be mediated by music through investigating those of people with visual impairment.

Attention-based CNN-BiGRU for Bengali Music Emotion Classification

  • Subhasish Ghosh;Omar Faruk Riad
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.9
    • /
    • pp.47-54
    • /
    • 2023
  • For Bengali music emotion classification, deep learning models, particularly CNN and RNN are frequently used. But previous researches had the flaws of low accuracy and overfitting problem. In this research, attention-based Conv1D and BiGRU model is designed for music emotion classification and comparative experimentation shows that the proposed model is classifying emotions more accurate. We have proposed a Conv1D and Bi-GRU with the attention-based model for emotion classification of our Bengali music dataset. The model integrates attention-based. Wav preprocessing makes use of MFCCs. To reduce the dimensionality of the feature space, contextual features were extracted from two Conv1D layers. In order to solve the overfitting problems, dropouts are utilized. Two bidirectional GRUs networks are used to update previous and future emotion representation of the output from the Conv1D layers. Two BiGRU layers are conntected to an attention mechanism to give various MFCC feature vectors more attention. Moreover, the attention mechanism has increased the accuracy of the proposed classification model. The vector is finally classified into four emotion classes: Angry, Happy, Relax, Sad; using a dense, fully connected layer with softmax activation. The proposed Conv1D+BiGRU+Attention model is efficient at classifying emotions in the Bengali music dataset than baseline methods. For our Bengali music dataset, the performance of our proposed model is 95%.

Music Similarity Search Based on Music Emotion Classification

  • Kim, Hyoung-Gook;Kim, Jang-Heon
    • The Journal of the Acoustical Society of Korea
    • /
    • v.26 no.3E
    • /
    • pp.69-73
    • /
    • 2007
  • This paper presents an efficient algorithm to retrieve similar music files from a large archive of digital music database. Users are able to navigate and discover new music files which sound similar to a given query music file by searching for the archive. Since most of the methods for finding similar music files from a large database requires on computing the distance between a given query music file and every music file in the database, they are very time-consuming procedures. By measuring the acoustic distance between the pre-classified music files with the same type of emotion, the proposed method significantly speeds up the search process and increases the precision in comparison with the brute-force method.

Social Network Based Music Recommendation System (소셜네트워크 기반 음악 추천시스템)

  • Park, Taesoo;Jeong, Ok-Ran
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.133-141
    • /
    • 2015
  • Mass multimedia contents are shared through various social media servies including social network service. As social network reveals user's current situation and interest, highly satisfactory personalized recommendation can be made when such features are applied to the recommendation system. In addition, classifying the music by emotion and using analyzed information about user's recent emotion or current situation by analyzing user's social network, it will be useful upon recommending music to the user. In this paper, we propose a music recommendation method that makes an emotion model to classify the music, classifies the music according to the emotion model, and extracts user's current emotional state represented on the social network to recommend music, and evaluates the validity of our method through experiments.