Music Emotion Research Based on Reinforcement Learning and Multimodal Information

Music is an important carrier of emotion and an indispensable factor in people’s daily life. With the rapid growth of digital music, people’s demand for music emotion analysis and retrieval is also increasing. With the rapid development of Internet technology, digital music has been derived continuo...

Full description

Saved in:
Bibliographic Details
Main Author: Yue Hu
Format: Article
Language:English
Published: Wiley 2022-01-01
Series:Journal of Mathematics
Online Access:http://dx.doi.org/10.1155/2022/2446399
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832545833260679168
author Yue Hu
author_facet Yue Hu
author_sort Yue Hu
collection DOAJ
description Music is an important carrier of emotion and an indispensable factor in people’s daily life. With the rapid growth of digital music, people’s demand for music emotion analysis and retrieval is also increasing. With the rapid development of Internet technology, digital music has been derived continuously, and automatic recognition of music emotion has become the main research focus. For music, emotion is the most essential feature and the deepest inner feeling. Under the ubiquitous information environment, revealing the deep semantic information of multimodal information resources and providing users with integrated information services has important research and application value. In this paper, a multimodal fusion algorithm for music emotion analysis is proposed, and a dynamic model based on reinforcement learning is constructed to improve the analysis accuracy. The model dynamically adjusts the emotional analysis results by learning the user’s behavior, so as to realize the personalized customization of the user’s emotional preference.
format Article
id doaj-art-1b855258cac24fa8992d3bbff635b690
institution Kabale University
issn 2314-4785
language English
publishDate 2022-01-01
publisher Wiley
record_format Article
series Journal of Mathematics
spelling doaj-art-1b855258cac24fa8992d3bbff635b6902025-02-03T07:24:28ZengWileyJournal of Mathematics2314-47852022-01-01202210.1155/2022/2446399Music Emotion Research Based on Reinforcement Learning and Multimodal InformationYue Hu0Shanxi Jinzhong Institute of TechnologyMusic is an important carrier of emotion and an indispensable factor in people’s daily life. With the rapid growth of digital music, people’s demand for music emotion analysis and retrieval is also increasing. With the rapid development of Internet technology, digital music has been derived continuously, and automatic recognition of music emotion has become the main research focus. For music, emotion is the most essential feature and the deepest inner feeling. Under the ubiquitous information environment, revealing the deep semantic information of multimodal information resources and providing users with integrated information services has important research and application value. In this paper, a multimodal fusion algorithm for music emotion analysis is proposed, and a dynamic model based on reinforcement learning is constructed to improve the analysis accuracy. The model dynamically adjusts the emotional analysis results by learning the user’s behavior, so as to realize the personalized customization of the user’s emotional preference.http://dx.doi.org/10.1155/2022/2446399
spellingShingle Yue Hu
Music Emotion Research Based on Reinforcement Learning and Multimodal Information
Journal of Mathematics
title Music Emotion Research Based on Reinforcement Learning and Multimodal Information
title_full Music Emotion Research Based on Reinforcement Learning and Multimodal Information
title_fullStr Music Emotion Research Based on Reinforcement Learning and Multimodal Information
title_full_unstemmed Music Emotion Research Based on Reinforcement Learning and Multimodal Information
title_short Music Emotion Research Based on Reinforcement Learning and Multimodal Information
title_sort music emotion research based on reinforcement learning and multimodal information
url http://dx.doi.org/10.1155/2022/2446399
work_keys_str_mv AT yuehu musicemotionresearchbasedonreinforcementlearningandmultimodalinformation