Please use this identifier to cite or link to this item:
Title: Artificial intelligence in music : composition and emotion
Authors: Chen, Gong
Degree: Ph.D.
Issue Date: 2019
Abstract: Music is a universal feature in all human societies. People compose music for deliver their mind. In turn, music evokes people's emotion. This thesis investigates new artificial intelligence techniques for music composition and emotion analysis. Algorithmic composition enables the computer to compose music just like human musician. We utilize deep learning, a powerful artificial intelligence technology, in algorithmic composition. To achieve both good musicality and novelty of machine-composed music, we propose a musicality-novelty generative adversarial neural nets (MNGANs) model. Two games named musicality game and novelty game are designed and optimized alternatively. Specifically, the musicality game makes the output music samples follow the distribution of human-composed music examples, while the novelty game makes the output samples to be far from the nearest human example. A series of experiments validate the effectiveness of the proposed algorithmic composition technique. To provide more convincing evaluation of the machine-composed music, we propose to analyze the brainwave of audience by using electroencephalography (EEG). A new psychological paradigm employing auditory stimuli with different extent of musicality are designed. Based on this paradigm, we propose a novel computational model to evaluate the musicality for machine-composed music. Several empirical experiments are performed, and the results validate the effectiveness of the proposed method. Some interesting neuroscience observations are also reported. For the music emotion analysis, we deliver two pieces of works. The first work is a novel multi-label emotion classification model named quantum convolutional neural network (QCNN), which classify the music according to the audio signal. Inspired by the concepts of superposition state and collapse in quantum mechanism, we model the emotion measurement process by a new designed convolutional neural network. Empirical experiments validate that our method has achieved good classification result by learning from the labels provided by humans. The second work of music emotion analysis is based on the EEG of the audience. Considering that the EEG-based music emotion classification can be influenced by the inter-trial effect, we design a novel single-trial music listening paradigm. The new paradigm avoids the inter-trial effect on EEG data. To address the severe inter-subject difference in the single-trial setting, we propose a novel computation model named resting-state alignment, which aims to eliminate the difference between subjects and at the same time, find the features contributing to emotion recognition. Empirical experiments demonstrate the performance of the proposed technique and the observations are consistent with the previous studies from neuroscience. Future works will be explored in algorithmic composition to evoke certain emotion, which has both academic values and commercial potentials.
Subjects: Hong Kong Polytechnic University -- Dissertations
Artificial intelligence -- Musical applications
Computer music
Computer composition (Music)
Emotions in music
Pages: xviii, 111 pages : color illustrations
Appears in Collections:Thesis

Show full item record

Page views

Last Week
Last month
Citations as of Jun 4, 2023

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.