skip to main content
research-article

Affective Calibration of Musical Feature Sets in an Emotionally Intelligent Music Composition System

Published: 10 May 2017 Publication History

Abstract

Affectively driven algorithmic composition (AAC) is a rapidly growing field that exploits computer-aided composition in order to generate new music with particular emotional qualities or affective intentions. An AAC system was devised in order to generate a stimulus set covering nine discrete sectors of a two-dimensional emotion space by means of a 16-channel feed-forward artificial neural network. This system was used to generate a stimulus set of short pieces of music, which were rendered using a sampled piano timbre and evaluated by a group of experienced listeners who ascribed a two-dimensional valence-arousal coordinate to each stimulus. The underlying musical feature set, initially drawn from the literature, was subsequently adjusted by amplifying or attenuating the quantity of each feature in order to maximize the spread of stimuli in the valence-arousal space before a second listener evaluation was conducted. This process was repeated a third time in order to maximize the spread of valence-arousal coordinates ascribed to the generated stimulus set in comparison to a spread taken from an existing prerated database of stimuli, demonstrating that this prototype AAC system is capable of creating short sequences of music with a slight improvement on the range of emotion found in a stimulus set comprised of real-world, traditionally composed musical excerpts.

References

[1]
O. Bown and S. Lexer. 2006. Continuous-time recurrent neural networks for generative and interactive musical performance. In Applications of Evolutionary Computing. Springer, 652--663.
[2]
M. M. Bradley and P. J. Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25, 1 (1994), 49--59.
[3]
R. Bresin. 1998. Artificial neural networks based models for automatic performance of musical scores. J. New Music Res. 27, 3 (1998), 239--270.
[4]
R. Bresin and A. Friberg. 2011. Emotion rendering in music: Range and characteristic values of seven musical variables. Cortex 47, 9 (Oct. 2011), 1068--1081.
[5]
E. Brown and P. Cairns. 2004. A grounded investigation of game immersion. In CHI’04 Extended Abstracts on Human Factors in Computing Systems. ACM, 1297--1300.
[6]
G. A. Carpenter and S. Grossberg. 1992. A self-organizing neural network for supervised learning, recognition, and prediction. IEEE Commun. Mag. 30, 9 (1992), 38--49.
[7]
M. Casey. 2001. General sound classification and similarity in MPEG-7. Organised Sound 6, 2 (Aug. 2001), 153--164.
[8]
P. Dahlstedt. 2001. A mutasynth in parameter space: Interactive composition through evolution. Organised Sound 6, 2 (Aug. 2001), 121--124.
[9]
B. Den Brinker, R. Van Dinther, and J. Skowronek. 2012. Expressed music mood classification compared with valence and arousal ratings. EURASIP J. Audio Speech Music Process. 2012, 1 (2012), 1--14.
[10]
J. Eaton, D. Williams, and E. Miranda. 2014. Affective jukebox: A confirmatory study of EEG emotional correlates in response to musical stimuli. In Proceedings of the Joint ICMC, SMC 2014, 11th Sound and Music Conference, and 14th International Computer Music Conference. University of Athens, Greece.
[11]
T. Eerola and J. K. Vuoskoski. 2010. A comparison of the discrete and dimensional models of emotion in music. Psychol. Music 39, 1 (Aug. 2010), 18--49.
[12]
H. Egermann and S. McAdams. 2013. Empathy and emotional contagion as a link between recognized and felt emotions in music listening. Music Percept. Interdiscip. J. 31, 2 (2013), 139--156.
[13]
A. Eigenfeldt. 2011. Real-time composition as performance ecosystem. Organised Sound 16, 2 (Aug. 2011), 145--153.
[14]
A. Gabrielsson. 2002. Emotion perceived and emotion felt: Same or different? Music. Sci. 5, 1 suppl (2002), 123--147.
[15]
A. Gabrielsson and E. Lindström. 2001. The influence of musical structure on emotional expression. In Music and Emotion:Theory and Research. Series in affective science. P. N. Juslin, J. A. Sloboda, ed. Oxford University Press, New York, 223--248.
[16]
P. Gomez and B. Danuser. 2007. Relationships between musical structure and psychophysiological measures of emotion. Emotion 7, 2 (2007), 377.
[17]
O. Grewe, F. Nagel, R. Kopiez, and E. Altenmüller. 2007. Emotions over time: Synchronicity and development of subjective, physiological, and facial affective reactions to music. Emotion 7, 4 (2007), 774--788.
[18]
O. Grewe, F. Nagel, R. Kopiez, and E. Altenmüller. 2005. How does music arouse “chills”? Ann. N. Y. Acad. Sci. 1060, 1 (2005), 446--449.
[19]
M. Grimshaw, C. A. Lindley, and L. Nacke. 2008. Sound and immersion in the first-person shooter: mixed measurement of the player's sonic experience. In Proceedings of Audio Mostly Conference. 1--7.
[20]
G. Ilie and W. F. Thompson. 2006. A comparison of acoustic cues in music and speech for three dimensions of affect. Music Percept. 23, 4 (2006), 319--330.
[21]
J. J. Javela, R. E. Mercadillo, and J. M. Ramirez. 2008. Anger and associated experiences of sadness, fear, valence, arousal, and dominance evoked by visual scenes. Psychol. Rep. 103, 3 (2008), 663--681.
[22]
L. N. Jefferies, D. Smilek, E. Eich, and J. T. Enns. 2008. Emotional valence and arousal interact in attentional control. Psychol. Sci. 19, 3 (2008), 290--295.
[23]
K. Kallinen and N. Ravaja. 2006. Emotion perceived and emotion felt: Same and different. Music. Sci. 10, 2 (Sep. 2006), 191--213.
[24]
A. Kirke and E. R. Miranda. 2009. A survey of computer systems for expressive music performance. ACM Comput. Surv. 42, 1 (Dec. 2009), 1--41.
[25]
M. M. Marin and J. Bhattacharya. 2010. Music induced emotions: Some current issues and cross-modal comparisons. Music Education. Nova Science Publishers, Hauppauge, NY, 1--38.
[26]
A. Mattek. 2011. Emotional Communication in Computer Generated Music: Experimenting with Affective Algorithms. In Proceedings of the 26th Annual Conference of the Society for Electro-Acoustic Music in the United States.
[27]
A. Mehrabian. 1996. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14, 4 (1996), 261--292.
[28]
S. Nielzén and Z. Cesarec. 1982. Emotional experience of music as a function of musical structure. Psychol. Music (1982).
[29]
H. Purwins, B. Blankertz, and K. Obermayer. 2000. Computing auditory perception. Organised Sound 5, 3 (December 2000), 159--171.
[30]
P. J. Rentfrow and S. D. Gosling. 2003. The do re mi's of everyday life: the structure and personality correlates of music preferences. J. Pers. Soc. Psychol. 84, 6 (2003), 1236.
[31]
R. Rowe. 1992. Interactive Music Systems: Machine Listening and Composing. MIT Press.
[32]
J. A. Russell. 1980. A circumplex model of affect. J. Pers. Soc. Psychol. 39, 6 (1980), 1161.
[33]
K. R. Scherer. 2004. Which emotions can be induced by music? What are the underlying mechanisms? And how can we measure them? J. New Music Res. 33, 3 (2004), 239--251.
[34]
S. Shapiro and D. J. MacInnis. 2002. Understanding program-induced mood effects: Decoupling arousal from valence. J. Advert. 31, 4 (2002), 15--26.
[35]
Y. Visell. 2004. Spontaneous organisation, pattern models, and music. Organised Sound 9, 2 (Aug. 2004), 151--165.
[36]
J. K. Vuoskoski and T. Eerola. 2012. Can sad music really make you sad? Indirect measures of affective states induced by music and autobiographical memories. Psychol. Aesthet. Creat. Arts 6, 3 (2012), 204.
[37]
J. K. Vuoskoski, W. F. Thompson, D. McIlwain, and T. Eerola. 2012. Who enjoys listening to sad music and why? Music Percept. 29, 3 (2012), 311--317.
[38]
A. Williamon and J. W. Davidson. 2002. Exploring co-performer communication. Music. Sci. 6, 1 (2002), 53--72.
[39]
D. Williams, A. Kirke, E. R. Miranda, E. Roesch, I. Daly, and S. Nasuto. 2014a. Investigating affect in algorithmic composition systems. Psychol. Music 43, 6 (Aug. 2014), 831--854.
[40]
D. Williams, A. Kirke, E. R. Miranda, E. Roesch, I. Daly, and S. Nasuto. 2014b. Evaluating perceptual separation in a pilot system for affective composition. In Proceedings of the 40th International Computer Music Conference.
[41]
D. Williams, A. Kirke, E. R. Miranda, E. Roesch, I. Daly, and S. Nasuto. 2015. Investigating perceived emotional correlates of rhythmic density in algorithmic music composition. ACM Trans. Appl. Percept. 12, 3 (2015).

Cited By

View all
  • (2024)AI-Based Affective Music Generation Systems: A Review of Methods and ChallengesACM Computing Surveys10.1145/367255456:11(1-34)Online publication date: 17-Jun-2024
  • (2023)AffectMachine-Classical: a novel system for generating affective classical musicFrontiers in Psychology10.3389/fpsyg.2023.115817214Online publication date: 6-Jun-2023
  • (2023)The Affective Bar PianoProceedings of the 23rd ACM International Conference on Intelligent Virtual Agents10.1145/3570945.3607358(1-3)Online publication date: 19-Sep-2023
  • Show More Cited By

Index Terms

  1. Affective Calibration of Musical Feature Sets in an Emotionally Intelligent Music Composition System

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Applied Perception
    ACM Transactions on Applied Perception  Volume 14, Issue 3
    July 2017
    148 pages
    ISSN:1544-3558
    EISSN:1544-3965
    DOI:10.1145/3066910
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 May 2017
    Accepted: 01 January 2017
    Revised: 01 January 2017
    Received: 01 January 2016
    Published in TAP Volume 14, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Algorithmic composition
    2. emotional congruence
    3. music perception

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    • Ninth Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM)
    • Engineering and Physical Sciences Research Council (EPSRC)
    • Royal Northern College of Music, Manchester, UK

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)66
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 24 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)AI-Based Affective Music Generation Systems: A Review of Methods and ChallengesACM Computing Surveys10.1145/367255456:11(1-34)Online publication date: 17-Jun-2024
    • (2023)AffectMachine-Classical: a novel system for generating affective classical musicFrontiers in Psychology10.3389/fpsyg.2023.115817214Online publication date: 6-Jun-2023
    • (2023)The Affective Bar PianoProceedings of the 23rd ACM International Conference on Intelligent Virtual Agents10.1145/3570945.3607358(1-3)Online publication date: 19-Sep-2023
    • (2023)An Intelligent Calibration Testing of Electricity Meter using XGBoost for Manufacturing 4.02023 International Conference on Computer Science, Information Technology and Engineering (ICCoSITE)10.1109/ICCoSITE57641.2023.10127767(183-188)Online publication date: 16-Feb-2023
    • (2023)Neural decoding of music from the EEGScientific Reports10.1038/s41598-022-27361-x13:1Online publication date: 12-Jan-2023
    • (2022)Analysis of Piano Performance Characteristics by Deep Learning and Artificial Intelligence and Its Application in Piano TeachingFrontiers in Psychology10.3389/fpsyg.2021.75140612Online publication date: 27-Jan-2022
    • (2022)Analysis of Different Keying Modes and Brain Sound Image Establishment Mode in Piano PlayingWireless Communications & Mobile Computing10.1155/2022/67134682022Online publication date: 1-Jan-2022
    • (2022)Design of Music Teaching System Based on Artificial IntelligenceMathematical Problems in Engineering10.1155/2022/26273952022(1-7)Online publication date: 6-Jul-2022
    • (2022)A systematic review of artificial intelligence-based music generation: Scope, applications, and future trendsExpert Systems with Applications10.1016/j.eswa.2022.118190209(118190)Online publication date: Dec-2022
    • (2021)Affective Visualization in Virtual Reality: An Integrative ReviewFrontiers in Virtual Reality10.3389/frvir.2021.6307312Online publication date: 6-Aug-2021
    • Show More Cited By

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media