Publication Alert! This journal paper discusses our automatic music generation system, AffectMachine. The development of this system started back in 2021, and was an interdisciplinary effort (our team members have backgrounds in cognitive science, music composition, and computer science). AffectMachine is capable of creating emotional-sounding music in real time, either based on the user's inputs, or based on their physiological (e.g., HR) or neural signals. We will be embedding the system into a BCI (brain computer interface) to help listeners regulate their emotion states.
Our paper testing a novel music-based Brain Computer Interface (BCI) for emotion regulation in listeners was published in PLoS ONE yesterday. The paper is freely available online here.
Title: A closed-loop, music-based brain-computer interface for emotion mediation