TY - GEN
T1 - Neurophone
T2 - 6th IEEE International Symposium on the Internet of Sounds, IS2 2025
AU - Velasquez-Pena, Maria Lucia
AU - Ramirez-Espinosa, Gustavo
AU - Garcia-Orjuela, Danilo Alejandro
AU - Alvarado-Rojas, Catalina
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025/10/29
Y1 - 2025/10/29
N2 - Music is widely recognized as a powerful tool for emotional regulation and cognitive engagement. Recent advances in wearable systems, signal processing and interactive technologies have enabled the real-time generation and modulation of music based on brain activity, leading to the development of Brain-Computer Music Interfaces (BCMIs). These systems allow users to interact with musical environments through brain electrical signals captured via electroencephalography (EEG). In this work, we present the Neurophone, a BCMI that integrates real-time EEG acquisition with dynamic music generation. EEG signals are collected using the Muse S wearable headband and transmitted via Bluetooth to a mobile application. The signals are then forwarded through WiFi using OSC communication protocol to a custom musical feedback system. This system was implemented in Ableton Live, a digital audio workstation (DAW) using Max for Live for programming purposes. The Neurophone enables the transduction of five EEG frequency bands into musical parameters such as pitch, velocity, duration, and rhythm. It functions as a real-time musification platform, allowing users to modulate musical expression through their cognitive and emotional states. The instrument emphasizes performance and versatility with a simple plug-and-play integration. The Neurophone requires minimal setup and can be readily adapted to different EEG acquisition devices. Preliminary results demonstrate EEG-to-MIDI mapping, showing how distinct frequency bands correlate with specific musical features. This work aims to bridge neuroscience with musical technology, offering an adaptable tool for studying brain responses and interaction with musical environments. The Neurophone opens new possibilities for therapeutic applications, creative expression, adaptive music generation, and immersive sound experiences.
AB - Music is widely recognized as a powerful tool for emotional regulation and cognitive engagement. Recent advances in wearable systems, signal processing and interactive technologies have enabled the real-time generation and modulation of music based on brain activity, leading to the development of Brain-Computer Music Interfaces (BCMIs). These systems allow users to interact with musical environments through brain electrical signals captured via electroencephalography (EEG). In this work, we present the Neurophone, a BCMI that integrates real-time EEG acquisition with dynamic music generation. EEG signals are collected using the Muse S wearable headband and transmitted via Bluetooth to a mobile application. The signals are then forwarded through WiFi using OSC communication protocol to a custom musical feedback system. This system was implemented in Ableton Live, a digital audio workstation (DAW) using Max for Live for programming purposes. The Neurophone enables the transduction of five EEG frequency bands into musical parameters such as pitch, velocity, duration, and rhythm. It functions as a real-time musification platform, allowing users to modulate musical expression through their cognitive and emotional states. The instrument emphasizes performance and versatility with a simple plug-and-play integration. The Neurophone requires minimal setup and can be readily adapted to different EEG acquisition devices. Preliminary results demonstrate EEG-to-MIDI mapping, showing how distinct frequency bands correlate with specific musical features. This work aims to bridge neuroscience with musical technology, offering an adaptable tool for studying brain responses and interaction with musical environments. The Neurophone opens new possibilities for therapeutic applications, creative expression, adaptive music generation, and immersive sound experiences.
KW - Brain-Computer Music Interface
KW - Generative music
KW - Neuroengineering
KW - Signal Processing
UR - http://dx.doi.org/10.1109/is264627.2025.11284624
UR - https://www.mendeley.com/catalogue/85dab7d9-7a15-30a0-87f3-586714c9572d/
UR - https://www.scopus.com/pages/publications/105031771683
U2 - 10.1109/is264627.2025.11284624
DO - 10.1109/is264627.2025.11284624
M3 - Conference contribution
T3 - 2025 IEEE 6th International Symposium on the Internet of Sounds, IS2 2025
BT - 2025 IEEE 6th International Symposium on the Internet of Sounds, IS2 2025
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 29 October 2025 through 31 October 2025
ER -