Such projects include: the Listening to the Mind Listening (LML) concert staged at Sydney Opera House Studio in 2004, which demonstrated the value of sonification as musical experiences and research possibilities (Barrass, 2012); the continued development of brain-computer interfaces (BCI), especially for extending musical activity to disabled individuals through a brain-computer music interface (BCMI) (Miranda & Brouse, 2005); the work of Palacio-Quintin towards sonifying movement with sensors measuring distance and pressure, which related to an expression of emotion through physical movement (Palacio-Quintin & Zadel, 2008); as well as a host of performing artists who use biosignals more as entertainment, such as Lucky Dragons, The Heart Chamber Orchrestra, and Manifold Motion (Paalasmaa, Murphy, & Holmqvist, 2012). The piece that personally called my attention towards this topic was Daniel Dehaan’s piece, “Intelligence in the Human Machine,” premiered in 2014 with Chicago Symphony cellist Katinka Kleijin, which involved her performing from an interactive score while her brain waves were captured by an EEG headset and turned into electronic sounds (Sterbenz, 2014). Though the appearance of such works is promising, there is much work still to be done and many more advancements to be …show more content…
The equipment, ideally, should allow the performer to artistically express themselves while not being restrictive in any way. Currently, use of an EEG is becoming popular, partly due to the portable nature of some of the commercial models and the affordable pricing (Christopher, Kapur, Carnegie, & Grimshaw, 2014). A system that uses this approach is the brain-computer music interface (BCMI), being developed by researchers like Eduardo Miranda and Andrew Brouse. The researchers have managed to develop a BCMI that is fed EEG signals and then uses the data to “play” a piano (Miranda & Brouse, 2005). The researchers point out that the system is not a direct presentation of brain activity, rather, the computer system can interpret the information and sonify it in its own way, in this case some patterns on a piano (Miranda & Brouse, 2005). However, this headset or cap system would also need supplementing with other sensors, such as galvanic skin sensors and heartrate monitors to balance out some of the previously mentioned limitations and to gain a clearer understanding of what is being experienced. These systems are taking shape, such as the BioMuse System, which uses kinematic sensors (measures motion of the body) and physiological sensors that measure somatic/autonomic activity with the