A piano that captures the data of live performance offers the MIT community new possibilities for studying and experimenting with music.
Nicole Estvanik Taylor | Arts at MIT
Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she’d just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.
“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”
This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.
Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.
“See? We don’t need it,” Kim confirmed with a smile.
“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”
According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.
During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.
“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”
For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”
As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.
“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”
Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.
Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master’s Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.
“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”
“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”
In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it’s going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it’s going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It’s going to play an integral role in a lot of musical endeavors.”