This AI can interpret the music played by an instrument only using visual cues

This AI can interpret the music played by an instrument only using visual cues

Machine learning has helped a group of researchers from the University of Washington to conceive a system, called Audeo , that creates audio from silent piano performances.

In other words, this artificial intelligence recreates the interpretive experience of musicians and their instruments using only visual cues .

Audeo

Audeo uses a series of steps to decode what is happening in the video and then translate it into music. First, you have to detect which keys are pressed in each frame of video to create a diagram over time. Then you need to translate that diagram into something that a music synthesizer actually recognizes as a sound that a piano would make. This second step cleanses the data and adds more information, such as how hard each key is pressed and for how long .

The researchers trained and tested the system using YouTube videos from pianist Paul Barton . The line-up consisted of some 172,000 video frames of Barton playing music by well-known classical composers such as Bach and Mozart.

The reliability in interpreting which song is played by Audeo is so high that it even exceeds that of song recognition apps: the applications correctly identified the piece that Audeo played approximately 86% of the time, while Audeo reached 93% .

Audeo was trained and tested only on Paul Barton piano videos. Future research is needed to see how well you can transcribe music for any musician or piano.