Skip to content

NSF REU: Computational Methods for Music, Media, and Minds at the University of Rochester. Faculty Mentors: Prof. Zhiyao Duan (ECE), Prof. Matthew Brown (Music) Collaborators: Christodoulos Benetatos, Christopher Winders, Ph. D.

Notifications You must be signed in to change notification settings

nickcreel/automatic_music_events

Repository files navigation

#Automatic Rendering of Augmented Events in Immersive Concerts

Mentor(s): Zhiyao Duan (Electrical and Computer Engineering); Matthew Brown (Music Theory, Eastman School of Music)

In immersive concerts, the audience’s music listening experience is often augmented with texts, images, lighting and sound effects, and other materials. Manual synchronization of these materials with the music performance in real time becomes more and more challenging as their number increases. In this project, we will design an automatic system that is able to follow the performance and control pre-coded augmented events in real time. This allows immersive concert experiences to scale with the complexity of the texts, images, lighting and sound effects. We will work with TableTopOpera at the Eastman School of Music on implementing and refining this system.

Funding for this research is provided by the National Science Foundation, award no.1659250.

About

NSF REU: Computational Methods for Music, Media, and Minds at the University of Rochester. Faculty Mentors: Prof. Zhiyao Duan (ECE), Prof. Matthew Brown (Music) Collaborators: Christodoulos Benetatos, Christopher Winders, Ph. D.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages