FURI | Spring 2022

Creative Frameworks: Leveraging Deep Learning and Data Analysis to Create Accessible Artistic Technologies

Data icon, disabled. Four grey bars arranged like a vertical bar chart.

Auditory analysis based on musical information and text lyrics, and flexible light controls through mediums such as keyboard input or motion capture can be used as input to an end-to-end program that generates light shows and visuals, enabling creative expression. Researchers have developed programs that pull information from songs and record user input to automatically produce replayable light shows via Digital Multiplex (DMX) output. Combining this with generative adversarial networks that produce visuals, human-driven creative performances can be produced, making creative expression accessible to those of any background or abilities, removing traditional limitations of technology and complicated physical interfaces.

Student researcher

Kang Yi Lim

Kang Yi Lim

Computer systems engineering

Hometown: Chandler, Arizona, United States

Graduation date: Spring 2022