CS 184: Computer Graphics and Imaging, Spring 2019

Final Project: Choreographed Particle Simulation

Yusuf Fateen, Brian Levis, Jayanth Sundaresan

Milestone Video

Wow (Instrumental) - Post Malone


At each timestep, forces between particles are computed, particles are modified according to musical features, and particle positions are updated.

In the above video, particles are slightly attracted to each other and collide elastically. Particles are colored according to speed, gravity is affected by subsampled audio magnitude, particles bulge according to "beats", and particles change color according to beats that form the main tempo of the song. In the video above, the color changes may be observed after the conclusion of the song's 20 second introduction.

In order to support a large number of particles, we will implement a grid-based optimization data structure that will maintain location-based references to particles so that they can be interacted only with their closest neighbors. We also plan to implement fluid-simulation equations to replace the simple gravity-based ones shown here.


The render function brings the particle simulation up to date using the time since the music started playing, and redraws the entire canvas. here.

This pipeline is fairly slow, and is limited to lower resolutions. We also plan to improve the appearance of our particles.


A python script takes in a sound file, and uses scipy and librosa to generate a low-frequency signal that represents the volume of the audio at a simulation step. Onset envelopes and beats are calculated and matched to this lower frequency. These vectors are saved as a data file and given as input to the simulator. The simulator plays the audio using a child process, and uses the start time to keep the simulator synchronized.

Demo Slides