This is written in the Processing program language. The video portion is generated by Daniel Shiffman’s Flocking code.
I wrote the sound generation code. I use two pulse oscillators. One for the x-axis and one for the y-axis.
The average position of the boids is mapped to the frequency of each respective oscillator. So, if the boids move, in general, up, the pitch of one oscillator goes up. Similarly if the boids move to the right, the second oscillator increases in pitch. Finally, the standard deviations of the locations of the boids is mapped to the width of the oscillators. I’m glossing over some details, but that’s the basic idea.
Here is the crux of my sound mapping code:
x_avg = self.get_x_avg() x_velocity_avg = self.get_x_velocity_avg() y_velocity_avg = self.get_y_velocity_avg() pos = (x_avg / w) * 2 - 1 y_avg = self.get_y_avg() x_std_dev = std_dev([b.location for b in self.boids]) y_std_dev = std_dev([b.location for b in self.boids]) # Syntax: .set( freq, width, amp, add, pos) ch_1.set( x_avg, x_std_dev / (w / 2), 1, x_velocity_avg, pos) ch_2.set(-(h - y_avg), y_std_dev / (h / 2), 1, y_velocity_avg, pos)
The full code is here. I would recommend running the code as the video is only a random three minute sample of the idea.
Of course the screen wraps around. (The boids are flying on the surface of a torus.) The way I’ve written it, the audio also “wraps around”. It might make sense to use absolute frequencies instead, and have the app stop after the pitch goes out of hearing range. Maybe I will try that.
I didn’t realize until after finishing that nature was supposed to affect the rhythm. Also, this is a simulation of nature, rather than nature itself. So, I kind of broke the rules. Sorry.