I wrote a program in Processing that lets you explore sonically Agnes Martin’s painting “Summer 1964”.
I also made a short video of me running the app, but I consider that a poor substitute for actually running the app and interacting with the painting. Here are the links:
Here is the most relevant code:
float blueAtPixel = blue(pixels[mouseX + mouseY * width]);
float blueInRegion = blueConvolutiion(mouseX, mouseY, blueMatrix, blueMatrixSize, img);
// Map blue in the pixel region of the mouse logarithmically to 150 - 1150 to create a base frequency range
float frequency = pow(1000, blueAtPixel / 255.0) + 150;
// Use blue at the pixel of the mouse mapped from -0.5 to 0.5 as a detune argument
float detune = map(blueInRegion, 0, 255, -0.5, 0.5);
Basically, the synth pitch is in ratio to the amount of blue present where the mouse is pointing. To be precise, the base pitch is mapped from the blue value of the single pixel, while the detuning is mapped from the average of that pixel and a small neighborhood of it.
One thing that is counter-intuitive is that white has an RGB value of [255, 255, 255]. So those white dots yield a high pitch, while they look like they have no blue (after all, they’re white), so it seems like the pitch should be low.
I spent a lot of time trying to make some sort of sound composition out of this, but the painting is simply too detailed and I couldn’t find a good way to map the two spatial dimensions of the painting (in its full detail) to the one temporal dimension of sound. (I’ve ran into this problem before.)