Dreams, Seen By Man-Made Machines

I’ve been writing my first algorithmic music composition tool in Processing. It’s not terribly complex, though it has stretched my understanding of Java a bit, which is nice.

Basically, it loads an array of the steps of a scale — for example, a minor scale is [0,2,3,5,7,8,10,12] steps from the root note. Then you feed it a root note and a length of note to generate — eighth, quarter, etc. It then pipes this out as MIDI data.

When that data is sent to Reason, and a bass beat is added and some delay, it sounds like this:

Generative Sample (0:32, MP3)

This is two melodic passes — one of eighth notes and one of quarter notes — and one where it just went apeshit on a Dr. Rex sample that I fucked with violently. That’s the clicky bit.

Here’s the weird part: it doesn’t work exactly right. It has something, I’m sure, to do with the code I got from another Processing programmer to bypass Processing’s framerate resolution. I changed it and, I feel sure, fucked it up somewhere, because now the tool occasionally spits out notes on some semi-random rhythm.

But it sounds cool that way. I’m gonna teach it how to write basslines, next.

Whaddya think of this? I think it sounds lovely and strange — not random, precisely.

Leave a comment