Home About EIS →

cfml: the context-free music language


For the ears: cfml-prototype.mp3

Context Free is an excellent tool for exploring generative spaces in the domain of 2D visual art (and Structure Synth does a fantastic job in 3D), but can a language of circles, rectangles, and triangles mutated by rotates, translates, and scales be translated into the domain of music? The result is not just a rich analogy, but a fun and expressive software performance instrument.

In creating cfml, I set my goal as making a translation of cfdg (the language for visual compositions) into the domain of live music. At the highest level this meant figuring out what sense of music I was going to map to. Context Free doesn’t do all kinds of visual art, really it can only place many many copies of a few primitives around the page with some interesting transformations applied to them.  After this, it is up to a graphics library to render these shapes to pixels and shoot them out the display. Over in the music domain I decided that a “single note played on a particular instrument” was a good primitive and that common musical transformations such as pitch transposition, time-stretching, and volume control would make nice analogs to cfdg’s geometric and color transformations. These primitive musical objects are handed to the system as MIDI events and are then rendered (live!) to a nice, sampled waveform and shot out the speakers for the audience to hear.

Below is a side-by-side comparison of several concepts in cfdg and cfml.  Keep in mind that cfdg has a custom, Java-like syntax while cfml inherits its syntax from Scheme.

Concept Visuals in cfdg Music in cfml
Primitives // solid black square
// with unit size at the origin
SQUARE
; a one-beat literal:
; now, guitar, middle-C, standard volume, whole note
(literal 1 '(0 3 0 1 1))
Transformations // draw a widget
// shifted up and partially transparent
widget{x 2 a -0.5}
; play a riff
; transposed up and at a lower volume
(vol 1/2 (tra +2 riff))
Recursion // a chain is an infinite pattern of links
rule chain { link{} chain{x 1} }
; a song is an infinite pattern of verses
(define (song) (after verse song))
Non-determinism // to draw a foo
// draw either a bar or a baz
rule foo { bar{} }
rule foo { baz{} }
; to play a verse
; choose between happy and sad
(define (verse) (choose happy sad))
Performance // start by drawing a metawidget
startshape metawidget
; start by composing a song
; with the given tempo and scale
(perform song 120 (pc:scale 0 'dorian))

The comparison above shouldn’t be too scary (once you get past the syntax change). But what prompted a syntax change in the first place? Surely parens have no more special relation to music than they do to visual art.

Here’s where things get tricky. Visual art is static, timeless, and purely spatial. Music, on the other hand, gives up almost all of its spatial detail in exchange for rich temporal detail. Music happens over time — and time isn’t something we can talk about in cfdg.

When making cfml, the best tool I knew of for algorithmic composition of music was Impromptu, a Scheme-based livecoding environment. An essential idiom in live composition in Impromptu is “temporal recursion” whereby a function schedules a call back to itself after a time delay. Having had such a great experience with Impromptu in the past (and being an appropriately lazy programmer) decided to create cfml as an internal domain-specific-language within Scheme, running inside of Impromptu and exploiting its musical performance libraries (for abstract musical note manipulation and MIDI playback). Mad propz @ Abelson and Sussman.

Time manifests itself in cfml not as a whimpy delay-by-n-beats operator, but two super-powered functional combinators called during and after. The composite (during melody harmony) immediately begins composing the melody and the harmony and assumes a total duration of the longer of the two pieces. Likewise, (after chorus verse) delays not only the performance, but even delays the composition of the verse until the chorus has finished performing. The result assumes the duration of the total duration of the sequence. Given the literal syntax shown in the comparison (which allowed simple, constant time offsets) you can create rich musical pieces from individual notes by composing (in the mathematical sense) chunks using the two combinators and a sprinkling of transformations (to pitch, volume, and duration). Ok, choose is a pretty powerful combinator as well, but it didn’t require nearly as much continuation-related magic to implement as they others.

The deferred composition in cfml isn’t just a gimmick to get the music playing sooner, it is an essential element of the language’s semantics. Consider the definition (define (song) (after (during melody harmony) song)). This simple rule is clearly recursive, and it is equally clearly missing a base case! If you tried to perform song you’d get an infinite sequence of notes.  Well, you’d get tired after a while and modify the rule to have a likely base case and the song would end naturally. Deferred composition not only saves the CPU work of deciding which notes to play later, it saves the human artist the work of deciding how to compose those notes that will come later. Livecoding is required as well, as there must be some way to affect the running program if you are ever going to tame that infinite recursion.

My hope is that cfml (in proximity to cfdg) makes it easier to think about generative art in domain-agnostic terms. There is a whole art (or maybe science (well, its clearly design)) to expressively crafting recursive definitions and tending nondeterministic rules while they are in the process of executing. There are teachable tricks to refactoring processes in-flight while preserving (or mutating) certain perceivable aspects. There are new spaces of concerns that go into the design of tools for this mode of artistic programming that don’t make sense in the big-project, enterprise software engineering mindset.

People talk agile, extreme, but try programming in a limited model of computation and responding to shifting client demands on a second-by-second basis by modifying the software while it runs. You’d be surprised by how relaxing it is. Really.

Ok, now that you’ve reached the end, go click the fancy image at the start of this article for a nice, practical comparison of cfml and cfdg. If you like what you see (and have a Mac), go pick up Impromptu, download my cfml library and example from github (patches welcome) and start hacking away.


About the author:  Adam is a PhD student, research scientist, software engineer, musician, artist, and hacker. He has a very special kind of respect for those elegant weapons like lisp (pronounced "scheme") and prolog, for a more civilized age. Read more from this author


This entry was posted in Authoring and tagged , , , , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

7 Comments

  1. Posted November 12, 2009 at 10:04 AM | Permalink

    Adam,
    I think this program is one of the best things you’ve ever done. It’s awesome. Is it possible for you to continue plugging away at it? Perhaps some documentation about how it can be used from a different file (don’t want to munge the library itself!)

    Love it.

  2. Adam M. Smith
    Posted November 12, 2009 at 6:38 PM | Permalink

    @chris

    Unfortunately, in its current language-as-a-library form, you really have to be pretty familiar with Impromptu as a programming environment and a runtime environment before you can really wrap your head around the depth of cfml. I think the ideal way to package cfml would be in a stand-alone tool that kept you from using (and getting lost in) the apocalyptic power of livecoding + a turing-complete language.

    To just play some nifty music on good faith that the system will do what you want, the following should be sufficient:
    – launch Impromptu
    – open cfml.scm
    – evaluate the entire program (select all then press eval), you should hear one sample of the demo grammar
    – open a new buffer
    – start defining rules and calling perform in this new buffer while making no changes to the other buffer (definitions from all of the buffers feed to a common process by default)
    – if it seems like there is no sound after a while when there should be, try re-performing the demo grammar and restart Impromptu if this doesn’t work

    As for continuing to plug away at it, I intent the Impromptu implmentation of cfml as a kind of prototype. Someday I’ll make a nice stand-alone tool more directly in the style of Context Free or Structure Synth.

  3. Andrew Plotkin
    Posted November 13, 2009 at 11:44 AM | Permalink

    (Risking the scourge of self-promotion…) This is nifty, because I have a generative audio project which has been simmering for years and years, and it’s aiming at the same ideas. You seem to have come to a lot of the same design decisions I did, too: control over volume/pitch/duration; elements that vary over time; delayed self-invocation; composable functions.

    My project is at http://boodler.org/ .

    Boodler is not determinedly functional like cfml is. I wrote it in Python, and all the sound elements (“agents”) are Python functions. Your example

    (define (song) (after (during melody harmony) song))

    …would be something like

    class Song(Agent):
    def run(self):
    self.sched_agent(Melody())
    self.sched_agent(Harmony())
    self.resched(30)

    A fairly bland procedural model, and you see I have no equivalent to your notion of an element’s duration. (You can get the duration of a primitive sound, but not an agent.)

    Nonetheless, it works. I’ve put together some nice wind, water, and weather soundscapes, which have enough layering and variation that they don’t sound repetitive. Then someone contributed an agent that took my soundscapes, added some more, and cross-faded between them to produce a terrific one-hour thunderstorm — distant thunder, then a shower, then wind, then heavy rain… and so on.

    Take a look if you’re interested — it requires only Python, builds on Mac/Unix, and people (not me) have gotten it to work on Windows too.

  4. Tom Boyd
    Posted July 30, 2010 at 9:54 AM | Permalink

    I wish I had a mac so I could download your cfml library and start using it.

  5. Elan Chalford
    Posted December 21, 2010 at 7:49 AM | Permalink

    This kind of program reminds me of an online game: Seaquence. I haven’t played with it, but my son has. It creates interesting moving shapes while playing simple music patterns.
    Could this be related to cfml?

  6. Realgon
    Posted December 30, 2010 at 6:10 PM | Permalink

    This is really, really amazing. I have never looked at music in such a way. Very interesting concept. I can visualize music a cleaner way. I will come back and study this some more. Thank you.

  7. Tom
    Posted April 22, 2011 at 9:40 AM | Permalink

    I agree and it is amazing that composers and sound artists can use impromptu (OSX programming language and environment) for their visual support .

4 Trackbacks

  • [...] This post was mentioned on Twitter by Jillis ter Hove and rndmcnlly, Bun B. Bun B said: cfml: the context-free music language: For the ears: cfml-prototype.mp3 · Context Free is an excellent tool for.. http://bit.ly/6y5vY [...]

  • By Syntopia » Blog Archive » Assorted Links on November 18, 2009 at 1:40 PM

    [...] M. Smith has begun working on cfml – a context-free music language. It is a Context-Free Design Grammar – for music. [...]

  • [...] I digress. I’d just like to show you guys cmfl, the context-free music language, which uses generative algorithms usually created for visual design for AUDIO. How neat is that? [...]

  • By links for 2010-04-20 on April 20, 2010 at 1:03 AM

    [...] cfml: the context-free music language Generative music in a livecoding environment. Uses Scheme in something called impromptu ( the livecoding environment). (tags: music generative contextfree language programming audio algorithm cfml scheme) This entry was posted in Links. Bookmark the permalink. Post a comment or leave a trackback: Trackback URL. « I am eating an apple [...]