Web Audio API: where to start ?

Here’s a list of links that helped me grasp the Web Audio API. Have fun!

start here:
simple tutorial about playing sounds, using filters, changing parameters, loading samples into buffers: http://www.html5rocks.com/en/tutorials/webaudio/intro/

dealing with the bad clock of javascript (and making full use of the accurate API one): http://www.html5rocks.com/en/tutorials/audio/scheduling/

the web audio api specification should be your main reference document: http://www.w3.org/TR/webaudio/
note that some of the names are different for Safari and Chrome.
there is a fix called monkey patch for dealing with these differences.

writing your own scriptProcessorNode (web audio uGen):
when using these self written nodes in a graph of other nodes you may find out that they have some limitations compared to normal nodes (in terms of garbage collection). Here an explanation of working around these: http://sriku.org/blog/2013/01/30/taming-the-scriptprocessornode/#vanishing-script-node

dat.GUI can be a nice start to make a quick GUI, don’t forget that making your own is not that hard with HTML5 and jQuery.

video of Chris Rogers, showing off some the api using some nice demo’s and talking about the history.

Read the small book by Crockford, “JavaScript: the good parts”. This is not strictly a JavaScript tutorial, it is more like a very precise description of how to code properly in the language. Other tutorials might be easier, but this will bring a deeper understanding of how to make coding pleasant.

Finally (this may be very obvious) but don’t forget that you can test JavaScript snippets & commands in the error console of Safari [alt+apple+c] & [alt+apple+j] in Chrome.

my own stuff:
maze ensemble web app
maze ensemble web app (earlier version)
feedback delay test

Keep an eye on this website for future experiments !

Euclidian Rhythms in ChucK

After having a lot of fun with this flash app, I thought it would be interesting to also import this rhythmic algorithm in ChucK. The basic idea is that the algorithm divides the pulses over the beats as balanced as possible. Of course 4 pulses in 16 beats would result in something like:
Things become much more interesting when you take a number of pulses that doesn’t divide very well, like 6 pulses in 16 beats.
. x . . x . . x . x . . x . . x
Of course layering several of these patterns and using prime numbers for the number of beats can result in surprisingly complex patterns.

I based my code on the examples in this article.

By the way, there also exist examples in Max/MSP and PD

class Euclid {
    int bitmap[];
    int remainder[];
    int count[];
    fun  void buildString (int level) {
        if (level == -1) {
        } else if (level == -2) {
        } else {
            for (0 => int i; i < count[level]; i++) {
                buildString (level-1);
            if (remainder[level] != 0) {
                buildString (level-2);
    fun void computeBitmap(int numSlots, int numPulses) {
        numSlots - numPulses => int divisor;
        null @=> remainder;
        null @=> count;
        null @=> bitmap;
        int a[100] @=> remainder;
        int b[100] @=> count;
        int c[0] @=> bitmap;
        numPulses => remainder[0];
        0 => int level;
        do {
            divisor / remainder[level] => count[level];
            divisor % remainder[level] => remainder[level + 1];
            remainder[level] => divisor;
        } while (remainder[level] > 1);
        divisor => count[level];
        buildString (level);    
    fun int [] compute(int slots,int pulse) {
        return bitmap;
    fun int [] append (int input[],int value) {
        input.size() => int size;
        size + 1 => input.size;
        value => input[size];
        return input;
    fun void [] print () {
        chout <= "Euclid pattern =" <= IO.newline();
        for (int i;i<bitmap.size();chout <= bitmap[i++] <= " ") {
            // nothing
        chout <= IO.newline();

class TestEuclid { // this is a little testclass...
    Euclid myPattern;
    chout <= myPattern.toString() <= IO.newline();
    float freq;
    fun void init(int numSlots,int pulses,float _freq) {
        _freq => freq;
        myPattern.compute(numSlots,pulses); // make a pattern with 15 slots of which 4 are turned on.
        spork ~ schedule();
    fun void ping(float gain,dur dura) { // a simple pulse
        SinOsc c => Envelope e => Pan2 p => dac;
        Math.random2f(-1,1) => p.pan;
        .12 => e.gain;
        freq => c.freq;
        gain => c.gain;
        dura * 2 => e.duration => now;
    fun void schedule() { // sequencer
        0 => int i;
        while(1) {
            spork ~ ping(myPattern.bitmap[i++],.1::second);
            i % myPattern.bitmap.cap() => i;
            .12::second => now;

TestEuclid test[10];

test.cap() => int i;
while(i--) { // 10 test patterns with random amount of slots and pulses, random harmonic of 55 hz.
    // note: handpicking the values can give even nicer results

hour => now;

Web Audio API experiment

I added a little experiment with the Web Audio API to this website. It sends a sinewave through 5 feedback delays. The read position in the delay lines is changed over time and a small interface allows you to influence how fast and where to.

find it here

For those interested the source is available at Github: