Talk @ Ariga Sessions

ARIGA Sessions:
Can we confuse a computer for our own benefit? – Casper Schipper

Tuesday 25.01.22, 17:00 CET

In this presentation, I will explore a tension I experienced in creating
a programming language specifically for the purpose of artistic
exploration. On the one hand, this language has the tool-like
requirements that any programming language has: it should be easy to
express ideas in the language, easy to rewrite code while listening to
the result, extend, avoid annoying bugs etc… However there is also
something paradoxical in that my aim is often to program for finding
“unexpected” results. This means that a program produces material that
surprises me as its author and thus may generate a tension with the
earlier mentioned requirements of “casual” programming. To be clear,
with unexpected I do not mean just some randomness in the output, but
that the program’s structural functioning is actually different from how
it was designed. To explore this idea, I will show some examples from my
own work from where I think ambiguity was working and where it was
perhaps missing.

The presentation will take approximately 20-25 minutes: participants are
welcome to actively participate in the discussion possibly sharing
examples from their artistic practice.

The session will be streamed live to the ARIGA youtube channel:
https://www.youtube.com/channel/UCnFHs23ezU0Hq47aYDgWcoA

ARIGA stands for Artistic Research in Generative Art. ARIGA is a Special
Interest Group within the Society of Artistic Research (SAR) that
focuses on the computational, the algorithmic, and the generative. In
particular, ARIGA addresses practices that engage with computational
artifacts while suspending preconceived ideas of function and rather
embrace the mutual evolution of technology and artistic
thought. Computation is understood as an actor integrated in a mesh of
irreducible interrelations, part of an ecology in which technological,
historical, social, and scientific aspects diffract. ARIGA critically
addresses fundamental questions common to diverse practices and seeks to
gather artist researchers working with generative processes in
heterogeneous media, including space, sound, image, video, sculpture,
and language.

ARIGA is an idea by Luc Döbereiner and David Pirrò

https://www.researchcatalogue.net/portals?portal=1220978

EngelenZender app

Together with Wouter Snoei, I have had the hounor to recreate the artwork “EngelenZender” (Angels) as a mobile app. This project by Moniek Toebosch (1948-2012) was a dedicated radio transmitter that broadcasted at 98FM, and only receivable on the Houtribdijk, a 23km long dike going through Markermeer between Enkhuizen and Lelystad. Those who tuned in, would hear angels sing. This ‘angel music’ was live generated from vocal recordings using chance operations, which meant it would continuously create new variations. The original programming was done by Harm Visser.

Wouter and I have now created a mobile app that will also play angels, but of course, only when you are on the Houtribdijk (though there is a short preview if you open it within the museum 😉 ).

EngelenZender is part of the exposition “Vrijheid – de vijftig Nederlandse kernkunstwerken vanaf 1968” in museum de Fundatie in Zwolle (link). You can download the EngelenZender app through the links below:

iOS: https://itunes.apple.com/nl/app/engelenzender/id1444546045

Android: https://play.google.com/store/search?q=engelenzender


Web Audio API: where to start ?

Here’s a list of links that helped me grasp the Web Audio API. Have fun!

start here:
simple tutorial about playing sounds, using filters, changing parameters, loading samples into buffers: http://www.html5rocks.com/en/tutorials/webaudio/intro/

timing:
dealing with the bad clock of javascript (and making full use of the accurate API one): http://www.html5rocks.com/en/tutorials/audio/scheduling/

details:
the web audio api specification should be your main reference document: http://www.w3.org/TR/webaudio/
note that some of the names are different for Safari and Chrome.
there is a fix called monkey patch for dealing with these differences.

writing your own scriptProcessorNode (web audio uGen):
when using these self written nodes in a graph of other nodes you may find out that they have some limitations compared to normal nodes (in terms of garbage collection). Here an explanation of working around these: http://sriku.org/blog/2013/01/30/taming-the-scriptprocessornode/#vanishing-script-node

GUI:
dat.GUI can be a nice start to make a quick GUI, don’t forget that making your own is not that hard with HTML5 and jQuery.

Chris
video of Chris Rogers, showing off some the api using some nice demo’s and talking about the history.

JavaScript:
Read the small book by Crockford, “JavaScript: the good parts”. This is not strictly a JavaScript tutorial, it is more like a very precise description of how to code properly in the language. Other tutorials might be easier, but this will bring a deeper understanding of how to make coding pleasant.

Finally (this may be very obvious) but don’t forget that you can test JavaScript snippets & commands in the error console of Safari [alt+apple+c] & [alt+apple+j] in Chrome.

my own stuff:
maze ensemble web app
maze ensemble web app (earlier version)
feedback delay test

Keep an eye on this website for future experiments !

Euclidian Rhythms in ChucK

After having a lot of fun with this flash app, I thought it would be interesting to also import this rhythmic algorithm in ChucK. The basic idea is that the algorithm divides the pulses over the beats as balanced as possible. Of course 4 pulses in 16 beats would result in something like:
x…x…x…x…
Things become much more interesting when you take a number of pulses that doesn’t divide very well, like 6 pulses in 16 beats.
. x . . x . . x . x . . x . . x
Of course layering several of these patterns and using prime numbers for the number of beats can result in surprisingly complex patterns.

I based my code on the examples in this article.

By the way, there also exist examples in Max/MSP and PD

class Euclid {
    int bitmap[];
    int remainder[];
    int count[];
    
    fun  void buildString (int level) {
        if (level == -1) {
            append(bitmap,0);
        } else if (level == -2) {
            append(bitmap,1);
        } else {
            for (0 => int i; i < count[level]; i++) {
                buildString (level-1);
            } 
            if (remainder[level] != 0) {
                buildString (level-2);
            }
        }
    }
    
    fun void computeBitmap(int numSlots, int numPulses) {
        numSlots - numPulses => int divisor;
        
        null @=> remainder;
        null @=> count;
        null @=> bitmap;
        
        int a[100] @=> remainder;
        int b[100] @=> count;
        int c[0] @=> bitmap;
        
        numPulses => remainder[0];
        0 => int level;
        do {
            divisor / remainder[level] => count[level];
            divisor % remainder[level] => remainder[level + 1];
            remainder[level] => divisor;
            level++;
        } while (remainder[level] > 1);
        
        divisor => count[level];
        
        buildString (level);    
    }
    
    fun int [] compute(int slots,int pulse) {
        computeBitmap(slots,pulse);
        return bitmap;
    }
    
    fun int [] append (int input[],int value) {
        input.size() => int size;
        size + 1 => input.size;
        value => input[size];
        return input;
    }
    
    fun void [] print () {
        chout <= "Euclid pattern =" <= IO.newline();
        for (int i;i<bitmap.size();chout <= bitmap[i++] <= " ") {
            // nothing
        }
        chout <= IO.newline();
    }
}

class TestEuclid { // this is a little testclass...
    Euclid myPattern;
    chout <= myPattern.toString() <= IO.newline();
    float freq;
    
    fun void init(int numSlots,int pulses,float _freq) {
        _freq => freq;
        myPattern.compute(numSlots,pulses); // make a pattern with 15 slots of which 4 are turned on.
        myPattern.print(); 
        spork ~ schedule();
    }
    
    fun void ping(float gain,dur dura) { // a simple pulse
        SinOsc c => Envelope e => Pan2 p => dac;
        Math.random2f(-1,1) => p.pan;
        .12 => e.gain;
        freq => c.freq;
        gain => c.gain;
        e.value(1);
        e.target(0);
        dura * 2 => e.duration => now;
    }
    
    fun void schedule() { // sequencer
        0 => int i;
        myPattern.print();
        while(1) {
            spork ~ ping(myPattern.bitmap[i++],.1::second);
            i % myPattern.bitmap.cap() => i;
            .12::second => now;
        }
    }
}

TestEuclid test[10];

test.cap() => int i;
while(i--) { // 10 test patterns with random amount of slots and pulses, random harmonic of 55 hz.
    // note: handpicking the values can give even nicer results
    test[i].init(Math.random2(7,21),Math.random2(2,7),Math.random2(1,8)*110);
}


hour => now;

Web Audio API experiment

I added a little experiment with the Web Audio API to this website. It sends a sinewave through 5 feedback delays. The read position in the delay lines is changed over time and a small interface allows you to influence how fast and where to.

find it here

For those interested the source is available at Github:
https://github.com/casperschipper/WebAudioFeedbackDelay