noControl – chaos and belief
Towards the end of summer 2015 I was asked by a friend, colleague and mentor, to contribute to his sound installation, that would take place at Café Oto’s Project Space, between the 10th and 13th of September 2015.
Twenty some people participated in this joint assignment and I was delighted to have been asked and be able to participate.
Inevitably, my contribution ended up exploring further the area of belief, albeit in a different format that the ones I am accustomed to, namely those originating from live and real-time performance.
For this event I worked without the element of improvisation, other than that required in operating a single rotary dial. I instead concentrated on the conceptual aspect and synthesized some of my thoughts in relation to gestures and their projection onto meaning/belief.
Tom Mudd’s event was named “Control” and it was set to investigate and explore the affordances of the interface design in new instruments.
CONTROL – AN INTERACTIVE SOUND INSTALLATION: TOM MUDD
10–13 SEPTEMBER 2015, 1–9PM, OTO PROJECT SPACE
Here is the link to Tom’s website:
Here is the short description of my contribution, as it appeared on the publicity for the event:
‘noControl’ investigates the apparent need of the average user to have immediate and big-featured feedback from a given interface. As patience for response diminishes, so does the ability to be open to subtlety and to wider time window evolving events. The illusion of control over the interface often accompanies this restlessness/impatience. Interactions can therefore manifest in big, dramatic and short lived movements/gestures. Under the hood of noControl, several chaotic behaviours are shaping the sound events to morph continuously and the degree of agency that the user is be able to exercise is limited and partial. Nevertheless, he/she will make belief and assign meaningful mappings between his/her actions and the sonic output.
No belief is True. No. Belief. Is. True. (Jed McKenna)
And here is how my contribution sounded. As you can notice, quick gestures do not produce a proportionate output. You really need to let it be to hear what it does…
In it, it is clearly possible to note that when the user performs abrupt and dramatic changes in the angular velocity of the knob, no sound is output.
In fact, the patch works at his intended capacity when left alone. All sound processes are internally generated by chaotic behaviors and the user has a very limited say in what the result will be. He/she can adjust slightly the angular position but then would have to wait to perceive the ever-evolving patterns that are generated.
The actual programming of noControl is quite involved and comprises of several buffers of audio over which various degrees of signal processing are applied and combined in a non-linear and dynamical fashion. This is mostly achieved by coupling oscillators and driving their frequency via stochastic processes and, partially, via user’s input.
The following figure illustrates a good part of the abstraction patch that is eventually dynamically loaded into Tom Mudd’s host patch:
There are several constituent modules that make up the global behaviour of the patch, as already mentioned. In the following screenshot it is possible to view more in detail two of the four three of the buffers were audio is stored and processed:
The mysterious bangMe in red, is a trigger event that initiates most of the processes in the patch. It is NOT determined by the user, despite what he might think. There are two main buffers, named buf1 and buf2, whose audio content is combined and treated under operation such as modulus, multiplication, summation, average(buf1, buf2) and min(buf1, buf2). These buffers are loaded with new audio when triggered by the event newTrack. Such event is indeed resulting form the user’s input. In particular, newTrack is interdicted when the angular velocity of the knob’s rotation exceeds a threshold value and, conversely, allowed if not. The audio content resulting from the interpolation of the two buffers is loaded and stored into yet another buffer, named buf3.
Buf3 will serve as the content for a wavetable oscillator, whose frequency is led by the already seen bangMe and whose phase is driven by noControl, which in turn is the result of the chaotic coupling of two other oscillators.
These reside at the core of the patch, pictured below, in the right hand side of the next fig.
The actual non-linear behaviour is implemented inside the block called gen~.
Gen is a welcome addition since Max 6, and it allows not only to code directly in text based style (codebox) but also, an more importantly, to process with single sample accuracy, which was previously impossible.
Two phasors are visible, both driven by the now familiar bangMe. However, one phasor is also informed directly by the user’s input. What is seen as the yellow object box named numba, is the actual value between 0 and 1 (with 1024 points of definition) provided by the rotation of the knob. The result of the non-linear behaviour is finally output as noControl, in the purple object box, and, as we have seen, drives the phase of the wavetable oscillator (on the left hand side of fig. below)