Tuesday, November 17, 2015

Hitchhiker Laboratories Abattoir

After a year of learning all I could about electronics, I have some results to show. I created my first effect pedal from scratch: the Hitchhiker Laboratories abattoir.

Before you send in your order, you have to understand that Hitchhiker Laboratories is not a real company; it’s what I call my basement studio. Furthermore, the abattoir is what I call this pedal I created, because I felt like giving it a name; it’s not a commercially available product at this time (if ever).



At its core, the abattoir is a vactrol based VCA with an attack and release envelope generator. In addition to the Attack and Release controls that set the envelope, there’s also a Mod control that affects both parameters at once. When Mod is turned fully counter-clockwise, the values of the Attack and Release knobs are used, but turning the Mod knob clockwise will gradually reduce the length of both the attack and release settings.

Pushing the purple button triggers the attack and releasing it triggers the release (left status LED light purple when the envelope is triggered). This can be used in one of two setting determined by the topmost toggle switch: 1) make (left) a sound or 2) break a sound. In the first setting, the audio sent to the pedal’s input is silenced unless the purple button is pressed, but in the second setting, the input audio is heard unless the button is pressed. In both cases, the envelope settings are used to control the vactrol VCA to make or break the sound. The abattoir’s envelope generator and vactrol VCA make it differ from other kill switch in that it can kill very smoothly. If you like the clicks, look elsewhere… the abattoir will never cause clicks of any kind.

The abattoir also allows the recording and playback of trigger sequences played on the purple button. This is the mechanically perfect repetitive killing action that gives the abattoir its name. When the right status LED is off, pressing the orange button once will arm the recording (LED turns red). The sequence will begin recording with the next press of the purple button and will end with the next press of the orange button. Playback of the sequence will start immediately after this second press of the orange button (LED turns green). Pressing the orange button during playback will pause the playback of the sequence (LED turns orange) and pressing it again will restart the sequence. A long press of the orange button at any time will clear the recorded sequence (LED turns off).

While a sequence is playing, the envelope settings can be manipulated to vary playback, but the abattoir also has a Speed control that makes it possible to vary the playback speed of the recorded sequence. Also, the break/make toggle switch can be used to play the “opposite” of what you recorded (the ground to your figure).

The bottom toggle switch is used to bypass the effect (left=off).


In February of 2010, Rick Walker posted to the Looper’s Delight mailing a description of a pedal he had designed with Bill Putnam. Rick wanted a pedal “that could utilize a hand drummer or string or wind instrumentalists’ ability to move their fingers rapidly, either in arpeggiation modes, static ostinato rhythms or even just randomly” and that could be used to “glitch silence into an already existing sound file or to take a random ambient loop as a sound source and be able to constrain it to a very articulate rhythm.” Needless to say, this was music to my ears and I, among many other musicians, was eagerly awaiting the release of the “Walker Manual Glitch Pedal."

While we still wait for this for this novel instrument, I had the notion last year of trying to implement some of its features in a design of my own. The pedal I would end up building a year later, while sharing some similarities, bears little resemblance to the original specification as described by Rick and perhaps owes more to recent pedals such as the Electro-Harmonix Chillswitch and the Dwarfcraft Memento. Many of the “Walker Manual Glitch Pedal” are unfortunately nowhere to be found in my design, such as: multiple buttons to allow finger drumming, onboard pink and white noise generators and filter with sweepable frequency and resonance. Oh well… I guess Rick will have to build his pedal! I know I’ll be first line to buy one if it ever comes out.

Wednesday, November 26, 2014

Matt Davignon Remix Album

This is a quick update to spread the word about yesterday's release of Matt Davignon’s anniversary remix album celebrating 10 years of his unique approach to drum machines to which I had the privilege of contributing a track.

Monday, October 13, 2014

Meta-trombone on iOS

When I was invited to perform at IMOOfest 2014 as part of the IMOO Chamber Orchestra (featuring Jean Derome and Joane Hétu), festival organizer Craig Pedersen asked me to keep my electronic setup as simple as possible.

There were two considerations that guided the redesign of my meta-trombone. First, I wanted a lighter rig that could be quickly deployed on stage. Second, I had to make changes to adapt from solo performance to ensemble playing.

Lighter is better

I’ve already given details on all the hardware and software that I have been using to date to create the meta-trombone. If there is one problem with this rig, it is the weight of my Gator case (laptop + 2U), While I certainly like its sturdy construction, hauling it through the Paris subway last summer made me question its necessity. It certainly annuls one of the advantages of making music with a computer: not having to carry heavy gear all over the place.

The model I chose to adopt was that of the guitar player: I wanted my rig to consist of a small pedalboard, my instrument and an amplifier. I think this meets IMOOfest’s simplicity requirement as I know other performers will have a similar setup. It will also allow me to play in venues that don’t have their own PA, such as art galleries and house concerts.

Starting with the end of the signal chain, I began to research amplifiers, I quickly excluded guitar amps, since they’re an extension of the electric guitar and add too much tone coloration to the signal to be of any use to me. I briefly considered buying a keyboard amplifier, but ultimately I decided to buy an Electro Voice ZLX powered loudspeaker. The ZLX is inexpensive, sounds decent and is quite versatile. It can output up to 126db, which can easily match the acoustic sound of the trombone. I bought a single speaker for the IMOO gig, but I think I’ll get a second one for solo work to abuse the stereo field.

Moving to the pedalboard, I decided to port my Bidule patch to Pure Data so that I could run it on my spare iPhone 5. I’ve been playing with libpd since it was rolled out, but instead of coding my own iOS application, I went for the simplicity and ease of use that MobMuPlat offers. I quickly created an interface in the MobMuPlat editor and I had a Pure Data patch running on my iPhone in no time (look below for further details on the patch).

The biggest hurdle to making music with the iPhone is that you need to get audio and MIDI through the iPhone’s unique connector. Apogee’s Duet provides an elegant solution as it offers an audio interface (2 inputs / 4 outputs) with a USB MIDI connector. I connected my KMI 12 Step to it without any issue. I’m also using a Yamaha FC-7 expression pedal that connects to the 12 Step expression port. Audio input is through my trusty Audio-Technica ATM350 Cardioid Condenser Clip-On Microphone. All the gear fits on a Pedaltrain 2, which came with a nice carry bag.


(notice extra room for future iPad expansion or stomp boxes)

Working with iOS

Aside from the previously mentioned MobMuPlat, I’m making use of several other apps as well.  First, for pitch to midi conversion, I’m using MIDImorphosis.  There are other apps that provide this functionality on iOS, however this is the only one that has Audiobus support and that made the difference in the end, since it’s the only I’ve found of splitting a audio channel to two different apps. I’m also using Audiobus to route the output of MobMuPlat to Moog’s Filtatron.  Filtatron is a great sounding app that combines an analogue-modelled filter with modulation, a warm distortion and a delay with feedback and modulation.  There’s also an oscillator and a sample player/looper but I haven’t needed those yet.

Below is what the routing looks like in Audiobus:

Photo 1  1

This is the interface to my Pure Data patch in MobMuplat:

Photo 4

I have a second iPhone 5 strapped to my trombone that can send MIDI control signals over bluetooth (using the excellent Apollo MIDI over Bluetooth app from secret base design).  I created the following interface in Lemur to control Filtatron and send midi notes to my Pure Data patch:

Photo 2  1

Photo 3

The patch I create in Pure Data is essentially a looping sampler.  Once a phrase is recorded, it can be played back a different rates. When the LOOP toggle is on, the notes sent to the sampler will be held and the sample will loop at the playback rate that corresponds to the given note.  Up to four notes can be held at one time (with voice stealing). 

Additionally, the trombone performance can be analyzed with MIDImorphosis to convert the audio to MIDI notes. These notes can be used with the sampler in one of two modes.  The first is the “Sampler” mode in which the playback rate is fixed at a value determined by an adjustable pitch shift value. The MIDI notes triggered by the trombone will select the starting position within the sample in such a way that a given pitch class will always trigger playback from one of twelve divisions within the sample. The “Synth” mode will trigger playback from the beginning of the sample at a playback rate corresponding to the actual note played on trombone.

Future development

This setup served me well in performance and I’m not looking back. I believe that all future development of the meta-trombone will take on this platform. The next logical step would be to add a looper to this setup. I already have a nice hardware looper I could throw in, but I’ve had a lot of fun playing with a Pd patch created by  Marco Baumgartner called ALFALOOP, which is a well designed delay-based looper that has all the features I need to build from.

Monday, April 28, 2014


In 1951 John Cage composed a piece for twelve radios titled Imaginary Landscape No. 4. This piece is a continuation of Cage's thinking from his manifesto The Future of Music: Credo in which the composer defined music as 'organized sounds' and constitutes an early use of sampling in music. However, Cage had another motivation for writing this piece: adjusting to the reality of radios in his environment. 
"Well, you know how I adjusted to that problem of the radio in the environment. Primitive people adjusted to the animals which frightened them, they drew pictures of them on their caves. And so I simply made a piece using radios. Now, whenever I hear radios, even a single one, not just twelve at a time, I think well, they're just playing my piece. [..] and I listen to it with pleasure. By pleasure, I mean, I notice what happens – I can attend to it, and become interested in the… well, what it actually is that you're interested in, is what superimposed what, what happens at the same time, together with what happens before and what happens after. Formerly, when I would go into any friends' home, out of deference, you know, to my tastes, seeing me coming they simply turned off any radio, or even a disc that was playing. Now they no longer do it, they know that I think that I composed all those things." 
John Cage from John Cage and Morton Feldman In Conversation, 1967

By composing music for radios, Cage was able to listen with interest to something he had previously found disagreeable. I've already had similar experiences by utilizing augmented reality musical apps to transform elevators full of chatting public servants or doing the dishes into new and interesting sonic experiences. In this regard, I'm already familiar with the idea and quite happy with the results. 

I've decided to take this idea and compose a new piece to help me adjust to my evolving environment. 

TANTRUM! for solo toddler

Friday, April 4, 2014

Hello 6502 Assembly

After many false starts (since age 12), I've finally managed to compile and run a machine language program on my Commodore 64. The code below is in ACME assembly language and displays the "Hello World!" in the middle of the screen. Nothing ground breaking, but it's a required first step to verify that I have all my tools setup and working.

Anyone looking to cross-develop for the Commodore 64 on OS X should have a look at Dust. Now, let's see if I can make this thing beep...

Thursday, April 3, 2014

2013 World Tour

I haven’t had a chance to say very much on the subject of last year's world tour. My first concert was in Toronto in March. I performed at Synesthesia II at the invitation of the organizers, FAWN Opera. There were many amazing performers that night and I was quite impressed with Sarah Gates’ inspiring performance on saxophone. Video of that concert has surfaced, but the stereo separation was not captured by the video camera, so the sound is not representative of what the audience heard.

This was not my best performance, unfortunately. The venue was interesting and delivered lots of ambiance; however, it did not provide a very large stage. Somehow, from the time I setup my equipment to when I made ready to play, my gear got tangled with other performer’s equipment and my headphones were damaged to the point that I was unable to use them. Luckily, the organizers were able to quickly get everything in readiness and I was able to perform the Canadian premiere of my meta-trombone.

In May, I travelled to New York City and had the opportunity to perform at Brooklyn’s Goodbye Blue Monday. This was the first time I performed my meta-trombone in a general setting where listener’s expectation were not biased towards the unusual. Unfortunately, there weren’t that many people at the bar that night and I performed mostly to a group of my friends. I had also been invited to play at ABC No Rio that same week; however that concert was cancelled due to other commitments by the organizers.

In July, I travelled to Paris to perform at the first edition of the Paris Loop Jubilee. This event was masterfully orchestrated by Emmanuel Reveneau and everything about this festival was superbly executed: the artist had free lodging a short metro ride from the venue; the venue was unusual and inspiring; the artist were provided with daily meals that were both copious and healthy; and all technical needs of the performances were met.

Above all, the lineup that Emmanuel brought together for his festival was eclectic and masterful. Not only did I have the opportunity to hear and hangout with old friends such as Rick Walker, Laurie Amat, Luca Formentini and Emmanuel Reveneau, but I also discovered some new performers worthy of a global following.
Luca and Rick getting ready to play in one of the three "voûtes" artists have taken over under the Tolbiac street bridge in the 13th.
I was fairly emotional during my performance, since only a few hours before my set, my wife showed up unannounced to surprise me! We spent a couple days together in Paris before I met up with my fellow travellers en route to Cologne.

The Cologne festival was organized by Michael Peters whom I have known for years through the geographical defiance of the internet, but met for the first in Paris, where he also performed at the Paris Loop Jubilee with Stefan Tiedje.

Michael Peters, Laurie Amat, Emmanuel Reveneau and Rick Walker standing in front of the yellow van that took us from Paris to Cologne.
Michael's backyard

Around town, neat Michael's house
Michael hosted Rick, Emmanuel, Laurie and I at his house near Cologne. Luckily, I had a day off before the concert to unwind and to appreciate the German countryside. I spent most of it reading and walking around the small town near Michael’s house. The next day we went to Cologne early to setup our equipment and to meet the other performers.
Emmanuel, Laurie, Steve Moyes and Amy X. Neuburg getting ready to play.

Cellist Steve Moyes making some last minutes preparations.

Laurie's rig.

Amy's rig.

Rick Walker's fun house.
Michael arranged a wonderful mix of German and international performers to play at his one-day event. We all shared the lofty stage at the Alte Feuerwache and we setup our equipment well in advance of the performances with the brilliant assistance of two highly competent technicians. As a solo performer, this was my most satisfying performance in technically ideal conditions with a large and receptive audience.

After all that travelling, I intend to spend more time at home this year. I’m reorganizing my basement studio to help me create three albums I’ve sketched out. The first will document my meta-trombone in a studio setting, the second will explore the sound of the Commodore 64’s iconic sound chip with musique concrète manipulations and the third will pursue some ideas developed as contributions to the Disquiet Junto. Finally, I will also have the privilege to contribute a track to Matt Davignon’s anniversary remix album celebrating 10 years of his unique approach to drum machines.

Thursday, February 13, 2014

New comic book story

I’m finishing up my two-page contribution to Éditions Trip’s upcoming release of Trip #8. As you can see from the page below, it is about an astronaut floating out of control in space. This short piece is the prologue for a longer work I have outlined and it sets the ontological basis for magic in that graphic novel’s universe (more on this as it develops).
An obvious inspiration for the astronaut drawings was NASA’s photographs of the first American spacewalk.
The word captions were removed for this posting to provide further incentive to buy Trip #8 when it comes out in March.

I would like to thank Éditions TRIP for the opportunity to participate once again in one of their publications as it provided the necessary impetus to start this project.

Friday, February 7, 2014

OSCNotation 3.0

Version 3.0 of OSCNotation has just been released on the Apple App Store.

I started working on this update in December when composer Nicolas Fells contacted me about including iPad support to OSCNotation for a piece he was working on. The appeal of working with the iPad is that OSCNotation can be used to display multiple staves on the same screen.  Musicians can all read from the same device and see how their parts interact with those of their colleagues. This is probably the single most requested feature and I had some spare time I decided to dedicate to this task.

IOS Simulator Screen shot Feb 4 2014 8 53 03 PM

The user guide has been updated to reflect all changes.

What I find most rewarding about my work in music technology is hearing the music to which my software contributes. Below is a concert from early 2013 featuring Dan Tepfer and Lee Konitz.  Dan is using OSCNotation to send musical notation to the Harlem String Quartet from his midi keyboard. How cool is that?

Tuesday, February 4, 2014

My Commodore 64

Lately, I've been obsessed with the Commodore 64.  It all started when I read a fantastic little book that discussed a one line program from a variety of different angles.  After spending a couple hours with emulators running on my MacBook, I wanted to get my hands on the real thing.

My first computer was a Commodore 64.  I remember vividly the happiness I felt that night in December when my father came home with all those boxes.  He laid everything on the floor and we spent hours putting everything together.  All the manuals and documentation were in English and at that age (eight), I had yet to master the language.  This is probably the greatest gift I ever received from anyone.  Not only was the Commodore 64 a fun game system, it was my first introduction to computer programming.  I spent a lot of time deciphering the codes in the manuals and I made little games in BASIC.  Unfortunately, I never figured out how to save anything, so I had to input the code every time I wanted to play one of my games (again, the manuals were in a "foreign" language).

All this reminiscence eventually lead me to acquire two systems.  The first is an early model breadbin C64, just like the one I had as a kid.  I made some modifications to it:
  • I added an SD card reader to replace the bulky disk drive; 
  • I added some buttons to control the SD card reader; 
  • I installed JiffyDOS with a Kernel selector switch;
  • and I installed heat-sinks on most integrated circuits.  
I will use this unit to play games and to create my own programs.

Programming the C64 may seem like an odd idea (or a waste of time), but the 6502 CPU on which the C64's CPU is based is still being manufactured and seeing lots of use in certain applications.  Learning program in 6502 machine language may be a very worthwhile endeavour…  these chips are a lot cheaper then the micro-controller development boards available these days and, in some sense, a lot more powerful.

Also, the Commodore Basic language was ported to OS X, Windows and Linux as a powerful scripting language.  As such, it is an interesting alternative to other scripting languages and a nice way of recycling my old chops.

I also acquired a mint condition C64G.  This is a beautiful unit with a C64C motherboard in a creamy-white breadbin case.  I got this one specifically for musical purposes and I modified it in the following way:
  • I added a 1/4" audio output; 
  • I grounded the audio input to reduce noise; 
  • I installed an LCD screen;
  • I installed a power plug for the LCD screen;
  • I installed heat-sinks on most integrated circuits.
This C64 will live on a pedal board next various guitar pedals. I'll use MSSIAH's mono-synth program as a sound source to be modified by the guitar pedals.  For controller, I'll use my previously mentioned homemade midi guitar controller.  I'm also using a Raspberry Pi running Pure Data to process the midi signals from the guitar controller to the C64.  Here's the signal path I'm considering at this time:

1266867 575796905816004 674561434 o

Friday, July 26, 2013

Meta-trombone: European tour edition

Leading up to my currently ongoing European tour (concerts in Paris and Cologne); I  tweaked my meta-trombone yet again.

The trombone audio input goes to:
The audio to midi converter interprets the trombone performance and outputs midi notes.  Those notes are either sent to the Midi Looper (more on this below) or to the sampler.  Before reaching the sampler, some midi effects can be applied to the midi notes (Midi Delay and Cthulhu).  The output of the sampler goes to the audio outputs (with reverb) and the looper.

The KMI 12-Step controls either the Midi Looper or the Mobius Looper.  As commands are sent, the Head’s Up Display on my iPod Touch is updated (see image below).  I use the Line 6 FBV to change Mobius’ output volume, secondary feedback and playback rate.  I also use it to change the quantization setting of commands being sent to Mobius from the 12-Step.  As I change these parameters, the Head’s Up Display on the iPod is updated (four dials in upper left corner).  I select which parameter the pedal affects using the four switches on the FBV.  When a parameter is selected, its dial appears green on the iPod.  In this way, I can modify several parameters at the same time.

Finally, the output of the looper goes to audio outputs through reverb.  I have eliminated the post-looper effects, as they were more confusing than aesthetically satisfying.

Performance modes

The signal flow is only part of the story; to understand what is going on in a meta-trombone performance, I need to discuss the various performance modes.

Trombone Mode

In this mode, the acoustic trombone sound is being sent to the looper and will be recorded.  Turning off this mode, the trombone sound is no longer sent to the looper and will not be recorded, but it is still sent to the audio outputs.

Midi-note Mode

In this mode, the midi notes interpreted from the trombone performance are sent to the sampler.  This mode is only relevant once a phrase has been recorded into the sampler during performance.  There are two sub modes: synth and trigger.  Synth will cause the sampler to playback the recorded phrase from where playback last stopped at a playback rate relative to the note being played (e.g. higher notes cause faster playback).  Trigger will cause the sampler to playback at a defined playback rate starting from one of sixteen positions within the phrase relative to the note being played.  The playback rate of the trigger mode can be modified during performance.

Midi Loop Mode

In this mode, the midi notes interpreted from the trombone performance are not sent to the sampler. Instead, they are sent directly to the Mobius looper and the Midi Looper.  Any given note will select a loop (1 through 4) and a starting position within that loop.  In this mode, I can “remix” all my loops together by playing trombone!

The midi looper can record those loops and since the output is sent to the looper, the “remix” I created in performance keeps going when I change mode or stop playing the phrase that gave it life.
I developed the Midi Looper in Cycling 74’s Max/MSP and the software is currently available for OS X (Windows support in the near future).

Monday, April 22, 2013

Tools of the Trade – Meta-trombone Edition

After every show, someone always wants to get more information about the technology that makes my meta-trombone possible. For the benefit of those who cannot make it out to one of my concerts, I thought I would briefly list and describe the hardware and software I rely on at this stage of the instrument’s development.



Mac Book Pro (mid-2010 i7)

The central nervous system of my rig, my MBP is indispensable. These days you can use any manufacturer’s computer and almost any operating system to create music in real-time; however, there are advantages to using a Mac. Foremost is availability of replacement computers that precisely match the specs of my current machine. In addition, third-party developers can test their hardware and software on exactly the same system as the one you are using, which may not be the case with other computers. The result is better system integration that results in less setup time and more music making.

Apple iPod Touch (4G)

I use the iPod touch (attached to my trombone) as a heads-up display for system information and looper status. This way I don`t need to look down at my laptop too much. I can also use the iPod`s accelerometers to control parameters.

RME Fireface 800

RME are makers of audio interface of choice for anyone interested in reliability and sound quality. The FF800 features lots of ins and outs, direct monitoring and a matrix mixer with presets. This is more than I need, which is precisely what you want from your audio interface… your tools should not hinder your creativity.

ATM350 Cardioid Condenser Clip-On Microphone

I have been using this microphone for years… over a hundred gigs and I have never felt the need to look elsewhere.


The 12 Step is a great little controller with a piano keyboard layout and illuminated keys. It is small enough to fit in a 1U rack drawer, its USB powered, it is solid and it is spill proof. What else do you need?


I am still integrating the FBV into my set, but the four switches allow me to select what parameter the expression pedal affects. I think this will prove very useful as I continue development on the meta-trombone.

Gator GRC-Studio-2-Go ATA Case

I like this case because I can arrive at the gig with everything wired and ready to go. I added a 1U drawer to keep my microphone and my KMI 12 Step, so this single box contains almost everything I need for the gig.

YSL-697Z Professional Trombone

The 697z has been my horn of choice for the last five years. Yamaha built it for Al Kay, but it meets all of my expectations of what a great trombone should be.

K&M 15270 Trombone Stand (in-bell)

Since, you should never leave your trombone on the floor; always bring a stand with you. The convenience of the in-bell stand outweighs the inconvenience of an unbalanced trombone case.

Yamaha Trombone Lyre

After many false starts, it turns out the best way to attached anything to your trombone (iPod Touch, sensors or whatever) is with a lyre.

Sennheiser HD25-1 II Headphones

Since I could never get used to playing a brass instrument with something stuck inside my ears, I only use over the ear headphones to monitor the mayhem on stage. The HD25-1 II provides a good level of noise isolation and gives me a great signal.




I run TouchOSC on my iPod Touch to display system status information received wirelessly from my MacBook through OSC messages. I also use it to send the iPod’s accelerometer data to the MacBook. The long-term goal is to write my own performance software for iOS that will also display algorithmically generated musical notation.

Circular Labs’ Mobius

The Mobius looper is developed by Jeff Larson, who makes it available freely. A scriptable multitrack looper, Mobius brings a lot of creative potential to the table. I cannot imagine how hard it would be to make music without this tool, as I am unaware of anything quite like it.

Expert Sleepers’ Crossfade Loop Synth

While it is primarily a sampler, you can also view this versatile plugin as a creative delay or even a looper. I have a series of tips and tricks for this plugin that I will post shortly.

Audio Damage Eos

Eos is a good sounding reverb that does not tax your CPU too much.

Xfer records' Cthulhu

This nice little plugin consists of two independently selectable midi effects: a chord memorizer and an arpeggiator. The chord module allows me to assign a user-defined chord to any midi note. The arpeggiator takes the output of the chord module and sequences the chord notes according to a pre-defined pattern. Sending the output of Cthulhu to the Crossfade Loop Synth adds a lot of interesting possibilities.

Plogue Bidule

This is where the magic happens. Bidule is a graphical music programming environment. It is also a VST/AU host, so you can use your plugins as elements within your “code”. I use it to convert my trombone sound into MIDI notes and to route signals between plugins based on system state. I also use it to augment the functionality of the plugins I use. In a way, the Bidule patch is the instrument and the composition when I play meta-trombone.

Future Addition


GameTrak controller

The GameTrak controller is an intriguing option for gestural control of musical parameters. After reading on the development of the 3D Trombone, I ordered two GameTraks and I think I will incorporate them into my performance system. By determining the distance between the two hand units while playing trombone, I think I can use this controller to determine the slide position. There are other possibilities, of course.


I`ve been learning Max since last summer and I can think of a few ways it will prove useful down the road. Presently, I really appreciate how easy it was to integrate with the Arduino to read the values coming from the GameTrak controller or other sensors. I`ve also been playing with GEN and the sounds I get from it are very surprising. There are also a number of interactive music patches available for Max that makes it worthwhile to study this software.

Sunday, March 24, 2013

2012 - My year in review

A couple months ago, I made a track for a Disquiet Junto project called audio journal. Here is my contribution:

The year 2012 was quite good to me… On the personal side, the high point was the birth of my daughter Myriam in February and that adventure keeps getting better all the time.

On the musical side of things, I had a great year. I contributed to my first Chain Tape-Collective project, CT-One minute. One of the two tracks I submitted to that project, Twice Through the Looking Glass, was later selected for the 2012 60x60 Canadian Mix and has been heard in concerts all over Canada.

In May I released sans jamais ni demain, an album of electroacoustic compositions that brought together most of my musical ideas up to that point. Over the summer I took a class in Max at the Massachusetts College of Arts and Design, released my first iOS app and made headway in the development of my meta-trombone. I also created a fun and intuitive vocal instrument in Bidule. Below is a video of a test performance, in case you missed it the first time around:

In October I had the pleasure of playing two concerts at the Y2KX+2 Livelooping festival in San Jose and Santa Cruz. Not only did I meet some great people, I used the recordings from those performances to document my work on the meta-trombone. While I was in California, I also released my second iOS app, OSCNotation, which I've recently updated and discussed on this blog.

In November I joined the Disquiet Junto and produced my first track with project 48 - libertederive:

I enjoy the challenge of making music within the constraints of each project.  As the above track should make clear, it prompts me to create music I would not otherwise create.

Things to come

The present year should be equally awesome…  For starters, I'm in the middle of a world tour to promote my meta-trombone:

  • Toronto (March)
  • New York City (May)
  • Brooklyn (May)
  • Paris (July)
  • Cologne (July)

Also, I have two musical releases planned and a new app for OS X and Windows in the works.

Keep the schedule hectic!


Thursday, March 14, 2013

The virtue of free

Last year I released two apps for iOS: BreakOSC! and OSCNotation. Both used Open Sound Control (OSC) to accomplish very different things.

In BreakOSC!, the user plays a game of Breakout to change parameters in their music software based on what occurs in the game. I thought this was a great idea… I spent a couple months polishing this app and tried selling it for 0.99$. Twelve people bought it. No one reviewed it and I received no emails from its users. The only reason I do not consider this project a complete waste of time is that I make use of the app in my own music, from time to time. I do not plan to do any further work on this app.  (I have since made it available for free and over 200 people have downloaded it in only a few days)

OSCNotation has been a very different story. For my main ongoing musical project, I needed to display programmatically generated musical notation on the iPhone. Once I found a way, I realized that other musicians and composers could also find uses for this and I packaged this part of my project into a simple app that displays notation based on messages it receives via OSC. It took me very little time to create this app and I did not polish it to the level of BreakOSC!. Consequently, I made it available for free.

The response has been amazing. CDM reviewed it and Music Tech Magazine spread the news to its readers. To date, over 500 people have installed OSCNotation. Furthermore, users also contributed back… Carl Testa created a tutorial for Supercollider and Joel Matthys created ChucK code for a performance of Riley’s “In C”. Joel also coded an Android version of OSCNotation that mirrors the features of the first version of my app.

I have also received many emails from users describing their intended use of my app to teach, compose and perform. I look forward to hearing the music they create with my app.

Further, this interest in OSCNotation brought some attention to my own music and art. Indeed, my blog and bandcamp stats show a spike surrounding the dates of the original release.

Given all this, it is not very surprising that I felt it worthwhile to continue the development of this app. Today, I am very happy to announce the availability of OSCNotation version 2.0!

Some of the new features:

  • Note beaming
  • Triplets (half note, quarter note and eight note)
  • User can choose to display accidentals as flats or sharps
  • User can specify beat duration (affects note beaming). 

You can refer to the user guide page on the OSCNotation website to see how that works. Enjoy (and please share your music).

Thursday, January 31, 2013

Artist Statement

Lately I’ve been giving some thoughts to developing an artist’s statement that would unite my various artistic endeavours.  Given my seemingly disparate output, I thought this would be a lot harder to do, but the statement wrote itself…  I rapidly discovered an underlying theme in (almost) all my artistic interests and it just fit and felt right.  I really believe this is what I’ve been doing all these years, but, for the first time, I’ve now described it with words.

What I’ve realised is that, in my art, I explore the distinction between the symbol (word, image or sound) and the object it represents.  By scrambling this distinction, the symbol can become artistic building blocks and objects can acquire meaning.  My approach draws inspiration from the works of Magritte and Gödel's theorem on the incompleteness of mathematics.

In Les deux mystères, Magritte depicts a painting of a tobacco pipe on an easel. Below the pipe we can read the phrase: “Ceci n’est pas une pipe” (this is not a pipe).  Besides the painting, there is another pipe (the presumed model for the painting).  In this painting, Magritte brings our attention to the distinction between the symbol (the pipe on the easel) and the object it represents (the “real” pipe besides the easel).  However, this last pipe is no more an object than the pipe from the painting on the easel.  They’re both images of pipes…  With this realisation in mind, we can read once again the phrase on the painting and become aware that, just like these pipes are not really pipes, the words are not words.  Rather, they’ve become coloured shapes on the canvas.  The symbol is objectified and manipulated to create art.

In his famous theorem, Gödel shatters the distinction between the discourse about numbers and the numbers themselves by producing an equation that talks about itself.  This equation tells us that it is part of the mathematical domain, but that it cannot be demonstrated.  The object of mathematical discourse participates in the discussion… the object is elevated to symbol and acquires meaning.

In my artistic practice, I explore this movement from object to symbol and from symbol to object.  I do this by producing self-referential films, images (films and comic books) by manipulating other images or words, music from language, music where the notes are both musical material and control signal to change parameters and computer assisted poetry.  Recently, I’ve also created a game that sends control messages to change musical parameters based on what’s happening in the game.

Where I propose to go

These last few months since Y2KX+2 have seen much development on my meta-trombone.  The first thing I wanted to do after those performances was to replace the first instance of Mobius in my signal chain.  I think the way I was using it (as a sampler, rather than a looper) caused it to crash in performance.  After some research, I opted for Expert Sleepers’ Crossfade Loop Synth.  I was able to recreate the functionality I was getting from Mobius by expanding my Bidule patch, which turned out to be fairly painless.  This new sampler does add some interesting possibilities such as:

-          Note polyphony;
-          Built-in filter, pitch modulation, LFOs;
-          Different loop play back modes (Forward-and-backward being my favourite).

new flow

The other area of development was the addition of midi effects.  Whereas I only had midi note delay for my performances in California, I have now added Xfer’s Cthulhu to my patch.  This nice little plugin consists of two independently selectable midi effects: a chord memorizer and an arpeggiator.  The chord module allows me to assign a user-defined chord to any midi note.  Sending chords rather single notes to sampler plays back the sampled phrase at different playback rates all at once (something I find very satisfying).  The arpeggiator takes the output of the chord module and sequences the chord notes according to a pre-defined pattern.

The next aspect to see development will be the post-looper effects block.  Presently, I think I want to add both delay and tremolo slicing, but I may come up with other options as I work on this (suggestions?).

After that, I will concentrate on developing the iOS performance software component of this system.  Presently, I’m using TouchOSC to display system status information (such as what the looper is doing to what track or what performance mode I’m in), but I intend to build on the technology I’ve developed for my OSCNotation app (version 2.0 forthcoming) and display notation on my iPhone.  Since the system can already determine the notes that I’m playing (or have played recently), I’d like to use that information when deciding what notation to display.  For instance, the system could suggest new rhythmic or tonal material that either follows what I’ve played or that contradicts it.  Ideally, I’d like to build some game mechanics into it that would react to whether or not I accept these suggestions.  For example, the “game” could start with only a few functions available to the performer and advanced function needing to be “unlocked” by advancing in the game (i.e. playing what is suggested).  I’ve already explored this music game idea with my app BreakOSC!, but the idea still inspires me.

Monday, October 29, 2012

Live Recording from Y2KX+2

I have released the live recordings from my performances in San Jose and Santa Cruz as an album on bandcamp. All sounds were made with a trombone (with different mutes and at times singing through the instrument). Effects were limited to Rate Shifting in Mobius, reverb and some compression. Free download!

Friday, October 26, 2012

Y2KX+2 Livelooping Festival


Last week I had the privilege to perform at the 12th annual Y2K Livelooping Festival in California.   This festival is, by nature and design, as eclectic and wonderful as organizer Rick Walker.  At times, it seemed performers shared nothing but an attentive audience and an interest in using the techno-musical wizardry of livelooping.

Among the 50+ excellent artists I had a chance to hear, there are a few that stood out for me:

  • Luca Formentini (Italy) played a wonderful set of ambient guitar in San Jose and I really connected with his approach to improvised music.
  • Emmanuel Reveneau (France) had an amazing set in Santa Cruz.  For this second of two sets at the festival, I felt that Emmanuel had soaked up a lot whatever was in the Santa Cruz air that week and he let it influence his music,  His loop slicing was especially inspired...  I can't wait for the release of the software he made with his computer-savvy partner.
  • Hideki Nakanishi  a.k.a Mandoman (Japan) gets an unbelievable sound out of a mandolin he built himself.
  • John Connell only used an iPhone and a DL4 for his set.  This minimalist approach really worked for him and it reminded me that the simple option is often the best option.  I hope he'll check out the soon to be released Audiobus app as it will open up some possibilities for his music.
  • Amy X Neuburg is one of my favourite loopers.  I have an insatiable  appetite for her unique combination of musicality and humour.  Unfortunately I was setting up during her set and I couldn't give her music my full attention. 
  • Moe! Staiano played a great set for percussion instruments such as the electric guitar.
  • Bill Walker played a laid back and masterful set of lap steel looping.
  • Laurie Amat's birthday set (with Rick Walker) was simply the most appropriate way to end the festival.
  • Shannon Hayden: Remember that name (you'll be hearing her music in your favourite TV shows soon enough).

The collegiality among the performers was a high point of my participation in this festival.  I had the occasion to enjoy discussing the philosophical aspect of improvised experimental music with Luca, sharing notes on the business side of music with Shannon, listening to Laurie tells us about her collaboration with Max Mathews, witnessing technical demonstrations from Emmanuel, Bill and Rick,  and listening to my housemate Paul Haslem practice on hammered dulcimer.

2012 10 19 0176

The Test of Performance

Personally, my participation in the festival was an opportunity to put my meta-trombone project to the test of performance.  As with any new performance system, there were both positive and negative points to these first two maiden voyages.  Encouragingly, I was quite satisfied with the varied timbres I could produce with the meta-trombone.  I also enjoyed the drone-like feel of some of the loops and I liked the hypnotic phasing I employed.  

However, not everything went well.  My software crashed midway through my performance in Santa Cruz and I was forced to restart it. Thankfully, this is something I had practiced and I was able to keep playing acoustically on the trombone while the software came back online.  It did not take very long and many people told me they did not even notice the crash…  

More problematic, as I listen to the recorded performances, I feel there is something missing.  I find the conceptual aspects of the meta-trombone quite stimulating, however conceptually interesting music does not necessarily translate to good music (music people want to hear).  I tend to get overly interested in the conceptual part, but I need to focus on the music now that the concepts are firmly in place.  

I talked it over with other performers: Emanuel suggested I form a trio with a bassist and a drummer so that I could rely on them to anchor the narrative aspects; Luca thought I needed to think more about my transitions.  Both suggestions will need to be explored as I continue work on the meta-trombone.

Next Steps

I'm currently editing the recordings of my two performances into accessible short 'songs' for easy consumption.  While the meta-trombone still requires work, I feel that this point in its development is still worthy of documentation and I stand by the recordings I made in California.  

One of the first things I want to develop further is role of the notation metaphor in the meta-trombone.  Currently, trombone performance is interpreted by the computer software and the notes that I play execute code (specifically Mobius scripts).  I would like to expand this by creating algorithms that will send notation to be displayed on my iPod touch based on what notes were previously played.  Since meta-trombone notes serve both as musical material and as control signals, the software will be able to suggest changes to either the music or the system states by displaying music notation.  I already have a working app that displays music notation on iOS in real-time through OSC and it is generating quite a bit of  buzz.  I'll have to integrate it into a performance software for iOS that will ultimately replace TouschOSC, which I currently use as my heads-up display (see photo above).

Another avenue for further exploration would be to diversify the computer code that can be executed by playing notes.  I have a couple ideas for this and I think I will turn to Common Music to help implement them.  Out of the box, Common Music can respond to a MIDI note-on message by executing a scheme procedure, so it will be easy to integrate into my existing system.

I'm also looking to perform more with the meta-trombone and I'm actively looking for playing opportunities.  There's a possible gig in New York City in mid April (2013), so if anyone can help me find anything else around that time in NYC, it would make it a worthwhile trip.




Friday, July 27, 2012

Meta-Trombone Revisited

The recent release of version 2.0 of Mobius has spurred me to redesign my meta-trombone Bidule patch.  Since I can have both the new and the old version in the same patch, my matrix mixer (and some of the most complex patching) can be eliminated by using both versions of the looper. 

T set flow

The first looper will be the one that is “played” by trombone notes.  This is what I mean by playing the looper:

  • trombone notes will trigger the loop playback from a position determined by the note value
  • and/or trombone notes will change the playback rate relative to the note played
  • and the amplitude of the loop will follow the trombone performance by using an envelope follower.

I’ll have a second instance of Mobius down the line that will resample the output of the first looper in addition to (or in the absence of) any incoming audio.  Effects will be applied after audio in, after the envelope follower and after the resampling looper.  I’ve yet to determine exactly what those effects will be, but the success of my vocal set patch leads me to consider a rather minimalist approach.

Speaking of minimalism, I’ve been listening to a lot of Steve Reich these days and I’d like to incorporate some phasing pattern play into my set for my upcoming performance at this year’s Y2K festival.  One way to quickly create some interesting phasing composition is to capture a loop to several tracks at once and then trim some of the tracks by a predetermined amount.  This can be easily accomplished with a script and I’ve been toying with some ideas along those lines. 

Something else to which I’ve given some consideration is the development of midi effects to insert on the midi notes interpreted from the trombone performance.  Some midi effects that would be easy to implement:

  • midi note delay;
  • arpeggiator;
  • remapper (to specific key signature);
  • transposer.

It will be interesting to see what impact these effects will have on the loop playback of the first looper.  Another idea is to remap notes to parameter selection or note velocity to parameter value.

Another significant change is that I’ve acquired hardware to interpret midi notes from trombone performance.  I’ve decided to go with the Sonuus I2M instead of my previously discussed approach mainly because I was wasting too much time try to make the ultrasonic sensor work properly.  Bottom line, it wasn’t that interesting and I’d rather be playing music. My current plan is to use a contact microphone to feed audio to the I2M and to have a gate on the midi notes it generates in Bidule that I’ll activate with a footswitch.

I’ll also be designing performance software for the iOS as I intend to attach an iPod touch to the trombone to serve as my heads-up display for various system states (updated wirelessly with OSC).  I’ll be controlling the iPod with a Bluetooth page-turning footswitch.  One pedal on the footswitch will change between different screens and the other pedal will activate an action available on that screen.  For instance, on the notation screen, pressing the action pedal will display a new melodic line (either algorithmically generated or randomly chosen from previously composed fragments).

Now all I have to do is build it (and they will come…  or more accurately, I will go to them).

Thursday, July 12, 2012

Bring a map

Controller mapping, the art of selecting which parameters is controlled by what hardware, has been on my mind a lot these days as I prepare for an upcoming performance as a Featured Artist at this year's Y2K Live Looping Festival in California (I'll be playing in San Jose and Santa Cruz).

Before beginning this particular mapping, I had a vision I wanted to instantiate.  I wanted a system that would allow me to quickly create complex and ever evolving loops using only words and other vocal sounds.  I also wanted to limit myself to musique concrete manipulations: Loops, cutting and splicing sounds, delay, pitch shifting and reverb.

This is the audio flow I came up with:

V set flow

Incoming audio is sent to outputs and also split to four tracks on a multitrack looper.  Before reaching the looper, each signal path goes through a pitch shifting effect.  Each track then goes to its own post-looper effect.  Tracks 1 and 2 go to a delay while Tracks 3 and 4 go to a reverb.  Those two groups of tracks are mixed together and the result is sent to a crossfader than selects between these two sources.  The output of the crossfader is mixed with the audio input and sent out.   

My looper is Mobius.  I could’ve used another looper for this project, but my familiarity with this software and ease of implementation won out over wanting to play with another looper (I’ve had my eye on Augustus for a while).

My pitch shifter is Pitchwheel.   It’s a pretty interesting plugin that can be used on its own to create some interest in otherwise static loops.  Here, I’m only using it to shift the incoming audio, so it’s a pretty straightforward scenario.

My reverb is Eos by Audio Damage.  Do you know of a better sounding reverb that is also CPU friendly?  I can’t think of any.  My delay in this project is also by Audio Damage.  I’m using their Discord3 effect that combines a nice delay with some pitch shifting and filtering with an LFO thrown in to modulate everything.  This effect can really make things sound weird, but I’ll be using more subtle patches for this project.

To control all of this, I’ll be using my trusty Trigger Finger to control the looper and my Novation Nocturn to control the effects.  Here’s what I decided to do for looper control:

V set control

Starting on the left, the faders will control the volume of my tracks in Mobius.  The pads and rotary dials on the right are grouped by column and correspond to tracks 1 to 4.  Each button perform the same function, but on different tracks.  The bottom row of pads call the sustain substitute function on a given track.  The row immediately above it does the same thing, but will also turn off the incoming audio, so it will act like my erase button (with secondary feedback determining how much the sounds will be attenuated).  The next row up sends the track playing backwards for as long as the button is pressed and the final row of buttons mutes a given track.  The first rotary dial controls the playback rate of a given tracks and the top one controls its secondary feedback setting.

To control the effects, this is the mapping I came up with for the Nocturn:

V set effect control

The crossfader is obviously used to control the crossfader between the two track groups.  After that, each track has two knobs: one that controls the amount of pitch shift to audio coming in to the track and another that controls the wet/dry parameters of the tracks post-looper effect.  The pads on the bottom will select different plugin patches, but the last one on the right is used to reset everything and prepare for performance.  Among other things, it will create an empty loop of a specified length in Mobius, which is needed before I can begin using the sustain substitute function.  Essentially, I’ll be replacing the silence of the original loop with the incoming audio.

One thing I won’t be doing is tweaking knobs and controlling every single parameter of my plugins.  I’ll rely on a few well-chosen and specifically created patches instead.  Also, keeping the effects parameters static can be an interesting performance strategy.  When I heard Mia Zabelka perform on violin and body sounds last year at the FIMAV, one thing that struck me was that she played her entire set through a delay effect without once modifying any of its parameters.  The same delay time and the same feedback throughout.  For me, this created a sense of a world in which her sounds existed or a canvas on which her work lived.  It’s like she changed a part of the physical reality of the world and it made it easier to be involved in her performance because I could predict what would happen.  Just as I can instinctively predict the movement of a bouncing ball in my everyday universe, I became able to predict the movements of sound within the universe she created for us with her performance.

Here's a recording I made tonight by fooling around with this setup:

Tuesday, May 29, 2012

New Album Release: sans jamais ni demain

I’ve just released a new album on Bandcamp: sans jamais ni demain. It’s a collection of experimental electronic music and recent explorations. Nothing grandiose, but I felt the need to update my bandcamp and share what I’ve been working on. A little more details about the songs:

the longing for repetition

“Happiness is the longing for repetition.”
---Milan Kundera
This is a song I made for CT-One Minute. All sounds are derived from a 10-second bass clarinet phrase sample that can be downloaded freely from the Philharmonia Orchestra's website. The sample was played back at various playback rates, forward and backward, through various envelopes using the Samplewiz sampler on my iPod. This performance was recorded in one take with all looping and effects done in samplewiz. No further editing or effects except for copy and pasting the beginning at the end to bring closure to the piece.
I approached samplewiz as a livelooper, since, in "note hold" mode, every note on the keyboard can be seen as a track on a multi-track looper (each with a different playback rate). For this piece, I used the forward and backwards loop settings in the wave window, so things get go sound a bit different. I added some delay and messed with the envelope and it started to sound nice. Once I had a good bed of asynchronous loops, I left "note hold" by tapping rather than swiping the control box (this kept the held notes). I then changed the settings around and played over the loops without "overdubbing".
Samplewiz is quite powerful... You can also change the loop start and end points in between notes to add variety, without affecting the notes that are already being held.

tutus de chemin

This is the soundtrack for a short film I made in a weekend with my wife. I started with a vocal recording of my wife that I sent through Paul's Extreme Sound Stretch. The resulting audio file was played back as a looop in Bidule. I sent the audio to a pitch shifting plug-in (I believe I was using PitchWheel at the time) and then to a midi gate group and finally to the Mobius looper. I performed the sounds two-handed on my Trigger Finger. One hand was controlling a fader that was assigned to pitch shifting and the other was triggering pads to control the midi gate (the note envelope) and various functions in Mobius.

Three of a kind

This piece started out as an assignment for a course in Electroacoustic composition I took at SFU a few years ago.  The sound source was a homemade instrument, but everything was mangled and cut-up.  This piece features heavy use of the short loop fabrication technique familiar to readers of this blog.  I used Acid to assemble everything and add some effect automation throughout the piece.

le train

This is the soundtrack to a short animation film I made last year.  I used Soundgrain to isolate parts of the original sound's spectrum and used that software to create loops that I mixed  while recording.  I think this musical cutup is well-matched with the visual cutup it was meant to accompany.

Game music

This songs was made using my soon to be released iOS app: BreakOSC!  This app is a game that sends OSC messages based on in-game events.  In this case, when the ball hit blue and green bricks, Bidule triggered four instances of iZotope's Iris.  The paddle served as a cross-fader and mixed all those sounds together.  The results were sent to a generous amount of reverb courtesy of Audio Damage's Eos.

sans jamais ni demain

Another composition I made for the aforementionned course in electroacoustic composiiton I took at SFU.  The only sound source for this piece is a recording of myself reading an old poem I wrote in high-school.  The slow moving textures were made by isolating parts of those words, slowing them down and layering them over each other to create very long notes of varying pitch that fade in and out over time.  The more rythmic stuff I made using a now familiar technique.

July 8 2011

This piece is a recording of a live (from my basement) performance of what will one day become my meta-trombone.  A short loop is created at the top (what is heard twice in the beginning) and then altered in different ways determined by trombone performance.

Twice through the looking glass

This song was also made for CT-One Minute using the exact same sound source as the longing of repetition. This time, however, I used Iris to change the character of the sound and created two different sound patches.  I made two recordings with each of these patches by triggering the sounds with my new Pulse controller.  My three-month old daughter also took part by adding her own surface hitting contributions, making this our first father-daugther collaboration.  Once I had made these two recordings, I brought them in Bidule and placed them into Audio file players.  The amplitude of output of each player was controlled via faders on my Trigger Finger and the result was recorded to file.