Monday, January 21, 2013

Expression, part 1 (MIDI Note On Velocity)

In our previous Arduino sketches, we created musical instruments that were very limited in terms of expression. In this post, I'll discuss two ways that we can give the instrument more expressive capabilities that utilize the performer's breath - mapping breath articulation to MIDI note on velocity, and mapping breath values to MIDI aftertouch after the initial attack.

These two techniques are useful if you are trying to play an existing synthesizer patch that is not specifically designed for wind control. They still leave a lot to be desired if you want to create an instrument that plays like an acoustic wind instrument, but I'll cover that topic in a later post.

An Inventory Of Expression Parameters

Before I go much further, let's take a moment to understand the various ways the the MIDI spec conveys expressive data, and how that MIDI data relates to a breath sensor.

Note On Velocity

With a velocity-sensitive keyboard, the MIDI note on velocity describes how hard the key on the keyboard was pressed (strictly speaking, it's the speed with which the finger moves the piano key through its range of motion - that's why it's called "velocity" and not "force"). For percussive instruments like drums or keyboards, it usually controls the overall volume of the ADSR envelope. Note on values range from 0 to 127, with 0 representing the lightest possible touch, and 127 representing the maximum force.

For a wind controller, it's not so clear. When a wind player articulates a note, the note starts from zero breath and increases at some rate controlled by the player. In the case where the player is using a hard articulation (that is, one with a fast rise time), it may make sense to try to map the rise time of the sensor to a note on velocity. This can be helpful if you are trying to play a patch that behaves like a percussive or plucked instrument. We'll implement this in the first sketch in this series, and see how it works out. As a hint, it's kind of a dead-end if we really want to build an instrument that responds like a real-world wind instrument, but the concepts here are generally useful, plus if you really do want to play a piano or drum patch, it'll help you play that patch with more expression.

There is also a corresponding note off velocity. I don't think this is widely used in synth patches, but it would control the duration of the R phase of the ADSR envelope. We won't implement it in our instrument's code, to keep things simple.


In addition to detecting how hard a key was initially pressed, many MIDI keyboards can detect how hard a key is being pressed while it it being held down and transmit that data via MIDI aftertouch messages. With many synth patches, pressing harder on the key will engage a vibrato effect like the modulation wheel.

With our wind controller, It's pretty simple to send aftertouch data that corresponds to the breath values. So, if a patch is configured to respond to aftertouch, we should be able to make it respond to how hard we blow after starting the note. We'll implement that in our sketch and see how it affects a few different synthesizer patches.

Continuous Controllers

We saw how continuous controllers work in a previous post. They are MIDI messages that aren't associated with any particular note. There are many different continuous controllers defined (here's a complete list).  Some synthesizers, and especially software-based synthesizers, allow you to route any continuous controller to any synthesizer parameter. When we get to the post about designing synth patches for wind control, we'll see how powerful this capability is.

Mapping Breath to Note On Velocity

 Here's a sketch that will allow you to play MIDI patches that respond to note on velocity. Most piano, drum, and guitar patches will probably respond to note on velocity information.

The basic approach is to take the reading from the breath sensor at the time the note begins, and map that to a value in the range 0 to 127 (the minimum and maximum allowable MIDI note on velocity values), and send that with the note on event. We can use the Arduino map() method to do that math, like we did in our continuous controller sketch.

The only slightly tricky thing is that, when the breath starts, it takes a few milliseconds to build to its final value, so if we use the first value we read, it won't be as "loud" as the player intended. To get around that, we actually wait a little while after we see the breath go above the note on threshold value, and then re-sample the pressure sensor. The second sample is the one we map to the velocity.

The other concept in this sketch that you may not be familiar with it the use of a very simple finite state machine. While it may sound complicated, it's actually pretty simple. At any time, the sketch is in one of three states:

  • Note Off State - no note is sounding
  • Rise Time State - the performer has started to blow into the sensor, and we're waiting a bit for the pressure to rise to its final value
  • Note On State - a note is sounding
 For each state in a finite state machine, you also have to know what actions will move you from that state to some other state. One way to do that is via a table:

Current State   Input                       Next State
NOTE_OFF        Breath value goes above     RISE_TIME

RISE_TIME       RISE_TIME milliseconds      NOTE_ON
                have elapsed

RISE_TIME       Breath value goes below     NOTE_OFF

NOTE_ON         Breath value goes below     NOTE_OFF 

Another way to visualize a state machine is with a state graph. The states are shown as ovals, and the transitions between states are the edges (curved arrows) in the graph. Note that two of the states have transitions that point to themselves (I didn't include those in the table above). That's perfectly ok and shows that some inputs might not cause a transition out of the current state. Click the diagram to see a larger version if it's hard to read.

Here's the sketch: 

#define MIDI_CHANNEL 1
// The threshold level for sending a note on event. If the
// sensor is producing a level above this, we should be sounding
// a note.
// The maximum raw pressure value you can generate by
// blowing into the tube.
#define MAX_PRESSURE 500

// The three states of our state machine
// No note is sounding
#define NOTE_OFF 1
// We've observed a transition from below to above the
// threshold value. We wait a while to see how fast the
// breath velocity is increasing
#define RISE_TIME 10
// A note is sounding
#define NOTE_ON 3

// The five notes, from which we choose one at random
unsigned int notes[5] = {60, 62, 65, 67, 69};

// We keep track of which note is sounding, so we know
// which note to turn off when breath stops.
int noteSounding;
// The value read from the sensor
int sensorValue;
// The state of our state machine
int state;
// The time that we noticed the breath off -> on transition
unsigned long breath_on_time = 0L;
// The breath value at the time we observed the transition
int initial_breath_value;

void setup() {
  state = NOTE_OFF;  // initialize state machine

int get_note() {
  return notes[random(0,4)];

int get_velocity(int initial, int final, unsigned long time_delta) {
  return map(final, NOTE_ON_THRESHOLD, MAX_PRESSURE, 0, 127);

void loop() {
  // read the input on analog pin 0
  sensorValue = analogRead(A0);
  if (state == NOTE_OFF) {
    if (sensorValue > NOTE_ON_THRESHOLD) {
      // Value has risen above threshold. Move to the RISE_TIME
      // state. Record time and initial breath value.
      breath_on_time = millis();
      initial_breath_value = sensorValue;
      state = RISE_TIME;  // Go to next state
  } else if (state == RISE_TIME) {
    if (sensorValue > NOTE_ON_THRESHOLD) {
      // Has enough time passed for us to collect our second
      // sample?
      if (millis() - breath_on_time > RISE_TIME) {
        // Yes, so calculate MIDI note and velocity, then send a note on event
        noteSounding = get_note();
        int velocity = get_velocity(initial_breath_value, sensorValue, RISE_TIME);
        usbMIDI.sendNoteOn(noteSounding, velocity, MIDI_CHANNEL);
        state = NOTE_ON;
    } else {
      // Value fell below threshold before RISE_TIME passed. Return to
      // NOTE_OFF state (e.g. we're ignoring a short blip of breath)
      state = NOTE_OFF;
  } else if (state == NOTE_ON) {
    if (sensorValue < NOTE_ON_THRESHOLD) {
      // Value has fallen below threshold - turn the note off
      usbMIDI.sendNoteOff(noteSounding, 100, MIDI_CHANNEL);  
      state = NOTE_OFF;

Here's how it sounds. I play four notes quietly, four notes loud, four more notes quietly, and then do four notes of increasing volume (a crescendo, in musical terms), and eight notes of decreasing volume (a diminuendo). The sketch is randomly picking which notes to play, but the attack velocity of each note is under my control.

Looking at the MIDI data, it looks like the values are ranging from a low of about 4 to a maximum of about 100, so I could alter the MAX_PRESSURE value in the sketch to fix that. But you get the idea.

Here's another rhythmic motif in 9/8 that I played on a different wind controller, a Yamaha WX-7, that also maps breath to note on velocity. Because of this mapping, I'm able to accent the 1st, 4th, 6th, and 8th notes of each group of nine, resulting in a 3 + 2 + 2 + 2 pattern.

In part two, I'll add MIDI aftertouch to this sketch.

No comments:

Post a Comment