Friday, July 12, 2024
My new favorite pressure sensor
Monday, March 8, 2021
Sonifying Air Traffic
A while ago I got a Flightaware Pro Stick Plus - a neat little software-defined radio that lets my little Raspberry Pi Zero decode ADSB transmissions from aircraft. These transmissions broadcast the location and altitude of each aircraft, in addition to other data.
So, I thought, why not turn those transmissions into music?
Code (evolving work, is a bit messy): https://github.com/ggood/adsbTheremin
General idea: python code reads the stream of aircraft position reports and creates a map of where the aircraft are and their distance/altitude relative to my location. Then map this information to MIDI note on/off events and send this information to Ableton Live running on my Mac.
Listen to the long notes - lower pitches are aircraft at low altitudes, higher pitches are aircraft at higher altitudes. At about 2:50 you can hear the general aviation aircraft doing touch-and-goes at the Palo Alto Airport (low tones), while the commercial airliners pass overhead, with much higher pitches.
There is a lot of arpeggiation (jumping-around notes) to keep things interesting, but the long notes you hear represent the position and altitude of the aircraft.
Friday, July 8, 2016
My King VOX Ampliphonic Octavoice II
The octave divider, which was part of the Vox guitar amplifier line manufactured by the Thomas organ company, allowed a trumpet player to lower the pitch of his/her instrument by one or two octaves. It was called the Octavoice, and you can read more about it here:
http://www.voxshowroom.com/us/amp/octavoice.html
To use the Octavoice, the trumpet player drilled a hole in the mouthpiece, and screwed in an adapter that allowed the Octavoice's transducer fitting to be attached to the mouthpiece. Since I'm a trombonist, this picture shows the fitting on a trombone mouthpiece.
And when the transducer is attached to the mouthpiece, it looks like this.
This device was produced in the late 1960s or early 1970s, so there is no digital signal processing going on. Inside the box are a bunch of transistors, resistors, capacitors, and inductors - not even any IC op amds. I'm a little stymied on how this box could produce an octave shift down, but I am a computer engineer, not an electrical engineer. Here are some photos of the inside of the box.
Whether or not I understand how it works, it does work. I used it to record the "tuba" track for an entry in the contest to be the song on the flexi-disc included with the Billy And The Boingers Bootleg collection of Bloom County comics by Berkeley Breathed. We didn't win, since we weren't so heavy metal, but I still do sort of like the song we made.
So, for DBaylies: if anyone can offer suggestions on how to build something similar to how build a mouthpiece-attached transducer, post here. The goal is to get an audio signal into the input of an analog-digital converter for processing in the digital domain.
Thursday, April 21, 2016
Building a Woodwind Controller Using the MPR121 Touch Sensor
The Akai EWI uses a set of touch-sensitive metal keys:
A key is actuated simply by touching it. The instrument detects the change in capacitance that results when your body touches the key. For more information on how this works, see https://en.wikipedia.org/wiki/Capacitive_sensing. And for some videos of people playing the EWI, look on YouTube. I especially like this one of the late Michael Brecker: https://www.youtube.com/watch?v=tPUBp9uTLIw
Can I Do This?
Is there a way to inexpensively build this into our own wind controllers? The answer is definitely yes.
NXP Semiconductors manufactures a very low-cost chip, the MPR121 ($1.95 in single quantities), that provides 12 separate touch inputs.
and Adafruit Industries has a nice breakout board that breaks out the tiny pins on the chip, and has some support circuitry that allows use with either 3.3v or 5v controllers.
Sparkfun also has a breakout board for the MPR121. It doesn't include the level shifters to allow 3.3/5v operation, but is less expensive.
I won't write a tutorial on how to wire up the sensor and use it, since the Adafruit tutorial does a great job of that. Instead, I'll show how we can incorporate it into a woodwind-style controller.
One thing that may occur to you is that there are only 12 inputs, but typical woodwinds have more than 12 keys. Is there a way to use more than one touch sensor chip in an instrument? Yes, there is. The MPR121 is an I2C device, which means you can attach more than one to the same two-wire communication bus, as long as they have different addresses. Both the Adafruit and Sparkfun breakouts include a way to set the address (up to 4 different addresses can be set), so it is possible to do touch-sensing on up to 48 keys. That should be enough.
Modifying get_note() For Touch Sensing
In my post, Note Selection Basics, I define a function named get_note() that read the switches that simulated our "trumpet valves" and returns the MIDI note to play. Let's replace that with a new function that can read the MPR121 and detect which of the 12 inputs is being touched.First, a little background. Adafruit provides a library for the MPR121 that makes configuring and reading it very simple. To initialize the chip:
touchSensor.begin(0x5A);
(0x5A is the default I2C address of the Adafruit breakout)
Then, to read all of the pins:
uint16_t touchValue = touchSensor.touched();
This will return a 16 bit value where each of the lower 12 bits is a 1 if the key is touched, and a 0 if it is not. Which bit corresponds to which key depends on how I wired the brass washers to the touch sensor input pins. In my case, the octave keys are mapped to bits 0 and 1, and then the rest of the keys are:
LH index finger: bit 2
LH ring finger: bit 3
LH middle finger: bit 4
(bit 5 not used - it's an extra key on my instrument not currently used)
RH index finger: bit 6
RH ring finger: bit 7
RH middle finger: bit 8
RH pinkie: bit 9
(bits 10, 11 not used)
Table Lookup for Note Mapping
As I mentioned in the Note Selection Basics post, a C language case statement will start to get pretty ugly for an instrument with all these keys, so let's look into a table lookup approach. We'll build a table that has, in one column, the expected bit values read from the touched() function of the MPR121, and in the next column, the MIDI note to send.In C, we can define a structure that holds one row of that table like this:
struct fmap_entry {
uint16_t keys;
uint8_t midi_note;
};
This is a chunk of memory that can hold a 16-bit value (named "keys", which will hold a bitmap of key values), and an 8-bit MIDI note value.
Next, we'll build an array of these:
#define FMAP_SIZE 33
struct fmap_entry fmap[FMAP_SIZE];
This defines an array (or table, if you prefer) of 33 rows of the structure we define above.
To map from a fingering to a MIDI note, we start at the beginning of this table and check to see if the fingering we just read from the sensors matches the value in the "keys" field. If it does, we've found the MIDI note and we're done. Otherwise, we skip to the next entry, and so on. In programming, this is called a linear search.
We actually need to initialize the "fmap" array with all of the key and note values. That's very verbose, so I'll omit it here and will just include it with the full code.
So here's the revised get_note() function. Since my instrument has an extra key I'm not using, there's a little code (in blue) that makes sure that even if the player touches that unused key, it won't change the value we read, so we don't need to have entries in the table for when that key is touched/not touched.
int get_note() {
// This routine reads the touch-sensitive keys of the instrument and
// maps the value read to a MIDI note. We use a lookup table that maps
// valid combinations of keys to a note. If the lookup fails, this
// routine returns -1 to indicate that the fingering was not valid.
int ret = -1; // Sentinel for unknown fingering
uint16_t touchValue = touchSensor.touched();
// Since we're not using the 4th finger of the left hand, mask off that key
touchValue = touchValue & 0b1111111111011111;
for (uint8_t i = 0; i < FMAP_SIZE; i++) {
if (touchValue == fmap[i].keys) {
ret = fmap[i].midi_note;
break;
}
}
return ret;
}
The for loop (in red) is where we do the linear search. We look at each keys value, and when we find a match, we stop looking and return the corresponding MIDI note.
If we don't find a match, then the player has their fingers in a non-supported position, and our function returns -1, which means to take no action. In other words, we ignore the glitch. That's not the only thing we could do. If, for example, we wanted to try to mimic how a real instrument behaves, we might send a pitch bend message to indicate that the pitch should be a little sharp or flat relative to the normal fingering. There are a lot of possibilities.
Ok, now that we've had a look at some code, is it possible to build something?
A Test Platform
I've built a very hacky prototype woodwind-style controller as a platform to test these ideas. The body of the instrument is a 14" length of PVC plumbing pipe, 2" or so in diameter. I was going for roughly the dimensions of an alto recorder, but chose a thicker tube to make running wires inside a little easier.To make the touch-sensitive keys, I soldered lengths of wire to ten brass washers:
Then I drilled holes in the tube, ran the wires from the outside to the inside, and hot-glued the washers to the outside of the tube, about where my fingers would fall. I didn't arrange the keys to resemble any particular instrument, although they're somewhat like a recorder layout (albeit with one extra key for the left hand, oops). The Frankenstein-like contraption looks like this:
The white mouthpiece protruding from the top is something I made to use with my Blowchucks Controller for the Electro-Music Festival in 2014. It connects to a tube that runs down the length of the instrument and connects to the Freescale pressure sensor I've been using in all of my wind controller projects.
On the back, I added two octave keys, and strapped all of the electronics near the base. This means I only have a USB cable running from the instrument to my laptop.
In this close up of the electronics, you can see the MPR121 touch sensor breakout on the left. The green and white wires that attach to that board are the wires coming from the brass washers. The board to the right has the Teensy microcontroller mounted on the right, the the pressure sensor on the left. You can also see the power and ground (red/black) wires going to the touch sensor, as well as the two wires (yellow/white) that connect the touch sensor to the I2C bus on the microcontroller.
I'll do a proper video later, but for now, here's a quick audio demo of the instrument running the code below. You'll hear some glitches as I'm not a woodwind player and have a hard time getting multiple fingers to touch keys at exactly the same time. We can de-glitch that stuff in code, and I'll work on that for a future post.
The full code is posted below, and is also available at https://github.com/ggood/NoteSelectionTutorialRecorder
Wednesday, April 20, 2016
Note Selection - How fast do we need to be?
A Different Approach
As you probably noticed, I was lucky enough to have a guest post from Johan Berglund, who showed how to use a simple algorithm that reads all of the switches (keys) of the electronic woodwind, and adds or subtracts semitones from the "center" MIDI note that the instrument produces with no keys pressed.After reading Johan's post, I thought some of you might be interested in how important it is to have a fast algorithm for map key events to MIDI note on/note off events. For the impatient: it doesn't matter very much.
Johan's Algorithm
With no keys pressed, Johan's instrument will play a MIDI note 61, which is a C#4 (one half-step above middle C on a piano). Pressing the left-hand index finger switch will lower the pitch by 2 semitones, which means his instrument will produce a MIDI note 59, or a B3 (one half-step below middle C). When more than one key is pressed, all that's needed is to sum all of the effects of each key, although though there are a few exceptions where several keys interact, so some logical AND and NOT operations are needed.My Lookup Table Approach
By contrast, my approach uses a lookup table. I scan all of the switches, and put the value of each switch (on/off) into a bit array. So if only the left-hand index finger of Johan's instrument is pressed, my bit value would look like 00100000 00000000. If all of the 14 keys in Johan's instrument are pressed, the bit value would look like 00111111 11111111. The leftmost 2 bits are always zero because we're putting these 14 key switch values into a 16-bit number, so 2 bits are always zero.I maintain a table that has two columns: a fingering bitmask value, and a MIDI note, like this:
Bitmask MIDI note
...
00100000 00000000 59
...
To figure out which note our instrument should be playing, we read all of the switches and pack them into a bitmask, then start at the beginning of this table and compare the current switch on/off values with the bitmask in the table. When we find a match, we know which MIDI note to produce.
Now, if we naively create a table that has every possible combination of those 14 switches, we'll end up with a table with 2^14 slots, or 16,384 slots. Since the bitmask is 2 bytes (16 bits) and the MIDI note is 1 byte (8 bits), my table will occupy at least 3 * 16,384 = 49,152 byes, or 48 kbytes. That's way more RAM memory than the Arduino Uno has (2 kbytes), so we need to be smarter here.
One trick is to realize that not all fingering combinations need to be considered valid. For example, no saxophone player will use a fingering that includes the ring finger of the left hand and the pinky of the right hand (unless they're doing some sort of weird extended technique). So we can get rid of the majority of the entries in our table because they'll never be used. So, really, we only need as many entries in the table as there are notes that our instrument can produce, plus any alternate fingerings (two ways of playing the same note). Since Johan's instrument produces all 12 semitones of the chromatic scale across two octaves, if we conservatively allow for 3 alternate fingerings for each of those 24 notes, we only need 72 table entries for a total size of 216 bytes. That's more like it!
Let's Race!
Now that we have a compact way to represent all the fingering-to-MIDI note mappings, let's think for a bit about efficiency. In a drag race to convert a fingering to a MIDI note, will Johan's code or my code be faster?(Caveat: it's been a long time since I took my computer architecture class, so I may bork some of this!)
The Atmel AVR chip in the Arduino Uno has the following timing characteristics, according to https://en.wikipedia.org/wiki/Atmel_AVR_instruction_set:
- Arithmetic operations take one clock cycle, except for multiplication, which takes two cycles
- Reading data from memory takes two clock cycles
Johan's statement:
I count:
- 22 memory accesses (44 clock cycles)
- 5 multiplications (10 clock cycles)
- 22 arithmetic or logical operations (22 clock cycles)
My code's performance will depend on which fingering is selected. If we get lucky and the fingering the player is using is the first entry in our lookup table, we'll only need to do one comparison. If we're unlucky, we'll need to look through all of the table. Let's consider the worst case, because no musician wants their instrument to slow down when she plays certain notes.
So let's assume we have to look through all 72 entries. Each table lookup involves:
- 2 memory accesses, to bring the two bytes of the mask into registers (4 clock cycles total)
- 2 clock cycles to compare those byes
So, worst case, the table lookup is 72 * (2 + 2) + (2 + 2) = 292 clock cycles
Checkered flag to Johan!
So, clearly, Johan's code is more efficient. But how much does it really matter? Let's look at the clock frequency of the ATMega328 chip found in the Arduino Uno - 16 Mhz.That means that each clock cycle takes 1 / (16 * 10^6) seconds, or about .1 microseconds. Both of our algorithms will execute in less than 3 microseconds (0.000003 seconds), which is really really fast, compared to the 0.1 second delay that a human can perceive.
Understandability Wins
The take-away here is to realize that, in many or most cases where you're reading a sensor in your electronic musical instrument project, and then doing some computation, you will almost never need to worry about the efficiency of your algorithm*. Use the algorithm that is the simplest to implement, and that makes sense to you.Hope this was helpful! In my next post I'll show how to use a cool capacitive touch sensor you can buy for $8 to make an instrument like the Akai EWI.
- Gordon
*Computer science nerds: Yes, I'm intentionally omitting algorithms that have quadratic behavior here. On microcontrollers, even if the algorithm is quadratic, it's hard to make n very big, given typical microcontroller memory sizes.