I recently switched gears on my AMUP project so that I could focus on getting the controller to work with Traktor Duo. The video above is the culmination of my work from the past two weeks. I am happy to report that I have been able to get these two to play together very nicely.
Most of my time has been focused on testing different control layouts so that I could determine what functionality was most important, and how best to group the various controls. For the most part, I found Traktor’s mapping functionality to be pretty good – it ultimately let me do almost everything I wanted to. The issue is that the interface that you need to use to create the mappings is rather cumbersome.
To help you understand how AMUP actually works, here is a description of the mapping of Traktor functionality to AMUP switches, buttons and knobs.
Over the last couple of weeks, while I worked to get AMUP air working with Traktor Duo, I decided to create a generic midi map for AMUP that I could use with Traktor, Ableton and any other midi-controlled application.
The controls from the console send midi messages on channel 16, while the button and air panels send messages on channels 0 and 1. The led buttons have 5 different states – they are off when in state 0, and they have a different color for each of the other states.
Here is a chart with a detailed overview of the midi mapping for all AMUP air switches and sensors.
In preparation to finalize the code for each AMUP component I have developed the midi maps below. These maps assign every button, switch, and sensor to an action and, where appropriate, to a midi message.
For the monome I am using the standard midi message set-up from MonomeSerial. Under this configuration the monome outputs midi note on and off messages on channel 1. Notes 1 through 64 are assigned to the monome buttons.
Though this may be a bad decision, I have decided to completely disregard existing midi protocols. I am using control messages across channels one through four to control tracks one through four. The control messages within each channel are grouped by function:
mixer functions: control numbers 1 – 10;
device functions: control numbers 11 – 20;
clip functions: control numbers 21 – 30;
multi-function: control numbers 31 – 40.
Transport control messages are sent on channel 16, using control numbers 1 – 10. Below is a detailed midi maps of the AMUP component and monome controllers.
To support the new modular design of AMUP I needed to find an efficient way to handle various types of switches across multiple different components. With this in mind, I decided to create a library for Arduino that features 9 separate classes that are able to handle different types of inputs such as switches, buttons, rotary encoders, potentiometers and proximity sensors.
The benefits provided by this library includes that all input types can be integrated using the same design pattern into your code. Debouncing and smoothing functionality is also integrated into these classes to help filter noise.
Here is an image that shows the class hierarchy for the AMUP Input Library, followed by a short description of each class. The code for this library is available here on github. Feel free to use and change this code, but at your own risk.
AMUP air is a DJ MIDI controller for Ableton Live. The way in which the switches and sensors from this controller are mapped to functionality within Live is a core element of the user experience. This mapping must feel intuitive for this device to deliver on my vision.
Therefore, before finalizing the Arduino sketches for each AMUP component, and in preparation for developing the Live MIDI script, I needed to think through the functionality that should be assigned to each and every switch and button (all 108 of them, including the monome).
There were several learnings from my experiences using the original version of AMUP that I wanted to address in this prototype. First off, I wanted to add direct feedback on the air panel to support the control of volume using the proximity sensor. Secondly, I realized that it was best to use a single scene selection controller for all tracks since this is not a track-specific function. Lastly, it became clear that I needed to have access to transport controls such as coarse and fine tempo, tempo nudge, and clip launch quantization.
I also wanted to expand the functionality provided by AMUP in several ways. First off, I added three potentiometers to support direct control of Live parameters such as the effects send for each track/channel. Next, I integrated support for multi-modal functionality using button pads with RGB leds. This enables using the potentiometers to control multiple different sets of parameters within Live, such as mixer and device settings associated to specific channel. Lastly, I integrated a monome into my AMUP set-up to enable me to control clip launch, track monitor and stop controls, and more.
Here is an overview of the functionality controlled by each component within the AMUP set-up. In the next couple of days I will create a post with the full MIDI map for this device.
The main console was designed to control several transport functions from Live, including clip launch quantization and tempo controls. It also features controls for the assignment of control pads one and two to one of the four tracks in Live.
The button pad was designed to control various parameters associated to the mixer, tracks, devices and clip loops. This component supports multi-modal functionality that enables the potentiometers to toggle between controlling mixer-related parameters such as eq and effects send, and device-related parameters that vary by device.
The air panel uses a proximity sensor to control Mixer functions such as the volume of an assigned track. In the future this sensor may also feature multi-modal functionality similar to the potentiometer on the button pad, so that it can control device functions.
The monome is the latest addition to the AMUP set-up. It is used to handle several transport functions, such as scene and track scrolling; and mixer functions, such as track monitoring. It also enables direct control of clips within a moveable 4 x 7 clip slot matrix.
To create AMUP air I redesigned the circuit from scratch with three goals in mind. I wanted to fix all the bugs from the initial design to make the system reliable enough for live performance; to add support for functionality such as direct feedback on the device itself; and, to make the system more modular and extensible, so that I could change the components in a set-up, and even create new ones.
Before getting into the schematics of each individual components, I am going to provide an overview of the architecture for the entire circuit – this is definitely the first time I have been able to honestly use the word architecture to describe one of my circuits. Here is an image of the circuit architecture for the AMUP air:
In order to make the system more modular and expandable I gave each component its own Arduino. Based on size and price considerations I decided to use a set of bootloaded atmega328 microprocessors rather than full-featured Arduino protoboards, except for the main console that features an ArduinoMega 2560. It’s easy to make your own Arduino protoboard, this tutorial from the Arduino website shows how.
The process of learning how to bootload the chips was rather painful until I figured out how to solve the issue that was driving me crazy. Now I can set-up a personal bootloading factory. Soon I will create an arduino bootloading tutorial with tips gained from my personal suffering (at which point I will add a link here).
In order for this new approach to work I had to find a way to connect the three existing AMUP air components: the main console, button pads, and air sensors. I decided to connect the air sensors to the button pads using serial, since only one air sensor is connected to each button pad. For the connection between the button pads to the main console I have chosen i2c because it enables multiple components to be connected using the same two pins on the Arduino.
For comparison, above is a diagram that illustrates the circuit architecture of the AMUP classic. As you can see, the classic used a single ArduinoMega2560, which was directly connected to all the components using four 16-channel multiplexers.
To enable AMUP air to provide direct feedback I decided to use rgb leds – the button panel features 8 rgb leds, while the air sensor panel features 10 rgb leds. To drive such a large number of rgb leds from a single Arduino you can either create an led matrix, use a demultiplexer, or use an led driver. I decided to go with the last approach because it provides the most control, and saves processing power on the Arduino to do other stuff.
After testing several different components I came across Texas Instrument’s tlc5940. These led drivers are awesome. Each one provides 16 dimmable pins and connects to Arduinos using a daisy chainable serial interface. An arduino library exists that makes it really easy to integrate these chips into your code. I bought a bunch of extra ones to play around with on other projects. Here is a link to more information about using the tlc5940 with Arduinos.
In order to design circuit boards for the button panel and air (prox) panel, I needed to create a full schematics of each circuit. Creating a full schematic using application such as Eagle CAD or Fritzing is a really great way to plan out your circuit. In the past I had used hand-scribbled schematics, which are fine for simple projects, but for big projects these software packages make a big difference.
If you are interested in learning how to use Eagle CAD to build schematics for your own project I strong recommend Sparkfun’s Eagle schematic tutorial. This tutorial guided my own schematic design work for this project.
Here is a slide show with the schematics for the button panel and air sensor panel. Note that multiple schematics refer to each one of these components.
The process for developing the final schematics, which were used as the basis for the printed circuit boards, took several rounds of prototyping, testing and revisions. It is important to test your entire schematic before you finalize the design of your pcb; and even so, you will likely discover issues with your first draft. I have already done one round of revisions to the pcbs and the button board still contains a few minor issues.
Printed Circuit Boards
My decision to integrate printed circuit boards into this project was driven by several considerations. I wanted to make the circuit more robust and clean by reducing the use of wires; I wanted to minimize the footprint of the circuit; and, I wanted to continue to develop my pcb design skills.
I have fabricated two different sets of protoboards for AMUP air. The two boards pictured above are from the first set. Before ordering the second set of boards I made sure to set-up the initial version, making all of required fixes to get them working properly. To get both boards working I had to scratch off a few short circuits and use wire to connect a few components that were left stranded.
Once the first prototypes were working, I updated the pcb design files and sent the second prototypes out for production. I haven’t yet assembled the new prototypes but I already know that some of the issues from the first boards have persisted.
I have limited experience working with printed circuit boards. Before AMUP air, I had only designed and fabricated a circuit board for my Emote project. To help me develop, fabricate and test my pcbs I relied on guidance from Sparkfun’s Eagle CAD pcb design tutorials. These tutorials were invaluable.
I also used Sparkfun’s SMD soldering tutorials to help me learn how to solder small surface-mounted components onto the PCB. These tutorials helped me to finally learn to use a desoldering braid to remove jumpers from existing solder joints.
Over the past several weeks I have gone completely silent. Though ITP has finished, this is not a sign of me taking time off. Rather this is a result of a sprint to finish prototype 2.0 of my AMUP project (dubbed AMUP air) before a party at Lauren’s school, at which I was scheduled to dj. This party came and went last week. Unfortunately, I was not able to finish the prototype on time – I am actually still working on it.
In this post I will provide a brief overview of the main updates, along with a sneak peek at how it is all coming together. To refresh your memory about AMUP, here is a link to previous journal entries that describe the original vision for this project and the development process for the first prototype.
The first version of this project had not come close to delivering on my vision, though it did provide me with inspiration to continue my pursuit. It’s main flaws included buggy circuitry, which made it unreliable for live performances, and a lack of feedback on the device itself, which made it difficult to control certain parameters without looking at the screen. The physical design of the device itself had an interesting feel clunky as well.
For version 2.0 I wanted to fix all the issues mentioned above as well as making numerous other improvements such as adding new buttons and switches; making the hardware more extensible and robust; developing a more modular software architecture for the Arduino; and integrating deeper into Live’s functionality using the python API.
Here is a brief description of how I have attacked all of these updates over the past month:
Fixed bugs in circuitry by rebuilding the schematic from the bottom up. To improve reliability I designed a printed circuit boards for the button pad and air sensor components, and integrated new components such as LED drivers, multiplexers and capacitors.
Added support for direct feedback on the physical device itself using RGB led lights. On the air sensor, these leds provide direct feedback regarding channel volume, while on the button pad they enable the buttons to support multiple states.
Reconfigured the switches, buttons and potentiometers in response to learnings from first prototype. Added inputs to control quantization, tempo, effects send, along with support for multi state functionality.
Designed a more modular and extensible architecture for the hardware. Each button and air sensor panel combination functions as a standalone unit that can connect to the main console. Theoretically I can connect over one hundred button pad/air sensor components to the main console; though due to latency considerations the limit is much lower (I assume around 4 or 5).
Working on enabling deeper integration between AMUP and Live by using Live’s Remote MIDI scripts. These python scripts enable you to access a wide range of Live’s functionality that is otherwise unavailable.
Over the coming weeks I will provide more in-depth overviews about how I’ve addressed all of the areas above. I will also share several tutorials related to skills I had to acquire while working on this project:
Bootloading AVR chips with Arduino
Writing an Ableton MIDI Remote Scripts in python
To support this project I’ve developed a set of related libraries that handle digital and analog switches by providing services such as: smoothing data from analog switches; debouncing input from digital switch; making it easy to set-up and read rotary encoders; and supporting multi-state RGB buttons. All of the components in this library adhere to the same design pattern, which will hopefully make then easier to use. I will write-up some additional documentation in a future post, but for now you can find the library on my github page here.
It has been a long time since I updated my journal with documentation regarding Air-Mashup, the new DJ tool I designed as part of NIME. The development process for this project was challenging, sometimes frustrating, and ultimately very fulfilling. It was by far the most complex physical computing device I have ever built. It forced me to stretch my abilities in realms of physical prototyping and software development.
Today I will focus on the iterative prototyping process I used to develop this project. Between the early October and mid-December I developed 5 different sets of physical prototypes and too many iterations of the code to count. Here is an overview of all my prototypes along with an outline of the software development challenges I encountered.
The first prototype focused on testing the design of the structure that holds the proximity sensor and was constructed out of hand-cut pieces of foam core. It featured the proximity sensor and a laser light. In this phase of the design process I was able to identify the appropriate height for the “barrier” that ensures one’s hands do not come to close to the proximity sensor. This is important because the sensors effective range starts a few inches away from the sensor itself.
The second prototype focused on testing the design of the control panel that enables users to control clips, loops, headphone monitoring, and filter sends. This prototype was developed using laser-cut pieces of foam core. This phase of the design process enabled me to identify several issues with the placement of the buttons. It also led me to re-evaluate the overall design of the piece, since I was unhappy with the large and clunky feel of this box.
The third prototype was an utter failure. My focus at this stage was on testing a new approach for the design of control panel and the encasing for the proximity sensor. I decided to create this prototype using wood since I had a plank of basswood lying around. This was a mistake, I realized that it is always better to continue prototyping using easier to handle materials until the design is more developed. This piece went straight from the laser cut machine to the garbage.
With the fourth prototype I was able to get back on track. My focus at this stage was still on testing a new approach for the design of the control panel and the encasing for the proximity sensor. This prototype was constructed out of laser cut watercolor board. This material is made of cardboard-like paper that is thicker and harder then foam core. This build helped me identify the final tweaks I needed to incorporate into the design of the control panel. It also led me to the inspiration to merge all of the control panels but to separate out components that hold each individual the proximity sensors.
The final prototype is by far more complex than any of my previous builds. I went from working on a single control panel/proximity sensor combo to building out four integrated control panel/proximity sensor components. At this point in time I needed to finalize the physical prototype so that I could start preparing for the NIME performance. It took me several weeks to build this prototype: I had to order over 100 components and solder them (and often re-solder them) together to build the circuits; then I had to laser cut over 50 pieces of wood and plexi to construct the boxes. As can be expected, it took me much longer to build this prototype than any of the others. Here is a slideshow features pictures of the final prototypes and the build process.
Over the past two weeks I have made a lot progress on the project Air Mashup. First off, the project name has evolved (or devolved) into AM-UP – ok, this is really minor news. The real progress has been made in testing and setting-up the proximity sensors, which will make up a core part of the interaction experience.
In this journal entry I will first discuss the process I used to select and test different types of sensors. Then I will write a post about the development of my first and second generation prototypes, and discuss some of the main challenges I encountered.
Selecting the Right Sensor
I found three different types of range finders that I looked into. The main considerations that I had in mind with these sensors were: (1) resolution and range; (2) beam width and interference; and (3) price and value.
The first sensors to be considered were the Maxbotix Ultrasonic Range Finders LV and XL range. Though these sensors tend to be very reliable and easy to use, they did not offer the right resolution and value. The LV range has a resolution of 1-inch, which is just not enough, while the XL range has a resolution that is twice that but they cost $60 a piece. I also had some concerns regarding interference between range finders, since I am planning to space them out by 10- to 12-inches only.
The next sensors on my list were from Sharp’s Family of Infrared Proximity Sensors. These sensors feature the shortest range in the bunch but they also provide the best resolution. I looked at two models: the long range model that can measure from 15 to 150 cm, and the mid range model that can measure between 8 to 80 cm. These sensors also provide a narrow beam width and they offer a good price to value ratio.
The last sensor that I looked into was the Parallax Ping Range Finders. These range finders are much more complex to use than the previous two. They require that the sensor send out a signal and then listen for a “ping” to come back in order to measure distances. These range finders are similar in price and technology to the Maxboxtix. Therefore, they suffer from the same value and beam width issues.
After running these initial tests I decided that Sharp’s proximity sensors were the best solution for AM-UP. I went ahead and purchased seven of these sensors and shifted my efforts to developing prototypes.
Prototype One: Testing Dynamic Range
My first prototype was little more than a wooden board with three separate proximity sensors attached a varying distances. The purpose of this first test was to identify interference issues and to out the dynamic range of the sensor. Here are some of the key findings from this initial prototype:
the sensor’s beam width is very narrow. This means that each sensor only needs to be separated by about 11-inches to avoid interference. This also means that there is a small sweet spot where a user’s hand must be placed for the sensor to work.
the dynamic range of the sensor is sufficient but there are some important characteristics that need to be addressed in the code. First, if the user’s hand gets too close to the sensor (within 15cm) then the readings become unstable and unreliable. Second, noise occasionally creeps into the data and needs to be filtered out.
Over the past week I have started prototyping my Air Mashup (AM-UP) instrument. My first step was to purchase a few different varieties of infrared proximity sensors to identify the type that will work best. Based on my research I ruled out ultrasonic rangefinders because their beam is too wide, and would suffer from interference in the set-up that I envision.
I tested the sensors to understand two important characteristics of how they function: (1) how far apart do I need to place these sensors in order to avoid interference – this will determine how close I can place the sensors within my sensor array; (2) what is the sweet spot from a range perspective for each sensor model – this will determine the distance from the sensor that my gestures need to be confined to.
So here’s what I found out: I identified the best infrared sensor model for my instruments and I discovered that the distance between each sensor in the array needs to be roughly 12-inches and the sweet spot for each sensor is between 5 – 45 inches.
Here is an overview of the next steps I need to take:
write arduino code that is able to recognize different gestures that I can use to control the sound. This will determine whether I need to keep my gestures to simple up and down motions, or whether I can use more complex movements.
determine what additional buttons I need to add to the instrument for control of things such as clips selection, and filter/effects routing. I am currently considering using a monome to control the clip selection.
design and build the case/stand for the physical device.
write a max patch (or a live for max patch) that enables me control the clips using the data that is sent from the sensors which are hooked up to the arduino.
prepare the music that I want to use for my performance by creating clips and loops for use in ableton. Then play around with the clips to determine how I want to compose my performance (essentially, I need to plan my composition).
At this point most of my research will be accomplished by creating prototypes that are increasingly more refined. I will need to polish my max and ableton skills in order to get my so of the programming work completed. Lastly, I need to listen to the music I will be using as a source to identify the loops I want to create.