Sunday, May 19, 2013

Dancing Lego and Five Servos

I have always wanted to control something interesting involving little servo motors. That's somewhat of an ill-defined dream, isn't it? But it has all been made possible by using my handy dandy Raspberry Pi computer. I spent quite a bit of time this past year on another project that involved reading signals from almost a dozen temperature-humidity sensors and reporting status details to a remote web server. That project is now unfortunately stuck on the backburner, but at least it got me started doing interesting things with my Raspberry Pi.

In this post I report on my current side project, which involves my Raspberry Pi, five servos, my son's Lego blocks, and synchronized groovy music. Let's get right to it: the final result is a video hosted on my YouTube channel. I recommend watching it full screen on a device with decent speakers. Enjoy!

What follows next are some interesting details of the work that went into this project. These are organized in approximate chronological order, and do not cover the time I spent goofing around on ideas that ultimately led nowhere.

Servo Controller

I initially tried to control my servos directly from the Raspberry Pi through its GPIO ports using the RPIO library, which has built-in support for software PWM. In my first implementation I saw excessive jitter in the resulting servo motion. This may have simply been a problem with my code?? I bailed on that approach and switched over to the nifty yet inexpensive hardware 16-channel 12-bit PWM controller board from Adafruit. It's based on the PCA9685 chip from NXP Semiconductors. If you click on the image, it should take you directly to Adafruit's product page. This servo controller board has so much more potential than I used here. Multiple boards may be daisy chained together and used to control a whole bunch of servos simultaneously over a single I2C channel! Maybe up to a million servos?? Yikes! I used only one board and five servos on this project.

Moving to a hardware-based controller was a great move for me since it offered me chance to learn more about I2C serial communication. Adafruit's Python library also made it very easy to start talking to the controller board with a minimal amount of code. On the first time through, it might take someone a bit of playing around with system configuration on the Raspberry Pi to enable the I2C Broadcom module. Let Google be your friend.

Five Servos

I bought three inexpensive servos from Adafruit: two small servos and one bigger servo. I also had two additional small, fancy servos that were remnants from a defunct RC helicopter project. I played with controlling my servos one evening and got my sample code running in no time at all. The way a servo works is that it will rotate to its commanded position as quickly as it possibly can, and then hold that position until the next command is received. When running through a quick sequence of moves, the resulting motion is rather herky jerky and does not appear natural. I wanted something better for my dancing Lego robot project.

My solution was to operate the servo inside an event loop that forced it to follow a controlled path. This control loop repeatedly evaluates a system response function, whose output is the updated position I would use to command the servo. This function is implemented as an exponential or "first-order" response: \[y(t) = A + B \times \left[1- e^{-\left(t-t_0\right)/\tau}\right].\] This relationship describes the sort of "natural" path I want the servo follow over time \(t\) in response to an impulse with magnitude \(B\) given at time \(t_0\). Note that \(t \ge t_0\) and \(A\) is the position of the servo before time \(t_0\). The parameter \(\tau\) is the "time constant", and represents the time scale over which this action is to take place. If starting at position \(A\) at \(t = t_0\), then the servo will end up at position \(A + B\) when \(t - t_0 \gg \tau\).

I implemented this response function on my Raspberry Pi within a threaded Python class that is instantiated for each servo in my project. A given servo is managed independently from the other servos while running in its own background thread. Upon commanding servo to a new position, the background thread takes care of making sure the servo moves to its target position in a natural, fluid motion. I made a quick video a few days ago to illustrate this point.

Stolen Lego

My son has a ton of Lego parts and pieces all over the house and they served as the perfect foundation for a kinematic contraption to be controlled by my servos. The photo below shows a close up of the main unit, which consists of a two degree-of-freedom arm with a little green wheel on the end. The most interesting part of the whole setup is the rubber tank tread used to transfer rotation from the larger servo to the main control arm. Another fun bit is the spring-loaded suspension made from a pair of rubber bands supporting the middle wheel inside the tread.

This tank tread was sourced from my son's precious Lego creation he had put on display at the center of the kitchen table. It was the perfect piece for my Lego robot arm, so I carefully removed the tread and left my son's creation in somewhat of a broken state. I felt guilty about the whole thing, and I confessed to him later in the day when he came home from school. He just looked to me and said "That's OK, Dad." He appeared neither sad or happy, so I considered his statement as full approval of my actions. I still feel a tiny bit of guilt, but that tread remains on my robot to this day!

Synchronized to Music

Controlling my Lego device up to this point was interesting, but something was missing. The music! In the past with my YouTube videos I have added music to make them more fun. If one wishes to share such videos publicly, it's important to think about who own's the rights to the music. The easiest option is to use royalty-free music, and my favorite source is Kevin MacLoud's site at Incompetech.com. He has a huge selection from which to choose and he makes it very clear how this royalty-free stuff works. The song I used in my video is titled "Rocket".

In order to control my Lego device in sync with my music selection, I needed to extract information from the audio signal and use that as a basis for when and how quickly to command a servo. I initially tried some simple FFT-based frequency analysis methods, but nothing really worked out. Next I came across the free web-based API from The Echo Nest and I was stunned at what they offered. Their developer API essentially enables one to build a custom, state-of-the-art music system that understands things like songs, artists, fan favorites, and playlists. The only part I need for this project is their Track API which allows one to extract beat and loudness information from an audio file uploaded to their servers. Their client library takes care of all the hard parts, including uploading audio files and retrieving results. This just left me with the task of handling the high-level stuff related to my application.

The chart above shows couple seconds of audio from the song Rocket, along with analysis results from Echo Nest. The vertical orange lines indicate the times for every beat. Each beat is made up of one or more segments, and the small orange dots connected by red lines represent the loudness peaks through each segment. The time at which these peaks occur are the basis for when I issue a command to a servo. The loudness of the peak is a basis for how fast the servo moves to its new motion.

Each servo is configured to oscillates back and forth between a low value and a high value. The synchronization with music comes about by issuing a command to change direction each time a new loudness peak is observed during a beat or segment. This controls the timing of the motion, but not the magnitude. This is taken into account for servo command by weighting the the target position (low or high) by the audio loudness. The sequence of time and loudness pairs derived from the music audio file is demultiplexed into multiple streams, one for each servo. The main loop in my application simply processes each loudness peak as it happens, and uses it to command a different servo each time. This is repeated many times each second, and the outcome is an interestingly-choreographed dancing Lego robot.

Source Code

All source code for this project is available at my GitHub repository. It does not have a setup.py and thus isn't installable as a package. I mainly used this repository as a playground for the project, and it evolved quite a bit over the recent past. I have no idea what happens next.

Lego Batman says "Goodby!"

No comments: