Over the last 6 months or so I’ve been working on a BeagleBoard based robot. The motivation for this was to build a general robotics platform to try out some ideas I have on Simultaneous Localization and Mapping (SLAM) and robust sensor fusion. Here’s the result so far:
As you can probably tell from the photo, I’m a computer programmer, not a mechanical engineer!
The main features are:
- 2 Channel DC motor controller, based on the L298.
- 2 Pololu Quadrature encoders.
- 2 Channel servo controller, based on the ATTiny2313.
- An SRF08 ultrasonic range finder.
- Logitech C600 webcam, streaming raw YUYV frames to a PC.
- 3 Axis digital compass, based on the HMC5883.
- 3 axis accelerometer, based on the ADXL345.
- 3 axis gyroscope, based on the ITG-3200.
- Asus N10 802.11n wireless network adapter.
- Dual 5V switched mode voltage regulators, based on the TPS5430. Each supply is powered by two 2000mAh LiPo batteries.
- Dual Li-Po Battery Chargers (allowing the robot to charge its batteries and power itself simultaneously), based on the BQ24123.
- Dual Coulomb counters (for monitoring battery usage), based on the DS2782.
Here’s the overall architecture of the system:
The robot is controlled from a WPF application running on a laptop, which communicates with the robot over an 802.11n wireless network. Here’s a screenshot of the app:
Most of the peripherals on the robot are accessed over the I2C bus, via a C program called i2cproxy. i2cproxy runs on the BeagleBoard and listens on a given port. It responds to simple text commands, for example, ‘get 30 10’ (get the value at i2c address 30, register 10), or ‘set 30 10 2’ (set address 30, register 10 to the value 2). It also supports burst reads (reading from multiple registers in a single I2C transaction), and automatic polling of I2C registers. The source code for i2cproxy is available here.
The WPF application on the laptop communicates with i2cproxy on the BeagleBoard via the I2CBus class. Here’s some sample code using this class (this code runs on the laptop, and accesses the I2C bus on the BeagleBoard):
// Open the channel. var bus = new I2CBus(); bus.CommandPort = 2000; bus.PollPort = 2001; bus.Connect(); // Get the value at address 30, register 10. var value = bus.Get(30, 10); // Set the value of address 30, register 10, to value 2. bus.Set(30, 10, 2); // Poll address 30, registers 10-15, every 1000ms. bus.AddPoll(1000, 30, 10, 6, MyPollCallback, null);
The BeagleBoard itself runs Ubuntu 11 with a patched 3.1.0 kernel. The root file system was generated with rootstock. I had previously been using Angstrom, however I ran into driver and network issues which were mostly resolved when I switched to Ubuntu.
mjpeg-streamer lets you stream video from a UVC webcam as an MJPEG sequence over HTTP (MJPEG is essentially a sequence of JPEG images with the JPEG DHT segment omitted). Its relatively small so you can build it on the BeagleBoard itself and avoid the hassles of cross-compilation (though you will need your kernel headers and the libjpeg8 package). Here’s an example command-line:
./mjpg_streamer -i plugins/input_uvc/input_uvc.so -o "plugins/output_http/output_http.so -p 5000"
You can view the resulting video stream in Chrome or Firefox by typing:
into your address bar (obviously change the IP).
gstreamer is incredibly flexible, and lets you do almost anything, provided you can work out the appropriate command-line incantation. Here’s a command line which transmits JPEG encoded frames over TCP/IP:
gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)5/1 ! ffmpegcolorspace ! jpegenc ! multipartmux ! tcpserversink port=5000
This requires the ‘gstreamer0.10-plugins-good’ package. You can view the video stream on another machine using VLC media player. Open VLC, got to Media->Open Network Stream, and type in ‘tcp://192.168.0.70’, and you should be up and running.
Lossless Video Streaming
I’m planning to run the video stream through a set of image processing algorithms on the laptop. To do this effectively the transmitted image frames can’t have compression artifacts introduced by lossy compression codecs, like MJPEG or MPEG, as they’re likely to cause issues with the image processing. I need either a lossless codec, or to transmit the raw frames.
I tried a few lossless codecs without much success:
- I tried PNG encoding frames in gstreamer, however I couldn’t get this to transmit a stream of images (it seems to stall the gstreamer pipeline after encoding a single image).
- The ffmpeg gstreamer package has a lossless video codec, ffenc_ffv1. Unfortunately this completely saturated the BeagleBoard’s CPU (see here for a review of other lossless video codecs).
- I also tried JPEG encoding the frames with maximum quality, which doesn’t produce any human visible artifacts (though they still may be visible to the image processing algorithms). This results in CPU usage of around 55% with a 320×200 image at 15fps, which is still too high.
Lossless video compression tends not to compress particularly well anyway (perhaps a ratio of 2:1) so I ended up writing a small C application, uvcstreamer, which just transmits the raw image frames over TCP/IP. My webcam (a Logitech C600) outputs frames natively in YUYV pixel format which has a down-sampled chroma channel, which reduces the frame size by 25% anyway. Here’s the source code for uvcstreamer.
The mainboard is responsible for supplying regulated power to the BeagleBoard, motors and expansion boards, level converting the BeagleBoard’s 1.8V I2C bus to 3.3V and 5V, and managing battery charging. The board is home-made using the photo-resist method.
Initially I used a single power supply for the BeagleBoard and DC motors, however the DC motors drew to much current on start up, occasionally dropping the system voltage to the point where the BeagleBoard would reset, hence the dual power supply setup.
As described here, I’ve used switched mode regulators, rather than simple linear regulators. Switched mode regulators have a lot more external components, but can function at 95% efficiency, which makes a huge difference to battery life and heat output (a linear regulator would get 60% efficiency in the same situation).
The mainboard also includes two BQ24123 charger ICs (one for each channel). These can supply up to 2A, so they’re able to supply enough current to simultaneously charge the batteries and run the robot. Unfortunately these ICs come in a QFN package which is a bastard to solder.
The microcontroller in the center of the board is used to manage the charging, and expose the current charge state over the I2C bus. A couple of DS2782 coulomb counter ICs are also used to keep track of the voltage, current and charge remaining. The eventual plan is to have the robot autonomously find a charging station and charge itself when the battery is running low (note the ‘eventual’).
The motor controller is pretty simple: it uses an ATTiny2313 microcontroller to generate a PWM signal which is used to drive the L298 IC. The L298 contains two H-Bridges which drive the motors. The microcontroller exposes the motor state and speed through a set of I2C registers.
The microcontroller uses a modified version of Donald Blake’s TWI/I2C code (I’ve modified the original code to be register-based, and to support burst-reading). The modified I2C slave code is here.
One motor on each side of the robot is also equipped with a Pololu quadrature encoder for keeping track of wheel rotations. The output from the quadrature encoders is fed into the microcontroller and exposed over I2C (though I haven’t got the code for this running yet).
Electrical noise from the motors was initially a big problem, causing the microcontrollers to occasionally spontaneously reset. This was fixed by adding some capacitors across the motor terminals, as described here.
Motor Controller Source Code (An AVR Studio/GCC project)
Servos are easy to control with a microcontroller – have a look at this page for a quick description of the type of control signals a servo is expecting. The ATTiny2313 has a 16 bit timer with 2 output compare units (one for each servo) which makes generating the appropriate signals pretty easy (it can be done completely in hardware). The servos are controlled by a set of I2C registers.
Servo Controller Source Code (An AVR Studio/GCC project)
IMU Expansion Board
I’m still working on the code to sample, filter, and integrate the various values. I’ve got a prototype up and running which uses a dedicated interrupt line for each IC which is used to notify the BeagleBoard when a new sample is ready to be read (this should result in more efficient and accurate sampling than polling). The interrupt is passed through to userspace on the BeagleBoard via the gpio_keys driver. Using expansion header pins as interrupts also requires changing the mux settings. I’ll post this code when its up and running.
Thanks for reading!