Beaglebot – A BeagleBoard based robot

Over the last 6 months or so I’ve been working on a BeagleBoard based robot. The motivation for this was to build a general robotics platform to try out some ideas I have on Simultaneous Localization and Mapping (SLAM) and robust sensor fusion. Here’s the result so far:

As you can probably tell from the photo, I’m a computer programmer, not a mechanical engineer!

The main features are:


Here’s the overall architecture of the system:

The robot is controlled from a WPF application running on a laptop, which communicates with the robot over an 802.11n wireless network.  Here’s a screenshot of the app:

Most of the peripherals on the robot are accessed over the I2C bus, via a C program called i2cproxy. i2cproxy runs on the BeagleBoard and listens on a given port. It responds to simple text commands, for example, ‘get 30 10’ (get the value at i2c address 30, register 10), or ‘set 30 10 2’ (set address 30, register 10 to the value 2). It also supports burst reads (reading from multiple registers in a single I2C transaction), and automatic polling of I2C registers. The source code for i2cproxy is available here.

The WPF application on the laptop communicates with i2cproxy on the BeagleBoard via the I2CBus class. Here’s some sample code using this class (this code runs on the laptop, and accesses the I2C bus on the BeagleBoard):

// Open the channel.
var bus = new I2CBus();
bus.CommandPort = 2000;
bus.PollPort = 2001;

// Get the value at address 30, register 10.
var value = bus.Get(30, 10);

// Set the value of address 30, register 10, to value 2.
bus.Set(30, 10, 2);

// Poll address 30, registers 10-15, every 1000ms.
bus.AddPoll(1000, 30, 10, 6, MyPollCallback, null);

The BeagleBoard itself runs Ubuntu 11 with a patched 3.1.0 kernel. The root file system was generated with rootstock. I had previously been using Angstrom, however I ran into driver and network issues which were mostly resolved when I switched to Ubuntu.

Video Streaming

There are plenty of ways to get a webcam video stream off the BeagleBoard. Two good examples are mjpeg-streamer and gstreamer:

mjpeg-streamer lets you stream video from a UVC webcam as an MJPEG sequence over HTTP (MJPEG is essentially a sequence of JPEG images with the JPEG DHT segment omitted). Its relatively small so you can build it on the BeagleBoard itself and avoid the hassles of cross-compilation (though you will need your kernel headers and the libjpeg8 package). Here’s an example command-line:

./mjpg_streamer -i plugins/input_uvc/ -o "plugins/output_http/ -p 5000"

You can view the resulting video stream in Chrome or Firefox by typing:

into your address bar (obviously change the IP).

gstreamer is incredibly flexible, and lets you do almost anything, provided you can work out the appropriate command-line incantation. Here’s a command line which transmits JPEG encoded frames over TCP/IP:

gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)5/1 ! ffmpegcolorspace ! jpegenc ! multipartmux ! tcpserversink port=5000

This requires the ‘gstreamer0.10-plugins-good’ package. You can view the video stream on another machine using VLC media player. Open VLC, got to Media->Open Network Stream, and type in ‘tcp://’, and you should be up and running.

Lossless Video Streaming

I’m planning to run the video stream through a set of image processing algorithms on the laptop. To do this effectively the transmitted image frames can’t have compression artifacts introduced by lossy compression codecs, like MJPEG or MPEG, as they’re likely to cause issues with the image processing. I need either a lossless codec, or to transmit the raw frames.

I tried a few lossless codecs without much success:

  • I tried PNG encoding frames in gstreamer, however I couldn’t get this to transmit a stream of images (it seems to stall the gstreamer pipeline after encoding a single image).
  • The ffmpeg gstreamer package has a lossless video codec, ffenc_ffv1. Unfortunately this completely saturated the BeagleBoard’s CPU (see here for a review of other lossless video codecs).
  • I also tried JPEG encoding the frames with maximum quality, which doesn’t produce any human visible artifacts (though they still may be visible to the image processing algorithms). This results in CPU usage of around 55% with a 320×200 image at 15fps, which is still too high.

Lossless video compression tends not to compress particularly well anyway (perhaps a ratio of 2:1) so I ended up writing a small C application, uvcstreamer, which just transmits the raw image frames over TCP/IP. My webcam (a Logitech C600) outputs frames natively in YUYV pixel format which has a down-sampled chroma channel, which reduces the frame size by 25% anyway.  Here’s the source code for uvcstreamer.

The image stream is received on the laptop and converted to a System.Drawing.Bitmap suitable for display in the WPF application by the Camera class.


The mainboard is responsible for supplying regulated power to the BeagleBoard, motors and expansion boards, level converting the BeagleBoard’s 1.8V I2C bus to 3.3V and 5V, and managing battery charging. The board is home-made using the photo-resist method.

Initially I used a single power supply for the BeagleBoard and DC motors, however the DC motors drew to much current on start up, occasionally dropping the system voltage to the point where the BeagleBoard would reset, hence the dual power supply setup.

As described here, I’ve used switched mode regulators, rather than simple linear regulators. Switched mode regulators have a lot more external components, but can function at 95% efficiency, which makes a huge difference to battery life and heat output (a linear regulator would get 60% efficiency in the same situation).

The mainboard also includes two BQ24123 charger ICs (one for each channel). These can supply up to 2A, so they’re able to supply enough current to simultaneously charge the batteries and run the robot. Unfortunately these ICs come in a QFN package which is a bastard to solder.

The microcontroller in the center of the board is used to manage the charging, and expose the current charge state over the I2C bus. A couple of DS2782 coulomb counter ICs are also used to keep track of the voltage, current and charge remaining. The eventual plan is to have the robot autonomously find a charging station and charge itself when the battery is running low (note the ‘eventual’).

Motor Controller

The motor controller is pretty simple: it uses an ATTiny2313 microcontroller to generate a PWM signal which is used to drive the L298 IC. The L298 contains two H-Bridges which drive the motors. The microcontroller exposes the motor state and speed through a set of I2C registers.

The microcontroller uses a modified version of Donald Blake’s TWI/I2C code (I’ve modified the original code to be register-based, and to support burst-reading). The modified I2C slave code is here.

One motor on each side of the robot is also equipped with a Pololu quadrature encoder for keeping track of wheel rotations. The output from the quadrature encoders is fed into the microcontroller and exposed over I2C (though I haven’t got the code for this running yet).

Electrical noise from the motors was initially a big problem, causing the microcontrollers to occasionally spontaneously reset. This was fixed by adding some capacitors across the motor terminals, as described here.

Motor Controller Source Code (An AVR Studio/GCC project)

Servo Controller

Servos are easy to control with a microcontroller – have a look at this page for a quick description of the type of control signals a servo is expecting. The ATTiny2313 has a 16 bit timer with 2 output compare units (one for each servo) which makes generating the appropriate signals pretty easy (it can be done completely in hardware). The servos are controlled by a set of I2C registers.

Servo Controller Source Code (An AVR Studio/GCC project)

IMU Expansion Board

This expansion board contains an HMC5882 digital compass, and ADXL345 accelerometer, and an ITG-3200 gyroscope (all 3 axis). All 3 ICs use the I2C bus.

I’m still working on the code to sample, filter, and integrate the various values. I’ve got a prototype up and running which uses a dedicated interrupt line for each IC which is used to notify the BeagleBoard when a new sample is ready to be read (this should result in more efficient and accurate sampling than polling). The interrupt is passed through to userspace on the BeagleBoard via the gpio_keys driver. Using expansion header pins as interrupts also requires changing the mux settings. I’ll post this code when its up and running.

Thanks for reading!

This entry was posted in BeagleBoard, Electronics, I2C, Linux, Microcontrollers, Uncategorized and tagged , , , , , . Bookmark the permalink.

23 Responses to Beaglebot – A BeagleBoard based robot

  1. Matias says:

    Wow, great work!

    Perhaps you can help me a bit. I’m working in a similar project where i use the BeagleBoard to control automatically a quadrotor/quadcopter. You can take a look here:
    I need to send i2c commands to the motors but i’m having some troubles to configure i2c2 port on beagleboard. Can you help me to configure that?
    When i try to write in some slave i get “Write fail”. I’m using i2c-tools available here:

    Best regards,


    • bengalvin says:

      Hi Matias. It could be a lot of things. Some random thoughts: if you’re using an Atmel microcontroller as your I2C slave, make sure your clock speed is at least 8MHz and you’ve cleared the CKDIV8 fuse. Also check that you’ve lowered the I2C bus frequency on the BeagleBoard to 100KHz (it defaults to 400KHz, which is too fast for some microcontrollers). Check you don’t have SDA and SCL reversed, and that you have pull-up resistors on both lines (or that your level converter has them built in). If the slave is not a microcontroller, check the address is correct – some datasheets quote the address including the read/write bit (i2c-tools expects the address without this bit, so you have to right shift the address by 1). I2C issues can be hard to diagnose without an oscilloscope. I’ve been using a cheap DSO nano v2 (unfortunately its just 1 channel), but have my eye on an QA100.

  2. Ari says:

    Well done… I’m trying out robots on boards now and this would be a great place to starts. May I ask how long it took you?

    • bengalvin says:

      Thanks. Its probably taken me over 6 months, though most of that has been working out how to etch PCBs and solder surface mount reliably. I’ve had some problems with motor noise interfering with the I2C bus with the current design (I have separate power supplies for the motors and digital electronics but a shared ground), so I’m going to rebuild all of the boards with opto isolators and separately grounded power supplies, which I’m hoping should take a few weekends. Good luck!

  3. Patrick says:

    Hi, I understand that in order to use gstreamer to stream via IP add, we need to install gstreamer0.10-plugins-good. However, where should we install on and where can we find it? I tried to install on the board itself but couldnt find the package via opkg… Hopefully you can help me with it. Thanks!

  4. Francesco says:

    I’m using a beagleboard xm with Android 4.0.3 and I’m trying to implement I2c communication and an usb webcam image acquiring so I’m looking your code and i’s really useful. But I have a problem for the webcam: i know i have to integrate the linux kernel to abilitate it but i don’t know which part of the kernel i have to modify, i’ve looked over the internet to get some info but i haven’t find anything, you did something to enable the webcam on beagle?

    • bengalvin says:

      Hi Fracesco. Unfortunately I don’t know much about Android (i’ve been using Ubuntu and Angstrom). Usually i’d try plugging the webcam into a pc running a full Linux distro and see which module is loaded when its plugged in (by looking in the kernel log or using lsmod). Once you know the name of the module you’ll need to find the source and compile it against your target kernel (or if the kernel source already includes it, find out which option in the kernal config enables it). You’ll also need to cross compile if you’re compiling the code on a machine with a different architecture to the Beagleboard. I think Android is quite a different beast though, so this procedure may not be appropriate. Good luck!

  5. Tom says:

    Hi, I’ve just installed your webcam streaming program uvcstreamer from the git repository, but I do not know how to read the stream. I have tried typing in the IP followed by the port into chrome, however I just get a terminal response saying ‘requested frame rate x fps is not supported’, and a limitless download stared on the browser.
    Any help would be greatly appreciated, Thanks,

    • bengalvin says:

      Hi Tom. If you just want to get a video stream off the BeagleBoard its probably easier to use gstreamer or mjpeg-streamer, as there are many clients which can decode the resulting streams. The uvcstreamer program I wrote more or less dumps the raw data stream from the camera to the network. This has the advantage that its easy on the CPU and that there are no compression artifacts in the image frames (useful if you’re doing image processing), but there are no existing video clients that can decode the stream. I’ve written a decoder in C# which can decode MJPG and YUYV streams:

  6. Tom says:

    Thanks for the quick response!
    I have tried in the past with mjpg-streamer, however I have been unable to compile and install it (I am using a raspberry pi, and I have little experience as I am only 15). However, I have managed to get programs such as ‘motion’ streaming the video. Unlike these programs where I had a delay of at least 2 seconds your server and client work fantastic, with only a tiny delay! The only other thing I would like to know, is exactly how realistic it would be to get this video streaming to a web interface, perhaps using java, or at least an interface without the other BeagleBot controller parts, as I had to compile your whole program.

    (this is probably a bit of a dumb question)

    Thanks very much for your quick response and help so far,

    • Tom says:

      p.s. Would you mind if I post something about this on the RPi forums, as I think this may help plenty of other people aswell!

    • bengalvin says:

      Nice going for 15! IF you want to remove the non-camera parts of the UI, edit the MainWindow.xaml file – this determines the layout of the user interface. If you delete controls from the xaml you’ll have to remove the corresponding code from the code-behind MainWindow.xaml.cs file. It shouldn’t be too difficult to get it streaming to a webpage. If you want to keep the code in C# you could look at writing a component in Silverlight – this would let you reuse the existing Camera class. It also should be reasonably straightforward to convert the code to Java if you wanted to use an Applet instead. One thing to look out for is that the browser prevents Silverlight and Java Applets from opening up network connections to different domains from the host page, eg if you served the webpage from http://raspberrypi/tomspage.html, and your Silverlight control attempted to open a connection to tcp://raspberrypi:2000/, the tcp connection would be blocked by the brower’s cross domain policy. There are workarounds though (google ‘silveright cross domain’).

      Good luck!

  7. Bernd says:

    Hi, great work !!!! Is it possible to get circuit diagrams as eagle (or something else) files. Or only the diagrams as pic. It would be very helpful for me. Thanks Bernd

    • bengalvin says:

      Hi Bernd. I’v had some problems in the design above with electrical noise from the motors interfering with the I2C bus, so I probably wouldn’t inflict this design on you 🙂 The noise seems to be partly caused by routing the I2C lines too close to a couple of high dv/dt traces, and by having a shared ground between the digital and motor power supplies. I’ve rebuilt the robot with separate power supplies with no shared ground, and optical isolators for driving the motors and servos. So far this seems to be working well, and hopefully I’ll write this up in a couple of weeks (hopefully with some eagle files).

      • MrRobotic says:

        Hi Ben, thanks for your response. Ok, I will try to create my own design 🙂 and take care with your experience with noice and interefering. Did you use an eeprom on your mainboard for identifiying? However, your concept is really not bad 🙂

  8. Haley says:

    Do you mind if I quote a couple of your articles as long as I provide credit and sources back
    to your webpage? My blog site is in the very same niche as yours and my visitors would
    definitely benefit from a lot of the information you provide here.
    Please let me know if this ok with you. Cheers! regards

  9. now says:

    Is is there any reason that u used i2c than spi ?

    • bengalvin says:

      Hi. I2C requires only 2 signal lines, whereas SPI requires 3 plus additional lines to select the slave device. I have 8 slave devices so I’d need 10 wires for SPI, which is considerably more difficult to route than the 2 required for I2C. Having said that, the bi-directional nature of I2C makes debugging, buffering and level translation more difficult, and I’ve had a lot of problems with I2C bus stability, so I’d still consider SPI for a future project.

      • now says:

        Out of curiosity what problems did you run into when you used i2c.
        And i am fascinated by your project, congrats. I will be following your github and this page.

      • bengalvin says:

        Thanks! I’ve had quite a few issues with the bus locking up, I think because one of the slaves misses a clock signal and keeps holding SDA low. Initially this was due to electrical noise from the DC motors, but using isolated power supplies and minimizing bus length seems to have fixed that. Running the I2C bus through level converters seems to make problems more likely too (some slaves run at 5V, some at 1.8V).

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s