Programming Quadcopters, Part I

Video of my presentation for the Madison Perl Mongers about programming quadcopters. Part II will be given for the Millwaukee Perl Mongers on March 16th.

The quality isn’t up to snuff. I grabbed an action cam with a fisheye lens and a microphone that isn’t meant for this sort of thing. The brightness of the projector washes out the image of most of the slides. I’m putting it up for posterity, but I think I’ll redo it at home so people who weren’t there can tell what the hell is going on.

Low Latency FPV Streaming with the Raspberry Pi

One of the biggest challenges in running my local quadcopter racing group has been overlapping video channels. The manual for the transmitters list a few dozen channels, but in practice, only four or five can be used at the same time without interference.

The transmitters being used right now are analog. Digital video streams could make much more efficient use of the spectrum, but this can introduce latency. Of late, I’ve been noticing a few posts around /r/raspberry_pi about how to do an FPV stream with an RPi, and I’ve been doing some experiments along these lines, so I thought it was a good time to share my progress.

It’s tempting to jump right into HD resolutions. Forget about it; it’s too many pixels. Fortunately, since we’re comparing to analog FPV gear, we don’t need that many pixels to be competitive. The Fatshark Dominator V3s are only 800×600, and that’s a $600 set of goggles.

You’ll want to disable wireless power management. This will tend to take the wireless interface up and down a lot, introducing a delay each time. It’s not saving you that much power; consider that an RPi takes maybe 5 watts, while on a 250-sized quadcopter, the motors can easily take 100 watts or more each. So shut that off by adding this anywhere in /etc/network/interfaces:

And reboot. That should take care of that. Check the output of iwconfig to be sure. You should see a line that says “Power Management:off”.

You’ll want to install GStreamer 1.0 with the rpicamsrc plugin. This lets you take the images directly off the RPi camera module, without having to use shell pipes to have raspivid to go into GStreamer, which would introduce extra lag.

With GStreamer and its myriads of plugins installed, you can start this up on the machine that will show the video:

This will listen on UDP port 5000, waiting for an RTSP h.264 stream to come in, and then automatically display it by whatever means works for your system.

Now start this on the RPi:

Modify that last line to have the IP address of the machine that’s set to display the stream. This starts grabbing 640×480 frames off the camera with h.264 encoding, wraps them up in the RTSP protocol, and sends them out.

On a wireless network with decent signal and OK ping times (80ms average over 100 pings), I measured about 100ms of video lag. I measured that by displaying a stop watch on my screen, and then pointing the camera at that and taking a screenshot:

rpi_cam_latency

This was using a RPi Model 2, decoding on a fairly modest AMD A8-6410 laptop.

I’d like to tweak that down to the 50-75ms range. If you’re willing to drop some security, you could probably bring down the lag a bit by using an open WiFi network.

I’ll be putting together some estimates of bandwidth usage in another post, but suffice it to say, a 640×480@30fps stream comes in under 2Mbps with decent quality. There will be some overhead on that for things like frame and protocol headers, but that suggests a 54Mbps wireless connection will take over 10 people no problem, and that’s on just one WiFi channel.

New Module: Device::Spektrum — Control Common RC Quadcopter Flight Controllers

I’ve been wanting to put a Raspberry Pi on a quadcopter ever since I realized that the AR.Drone was too limited and poorly implemented.

It’s a bad idea to run everything off a Raspberry Pi running Linux. There are a lot of fine tuning PID-related adjustments every fraction of a second, and Linux could potentially delay the running process and cause you to crash (in the physical sense as well as the computational sense). So you need a flight controller like the Naze32, MultiWii, or AeroQuad that takes care that stuff in a dedicated processor.

How to communicate with a flight controller? The typical quadcopter sends PPM or Combined PPM to the FC. PPM sends pulses of a specific length, with one wire for each channel. Combined PPM is the same idea, but sends everything together and only needs one wire total. This is the oldest, most analog way of sending RC signals, coming from the days of brushed motors, NiCad batteries, and 27MHz radios with ten foot long antennas.

PPM/CPPM won’t work with a Raspberry Pi with Linux, either, for similar reasons as above. A delay in the timing of the signal throws the whole system off. It has been done on Arudino and Teensy.

The modern way is to use a digital serial protocol. One of these protocols is called Spektrum, and it’s implemented on the Naze32 FC, among others.

The Spektrum protocol has been implemented in Perl with Device::Spektrum. Should be hitting the CPAN mirrors shortly.

Newwwwwww QuadCopter!

My AeroQuad kit came yesterday. It’s a Cyclone frame with an AeroQuad32 control board. Spent most of the day at The Bodgery building it, and got most of the way through the frame build:

Half-finished AeroQuad

The plan is to put a Raspberry Pi on board that can be used for control over WiFi using UAV::Pilot, and also 1080p video.

Along those lines, I’m going back to the old WumpusRover code and getting the video streaming up to snuff. It’s ported to GStreamer1 now (instead of the original GStreamer CPAN module, which compiled against the old gst 0.10). One major item is that on a new client connection, wait for an h.264 keyframe to come down the wire before sending the client anything. There’s flags on the GstBuffer object that can do that, but they’re implemented via C macros that cause some trouble to the Glib introspection-based bindings. I’ll have to write some xs code into GStreamer1 to help support it.

(I feel like a bit of an idiot having my name on the GStreamer1 module on CPAN. With the Glib introspection bindings doing a whole lot of magic, I really have no idea how it works. I was just the one who stepped up to the plate to get it done. If someone came to me directly asking for help, I’d probably have no clue.)

Building OpenTX on Gentoo

I just got a Turnigy 9XR radio for a new quadcopter. I had been thinking about the Parrot Beebop, but I decided that I wanted a grown-up quad, so I got the AeroQuad Cyclone kit instead.

Now, the reason I bought the Turnigy 9XR is that it has an ATmega on board with fully customizable firmware. It even has the standard Atmel ISP port for programming.

I setup my laptop to build one of the major FOSS firmwares out there, OpenTX. Building the firmware itself went fine. The tricky part was building Companion, which is basically a big GUI wrapper around avrdude for burning the firmware. I hit an error that I couldn’t quite figure out:

This turned out to be because my Gentoo system was setup with python3.3 to be hit by default, and the code above is still on python2.7. I still had a build of 2.7 available, though, so it was an easy matter of switching:

Then follow the rest of the Companion build instructions, and switch it back to python3.3 when you’re done.

Bunch of UAV::Pilot Updates on CPAN

UAV::Pilot::Video::Ffmpeg v0.2, UAV::Pilot, UAV::Pilot::WumpusRover v0.2, and UAV::Pilot::WumpusRover::Server v0.2

These modules just got some Yak Shaving done. Fixed up their CHANGELOG, spammed license headers on everything, and added a test for POD errors.

UAV::Pilot v0.10

Fixed a regression bug (and added a test case) of FileDump taking a filehandle.

UAV::Pilot::Command will now call uav_module_quit() on the implementation libraries for cleanup purposes.

Same Yak Shaving as above.

UAV::Pilot::ARDrone v0.2

Added bin/ardrone_display_video.pl for processing the video data. This can either
be over STDIN, or in a raw filehandle. This is implemented in a way that should work for both
Unixy operating systems and Windows.

Added UAV::Pilot::ARDrone::Video::Stream and UAV::Pilot::ARDrone::Video::Fileno to support the above.

In the uav shell for ARDrone, there is a parameter you can pass to start_video. Calling it without a parameter has the video stream handled in an external process with the fileno() method, which keeps the stream latency down. Calling it with 1 will instead use a pipe, which has a small but often noticeable lag in the video output. Calling with 2 will use the old fashioned way, which does not open an external process. Using an external process tends to take advantage of multicore CPUs better.

Nav data now correctly parses roll/pitch/yaw. The SDL output was updated to accommodate the corrected values.

Same Yak Shaving as above.

Well, I Feel Stupid

Given that the AR.Drone’s control system sends roll/pitch/yaw parameters as floats between 1.0 and -1.0, I thought the navdata sent back things the same way. I was never getting the output I expected in the SDL nav window, though.

Then last night, I was comparing how NodeCopter parses the same nav packets and seeing completely different numbers. Turns out the AR.Drone sends back numbers in millidegrees instead.

I feel dumb for not noticing this before. OTOH, this is only “documented” in the AR.Drone source code. The actual docs tell you to look at the navdata in Wireshark and figure out the rest for yourself.

Corrections are now up on github. Should be a new release of UAV::Pilot distros coming up soon. These releases will cleanup quite a few details that I wanted to get done before YAPC, so we should be in good shape.

Hopefully, the TSA won’t bother me too much with an AR.Drone in my carry-on. Some people at my local hackerspace managed to get a whole Power Wheels Racer, including the battery, into their carry-on, so I think I’ll be good.