Category Archives: avr/arduino

Odroid W ADC Fail!

2 days and 5 forum posts later and the picture is clearer. Let’s start from the beginning. To simplify I’ll refer to the Raspberry pi rev. 2 board and ignore rev. 1 and the A+/B+ (** check the notes for details).

The Raspberry pi has the i2c-0 and i2c-1 busses. The former** is used by the GPU to talk to the camera while the latter can be used by the user however she/he wants. Each of the 2 busses can be redirected to several physical pins by changing the mode of these pins:

GPIO 0/1  – ALT0 mode
GPIO 28/29  – ALT0 mode
GPIO 44/45 – ALT1 mode

GPIO 2/3 – ALT0 mode
GPIO 44/45 – ALT2 mode

Only one of these pairs can be activated at any moment per bus.

The camera is physically connected to GPIO 0/1 **. These pins are setup as inputs by default and the GPU will change them to ALT0 whenever it needs to talk to the camera but _switches them back_ to INPUT immediately after. If you monitor the mode of GPIO 0/1 you’ll see that most of the time they are INPUT with random switching to ALT0 0-30 times per second. It seems that the more movement the camera sees the more it talks to the GPU. It seems to be related to AWB and shutter speed.

So far – no problem. It’s pretty clear from the design of the Raspberry pi that i2c-0 is off limits and there is no way to synchronize the GPU access with the CPU – so i2c-0 cannot be shared between the camera and any other device. If one attempted to use i2c-0 and started ‘raspivid -t 0’ he would see weird things ranging from i2c errors, the image freezing for seconds, random noise on the screen or even the board completely freezing.


The OdroidW has a nice PMC 5t619 chip that provides 2 free ADC pins that I intended to use to monitor voltage and current of silkopter. The PMC uses i2c-0 to talk to the CPU so care must be taken to synchronize somehow with the GPU. This is what I’ve been trying to achieve the whole weekend…

– 1st  try: After every use of mmal I switched GPIO 1/2 mode from IN back to ALT0 so I can talk to the PMC. Didn’t work as it seems the GPU uses the bus even between calls to mmal (in hindsight it makes sense as the camera has to inform the GPU about starting and finishing transfers and the GPU has to set awb, gains and shutter speeds).

– 2nd try: Use a semaphore to trigger the ADC measurement in the mmal callbacks – hoping that after the callback I get a period of silence from the GPU. No such luck

– 3rd try: Give up using the hw i2c-0 bus and try bitbanging on GPIO 0/1. This was such a nice idea – let the GPU use the hardware i2c-0 bus and I’ll use bitbanging… Didn’t work as both the GPU and the PMC are connected to GPIO 0/1 pins – so they share the same physical pins….

So basically it looks like the OdroidW is not capable of using both the camera and the PMC at the same time because they share the i2c-0 bus but _also the pins_!!!. This could have been easily avoided by putting the PMC on some other GPIO and bit banging, or at the very lease putting it on i2c-1.

So now I’m back to the drawing board and started considering an Arduino mini board as ADC + PWM generator. Connected through i2c-1 probably..



** Rev.1 boards have i2c-0 and i2c-1 reversed. So i2c-0 is free while i2c-1 is the camera one. ** A+ and B+ have the camera using GPIO 28/29 pins to access i2c-0.

MPU6000 FIFO #2

Just realized a new way to use the fifo to avoid duplicated data.

Instead of routing all the data to the fifo, I will route just gyro.x – so 2 bytes.

Then, when polling the mpu:

1. request the fifo size

2. if == to the old fifo size, return. No new data available

3. read all data from the fifo and discard it.

4. read data from the registers instead of the fifo


This has the advantage of not relying on a certain packet structure in the fifo and is immune to fifo overflow issues.

I will test it on sunday and see how it works.


The ideal way to read data from the mpu is with an interrupt. I cannot do this as the interrupt pin is not connected on the crius board. So I’m polling the imu at 1Khz hoping the data is ready but as it turns out, the avr and the imu clocks don’t exactly match. For this reason I sometimes get the same sample twice in a row or skip a sample. This results in errors in the gyro integration – a 90 degree rotation in one axis is interpreted as a 89 or 91 degrees for example.

FIFO to the rescue. The mpu6000/6050 has a built in fifo that you can interrogate regularly. By polling the fifo counter I know if enough data has arrived and read it – or if not enough just wait for it.

The problem is that the FIFO – being a First In First Out buffer – can be read only from the back – the oldest data. And if the buffer overflows (it has 1024 bytes) the oldest data gets discarded.

For this reason I have to make sure that the data packet size is a divisor of 1024 so that when a new packet arrives and overflows the 1024 bytes buffer, it discards one entire packet and not a fragment only.

My packet has 14 bytes right now – 6 for the accelerometer, 6 for the gyro and 2 for temperature – so whenever there is an overflow I get corrupted data from the fifo as I’m not reading whole packets anymore – but some data from one and the rest from the next one..

My temporary solution is to throw away data from the fifo whenever there’s too much in it – say more than packet size * 10.

Something like:

if (fifo_size > packet_size * 10)
i2c::discard(mpu_addr, fifo_rw_register, fifo_size - packet_size); //leave one packet intact
fifo_size = packet_size;
if (fifo_size >= packet_size)
i2c::read(mpu_addr, fifo_rw_register, packet);

This solved most of the too-much-rate problem but not entirely. I still need to debug it a bit more..



By removing the rate pid from the crius, sending quantized data instead of floats through the serial port and compiling with -DNDEBUG to remove asserts I managed to send data at 1000Hz. I can go up to 1800Hz but since the MPU6050 can sample at max 1000 there’s no need for that.

Good to know that I still have extra CPU for when I’ll add the GPS code.


So 1000Hz is possible through a serial port at 5000000 baud, with cpu to spare.


Case almost ready

Cover still WIP.


It tool only 3 tries to figure out how to pack everything. It weights 30g without the electronics and 180 with the raspberry pi, crius board, ESC, WIFI and sonar.


IMG_20140713_144016 IMG_20140713_144025 IMG_20140713_144033 IMG_20140713_144040 IMG_20140713_144056 IMG_20140713_144443


There are 4 layers. First the sonar, at the bottom. Then the ESC and the raspberry pi. On top of them there is a plastic bridge holding the WIFI card and on top of it there’s the crius.

At the front there is space left for the camera with its servo.

The motor wires will go below the ESC board – there are around 15mm of clearing there.


UAV architecture #2 – frequencies

I finally have the ground station (gs) talking to the brain (raspberry pi) and the io_board (crius/avr). I get sensor data, can calibrate the accelerometer/gyro/compass, send uav and camera input and so on.

The io_board sends sensor data at 500Hz to the brain and also runs a rate PID at 250Hz. Motors are mixed on the io_board using a throttle input from the brain.
The brain will do AHRS at 500Hz using the sensor data from the io_board and it forwards data to the gs at 30Hz.
The next step is to get the io_board simulation working again to be able to test the rate pid, stability pid, motor mixer and GS.

New Frsky telemetry protocol and Ground Station software

This project allows you to send telemetry data from a megapirate board through a 2-way frsky link back to the Ground station TX, where it gets decoded and merged over the video feed.

I ordered a new frsky telemetry rx/tx from hobbyking a month back and while waiting for it to be shipped – 3 weeks to spain – I started reading about the telemetry protocol they use.

It seemed like such a useful thing to be able to send data back to the GS. My first thought was to remove the OSD form the quad and put it in the ground station. I’d save a few grams of weight and I would get all the range of the frsky radio instead of the measly ~200m range I’m getting with my 5.8Ghz video tx (in the meantime I improved this with a 4-turn helical).
The only thing I had to do was plug serial 3 with the mavlink data stream from my crius aiop into the frsky receiver on the quad and get it from the th9x, plug it into the minimosd in the GS and voila! a more reliable OSD.

What I didn’t know was:
1. The bandwidth is ~1200 bytes per second. This means 40bytes for a 30hz refresh rate. Way too low for mavlink
2. The frsky tx and rx use 232 levels while aiop and minimosd uses TTL. This requires 2 level inverters – and I only ordered one from HK.

The first problem could be solved with two mavlink <-> frsky protocol converters and there are a few projects for this (
The other alternative was to write a new protocol that compresses the data to fit in the 1200bps limit and decode it at the GS end.

So on the quad, I have this: crius serial 3 tx pin (yellow) -> ttl to 232 converter (blue) -> frsky receiver tx pin (red)

The ttl to 232 converter is this one:

I had to disable the serial3 mavlink from mgapirate and plug my own function in the update loop.
To save bandwidth, the info is split in packets (motors, altitude, rc inputs etc) and each packet is sent only if it changed (so delta compressed). Together with quantization, I managed to pack everything I wanted (except GPS – still waiting for it to arrive).
The result is real-time update for the important things – altitude, heading, motors (to graph temperatures). The refresh rate depends on how much things change in time – and it turns out that most data is pretty stable so I usually get 15-20Hz – way more than my initial estimates of 2-5Hz that I would get without the compression.

For the second problem – I found a way to hack into the TX and get TTL levels out of it directly ( Then connect this to a serial bluetooth adapter and voila – telemetry straight to my laptop or phone.
With a small QT+opencv application I can have the camera feed on my laptop combined with the rendered telemetry values in a nice HUD.

The app supports 2 deinterlace methods (bob + some other type that I forgot) and recording using a user chosen codec.
The interaction happens with shortcuts – no menus for now – and is pretty crude. I’ll soon add support for touchscreen as I got a HP Omni tablet and that will replace my laptop in the GS. Battery life is great on the tablet (6-9 hours) and the CPU is way stronger than to my old eeepc 901.

To use:
1. Download megapirate 3.0.1 R3
2. Apply my patch. If you want to do this manually over another revision you have to:

  • Disable the mavlink from serial3. This means to comment all gcs3. instances (except for the one that sets non-blocking mode)
  • Add the code surrounded by // CT protocol v 1.0 in the GCS_Mavlink.pde file
  • Call the ct_process() function from static void gcs_data_stream_send(void).

3. Download the fpv-monitor code:
4. Make sure you have QT 5.2.0, opencv and boost 1.55 (header + binaries) installed
5. Compile using VS2012 (some C++11 requires it)
6. Run as fpx-monitor comX Y – where x is the com port and y is the camera index (0 – built in camera, 1 – receiver)

After everything compiles, run it with these params:
fpv-monitor com5 0
even if the com port doesn’t exist, it should show the feed from your camera/easycap:

Of course, this assumes you have a easycap with the drivers installed.

The shortcuts are:
F11 – fullscreen
F5 – start/stop capture
D – cycle deinterlace methods
R – smooth video resize (more CPU)
CTRL-0 – use camera index 0
CTRL-1 – use camera index 1

Here’s how it works:

App source code:
In the megapirate/patches folder you’ll find the patch + already patched file for 3.0.1R3

Good luck.