Video streaming on the raspberry pi finally works. Time to move on to the actual UAV code.
So far I have this figured out:
There are 4 major parts that I need to develop:
1. IO Board. This runs on the crius board and is responsible of reading all the sensors, computing the local frame of reference and controlling the motors using a rate pid.
2. Brain. This runs on the RPI and talks to the IO Board. It does dead reckoning and all the high-level AI like assists (avoid ground, stay in perimeter, etc), camera streaming, return home and scripting
3. GroundStation. Runs on the laptop/tablet and talks to the Brain through TCP (comms)/UDP (streaming). It acts as a remote control + setup software.
4. Simulation. This can replace the IO board with a bullet physics simulation. It talks to the Brain and simulates everything the IO_Board does – to allow me to test the brain without actually crashing any quadcopters.
So far I have some of the brain code started:
– IO_Board class that talks to the IO Board or simulation
– Video Streamer
– Camera code, working and tested
– TCP comm channels
The Ground station also has some code done since the AVR tests a few months back but it needs to be adapted to the new brain code:
– Calibration code
– Diagnostics for the AHRS and dead reckoning
The Simulator is mostly done but for AVR. Needs to be adapted to the brain to mimick the IO Board. It can simulate the motors + propellers, accelerometer and gyro and air drag.