In my last post I talked about using the si4463 chip to send video, telemetry and RC data to the quadcopter. I calculated the bandwidth and it seemed that 500kbps video with 5/7 FEC coding should be possible.
I wrote the code, linked everything together and it kind of worked. The video was stable, not many packets lost but the latency was pretty bad. It got up to ~200ms from the current 100-130ms but worse than this, the video was very choppy. I managed to narrow it down to the H264 codec in the raspberry pi: the bitrate you configure in the codec is an average bitrate, per second. Each frame can vary a lot in size as long as the average is preserved (with some allowed over and undershoot). I got I-frames of 12-16KB and P-frames of 400-500 bytes (at 30 FPS). The I-frames took way longer to send than the P-frames and this resulted in a choppy video. I tried to play around with settings – like disabling CABAC and activating CBR but nothing made the bitstream uniform enough.
The final, biggest problem was actually caused by the RF4463F30 module – and it’s the same problem I had months ago when I tried to use them: they introduce a LOT of noise in the power line, enough to cause all the I2C chips on the quad to fail.
I tried all kinds of capacitors to decouple the module and reduced the noise a lot but there seem to always be some capacitive/inductive coupled noise persisting.
In the end I just gave up and went back to the RFM22B chip which just works. For video I went back to the monitor mode/injection system but did change the modulation to CCK, 5.5Mb to hopefully reduce the possibility of interference with other 2.4Ghz RC systems. CCK is spread spectrum similar to DSSS and should be able to coexist with RC systems around.
So yeah, FPV through a very constrained channel with a temperamental H264 encoder and a very temperamental transceiver module is not fun…
It’s been a while since my last post and that’s not because I abandoned silkopter, but because I’ve been very busy working on it.
During my latest test 3 weeks ago I noticed that the video link is not as stable as I wanted. I’m using a system similar to wifibroadcast – wifi cards in monitor mode doing packet injection. They are working on 2.4Ghz and the area where I’m flying seems to have this band a bit crowded, causing the video link to be glitchy.
On the other hand my 433Mhz RC link has been pretty solid. So inspired by this I started thinking if I could do the same thing: send video through my RC link.
I already had the RF4463F30 transceivers which are using the si4463 chip and I knew them well (and hated them a lot) so I thought to give it a try.
Now – the max bandwidth the si4463 supports is limited to 1mbps which might seem like enough for 640×480 resolution but there is a lot of overhead that effectively limit this to around 755kbps. Since I want some FEC – say 5/7 coding – I end up with 500kbps target video bitrate.
I made an xls to ease the calculations. It allows me to specify the air bitrate, packet sizes, packet overhead, video bitrate, fps, preamble/sync sized, header sizes etc and calculates all kind of nice info, including if the link is viable or not.
Here is a screenshot with my current calculations:
For the video link, 500kbps seems like enough as there’s not a lot of movement in a quad, specially with a gimbal. To further improve quality and decrease latency I’m experimenting with the x264 software compressor library instead of the GPU compressor in the PI. It seems to be able to compress real-time a 640×480 video with good quality, zero latency setting and 2 cores only.
I still have to compare the quality between the 2 (GPU and x264) though.
Here’s the case after my best attempt:
It looks… bad. The paint coat is horrible and full of scratches and the screen is too big.
But worst of all, the screen is not bright enough in direct sunlight. Not even close. I don’t have a photo but after brief testing I’d say it’s unusable.
So I’m pretty disappointing with the result – I ended up with a big, heavy RC system that is too dim to be usable for FPV.
I searched for a week for alternative capacitive touch screens, preferable in the 5-7 inches range but found nothing bright enough under 100 euros.
So after a mild diy depression I got an idea that will solve at lease 3 of the issues – cost, bright screen and the RC size: use my Galaxy Note4 phone as the screen.
The setup will look like this:
- The quad will send video through 2.4Ghz, packet injection (a.k.a. wifibroadcast method) and RC stream through 433Mhz
- The RC will receive both video and RC data and relay them to the phone using another 5.8Ghz wifi UDP connection. The phone will decompress the H264 video using OMX (or whatever is available) and display it with telemetry on top.
- The phone will also act like a touchscreen interface to control the RC/Quad
Basically this is what most commercial quads (like Mavic) are doing. I’m sure the video link is 2.4GHz due to longer range than 5.8 and better penetration and the connection with the phone is done over a 5.8, low power link.
So the next steps are:
- Redesign a smaller case that will accomodate a Raspberry Pi 3, the RC stick and fader + buttons and wifi cards
- Write a quick android application that can connect to the RC and decompress the video stream
Just did some latency tests using RUDP through wifi, 640×480@30fps, 2Mbps.
Both the laptop and the quadcopter are in the same room but they go through a router 3 walls away. Signal strengths are (as reported by iwconfig):
Link Quality=70/70 Signal level=-37 dBm
Link Quality=58/70 Signal level=-52 dBm
Ping reports these RTTs:
64 bytes from 192.168.1.110: icmp_seq=1 ttl=64 time=137 ms
64 bytes from 192.168.1.110: icmp_seq=2 ttl=64 time=160 ms
64 bytes from 192.168.1.110: icmp_seq=3 ttl=64 time=85.4 ms
64 bytes from 192.168.1.110: icmp_seq=4 ttl=64 time=108 ms
64 bytes from 192.168.1.110: icmp_seq=5 ttl=64 time=125 ms
64 bytes from 192.168.1.110: icmp_seq=6 ttl=64 time=149 ms
64 bytes from 192.168.1.110: icmp_seq=7 ttl=64 time=73.6 ms
64 bytes from 192.168.1.110: icmp_seq=9 ttl=64 time=119 ms
The quadcopter uses the more sensitive alfa card while the laptop has its own crappy RTL8723be card that has many issues under linux…
I happen to live in a building with a very noisy wifi situation so SNR is not good at all.
Average latency is around 100-160ms with random spikes of 3-400 ms every 10-20 or so seconds.
[Edit – just realized that both the brain and the GS are debug versions…]
To measure I pointed the raspicam at my phone’s stopwatch app and then took photos of the phone and the screen at the same time. Here are some of them: