Monthly Archives: February 2015

The Pilot

So I have my nodes almost done. There are acceleration streams, angular velocity, compass, gravity, reference frames and sonar distances and all these have to translate somehow to PWM pulses for the motors.

My initial diagram (here) showed a rate PID  – with angular velocity as input and torque as output – but it’s missing something: the target angular velocity. I’m missing the input.

I kept thinking how can I model the input that controls all these nodes and it hit me – I’m missing the pilot. The central node that reads all the input sources and gives commands to the actuators – the sink nodes. The pilot knows if it’s piloting an airplane, a multirotor, a rover or a boat so I need one for each vehicle type.

The pilot receives the inputs from the Comms class and outputs a torque stream that will be fed in the motor mixer and then in  thrust node.

So I created the processing node Multirotor_Pilot that takes these streams:

– an angular velocity for the Rate Pids

– a reference frame stream for the stability (horizontal) pids

– a location stream for the assisted pids and for the RTH and other AI behaviors

– a vertical distance stream (from a sonar usually but maybe from a laser as well) for the ground avoidance

It will use these streams based on the current input mode and state (landing, take off, crash etc) and process them into a torque stream. This basically means how the pilot wants the quad to rotate.

The streams are generated from sensors and passed through low-pass filters and resampled to around 20-50Hz. No need for more as it will just result in noise.

If I want to use silkopter with an airplane I will have to add a new Plane_Pilot class to model the plane. Same for any other vehicle.

Things are clearer now.

Advertisements

Signal processing

While thinking about ways to implement multiple sensor support – for several GPS or gyroscopes – I realized that an UAV can be modeled as a signal processing device.

It can be split like this:

  • sources – like the MPU9250 which generates 4 signal streams: acceleration, angular velocity, magnetic field and temperature, the SRF02 sonar which generates a stream of distances
  • streams – like the acceleration from the imu, or the location stream from the GPS. Each stream has a sample rate
  • sink – like the pigpio PWM. Sinks accept streams of a certain sample rate only
  • and processors – like low pass filters (LPF) or the PIDs, or the AHRS. Processors don’t have an inherent sample rate. They adopt the rate of the input usually.

All of these are implemented as nodes with several inputs and outputs. All nodes are added to the HAL layer and both their properties and connections are configured from a json file.

The important aspect is that – like all signal processing – signal rate is very important and has to be properly matched between nodes to satisfy the nyquist frequency. For example the PWM has a max sample rate of 490Hz so any signal higher than ~240Hz will be seen as noise. Higher frequency signals have to be resampled and low-pass-filtered before fed in the PWM node.

I started implementation last week and – after many rewrites – I have a model that ‘feels right’.

Here are some drawings:

Ahrs_0

Here the MPU9250 node exposes 4 sub-nodes (not shown) – an accelerometer, a gyroscope, a magnetometer and a thermometer. Each of these outputs a stream: acceleration stream, angular velocity, magnetic field and temperature.

The frequency of these streams is determined by the sensor. Acceleration and Angular Velocity are at 1Khz while magnetic field is at 100Hz. Temperature (not shown in the pic) is at 10Hz.

The Magnetic Field stream is fed into a Resampler node that takes it from 100Hz to 1000Hz, ready to be fed into the AHRS procesor node. The reason for this is that upsampling the stream needs a low-pass flter to avoid parasitic frequencies. A sample rate of 100Hz can carry at most a 50Hz signal so when going to 1000Hz sample rate this max frequency has to be maintained. Simply cloning each sample 10 times is not enough as this adds a lot of high frequency harmonics.

So now all signals at 1Khz sample rate are fed into the AHRS which outputs a Reference Frame stream, also at 1Khz. This reference frame represents the rotation of the UAV relative to earths frame of reference.

Right now I have a complimentary filter for the AHRS node but in the ear future I will write an extended kalman filter (EKF) for it – as soon as this tutorial is complete.

Note200215_0

This diagram uses the AHRS output – the reference frame stream – plus the acceleration stream again to output 2 new streams – gravity vector and linear acceleration. These 2 streams will be further used for dead reckoning and stability pids.

Note200215_1_0

Here I have the basic rate PIDs diagram. The angular velocity stream is first resampled from 1KHz to 50Hz (the pic says LPF but it’s actually a resampler node). The reason for this is that the usable bandwidth of the quadcopter is ~20-30Hz. Anything higher than this is noise.

The stream is further passed to the Rate PID processor node which outputs a torque stream (also at 50Hz). Torque is converted to thrust using a model of the motors+propellers (configurable of course), which is fed to the motor mixer and then in the PWM sink.

The advantage of this system is that new processors can be easily added to accommodate new hardware. For example if I want to use 2 GPS sensors I will add a redundancy processor that takes 2 location streams and outputs the healthiest one. Or I can merge them together to get better accuracy – while the rest of the nodes don’t even know nor care about this.

Basically I have minimal coupling between the elements of the system which talk through very well defined (and type safe) interfaces.

The ground station is also simplified. It can enumerate ll the nodes, read their configuration as a json value, change it and push it back to the nodes. Changing a filter cutoff will become trivial.

So far I still working to match the old functionality but in 1-2 weeks I believe this system will be ready for a test.

GPS jamming

The Raspberry pi camera completely jams my Ublox Neo6 GPS.

I’m monitoring the jamming, ADC and noise values and whenever I turn on the camera and start capturing video, jamming goes up from 10 to 140, ADC goes down from 4600 to 1000 and the noise from 100 to 130.

This kills any fix the GPS has and it stops seeing any satellites. With the camera off I regularly get between 3-7 satellites indoors and this is with a passive antenna. With the camera on I get zero even outside.

I wrapped the entire camera in several layers of aluminum foil, same for the ribbon cable without noticing any improvement.

Only after placing a small ferrite bead on the RX/TX serial wires the ADC went up from 1000 to ~3600. So far it looks like the issue is electrical noise.

In the mean time I’m waiting for the new GPS I ordered from hobby king and also started working on supporting the NAVIO board as it has a newer GPS (the ublox 8).

This brings me to the next issue – the brain architecture is not ready to support multiple sensors of the same time.

Time to work on the software while I’m waiting for both the new GPS and the Navio board.

Git

The repo is now on github:

https://github.com/jeanleflambeur/silkopter

git is still a bit scary compared to the familiar naivety of svn but it’s too powerful to ignore. First thing I did after importing the svn was to recover crius version of silkopter from last year and make a branch from it. This was doable in svn as well by doing a checkout to that revision and committing in a branch.. but in github it was 2 clicks.

Robust sensors

Thanks to HobbyKings horrible shipping times, I’m forced to use my broken Crius UBLOX GPS to test the GPS code. I’m not sure what is broken but the symptoms vary from garbage data, wrong crc, loosing the fix every 2 minutes and random periods of complete silence.

To be able to continue working I had to add a watchdog that reinitializes the GPS if it’s silent for more than 5 seconds. However – my GPS::init function is blocking and takes a few seconds to do it’s work as it has to wait for the ACK messages and retry a few times if needed. So I moved it on the secondary thread using silk::async.

void GPS_UBLOX::process()
{
    QLOG_TOPIC("gps_ublox::process");
    if (!m_is_setup)
    {
        if (!m_setup_future.valid() || //first time we're starting this
             m_setup_future.get_state() == boost::future_state::ready) //the previous async setup failed
        {
            //start the async setup
            m_setup_future = silk::async([this]() { setup(); });
        }
        return;
    }

And the watchdog:

//check if we need to reset
if (q::Clock::now() - m_sample.last_complete_time_point >= REINIT_WATCHGOD_TIMEOUT)
{
    m_is_setup = false;
}

So if at any moment the GPS stops responding I’m firing the setup on the second thread – to avoid blocking the main control thread and keep doing it until the GPS responds again.

I’m thinking to add this king of hot restart to all sensors and in case something happens to trigger it from the GS.

silk::async

No intro for this one, it’s too late.

Here’s the code:

extern boost::asio::io_service s_io_service;

namespace silk
{
    template<typename F> auto async(F f) -> boost::unique_future<decltype(f())>
    {
        typedef decltype(f()) result_type;
        typedef boost::packaged_task<result_type> packaged_task;
        auto task = std::make_shared<packaged_task>(std::move(f));
        boost::unique_future<result_type> future = task->get_future();
        s_io_service.post(boost::bind(&packaged_task::operator(), task));
        return future;
    }
}

It’s std::async replacement that uses a boost::asio::io_service as a dispatcher.

I use it for any operation that might take too long and it’s in the middle of my control loop – like saving new PIDs to disk ot the battery state.

Usage is pretty simple – pass a lambda and that lambda will be executed some time later from the thread that is doing the io_service::run()

silk::async([=]()
{
    save_settings();
});

or

silk::async([=]()
{
    autojsoncxx::to_pretty_json_file("battery_state.cfg", m_saved_state);
    QLOGI("Capacity used: {}mAh", m_capacity_used_mah);
});

The function returns a boost::future so the result of the lambda can be pooled for.

I like the async ways. The future/promise mechanism is a very elegant way to express computations. It reminds me a bit of the Haskell’s lazyness.