Another screenshot with the new UI:
The signal processing view of a quadcopter has one obvious result – a node editor.
Still WIP but here’s the first screenshot:
The left top part has some nodes that compute a Reference Frame, a Location and a Battery State from a MPU9250 imu, a MS5611 baro, a Ublox GPS and a RC5T619 ADC.
There are some processors there that do some resampling or convert streams (like an ADC_Voltmeter that converts from ADC to Voltage).
The nodes are read from the brain running on the Raspberry Pi. Each node has:
– a unique name
– a type (class_id)
– a list of inputs
– a list of outputs
– an init_params json which represents fixed configuration that is applied when the node is created. Stuff like IMU sample rate or gyro range.
– a config json that has things that can be changed at any moment – like where the inputs/outputs are connected or calibration data, or cutoff frequency for LPFs.
Each input/output has a name and a stream type that determines where it can be connected. There’s also a sample rate that has to be matched.
Next is to allow uploading of the json configs to the brain and more editing features.
The node editing is done with this library.
So I have my nodes almost done. There are acceleration streams, angular velocity, compass, gravity, reference frames and sonar distances and all these have to translate somehow to PWM pulses for the motors.
My initial diagram (here) showed a rate PID – with angular velocity as input and torque as output – but it’s missing something: the target angular velocity. I’m missing the input.
I kept thinking how can I model the input that controls all these nodes and it hit me – I’m missing the pilot. The central node that reads all the input sources and gives commands to the actuators – the sink nodes. The pilot knows if it’s piloting an airplane, a multirotor, a rover or a boat so I need one for each vehicle type.
The pilot receives the inputs from the Comms class and outputs a torque stream that will be fed in the motor mixer and then in thrust node.
So I created the processing node Multirotor_Pilot that takes these streams:
– an angular velocity for the Rate Pids
– a reference frame stream for the stability (horizontal) pids
– a location stream for the assisted pids and for the RTH and other AI behaviors
– a vertical distance stream (from a sonar usually but maybe from a laser as well) for the ground avoidance
It will use these streams based on the current input mode and state (landing, take off, crash etc) and process them into a torque stream. This basically means how the pilot wants the quad to rotate.
The streams are generated from sensors and passed through low-pass filters and resampled to around 20-50Hz. No need for more as it will just result in noise.
If I want to use silkopter with an airplane I will have to add a new Plane_Pilot class to model the plane. Same for any other vehicle.
Things are clearer now.