Calculating Euler angles (Roll/Pitch/Yaw) using Rotation Vector

It’s been almost a year since my last post as the project was on a pause for a while.

In this this post I will cover the most important piece of the quadcopter – position sensor. This sensor calculates Roll/Pitch/Yaw angles using Android’s Rotation Vector Sensor and is crucial part in it’s self-stabilization and control.

What I was trying to achieve:

  1. Calculate Roll/Pitch/Yaw angles
  2. The device is rotated in such a way to let the camera face forward. This means XYZ axes needed to be reassigned to get correct reading of roll/pitch/yaw
  3. It’s not always possible to perfectly align the device with ground and north pole, so the absolute angles will be different from 0. I needed a way to reset angles to 0 and thus calculate relative instead of absolute values

As I mentioned already the implementation is based on Rotation Vector Sensor. It is a virtual sensor which combines data from accelerometer and gyro to calculate device position. So it takes best of the two worlds – speed from the gyro, and no gyro drift.

Basic implementation

Let’s now calculate Euler angles from the rotation vector:

private float[] rMatrix = new float[9];
/**
 * @param result the array of Euler angles in the order: yaw, roll, pitch
 * @param rVector the rotation vector
 */
public void calculateAngles(float[] result, float[] rVector){
    //caculate rotation matrix from rotation vector first
    SensorManager.getRotationMatrixFromVector(rMatrix, rVector);

    //calculate Euler angles now
    SensorManager.getOrientation(rMatrix, result);

    //The results are in radians, need to convert it to degrees
    convertToDegrees(result);
}

private void convertToDegrees(float[] vector){
    for (int i = 0; i < vector.length; i++){
        vector[i] = Math.round(Math.toDegrees(vector[i]));
    }
}

As I mentioned above the elements of the result vector come in order: yaw, roll, pitch.

Change Device Orientation

But what if you need to change orientation of the device, such as flip it so the camera face forward like in my case:

photo(5)

It can result in changing the axes along which you calculate the rotation. Luckily it’s very easy to fix, with Android API. You just need to use method SensorManager.remapCoordinateSystem(inR, NEW_AXIS_X, NEW_AXIS_Y, outR);

Now we can modify code from above to apply different orientation.

private float[] rMatrix = new float[9];
private float[] tempRMatrix = new float[9];
/**
 * @param result the array of Euler angles in the order: yaw, roll, pitch
 * @param rVector the rotation vector
 */
public void calculateAngles(float[] result, float[] rVector){
    //caculate temp rotation matrix from rotation vector first
    SensorManager.getRotationMatrixFromVector(tempRMatrix, rVector);
    //translate rotation matrix according to the new orientation of the device
    SensorManager.remapCoordinateSystem(tempRMatrix, remapX, remapY, rMatrix);    

    //calculate Euler angles now
    SensorManager.getOrientation(rMatrix, result);

    //Now we can convert it to degrees
    convertToDegrees(result);
}

The results you will get

  • Yaw in the range of [-180,180]
  • Roll in the range of [-90,90]
  • Pitch in the range of [-180,180]

Translate Angles from Device’s to Quadcopter’s Coordinate System

It’s not always possible to place device in such a way that it’s coordinate system is perfectly aligned with the quadcopter’s one. So what we need to do is just to translate one into another. But first I want to focus on translating angles (any integer) to [-180, 180] range and then calculating delta.

First let’s look how we can translate any angle in degrees to [-180,180]. This means angle of 340 degrees will become -20 and -181 will become 179.

/**
 * Translates given angle (degrees) into [-180,180] range
 * @param angle the given angle to translate
 * @return the translated angle
 */
public static int translateAgle(int angle){
    if (angle == 0)
       return 0;
        
    int d = angle/180;
    if (d%2 == 0){
        return angle%180;
    }
    int signum = Math.abs(angle)/angle;
        
    return angle%180 - signum*180;
}

Now let’s say we have two readings of the Pitch angle: Pitch_1 [-179] and Pitch_2 [179]. We need to find the smallest correction to apply to Pitch_1 to get Pitch_2, let’s call this correction delta.  Positive delta means you are going clockwise, negative – countercloskwise. For example the actual delta for  Pitch_2 and Pitch_1 (i.e. Pitch_2 – Pitch_1)is -2, but not 358.

It’s very easy now to calculate delta using translateAgle() method:

/**
 * Calculates minimal angle difference (left - right) between two angles
 * @param left the left angle
 * @param right the right angle
 * @return the delta
 */
public static int angleDiff(int left, int right){
   left = translateAgle(left);
   right = translateAgle(right);
   return translateAgle(left - right);
}

Let’s put all the pieces together and translate the angles in Quadcopter Coordinate System. Though I want to mention the solution with calculating deltas is not ideal and doesn’t work well when quadcopter CS is significantly rotated vs device’s CS (because the Roll we get is in the range [-90,90] so you will get incorrect results getting closer to extremes). The better approach is to use rotation matrix multiplication, however I didn’t have much time to play with it, and my current solution works just well for fine tuning, just what I needed.

private float[] rMatrix = new float[9];
private float[] tempRMatrix = new float[9];

/**
 * @param result the array of Euler angles in the order: yaw, roll, pitch
 * @param rVector the rotation vector
 * @param referenceAngles the Euler angles of the reference position
 */
public void calculateAngles(float[] result, float[] rVector, int[] referenceAngles){
    //caculate temp rotation matrix from rotation vector first
    SensorManager.getRotationMatrixFromVector(tempRMatrix, rVector);
    //translate rotation matrix according to the new orientation of the device
    SensorManager.remapCoordinateSystem(tempRMatrix, remapX, remapY, rMatrix);    

    //calculate Euler angles now
    SensorManager.getOrientation(rMatrix, result);

    //Now we can convert it to degrees
    applyDeltaAndCovert(result, referenceAngles);
}

/**
 * @param result the array of Euler angles which need to be translated
 * @param reference the array of Euler angles of reference coordinate system
 */
private void applyDeltaAndConvert(float[] result, float[] reference){
    for (int i = 0; i < result.length; i++) {
        //convert from radians to degree
        int cur = (int)Math.round(Math.toDegrees(result[i]));
        int ref = reference[i];
        result[i] = angleDiff(cur, ref);
    }
}

The full implementation can be found here.

Advertisements

Quadcopter in code

Background

Below is a primitive picture of a quad. The grey circles are the quad motors, denoted FL (front left), FR (front right), RL (rear left) and RR (rear right).

quad

Let’s see how we can control quad in each of the directions:

  1. X-axis. To rotate clockwise – increase thrust on the left motors and reduce thrust on the right. To rotate counter-clockwise the opposite.
  2. Y-axis. For clockwise rotation – increase front and reduce rear.
  3. Z-axis. This one is a bit more tricky. Notice red arrows on the picture – these are directions of the props rotation. The diagonal motors spin in the same direction! That’s why if you increase thrust on FL and RR and reduce on FR and RL then the quad will start turning counter-clockwise around Z-axis.

In aviation X, Y, Z are usually referred as roll, pitch, yaw.

How is it applied for quad control?

The control of the quad is basically and infinite loop recalculating thrust for each of the 4 motors each round:

while (true) {
  thrustFL = newThrust + α*rollCorrection + β*pitchCorrection - γ*yawCorrection;
  thrustFR = newThrust - α*rollCorrection + β*pitchCorrection + γ*yawCorrection;
  thrustRL = newThrust + α*rollCorrection - β*pitchCorrection + γ*yawCorrection;
  thrustRR = newThrust - α*rollCorrection - β*pitchCorrection - γ*yawCorrection;
}

0≤ α,β,γ ≤1

Positive ( > 0) correction parameter (rollCorrection, pitchCorrection and yawCorrection) means clockwise rotation, negative – counter-clockwise.

What are roll, pitch and yaw corrections?

Here it comes to some ground basics of Control Theory, but before let’s assume that we use the same axes for sensor data as on the picture. Though these axes may not be the same as used by default in Android device placed on the quad… but it’s not a big deal, it can be easily remapped.

Correction basically tells us how much we need to adjust thrust on the motor to achieve desired value of roll, pitch or yaw. So obviously proportional (P) component of the correction is the difference between current and desired angles. Larger the difference, more thrust correction we need to apply. Dead simple.

In ideal world this will be enough. However quad is a heavy thing… it has decent amount of inertia, it’s sensors have latency, the motors also have their own latency and so on… So if we implement only proportional part the quad will start overshooting through the desired point which will lead to oscillation and potentially to a crash.

To eliminate this we will add differential (D) component. This component takes into account speed of position change (in our case angle) and will force to apply more thrust to resist faster changes to prevent overshooting. In math sense this is a derivative of the proportional component.

Again, if the quad flies indoor and is perfectly balanced and no external forces like wind are applied this will be enough. But let’s assume it’s windy outside. It might happen that proporional component is not enough to get the quad into needed position because the force of the wind is trying to rotate it in opposite direction. That’s when integral (I) component comes into play.

Integral component is basically an accumulated error (i.e. difference between actual and desired position), i.e. in other words it is an integral of proportional component:

Σ((actualRoll - desiredRoll) * Δt )

The same is true for yaw and pitch.

Now if we combine P, I and D we will get classical PID-controller:

rollCorrection = η*(Δroll) + λ*(Δroll/Δt) + μ*(Σ(Δroll*Δt)), where Δroll = actual_roll - desired_roll, and 0≤λ,μ,η≤1
pitchCorrection = η*(Δpitch) + λ*(Δpitch/Δt) + μ*(Σ(Δpitch*Δt)), where Δroll = actual_pitch - desired_pitch, and 0≤λ,μ,η≤1
yawCorrection = η*(Δyaw) + λ*(Δyaw/Δt) + μ*(Σ(Δyaw*Δt)), where Δyaw = actual_yaw - desired_yaw, and 0≤λ,μ,η≤1

How to calculate current position?

That’s where we need to use available sensors. Android provides very comprehensive SensorManager API and to calculate P component it is probably the best to use RotationVector sensor (especially if you use API 4.0 and higher then it will have better quality due to use of Kallman filter).

I component can be easily calculated using P values. But what’s about D?

Since D is a rotation speed, can be easily calculated using P values (see above)… but at the same time rotation speed is what gyro sensor measures.

So should I use gyro or absolute values?

On one hand gyro has extremely low latency, which is always good. On the other – it suffers from “gyro drift”, i.e. effect when sensor values drift from 0 over time even if the sensor is staying still. However It seems that since Android 4.0 this problem was fixed.

The other option is to use Android’s RotationVector, a virtual sensor that combines accelerometer and gyro data to provide absolute orientation quicker than using accelerometer and more accurate than using gyro. Even though it’s very quick, it is still not enough for quad implementation. So the answer to the question above – use gyro to calculate D component.

In one of the next posts I will dig into the sensor topic and will describe how to interpret and improve sensor output.

Related articles

Architecture. Or should I call it boring stuff?

In this post I will focus on the high-level Androbotus architecture. I know it might be not as exciting as some experiments or other tricks that were faced during the development, but it’s the right thing to start with to set up the context. At the same time you won’t find here much about the quadcopters. Instead I will focus on the generic framework architecture and will cover some specific aspects and challenges of building Android quadcopters (or robots in general).

Framework

Almost any robot which is more complicated than a coffe machine (hmm, probably not a good example) should consist of multiple independent components which in turn should have a way to communicate with each other. Some components can relate to physical elements of a robot. Such as servos which move robot legs. Others can be logical (virtual). A good example of these would be stabilization logic versus navigation or object recognition logic.

So it’s pretty clear, if a robot consist of more than one such component some sort of a framework needed. And here the fun part starts. You can chose to use an existing one, luckily for Android for example there is a Java version of Robot Operating System called rosjava. Or like in my case I decided to develop my own. It doesn’t mean rosjava sucks 🙂 I just wanted to have more control over the way messages are generated and distributed to play around performance issues, plus it is quite minimal so didn’t take much time.

In androbotus the individual components called modules, they are somewhat similar to nodes in ROS. Also just like in ROS, androbotus is using Publish-Subscribe pattern as a communication model. In few words it means that modules can send (publish) messages to certain topics as well as subscribe to topics to receive messages published by other modules.

Modules

The modules can be local or remote. Local modules are part of robot software, and are deployed and being executed on board. Remote modules form robot’s cloud service and in turn can use other cloud services if needed. For example to perform some heavy task they can use a service backed by a decent size server cluster. On the other hand remote modules suit well for remote management either of one or multiple robots. They also can serve as communication channels allowing robots to share information and communicate to each other.

I find it an interesting interpretation of cloud-robotics. It means robots not only can use cloud services to perform some computations, but at the same time robots and related infrastructure become a cloud service themselves, so it becomes actually robot-as-a-service (RaaS).

Since the very beginning I decided to move more towards RaaS, so whenever in the context of Androbotus I will use the buzz-word cloud, that shall mean I’m talking about RaaS.

Some pictrures

A post about architecture without a diagram is a job half done! Here it is, finally.

androbotus-arch

Note, the two blocks on the picture, one on the left and one on the right side. The left one is a robot node. The right one is a server (cloud) node. Each of the nodes has multiple modules and a single message broker. Message broker basically reroutes messages generated and published by modules according to what topics the modules are subscribed to. Certain types of messages can be delivered to a message broker of another node, allowing the nodes to communicate.

Like it usually happens in hobby projects, the intended design goes a bit ahead of the actual implementation, so don’t take it as 100% accurate. On the picture the brokers communicate via websockets and have some level of security implemented. That’s the plan, however in current implementation it’s done via TCP, quick and easy.

Instead of conclusion

Let’s now look at the example of how the framework is used for the first Androbotus implementation, the quadcopter. Should I may be call it Quadrobotus ? 🙂 Since it’s the first implementation it’s not very sophisticated. I think of it as a POC to try out main concepts.

Quadrocopters are in general quite simple flying machines, they have minimal number of moving parts – only 4 motors. However to make it fly smoothly the level of synchronization should be very high. This implies a design when all 4 motors are managed within a single module.The modules are:

  • Sensor Module
  • Stabilization Module
  • Navigation Module
  • Script Execution Module
  • Camera Module

Sensor Module does what it sounds to be doing. It manages all the internal Android sensors. By manages I mean it receives values from sensors, applies rotation transformations according to the initial device rotation, applies some math to clean up noisy data (this is very interesting and big topic itself, I will try to cover some of it in one of the next posts) and translates values into a form that other modules expect.

Stabilization Module is responsible for primitive reflexes of the quad. Such as keeping certain roll, pitch and yaw angles and preventing from arbitrary rotations and oscillations induced by external or internal forces. Stabilization module is the most important one in the robot since it is the module that controls all 4 motors. Important to mention that the frequency of adjusting motor values should be very high, the higher the better. Other modules if they have something to do with the flying attitude will have to communicate with stabilization module.  Note, it doesn’t have anything to do with altitude, air or ground speed or coordinates. These values do not require very low latency in the motor speed adjustments and are managed by the next module.

Navigation Module, as was mentioned before, manages higher level flying attitude, such as altitude, absolute position and speed using GPS sensor data. However unlike stabilization module it doesn’t send commands directly to motors, instead it send commands to stabilization module requesting to change an angle or increase thrust level  for example.

Script Execution Module  is even higher level module and is responsible for executing predefined programs (scripts). It is probably the most interesting module in the robot, however I’m not going to describe it now. Even though the design is in place, this work is still is in early stage.

Camera Module has lot of similarities with Sensor module. Camera is basically just another type of sensor which produces visual data.

On the picture below you can see an early prototype of the quadcopter’s dashboard.

androbotus-dash

On the left it displays video stream, three gauges in the middle display roll/pitch/yaw and cluster of four gauges on the right shows motor speed individually for each motor ( FL – front left; FR – front right; RL – rear left; RR – rear right).

The backend controller for this UI page is basically a remote module for the robot and can potentially allow to control multiple robots over internet…

I think this is a good point to end this post. In the next one I will probably talk a bit more about the framework and some performance considerations.

Is it really $100?

As I mentioned here, the first Androbotus I’m building is a quadcopter. The robot hobby can get very expensive very quickly, so I set a goal to myself to keep the price under $100 (probably didn’t want to hurt my racing budget, still have a lot to do to make my Miata faster, but it’s a subject for a different blog and at different time 🙂 ).

Is it really going to be a $100 robotic quad? Get the calculators ready, let’s count.

  • 4 motors – $7 a piece. It was claimed it can provide 900 grams of thrust each… not sure though
  • 4 ESC – $8 each, 18-20A
  • frame – $0 . It’s super easy to make a DIY quad frame. But I was lazy and bought a simple one for 20 bucks (not sure it was a good idea though 🙂 )
  • batteries – $0. I don’t add them into the price since any RC fan will have boxes full of batteries of any kind. I found 2 4000mah 3-cell lipo batteries is ok for the big-size quad.
  • Android IOIO – $35 on Amazon. This is a very nice board that allows to control external modules using simple Java API. Since I planned to stick with phone’s sensors this board had much more sense than Arduino, which would be the way to go if I used external accelerometer/gyro boxes.
  • Android phone – $0. I use my old Samsung Infuse, running Android 2.3.6 (gingerbread). However I would recommend to use more modern/better device running Android 4.0+. I will explain why in one of the upcoming posts.

The parts can be found in any online rc-hobby store (ex. hobbyking.com)
So what do we have in total?

4*$7 + 4*$8 + $35 = $95.

Well, extra $5 can be spent on glue, tape and bullet connectors 🙂

Update: Oh, completely forgot. Spend the remaining 5 bucks on decent propeller blades 🙂 In my next post I will describe in more details the architecture and how this stuff works in general…