Hi,
I am working on a project that involves an accelerometer mounted in a vehicle. The orientation of the device could be arbitrary and hence the X, Y and Z axes could not necessarily be aligned with that of the vehicle. I need to sense linear deceleration in the front / back direction of a cars travel.
I have researched rotation matrices and it seems the orientation can be calibrated using a rotation matrix. Does any one have any experience with rotation matrices on e.g. arduino that could show an example? I have found lots of material online referencing android but they don’t compare that well to implementation on a microcontroller.
If there are any other methods to do this I would also be keen to learn.
Many thanks
Tim
Hi Tim,
Thanks for posting on the forum, welcome to the community!
Yes, I suspect it’d be MAUI .NET, Flutter APIs or some Kotlin libraries to detect motion on a mobile device?
Typically, they might be abstracting away most of the trigonometry and calculus required to give you the components for XYZ acceleration and/or velocity and orientation of the device in a frame where the vector representing gravity is going down. I remember having to do something similar to this a few years back for an educational ROV project I was working on to maintain orientation underwater.
Here’s a guide running you through getting roll and tilt with an accelerometer:
If you’re keen to do it from scratch in ArduScript:
Once you’ve got your frame’s orientation in space relative to some unit vector (typically 9.8 m/s/s going downward, i.e. gravity)
It’s as simple as applying the Sine rules on right angled triangles against that unit vector to get the X, Y, Z acceleration from an accelerometer with the same orientation in that frame.
Then integrating that against time you get the velocity in each direction.
Getting that initial orientation of the frame is the most difficult part in my opinion. I hit GPT about it for you to get a cleaner explanation of how to do so with just an accelerometer, the big problem though is when you’re accelerating, if the orientation changes, there’s no easy way to be able to tell that has occurred with just an accelerometer, that’s part of the reason a filtered output of a gyro is typically used too:
An accelerometer measures the acceleration forces. When used in static situations, the only force it measures is gravity. You can use this fact to determine the device’s orientation.
To apply the Sine rule for determining the orientation, you first need to know the values of acceleration in 3 axis (usually labeled as Ax, Ay, and Az for X-axis, Y-axis, Z-axis respectively).
First, calculate the acceleration vector’s magnitude using Pythagoras’ theorem:
A = sqrt(Ax^2 + Ay^2 + Az^2)
Next, determine the inclination (tilt) angles on each axis:
Inclination_x = arcsin(Ax / A)
Inclinction_y = arcsin(Ay / A)Noting the arcsin function gives a value in range -90 to +90 degrees.
For a full range of motion, you would need to use the atan2 function for azimuth :
Azimuth_z = atan2(Ay , Ax )This will give you the orientation in 3D space. Understand that each of these calculations assumes the other two axes are on a plane perpendicular to the force of gravity.
Please note, these calculations give raw orientation values and do not account for gyroscope data. For better accuracy, consider using sensor fusion algorithms such as a Kalman filter or a complementary filter which can combine data from gyroscope and accelerometer.
Here’s some very basic working out to show you how you can imagine the frame of the accelerometer versus the frame where the vector G (gravity) is downward. Note that since you can take the resultant of the measured Ax Ay Az when it is still to get G in that frame, all you need to do is compute what rotation of that resultant vector makes it face downward, then apply that linear transform to all of the other measured vectors Ax Ay Az into that frame.
Once you know your orientation, you can then go on to get the speed in each direction as I described by taking the difference of what the accelerometer is measuring when moving, versus what it was when it was static as long as the orientation doesn’t change.
The whole moral of the story is that this is significantly easier to do with a pre-written library But not half as fun of course