Merging data from multiple sensors?

Has core written any guides (or know of any) on comparing/merging sensor data from multiple sensors?
I’ve been on and off working on a project to try to achieve what Rokoko do with their inertial motion capture suits, using the now-retired PiicoDev MPU6050 board, but the sensor data is quite noisy so using multiple sensors and solving for coherent motion is quite tricky.

Can anyone point me in a good starter direction for solving motion from two 6-axis sensors recording the same object?

As an extension to that, if anyone has some wisdom on solving for somewhat accurate motion using accelerometer / gyro data within a 10x10m room I’d love to hear from you. I’m hoping someone in the drone space might have some experience with this.

3 Likes

Hi Luke,

Cool project! I have not completed the 3rd-year uni courses to advise you properly, but I might be able to give you some pointers. I think the overarching field is called “Sensor Fusion”.

Have you looked into sticking a lowpass filter on this, or averaging it? What kind of latency are you happy to tolerate?

Generally I’ve seen people use Kalman Filters for this, but I’ve never implemented one myself:

You might want to look into optical tracking solutions too. Expensive but incredibly accurate and responsive, even in large spaces

All the best with the project, I hope someone with more chops than I can reach out with something more fleshed-out :slight_smile:

5 Likes

Thanks for the reply James! Super helpful :slight_smile: I’ll look into the Kalman filter approach.

I’m trying to see what I can do without optical tracking, at least to begin with. The Rokoko suit can get pretty amazing results with its inertial sensors, so I want to try to mimic that for prop tracking as well (which they don’t support). I’m a solar engineer by day and animator on the side. I don’t know nearly as much electronics as I’d like, and even ‘cheap’ mocap is too much for a hobby, so I figured this was a fun crossover project for me! I’m going to see if I can make a mocap ‘suit’ for my dog haha

5 Likes

Hi Luke and James!

Really interesting project!!

James has hit the nail on the head - the Kalman filter will also apply a level of filtering so that you dont have as many outlier and a lot of the noise should be reduced (depending on whether you increase the weight of the modeled or measured states).

Phils lab has a great video on the topic of sensor fusion and Kalman filters here: https://www.youtube.com/watch?v=RZd6XDx5VXo&ab_channel=Phil’sLab
To get as accurate readings as possible you’ll also need some baseline readings to capture where the object is for testing, this could be done with an encoder, potentiometer or another sensor that has a small amount of error (I haven’t specifically looked into any distance sensors, ToF seems like it would be appropriate)

Drones typically GPS systems with an accuracy of around ~2m (your average mobile phone’s resolution) depending on the connection.

In the Maker realm a lot of people like to use the Xbox Kinect - might be worth a look into: Kinect - Wikipedia

Liam

5 Likes

Thanks for your responses guys - I’ll have a go at implementing the Kalman filter and report back with what I find !

3 Likes

SUMMARY:
I recommend you to use a camera with OpenPose. Your approach has many challenges (including wiring, data streaming…). MPU6050 is not a good sensor choice since it does not have a magnetometer to compensate for heading drift. Right now 9DOF IMUs are in low supply too.
I have project that is quite similar, so I’m aware of these challenges.

ORIGINAL ANSWER…

What kind of ‘motion’ are you trying to capture? Orientation (rotation) or XYZ position?
What sort of accuracy are you trying to achieve?

I believe the MPU6050 has it’s own orientation fusion algorithm on the Digital Motion Processor with a quaternion output.

From the accelerometer and gyroscope you can obtain:

  • (3D rotation) with temporarily accurate heading subject to drift. (need magnetometer to compensate for this drift)
  • Obtain short periodic location by highpass filtering acceleration after transformation to global axis. Then integrating to obtain velocity. Highpass again, then integrate to get your not so accurate periodic location.

I have an incredibly similar project that uses these techniques.
Specifically Madgwick filter for orientation (theres an existing library for that)

If you are trying to obtain accurate localization, then you are using the wrong sensor.
I would suggest either using GPS RTK or Bluetooth AoA or a Slam based approach (with camera or lidar) or Optical tracking as mentioned by
Liam.

With Rokoko, they have multiple IMUs and likely uses inverse kinematics with each foot step to calculate position.

If motion capture is what you are trying to achieve, have a look at OpenPose only requiring a camera.

2 Likes

Thanks for your response Meng. I am trying to achieve what Rokoko does. So no, I don’t need GPS. Rokoko uses IMUs, same as my project. They don’t use IK for raw data collection (but they do have filters that use it for post processing).

I am specifically, and deliberately, not opting for an optical approach here. There are many pose estimation libraries within the CV world, that’s not what I’m trying to solve. I am also not trying to find an off the shelf mocap solution. There are plenty available, and I’ve used many of them for animation productions. This is a hobby project to see if I can build my own, I won’t be using it in production :slight_smile:

So you are right, the MPU 6050 has its own orientation solver, but I need to be able to solve for orientation along a chain. As an example of the problem, imagine an arm: upper arm and forearm, connected by an elbow. The elbow is a hinge joint, meaning it rotates on a single axis. If I put an IMU just above the wrist, and another just above the elbow, and I know the length between these sensors and the position of the elbow relative to each of them, then if the sensor data is accurate and doesn’t have time-based errors, I can simply take the orientation of each of those sensors and calculate the rotation of the elbow. Animation data in 3D is usually stored in joint rotations. That’s how skeletal animation is done in most 3d packages, anyway. So that’s essentially my goal. Take a bunch of IMUs in predetermined positions, constrained along bones that form a joint chain, and solve for the orientation of those joints. From my testing however, simply taking the orientation output from the MPU6050 doesn’t enable this. Part of that will come down to the sample rate and accuracy of the MPU6050, but at the end of the day this is a prototype project that will have lots of practical limitations. I’m not trying to build a product to rival rokoko, for instance :slight_smile:

Edit: Where did Meng’s post go?

3 Likes

Since a few people have said that I should use optical systems instead or different peoples’ libraries etc, I should mention that I am trying to reinvent the wheel here.
I know there are other things that do what I’m doing, I am trying to replicate those things. Cheers :slight_smile:

2 Likes

Hi Luke,

My post got flagged as spam - currently getting reviewed.

What you are trying to do is very difficult algorithmically. Probably more like a research level stuff. Do you have access to research papers?

The biggest issue I see is that MPU6050 has no magnetometer for drift correction. So you’ll end up with weird results. You cannot correct for heading if there is no sensor to measure that.

Another approach could be using BLE AoA tags to get up to 1cm accuracy so you don’t need to add up all the errors from the chain of bone orientations.

3 Likes

I have an idea. In my project, I calibrated the orientation by aligning thighs and waist heading when standing. I believe a similar technique can be used for the arms. I think it’s impossible to correct this for the head through.

2 Likes

Have a similar but not so hard issue with extracting Pulse (heartrate) and respiration data from an Oximeter waveform. One approach (Renesas ) was to use various filters including a Kalman filter
I would be interested in anyone having an implementation in C++ for small micros such as an Arduino. In this case it is an ESP32 - VROOM

R

3 Likes

Kalman filter requires you to have a mathematical model between multiple sensors. In your case, I cannot see how that applies.

Seems like something a peak detector could work for. What does your waveform look like?

1 Like

Maybe so

The waveform is pulsatile at the pulserate with a large baseline DC component

The Renesas discussion is at

https://www.renesas.com/us/en/document/apn/ob1203-pulse-oximeter-algorithm-spo2-heart-rate-and-respiration-rate

In particular Respiration is difficult - trying to extract a signal anywhere between 0.1 - 1 Hz from the pulsile waveform at 1-4Hz

There is amplitude and frequency variation of the pulse waveform and DC baseline variation with respiration
So in a sense you will have several “sensors” - the amplitude variation, the frequency variation and the baseline DC variation - all derived independently

There is noise with movement and other confounding signals such as ambient light

R

3 Likes

Very interesting. I wonder what is the relationship between the amplitude variation, the frequency variation and the baseline DC variation.

Since respiration data occurs between 0.1 - 1 Hz, the other frequencies could be filtered out. A modern, computationally efficient technique is “wavelet filtering” for extracting the frequency power of a small range of frequencies in a small time window. Like a Fourier Transform but for specific frequency and time. But I don’t know enough about SPO2 to help you.

2 Likes

Yes I have seen a paper on Wavelet Transforms

Is it possible to implement these in a small MCU?

2 Likes

Yes it’s very simple. You probably don’t want a whole transform. Only frequencies and windows of interest:

  1. Create 1 sine and 1 cos wave at the chosen window size and frequency.

  2. Choose a window function and multiply with the sine/cos wave to limit what frequencies contribute the the power.

  3. Convolve the signal with the sine and cos wave.

  4. Amplitude is just sqrt( cosComponent^2 + sineComponent^2 )

You don’t have to use sine or cos. The 2 waveforms just have to be orthogonal (When you multiply both functions together and integrate, they are zero)

3 Likes

Thanks for that

Will use a peak detector for the pulse - the current library already does that but it does lose track fairly frequently

The Renesas algorithm uses a second windowing technique to improve the accuracy

I am using the TI library for the ADE4490

The range of possible signal frequencies for respiration is quite wide - a decade of so

I guess you would have to do the transform repeatedly over narrow ranges to cover the whole window of interest

R

3 Likes

The way Rokoko approach this is by taking a ‘snapshot’ of the actor in a T-Pose.

I think maybe the MPU6050 is just not a suitable part for such an application, or maybe I need a different setup for coordinating the many sensors - I don’t know how to get the timestamp of each read - as in, when I read the sensor, how much time has passed since it made the measurement? If I use a multiplexer to talk to several of them, how can I get a measurement for each one in a short enough timestamp when I can’t talk to them all in parallel?

3 Likes

One possible solution is to group the sensors into smaller regions: eg arm, leg torso for higher frequency sampling and calculations. I’d assume whatever Kalman filter you wanted to create would only be applicable within these groups. Precise timing can be achieved with a timer interrupt.

Then send the calculated quaternions to a main processor at a lower frequency.

Streaming to PC is another challenge.

Overall I recommend to use ESP32 for this since it has very fast dual cores for parallel processing. Also two I2C buses. And wireless streaming using ESPNOW is almost as fast as USB UART.

2 Likes

Hi All,

Interesting discussion!

In my small amount of experience, I’ve come across that the drift can be estimated as close enough to linear and can be compensated for with another state in the Kalman filter - if there are any other non-linear components adding another state and compensating should remove it…

I’d say it could definitely be done! To expand on Meng’s Idea, a Pico could be used as a ‘hub’ for up to 4 MPU’s and the fast clock speed at a mint price means that you can still keep everything cheapish.

Circling way back to the scope of your project Luke:

  • Would you oppose using another non-optical sensor? Core has a mint guide on using a magnetometer as an angle measurement sensor: DIY Magnetic Encoder - Tutorial Australia

  • Will you be interacting with the room itself or do you need tracking within the 10x10m space (accurate relative to the starting point)?

3 Likes