How can I convert IMU data to position data in simulink?

113 visualizzazioni (ultimi 30 giorni)
Hello everybody, I'm try to control the robot via ROS communication. I already have simulink model to control the logic to publish the velocity topic to robot from odom topic. Right now, I have the bno055 to recieve the imu data from the robot but the problem is I have to convert to odometry data. Can anyone suggest me the way to change IMU data to position by doing the model in simulink. I have an idea that integrate acceleration to velocity and integrate agian to position and may be need EKF block. But I'm not sure is this idea is ok or not and also i dont understand where to put the EKF and how to use it.
  1 Commento
Vivek
Vivek il 25 Gen 2024
Modificato: Vivek il 25 Gen 2024
Hi Setthivadee,
You can use insEKF to get position from IMU data. This will be better than simple double integration of the IMU data using the integrator block as that technique will accumulate a lot of error quickly and you will see the drift in the vehicle trajectory because of that.
You can follow this example for doing that. You can create a MATLAB function block using that example where you would have to create a persistent ‘filt’ variable that you can initialize on the first call to the function block. Do add a proper motion model as per the vehicle you have. You can use insMotionPose if that suits your purpose. Then on every data sample you can run the loop which predicts and corrects the state using IMU data.The inputs to the function block will be the IMU data and output will be the position that you can extract from the ‘filt’ object using stateparts.

Accedi per commentare.

Risposte (1)

Harimurali
Harimurali il 11 Gen 2024
Hi Setthivadee,
Here is a step-by-step guide to help you create a Simulink model to control a robot via ROS communication:
1. Read IMU Data: Use the appropriate ROS blocks to subscribe to the IMU topic and receive the accelerometer and gyroscope data.
2. Preprocessing: Apply any necessary preprocessing to the IMU data, such as removing biases and scaling the measurements to the correct units.
3. Integration to Velocity: Integrate the linear acceleration data to estimate the velocity. This can be done using a simple integrator block, but remember that this will only give you relative velocity, which will drift over time.
4. Integration to Position: Integrate the estimated velocity to get an estimate of the position. Again, this will be a relative position and will drift.
5. Orientation Estimation: Use the gyroscope data to estimate the orientation of the robot. This can be done using techniques like quaternion integration or converting to Euler angles and integrating.
6. Sensor Fusion with EKF: Implement an Extended Kalman Filter to fuse the IMU data with any other available sensors (e.g., wheel encoders, GPS, etc.) to improve the position and velocity estimates. The EKF will help correct the drift introduced by the double integration of acceleration.
7. EKF Implementation: The EKF will have two main steps: prediction and update.
  • Prediction Step: Use the motion model of your robot to predict the next state and covariance based on the control inputs (e.g., acceleration and angular velocity from the IMU).
  • Update Step: Use the measurements from the IMU (and possibly other sensors) to update the state and covariance estimates.
8. Output Odometry Data: Use the estimated state from the EKF to publish odometry data to a ROS topic. This will typically include the position and orientation of the robot.
9. Simulink Model Structure: Here is a basic structure for your Simulink model:
  1. ROS Subscribers for IMU data.
  2. Preprocessing blocks for noise reduction and bias removal.
  3. Integrator blocks for velocity and position estimation.
  4. Orientation estimation blocks.
  5. Extended Kalman Filter block for sensor fusion.
  6. ROS Publisher for odometry data.
10. Testing and Tuning
  • Test the Simulink model in a simulated environment first to tune the EKF parameters and ensure that the position and velocity estimates are accurate.
  • Once the simulation works well, you can proceed to test with the real robot.
Remember that this is a complex process and may require iterative testing and tuning to get accurate results. If you are not familiar with EKF, it might be beneficial to use existing ROS packages like "robot_localization" that provide sensor fusion capabilities, including the use of EKF for combining data from various sensors to produce accurate state estimates.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by