It would be great if someone working in the same area can provide me with some solutions. your valuable inputs will be useful for my PhD work. thanks in advance!!
Monocular visual odometry for custom datasets
5 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
I am trying to implement monoculr visual odometry mentioned in https://www.mathworks.com/help/vision/ug/monocular-visual-odometry.html using my own datasets. The datasets were prepared by moving robot in particular trajectory and captured the images and groundtruth poses.
But when i ran the code using my own datasets (circle trajectory) the output was obtained as shown below
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1335059/image.jpeg)
It can be seen that estimated trajectory is traced only half. Also it was seen that after particular frames translation vectors are same and rotation component alone changes. Please suggest some solutions and how to rectify this.
Also please tell how the ground truth poses are generated for New Tsukuba Stereo Dataset as it would help us to prepare the ground truth accordingly.
Risposte (0)
Vedere anche
Categorie
Scopri di più su Tracking and Motion Estimation in Help Center e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!