import data problem for training faster rcnn
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
i am tring to train my faster rccn on custom data attached ( images with there bounding boxes in .txt) but i can able to to it.kinly help me .
note:
i have multible images with ther .txt file
4 Commenti
Ganesh
il 13 Dic 2023
Hi,
Have you gone through the documentation for the structure of data to train Faster RCNN? The way I see it, you claim to have multiple images for the first column, you seem to have data on the bounding boxes for the second column, and the dataset should contain a label. Please refer to this document https://in.mathworks.com/help/vision/ref/trainfasterrcnnobjectdetector.html#bvkk009-1-trainingData:~:text=trainingData%20%E2%80%94%20Labeled%20ground%20truth
Kindly reach out if you have any challenges in structuring your dataset
Risposte (1)
Githin George
il 18 Dic 2023
Hello Ahmad,
It is my understanding that you are trying to train a faster R-CNN network and would like to create a MATLAB “groundTruth” object for the same.
The “groundTruth” object is meant for use with the various “Labeler apps” available in MATLAB. You can programmatically create a “groundTruth” object or use a labeling app like the “Image Labeler App” to create labels and export them to the workspace or a file. Even though this object contains all the label information it cannot be used directly to train a faster R-CNN network as the input to the “trainFasterRCNNObjectDetector” function must be a table or datastore object. The data can be extracted from the object and used as inputs to the training function.
Please look at below documentation links related to the “groundTruth” object, and “trainFasterRCNNObjectDetector” function for more details.
I hope this helps.
0 Commenti
Vedere anche
Categorie
Scopri di più su Recognition, Object Detection, and Semantic Segmentation in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!