Why does AlexNet train slower and use much more memory with "simpler" brain CT images?

1 visualizzazione (ultimi 30 giorni)
Matlab 2017a, Windows, GPU
I have used transfer learning with AlexNet. I retrained it to classify 3 types of CT brain abnormalities by changing the last fully connected layer and using the trainNetwork function. As image input I converted the CT brain scans to uint8 tiff images.( pixel value ranges of 0-255) As a result surrounding air and scalp fat were set to 0, which is water density on CT images. Bone became 255. The brain tissues, both normal and abnormal, were unchanged in the 1-85 range. This worked well with minibatch accuracies of ~0.98 and test batch accuracies of ~0.95.
I then thought that I could improve the network by removing the bone/skull in image preprocessing. So I wrote a function which set the bone/skull to 0 (instead of 255). I assumed that this would allow the brain tissues and abnormalities to extend over a greater range in the normalized images. Now when I attempt to retrain AlexNet, the iterations take about 5 times longer, system memory usages maxes out (63 GB is available), and Matlab freezes. Any idea why?
  2 Commenti
Joss Knight
Joss Knight il 28 Apr 2018
The failure to converge as fast is not a particular surprise - perhaps there simply isn't sufficient information in your dataset to do the classification now (i.e. the bone/water distinction was essential). As to why you max out system memory, that sounds like a memory leak. Are you able to upgrade MATLAB to a more recent version?
Peter
Peter il 30 Apr 2018
Thank you for your reply. I suspect you are correct that the bone/water distinction is necessary in this example.
Using R2017b.

Accedi per commentare.

Risposte (0)

Categorie

Scopri di più su Image Data Workflows in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by