Low performance when training SVM model using "polynomial" kernel function
Mostra commenti meno recenti
Hello,
I am trying to compare the performance of SVM regression (or SVR) between "rbf", "polynomial", and "linear".
The training works well when using "rbf" and "linear" (e.g., 0.7~0.8 of R^2).
However, when "polynomial" function was applied as kernel function, the performance degraded to 0.001 of R^2 or negative.
I used the code:
Mdl = fitrsvm(X,Y,"Standardize",'true','KernelFunction','polynomial','OptimizeHyperparameters',{'BoxConstraint','Epsilon','KernelScale','PolynomialOrder'},'HyperparameterOptimizationOptions',struct('MaxObjectiveEvaluations',100))
Please help
Thank you.
Risposta accettata
Più risposte (1)
Ganesh
il 14 Giu 2024
2 voti
The accuracy you achieve with a Kernal Function would depend on the data distribution. Adding your data might help us give you a better idea over the reason.
You could try out the following example in MATLAB:
Initially, run the example and see the number of iterations, and you can try changing the "Kernal Function" to "polynomial" and running the model. You will find that the number of iterations it takes to converge is now 20 times!
When your data is two or three columns it's easier to visualize the same, but as your dimensions grow, it gets harder to plot and visualize your findings.
1 Commento
minhyuk jeung
il 17 Giu 2024
Categorie
Scopri di più su Linear Regression in Centro assistenza e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
