Azzera filtri
Azzera filtri

i have trained a classifier with a data, but now i dont know how to test it for other data.

2 visualizzazioni (ultimi 30 giorni)
main function--
DataSet = readtable('Dataset.xlsx');
%DataSet bir Tablo nesnesidir
%iknci parametre fold parametresi
fold=342;
N=5;
D_orani=zeros(N,3);
k1=zeros(2,2,N);
k2=zeros(2,2,N);
k3=zeros(2,2,N);
for i=1:N
[k1(:,:,i),D_orani(i,1)]=Ensemble_GentleBoost(DataSet,fold);
[k2(:,:,i),D_orani(i,2)]=KnnClassifier(DataSet,fold);
[k3(:,:,i),D_orani(i,3)]=LojisticRegression(DataSet,fold);
end
Dogruluk_orani=sum(D_orani,1)/N;
k1=mean(k1,3);k2=mean(k2,3);k3=mean(k3,3);
fprintf('Ensemble_GentleBoost icin= %8.8f ',Dogruluk_orani(1)*100)
fprintf('\n');
fprintf('Weighted K-NN= %8.8f %',Dogruluk_orani(2)*100)
fprintf('\n');
fprintf('Lojistic Regression icin= %8.8f ',Dogruluk_orani(3)*100)
fprintf('\n');
---------------------------------------------------------------------------------------------------------------
knnclassifier.m
%*** 24/12/2018*********************************************%
%*** ALHASAN ALKHATIB B140100255****************************%
%*** Ses kayitlari ile Parkinson hastaligi tespiti**********%
%*** KnnClassifier.m dosyasi********************************%
%***********************************************************%
% Weighted K-NN
function [konfizyon,validationAccuracy]=KnnClassifier(DataSet,fold)
inputTable = DataSet;
predictorNames = {'Jitta', 'jitt', 'jit_rap', 'jit_ppq5', 'jit_DDP', 'sh_DB', 'shimmer', 'sh_apq3', 'sh_apq5', 'sh_apq11', 'shim_DDP', 'median_pitch', 'mean_pitch', 'max_pitch', 'min_pitch', 'range_pitch', 'variation', 'oto_KT', 'oto_K0', 'HNR', 'NHR'};
predictors = inputTable(:, predictorNames);
response = inputTable.class;
% Train a classifier
classificationKNN = fitcknn(...
predictors, ...
response, ...
'Distance', 'Euclidean', ...
'Exponent', [], ...
'NumNeighbors', 11, ...
'DistanceWeight', 'SquaredInverse', ...
'Standardize', true, ...
'ClassNames', [0; 1]);
% predict function
predictorExtractionFcn = @(t) t(:, predictorNames);
knnPredictFcn = @(x) predict(classificationKNN, x);
trainedClassifier.predictFcn = @(x) knnPredictFcn(predictorExtractionFcn(x));
trainedClassifier.RequiredVariables = {'Jitta', 'jitt', 'jit_rap', 'jit_ppq5', 'jit_DDP', 'sh_DB', 'shimmer', 'sh_apq3', 'sh_apq5', 'sh_apq11', 'shim_DDP', 'median_pitch', 'mean_pitch', 'max_pitch', 'min_pitch', 'range_pitch', 'variation', 'oto_KT', 'oto_K0', 'HNR', 'NHR'};
trainedClassifier.ClassificationKNN = classificationKNN;
trainedClassifier.About = 'This struct is a trained classifier exported from Classification Learner R2016a.';
trainedClassifier.HowToPredict = sprintf('To make predictions on a new table, T, use: \n yfit = c.predictFcn(T) \nreplacing ''c'' with the name of the variable that is this struct, e.g. ''trainedClassifier''. \n \nThe table, T, must contain the variables returned by: \n c.RequiredVariables \nVariable formats (e.g. matrix/vector, datatype) must match the original training data. \nAdditional variables are ignored. \n \nFor more information, see <a href="matlab:helpview(fullfile(docroot, ''stats'', ''stats.map''), ''appclassification_exportmodeltoworkspace'')">How to predict using an exported model</a>.');
% predictors and response
inputTable = DataSet;
predictorNames = {'Jitta', 'jitt', 'jit_rap', 'jit_ppq5', 'jit_DDP', 'sh_DB', 'shimmer', 'sh_apq3', 'sh_apq5', 'sh_apq11', 'shim_DDP', 'median_pitch', 'mean_pitch', 'max_pitch', 'min_pitch', 'range_pitch', 'variation', 'oto_KT', 'oto_K0', 'HNR', 'NHR'};
predictors = inputTable(:, predictorNames);
response = inputTable.class;
% cross-validation
partitionedModel = crossval(trainedClassifier.ClassificationKNN, 'KFold', fold);
% validation accuracy
validationAccuracy = 1 - kfoldLoss(partitionedModel, 'LossFun', 'ClassifError');
% validation predictions and scores
[validationPredictions, validationScores] = kfoldPredict(partitionedModel);
[N,F]=size(DataSet);
R=response;
VP=validationPredictions;
AA=0;AN=0;NA=0;NN=0;
for i=1:N
if R(i)==1 && VP(i)==1
AA=AA+1;
elseif R(i)==1 && VP(i)==0
AN=AN+1;
elseif R(i)==0 && VP(i)==1
NA=NA+1;
elseif R(i)==0 && VP(i)==0
NN=NN+1;
end
end
konfizyon=zeros(2,2);
konfizyon(1,1)=AA/168;konfizyon(1,2)=AN/168;konfizyon(2,1)=NA/174;konfizyon(2,2)=NN/174;
end

Risposte (1)

KALASH
KALASH il 28 Feb 2024
Hi Gaurav,
I see that you have trained a classifier using some data and now you seek a function to test out the model on a new set of data. To do so, you can leverage the "predict" function which can be called as follows:
%............your above code…..
predictions = predict(your_trained_model_name, newData);
disp(predictions);
%.....end…..
Your newData should be in the same format and should have the same number of features as your training data.
That should work, alternatively you can try appending your new data in the older data and then split the data for training and test accordingly. Below is an example code on how you can split the data into training and testing data:
% Append new test data to the training data
X = [X; newTestData];
Y = [Y; cell(size(newTestData, 1), 1)]; % Assign placeholder labels for the new test data
% Specify the indices for the training set
trainIndices = 1:size(X, 1) - size(newTestData, 1);
% Specify the indices for the testing set
testIndices = size(X, 1) - size(newTestData, 1) + 1:size(X, 1);
% Training set
XTrain = X(trainIndices, :);
YTrain = Y(trainIndices, :);
% Testing set
XTest = X(testIndices, :);
YTest = Y(testIndices, :);
% Train a k-Nearest Neighbors classifier
knnModel = fitcknn(XTrain, YTrain, 'NumNeighbors', 3);
% Predict using the trained classifier on the test set
predictions = predict(knnModel, XTest);
% Display the predictions
disp('Predictions for test data:');
disp(predictions);
I hope it helps. You may refer to the MATLAB R2018a documentation for the predict function here:
You can also refer this for more information on splitting the data:

Prodotti


Release

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by