Recursive feature selection with function generated from the classification learner app
8 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hi,
I have generated a SVM function from the classifiaction learner app (below). I wish to use the sequentialfs() function to preform recursive feature elimination for a data mining project. (As there are over 600 features (and well over 250,000 observations in the final ballanced test set)) feature eleimination makes a lot of sense.
I am having real difficulty brining together the training function with the documentration provided for sequentialfs(). Does anyone have experiance of getting a classifiaction learner funtion running with sequentialfs()?
I was thinking this would give me a faster turn around, however it's been a miserable expeiacne. I think my next step will be to try and strip everything back and start from the begining with Mdl = fitcsvm(Tbl,ResponseVarName), however if anyone has experiacne of meshing the two functions together I would appreciate learning where I am going wrong!
function [trainedClassifier, validationAccuracy] = trainClassifier(trainingData)
% [trainedClassifier, validationAccuracy] = trainClassifier(trainingData)
% Returns a trained classifier and its accuracy. This code recreates the
% classification model trained in Classification Learner app. Use the
% generated code to automate training the same model with new data, or to
% learn how to programmatically train models.
%
% Input:
% trainingData: A table containing the same predictor and response
% columns as those imported into the app.
%
%
% Output:
% trainedClassifier: A struct containing the trained classifier. The
% struct contains various fields with information about the trained
% classifier.
%
% trainedClassifier.predictFcn: A function to make predictions on new
% data.
%
% validationAccuracy: A double representing the validation accuracy as
% a percentage. In the app, the Models pane displays the validation
% accuracy for each model.
%
% Use the code to train the model with new data. To retrain your
% classifier, call the function from the command line with your original
% data or new data as the input argument trainingData.
%
% For example, to retrain a classifier trained with the original data set
% T, enter:
% [trainedClassifier, validationAccuracy] = trainClassifier(T)
%
% To make predictions with the returned 'trainedClassifier' on new data T2,
% use
% [yfit,scores] = trainedClassifier.predictFcn(T2)
%
% T2 must be a table containing at least the same predictor columns as used
% during training. For details, enter:
% trainedClassifier.HowToPredict
% Auto-generated by MATLAB on 05-Nov-2023 20:21:55
% Extract predictors and response
% This code processes the data into the right shape for training the
% model.
inputTable = trainingData;
predictorNames = {'Epoch', 'Var1', 'features1', 'features2', 'features3', 'features4', 'features5', 'features6', 'features7', 'features8', 'features9', 'features10', 'features11', 'features12', 'features13', 'features14', 'features15', 'features16', 'features17', 'features18', 'features19', 'features20', 'features21', 'features22', 'features23', 'features24', 'features25', 'features26', 'features27', 'features28', 'features29', 'features30', 'features31', 'features32', 'features33', 'features34', 'features35', 'features36', 'features37', 'features38', 'features39', 'features40', 'features41', 'features42', 'features43', 'features44', 'features45', 'features46', 'features47', 'features48', 'features49', 'features50', 'features51', 'features52', 'features53', 'features54', 'features55', 'features56', 'features57', 'features58', 'features59', 'features60', 'features61', 'features62', 'features63', 'features64', 'features65', 'features66', 'features67', 'features68', 'features69', 'features70', 'features71', 'features72', 'features73', 'features74', 'features75', 'features76', 'features77', 'features78', 'features79', 'features80', 'features81', 'features82', 'features83', 'features84', 'features85', 'features86', 'features87', 'features88', 'features89', 'features90', 'features91', 'features92', 'features93', 'features94', 'features95', 'features96', 'features97', 'features98', 'features99', 'features100', 'features101', 'features102', 'features103', 'features104', 'features105', 'features106', 'features107', 'features108', 'features109', 'features110', 'features111', 'features112', 'features113', 'features114', 'features115', 'features116', 'features117', 'features118', 'features119', 'features120', 'features121', 'features122', 'features123', 'features124', 'features125', 'features126', 'features127', 'features128', 'features129', 'features130', 'features131', 'features132', 'features133', 'features134', 'features135', 'features136', 'features137', 'features138', 'features139', 'features140', 'features141', 'features142', 'features143', 'features144', 'features145', 'features146', 'features147', 'features148', 'features149', 'features150', 'features151', 'features152', 'features153', 'features154', 'features155', 'features156', 'features157', 'features158', 'features159', 'features160', 'features161', 'features162', 'features163', 'features164', 'features165', 'features166', 'features167', 'features168', 'features169', 'features170', 'features171', 'features172', 'features173', 'features174', 'features175', 'features176', 'features177', 'features178', 'features179', 'features180', 'features181', 'features182', 'features183', 'features184', 'features185', 'features186', 'features187', 'features188', 'features189', 'features190', 'features191', 'features192', 'features193', 'features194', 'features195', 'features196', 'features197', 'features198', 'features199', 'features200', 'features201', 'features202', 'features203', 'features204', 'features205', 'features206', 'features207', 'features208', 'features209', 'features210', 'features211', 'features212', 'features213', 'features214', 'features215', 'features216', 'features217', 'features218', 'features219', 'features220', 'features221', 'features222', 'features223', 'features224', 'features225', 'features226', 'features227', 'features228', 'features229', 'features230', 'features231', 'features232', 'features233', 'features234', 'features235', 'features236', 'features237', 'features238', 'features239', 'features240', 'features241', 'features242', 'features243', 'features244', 'features245', 'features246', 'features247', 'features248', 'features249', 'features250', 'features251', 'features252', 'features253', 'features254', 'features255', 'features256', 'features257', 'features258', 'features259', 'features260', 'features261', 'features262', 'features263', 'features264', 'features265', 'features266', 'features267', 'features268', 'features269', 'features270', 'features271', 'features272', 'features273', 'features274', 'features275', 'features276', 'features277', 'features278', 'features279', 'features280', 'features281', 'features282', 'features283', 'features284', 'features285', 'features286', 'features287', 'features288', 'features289', 'features290', 'features291', 'features292', 'features293', 'features294', 'features295', 'features296', 'features297', 'features298', 'features299', 'features300', 'features301', 'features302', 'features303', 'features304', 'features305', 'features306', 'features307', 'features308', 'features309', 'features310', 'features311', 'features312', 'features313', 'features314', 'features315', 'features316', 'features317', 'features318', 'features319', 'features320', 'features321', 'features322', 'features323', 'features324', 'features325', 'features326', 'features327', 'features328', 'features329', 'features330', 'features331', 'features332', 'features333', 'features334', 'features335', 'features336', 'features337', 'features338', 'features339', 'features340', 'features341', 'features342', 'features343', 'features344', 'features345', 'features346', 'features347', 'features348', 'features349', 'features350', 'features351', 'features352', 'features353', 'features354', 'features355', 'features356', 'features357', 'features358', 'features359', 'features360', 'features361', 'features362', 'features363', 'features364', 'features365', 'features366', 'features367', 'features368', 'features369', 'features370', 'features371', 'features372', 'features373', 'features374', 'features375', 'features376', 'features377', 'features378', 'features379', 'features380', 'features381', 'features382', 'features383', 'features384', 'features385', 'features386', 'features387', 'features388', 'features389', 'features390', 'features391', 'features392', 'features393', 'features394', 'features395', 'features396', 'features397', 'features398', 'features399', 'features400', 'features401', 'features402', 'features403', 'features404', 'features405', 'features406', 'features407', 'features408', 'features409', 'features410', 'features411', 'features412', 'features413', 'features414', 'features415', 'features416', 'features417', 'features418', 'features419', 'features420', 'features421', 'features422', 'features423', 'features424', 'features425', 'features426', 'features427', 'features428', 'features429', 'features430', 'features431', 'features432', 'features433', 'features434', 'features435', 'features436', 'features437', 'features438', 'features439', 'features440', 'features441', 'features442', 'features443', 'features444', 'features445', 'features446', 'features447', 'features448', 'features449', 'features450', 'features451', 'features452', 'features453', 'features454', 'features455', 'features456', 'features457', 'features458', 'features459', 'features460', 'features461', 'features462', 'features463', 'features464', 'features465', 'features466', 'features467', 'features468', 'features469', 'features470', 'features471', 'features472', 'features473', 'features474', 'features475', 'features476', 'features477', 'features478', 'features479', 'features480', 'features481', 'features482', 'features483', 'features484', 'features485', 'features486', 'features487', 'features488', 'features489', 'features490', 'features491', 'features492', 'features493', 'features494', 'features495', 'features496', 'features497', 'features498', 'features499', 'features500', 'features501', 'features502', 'features503', 'features504', 'features505', 'features506', 'features507', 'features508', 'features509', 'features510', 'features511', 'features512', 'features513', 'features514', 'features515', 'features516', 'features517', 'features518', 'features519', 'features520', 'features521', 'features522', 'features523', 'features524', 'features525', 'features526', 'features527', 'features528', 'features529', 'features530', 'features531', 'features532', 'features533', 'features534', 'features535', 'features536', 'features537', 'features538', 'features539', 'features540', 'features541', 'features542', 'features543', 'features544', 'features545', 'features546', 'features547', 'features548', 'features549', 'features550', 'features551', 'features552', 'features553', 'features554', 'features555', 'features556', 'features557', 'features558', 'features559', 'features560', 'features561', 'features562', 'features563', 'features564', 'features565', 'features566', 'features567', 'features568', 'features569', 'features570', 'features571', 'features572', 'features573', 'features574', 'features575', 'features576', 'features577', 'features578', 'features579', 'features580', 'features581', 'features582', 'features583', 'features584', 'features585', 'features586', 'features587', 'features588', 'features589', 'features590', 'features591', 'features592', 'features593', 'features594', 'features595', 'features596', 'features597', 'features598', 'features599', 'features600', 'features601', 'features602', 'features603', 'features604', 'features605', 'features606', 'features607', 'features608', 'features609', 'features610', 'features611', 'features612', 'features613', 'features614', 'features615', 'features616', 'features617', 'features618', 'features619', 'features620', 'features621', 'features622', 'features623', 'features624', 'features625', 'features626', 'features627', 'features628', 'features629', 'features630'};
predictors = inputTable(:, predictorNames);
response = inputTable.Stage;
isCategoricalPredictor = [false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false];
classNames = categorical({'STAGE - N1'; 'STAGE - N2'; 'STAGE - N3'; 'STAGE - R'; 'STAGE - W'}, {'STAGE - N1' 'STAGE - N2' 'STAGE - N3' 'STAGE - NO STAGE' 'STAGE - R' 'STAGE - W' 'PAUSED'});
% Train a classifier
% This code specifies all the classifier options and trains the classifier.
template = templateSVM(...
'KernelFunction', 'polynomial', ...
'PolynomialOrder', 3, ...
'KernelScale', 'auto', ...
'BoxConstraint', 1, ...
'Standardize', true);
classificationSVM = fitcecoc(...
predictors, ...
response, ...
'Learners', template, ...
'Coding', 'onevsone', ...
'ClassNames', classNames);
% Create the result struct with predict function
predictorExtractionFcn = @(t) t(:, predictorNames);
svmPredictFcn = @(x) predict(classificationSVM, x);
trainedClassifier.predictFcn = @(x) svmPredictFcn(predictorExtractionFcn(x));
% Add additional fields to the result struct
trainedClassifier.RequiredVariables = {'Epoch', 'Var1', 'features1', 'features2', 'features3', 'features4', 'features5', 'features6', 'features7', 'features8', 'features9', 'features10', 'features11', 'features12', 'features13', 'features14', 'features15', 'features16', 'features17', 'features18', 'features19', 'features20', 'features21', 'features22', 'features23', 'features24', 'features25', 'features26', 'features27', 'features28', 'features29', 'features30', 'features31', 'features32', 'features33', 'features34', 'features35', 'features36', 'features37', 'features38', 'features39', 'features40', 'features41', 'features42', 'features43', 'features44', 'features45', 'features46', 'features47', 'features48', 'features49', 'features50', 'features51', 'features52', 'features53', 'features54', 'features55', 'features56', 'features57', 'features58', 'features59', 'features60', 'features61', 'features62', 'features63', 'features64', 'features65', 'features66', 'features67', 'features68', 'features69', 'features70', 'features71', 'features72', 'features73', 'features74', 'features75', 'features76', 'features77', 'features78', 'features79', 'features80', 'features81', 'features82', 'features83', 'features84', 'features85', 'features86', 'features87', 'features88', 'features89', 'features90', 'features91', 'features92', 'features93', 'features94', 'features95', 'features96', 'features97', 'features98', 'features99', 'features100', 'features101', 'features102', 'features103', 'features104', 'features105', 'features106', 'features107', 'features108', 'features109', 'features110', 'features111', 'features112', 'features113', 'features114', 'features115', 'features116', 'features117', 'features118', 'features119', 'features120', 'features121', 'features122', 'features123', 'features124', 'features125', 'features126', 'features127', 'features128', 'features129', 'features130', 'features131', 'features132', 'features133', 'features134', 'features135', 'features136', 'features137', 'features138', 'features139', 'features140', 'features141', 'features142', 'features143', 'features144', 'features145', 'features146', 'features147', 'features148', 'features149', 'features150', 'features151', 'features152', 'features153', 'features154', 'features155', 'features156', 'features157', 'features158', 'features159', 'features160', 'features161', 'features162', 'features163', 'features164', 'features165', 'features166', 'features167', 'features168', 'features169', 'features170', 'features171', 'features172', 'features173', 'features174', 'features175', 'features176', 'features177', 'features178', 'features179', 'features180', 'features181', 'features182', 'features183', 'features184', 'features185', 'features186', 'features187', 'features188', 'features189', 'features190', 'features191', 'features192', 'features193', 'features194', 'features195', 'features196', 'features197', 'features198', 'features199', 'features200', 'features201', 'features202', 'features203', 'features204', 'features205', 'features206', 'features207', 'features208', 'features209', 'features210', 'features211', 'features212', 'features213', 'features214', 'features215', 'features216', 'features217', 'features218', 'features219', 'features220', 'features221', 'features222', 'features223', 'features224', 'features225', 'features226', 'features227', 'features228', 'features229', 'features230', 'features231', 'features232', 'features233', 'features234', 'features235', 'features236', 'features237', 'features238', 'features239', 'features240', 'features241', 'features242', 'features243', 'features244', 'features245', 'features246', 'features247', 'features248', 'features249', 'features250', 'features251', 'features252', 'features253', 'features254', 'features255', 'features256', 'features257', 'features258', 'features259', 'features260', 'features261', 'features262', 'features263', 'features264', 'features265', 'features266', 'features267', 'features268', 'features269', 'features270', 'features271', 'features272', 'features273', 'features274', 'features275', 'features276', 'features277', 'features278', 'features279', 'features280', 'features281', 'features282', 'features283', 'features284', 'features285', 'features286', 'features287', 'features288', 'features289', 'features290', 'features291', 'features292', 'features293', 'features294', 'features295', 'features296', 'features297', 'features298', 'features299', 'features300', 'features301', 'features302', 'features303', 'features304', 'features305', 'features306', 'features307', 'features308', 'features309', 'features310', 'features311', 'features312', 'features313', 'features314', 'features315', 'features316', 'features317', 'features318', 'features319', 'features320', 'features321', 'features322', 'features323', 'features324', 'features325', 'features326', 'features327', 'features328', 'features329', 'features330', 'features331', 'features332', 'features333', 'features334', 'features335', 'features336', 'features337', 'features338', 'features339', 'features340', 'features341', 'features342', 'features343', 'features344', 'features345', 'features346', 'features347', 'features348', 'features349', 'features350', 'features351', 'features352', 'features353', 'features354', 'features355', 'features356', 'features357', 'features358', 'features359', 'features360', 'features361', 'features362', 'features363', 'features364', 'features365', 'features366', 'features367', 'features368', 'features369', 'features370', 'features371', 'features372', 'features373', 'features374', 'features375', 'features376', 'features377', 'features378', 'features379', 'features380', 'features381', 'features382', 'features383', 'features384', 'features385', 'features386', 'features387', 'features388', 'features389', 'features390', 'features391', 'features392', 'features393', 'features394', 'features395', 'features396', 'features397', 'features398', 'features399', 'features400', 'features401', 'features402', 'features403', 'features404', 'features405', 'features406', 'features407', 'features408', 'features409', 'features410', 'features411', 'features412', 'features413', 'features414', 'features415', 'features416', 'features417', 'features418', 'features419', 'features420', 'features421', 'features422', 'features423', 'features424', 'features425', 'features426', 'features427', 'features428', 'features429', 'features430', 'features431', 'features432', 'features433', 'features434', 'features435', 'features436', 'features437', 'features438', 'features439', 'features440', 'features441', 'features442', 'features443', 'features444', 'features445', 'features446', 'features447', 'features448', 'features449', 'features450', 'features451', 'features452', 'features453', 'features454', 'features455', 'features456', 'features457', 'features458', 'features459', 'features460', 'features461', 'features462', 'features463', 'features464', 'features465', 'features466', 'features467', 'features468', 'features469', 'features470', 'features471', 'features472', 'features473', 'features474', 'features475', 'features476', 'features477', 'features478', 'features479', 'features480', 'features481', 'features482', 'features483', 'features484', 'features485', 'features486', 'features487', 'features488', 'features489', 'features490', 'features491', 'features492', 'features493', 'features494', 'features495', 'features496', 'features497', 'features498', 'features499', 'features500', 'features501', 'features502', 'features503', 'features504', 'features505', 'features506', 'features507', 'features508', 'features509', 'features510', 'features511', 'features512', 'features513', 'features514', 'features515', 'features516', 'features517', 'features518', 'features519', 'features520', 'features521', 'features522', 'features523', 'features524', 'features525', 'features526', 'features527', 'features528', 'features529', 'features530', 'features531', 'features532', 'features533', 'features534', 'features535', 'features536', 'features537', 'features538', 'features539', 'features540', 'features541', 'features542', 'features543', 'features544', 'features545', 'features546', 'features547', 'features548', 'features549', 'features550', 'features551', 'features552', 'features553', 'features554', 'features555', 'features556', 'features557', 'features558', 'features559', 'features560', 'features561', 'features562', 'features563', 'features564', 'features565', 'features566', 'features567', 'features568', 'features569', 'features570', 'features571', 'features572', 'features573', 'features574', 'features575', 'features576', 'features577', 'features578', 'features579', 'features580', 'features581', 'features582', 'features583', 'features584', 'features585', 'features586', 'features587', 'features588', 'features589', 'features590', 'features591', 'features592', 'features593', 'features594', 'features595', 'features596', 'features597', 'features598', 'features599', 'features600', 'features601', 'features602', 'features603', 'features604', 'features605', 'features606', 'features607', 'features608', 'features609', 'features610', 'features611', 'features612', 'features613', 'features614', 'features615', 'features616', 'features617', 'features618', 'features619', 'features620', 'features621', 'features622', 'features623', 'features624', 'features625', 'features626', 'features627', 'features628', 'features629', 'features630'};
trainedClassifier.ClassificationSVM = classificationSVM;
trainedClassifier.About = 'This struct is a trained model exported from Classification Learner R2023b.';
trainedClassifier.HowToPredict = sprintf('To make predictions on a new table, T, use: \n [yfit,scores] = c.predictFcn(T) \nreplacing ''c'' with the name of the variable that is this struct, e.g. ''trainedModel''. \n \nThe table, T, must contain the variables returned by: \n c.RequiredVariables \nVariable formats (e.g. matrix/vector, datatype) must match the original training data. \nAdditional variables are ignored. \n \nFor more information, see <a href="matlab:helpview(fullfile(docroot, ''stats'', ''stats.map''), ''appclassification_exportmodeltoworkspace'')">How to predict using an exported model</a>.');
% Extract predictors and response
% This code processes the data into the right shape for training the
% model.
inputTable = trainingData;
predictorNames = {'Epoch', 'Var1', 'features1', 'features2', 'features3', 'features4', 'features5', 'features6', 'features7', 'features8', 'features9', 'features10', 'features11', 'features12', 'features13', 'features14', 'features15', 'features16', 'features17', 'features18', 'features19', 'features20', 'features21', 'features22', 'features23', 'features24', 'features25', 'features26', 'features27', 'features28', 'features29', 'features30', 'features31', 'features32', 'features33', 'features34', 'features35', 'features36', 'features37', 'features38', 'features39', 'features40', 'features41', 'features42', 'features43', 'features44', 'features45', 'features46', 'features47', 'features48', 'features49', 'features50', 'features51', 'features52', 'features53', 'features54', 'features55', 'features56', 'features57', 'features58', 'features59', 'features60', 'features61', 'features62', 'features63', 'features64', 'features65', 'features66', 'features67', 'features68', 'features69', 'features70', 'features71', 'features72', 'features73', 'features74', 'features75', 'features76', 'features77', 'features78', 'features79', 'features80', 'features81', 'features82', 'features83', 'features84', 'features85', 'features86', 'features87', 'features88', 'features89', 'features90', 'features91', 'features92', 'features93', 'features94', 'features95', 'features96', 'features97', 'features98', 'features99', 'features100', 'features101', 'features102', 'features103', 'features104', 'features105', 'features106', 'features107', 'features108', 'features109', 'features110', 'features111', 'features112', 'features113', 'features114', 'features115', 'features116', 'features117', 'features118', 'features119', 'features120', 'features121', 'features122', 'features123', 'features124', 'features125', 'features126', 'features127', 'features128', 'features129', 'features130', 'features131', 'features132', 'features133', 'features134', 'features135', 'features136', 'features137', 'features138', 'features139', 'features140', 'features141', 'features142', 'features143', 'features144', 'features145', 'features146', 'features147', 'features148', 'features149', 'features150', 'features151', 'features152', 'features153', 'features154', 'features155', 'features156', 'features157', 'features158', 'features159', 'features160', 'features161', 'features162', 'features163', 'features164', 'features165', 'features166', 'features167', 'features168', 'features169', 'features170', 'features171', 'features172', 'features173', 'features174', 'features175', 'features176', 'features177', 'features178', 'features179', 'features180', 'features181', 'features182', 'features183', 'features184', 'features185', 'features186', 'features187', 'features188', 'features189', 'features190', 'features191', 'features192', 'features193', 'features194', 'features195', 'features196', 'features197', 'features198', 'features199', 'features200', 'features201', 'features202', 'features203', 'features204', 'features205', 'features206', 'features207', 'features208', 'features209', 'features210', 'features211', 'features212', 'features213', 'features214', 'features215', 'features216', 'features217', 'features218', 'features219', 'features220', 'features221', 'features222', 'features223', 'features224', 'features225', 'features226', 'features227', 'features228', 'features229', 'features230', 'features231', 'features232', 'features233', 'features234', 'features235', 'features236', 'features237', 'features238', 'features239', 'features240', 'features241', 'features242', 'features243', 'features244', 'features245', 'features246', 'features247', 'features248', 'features249', 'features250', 'features251', 'features252', 'features253', 'features254', 'features255', 'features256', 'features257', 'features258', 'features259', 'features260', 'features261', 'features262', 'features263', 'features264', 'features265', 'features266', 'features267', 'features268', 'features269', 'features270', 'features271', 'features272', 'features273', 'features274', 'features275', 'features276', 'features277', 'features278', 'features279', 'features280', 'features281', 'features282', 'features283', 'features284', 'features285', 'features286', 'features287', 'features288', 'features289', 'features290', 'features291', 'features292', 'features293', 'features294', 'features295', 'features296', 'features297', 'features298', 'features299', 'features300', 'features301', 'features302', 'features303', 'features304', 'features305', 'features306', 'features307', 'features308', 'features309', 'features310', 'features311', 'features312', 'features313', 'features314', 'features315', 'features316', 'features317', 'features318', 'features319', 'features320', 'features321', 'features322', 'features323', 'features324', 'features325', 'features326', 'features327', 'features328', 'features329', 'features330', 'features331', 'features332', 'features333', 'features334', 'features335', 'features336', 'features337', 'features338', 'features339', 'features340', 'features341', 'features342', 'features343', 'features344', 'features345', 'features346', 'features347', 'features348', 'features349', 'features350', 'features351', 'features352', 'features353', 'features354', 'features355', 'features356', 'features357', 'features358', 'features359', 'features360', 'features361', 'features362', 'features363', 'features364', 'features365', 'features366', 'features367', 'features368', 'features369', 'features370', 'features371', 'features372', 'features373', 'features374', 'features375', 'features376', 'features377', 'features378', 'features379', 'features380', 'features381', 'features382', 'features383', 'features384', 'features385', 'features386', 'features387', 'features388', 'features389', 'features390', 'features391', 'features392', 'features393', 'features394', 'features395', 'features396', 'features397', 'features398', 'features399', 'features400', 'features401', 'features402', 'features403', 'features404', 'features405', 'features406', 'features407', 'features408', 'features409', 'features410', 'features411', 'features412', 'features413', 'features414', 'features415', 'features416', 'features417', 'features418', 'features419', 'features420', 'features421', 'features422', 'features423', 'features424', 'features425', 'features426', 'features427', 'features428', 'features429', 'features430', 'features431', 'features432', 'features433', 'features434', 'features435', 'features436', 'features437', 'features438', 'features439', 'features440', 'features441', 'features442', 'features443', 'features444', 'features445', 'features446', 'features447', 'features448', 'features449', 'features450', 'features451', 'features452', 'features453', 'features454', 'features455', 'features456', 'features457', 'features458', 'features459', 'features460', 'features461', 'features462', 'features463', 'features464', 'features465', 'features466', 'features467', 'features468', 'features469', 'features470', 'features471', 'features472', 'features473', 'features474', 'features475', 'features476', 'features477', 'features478', 'features479', 'features480', 'features481', 'features482', 'features483', 'features484', 'features485', 'features486', 'features487', 'features488', 'features489', 'features490', 'features491', 'features492', 'features493', 'features494', 'features495', 'features496', 'features497', 'features498', 'features499', 'features500', 'features501', 'features502', 'features503', 'features504', 'features505', 'features506', 'features507', 'features508', 'features509', 'features510', 'features511', 'features512', 'features513', 'features514', 'features515', 'features516', 'features517', 'features518', 'features519', 'features520', 'features521', 'features522', 'features523', 'features524', 'features525', 'features526', 'features527', 'features528', 'features529', 'features530', 'features531', 'features532', 'features533', 'features534', 'features535', 'features536', 'features537', 'features538', 'features539', 'features540', 'features541', 'features542', 'features543', 'features544', 'features545', 'features546', 'features547', 'features548', 'features549', 'features550', 'features551', 'features552', 'features553', 'features554', 'features555', 'features556', 'features557', 'features558', 'features559', 'features560', 'features561', 'features562', 'features563', 'features564', 'features565', 'features566', 'features567', 'features568', 'features569', 'features570', 'features571', 'features572', 'features573', 'features574', 'features575', 'features576', 'features577', 'features578', 'features579', 'features580', 'features581', 'features582', 'features583', 'features584', 'features585', 'features586', 'features587', 'features588', 'features589', 'features590', 'features591', 'features592', 'features593', 'features594', 'features595', 'features596', 'features597', 'features598', 'features599', 'features600', 'features601', 'features602', 'features603', 'features604', 'features605', 'features606', 'features607', 'features608', 'features609', 'features610', 'features611', 'features612', 'features613', 'features614', 'features615', 'features616', 'features617', 'features618', 'features619', 'features620', 'features621', 'features622', 'features623', 'features624', 'features625', 'features626', 'features627', 'features628', 'features629', 'features630'};
predictors = inputTable(:, predictorNames);
response = inputTable.Stage;
isCategoricalPredictor = [false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false];
classNames = categorical({'STAGE - N1'; 'STAGE - N2'; 'STAGE - N3'; 'STAGE - R'; 'STAGE - W'}, {'STAGE - N1' 'STAGE - N2' 'STAGE - N3' 'STAGE - NO STAGE' 'STAGE - R' 'STAGE - W' 'PAUSED'});
% Perform cross-validation
partitionedModel = crossval(trainedClassifier.ClassificationSVM, 'KFold', 5);
% Compute validation predictions
[validationPredictions, validationScores] = kfoldPredict(partitionedModel);
% Compute validation accuracy
validationAccuracy = 1 - kfoldLoss(partitionedModel, 'LossFun', 'ClassifError');
2 Commenti
Image Analyst
il 5 Nov 2023
Honestly I'd like to run the Classification Learner app myself. Can you upload your training data table in a .mat file? And the ground truth class is the first column of the table, right?
Risposta accettata
Drew
il 14 Nov 2023
Modificato: Drew
il 14 Nov 2023
The short answer to your question is that sequentialfs uses code for training and testing one model, for one fold. This code snippet can be taken from the generated code from Classification Learner.
Steps:
(1) Load the data you sent into Classification Learner, exclude the 3 columns you indicated. Build SVM model. Generate code for that model. (You have already done these steps, based on the generated code that you shared in your question).
(2) Create a script that uses sequentialfs, with the model-building code as suggested by Classification Learner.
% Feature selection
load randSamp_reduced_to_1000_for_IA.mat;
X=table2array(randSamp(:,5:end));
y=table2array(randSamp(:,3));
% For reproducibility
rng("default");
% Create stratified partition
cv = cvpartition(y,"KFold",10);
% Call sequentialfs
opts = statset("Display","iter","UseParallel",1);
[tf,history] = sequentialfs(@myfun,X,y,"CV",cv,"Options",opts);
function criterion = myfun(XTrain,yTrain,XTest,yTest)
% This code is from what was generated by Classification Learner.
% Essentially, use the definition of a templateSVM followed by a call to fitcecoc.
template = templateSVM(...
'KernelFunction', 'linear', ...
'PolynomialOrder', [], ...
'KernelScale', 'auto', ...
'BoxConstraint', 1, ...
'Standardize', true);
% In this call to fitcecoc to build the multi-class SVM, the 'ClassNames'
% Name-Value pair was removed, because the default will give the same values for 'ClassNames'.
% That is, including the 'ClassNames' Name-Value Pair would produce the
% same results.
classificationSVM = fitcecoc(...
XTrain, ...
yTrain, ...
'Learners', template, ...
'Coding', 'onevsone');
% Sequentialfs requires the loss (classification error) multiplied by
% the size of the test fold. So, use that as the return value.
criterion = loss(classificationSVM,XTest,yTest) * size(XTest,1);
end
The code above gave the following console output (ignore the color coding due to using a code block here):
Start forward sequential feature selection:
Initial columns included: none
Columns that can not be included: none
Step 1, added column 568, criterion value 0.574
Step 2, added column 64, criterion value 0.484
Step 3, added column 290, criterion value 0.45
Step 4, added column 496, criterion value 0.428
Step 5, added column 340, criterion value 0.406
Step 6, added column 118, criterion value 0.398
Step 7, added column 274, criterion value 0.387
Step 8, added column 368, criterion value 0.381
Final columns included: 64 118 274 290 340 368 496 568
I tried re-running with 5-fold cross-validation, that is, with
cv = cvpartition(y,"KFold",5);
and obtained the following result in the console output:
Start forward sequential feature selection:
Initial columns included: none
Columns that can not be included: none
Step 1, added column 568, criterion value 0.574
Step 2, added column 64, criterion value 0.488
Step 3, added column 290, criterion value 0.447
Step 4, added column 496, criterion value 0.428
Step 5, added column 340, criterion value 0.414
Step 6, added column 118, criterion value 0.403
Step 7, added column 147, criterion value 0.39
Step 8, added column 436, criterion value 0.381
Final columns included: 64 118 147 290 340 436 496 568
In the two runs above, the first 6 chosen features were the same. After that, the choices varied, but the reduction in the classification error was similar. One could also adjust the stopping criteria if desired. If there are many features which are highly correlated, then it could be that different feature selection mechanisms (or different runs with different random seeds) will choose alternates among features that contain similar information.
Remember that sequentialfs is a brute-force, compute-intensive way to select features. One can use a subset of data (like the 1000 samples here), or change the cross-validation strategy, to reduce the computation. There are many other feature selection methods which will be much faster, such as those provided in the classification Learner app (MRMR, Chi2, ReliefF, ANOVA, Kruskal-Wllis).
Adding more info about timing: After adding tic and toc before and after sequentialfs to measure elapsed time, the following timing of 3 hours and 35 minutes was observed. Also, the randomization in training was a bit different, so the sequential feature selection proceeded further on this run and achieved a lower validation error rate.
Start forward sequential feature selection:
Initial columns included: none
Columns that can not be included: none
Step 1, added column 568, criterion value 0.564
Step 2, added column 64, criterion value 0.488
Step 3, added column 290, criterion value 0.447
Step 4, added column 538, criterion value 0.429
Step 5, added column 619, criterion value 0.409
Step 6, added column 61, criterion value 0.398
Step 7, added column 46, criterion value 0.387
Step 8, added column 1, criterion value 0.384
Step 9, added column 530, criterion value 0.371
Step 10, added column 151, criterion value 0.37
Step 11, added column 371, criterion value 0.369
Step 12, added column 218, criterion value 0.366
Step 13, added column 41, criterion value 0.365
Step 14, added column 442, criterion value 0.362
Step 15, added column 593, criterion value 0.359
Final columns included: 1 41 46 61 64 151 218 290 371 442 530 538 568 593 619
Elapsed time is 12907.820332 seconds.
>> fstime = duration(0,0,12907.820332)
fstime =
duration
03:35:07
If this answer helps you, please remember to accept the answer.
4 Commenti
Drew
il 16 Nov 2023
Yes, the criterion can be changed. Note that the feature selection will seek to minimize the criterion, so the criterion must be constructed such that lower values (closer to negative infinity) are better. For PPV and F1, higher values are better. Therefore, use the negative of PPV or F1, if putting them in the criterion. For this multi-class classification problem, note that there is a per-class PPV and F1 for each class. So, you need to determine whether to optimize the F1 or PPV for one class, or to use some form of average across classes. For a similar situation for rocmetrics, see the average function of the rocmetrics object https://www.mathworks.com/help/stats/rocmetrics.average.html. This averaging function can do micro, macro, or weighted-macro average.
I did a quick update of the code to use the negative of the average per-class F1 score, and the sequentialfs chose only 3 features in common with the previous run using the error rate loss criterion. There seems to be a lot of redundancy in the features.
% Feature selection
load randSamp_reduced_to_1000_for_IA.mat;
X=table2array(randSamp(:,5:end));
y=table2array(randSamp(:,3));
% y is now a categorical array with some unused categories which can be seen
% with categories(y), or categories(table2array(randSamp(:,3))).
% Let's trim y down to just the used categories, which can be seen with
% unique(y).
% This will avoid carrying around extra unused categories when building models
% and confusion matrices.
y=categorical(y,{'STAGE - N1' 'STAGE - N2' 'STAGE - N3' 'STAGE - R' 'STAGE - W'});
% For reproducibility
rng("default");
% Create stratified partition
cv = cvpartition(y,"KFold",5);
% Call sequentialfs
opts = statset("Display","iter","UseParallel",1);
tic;
[tf,history] = sequentialfs(@myfun,X,y,"CV",cv,"Options",opts);
toc;
function criterion = myfun(XTrain,yTrain,XTest,yTest)
template = templateSVM(...
'KernelFunction', 'linear', ...
'PolynomialOrder', [], ...
'KernelScale', 'auto', ...
'BoxConstraint', 1, ...
'Standardize', true);
classificationSVM = fitcecoc(...
XTrain, ...
yTrain, ...
'Learners', template, ...
'Coding', 'onevsone');
% Use -mean(F1PerClass)
[predicted_labels,scores] = predict(classificationSVM,XTest);
counts=confusionmat(yTest,predicted_labels);
precisionPerClass= diag(counts)./ (sum(counts))';
recallPerClass = diag(counts)./ (sum(counts,2));
F1PerClass=2.*diag(counts) ./ ((sum(counts,2)) + (sum(counts))');
% Define the loss as the negative of the average F1PerClass,
% and scale it by the size of the Test data
% criterion = loss(classificationSVM,XTest,yTest) * size(XTest,1);
criterion = -mean(F1PerClass) * size(XTest,1);
end
Here is the console output
Start forward sequential feature selection:
Initial columns included: none
Columns that can not be included: none
Step 1, added column 358, criterion value -0.411812
Step 2, added column 64, criterion value -0.491031
Step 3, added column 542, criterion value -0.544969
Step 4, added column 71, criterion value -0.558931
Step 5, added column 243, criterion value -0.564062
Step 6, added column 193, criterion value -0.566852
Step 7, added column 262, criterion value -0.574583
Step 8, added column 538, criterion value -0.589438
Step 9, added column 61, criterion value -0.605033
Step 10, added column 9, criterion value -0.618162
Step 11, added column 27, criterion value -0.625652
Step 12, added column 547, criterion value -0.629619
Final columns included: 9 27 61 64 71 193 243 262 358 538 542 547
Elapsed time is 12526.550517 seconds.
>> fstime=duration(0,0,12526.550517)
fstime =
duration
03:28:46
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Generalized Additive Model in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!