My understanding is that KNN uses the classifications of the k data nearest to a query point in order to inform the classification of the query point. I was wondering how K-fold cross validation affects the ability of the KNN classifier in the classificationLearner toolbox in Matlab. My understanding of K-Fold CV is that it is used to make sure that out-of-sample data is predicted well. But also, one can use cross validation to optimize hyperparameters. Is something like this happening with KNN too? If so, what hyperparameters are being optimized through the use of KNN in classificationLearner?