matlab에서 fitcsvm함수로 SVM분류기를 이용해 ROC curve를 그리려면, 학습한 SVM 모델을 fitPosterior함수(score 를 posterior probability로 변환)를 통해 모델을 변환한 후 predict함수의 입력모델로 써야 해줘야 test셋의 posterior probability를 구할 수 있다. 이 posterior probability를 perfcurve에 입력시켜 roc curve와 auc를 그릴 수 있다. 아래는 그 예제.

 

test_fitcsvm_predict_perfcurve1.m

 


close all; clear; clc;

% Load the ionosphere data set. Suppose that the last 10 observations become available after training the SVM classifier.
load ionosphere

n = size(X,1);       % Training sample size
% isInds = 1:(n-10);   % In-sample indices
% oosInds = (n-9):n;   % Out-of-sample indices
cp = cvpartition(Y, 'k', 5);
disp(cp)
trIdx = cp.training(1);
teIdx = cp.test(1);
isInds = find(trIdx);
oosInds = find(teIdx);


% Train an SVM classifier. It is good practice to standardize the predictors and specify the order of the classes.
% Conserve memory by reducing the size of the trained SVM classifier.
SVMModel = fitcsvm(X(isInds,:),Y(isInds),'Standardize',true, 'ClassNames',{'b','g'});
CompactSVMModel = compact(SVMModel);
whos('SVMModel','CompactSVMModel')

% The positive class is 'g'. The CompactClassificationSVM classifier (CompactSVMModel) uses less space than
% the ClassificationSVM classifier (SVMModel) because the latter stores the data.

% Estimate the optimal score-to-posterior-probability-transformation function.
fCompactSVMModel = fitPosterior(CompactSVMModel, X(isInds,:),Y(isInds))

% The optimal score transformation function (CompactSVMModel.ScoreTransform) is the sigmoid function
% because the classes are inseparable.
% Predict the out-of-sample labels and positive class posterior probabilities.
% Since true labels are available, compare them with the predicted labels.

[labels,PostProbs] = predict(fCompactSVMModel,X(oosInds,:));
table(Y(oosInds),labels,PostProbs(:,2),'VariableNames', {'TrueLabels','PredictedLabels','PosClassPosterior'})

a1 = Y(oosInds);
a2 = PostProbs(:,2);
a3 = Y{1};
[falsePositiveTree, truePositiveTree, T, AucTreeg] = perfcurve(Y(oosInds),PostProbs(:,2), Y{1});
plot(falsePositiveTree, truePositiveTree, 'LineWidth', 5)
xlabel('False positive rate');
ylabel('True positive rate');
title('ROC');

% a1 = Y(oosInds);
% a2 = PostProbs(:,2);
% a3 = Y{2};
% figure,
% [falsePositiveTree, truePositiveTree, T, AucTreeb] = perfcurve(Y(oosInds),PostProbs(:,2), Y{2});
% plot(falsePositiveTree, truePositiveTree, 'LineWidth', 5)
% xlabel('False positive rate');
% ylabel('True positive rate');
% title('ROC');

[~,scoresSVMModel] = predict(SVMModel,X(oosInds,:));
[~,scoresCompactSVMModel] = predict(CompactSVMModel,X(oosInds,:));
[~,scoresfCompactSVMModel] = predict(fCompactSVMModel,X(oosInds,:));
[falsePositiveTree, truePositiveTree, T, AucTreeg] = perfcurve(Y(oosInds),scores(:,2), Y{1});
plot(falsePositiveTree, truePositiveTree, 'LineWidth', 5)
xlabel('False positive rate');
ylabel('True positive rate');
title('ROC');


a1 = 1;

 

 


 

Posted by uniqueone
,