http://www.ninadthakoor.com/2016/07/25/getting-started-with-matconvnet/
Available at: http://www.vlfeat.org/matconvnet/
My goal is to use this toolbox to classify cars into four classes: Sedan, Minivan, SUV and pickup. I already have the data from my prior work (http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6558793).
My plan is to first use transfer learning i. e. to use one of the deep networks pre-trained with imagenet data to extract features and use an SVM classifier to do the actual classification. I plan to do this with Matlab 2016a on Windows 10 PC equipped with GeForce GTX 960. I am aware that Matlab also has deep learning support in its Neural Network toolbox, I am going with MatConNet in hopes that it will stay more cutting edge.
The code below is derived from http://www.mathworks.com/company/newsletters/articles/deep-learning-for-computer-vision-with-matlab.html and http://www.vlfeat.org/matconvnet/quick/
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64 |
clear; close all; clc; %Load the pre-trained net net = load( 'imagenet-vgg-f.mat' ); net = vl_simplenn_tidy(net) ; %Remove the last layer (softmax layer) net.layers = net.layers(1 : end - 1); %% This deals with reading the data and getting the ground truth class labels %All files are are inside the root root = 'C:\Users\ninad\Dropbox\Ninad\Lasagne\data\c200x200\'; Files = dir(fullfile(root, '*.bmp' )); %Load the map which stores the class information load MakeModels.mat for i = 1 : length(Files) waitbar (i/ length(Files)); % Read class from the map Q = Files(i).name( end - 18 : end - 4); Qout = MakeModels(Q); Files(i).class = Qout.Type; % Preprocess the data and get it ready for the CNN im = imread(fullfile(root, Files(i).name)); im_ = single(im); % note: 0-255 range im_ = imresize(im_, net.meta.normalization.imageSize(1:2)); im_ = bsxfun(@minus, im_, net.meta.normalization.averageImage); % run the CNN to compute the features feats = vl_simplenn(net, im_) ; Files(i).feats = squeeze(feats( end ).x); end %% Classifier training %Select training data fraction trainFraction = 0.5; randomsort = randperm(length(Files)); trainSamples = randomsort(1 : trainFraction * length(Files)); testSamples = randomsort(trainFraction * length(Files)+1 : end ); Labels = [Files.class]; Features = [Files.feats]; trainingFeatures = Features( :, trainSamples); trainingLabels = Labels( :, trainSamples); classifier = fitcecoc(trainingFeatures', trainingLabels); %% Carry out the validation with rest of the data testFeatures = Features( :, testSamples); testLabels = Labels( :, testSamples); predictedLabels = predict(classifier, testFeatures'); confMat = confusionmat(testLabels, predictedLabels); % Convert confusion matrix into percentage form confMat = bsxfun(@rdivide,confMat,sum(confMat,2)) % Display the mean accuracy mean(diag(confMat)) |
'Deep Learning > resources' 카테고리의 다른 글
머신러닝/딥러닝 전반의 내용을 담고 있는 ebook (0) | 2017.07.07 |
---|---|
GitHub - leriomaggio/deep-learning-keras-tensorflow: Introduction to Deep Neural Networks with Keras and Tensorflow (0) | 2017.06.08 |
How to generate confusion matrix in MatConvNet (0) | 2017.05.15 |
PyTorch Tutorial (0) | 2017.05.13 |
assertition failed vl_simplenn.m (0) | 2017.05.13 |