一、ex3.m

%% Machine Learning Online Class - Exercise 3 | Part 1: One-vs-all%  Instructions
%  ------------
%
%  This file contains code that helps you get started on the
%  linear exercise. You will need to complete the following functions
%  in this exericse:
%
%     lrCostFunction.m (logistic regression cost function)
%     oneVsAll.m
%     predictOneVsAll.m
%     predict.m
%
%  For this exercise, you will not need to change any code in this file,
%  or any other files other than those mentioned above.
%%% Initialization
clear ; close all; clc%% Setup the parameters you will use for this part of the exercise
input_layer_size  = 400;  % 20x20 Input Images of Digits
num_labels = 10;          % 10 labels, from 1 to 10   % (note that we have mapped "0" to label 10)%% =========== Part 1: Loading and Visualizing Data =============
%  We start the exercise by first loading and visualizing the dataset.
%  You will be working with a dataset that contains handwritten digits.
%% Load Training Data
fprintf('Loading and Visualizing Data ...\n')load('ex3data1.mat'); % training data stored in arrays X, y
m = size(X, 1);% Randomly select 100 data points to display
rand_indices = randperm(m); % Randomly select 100
sel = X(rand_indices(1:100), :);displayData(sel);fprintf('Program paused. Press enter to continue.\n');
pause;%% ============ Part 2: Vectorize Logistic Regression ============
%  In this part of the exercise, you will reuse your logistic regression
%  code from the last exercise. You task here is to make sure that your
%  regularized logistic regression implementation is vectorized. After
%  that, you will implement one-vs-all classification for the handwritten
%  digit dataset.
%fprintf('\nTraining One-vs-All Logistic Regression...\n')lambda = 0.1;
[all_theta] = oneVsAll(X, y, num_labels, lambda);fprintf('Program paused. Press enter to continue.\n');
pause;%% ================ Part 3: Predict for One-Vs-All ================
%  After ...
pred = predictOneVsAll(all_theta, X);fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);

二、ex3_nn.m

%% Machine Learning Online Class - Exercise 3 | Part 2: Neural Networks%  Instructions
%  ------------
%
%  This file contains code that helps you get started on the
%  linear exercise. You will need to complete the following functions
%  in this exericse:
%
%     lrCostFunction.m (logistic regression cost function)
%     oneVsAll.m
%     predictOneVsAll.m
%     predict.m
%
%  For this exercise, you will not need to change any code in this file,
%  or any other files other than those mentioned above.
%%% Initialization
clear ; close all; clc%% Setup the parameters you will use for this exercise
input_layer_size  = 400;  % 20x20 Input Images of Digits
hidden_layer_size = 25;   % 25 hidden units
num_labels = 10;          % 10 labels, from 1 to 10   % (note that we have mapped "0" to label 10)%% =========== Part 1: Loading and Visualizing Data =============
%  We start the exercise by first loading and visualizing the dataset.
%  You will be working with a dataset that contains handwritten digits.
%% Load Training Data
fprintf('Loading and Visualizing Data ...\n')load('ex3data1.mat');
m = size(X, 1);% Randomly select 100 data points to display
sel = randperm(size(X, 1));
sel = sel(1:100);displayData(X(sel, :));fprintf('Program paused. Press enter to continue.\n');
pause;%% ================ Part 2: Loading Pameters ================
% In this part of the exercise, we load some pre-initialized
% neural network parameters.fprintf('\nLoading Saved Neural Network Parameters ...\n')% Load the weights into variables Theta1 and Theta2
load('ex3weights.mat');%% ================= Part 3: Implement Predict =================
%  After training the neural network, we would like to use it to predict
%  the labels. You will now implement the "predict" function to use the
%  neural network to predict the labels of the training set. This lets
%  you compute the training set accuracy.pred = predict(Theta1, Theta2, X);fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);fprintf('Program paused. Press enter to continue.\n');
pause;%  To give you an idea of the network's output, you can also run
%  through the examples one at the a time to see what it is predicting.%  Randomly permute examples
rp = randperm(m);for i = 1:m% Display fprintf('\nDisplaying Example Image\n');displayData(X(rp(i), :));pred = predict(Theta1, Theta2, X(rp(i),:));fprintf('\nNeural Network Prediction: %d (digit %d)\n', pred, mod(pred, 10));% Pausefprintf('Program paused. Press enter to continue.\n');pause;
end

三、lrCostFunction.m

function [J, grad] = lrCostFunction(theta, X, y, lambda)
%LRCOSTFUNCTION Compute cost and gradient for logistic regression with
%regularization
%   J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
%   theta as the parameter for regularized logistic regression and the
%   gradient of the cost w.r.t. to the parameters. % Initialize some useful values
m = length(y); % number of training examples % m% You need to return the following variables correctly
J = 0; %1*1
grad = zeros(size(theta)); %(n+1)*1% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
%               You should set J to the cost.
%               Compute the partial derivatives and set grad to the partial
%               derivatives of the cost w.r.t. each parameter in theta
%
% Hint: The computation of the cost function and gradients can be
%       efficiently vectorized. For example, consider the computation
%
%           sigmoid(X * theta)
%
%       Each row of the resulting matrix will contain the value of the
%       prediction for that example. You can make use of this to vectorize
%       the cost function and gradient computations.
%
% Hint: When computing the gradient of the regularized cost function,
%       there're many possible vectorized solutions, but one solution
%       looks like:
%           grad = (unregularized gradient for logistic regression)
%           temp = theta;
%           temp(1) = 0;   % because we don't add anything for j = 0
%           grad = grad + YOUR_CODE_HERE (using the temp variable)
%h = sigmoid(X*theta); %m*1
part1 = y.*(log(h)); %m*1
part2 = (1-y).*(log(1-h)); %m*1  J_ori = sum(-part1 - part2) / m; %1*1  sz_theta = size(theta, 1);
theta_temp = theta(2:sz_theta);
punish_J = sum(theta_temp.^2)*lambda/2/m;
J = J_ori + punish_J;% graddiff = h - y; %m*1
temp = X' * diff; % (n+1)*m × m*1 -> (n+1)*1
temp = temp / m; % (n+1)*1;  grad_ori = temp;  punish_theta = theta_temp*lambda/m;
punish_theta = [0; punish_theta];
grad = grad_ori + punish_theta;  % =============================================================grad = grad(:);end

四、oneVsAll.m

function [all_theta] = oneVsAll(X, y, num_labels, lambda)
%ONEVSALL trains multiple logistic regression classifiers and returns all
%the classifiers in a matrix all_theta, where the i-th row of all_theta
%corresponds to the classifier for label i
%   [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
%   logisitc regression classifiers and returns each of these classifiers
%   in a matrix all_theta, where the i-th row of all_theta corresponds
%   to the classifier for label i% Some useful variables
m = size(X, 1); % m
n = size(X, 2); % n% You need to return the following variables correctly
all_theta = zeros(num_labels, n + 1); % num_labels*(n+1)% Add ones to the X data matrix
X = [ones(m, 1) X]; % m*(n+1)% ====================== YOUR CODE HERE ======================
% Instructions: You should complete the following code to train num_labels
%               logistic regression classifiers with regularization
%               parameter lambda.
%
% Hint: theta(:) will return a column vector.
%
% Hint: You can use y == c to obtain a vector of 1's and 0's that tell use
%       whether the ground truth is true/false for this class.
%
% Note: For this assignment, we recommend using fmincg to optimize the cost
%       function. It is okay to use a for-loop (for c = 1:num_labels) to
%       loop over the different classes.
%
%       fmincg works similarly to fminunc, but is more efficient when we
%       are dealing with large number of parameters.
%
% Example Code for fmincg:
%
%     % Set Initial theta
%     initial_theta = zeros(n + 1, 1);
%
%     % Set options for fminunc
%     options = optimset('GradObj', 'on', 'MaxIter', 50);
%
%     % Run fmincg to obtain the optimal theta
%     % This function will return theta and the cost
%     [theta] = ...
%         fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
%                 initial_theta, options);
%for c = 1:num_labelsinitial_theta = zeros(n + 1, 1);options = optimset('GradObj', 'on', 'MaxIter', 50);[theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)),    initial_theta, options);
all_theta(c, :) = theta;
end% =========================================================================end

五、predictOneVsAll.m

function p = predictOneVsAll(all_theta, X)
%PREDICT Predict the label for a trained one-vs-all classifier. The labels
%are in the range 1..K, where K = size(all_theta, 1).
%  p = PREDICTONEVSALL(all_theta, X) will return a vector of predictions
%  for each example in the matrix X. Note that X contains the examples in
%  rows. all_theta is a matrix where the i-th row is a trained logistic
%  regression theta vector for the i-th class. You should set p to a vector
%  of values from 1..K (e.g., p = [1; 3; 1; 2] predicts classes 1, 3, 1, 2
%  for 4 examples) m = size(X, 1); % m
num_labels = size(all_theta, 1); % k% You need to return the following variables correctly
p = zeros(size(X, 1), 1); % m*1% Add ones to the X data matrix
X = [ones(m, 1) X]; % m*(n+1)% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
%               your learned logistic regression parameters (one-vs-all).
%               You should set p to a vector of predictions (from 1 to
%               num_labels).
%
% Hint: This code can be done all vectorized using the max function.
%       In particular, the max function can also return the index of the
%       max element, for more information see 'help max'. If your examples
%       are in rows, then, you can use max(A, [], 2) to obtain the max
%       for each row.
%       x_theta = X * all_theta'; % m*(n+1) × (n+1)*k -> m*k
for c = 1:mmax_value = max(x_theta(c,:));
idx = find(x_theta(c,:) == max_value)
p(c) = idx;end% =========================================================================end

六、predict.m

function p = predict(Theta1, Theta2, X)
%PREDICT Predict the label of an input given a trained neural network
%   p = PREDICT(Theta1, Theta2, X) outputs the predicted label of X given the
%   trained weights of a neural network (Theta1, Theta2)% Useful values
m = size(X, 1); % m
num_labels = size(Theta2, 1); %num_labels% You need to return the following variables correctly
p = zeros(size(X, 1), 1); % m*1% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
%               your learned neural network. You should set p to a
%               vector containing labels between 1 to num_labels.
%
% Hint: The max function might come in useful. In particular, the max
%       function can also return the index of the max element, for more
%       information see 'help max'. If your examples are in rows, then, you
%       can use max(A, [], 2) to obtain the max for each row.
%X = [ones(m,1) X]; % add 1, m*(n+1)x_theta1 = X * Theta1'; % m*(n+1) × (n+1)*k -> m*k
x_theta1 = sigmoid(x_theta1);x_theta1 = [ones(m, 1) x_theta1] % add 1x_theta2 = x_theta1 * Theta2'; % m*k × k*(n+1) -> m*(n+1)
x_theta2 = sigmoid(x_theta2);for c = 1:mmax_value = max(x_theta2(c,:));
idx = find(x_theta2(c,:) == max_value)
p(c) = idx;end%max_value = max(x_theta2);
%idx = find(x_theta2 == max_value);
%p = idx;% =========================================================================end

七、submit results

Machine Learning week 4 quiz: programming assignment-Multi-class Classification and Neural Networks相关推荐

  1. Machine Learning week 5 quiz: programming assignment-Multi-Neural Network Learning

    一.ex4.m %% Machine Learning Online Class - Exercise 4 Neural Network Learning% Instructions % ------ ...

  2. Machine Learning week 9 quiz: programming assignment-Anomaly Detection and Recommender Systems

    一.ex8.m %% Machine Learning Online Class % Exercise 8 | Anomaly Detection and Collaborative Filterin ...

  3. Machine Learning week 8 quiz: programming assignment-K-Means Clustering and PCA

    一.ex7.m %% Machine Learning Online Class % Exercise 7 | Principle Component Analysis and K-Means Clu ...

  4. Machine Learning week 7 quiz: programming assignment-Support Vector Machines

    一.ex6.m %% Machine Learning Online Class % Exercise 6 | Support Vector Machines % % Instructions % - ...

  5. Machine Learning week 6 quiz: programming assignment-Regularized Linear Regression and Bias/Variance

    一.ex5.m %% Machine Learning Online Class % Exercise 5 | Regularized Linear Regression and Bias-Varia ...

  6. Machine Learning week 3 quiz: programming assignment-Logistic Regression

    一.ex2.m: the main .m file to call other function files % matlab%% Machine Learning Online Class - Ex ...

  7. Machine Learning week 11 quiz: Application: Photo OCR

    Application: Photo OCR 5 试题 1. Suppose you are running a sliding window detector to find text in ima ...

  8. Machine Learning week 10 quiz: Large Scale Machine Learning

    Large Scale Machine Learning 5 试题 1. Suppose you are training a logistic regression classifier using ...

  9. Machine Learning week 6 quiz: Machine Learning System Design

    Machine Learning System Design 5 试题 1. You are working on a spam classification system using regular ...

最新文章

  1. 图像的大小计算 位深和色深
  2. 为Linux上的Tomcat安装apr支持
  3. 嵌套母版页中的控件访问
  4. spring mvc dubbo ios android整合cms内容发布平台
  5. SpringMVC运行流程分析
  6. ORA-12170:TNS:连接超时
  7. hadoop二次排序
  8. Stackoverflow 年度报告 2020:开发者最喜爱的数据库是什么?
  9. 腾讯一面总结-web前端-2018.4.11
  10. SAP License:ABC作业成本法-平行记帐
  11. linux几个不常用但是很有用的命令
  12. java画图类_JAVA绘图类_Graphics
  13. java bitset xor_java中的BitSet
  14. Ubuntu配置及美化篇
  15. 一个3D引擎Demo 源码
  16. 达拉斯大学计算机硕士专业排名,德克萨斯大学达拉斯分校UTD计算机科学Computer Science专业排名第251-300位(2021年THE世界大学商科排名)...
  17. 消防应急照明和疏散指示系统
  18. 个人所得税年度应纳税额抵扣-云服务器ECS入门-考试题及答案-申报更正流程
  19. 买手妈妈如何赚钱?赚钱的模式具体是什么?
  20. AS608指纹模块开发教程

热门文章

  1. java校园足球管理系统_基于jsp的校园足球管理平台-JavaEE实现校园足球管理平台 - java项目源码...
  2. LINUX防火墙打开与关闭
  3. jvm性能调优实战 -52修复堆内存区域内存溢出问题OutOfMemoryError: Java heap space
  4. 实战SSM_O2O商铺_35【商品】商品编辑之View层的实现
  5. 实战SSM_O2O商铺_12【商铺注册】View层之前台页面
  6. 基于netty实现mq
  7. python 从尾到头打印链表
  8. halcon的算子清点:Chapter 10 3d匹配
  9. 哨兵机器人钢力士_哨兵胳膊都被卸了?巴西厂X战警钢力士正式公布
  10. vue就地复用不是更快吗_Vue.js从零开始——组件(1)