ex3-nn 神经网络

%% Machine Learning Online Class - Exercise 3 | Part 2: Neural Networks%  Instructions
%  ------------
%
%  This file contains code that helps you get started on the
%  linear exercise. You will need to complete the following functions
%  in this exericse:
%
%     lrCostFunction.m (logistic regression cost function)
%     oneVsAll.m
%     predictOneVsAll.m
%     predict.m
%
%  For this exercise, you will not need to change any code in this file,
%  or any other files other than those mentioned above.
%%% Initialization
clear ; close all; clc%% Setup the parameters you will use for this exercise
input_layer_size  = 400;  % 20x20 Input Images of Digits
hidden_layer_size = 25;   % 25 hidden units
num_labels = 10;          % 10 labels, from 1 to 10   % (note that we have mapped "0" to label 10)

Part 1: Loading and Visualizing Data

%% =========== Part 1: Loading and Visualizing Data =============
%  We start the exercise by first loading and visualizing the dataset.
%  You will be working with a dataset that contains handwritten digits.
%% Load Training Data
fprintf('Loading and Visualizing Data ...\n')load('ex3data1.mat');
m = size(X, 1);% Randomly select 100 data points to display
sel = randperm(size(X, 1));
sel = sel(1:100);displayData(X(sel, :));fprintf('Program paused. Press enter to continue.\n');
pause;

Function Definition

displayData

function [h, display_array] = displayData(X, example_width)
%DISPLAYDATA Display 2D data in a nice grid
%   [h, display_array] = DISPLAYDATA(X, example_width) displays 2D data
%   stored in X in a nice grid. It returns the figure handle h and the
%   displayed array if requested.% Set example_width automatically if not passed in
if ~exist('example_width', 'var') || isempty(example_width) example_width = round(sqrt(size(X, 2)));
end% Gray Image
colormap(gray);% Compute rows, cols
[m n] = size(X);
example_height = (n / example_width);% Compute number of items to display
display_rows = floor(sqrt(m));
display_cols = ceil(m / display_rows);% Between images padding
pad = 1;% Setup blank display
display_array = - ones(pad + display_rows * (example_height + pad), ...pad + display_cols * (example_width + pad));% Copy each example into a patch on the display array
curr_ex = 1;
for j = 1:display_rowsfor i = 1:display_colsif curr_ex > m break; end% Copy the patch% Get the max value of the patchmax_val = max(abs(X(curr_ex, :)));display_array(pad + (j - 1) * (example_height + pad) + (1:example_height), ...pad + (i - 1) * (example_width + pad) + (1:example_width)) = ...reshape(X(curr_ex, :), example_height, example_width) / max_val;curr_ex = curr_ex + 1;endif curr_ex > mbreak; end
end% Display Image
h = imagesc(display_array, [-1 1]);% Do not show axis
axis image offdrawnow;end

Part 2: Loading Pameters

% In this part of the exercise, we load some pre-initialized
% neural network parameters.fprintf('\nLoading Saved Neural Network Parameters ...\n')% Load the weights into variables Theta1 and Theta2
load('ex3weights.mat');

Part 3: Implement Predict

%  After training the neural network, we would like to use it to predict
%  the labels. You will now implement the "predict" function to use the
%  neural network to predict the labels of the training set. This lets
%  you compute the training set accuracy.pred = predict(Theta1, Theta2, X);fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);fprintf('Program paused. Press enter to continue.\n');
pause;%  To give you an idea of the network's output, you can also run
%  through the examples one at the a time to see what it is predicting.%  Randomly permute examples
rp = randperm(m);for i = 1:m% Display fprintf('\nDisplaying Example Image\n');displayData(X(rp(i), :));pred = predict(Theta1, Theta2, X(rp(i),:));fprintf('\nNeural Network Prediction: %d (digit %d)\n', pred, mod(pred, 10));% Pause with quit options = input('Paused - press enter to continue, q to exit:','s');if s == 'q'breakend
end

Function Definition

predict

function p = predict(Theta1, Theta2, X)
%PREDICT Predict the label of an input given a trained neural network
%   p = PREDICT(Theta1, Theta2, X) outputs the predicted label of X given the
%   trained weights of a neural network (Theta1, Theta2)% Useful values
m = size(X, 1);
num_labels = size(Theta2, 1);% You need to return the following variables correctly
p = zeros(size(X, 1), 1);% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
%               your learned neural network. You should set p to a
%               vector containing labels between 1 to num_labels.
%
% Hint: The max function might come in useful. In particular, the max
%       function can also return the index of the max element, for more
%       information see 'help max'. If your examples are in rows, then, you
%       can use max(A, [], 2) to obtain the max for each row.
%a1 = [ones(m, 1) X];
z2 = a1*Theta1';
a2 = [ones(size(z2, 1), 1) sigmoid(z2)];
z3 = a2*Theta2';
a3 = sigmoid(z3);[p_max, p] = max(a3, [], 2);% =========================================================================end

ex3_nn神经网络相关推荐

  1. 【吴恩达机器学习】Week4 编程作业ex3——多分类任务和神经网络

    Multi-class Classification 1. 数据预处理和可视化 dispalyData.m function [h, display_array] = displayData(X, e ...

  2. 为什么神经网络的激活函数必须使用线性函数?

    什么是线性函数? 函数本来是输入某个值后会返回一个值的转换器.向这个转换器输入某个值后,输出值是输入值的常数倍的函数称为线性函数(用数学式表示为h(x) = cx. c为常数).因此,线性函数是一条笔 ...

  3. 【机器学习】RNN循环神经网络

    循环神经网络归属: 领域:机器学习 方向:自然语言处理 贡献:自动文本生成 循环神经网络实际应用: 生活中因为原始数据都是序列化的,比如自然语言,语音处理,时间序列问题(股票价格)等问题, 这个时候需 ...

  4. Hopfiled 神经网络实例解释

    Hopfiled 神经网络入门 进击吧程序猿 2018-01-01 23:04:27 本文参考 Hinton 的机器学习课程,总结了 Hopfield 神经网络,整个学习的脉络是:Hopfield 网 ...

  5. 卷积神经网络之卷积计算、作用与思想 深度学习

    博客:blog.shinelee.me | 博客园 | CSDN 卷积运算与相关运算 在计算机视觉领域,卷积核.滤波器通常为较小尺寸的矩阵,比如3×33×3.从这个角度看,多层卷积是在进行逐层映射,整 ...

  6. 机器都会学习了,你的神经网络还跑不动?来看看这些建议

    在很多机器学习的实验室中,机器已经进行了上万小时的训练.在这个过程中,研究者们往往会走很多弯路,也会修复很多bug,但可以肯定的是,在机器学习的研究过程中,学到经验和知识的并不仅仅是机器,我们人类也积 ...

  7. BERT大火却不懂Transformer?读这一篇就够了 原版 可视化机器学习 可视化神经网络 可视化深度学习...20201107

    20211016 调节因子 20211004 [NLP]Transformer模型原理详解 - 知乎 论文所用 20210703 GPT模型与Transformer进行对比_znevegiveup1的 ...

  8. 神经网络为什么需要多次epoch

    Δw(t)=−ε ∂w(t) ∂E ​ +αΔw(t−1)(9) 我们知道反向传播每次迭代的效果是这样的: w=w+Δw(t) w=w+\Delta w(t) w=w+Δw(t) 我们知道,每条训练数 ...

  9. 卷积神经网络通俗解读

    转载自:https://blog.csdn.net/dong_lxkm/article/details/80575207 一.前言 最近一直在研究深度学习,联想起之前所学,感叹数学是一门朴素而神奇的科 ...

最新文章

  1. matlab导入txt数据_如何正确的将txt文本数据导入到Word中使用?
  2. C++ 并发编程(四):基于 Asio 的线程池
  3. suface怎么把计算机放到桌面,快速使用Surface的八条技巧
  4. 计算机感染病毒后 一定不能清除的措施是,计算机感染病毒后,一定不能清除的措施是()。...
  5. iis7+php7.1配置,IIS7.X配置PHP运行环境小结
  6. 机器学习笔记(十三):降维
  7. 单日课程超10万节!VIPKID 如何通过实时计算提升上课体验?
  8. 使用计算机控制台方法,故障控制台使用方法
  9. 万有引力的意思_从牛顿的苹果到牛顿的大炮:万有引力定律
  10. 开源容器云openshift pdf_OpenShift和Kubernetes的10个最重要的区别
  11. VS2015激活 密钥
  12. 微信怎么测试好友软件,三种方法教你检测微信中的僵尸粉!不要再用第三方软件了哦...
  13. 中兴网络设备交换机路由器查看ARP表项命令方法
  14. jenkins 插件_Jenkins通过Ruby插件赢得了新的皇冠
  15. vue 阻止输入框冒泡
  16. 基于C# WinForms窗体——飞机大战
  17. Erlang学习时间曲线
  18. 年终盘点,蔚来终于失去互联网造车老大地位,被小鹏取而代之
  19. dp线长什么样子_如何选一根好的DP线?不同DP线有什么区别?
  20. H5兼容问题及解决方法

热门文章

  1. Linux qq查看对方ip,怎么知道对方是否隐身(基于Linux QQ)
  2. 【公众号】欢迎关注本人微信公众号:一枝花算不算浪漫
  3. MATLAB对应分析------------2019/8/27
  4. Stata:跨国投入产出分析-icio
  5. 2022年危险化学品经营单位安全管理人员考题及在线模拟考试
  6. 【266期】面试官问:为什么 SQL 要尽量避免使用 IN 和 NOT IN?
  7. 苹果市值蒸发超千亿美元;戴威称 ofo 不会倒闭;人人网被卖,多牛接盘 | 雷锋早报...
  8. css3如何实现滑块
  9. 推荐几个科研数据相关的下载平台
  10. 修复版超强大微信小程序源码-内含几十款功能证件图片制作王者战力查询工具箱大全