Matlab深度学习笔记——深度学习工具箱说明
本文是Rasmus Berg Palm发布在Github上的Deep-learning toolbox的说明文件,作者对这个工具箱进行了详细的介绍(原文链接:https://github.com/rasmusbergpalm/DeepLearnToolbox#deeplearntoolbox点击打开链接)。
警告:
这个工具箱已经过时,并且不再被保留。
有很多比这个工具箱更好的工具可用于深度学习,如 Theano, torch 或者 tensorflow。
我建议您使用上面提到的工具之一,而不是使用这个工具箱。
祝好,Rasmus
DeepLearnToolbox——一个用于深度学习的Matlab工具箱
深度学习是机器学习的一个新领域,它的重点是学习深层次的数据模型。它的灵感来自于人脑表面的深度分层体系结构。深度学习理论的一个很好的概述是学习人工智能的深层架构。
想得到更多相关介绍,请看下面的Geoffrey Hinton和Andrew Ng的视频。
The Next Generation of Neural Networks (Hinton, 2007)
Recent Developments in Deep Learning (Hinton, 2010)
Unsupervised Feature Learning and Deep Learning (Ng, 2011)
工具箱中包含的目录
NN/ - 一个前馈BP神经网络的库
CNN/ - 一个卷积神经网络的库
DBN/ - 一个深度置信网络的库
SAE/ - 一个堆栈自动编码器的库
CAE/ - 一个卷积自动编码器的库
util/ - 库中使用的效用函数
data/ - 例子所用到的数据
tests/ - 用来验证工具箱正在工作的测试
设置
1.下载
2.addpath(genpath('DeepLearnToolbox'));
例:深度置信网络Deep Belief Network
function test_example_DBN
load mnist_uint8;train_x = double(train_x) / 255;
test_x = double(test_x) / 255;
train_y = double(train_y);
test_y = double(test_y);%% ex1 train a 100 hidden unit RBM and visualize its weights
rand('state',0)
dbn.sizes = [100];
opts.numepochs = 1;
opts.batchsize = 100;
opts.momentum = 0;
opts.alpha = 1;
dbn = dbnsetup(dbn, train_x, opts);
dbn = dbntrain(dbn, train_x, opts);
figure; visualize(dbn.rbm{1}.W'); % Visualize the RBM weights%% ex2 train a 100-100 hidden unit DBN and use its weights to initialize a NN
rand('state',0)
%train dbn
dbn.sizes = [100 100];
opts.numepochs = 1;
opts.batchsize = 100;
opts.momentum = 0;
opts.alpha = 1;
dbn = dbnsetup(dbn, train_x, opts);
dbn = dbntrain(dbn, train_x, opts);%unfold dbn to nn
nn = dbnunfoldtonn(dbn, 10);
nn.activation_function = 'sigm';%train nn
opts.numepochs = 1;
opts.batchsize = 100;
nn = nntrain(nn, train_x, train_y, opts);
[er, bad] = nntest(nn, test_x, test_y);assert(er < 0.10, 'Too big error');
例:堆叠式自动编码器Stacked Auto-Encoders
function test_example_SAE
load mnist_uint8;train_x = double(train_x)/255;
test_x = double(test_x)/255;
train_y = double(train_y);
test_y = double(test_y);%% ex1 train a 100 hidden unit SDAE and use it to initialize a FFNN
% Setup and train a stacked denoising autoencoder (SDAE)
rand('state',0)
sae = saesetup([784 100]);
sae.ae{1}.activation_function = 'sigm';
sae.ae{1}.learningRate = 1;
sae.ae{1}.inputZeroMaskedFraction = 0.5;
opts.numepochs = 1;
opts.batchsize = 100;
sae = saetrain(sae, train_x, opts);
visualize(sae.ae{1}.W{1}(:,2:end)')% Use the SDAE to initialize a FFNN
nn = nnsetup([784 100 10]);
nn.activation_function = 'sigm';
nn.learningRate = 1;
nn.W{1} = sae.ae{1}.W{1};% Train the FFNN
opts.numepochs = 1;
opts.batchsize = 100;
nn = nntrain(nn, train_x, train_y, opts);
[er, bad] = nntest(nn, test_x, test_y);
assert(er < 0.16, 'Too big error');
Example: Convolutional Neural Netsfunction test_example_CNN
load mnist_uint8;train_x = double(reshape(train_x',28,28,60000))/255;
test_x = double(reshape(test_x',28,28,10000))/255;
train_y = double(train_y');
test_y = double(test_y');%% ex1 Train a 6c-2s-12c-2s Convolutional neural network
%will run 1 epoch in about 200 second and get around 11% error.
%With 100 epochs you'll get around 1.2% error
rand('state',0)
cnn.layers = {struct('type', 'i') %input layerstruct('type', 'c', 'outputmaps', 6, 'kernelsize', 5) %convolution layerstruct('type', 's', 'scale', 2) %sub sampling layerstruct('type', 'c', 'outputmaps', 12, 'kernelsize', 5) %convolution layerstruct('type', 's', 'scale', 2) %subsampling layer
};
cnn = cnnsetup(cnn, train_x, train_y);opts.alpha = 1;
opts.batchsize = 50;
opts.numepochs = 1;cnn = cnntrain(cnn, train_x, train_y, opts);[er, bad] = cnntest(cnn, test_x, test_y);%plot mean squared error
figure; plot(cnn.rL);assert(er<0.12, 'Too big error');
例:神经网络Neural Networks
function test_example_NN
load mnist_uint8;train_x = double(train_x) / 255;
test_x = double(test_x) / 255;
train_y = double(train_y);
test_y = double(test_y);% normalize
[train_x, mu, sigma] = zscore(train_x);
test_x = normalize(test_x, mu, sigma);%% ex1 vanilla neural net
rand('state',0)
nn = nnsetup([784 100 10]);
opts.numepochs = 1; % Number of full sweeps through data
opts.batchsize = 100; % Take a mean gradient step over this many samples
[nn, L] = nntrain(nn, train_x, train_y, opts);[er, bad] = nntest(nn, test_x, test_y);assert(er < 0.08, 'Too big error');%% ex2 neural net with L2 weight decay
rand('state',0)
nn = nnsetup([784 100 10]);nn.weightPenaltyL2 = 1e-4; % L2 weight decay
opts.numepochs = 1; % Number of full sweeps through data
opts.batchsize = 100; % Take a mean gradient step over this many samplesnn = nntrain(nn, train_x, train_y, opts);[er, bad] = nntest(nn, test_x, test_y);
assert(er < 0.1, 'Too big error');%% ex3 neural net with dropout
rand('state',0)
nn = nnsetup([784 100 10]);nn.dropoutFraction = 0.5; % Dropout fraction
opts.numepochs = 1; % Number of full sweeps through data
opts.batchsize = 100; % Take a mean gradient step over this many samplesnn = nntrain(nn, train_x, train_y, opts);[er, bad] = nntest(nn, test_x, test_y);
assert(er < 0.1, 'Too big error');%% ex4 neural net with sigmoid activation function
rand('state',0)
nn = nnsetup([784 100 10]);nn.activation_function = 'sigm'; % Sigmoid activation function
nn.learningRate = 1; % Sigm require a lower learning rate
opts.numepochs = 1; % Number of full sweeps through data
opts.batchsize = 100; % Take a mean gradient step over this many samplesnn = nntrain(nn, train_x, train_y, opts);[er, bad] = nntest(nn, test_x, test_y);
assert(er < 0.1, 'Too big error');%% ex5 plotting functionality
rand('state',0)
nn = nnsetup([784 20 10]);
opts.numepochs = 5; % Number of full sweeps through data
nn.output = 'softmax'; % use softmax output
opts.batchsize = 1000; % Take a mean gradient step over this many samples
opts.plot = 1; % enable plottingnn = nntrain(nn, train_x, train_y, opts);[er, bad] = nntest(nn, test_x, test_y);
assert(er < 0.1, 'Too big error');%% ex6 neural net with sigmoid activation and plotting of validation and training error
% split training data into training and validation data
vx = train_x(1:10000,:);
tx = train_x(10001:end,:);
vy = train_y(1:10000,:);
ty = train_y(10001:end,:);rand('state',0)
nn = nnsetup([784 20 10]);
nn.output = 'softmax'; % use softmax output
opts.numepochs = 5; % Number of full sweeps through data
opts.batchsize = 1000; % Take a mean gradient step over this many samples
opts.plot = 1; % enable plotting
nn = nntrain(nn, tx, ty, opts, vx, vy); % nntrain takes validation set as last two arguments (optionally)[er, bad] = nntest(nn, test_x, test_y);
assert(er < 0.1, 'Too big error');
Matlab深度学习笔记——深度学习工具箱说明相关推荐
- SilverLight学习笔记--进一步学习Isolated Storage独立存储一(理论篇)
在"silverlight如何在客户端读取文件"以及"silverlight如何在客户端写入文件"两篇文章中我们初步接触了Isolated Storage概念. ...
- Python3学习笔记之-学习基础(第三篇)
Python3学习笔记之-学习基础(第三篇) 文章目录 目录 Python3学习笔记之-学习基础(第三篇) 文章目录 一.循环 1.for循环 2.while循环 3.break,continue 二 ...
- 强化学习笔记-强化学习概述
强化学习笔记-强化学习概述 机器学习分类 强化学习与监督学习的异同点 强化学习基本原理 强化学习解决的是什么样的问题 强化学习分类 请分别解释随机性策略和确定性策略 回报.值函数.行为值函数三个指标的 ...
- SAR学习笔记后续-phased工具箱介绍
摘要 <SAR学习笔记-代码部分>主要介绍了目标检测.一维距离像.二维距离像以及SAR成像的RDA算法等编程实现过程.这篇论文承接上篇内容,主要介绍MATLAB中phased工具箱. 文章 ...
- ggplot2学习笔记5:工具箱(一)基础图层、标签、注释、群组几何对象、曲面图
此博客作为自己的学习笔记,同时与大家交流分享! Toolbox 首先我们来明确一下使用图层的目的是什么 显示数据(data):绘图时的最底层(数据层),显示数据有助于我们改进模型: 显示数据的统计摘要 ...
- 大数据业务学习笔记_学习业务成为一名出色的数据科学家
大数据业务学习笔记 意见 (Opinion) A lot of aspiring Data Scientists think what they need to become a Data Scien ...
- UE4入门学习笔记——纪念学习虚幻引擎满一周年
UE4入门学习笔记 前言: 今天是正式学习ue4一周年.一年前的今天,我结束了PBR流程的学习,怀揣着对游戏制作的热爱,正式开始学习ue4,继续追寻儿时的那个大厂梦.谁也没想到,一年后的今天,我会在T ...
- 【原创】强化学习笔记|从零开始学习PPO算法编程(pytorch版本)
从零开始学习PPO算法编程(pytorch版本)_melody_cjw的博客-CSDN博客_ppo算法 pytorch 从零开始学习PPO算法编程(pytorch版本)(二)_melody_cjw的博 ...
- 学习笔记 mysql_MySQL 学习笔记
MySQL 学习笔记 1 定义 数据库中的表:一行叫一条记录.每一列叫一个属性,或一个字段. 主键:表中的某个特殊字段,具有唯一的确定的值,可以根据该字段唯一的确定一条记录 外键:表中的某个字段的值为 ...
- cocos2d学习笔记2——学习资源
1. 视频 找了好几个视频,有一些讲得好的文件资源没有,后来终于找到一个讲得不错还有文件资源的,还有高清下载地址,虽然是2.2版本的,但是确实能学到不少东西,对用cocos2d做游戏有了基本的印象,对 ...
最新文章
- 有了“手掌”,机械手也能盘“核桃”,耶鲁出品 | Science子刊
- centos6.8 配置 tomcat
- POJ 1423 Big Number
- CelebA数据集在Linux下解压
- artDiaLog弹出插件
- 关于SAP Commerce里CMS页面模型modifiedTime和modifiedtime的大小写问题
- JavaScript字符串、数组、对象方法总结
- CSS 小结笔记之em
- Zookeeper基础简介
- 看看一个朋友写的代码,大家发表发表意见,比较简单的代码
- Axure模板库(1)-常见网站
- mac os android 线刷,Mac系统下使用Fastboot线刷安卓设备
- 基于微信小程序会议室预约系统设计与实现毕业设计毕设开题报告参考
- MySQL中关于超键,主键和候选键的区别
- 免费企业网站模板_学校网站模板_政府网站模板源码下载
- hadoop培训笔记
- 成本太高,京东配送扛不住了?
- golang常用库之-mgo.v2包、MongoDB官方go-mongo-driver包、七牛Qmgo包 | go操作mongodb、mongodb bson
- Android 简单图片浏览器
- 详解数字美元白皮书:可能和你想的不一样