最近在学习BP神经网络,想要自定义神经网络内部,但是网上找了很久都没有找到源代码,于是自己去搬运了一波matlab官方的源代码。

不多说,直接上代码,未编辑过的源码:
function out1 = newff(varargin)
%NEWFF Create a feed-forward backpropagation network.
%
%  Obsoleted in R2010b NNET 7.0.  Last used in R2010a NNET 6.0.4.
%  The recommended function is <a href="matlab:doc feedforwardnet">feedforwardnet</a>.
%
%  Syntax
%
%    net = newff(P,T,S)
%    net = newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF)
%
%  Description
%
%    NEWFF(P,T,S) takes,
%      P  - RxQ1 matrix of Q1 representative R-element input vectors.
%      T  - SNxQ2 matrix of Q2 representative SN-element target vectors.
%      Si  - Sizes of N-1 hidden layers, S1 to S(N-1), default = [].
%            (Output layer size SN is determined from T.)
%    and returns an N layer feed-forward backprop network.
%
%    NEWFF(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) takes optional inputs,
%      TFi - Transfer function of ith layer. Default is 'tansig' for
%            hidden layers, and 'purelin' for output layer.
%      BTF - Backprop network training function, default = 'trainlm'.
%      BLF - Backprop weight/bias learning function, default = 'learngdm'.
%      PF  - Performance function, default = 'mse'.
%      IPF - Row cell array of input processing functions.
%            Default is {'fixunknowns','remconstantrows','mapminmax'}.
%      OPF - Row cell array of output processing functions.
%            Default is {'remconstantrows','mapminmax'}.
%      DDF - Data division function, default = 'dividerand';
%    and returns an N layer feed-forward backprop network.
%
%    The transfer functions TF{i} can be any differentiable transfer
%    function such as TANSIG, LOGSIG, or PURELIN.
%
%    The training function BTF can be any of the backprop training
%    functions such as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc.
%
%    *WARNING*: TRAINLM is the default training function because it
%    is very fast, but it requires a lot of memory to run.  If you get
%    an "out-of-memory" error when training try doing one of these:
%
%    (1) Slow TRAINLM training, but reduce memory requirements, by
%        setting NET.<a href="matlab:doc nnproperty.net_efficiency">efficiency</a>.<a href="matlab:doc nnproperty.net_efficiency_memoryReduction">memoryReduction</a> to 2 or more. (See HELP TRAINLM.)
%    (2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM.
%    (3) Use TRAINRP which is slower but more memory efficient than TRAINBFG.
%
%    The learning function BLF can be either of the backpropagation
%    learning functions such as LEARNGD, or LEARNGDM.
%
%    The performance function can be any of the differentiable performance
%    functions such as MSE or MSEREG.
%
%  Examples
%
%    [inputs,targets] = simplefitdata;
%    net = newff(inputs,targets,20);
%    net = train(net,inputs,targets);
%    outputs = net(inputs);
%    errors = outputs - targets;
%    perf = perform(net,outputs,targets)
%
%  Algorithm
%
%    Feed-forward networks consist of Nl layers using the DOTPROD
%    weight function, NETSUM net input function, and the specified
%    transfer functions.
%
%    The first layer has weights coming from the input.  Each subsequent
%    layer has a weight coming from the previous layer.  All layers
%    have biases.  The last layer is the network output.
%
%    Each layer's weights and biases are initialized with INITNW.
%
%    Adaption is done with TRAINS which updates weights with the
%    specified learning function. Training is done with the specified
%    training function. Performance is measured according to the specified
%    performance function.
%
%  See also NEWCF, NEWELM, SIM, INIT, ADAPT, TRAIN, TRAINS% Mark Beale, 11-31-97
% Copyright 1992-2010 The MathWorks, Inc.%disp('NEWFF is no longer recommended. FEEDFORWARD is simpler and more efficient.');
% TODO - Recommendation function NNRECOMMEND%% Boilerplate Code - Same for all Network Functionspersistent INFO;
if (nargin < 1), error(message('nnet:Args:NotEnough')); end
in1 = varargin{1};
if ischar(in1)switch in1case 'info',if isempty(INFO), INFO = get_info; endout1 = INFO;end
elseout1 = create_network(varargin{:});
end%% Boilerplate Code - Same for all Network Functions%%
function info = get_infoinfo.function = mfilename;
info.name = 'Feed-Forward';
info.description = nnfcn.get_mhelp_title(mfilename);
info.type = 'nntype.network_fcn';
info.version = 6.0;%%
function net = create_network(varargin)if nargin < 2, error(message('nnet:Args:NotEnough')), endv1 = varargin{1};
if isa(v1,'cell'), v1 = cell2mat(v1); end
v2 = varargin{2};
if nargin > 2, v3 = varargin{3}; endif (nargin<= 6) && (size(v1,2)==2) && (~iscell(v2)) && (size(v2,1)==1) && ((nargin<3)||iscell(v3))nnerr.obs_use(mfilename,['See help for ' upper(mfilename) ' to update calls to the new argument list.']);net = new_5p0(varargin{:});
elsenet = new_5p1(varargin{:});
end%=============================================================
function net = new_5p1(p,t,s,tf,btf,blf,pf,ipf,tpf,ddf)if nargin < 2, error(message('nnet:Args:NotEnough')), end% Defaults
if (nargin < 3), s = []; end
if (nargin < 4), tf = {}; end
if (nargin < 5), btf = 'trainlm'; end
if (nargin < 6), blf = 'learngdm'; end
if (nargin < 7), pf = 'mse'; end
if (nargin < 8), ipf = {'fixunknowns','removeconstantrows','mapminmax'}; end
if (nargin < 9), tpf = {'removeconstantrows','mapminmax'}; end
if (nargin < 10), ddf = 'dividerand'; end% Format
if isa(p,'cell'), p = cell2mat(p); end
if isa(t,'cell'), t = cell2mat(t); end% Error checking
if ~(isa(p,'double') || isreal(p)  || islogical(t))error(message('nnet:NNet:XNotLegal'))
end
if ~(isa(t,'double') || isreal(t) || islogical(t))error(message('nnet:NNet:TNotLegal'))
end
if isa(s,'cell')if (size(s,1) ~= 1)error(message('nnet:NNet:LayerSizes'))endfor i=1:length(s)si = s{i};if ~isa(si,'double') || ~isreal(si) || any(size(si) ~= 1) || any(si<1) || any(round(si) ~= si)error(message('nnet:NNet:LayerSizes'))endends = cell2mat(s);
end
if (~isa(s,'double')) || ~isreal(s) || (size(s,1) > 1) || any(s<1) || any(round(s) ~= s)error(message('nnet:NNet:LayerSizes'))
end% Architecture
Nl = length(s)+1;
net = network;
net.numInputs = 1;
net.numLayers = Nl;
net.biasConnect = ones(Nl,1);
net.inputConnect(1,1) = 1;
[j,i] = meshgrid(1:Nl,1:Nl);
net.layerConnect = (j == (i-1));
net.outputConnect(Nl) = 1;% Simulation
net.inputs{1}.processFcns = ipf;
for i=1:Nlif (i < Nl)net.layers{i}.size = s(i);if (Nl == 2)net.layers{i}.name = 'Hidden Layer';elsenet.layers{i}.name = ['Hidden Layer ' num2str(i)];endelsenet.layers{i}.name = 'Output Layer';endif (length(tf) < i) || all(isnan(tf{i}))if (i<Nl)net.layers{i}.transferFcn = 'tansig';elsenet.layers{i}.transferFcn = 'purelin';endelsenet.layers{i}.transferFcn = tf{i};end
end
net.outputs{Nl}.processFcns = tpf;% Adaption
net.adaptfcn = 'adaptwb';
net.inputWeights{1,1}.learnFcn = blf;
for i=1:Nlnet.biases{i}.learnFcn = blf;net.layerWeights{i,:}.learnFcn = blf;
end% Training
net.trainfcn = btf;
net.dividefcn = ddf;
net.performFcn = pf;% Initialization
net.initFcn = 'initlay';
for i=1:Nlnet.layers{i}.initFcn = 'initnw';
end% Configuration
% Warning: Use of these properties is no longer recommended
net.inputs{1}.exampleInput = p;
net.outputs{Nl}.exampleOutput = t;% Initialize
net = init(net);% Plots
net.plotFcns = {'plotperform','plottrainstate','plotregression'};%================================================================
function net = new_5p0(p,s,tf,btf,blf,pf)
% Backward compatible to NNT 5.0if nargin < 2, error(message('nnet:Args:NotEnough')), end% Defaults
Nl = length(s);
if nargin < 3, tf = {'tansig'}; tf = tf(ones(1,Nl)); end
if nargin < 4, btf = 'trainlm'; end
if nargin < 5, blf = 'learngdm'; end
if nargin < 6, pf = 'mse'; end% Error checking
if isa(p,'cell') && all(size(p)==[1 1]), p = p{1,1}; end
if (~isa(p,'double')) || ~isreal(p)error(message('nnet:NNData:XNotMatorCell1Mat'))
end
if isa(s,'cell')if (size(s,1) ~= 1)error(message('nnet:NNet:LayerSizes'))endfor i=1:length(s)si = s{i};if ~isa(si,'double') || ~isreal(si) || any(size(si) ~= 1) || any(si<1) || any(round(si) ~= si)error(message('nnet:NNet:LayerSizes'))endends = cell2mat(s);
end
if (~isa(s,'double')) || ~isreal(s) || (size(s,1) ~= 1) || any(s<1) || any(round(s) ~= s)error(message('nnet:NNet:LayerSizes'))
end% Architecture
net = network(1,Nl);
net.biasConnect = ones(Nl,1);
net.inputConnect(1,1) = 1;
[j,i] = meshgrid(1:Nl,1:Nl);
net.layerConnect = (j == (i-1));
net.outputConnect(Nl) = 1;% Simulation
for i=1:Nlnet.layers{i}.size = s(i);net.layers{i}.transferFcn = tf{i};
end% Performance
net.performFcn = pf;% Adaption
net.adaptfcn = 'adaptwb';
net.inputWeights{1,1}.learnFcn = blf;
for i=1:Nlnet.biases{i}.learnFcn = blf;net.layerWeights{i,:}.learnFcn = blf;
end% Training
net.trainfcn = btf;% Initialization
net.initFcn = 'initlay';
for i=1:Nlnet.layers{i}.initFcn = 'initnw';
end% Warning: this property is no longer recommended for use
net.inputs{1}.exampleInput = p;net = init(net);% Plots
net.plotFcns = {'plotperform','plottrainstate','plotregression'};

matlab BP神经网络 newff函数官方源码相关推荐

  1. matlab BP神经网络 newff

    net = newff(P,T,S)                             % 这两种定义都可以 net = newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DD ...

  2. 【电力负荷预测】基于matlab BP神经网络电力负荷预测【含Matlab源码 278期】

    ⛄一.获取代码方式 获取代码方式1: 完整代码已上传我的资源: [电力负荷预测]基于matlab BP神经网络电力负荷预测[含Matlab源码 278期] 获取代码方式2: 付费专栏Matlab智能算 ...

  3. 【停车位预测】基于matlab BP神经网络停车位预测【含Matlab源码 765期】

    ⛄一.获取代码方式 获取代码方式1: 完整代码已上传我的资源:[停车位预测]基于matlab BP神经网络停车位预测[含Matlab源码 765期] 点击上面蓝色字体,直接付费下载,即可. 获取代码方 ...

  4. 【故障诊断分析】基于matlab BP神经网络三相逆变器故障诊断研究【含Matlab源码 1736期】

    一.BP神经网络三相逆变器故障诊断简介 针对三相桥式逆变电路为研究对象,建立了仿真模型,并对逆变器主电路开关器件的开路故障进行仿 真,提出了基于BP神经网络的故障诊断方法,确定了网络的结构和参数,并以 ...

  5. matlab神经网络newff函数的用法

    转自:matlab神经网络newff函数的用法,保存在此以学习. 设[P,T]是训练样本,[X,Y]是测试样本: net=newrb(P,T,err_goal,spread); %建立网络 q=sim ...

  6. matlab里newff,新版matlab中神经网络训练函数newff的使用方法

    新版matlab中神经网络训练函数newff的使用方法 新版 Matlab 中神经网络训练函数 Newff 的使用方法一. 介绍新版 newffSyntax net = newff(P,T,[S1 ...

  7. 新版matlab newff,[转载]新版Matlab中神经网络训练函数Newff的使用方法

    新版Matlab中神经网络训练函数Newff的使用方法 一. 介绍新版newff Syntax · net = newff(P,T,[S1 S2...S(N-l)],{TF1 TF2...TFNl}, ...

  8. Matlab newff 训练时间,新版Matlab中神经网络训练函数Newff的使用方法

    新版Matlab中神经网络训练函数Newff的使用方法 一. 介绍新版newff Syntax · net = newff(P,T,[S1 S2...S(N-l)],{TF1 TF2...TFNl}, ...

  9. 《MATLAB 神经网络43个案例分析》:第3章 遗传算法优化BP神经网络——非线性函数拟合

    <MATLAB 神经网络43个案例分析>:第3章 遗传算法优化BP神经网络--非线性函数拟合 1. 前言 2. MATLAB 仿真示例 3. 小结 1. 前言 <MATLAB 神经网 ...

最新文章

  1. python 技术篇-logging模块的日志定期清理设置,自动清理上个月的日志实例演示
  2. 学习旧岛小程序 (5) observer 函数中修改属性的值
  3. 软件性能-概念、关注点、术语
  4. 如何word删除分隔符
  5. github结合TortoiseGit使用sshkey,无需输入账号和密码
  6. python高级-异常(13)
  7. 利用MapGis6.7 对 jpg图像文件进行图形校准
  8. 打印表格留标题怎么设置_WPS怎么设置打印表格每页都有标题?
  9. oracle 排除节假日,ORACLE 计算节假日
  10. 基于n元语言模型整句拼音汉字转换
  11. 主要几个浏览器的内核是什么
  12. python复制excel图片_python批量导出excel区域图片
  13. 笔记本电脑计算机的配置表,笔记本组装配置清单_笔记本电脑配置单及价格
  14. java 后台将英文名 转换成中文名
  15. 记事本html写代码运行挠脚心,tk挠脚心
  16. 数字电路硬件设计系列(二)之DC-DC电源设计
  17. kettle 邮件服务器,kettle 实用功能之三 ---- 使用 kettle 群发动态内容的邮件。
  18. 超详细python下简单快速下载opencv
  19. 爬虫:UnicodeDecodeError: 'gbk' codec can't decode byte 0xa6 in position
  20. 运用审查元素下载网页视频

热门文章

  1. android释放内存只有1GB,手机内存不够用?教你5秒删掉1G垃圾,提升速度!
  2. Pandas数据分析案例(盛华化工锅炉排放数据可视化分析)
  3. gateway和openfeign依赖冲突
  4. 在线医疗 java_hospital 基于反射的 在线医疗项目(二)
  5. linux mmc驱动框架,Linux mmc framework2:基本组件之mmc
  6. 错题集:HDLBits lfsr5 Galois型lfsr
  7. 如何快速制作Gif动图
  8. Java SE - 10 - 多线程
  9. 用程序来模拟Alt+PrtSc的键盘事件,实现截屏功能
  10. 服务器cpu 单核过高的影响,一次单核CPU占用过高问题的处理