文章目录

  • 1 目的
  • 2 设计思路
  • 3 代码
  • 4 输出

项目地址!!!!!!!!!!!!!!!!!!!!!!!!

1 目的

进行1-100以内的质数分类

2 设计思路

1、生成1-100以内的数和对应二进制

2、对质数部分进行label为1,其余为0

3、选择前60组作为training数据,后40组testing

4、选择三层神经网络,其中hidden和output部分使用sigmoid函数

3 代码

1、测试数据生成函数

function f = dataset_generator

bits_num = 7;
prime_table = [2,3,5,7,11,13,17,19,23,29,31,37,41,43,47,53,59,61,67,71,73,79,83,89,97];
prime_numble = 25;
prime_dataset = zeros(100,8);

%Generate prime number dataset 1-100
for count = 1:100
bin_str = dec2bin(count,bits_num);
for i = 1:bits_num
prime_dataset(count,i) = str2num(bin_str(i));
end
for i = 1:prime_numble
if(count == prime_table(i))
prime_dataset(count,bits_num+1) = 1;
end
end
if(prime_dataset(count,bits_num+1)~=1)
prime_dataset(count,bits_num+1) = 0;
end
end

f = prime_dataset;

2、最优学习速率选择函数,当hidden neurons 的数量分别选择8和15时获得alpha 最优为1和0.1,ask why?
function test_script1

%Training Set
data = dataset_generator;
x_train = data(1:60,1:7);
y_train = data(1:60,8);
x_test = data(61:100,1:7);
y_test = data(61:100,8);

for pow_num = 1:5

%Learning Rate
alpha = 10^(-3+pow_num);%Initialize the network
syn0 = 2*rand(7,15)-1;
syn1 = 2*rand(15,1)-1;%Training the network
for i = 1:60000l0 = x_train;l1 = sigmoid(l0*syn0);l2 = sigmoid(l1*syn1);l2_error = l2 - y_train;if(i==1)overallerror(i) = mean(abs(l2_error));endif(mod(i,10000)==0)overallerror(i/10000+1) = mean(abs(l2_error));endl2_delta = l2_error.*sigmoid_derivation(l2);l1_error = l2_delta*syn1';l1_delta = l1_error.*sigmoid_derivation(l1);syn1 = syn1 - alpha*(l1'*l2_delta);syn0 = syn0 - alpha*(l0'*l1_delta);
end
alpha
overallerror

end

%Testing progress
%testing_output = sigmoid(sigmoid(x_test*syn0)*syn1)
%testing_error = sum(abs(y_test - testing_output))

function s = sigmoid (x)
[m,n] = size(x);
for i = 1:m
for j = 1:n
s(i,j) = 1/(1+exp(-x(i,j)));
end
end

function s = sigmoid_derivation(x)
s = x.*(1-x);

3、主程序,包括数据生成、训练测试数据选择、网络训练、网络测试、结果比较
function test_script

%Training Set
data = dataset_generator;
x_train = data(1:60,1:7);
y_train = data(1:60,8);
x_test = data(61:100,1:7);
y_test = data(61:100,8);

%According to result of “test_script1.m”
%Learning rate
%“alpha = 1” --------- “number of hidden neurons = 8”
%“alpha = 0.1” --------- “number of hidden neurons = 15”
alpha = 0.1;

%Initialize the network
syn0 = 2rand(7,15)-1;
syn1 = 2
rand(15,1)-1;

%Training the network
for i = 1:60000
l0 = x_train;
l1 = sigmoid(l0syn0);
l2 = sigmoid(l1
syn1);
l2_error = l2 - y_train;
if(i==1)
overallerror(i) = mean(abs(l2_error));
end
if(mod(i,10000)==0)
overallerror(i/10000+1) = mean(abs(l2_error));
end
l2_delta = l2_error.sigmoid_derivation(l2);
l1_error = l2_delta
syn1’;
l1_delta = l1_error.sigmoid_derivation(l1);
syn1 = syn1 - alpha
(l1’l2_delta);
syn0 = syn0 - alpha
(l0’*l1_delta);
end
overallerror

%Testing progress
testing_output = sigmoid(sigmoid(x_test*syn0)*syn1);
testing_output = round(testing_output);
testing_error = sum(abs(y_test - testing_output))
for cnt = 61:100
testing_output(cnt-60,2) = cnt;
end
testing_output

function s = sigmoid (x)
[m,n] = size(x);
for i = 1:m
for j = 1:n
s(i,j) = 1/(1+exp(-x(i,j)));
end
end

function s = sigmoid_derivation(x)
s = x.*(1-x);

4 输出

结果分析与后续工作:和之前简单的奇偶数判别结果不同,误差率非常大,是否因为素数的非线性特性决定?神经网络的结果怎么与数学建立联系?

机器学习(MACHINE LEARNING)MATLAB三层神经网络的简单应用相关推荐

  1. 机器学习(Machine Learning)深度学习(Deep Learning)资料汇总

    本文来源:https://github.com/ty4z2008/Qix/blob/master/dl.md 机器学习(Machine Learning)&深度学习(Deep Learning ...

  2. 机器学习(Machine Learning)深度学习(Deep Learning)资料集合

    机器学习(Machine Learning)&深度学习(Deep Learning)资料 原文链接:https://github.com/ty4z2008/Qix/blob/master/dl ...

  3. (转)机器学习(Machine Learning)深度学习(Deep Learning)资料

    原文链接:https://github.com/ty4z2008/Qix/blob/master/dl.md 机器学习(Machine Learning)&深度学习(Deep Learning ...

  4. 机器学习(Machine Learning)深度学习(Deep Learning)资料(Chapter 2)

    机器学习(Machine Learning)&深度学习(Deep Learning)资料(Chapter 2) - tony的专栏 - 博客频道 - CSDN.NET 注:机器学习资料篇目一共 ...

  5. 机器学习(Machine Learning)深度学习(Deep Learning)资料【转】

    转自:机器学习(Machine Learning)&深度学习(Deep Learning)资料 <Brief History of Machine Learning> 介绍:这是一 ...

  6. 机器学习 Machine Learning 深度学习 Deep Learning 资料

    机器学习(Machine Learning)&深度学习(Deep Learning)资料 機器學習.深度學習方面不錯的資料,轉載. 原作:https://github.com/ty4z2008 ...

  7. 机器学习(Machine Learning)amp;深度学习(Deep Learning)资料

    机器学习(Machine Learning)&深度学习(Deep Learning)资料 機器學習.深度學習方面不錯的資料,轉載. 原作:https://github.com/ty4z2008 ...

  8. Re:从零开始的机器学习 - Machine Learning(一) 线性回归

    从我对整个职业生涯的规划出发,我不仅想做一些高质量的应用(软件工程的角度),还想做一些激动人心的应用,所以我希望能在机器学习的方向走,尽管我在大学粗浅的学了些皮毛,但如果要把机器学习作为职业发展的话这 ...

  9. 机器学习(Machine Learning)基础

    机器学习(Machine Learning)基础 概念及用途 专门研究计算机怎样模拟或实现人类的学习行为,以获取新的知识或技能,重新组织已有的知识结构使之不断改善自身的性能.它是人工智能的核心,是使计 ...

最新文章

  1. matlab怎么根据波宽度去波,使用Matlab图像处理(三)——图像滤波原理
  2. python进程和线程_Python进程与线程知识
  3. java实现选项卡定时轮播_原生js面向对象编程-选项卡(自动轮播)
  4. C#调用Mail发送QQ邮件
  5. 纯CSS3实现轮播图
  6. mysql function函数_详解MySQL如何按表创建千万级的压测数据
  7. 玩转oracle 11g(28):ora-00064和程序异常终止
  8. 信息学奥赛一本通(1132:石头剪子布)
  9. 在列表中根据条件来筛选数据
  10. 现代 CMake 简明教程(一)- CMake 基础
  11. PHP学习笔记【13】_正则表达式
  12. 电脑硬盘怎么测试软件,如何通过软件检测电脑硬盘坏道?
  13. n皇后问题 递归 C语言,n皇后问题 递归和非递归
  14. 09-线程池与进程池
  15. Java线程Dump分析-工具TDA
  16. 并发编程之原子性及同步锁
  17. Zend Studio中安装Aptana及几个配置说明
  18. 2019年研究生数学建模E题加拿大站点数据批量下载
  19. LaTex 制作简历
  20. 莱布尼兹的二进制和布尔的全无假定   布尔逻辑之四

热门文章

  1. ONNX+TensorRT
  2. java.lang.ClassNotFoundException: Didn't find class com.tzutalin.dlibtest.MainActivity_
  3. A-Light-and-Fast-Face-Detector-for-Edge-Devices
  4. Expected a default value of type Tensor on parameter residual:
  5. Could not decode a text frame as UTF-8 的解决
  6. pytorch单维筛选 相乘
  7. c++多线程队列 类对象
  8. java获取执行时间
  9. c++ fhog学习资料整理
  10. opencv改变imshow窗口大小,窗口位置,ROI