图片来自:http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html
程序实现参考:http://www.wildml.com/2015/09/implementing-a-neural-network-from-scratch/
实验数据来自于:http://yann.lecun.com/exdb/mnist/
感谢。侵删。

1.前向传播与反向传播

  • 1.1 网络结构
  • 单个神经元的结构
  • 1.2前向传播算法






  • 1.3后向传播算法






  • 1.4权值更新






2.神经网路的实现

  • 程序使用数据获取地址:https://raw.githubusercontent.com/lxrobot/lxrobot-s-code/master/data178x197.npy

    
    #!/usr/bin/env python2# -*- coding: utf-8 -*-"""
    Created on Thu Jul 19 19:11:28 2018
    @author: rd
    """
    from __future__ import division
    import numpy as npdef sig(_z):_y=1/(1+np.exp(-_z))return _ydef predict(model,X):W1, b1, W2, b2 = model['W1'], model['b1'], model['W2'], model['b2']z1 = X.dot(W1) + b1 #a1 = np.tanh(z1) a1=sig(z1)z2 = a1.dot(W2) + b2 exp_scores = np.exp(z2) probs = exp_scores / np.sum(exp_scores, axis=1, keepdims=True)return probsdef get_accuracy(model,X,Y):probs=predict(model,X)pre_Y=np.argmax(probs,axis=1)comp=pre_Y==Yreturn len(np.flatnonzero(comp))/Y.shape[0]def get_loss(model,X,Y,reg_lambda):probs=predict(model,X)# Calculating the loss corect_logprobs = -np.log(probs[range(X.shape[0]), Y]) data_loss = np.sum(corect_logprobs) # Add regulatization term to loss  data_loss += reg_lambda/2 * (np.sum(np.square(model['W1']))+ np.sum(np.square(model['W2'])))loss = 1./X.shape[0] * data_loss     return lossdef nn_model(X,Y,nn_hdim,nn_output_dim,steps,epsilon,reg_lambda):np.random.seed(0) W1 = np.random.randn(X.shape[1], nn_hdim)  b1 = np.ones((1, nn_hdim)) W2 = np.random.randn(nn_hdim, nn_output_dim)b2 = np.ones((1, nn_output_dim)) model={}for i in xrange(steps):###forward propagationZ1=np.dot(X,W1)+b1#a1=np.tanh(Z1)a1=sig(Z1)Z2=np.dot(a1,W2)+b2#softmax outputexp_score=np.exp(Z2)prob = exp_score/np.sum(exp_score,axis=1,keepdims=1)#Backward Propagationdelta3=probdelta3[range(X.shape[0]),Y]-=1dW2 = np.dot(a1.T,delta3)delta2=np.dot(delta3,W2.T)*(1-np.power(a1,2))dW1 = np.dot(X.T,delta2)#update the weight valuedW2+=reg_lambda*W2dW1+=reg_lambda*W1W2+=-epsilon*dW2W1+=-epsilon*dW1if i%500==0:model = { 'W1': W1, 'b1': b1, 'W2': W2, 'b2': b2}print "The {} steps, Loss = {:2.5f}, Accaracy = {:2.5f}".format(i,get_loss(model,X,Y,reg_lambda),get_accuracy(model,X,Y))return modeldef main():"""The data is saved in a 57x197 numpy array with a random order,197=14*14+1,14 is the image size, 1 is the label."""datas=np.load('data178x197.npy')np.random.seed(14)np.random.shuffle(datas)sp=int(datas.shape[0]/3)train_X=datas[:sp,:-1]train_Y=datas[:sp,-1]test_X=datas[sp:,:-1]test_Y=datas[sp:,-1]reg_lambda=0.05epsilon=0.01steps=10000nn_output_dim=2   nn_hdim=16model=nn_model(train_X,train_Y,nn_hdim,nn_output_dim,steps,epsilon,reg_lambda)print"The test accuracy is {:2.5f}".format(get_accuracy(model,test_X,test_Y))
    if __name__=='__main__':main()
    >>>python nn_model.py
    The 0 steps, Loss = 116.85237, Accaracy = 0.47458
    The 500 steps, Loss = 7656.93264, Accaracy = 1.00000
    The 1000 steps, Loss = 11102.96887, Accaracy = 1.00000
    The 1500 steps, Loss = 14026.85439, Accaracy = 1.00000
    The 2000 steps, Loss = 15287.36413, Accaracy = 1.00000
    The 2500 steps, Loss = 16782.16622, Accaracy = 1.00000
    The 3000 steps, Loss = 18721.08597, Accaracy = 1.00000
    The 3500 steps, Loss = 19557.37682, Accaracy = 1.00000
    The 4000 steps, Loss = 20139.67117, Accaracy = 1.00000
    The 4500 steps, Loss = 21280.24345, Accaracy = 1.00000
    The 5000 steps, Loss = 21331.53461, Accaracy = 1.00000
    The 5500 steps, Loss = 22157.03441, Accaracy = 1.00000
    The 6000 steps, Loss = 21961.40862, Accaracy = 1.00000
    The 6500 steps, Loss = 22537.47486, Accaracy = 1.00000
    The 7000 steps, Loss = 22923.17602, Accaracy = 1.00000
    The 7500 steps, Loss = 23428.20322, Accaracy = 1.00000
    The 8000 steps, Loss = 23646.00209, Accaracy = 1.00000
    The 8500 steps, Loss = 23844.16144, Accaracy = 1.00000
    The 9000 steps, Loss = 24419.29215, Accaracy = 1.00000
    The 9500 steps, Loss = 23643.59117, Accaracy = 1.00000
    The test accuracy is 0.99160

refer

[1] http://www.wildml.com/2015/09/implementing-a-neural-network-from-scratch/

[2] http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html

[3] http://yann.lecun.com/exdb/mnist/

反向传播(Back Propagation)与神经网络(Neural Network)相关推荐

  1. 神经网络neural network

    http://blog.csdn.net/pipisorry/article/details/76095118 前馈神经网络:FFNN模型(feedforward neural network) 固定 ...

  2. OpenCV神经网络neural network的实例(附完整代码)

    OpenCV神经网络neural network的实例 OpenCV神经网络neural network的实例 OpenCV神经网络neural network的实例 #include <ope ...

  3. 神经网络-Neural Network 简介

    神经网络-Neural Network 简介 神经元构成 基本结构 简单思考 参考 神经元构成 z=sum(x)=wTx+b=w1x1+w2x2+w3x3+bz=sum(x) = w^Tx+b=w_1 ...

  4. 神经网络中的反向传播----Back Propagation

    说到神经网络,大家看到这个图应该不陌生: 这是典型的三层神经网络的基本构成,Layer L1是输入层,Layer L2是隐含层,Layer L3是隐含层,我们现在手里有一堆数据{x1,x2,x3,.. ...

  5. 神经网络 neural network

    神经网络之BP算法,梯度检验,参数随机初始化 neural network(BackPropagation algorithm,gradient checking,random initializat ...

  6. 反向传播算法的理论基础,神经网络反向传播算法

    1.如何理解神经网络里面的反向传播算法 反向传播算法(Backpropagation)是目前用来训练人工神经网络(Artificial Neural Network,ANN)的最常用且最有效的算法.其 ...

  7. Lecture4 反向传播(Back Propagation)

    目录 1 问题背景 1.1计算图(Computational Graph) 1.2 激活函数(Activation Function)引入 1.3 问题引入 2 反向传播(Back Propagati ...

  8. 深度学习入门-误差反向传播法(人工神经网络实现mnist数据集识别)

    文章目录 误差反向传播法 5.1 链式法则与计算图 5.2 计算图代码实践 5.3激活函数层的实现 5.4 简单矩阵求导 5.5 Affine 层的实现 5.6 softmax-with-loss层计 ...

  9. 神经网络(neural network)以及训练原理

    什么是神经网络 神经网络的发现可以说是将人工智能又拔高了一个度,现今很多了不起的成果都是在此之上完成的,那它是如何被发现的呢?既然 是"神经",自然可以联想到人体里面的的神经,及生 ...

最新文章

  1. SAP QM 使用QP01事务代码真的不能创建含有Multiple Specification的检验计划
  2. 简单快速开发C\S架构程序用最简单的不分层最快的效率达到功能要求的例子程序FrmKnowledge日积月累功能的实现...
  3. 【响应式Web前端设计】new Option()函数的作用(三区联动)
  4. python平均工资-2019年我国程序员薪资统计,看看你出于什么水平?
  5. Python+selenium用法 上
  6. python 随机生成汉字的三种方法
  7. OpenCV蒙版图像make mask image的实例(附完整代码)
  8. AspNetCore 基于AOP实现Polly的使用
  9. Anaconda 完全入门指南
  10. 【转载保存】索引文件锁LockFactory
  11. 简述python的安装过程_python3+ selenium3开发环境搭建-手把手教你安装python(详细)...
  12. Vue第二部分(1):组件基础学习
  13. clob字段怎么导出_人人都会遇到的问题:Java 如何优雅的导出 Excel~
  14. centos 常用的网络登录端口测试工具
  15. JavaScript到底应该怎么用?
  16. 矩阵和矢量的点乘推导及其简单应用
  17. 移动手机用户目录下的证书至根目录下
  18. ES 实现数据库or查询效果
  19. 编程之美 1.2 中国象棋将帅问题
  20. deeplearning.ai课程作业:Recurrent Neural Networks- Course 5 Week3

热门文章

  1. [paper reading] ResNet
  2. 安卓使用videoview进行音频、视频播放,及播放控制
  3. js系列教程13-原型、原型链、作用链、闭包全解
  4. 图像处理五:python读取图片的几种方式
  5. 线性方程组的5种描述方式
  6. vs下qt的信号与槽实现
  7. datepicker 属性设置 以及方法和事件
  8. C++11Mutex(互斥锁)详解
  9. 课程一(Neural Networks and Deep Learning),第三周(Shallow neural networks)—— 2、Practice Questions...
  10. 读取Apache访问日志,查看每一个独立客户端连接获得的字节数