上篇博客说了ResNetDenseNet的原理,这次说说具体实现


ResNet

def basic_block(input, in_features, out_features, stride, is_training, keep_prob):"""Residual block"""if stride == 1:shortcut = inputelse:shortcut = tf.nn.avg_pool(input, [ 1, stride, stride, 1 ], [1, stride, stride, 1 ], 'VALID')shortcut = tf.pad(shortcut, [[0, 0], [0, 0], [0, 0],[(out_features-in_features)//2, (out_features-in_features)//2]])current = conv2d(input, in_features, out_features, 3, stride)current = tf.nn.dropout(current, keep_prob)current = tf.contrib.layers.batch_norm(current, scale=True, is_training=is_training, updates_collections=None)current = tf.nn.relu(current)current = conv2d(current, out_features, out_features, 3, 1)current = tf.nn.dropout(current, keep_prob)current = tf.contrib.layers.batch_norm(current, scale=True, is_training=is_training, updates_collections=None)return current + shortcutdef block_stack(input, in_features, out_features, stride, depth, is_training, keep_prob):"""Stack Residual block"""current = basic_block(input, in_features, out_features, stride, is_training, keep_prob)for _d in xrange(depth - 1):current = basic_block(current, out_features, out_features, 1, is_training, keep_prob)return current

DenseNet

def conv2d(input, in_features, out_features, kernel_size, with_bias=False):W = weight_variable([ kernel_size, kernel_size, in_features, out_features ])conv = tf.nn.conv2d(input, W, [ 1, 1, 1, 1 ], padding='SAME')if with_bias:return conv + bias_variable([ out_features ])return convdef batch_activ_conv(current, in_features, out_features, kernel_size, is_training, keep_prob):"""BatchNorm+Relu+conv+dropout"""current = tf.contrib.layers.batch_norm(current, scale=True, is_training=is_training, updates_collections=None)current = tf.nn.relu(current)current = conv2d(current, in_features, out_features, kernel_size)current = tf.nn.dropout(current, keep_prob)return currentdef block(input, layers, in_features, growth, is_training, keep_prob):"""Dense Block"""current = inputfeatures = in_featuresfor idx in xrange(layers):tmp = batch_activ_conv(current, features, growth, 3, is_training, keep_prob)current = tf.concat(3, (current, tmp))features += growthreturn current, featuresdef model():"""DenseNet on ImageNet"""current = tf.reshape(xs, [ -1, 32, 32, 3 ])  # Inputcurrent = conv2d(current, 3, 16, 3)current, features = block(current, layers, 16, 12, is_training, keep_prob)current = batch_activ_conv(current, features, features, 1, is_training, keep_prob)current = avg_pool(current, 2)current, features = block(current, layers, features, 12, is_training, keep_prob)current = batch_activ_conv(current, features, features, 1, is_training, keep_prob)current = avg_pool(current, 2)current, features = block(current, layers, features, 12, is_training, keep_prob)current = tf.contrib.layers.batch_norm(current, scale=True, is_training=is_training, updates_collections=None)current = tf.nn.relu(current)current = avg_pool(current, 8)final_dim = featurescurrent = tf.reshape(current, [ -1, final_dim ])Wfc = weight_variable([ final_dim, label_count ])bfc = bias_variable([ label_count ])ys_ = tf.nn.softmax( tf.matmul(current, Wfc) + bfc )

代码不是完整的,只是表达最navie的思想核心部分

ResNet DenseNet(实践篇)相关推荐

  1. [深度学习-总结]Deep learning中8大模型介绍与比较(LeNet5,AlexNet,VGG,Inception,MobileNets,ResNet,DenseNet,Senet)

    深度学习 9中模型介绍与比较 0. CNN 结构演化 1. LeNet5 2. AlexNet 3. VGG 为什么使用2个3x3卷积核可以来代替5*5卷积核 4. 1*1卷积 5. Inceptio ...

  2. 系统学习深度学习(二十)--ResNet,DenseNet,以及残差家族

    转自:http://blog.csdn.net/cv_family_z/article/details/50328175 CVPR2016 https://github.com/KaimingHe/d ...

  3. ResNet DenseNet(原理篇)

    这篇博客讲现在很流行的两种网络模型,ResNet和DenseNet,其实可以把DenseNet看做是ResNet的特例  文章地址:  [1]Deep Residual Learning for Im ...

  4. PyTorch - FashionMNIST + LeNet / AlexNet / VGG / GooLeNet / NiN / ResNet / DenseNet

    文章目录 项目说明 数据集说明 - FashionMNIST 算法说明 - LeNet-5 LeNet-5 网络结构 代码实现 数据准备 下载数据集 查看数据 定义网络 训练 设置参数 训练方法 验证 ...

  5. 极深网络(ResNet/DenseNet): Skip Connection为何有效及其它

    /* 版权声明:可以任意转载,转载时请标明文章原始出处和作者信息 .*/ Residual Network通过引入Skip Connection到CNN网络结构中,使得网络深度达到了千层的规模,并且其 ...

  6. resnet论文_ResNet还是DenseNet?即插即用的DS涨点神器来了!

    DSNet比ResNet取得了更好的结果,并且具有与DenseNet相当的性能,但需要的计算资源更少.其中改进的DS2Res2Net性能非常强大. 作者:ChaucerG Date:2020-10-2 ...

  7. 一点就分享系列(实践篇6——上篇)【迟到补发_详解yolov8】Yolo-high_level系列融入YOLOv8 旨在研究和兼容使用【3月份开始持续补更】

    一点就分享系列(实践篇5-补更篇)[迟到补发]-Yolo系列算法开源项目融入V8旨在研究和兼容使用[持续更新] 题外话 [最近一直在研究开放多模态泛化模型的应用事情,所以这部分内容会更新慢一些,文章和 ...

  8. densenet tensorflow 中文汉字手写识别

    densenet 中文汉字手写识别,代码如下: import tensorflow as tf import os import random import math import tensorflo ...

  9. CNN网络架构演进:从LeNet到DenseNet

    原文来源:https://www.cnblogs.com/skyfsm/p/8451834.html 卷积神经网络可谓是现在深度学习领域中大红大紫的网络框架,尤其在计算机视觉领域更是一枝独秀.CNN从 ...

最新文章

  1. docker-elk装IK自定义分词库
  2. MVC下实现LayUI分页的Demo
  3. 「ROI 2017 Day 2」反物质(单调队列优化dp)
  4. 传播时延(propagation delay)与发送时延(transmission delay)
  5. c# webbrowser  获取用户选中文字
  6. 每日一道算法题--leetcode 746--使用最小花费爬楼梯--python
  7. Xshell连接linux(deepin)时提示ssh服务器拒绝了密码,请再试一次解决方法
  8. hibernate在不联网或者网络异常时不能解析配置文件
  9. java实现lru缓存_Java中的LRU缓存实现
  10. 问题五十一:怎么用ray tracing画tear drop
  11. 10.1寸大屏安卓通用车载导航
  12. 第1章 引论 - 数据结构与算法分析 c语言描述
  13. redis击穿、redis雪崩、redis穿透
  14. 校园失物招领小程序 开题报告(基于微信小程序毕业设计题目选题课题)
  15. python实现ip地址查询
  16. Win10创建文件不显示,Windows任务栏idea图标变白
  17. 苹果手机测试腿长软件,抖音测腿长特效功能在哪里 量长度app测距离软件推荐...
  18. mysql未开启binlog恢复_无全量备份、未开启binlog日志,利用percona工具恢复delete的数...
  19. 2014Android Demo源码 文件夹 PATH 列表
  20. 2023 WordPress 节日灯笼美化插件

热门文章

  1. cjson构建_[置顶] cJSON库(构建json与解析json字符串)-c语言
  2. 关于数码相机照片格式
  3. elasticsearch查询term等级(query查询)
  4. 5+ App开发入门指南
  5. 随笔 —— 门徒 无限恐怖
  6. 【基于分数的模型与扩散模型的区别与联系】score-based generative models总结
  7. 小白快进来,一篇文章彻底带你弄清C语言常见的输入输出函数
  8. 宽凳科技公布最新进展:已完成百余座城市数据采集,即将发布首张全自动高精度地图... 1
  9. C++餐厅自助管理系统
  10. 小优机器人恢复出厂设置_小优机器人基本使用指导手册.docx