文章目录

  • CIFAR10自定义网络实战

CIFAR10自定义网络实战


import osos.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import datasets, layers, optimizers, Sequential, metrics# 数据预处理
def preprocess(x,y):# [-1,1]x = 2 * tf.cast(x,dtype=tf.float32) / 255. - 1y = tf.cast(y,dtype=tf.int32)return x,ybatchsz = 128
# 数据集的加载
# x[b,32,32,3]  y[b,1]
(x,y),(x_val,y_val) = datasets.cifar10.load_data()# 消去[b,1]的1这个维度
y = tf.squeeze(y)
y_val = tf.squeeze(y_val)y = tf.one_hot(y,depth=10)
y_val = tf.one_hot(y_val,depth=10)
print('datasets:',x.shape,y.shape,x.min(),x.max())
# datasets: (50000, 32, 32, 3) (50000, 10) 0 255# 构建两个数据集
train_db = tf.data.Dataset.from_tensor_slices((x,y))
train_db = train_db.map(preprocess).shuffle(10000).batch(batchsz)
test_db = tf.data.Dataset.from_tensor_slices((x_val,y_val))
test_db = test_db.map(preprocess).batch(batchsz)sample = next(iter(train_db))
print('batch:',sample[0].shape,sample[1].shape)# 创建自己的层
# replace standard layers.Dense
class MyDense(layers.Layer):def __init__(self,inp_dim,outp_dim):super(MyDense,self).__init__()self.kernel = self.add_variable('w',[inp_dim,outp_dim])# self.bias = self.add_variable('b',[outp_dim])# 构建前向传播def call(self,input,training = None):x = input @ self.kernelreturn x# 构建自定义网络(5层)
class MyNetwork(keras.Model):def __init__(self):super(MyNetwork,self).__init__()# 优化-使参数变大-但容易造成过拟合self.fc1 = MyDense(32*32*3,256)self.fc2 = MyDense(256,128)self.fc3 = MyDense(128,64)self.fc4 = MyDense(64,32)self.fc5 = MyDense(32,10)def call(self,inputs,training=None):""":param inputs: [b,32,32,3]:param training::return:"""# 打平操作x = tf.reshape(inputs,[-1,32*32*3])x = self.fc1(x)x = tf.nn.relu(x)x = self.fc2(x)x = tf.nn.relu(x)x = self.fc3(x)x = tf.nn.relu(x)x = self.fc4(x)x = tf.nn.relu(x)# x[b,32]->[b,10]x = self.fc5(x)return xnetwork = MyNetwork()
network.compile(optimizer = optimizers.Adam(lr = 1e-3),loss = tf.losses.CategoricalCrossentropy(from_logits=True),metrics = ['accuracy'])network.fit(train_db,epochs=15,validation_data = test_db,validation_freq=1)# 保存模型权值
network.evaluate(test_db)
network.save_weights('ckpt/weights.ckpt')
del network
print('saved to ckpt/weights.ckpt')network = MyNetwork()
network.compile(optimizer = optimizers.Adam(lr = 1e-3),loss = tf.losses.CategoricalCrossentropy(from_logits=True),metrics = ['accuracy'])# 加载模型权值
network.load_weights('ckpt/weights.ckpt')
print('load weights from file')
network.evaluate(test_db)
Epoch 14/151/391 [..............................] - ETA: 2:59 - loss: 0.6248 - accuracy: 0.80478/391 [..............................] - ETA: 24s - loss: 0.6025 - accuracy: 0.7744 14/391 [>.............................] - ETA: 15s - loss: 0.5613 - accuracy: 0.795220/391 [>.............................] - ETA: 11s - loss: 0.5669 - accuracy: 0.796926/391 [>.............................] - ETA: 9s - loss: 0.5580 - accuracy: 0.8029 32/391 [=>............................] - ETA: 8s - loss: 0.5757 - accuracy: 0.793238/391 [=>............................] - ETA: 7s - loss: 0.5719 - accuracy: 0.792644/391 [==>...........................] - ETA: 6s - loss: 0.5721 - accuracy: 0.793350/391 [==>...........................] - ETA: 5s - loss: 0.5669 - accuracy: 0.796256/391 [===>..........................] - ETA: 5s - loss: 0.5710 - accuracy: 0.793962/391 [===>..........................] - ETA: 5s - loss: 0.5740 - accuracy: 0.794168/391 [====>.........................] - ETA: 4s - loss: 0.5731 - accuracy: 0.794575/391 [====>.........................] - ETA: 4s - loss: 0.5753 - accuracy: 0.792281/391 [=====>........................] - ETA: 4s - loss: 0.5745 - accuracy: 0.793688/391 [=====>........................] - ETA: 4s - loss: 0.5727 - accuracy: 0.793694/391 [======>.......................] - ETA: 3s - loss: 0.5742 - accuracy: 0.7927
101/391 [======>.......................] - ETA: 3s - loss: 0.5736 - accuracy: 0.7932
107/391 [=======>......................] - ETA: 3s - loss: 0.5724 - accuracy: 0.7934
114/391 [=======>......................] - ETA: 3s - loss: 0.5749 - accuracy: 0.7926
120/391 [========>.....................] - ETA: 3s - loss: 0.5757 - accuracy: 0.7934
126/391 [========>.....................] - ETA: 3s - loss: 0.5722 - accuracy: 0.7951
133/391 [=========>....................] - ETA: 3s - loss: 0.5721 - accuracy: 0.7955
139/391 [=========>....................] - ETA: 2s - loss: 0.5717 - accuracy: 0.7955
146/391 [==========>...................] - ETA: 2s - loss: 0.5715 - accuracy: 0.7954
152/391 [==========>...................] - ETA: 2s - loss: 0.5694 - accuracy: 0.7959
159/391 [===========>..................] - ETA: 2s - loss: 0.5688 - accuracy: 0.7957
166/391 [===========>..................] - ETA: 2s - loss: 0.5699 - accuracy: 0.7948
173/391 [============>.................] - ETA: 2s - loss: 0.5699 - accuracy: 0.7953
180/391 [============>.................] - ETA: 2s - loss: 0.5691 - accuracy: 0.7954
187/391 [=============>................] - ETA: 2s - loss: 0.5686 - accuracy: 0.7957
193/391 [=============>................] - ETA: 2s - loss: 0.5687 - accuracy: 0.7956
200/391 [==============>...............] - ETA: 2s - loss: 0.5694 - accuracy: 0.7952
207/391 [==============>...............] - ETA: 1s - loss: 0.5688 - accuracy: 0.7954
214/391 [===============>..............] - ETA: 1s - loss: 0.5673 - accuracy: 0.7951
221/391 [===============>..............] - ETA: 1s - loss: 0.5672 - accuracy: 0.7953
228/391 [================>.............] - ETA: 1s - loss: 0.5661 - accuracy: 0.7958
234/391 [================>.............] - ETA: 1s - loss: 0.5651 - accuracy: 0.7959
240/391 [=================>............] - ETA: 1s - loss: 0.5638 - accuracy: 0.7964
247/391 [=================>............] - ETA: 1s - loss: 0.5638 - accuracy: 0.7962
254/391 [==================>...........] - ETA: 1s - loss: 0.5627 - accuracy: 0.7971
261/391 [===================>..........] - ETA: 1s - loss: 0.5635 - accuracy: 0.7968
268/391 [===================>..........] - ETA: 1s - loss: 0.5642 - accuracy: 0.7966
275/391 [====================>.........] - ETA: 1s - loss: 0.5638 - accuracy: 0.7969
282/391 [====================>.........] - ETA: 1s - loss: 0.5633 - accuracy: 0.7972
289/391 [=====================>........] - ETA: 1s - loss: 0.5626 - accuracy: 0.7973
296/391 [=====================>........] - ETA: 0s - loss: 0.5625 - accuracy: 0.7973
302/391 [======================>.......] - ETA: 0s - loss: 0.5629 - accuracy: 0.7968
309/391 [======================>.......] - ETA: 0s - loss: 0.5641 - accuracy: 0.7967
318/391 [=======================>......] - ETA: 0s - loss: 0.5652 - accuracy: 0.7964
332/391 [========================>.....] - ETA: 0s - loss: 0.5661 - accuracy: 0.7960
347/391 [=========================>....] - ETA: 0s - loss: 0.5674 - accuracy: 0.7959
362/391 [==========================>...] - ETA: 0s - loss: 0.5676 - accuracy: 0.7957
376/391 [===========================>..] - ETA: 0s - loss: 0.5684 - accuracy: 0.7957
389/391 [============================>.] - ETA: 0s - loss: 0.5698 - accuracy: 0.7956
391/391 [==============================] - 4s 10ms/step - loss: 0.5697 - accuracy: 0.7956 - val_loss: 1.9200 - val_accuracy: 0.5195
Epoch 15/151/391 [..............................] - ETA: 2:55 - loss: 0.6455 - accuracy: 0.78128/391 [..............................] - ETA: 24s - loss: 0.5190 - accuracy: 0.8135 15/391 [>.............................] - ETA: 14s - loss: 0.5051 - accuracy: 0.816122/391 [>.............................] - ETA: 10s - loss: 0.4930 - accuracy: 0.822429/391 [=>............................] - ETA: 8s - loss: 0.4935 - accuracy: 0.8217 36/391 [=>............................] - ETA: 7s - loss: 0.4941 - accuracy: 0.823843/391 [==>...........................] - ETA: 6s - loss: 0.4999 - accuracy: 0.821250/391 [==>...........................] - ETA: 5s - loss: 0.5044 - accuracy: 0.818157/391 [===>..........................] - ETA: 5s - loss: 0.5097 - accuracy: 0.817764/391 [===>..........................] - ETA: 4s - loss: 0.5112 - accuracy: 0.817471/391 [====>.........................] - ETA: 4s - loss: 0.5097 - accuracy: 0.816878/391 [====>.........................] - ETA: 4s - loss: 0.5115 - accuracy: 0.817285/391 [=====>........................] - ETA: 4s - loss: 0.5161 - accuracy: 0.814892/391 [======>.......................] - ETA: 3s - loss: 0.5176 - accuracy: 0.814599/391 [======>.......................] - ETA: 3s - loss: 0.5187 - accuracy: 0.8149
106/391 [=======>......................] - ETA: 3s - loss: 0.5168 - accuracy: 0.8155
113/391 [=======>......................] - ETA: 3s - loss: 0.5177 - accuracy: 0.8148
119/391 [========>.....................] - ETA: 3s - loss: 0.5190 - accuracy: 0.8147
125/391 [========>.....................] - ETA: 3s - loss: 0.5164 - accuracy: 0.8159
132/391 [=========>....................] - ETA: 2s - loss: 0.5162 - accuracy: 0.8159
139/391 [=========>....................] - ETA: 2s - loss: 0.5149 - accuracy: 0.8156
146/391 [==========>...................] - ETA: 2s - loss: 0.5149 - accuracy: 0.8157
153/391 [==========>...................] - ETA: 2s - loss: 0.5139 - accuracy: 0.8161
159/391 [===========>..................] - ETA: 2s - loss: 0.5161 - accuracy: 0.8150
165/391 [===========>..................] - ETA: 2s - loss: 0.5156 - accuracy: 0.8154
171/391 [============>.................] - ETA: 2s - loss: 0.5135 - accuracy: 0.8162
177/391 [============>.................] - ETA: 2s - loss: 0.5148 - accuracy: 0.8158
183/391 [=============>................] - ETA: 2s - loss: 0.5155 - accuracy: 0.8155
189/391 [=============>................] - ETA: 2s - loss: 0.5171 - accuracy: 0.8147
195/391 [=============>................] - ETA: 2s - loss: 0.5189 - accuracy: 0.8140
201/391 [==============>...............] - ETA: 1s - loss: 0.5175 - accuracy: 0.8144
208/391 [==============>...............] - ETA: 1s - loss: 0.5165 - accuracy: 0.8144
214/391 [===============>..............] - ETA: 1s - loss: 0.5185 - accuracy: 0.8137
221/391 [===============>..............] - ETA: 1s - loss: 0.5182 - accuracy: 0.8140
228/391 [================>.............] - ETA: 1s - loss: 0.5175 - accuracy: 0.8143
234/391 [================>.............] - ETA: 1s - loss: 0.5170 - accuracy: 0.8144
240/391 [=================>............] - ETA: 1s - loss: 0.5161 - accuracy: 0.8150
246/391 [=================>............] - ETA: 1s - loss: 0.5168 - accuracy: 0.8142
253/391 [==================>...........] - ETA: 1s - loss: 0.5165 - accuracy: 0.8140
259/391 [==================>...........] - ETA: 1s - loss: 0.5169 - accuracy: 0.8136
265/391 [===================>..........] - ETA: 1s - loss: 0.5164 - accuracy: 0.8138
271/391 [===================>..........] - ETA: 1s - loss: 0.5161 - accuracy: 0.8139
278/391 [====================>.........] - ETA: 1s - loss: 0.5155 - accuracy: 0.8142
284/391 [====================>.........] - ETA: 1s - loss: 0.5156 - accuracy: 0.8140
291/391 [=====================>........] - ETA: 0s - loss: 0.5143 - accuracy: 0.8148
298/391 [=====================>........] - ETA: 0s - loss: 0.5146 - accuracy: 0.8146
305/391 [======================>.......] - ETA: 0s - loss: 0.5148 - accuracy: 0.8146
312/391 [======================>.......] - ETA: 0s - loss: 0.5151 - accuracy: 0.8144
325/391 [=======================>......] - ETA: 0s - loss: 0.5148 - accuracy: 0.8142
339/391 [=========================>....] - ETA: 0s - loss: 0.5161 - accuracy: 0.8137
354/391 [==========================>...] - ETA: 0s - loss: 0.5169 - accuracy: 0.8136
369/391 [===========================>..] - ETA: 0s - loss: 0.5193 - accuracy: 0.8127
383/391 [============================>.] - ETA: 0s - loss: 0.5197 - accuracy: 0.8126
391/391 [==============================] - 4s 10ms/step - loss: 0.5200 - accuracy: 0.8126 - val_loss: 2.0124 - val_accuracy: 0.51891/79 [..............................] - ETA: 0s - loss: 1.6155 - accuracy: 0.5625
10/79 [==>...........................] - ETA: 0s - loss: 1.8749 - accuracy: 0.5273
19/79 [======>.......................] - ETA: 0s - loss: 1.9776 - accuracy: 0.5169
27/79 [=========>....................] - ETA: 0s - loss: 1.9817 - accuracy: 0.5194
36/79 [============>.................] - ETA: 0s - loss: 1.9576 - accuracy: 0.5252
45/79 [================>.............] - ETA: 0s - loss: 1.9520 - accuracy: 0.5274
54/79 [===================>..........] - ETA: 0s - loss: 1.9581 - accuracy: 0.5268
63/79 [======================>.......] - ETA: 0s - loss: 1.9572 - accuracy: 0.5255
72/79 [==========================>...] - ETA: 0s - loss: 1.9786 - accuracy: 0.5215
79/79 [==============================] - 0s 6ms/step - loss: 2.0124 - accuracy: 0.5189
saved to ckpt/weights.ckpt
load weights from file1/79 [..............................] - ETA: 5s - loss: 1.6155 - accuracy: 0.5625
10/79 [==>...........................] - ETA: 0s - loss: 1.8749 - accuracy: 0.5273
19/79 [======>.......................] - ETA: 0s - loss: 1.9776 - accuracy: 0.5169
28/79 [=========>....................] - ETA: 0s - loss: 1.9824 - accuracy: 0.5176
37/79 [=============>................] - ETA: 0s - loss: 1.9426 - accuracy: 0.5283
46/79 [================>.............] - ETA: 0s - loss: 1.9576 - accuracy: 0.5275
55/79 [===================>..........] - ETA: 0s - loss: 1.9660 - accuracy: 0.5259
64/79 [=======================>......] - ETA: 0s - loss: 1.9604 - accuracy: 0.5242
73/79 [==========================>...] - ETA: 0s - loss: 1.9814 - accuracy: 0.5205
79/79 [==============================] - 1s 7ms/step - loss: 2.0124 - accuracy: 0.5189

在不使用卷积神经网络的情况下,效果也就这样

深度学习2.0-23.Keras高层接口之CIFAR10自定义网络实战相关推荐

  1. 深度学习入门系列23:项目:用爱丽丝梦游仙境生成文本

    大家好,我技术人Howzit,这是深度学习入门系列第二十三篇,欢迎大家一起交流! 深度学习入门系列1:多层感知器概述 深度学习入门系列2:用TensorFlow构建你的第一个神经网络 深度学习入门系列 ...

  2. 常用深度学习框——Caffe/TensorFlow / Keras/ PyTorch/MXNet

    常用深度学习框--Caffe/TensorFlow / Keras/ PyTorch/MXNet 一.概述 近几年来,深度学习的研究和应用的热潮持续高涨,各种开源深度学习框架层出不穷,包括Tensor ...

  3. 2_初学者快速掌握主流深度学习框架Tensorflow、Keras、Pytorch学习代码(20181211)

    初学者快速掌握主流深度学习框架Tensorflow.Keras.Pytorch学习代码 一.TensorFlow 1.资源地址: 2.资源介绍: 3.配置环境: 4.资源目录: 二.Keras 1.资 ...

  4. 深度学习笔记(23) 卷积维度

    深度学习笔记(23) 卷积维度 1. 卷积步长 2. 卷积输出维度 3. 三维卷积 1. 卷积步长 在这个例子中,用3×3的矩阵卷积一个7×7的矩阵,得到一个3×3的输出 输入和输出的维度是由下面的公 ...

  5. 日月光华深度学习(一、二)深度学习基础和tf.keras

    日月光华深度学习(一.二)深度学习基础和tf.keras [2.2]--tf.keras实现线性回归 [2.5]--多层感知器(神经网络)的代码实现 [2.6]--逻辑回归与交叉熵 [2.7]--逻辑 ...

  6. halcon 深度学习标注_HALCON深度学习工具0.4 早鸟版发布了

    原标题:HALCON深度学习工具0.4 早鸟版发布了 HALOCN深度学习工具在整个深度学习过程中扮演着重要的作用,而且在将来将扮演更重要的辅助作用,大大加快深度学习的开发流程,目前发布版本工具的主要 ...

  7. halcon显示坐标_HALCON深度学习工具0.4 早鸟版发布了

    HALOCN深度学习工具在整个深度学习过程中扮演着重要的作用,而且在将来将扮演更重要的辅助作用,大大加快深度学习的开发流程,目前发布版本工具的主要作用是图像数据处理和目标检测和分类中的标注. 标注训练 ...

  8. 深度学习之自编码器(2)Fashion MNIST图片重建实战

    深度学习之自编码器(2)Fashion MNIST图片重建实战 1. Fashion MNIST数据集 2. 编码器 3. 解码器 4. 自编码器 5. 网络训练 6. 图片重建 完整代码  自编码器 ...

  9. 深度学习之循环神经网络(5)RNN情感分类问题实战

    深度学习之循环神经网络(5)RNN情感分类问题实战 1. 数据集 2. 网络模型 3. 训练与测试 完整代码 运行结果  现在利用基础的RNN网络来挑战情感分类问题.网络结构如下图所示,RNN网络共两 ...

  10. 花书+吴恩达深度学习(十六)序列模型之双向循环网络 BRNN 和深度循环网络 Deep RNN

    目录 0. 前言 1. 双向循环网络 BRNN(Bidirectional RNN) 2. 深度循环网络 Deep RNN 如果这篇文章对你有一点小小的帮助,请给个关注,点个赞喔~我会非常开心的~ 花 ...

最新文章

  1. 二级c语言努力学可以过吗,考过计算机二级C语言一些感想和建议
  2. 对 SAP UI5 应用使用 uiveri5 执行测试的 ERR_CONNECTION_REFUSED 错误
  3. Linux C socket 编程之UDP
  4. js中事件处理程序的内存优化
  5. java 4级_《软件测试人员(Java)(4级)》【价格 目录 书评 正版】_中国图书网
  6. Hyperledger Fabric 实战(八):couchdb 丰富查询 selector 语法
  7. Atom: 安装版本过旧,会导致很多问题
  8. JAVA 滑块拼图验证码
  9. 英文邮件开场白(Dear / Hi / TO)
  10. rpm的安装与卸载,常用命令记载
  11. 矮人DOS工具箱 V4.2 安装及使用
  12. IB计算机科学选课,IB体系应如何正确选课?
  13. python 知乎 合并 pdf_怎么把多个pdf合并在一起?
  14. 前端基于excljs导出xlsx时图片资源的处理及踩坑实录
  15. dd命令测试磁盘读写速度
  16. 使用office2007打开excel报错:不是有效的win32文件
  17. 对象存储2:数据存储类型-文件存储、块存储、对象存储详解
  18. 网络笔记(18)DNS协议:网络世界的地址簿
  19. 基于css的表单模板
  20. java开发工程师报名费多少_都说Java工程师的薪资比较高,事实真是如此吗?

热门文章

  1. 阿里云ubuntu16.04安装ruby
  2. django orm 以列表作为筛选条件进行查询
  3. MySql免安装版绿化版安装配置,附MySQL服务无法启动解决方案
  4. 一个精心制作的页眉样式
  5. 机器学习-笔试题总结1
  6. TreeView 动态绑定数据及在当前节点操作
  7. 女人不需要哲学,因为哲学不能给她们带来面包
  8. 运行roscore出现unable to contact my own server无法启动小海龟的部分故障问题解决
  9. ef core code first from exist db
  10. 牛客网 牛客小白月赛2 H.武-最短路(Dijkstra)