keras调用自己训练的模型,并去掉全连接层
其实很简单
from keras.models import load_modelbase_model = load_model('model_resenet.h5')#加载指定的模型
print(base_model.summary())#输出网络的结构图
这是我的网络模型的输出,其实就是它的结构图
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) (None, 227, 227, 1) 0
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, 225, 225, 32) 320 input_1[0][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 225, 225, 32) 128 conv2d_1[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, 225, 225, 32) 0 batch_normalization_1[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, 225, 225, 32) 9248 activation_1[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 225, 225, 32) 128 conv2d_2[0][0]
__________________________________________________________________________________________________
activation_2 (Activation) (None, 225, 225, 32) 0 batch_normalization_2[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 225, 225, 32) 9248 activation_2[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 225, 225, 32) 128 conv2d_3[0][0]
__________________________________________________________________________________________________
merge_1 (Merge) (None, 225, 225, 32) 0 batch_normalization_3[0][0] activation_1[0][0]
__________________________________________________________________________________________________
activation_3 (Activation) (None, 225, 225, 32) 0 merge_1[0][0]
__________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, 225, 225, 32) 9248 activation_3[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 225, 225, 32) 128 conv2d_4[0][0]
__________________________________________________________________________________________________
activation_4 (Activation) (None, 225, 225, 32) 0 batch_normalization_4[0][0]
__________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, 225, 225, 32) 9248 activation_4[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 225, 225, 32) 128 conv2d_5[0][0]
__________________________________________________________________________________________________
merge_2 (Merge) (None, 225, 225, 32) 0 batch_normalization_5[0][0] activation_3[0][0]
__________________________________________________________________________________________________
activation_5 (Activation) (None, 225, 225, 32) 0 merge_2[0][0]
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D) (None, 112, 112, 32) 0 activation_5[0][0]
__________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, 110, 110, 64) 18496 max_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 110, 110, 64) 256 conv2d_6[0][0]
__________________________________________________________________________________________________
activation_6 (Activation) (None, 110, 110, 64) 0 batch_normalization_6[0][0]
__________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, 110, 110, 64) 36928 activation_6[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 110, 110, 64) 256 conv2d_7[0][0]
__________________________________________________________________________________________________
activation_7 (Activation) (None, 110, 110, 64) 0 batch_normalization_7[0][0]
__________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, 110, 110, 64) 36928 activation_7[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 110, 110, 64) 256 conv2d_8[0][0]
__________________________________________________________________________________________________
merge_3 (Merge) (None, 110, 110, 64) 0 batch_normalization_8[0][0] activation_6[0][0]
__________________________________________________________________________________________________
activation_8 (Activation) (None, 110, 110, 64) 0 merge_3[0][0]
__________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, 110, 110, 64) 36928 activation_8[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 110, 110, 64) 256 conv2d_9[0][0]
__________________________________________________________________________________________________
activation_9 (Activation) (None, 110, 110, 64) 0 batch_normalization_9[0][0]
__________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, 110, 110, 64) 36928 activation_9[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 110, 110, 64) 256 conv2d_10[0][0]
__________________________________________________________________________________________________
merge_4 (Merge) (None, 110, 110, 64) 0 batch_normalization_10[0][0] activation_8[0][0]
__________________________________________________________________________________________________
activation_10 (Activation) (None, 110, 110, 64) 0 merge_4[0][0]
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, 55, 55, 64) 0 activation_10[0][0]
__________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, 53, 53, 64) 36928 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 53, 53, 64) 256 conv2d_11[0][0]
__________________________________________________________________________________________________
activation_11 (Activation) (None, 53, 53, 64) 0 batch_normalization_11[0][0]
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D) (None, 26, 26, 64) 0 activation_11[0][0]
__________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, 26, 26, 64) 36928 max_pooling2d_3[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 26, 26, 64) 256 conv2d_12[0][0]
__________________________________________________________________________________________________
activation_12 (Activation) (None, 26, 26, 64) 0 batch_normalization_12[0][0]
__________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, 26, 26, 64) 36928 activation_12[0][0]
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 26, 26, 64) 256 conv2d_13[0][0]
__________________________________________________________________________________________________
merge_5 (Merge) (None, 26, 26, 64) 0 batch_normalization_13[0][0] max_pooling2d_3[0][0]
__________________________________________________________________________________________________
activation_13 (Activation) (None, 26, 26, 64) 0 merge_5[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 26, 26, 64) 36928 activation_13[0][0]
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 26, 26, 64) 256 conv2d_14[0][0]
__________________________________________________________________________________________________
activation_14 (Activation) (None, 26, 26, 64) 0 batch_normalization_14[0][0]
__________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, 26, 26, 64) 36928 activation_14[0][0]
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 26, 26, 64) 256 conv2d_15[0][0]
__________________________________________________________________________________________________
merge_6 (Merge) (None, 26, 26, 64) 0 batch_normalization_15[0][0] activation_13[0][0]
__________________________________________________________________________________________________
activation_15 (Activation) (None, 26, 26, 64) 0 merge_6[0][0]
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, 13, 13, 64) 0 activation_15[0][0]
__________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, 11, 11, 32) 18464 max_pooling2d_4[0][0]
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 11, 11, 32) 128 conv2d_16[0][0]
__________________________________________________________________________________________________
activation_16 (Activation) (None, 11, 11, 32) 0 batch_normalization_16[0][0]
__________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, 11, 11, 32) 9248 activation_16[0][0]
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 11, 11, 32) 128 conv2d_17[0][0]
__________________________________________________________________________________________________
activation_17 (Activation) (None, 11, 11, 32) 0 batch_normalization_17[0][0]
__________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, 11, 11, 32) 9248 activation_17[0][0]
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 11, 11, 32) 128 conv2d_18[0][0]
__________________________________________________________________________________________________
merge_7 (Merge) (None, 11, 11, 32) 0 batch_normalization_18[0][0] activation_16[0][0]
__________________________________________________________________________________________________
activation_18 (Activation) (None, 11, 11, 32) 0 merge_7[0][0]
__________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, 11, 11, 32) 9248 activation_18[0][0]
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 11, 11, 32) 128 conv2d_19[0][0]
__________________________________________________________________________________________________
activation_19 (Activation) (None, 11, 11, 32) 0 batch_normalization_19[0][0]
__________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, 11, 11, 32) 9248 activation_19[0][0]
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 11, 11, 32) 128 conv2d_20[0][0]
__________________________________________________________________________________________________
merge_8 (Merge) (None, 11, 11, 32) 0 batch_normalization_20[0][0] activation_18[0][0]
__________________________________________________________________________________________________
activation_20 (Activation) (None, 11, 11, 32) 0 merge_8[0][0]
__________________________________________________________________________________________________
max_pooling2d_5 (MaxPooling2D) (None, 5, 5, 32) 0 activation_20[0][0]
__________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, 3, 3, 64) 18496 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 3, 3, 64) 256 conv2d_21[0][0]
__________________________________________________________________________________________________
activation_21 (Activation) (None, 3, 3, 64) 0 batch_normalization_21[0][0]
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, 3, 3, 64) 36928 activation_21[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 3, 3, 64) 256 conv2d_22[0][0]
__________________________________________________________________________________________________
activation_22 (Activation) (None, 3, 3, 64) 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, 3, 3, 64) 36928 activation_22[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 3, 3, 64) 256 conv2d_23[0][0]
__________________________________________________________________________________________________
merge_9 (Merge) (None, 3, 3, 64) 0 batch_normalization_23[0][0] activation_21[0][0]
__________________________________________________________________________________________________
activation_23 (Activation) (None, 3, 3, 64) 0 merge_9[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, 3, 3, 64) 36928 activation_23[0][0]
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 3, 3, 64) 256 conv2d_24[0][0]
__________________________________________________________________________________________________
activation_24 (Activation) (None, 3, 3, 64) 0 batch_normalization_24[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, 3, 3, 64) 36928 activation_24[0][0]
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 3, 3, 64) 256 conv2d_25[0][0]
__________________________________________________________________________________________________
merge_10 (Merge) (None, 3, 3, 64) 0 batch_normalization_25[0][0] activation_23[0][0]
__________________________________________________________________________________________________
activation_25 (Activation) (None, 3, 3, 64) 0 merge_10[0][0]
__________________________________________________________________________________________________
max_pooling2d_6 (MaxPooling2D) (None, 1, 1, 64) 0 activation_25[0][0]
__________________________________________________________________________________________________
flatten_1 (Flatten) (None, 64) 0 max_pooling2d_6[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 256) 16640 flatten_1[0][0]
__________________________________________________________________________________________________
dropout_1 (Dropout) (None, 256) 0 dense_1[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 2) 514 dropout_1[0][0]
==================================================================================================
Total params: 632,098
Trainable params: 629,538
Non-trainable params: 2,560
__________________________________________________________________________________________________
去掉模型的全连接层
from keras.models import load_modelbase_model = load_model('model_resenet.h5')
resnet_model = Model(inputs=base_model.input, outputs=base_model.get_layer('max_pooling2d_6').output)
#'max_pooling2d_6'其实就是上述网络中全连接层的前面一层,当然这里你也可以选取其它层,把该层的名称代替'max_pooling2d_6'即可,这样其实就是截取网络,输出网络结构就是方便读取每层的名字。
print(resnet_model.summary())
新输出的网络结构:
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) (None, 227, 227, 1) 0
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, 225, 225, 32) 320 input_1[0][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 225, 225, 32) 128 conv2d_1[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, 225, 225, 32) 0 batch_normalization_1[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, 225, 225, 32) 9248 activation_1[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 225, 225, 32) 128 conv2d_2[0][0]
__________________________________________________________________________________________________
activation_2 (Activation) (None, 225, 225, 32) 0 batch_normalization_2[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 225, 225, 32) 9248 activation_2[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 225, 225, 32) 128 conv2d_3[0][0]
__________________________________________________________________________________________________
merge_1 (Merge) (None, 225, 225, 32) 0 batch_normalization_3[0][0] activation_1[0][0]
__________________________________________________________________________________________________
activation_3 (Activation) (None, 225, 225, 32) 0 merge_1[0][0]
__________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, 225, 225, 32) 9248 activation_3[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 225, 225, 32) 128 conv2d_4[0][0]
__________________________________________________________________________________________________
activation_4 (Activation) (None, 225, 225, 32) 0 batch_normalization_4[0][0]
__________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, 225, 225, 32) 9248 activation_4[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 225, 225, 32) 128 conv2d_5[0][0]
__________________________________________________________________________________________________
merge_2 (Merge) (None, 225, 225, 32) 0 batch_normalization_5[0][0] activation_3[0][0]
__________________________________________________________________________________________________
activation_5 (Activation) (None, 225, 225, 32) 0 merge_2[0][0]
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D) (None, 112, 112, 32) 0 activation_5[0][0]
__________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, 110, 110, 64) 18496 max_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 110, 110, 64) 256 conv2d_6[0][0]
__________________________________________________________________________________________________
activation_6 (Activation) (None, 110, 110, 64) 0 batch_normalization_6[0][0]
__________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, 110, 110, 64) 36928 activation_6[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 110, 110, 64) 256 conv2d_7[0][0]
__________________________________________________________________________________________________
activation_7 (Activation) (None, 110, 110, 64) 0 batch_normalization_7[0][0]
__________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, 110, 110, 64) 36928 activation_7[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 110, 110, 64) 256 conv2d_8[0][0]
__________________________________________________________________________________________________
merge_3 (Merge) (None, 110, 110, 64) 0 batch_normalization_8[0][0] activation_6[0][0]
__________________________________________________________________________________________________
activation_8 (Activation) (None, 110, 110, 64) 0 merge_3[0][0]
__________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, 110, 110, 64) 36928 activation_8[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 110, 110, 64) 256 conv2d_9[0][0]
__________________________________________________________________________________________________
activation_9 (Activation) (None, 110, 110, 64) 0 batch_normalization_9[0][0]
__________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, 110, 110, 64) 36928 activation_9[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 110, 110, 64) 256 conv2d_10[0][0]
__________________________________________________________________________________________________
merge_4 (Merge) (None, 110, 110, 64) 0 batch_normalization_10[0][0] activation_8[0][0]
__________________________________________________________________________________________________
activation_10 (Activation) (None, 110, 110, 64) 0 merge_4[0][0]
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, 55, 55, 64) 0 activation_10[0][0]
__________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, 53, 53, 64) 36928 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 53, 53, 64) 256 conv2d_11[0][0]
__________________________________________________________________________________________________
activation_11 (Activation) (None, 53, 53, 64) 0 batch_normalization_11[0][0]
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D) (None, 26, 26, 64) 0 activation_11[0][0]
__________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, 26, 26, 64) 36928 max_pooling2d_3[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 26, 26, 64) 256 conv2d_12[0][0]
__________________________________________________________________________________________________
activation_12 (Activation) (None, 26, 26, 64) 0 batch_normalization_12[0][0]
__________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, 26, 26, 64) 36928 activation_12[0][0]
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 26, 26, 64) 256 conv2d_13[0][0]
__________________________________________________________________________________________________
merge_5 (Merge) (None, 26, 26, 64) 0 batch_normalization_13[0][0] max_pooling2d_3[0][0]
__________________________________________________________________________________________________
activation_13 (Activation) (None, 26, 26, 64) 0 merge_5[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 26, 26, 64) 36928 activation_13[0][0]
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 26, 26, 64) 256 conv2d_14[0][0]
__________________________________________________________________________________________________
activation_14 (Activation) (None, 26, 26, 64) 0 batch_normalization_14[0][0]
__________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, 26, 26, 64) 36928 activation_14[0][0]
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 26, 26, 64) 256 conv2d_15[0][0]
__________________________________________________________________________________________________
merge_6 (Merge) (None, 26, 26, 64) 0 batch_normalization_15[0][0] activation_13[0][0]
__________________________________________________________________________________________________
activation_15 (Activation) (None, 26, 26, 64) 0 merge_6[0][0]
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, 13, 13, 64) 0 activation_15[0][0]
__________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, 11, 11, 32) 18464 max_pooling2d_4[0][0]
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 11, 11, 32) 128 conv2d_16[0][0]
__________________________________________________________________________________________________
activation_16 (Activation) (None, 11, 11, 32) 0 batch_normalization_16[0][0]
__________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, 11, 11, 32) 9248 activation_16[0][0]
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 11, 11, 32) 128 conv2d_17[0][0]
__________________________________________________________________________________________________
activation_17 (Activation) (None, 11, 11, 32) 0 batch_normalization_17[0][0]
__________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, 11, 11, 32) 9248 activation_17[0][0]
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 11, 11, 32) 128 conv2d_18[0][0]
__________________________________________________________________________________________________
merge_7 (Merge) (None, 11, 11, 32) 0 batch_normalization_18[0][0] activation_16[0][0]
__________________________________________________________________________________________________
activation_18 (Activation) (None, 11, 11, 32) 0 merge_7[0][0]
__________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, 11, 11, 32) 9248 activation_18[0][0]
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 11, 11, 32) 128 conv2d_19[0][0]
__________________________________________________________________________________________________
activation_19 (Activation) (None, 11, 11, 32) 0 batch_normalization_19[0][0]
__________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, 11, 11, 32) 9248 activation_19[0][0]
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 11, 11, 32) 128 conv2d_20[0][0]
__________________________________________________________________________________________________
merge_8 (Merge) (None, 11, 11, 32) 0 batch_normalization_20[0][0] activation_18[0][0]
__________________________________________________________________________________________________
activation_20 (Activation) (None, 11, 11, 32) 0 merge_8[0][0]
__________________________________________________________________________________________________
max_pooling2d_5 (MaxPooling2D) (None, 5, 5, 32) 0 activation_20[0][0]
__________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, 3, 3, 64) 18496 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 3, 3, 64) 256 conv2d_21[0][0]
__________________________________________________________________________________________________
activation_21 (Activation) (None, 3, 3, 64) 0 batch_normalization_21[0][0]
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, 3, 3, 64) 36928 activation_21[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 3, 3, 64) 256 conv2d_22[0][0]
__________________________________________________________________________________________________
activation_22 (Activation) (None, 3, 3, 64) 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, 3, 3, 64) 36928 activation_22[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 3, 3, 64) 256 conv2d_23[0][0]
__________________________________________________________________________________________________
merge_9 (Merge) (None, 3, 3, 64) 0 batch_normalization_23[0][0] activation_21[0][0]
__________________________________________________________________________________________________
activation_23 (Activation) (None, 3, 3, 64) 0 merge_9[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, 3, 3, 64) 36928 activation_23[0][0]
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 3, 3, 64) 256 conv2d_24[0][0]
__________________________________________________________________________________________________
activation_24 (Activation) (None, 3, 3, 64) 0 batch_normalization_24[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, 3, 3, 64) 36928 activation_24[0][0]
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 3, 3, 64) 256 conv2d_25[0][0]
__________________________________________________________________________________________________
merge_10 (Merge) (None, 3, 3, 64) 0 batch_normalization_25[0][0] activation_23[0][0]
__________________________________________________________________________________________________
activation_25 (Activation) (None, 3, 3, 64) 0 merge_10[0][0]
__________________________________________________________________________________________________
max_pooling2d_6 (MaxPooling2D) (None, 1, 1, 64) 0 activation_25[0][0]
==================================================================================================
Total params: 614,944
Trainable params: 612,384
Non-trainable params: 2,560
__________________________________________________________________________________________________
keras调用自己训练的模型,并去掉全连接层相关推荐
- keras库的安装及使用,以全连接层和手写数字识别MNIST为例
1.什么是keras 什么是keras? keras以TensorFlow和Theano作为后端封装,是一个专门用于深度学习的python模块. 包含了全连接层,卷积层,池化层,循环层,嵌入层等等等, ...
- keras: 用预训练的模型提取特征
keras提供了VGG19在ImageNet上的预训练权重模型文件,其他可用的模型还有VGG16.Xception.ResNet50.InceptionV3 4个. VGG19在keras中的定义: ...
- R语言使用keras包实现包含多个全连接层的二分类预测模型:在模型训练过程中动态可视化每个epoch后的loss值以及accuray值
R语言使用keras包实现包含多个全连接层的二分类预测模型:在模型训练过程中动态可视化每个epoch后的loss值以及accuray值 目录
- 【pytorch】pytorch自定义训练vgg16和测试数据集 微调resnet18全连接层
自己定义模型 测试: correct = 0total = 0for data in test_loader:img,label = dataoutputs = net(Variable(img))_ ...
- 如何只训练网络中的全连接层
https://blog.csdn.net/weixin_32759777/article/details/105661316 https://dongfangyou.blog.csdn.net/ar ...
- 【模型解读】“全连接”的卷积网络,有什么好?
[模型解读]"全连接"的卷积网络,有什么好? 这是深度学习模型解读第8篇,本篇我们简单介绍Densenet. 作者&编辑 | 言有三 1 从skip connect到den ...
- 全连接层tf.keras.layers.Dense()介绍
函数原型 tf.keras.layers.Dense(units, # 正整数,输出空间的维数activation=None, # 激活函数,不指定则没有use_bias=True, # 布尔值,是否 ...
- Caffe中卷基层和全连接层训练参数个数如何确定
今天来仔细讲一下卷基层和全连接层训练参数个数如何确定的问题.我们以Mnist为例,首先贴出网络配置文件: [python] view plain copy name: "LeNet" ...
- 搭建C++开发图像算法的环境——利用C++调用Pytorch训练后模型
本文主要介绍如何搭建C++开发图像算法的环境,使用到CMake + libtorch + OpenCV + ITK等.旨在构建一个可融合深度学习框架,可开发图像处理算法且易于跨平台编译的环境. 准备条 ...
- 1、基于Keras、Mnist手写数字识别数据集构建全连接(FC)神经网络训练模型
文章目录 前言 一.MNIST数据集是什么? 二.构建神经网络训练模型 1.导入库 2.载入数据 3.数据处理 4.创建模型 5.编译模型 6.训练模型 7.评估模型 三.总代码 前言 提示: 1.本 ...
最新文章
- ai3中文语音补丁_NS 暗黑破坏神3 中文补丁今日上线!刷起来
- as安装过程中gradle_重新认识AndroidStudio和Gradle,这些都是我们应该知道的
- DL之模型调参:深度学习算法模型优化参数之对深度学习模型的超参数采用网格搜索进行模型调优(建议收藏)
- 计算机办公实用技能项目实践教程,计算机常用办公软件应用/21世纪计算机科学与技术实践型教程...
- Leetcode题库 145.二叉树的后序遍历(递归 C实现)
- flink写入clickhouse遇到210错误故障排查思路
- java html entity encoding,实体“HTML.Version”的声明必须以''结尾
- linux下隐藏输入密码
- 移除项目中的CocoaPods
- 引入的html设置utf-8,如何为default.html将字符编码设置为UTF-8?
- 云图说|云数据库MySQL内核小版本升级全攻略
- Antlr中文文档初稿2(《ANTLR树分析器》)
- 零基础入门语义分割——Task2 数据扩增
- oauth2-server-php-docs 存储
- python3迭代器是什么_Python3.7之迭代器
- ICEM划分网格实例——六角形螺栓
- 全国计算机网络英语,2007年10月自考试题计算机网络管理全国试卷(国外英语资料).doc...
- /Volumes/TeXLive2019/install-tl: No binary platform specified/available, quitting.
- nodejs动态加载路由
- python输入如何加单位_Python中的单位转换
热门文章
- 诡异的dp(凸多边形分割):catalan数
- org.springframework.stereotype 注解
- Android5.0(Lollipop) BLE蓝牙4.0+浅析code(二)
- 关于安装PHP补装PDO与PDO_MYSQL操作
- 链表在libnet中的实现
- C#对象的浅拷与深拷贝
- Win8.1 操作系统中无法打开IE浏览器。
- java保护性拷贝(effective java)
- Kafka配置1--Windows环境安装和配置Kafka
- 解决fabric编译失败(make: *** [release/linux-amd64/bin/configtxgen] Error 1)