使用MATLABR2021b的深度学习工具箱对2021年美赛C题大黄蜂数据进行处理
图像识别
使用MobileNet-v2进行迁移学习
main1.m

%% 准备工作空间
clear;clc;
%% 导入数据
params = load("C:\Users\28605\Desktop\美赛预备\机器学习\大黄蜂识别\params_2022_01_12__19_12_54.mat");
digitDatasetPath = fullfile('C:\Users\28605\Desktop\美赛预备\机器学习\大黄蜂识别\dataset\');
imds = imageDatastore(digitDatasetPath,...'IncludeSubfolders',true,'LabelSource','foldernames');
%数据集图片个数
countEachLabel(imds);
numTrainFiles = 0.8;%   80%作为训练数据
[imdsTrain,imdsValidation] = splitEachLabel(imds,numTrainFiles,'randomize');
%查看图片大小
img = readimage(imds,1);
size(img);
%% 定义CNN网络结构
lgraph = layerGraph();
tempLayers = [imageInputLayer([254 254 3],"Name","imageinput")convolution2dLayer([3 3],32,"Name","Conv1","Padding","same","Stride",[2 2],"Bias",params.Conv1.Bias,"Weights",params.Conv1.Weights)batchNormalizationLayer("Name","bn_Conv1","Epsilon",0.001,"Offset",params.bn_Conv1.Offset,"Scale",params.bn_Conv1.Scale,"TrainedMean",params.bn_Conv1.TrainedMean,"TrainedVariance",params.bn_Conv1.TrainedVariance)clippedReluLayer(6,"Name","Conv1_relu")groupedConvolution2dLayer([3 3],1,32,"Name","expanded_conv_depthwise","Padding","same","Bias",params.expanded_conv_depthwise.Bias,"Weights",params.expanded_conv_depthwise.Weights)batchNormalizationLayer("Name","expanded_conv_depthwise_BN","Epsilon",0.001,"Offset",params.expanded_conv_depthwise_BN.Offset,"Scale",params.expanded_conv_depthwise_BN.Scale,"TrainedMean",params.expanded_conv_depthwise_BN.TrainedMean,"TrainedVariance",params.expanded_conv_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","expanded_conv_depthwise_relu")convolution2dLayer([1 1],16,"Name","expanded_conv_project","Padding","same","Bias",params.expanded_conv_project.Bias,"Weights",params.expanded_conv_project.Weights)batchNormalizationLayer("Name","expanded_conv_project_BN","Epsilon",0.001,"Offset",params.expanded_conv_project_BN.Offset,"Scale",params.expanded_conv_project_BN.Scale,"TrainedMean",params.expanded_conv_project_BN.TrainedMean,"TrainedVariance",params.expanded_conv_project_BN.TrainedVariance)convolution2dLayer([1 1],96,"Name","block_1_expand","Padding","same","Bias",params.block_1_expand.Bias,"Weights",params.block_1_expand.Weights)batchNormalizationLayer("Name","block_1_expand_BN","Epsilon",0.001,"Offset",params.block_1_expand_BN.Offset,"Scale",params.block_1_expand_BN.Scale,"TrainedMean",params.block_1_expand_BN.TrainedMean,"TrainedVariance",params.block_1_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_1_expand_relu")groupedConvolution2dLayer([3 3],1,96,"Name","block_1_depthwise","Padding","same","Stride",[2 2],"Bias",params.block_1_depthwise.Bias,"Weights",params.block_1_depthwise.Weights)batchNormalizationLayer("Name","block_1_depthwise_BN","Epsilon",0.001,"Offset",params.block_1_depthwise_BN.Offset,"Scale",params.block_1_depthwise_BN.Scale,"TrainedMean",params.block_1_depthwise_BN.TrainedMean,"TrainedVariance",params.block_1_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_1_depthwise_relu")convolution2dLayer([1 1],24,"Name","block_1_project","Padding","same","Bias",params.block_1_project.Bias,"Weights",params.block_1_project.Weights)batchNormalizationLayer("Name","block_1_project_BN","Epsilon",0.001,"Offset",params.block_1_project_BN.Offset,"Scale",params.block_1_project_BN.Scale,"TrainedMean",params.block_1_project_BN.TrainedMean,"TrainedVariance",params.block_1_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],144,"Name","block_2_expand","Padding","same","Bias",params.block_2_expand.Bias,"Weights",params.block_2_expand.Weights)batchNormalizationLayer("Name","block_2_expand_BN","Epsilon",0.001,"Offset",params.block_2_expand_BN.Offset,"Scale",params.block_2_expand_BN.Scale,"TrainedMean",params.block_2_expand_BN.TrainedMean,"TrainedVariance",params.block_2_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_2_expand_relu")groupedConvolution2dLayer([3 3],1,144,"Name","block_2_depthwise","Padding","same","Bias",params.block_2_depthwise.Bias,"Weights",params.block_2_depthwise.Weights)batchNormalizationLayer("Name","block_2_depthwise_BN","Epsilon",0.001,"Offset",params.block_2_depthwise_BN.Offset,"Scale",params.block_2_depthwise_BN.Scale,"TrainedMean",params.block_2_depthwise_BN.TrainedMean,"TrainedVariance",params.block_2_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_2_depthwise_relu")convolution2dLayer([1 1],24,"Name","block_2_project","Padding","same","Bias",params.block_2_project.Bias,"Weights",params.block_2_project.Weights)batchNormalizationLayer("Name","block_2_project_BN","Epsilon",0.001,"Offset",params.block_2_project_BN.Offset,"Scale",params.block_2_project_BN.Scale,"TrainedMean",params.block_2_project_BN.TrainedMean,"TrainedVariance",params.block_2_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [additionLayer(2,"Name","block_2_add")convolution2dLayer([1 1],144,"Name","block_3_expand","Padding","same","Bias",params.block_3_expand.Bias,"Weights",params.block_3_expand.Weights)batchNormalizationLayer("Name","block_3_expand_BN","Epsilon",0.001,"Offset",params.block_3_expand_BN.Offset,"Scale",params.block_3_expand_BN.Scale,"TrainedMean",params.block_3_expand_BN.TrainedMean,"TrainedVariance",params.block_3_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_3_expand_relu")groupedConvolution2dLayer([3 3],1,144,"Name","block_3_depthwise","Padding","same","Stride",[2 2],"Bias",params.block_3_depthwise.Bias,"Weights",params.block_3_depthwise.Weights)batchNormalizationLayer("Name","block_3_depthwise_BN","Epsilon",0.001,"Offset",params.block_3_depthwise_BN.Offset,"Scale",params.block_3_depthwise_BN.Scale,"TrainedMean",params.block_3_depthwise_BN.TrainedMean,"TrainedVariance",params.block_3_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_3_depthwise_relu")convolution2dLayer([1 1],32,"Name","block_3_project","Padding","same","Bias",params.block_3_project.Bias,"Weights",params.block_3_project.Weights)batchNormalizationLayer("Name","block_3_project_BN","Epsilon",0.001,"Offset",params.block_3_project_BN.Offset,"Scale",params.block_3_project_BN.Scale,"TrainedMean",params.block_3_project_BN.TrainedMean,"TrainedVariance",params.block_3_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],192,"Name","block_4_expand","Padding","same","Bias",params.block_4_expand.Bias,"Weights",params.block_4_expand.Weights)batchNormalizationLayer("Name","block_4_expand_BN","Epsilon",0.001,"Offset",params.block_4_expand_BN.Offset,"Scale",params.block_4_expand_BN.Scale,"TrainedMean",params.block_4_expand_BN.TrainedMean,"TrainedVariance",params.block_4_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_4_expand_relu")groupedConvolution2dLayer([3 3],1,192,"Name","block_4_depthwise","Padding","same","Bias",params.block_4_depthwise.Bias,"Weights",params.block_4_depthwise.Weights)batchNormalizationLayer("Name","block_4_depthwise_BN","Epsilon",0.001,"Offset",params.block_4_depthwise_BN.Offset,"Scale",params.block_4_depthwise_BN.Scale,"TrainedMean",params.block_4_depthwise_BN.TrainedMean,"TrainedVariance",params.block_4_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_4_depthwise_relu")convolution2dLayer([1 1],32,"Name","block_4_project","Padding","same","Bias",params.block_4_project.Bias,"Weights",params.block_4_project.Weights)batchNormalizationLayer("Name","block_4_project_BN","Epsilon",0.001,"Offset",params.block_4_project_BN.Offset,"Scale",params.block_4_project_BN.Scale,"TrainedMean",params.block_4_project_BN.TrainedMean,"TrainedVariance",params.block_4_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = additionLayer(2,"Name","block_4_add");
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],192,"Name","block_5_expand","Padding","same","Bias",params.block_5_expand.Bias,"Weights",params.block_5_expand.Weights)batchNormalizationLayer("Name","block_5_expand_BN","Epsilon",0.001,"Offset",params.block_5_expand_BN.Offset,"Scale",params.block_5_expand_BN.Scale,"TrainedMean",params.block_5_expand_BN.TrainedMean,"TrainedVariance",params.block_5_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_5_expand_relu")groupedConvolution2dLayer([3 3],1,192,"Name","block_5_depthwise","Padding","same","Bias",params.block_5_depthwise.Bias,"Weights",params.block_5_depthwise.Weights)batchNormalizationLayer("Name","block_5_depthwise_BN","Epsilon",0.001,"Offset",params.block_5_depthwise_BN.Offset,"Scale",params.block_5_depthwise_BN.Scale,"TrainedMean",params.block_5_depthwise_BN.TrainedMean,"TrainedVariance",params.block_5_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_5_depthwise_relu")convolution2dLayer([1 1],32,"Name","block_5_project","Padding","same","Bias",params.block_5_project.Bias,"Weights",params.block_5_project.Weights)batchNormalizationLayer("Name","block_5_project_BN","Epsilon",0.001,"Offset",params.block_5_project_BN.Offset,"Scale",params.block_5_project_BN.Scale,"TrainedMean",params.block_5_project_BN.TrainedMean,"TrainedVariance",params.block_5_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [additionLayer(2,"Name","block_5_add")convolution2dLayer([1 1],192,"Name","block_6_expand","Padding","same","Bias",params.block_6_expand.Bias,"Weights",params.block_6_expand.Weights)batchNormalizationLayer("Name","block_6_expand_BN","Epsilon",0.001,"Offset",params.block_6_expand_BN.Offset,"Scale",params.block_6_expand_BN.Scale,"TrainedMean",params.block_6_expand_BN.TrainedMean,"TrainedVariance",params.block_6_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_6_expand_relu")groupedConvolution2dLayer([3 3],1,192,"Name","block_6_depthwise","Padding","same","Stride",[2 2],"Bias",params.block_6_depthwise.Bias,"Weights",params.block_6_depthwise.Weights)batchNormalizationLayer("Name","block_6_depthwise_BN","Epsilon",0.001,"Offset",params.block_6_depthwise_BN.Offset,"Scale",params.block_6_depthwise_BN.Scale,"TrainedMean",params.block_6_depthwise_BN.TrainedMean,"TrainedVariance",params.block_6_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_6_depthwise_relu")convolution2dLayer([1 1],64,"Name","block_6_project","Padding","same","Bias",params.block_6_project.Bias,"Weights",params.block_6_project.Weights)batchNormalizationLayer("Name","block_6_project_BN","Epsilon",0.001,"Offset",params.block_6_project_BN.Offset,"Scale",params.block_6_project_BN.Scale,"TrainedMean",params.block_6_project_BN.TrainedMean,"TrainedVariance",params.block_6_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],384,"Name","block_7_expand","Padding","same","Bias",params.block_7_expand.Bias,"Weights",params.block_7_expand.Weights)batchNormalizationLayer("Name","block_7_expand_BN","Epsilon",0.001,"Offset",params.block_7_expand_BN.Offset,"Scale",params.block_7_expand_BN.Scale,"TrainedMean",params.block_7_expand_BN.TrainedMean,"TrainedVariance",params.block_7_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_7_expand_relu")groupedConvolution2dLayer([3 3],1,384,"Name","block_7_depthwise","Padding","same","Bias",params.block_7_depthwise.Bias,"Weights",params.block_7_depthwise.Weights)batchNormalizationLayer("Name","block_7_depthwise_BN","Epsilon",0.001,"Offset",params.block_7_depthwise_BN.Offset,"Scale",params.block_7_depthwise_BN.Scale,"TrainedMean",params.block_7_depthwise_BN.TrainedMean,"TrainedVariance",params.block_7_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_7_depthwise_relu")convolution2dLayer([1 1],64,"Name","block_7_project","Padding","same","Bias",params.block_7_project.Bias,"Weights",params.block_7_project.Weights)batchNormalizationLayer("Name","block_7_project_BN","Epsilon",0.001,"Offset",params.block_7_project_BN.Offset,"Scale",params.block_7_project_BN.Scale,"TrainedMean",params.block_7_project_BN.TrainedMean,"TrainedVariance",params.block_7_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = additionLayer(2,"Name","block_7_add");
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],384,"Name","block_8_expand","Padding","same","Bias",params.block_8_expand.Bias,"Weights",params.block_8_expand.Weights)batchNormalizationLayer("Name","block_8_expand_BN","Epsilon",0.001,"Offset",params.block_8_expand_BN.Offset,"Scale",params.block_8_expand_BN.Scale,"TrainedMean",params.block_8_expand_BN.TrainedMean,"TrainedVariance",params.block_8_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_8_expand_relu")groupedConvolution2dLayer([3 3],1,384,"Name","block_8_depthwise","Padding","same","Bias",params.block_8_depthwise.Bias,"Weights",params.block_8_depthwise.Weights)batchNormalizationLayer("Name","block_8_depthwise_BN","Epsilon",0.001,"Offset",params.block_8_depthwise_BN.Offset,"Scale",params.block_8_depthwise_BN.Scale,"TrainedMean",params.block_8_depthwise_BN.TrainedMean,"TrainedVariance",params.block_8_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_8_depthwise_relu")convolution2dLayer([1 1],64,"Name","block_8_project","Padding","same","Bias",params.block_8_project.Bias,"Weights",params.block_8_project.Weights)batchNormalizationLayer("Name","block_8_project_BN","Epsilon",0.001,"Offset",params.block_8_project_BN.Offset,"Scale",params.block_8_project_BN.Scale,"TrainedMean",params.block_8_project_BN.TrainedMean,"TrainedVariance",params.block_8_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = additionLayer(2,"Name","block_8_add");
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],384,"Name","block_9_expand","Padding","same","Bias",params.block_9_expand.Bias,"Weights",params.block_9_expand.Weights)batchNormalizationLayer("Name","block_9_expand_BN","Epsilon",0.001,"Offset",params.block_9_expand_BN.Offset,"Scale",params.block_9_expand_BN.Scale,"TrainedMean",params.block_9_expand_BN.TrainedMean,"TrainedVariance",params.block_9_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_9_expand_relu")groupedConvolution2dLayer([3 3],1,384,"Name","block_9_depthwise","Padding","same","Bias",params.block_9_depthwise.Bias,"Weights",params.block_9_depthwise.Weights)batchNormalizationLayer("Name","block_9_depthwise_BN","Epsilon",0.001,"Offset",params.block_9_depthwise_BN.Offset,"Scale",params.block_9_depthwise_BN.Scale,"TrainedMean",params.block_9_depthwise_BN.TrainedMean,"TrainedVariance",params.block_9_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_9_depthwise_relu")convolution2dLayer([1 1],64,"Name","block_9_project","Padding","same","Bias",params.block_9_project.Bias,"Weights",params.block_9_project.Weights)batchNormalizationLayer("Name","block_9_project_BN","Epsilon",0.001,"Offset",params.block_9_project_BN.Offset,"Scale",params.block_9_project_BN.Scale,"TrainedMean",params.block_9_project_BN.TrainedMean,"TrainedVariance",params.block_9_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [additionLayer(2,"Name","block_9_add")convolution2dLayer([1 1],384,"Name","block_10_expand","Padding","same","Bias",params.block_10_expand.Bias,"Weights",params.block_10_expand.Weights)batchNormalizationLayer("Name","block_10_expand_BN","Epsilon",0.001,"Offset",params.block_10_expand_BN.Offset,"Scale",params.block_10_expand_BN.Scale,"TrainedMean",params.block_10_expand_BN.TrainedMean,"TrainedVariance",params.block_10_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_10_expand_relu")groupedConvolution2dLayer([3 3],1,384,"Name","block_10_depthwise","Padding","same","Bias",params.block_10_depthwise.Bias,"Weights",params.block_10_depthwise.Weights)batchNormalizationLayer("Name","block_10_depthwise_BN","Epsilon",0.001,"Offset",params.block_10_depthwise_BN.Offset,"Scale",params.block_10_depthwise_BN.Scale,"TrainedMean",params.block_10_depthwise_BN.TrainedMean,"TrainedVariance",params.block_10_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_10_depthwise_relu")convolution2dLayer([1 1],96,"Name","block_10_project","Padding","same","Bias",params.block_10_project.Bias,"Weights",params.block_10_project.Weights)batchNormalizationLayer("Name","block_10_project_BN","Epsilon",0.001,"Offset",params.block_10_project_BN.Offset,"Scale",params.block_10_project_BN.Scale,"TrainedMean",params.block_10_project_BN.TrainedMean,"TrainedVariance",params.block_10_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],576,"Name","block_11_expand","Padding","same","Bias",params.block_11_expand.Bias,"Weights",params.block_11_expand.Weights)batchNormalizationLayer("Name","block_11_expand_BN","Epsilon",0.001,"Offset",params.block_11_expand_BN.Offset,"Scale",params.block_11_expand_BN.Scale,"TrainedMean",params.block_11_expand_BN.TrainedMean,"TrainedVariance",params.block_11_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_11_expand_relu")groupedConvolution2dLayer([3 3],1,576,"Name","block_11_depthwise","Padding","same","Bias",params.block_11_depthwise.Bias,"Weights",params.block_11_depthwise.Weights)batchNormalizationLayer("Name","block_11_depthwise_BN","Epsilon",0.001,"Offset",params.block_11_depthwise_BN.Offset,"Scale",params.block_11_depthwise_BN.Scale,"TrainedMean",params.block_11_depthwise_BN.TrainedMean,"TrainedVariance",params.block_11_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_11_depthwise_relu")convolution2dLayer([1 1],96,"Name","block_11_project","Padding","same","Bias",params.block_11_project.Bias,"Weights",params.block_11_project.Weights)batchNormalizationLayer("Name","block_11_project_BN","Epsilon",0.001,"Offset",params.block_11_project_BN.Offset,"Scale",params.block_11_project_BN.Scale,"TrainedMean",params.block_11_project_BN.TrainedMean,"TrainedVariance",params.block_11_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = additionLayer(2,"Name","block_11_add");
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],576,"Name","block_12_expand","Padding","same","Bias",params.block_12_expand.Bias,"Weights",params.block_12_expand.Weights)batchNormalizationLayer("Name","block_12_expand_BN","Epsilon",0.001,"Offset",params.block_12_expand_BN.Offset,"Scale",params.block_12_expand_BN.Scale,"TrainedMean",params.block_12_expand_BN.TrainedMean,"TrainedVariance",params.block_12_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_12_expand_relu")groupedConvolution2dLayer([3 3],1,576,"Name","block_12_depthwise","Padding","same","Bias",params.block_12_depthwise.Bias,"Weights",params.block_12_depthwise.Weights)batchNormalizationLayer("Name","block_12_depthwise_BN","Epsilon",0.001,"Offset",params.block_12_depthwise_BN.Offset,"Scale",params.block_12_depthwise_BN.Scale,"TrainedMean",params.block_12_depthwise_BN.TrainedMean,"TrainedVariance",params.block_12_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_12_depthwise_relu")convolution2dLayer([1 1],96,"Name","block_12_project","Padding","same","Bias",params.block_12_project.Bias,"Weights",params.block_12_project.Weights)batchNormalizationLayer("Name","block_12_project_BN","Epsilon",0.001,"Offset",params.block_12_project_BN.Offset,"Scale",params.block_12_project_BN.Scale,"TrainedMean",params.block_12_project_BN.TrainedMean,"TrainedVariance",params.block_12_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [additionLayer(2,"Name","block_12_add")convolution2dLayer([1 1],576,"Name","block_13_expand","Padding","same","Bias",params.block_13_expand.Bias,"Weights",params.block_13_expand.Weights)batchNormalizationLayer("Name","block_13_expand_BN","Epsilon",0.001,"Offset",params.block_13_expand_BN.Offset,"Scale",params.block_13_expand_BN.Scale,"TrainedMean",params.block_13_expand_BN.TrainedMean,"TrainedVariance",params.block_13_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_13_expand_relu")groupedConvolution2dLayer([3 3],1,576,"Name","block_13_depthwise","Padding","same","Stride",[2 2],"Bias",params.block_13_depthwise.Bias,"Weights",params.block_13_depthwise.Weights)batchNormalizationLayer("Name","block_13_depthwise_BN","Epsilon",0.001,"Offset",params.block_13_depthwise_BN.Offset,"Scale",params.block_13_depthwise_BN.Scale,"TrainedMean",params.block_13_depthwise_BN.TrainedMean,"TrainedVariance",params.block_13_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_13_depthwise_relu")convolution2dLayer([1 1],160,"Name","block_13_project","Padding","same","Bias",params.block_13_project.Bias,"Weights",params.block_13_project.Weights)batchNormalizationLayer("Name","block_13_project_BN","Epsilon",0.001,"Offset",params.block_13_project_BN.Offset,"Scale",params.block_13_project_BN.Scale,"TrainedMean",params.block_13_project_BN.TrainedMean,"TrainedVariance",params.block_13_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],960,"Name","block_14_expand","Padding","same","Bias",params.block_14_expand.Bias,"Weights",params.block_14_expand.Weights)batchNormalizationLayer("Name","block_14_expand_BN","Epsilon",0.001,"Offset",params.block_14_expand_BN.Offset,"Scale",params.block_14_expand_BN.Scale,"TrainedMean",params.block_14_expand_BN.TrainedMean,"TrainedVariance",params.block_14_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_14_expand_relu")groupedConvolution2dLayer([3 3],1,960,"Name","block_14_depthwise","Padding","same","Bias",params.block_14_depthwise.Bias,"Weights",params.block_14_depthwise.Weights)batchNormalizationLayer("Name","block_14_depthwise_BN","Epsilon",0.001,"Offset",params.block_14_depthwise_BN.Offset,"Scale",params.block_14_depthwise_BN.Scale,"TrainedMean",params.block_14_depthwise_BN.TrainedMean,"TrainedVariance",params.block_14_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_14_depthwise_relu")convolution2dLayer([1 1],160,"Name","block_14_project","Padding","same","Bias",params.block_14_project.Bias,"Weights",params.block_14_project.Weights)batchNormalizationLayer("Name","block_14_project_BN","Epsilon",0.001,"Offset",params.block_14_project_BN.Offset,"Scale",params.block_14_project_BN.Scale,"TrainedMean",params.block_14_project_BN.TrainedMean,"TrainedVariance",params.block_14_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = additionLayer(2,"Name","block_14_add");
lgraph = addLayers(lgraph,tempLayers);tempLayers = [convolution2dLayer([1 1],960,"Name","block_15_expand","Padding","same","Bias",params.block_15_expand.Bias,"Weights",params.block_15_expand.Weights)batchNormalizationLayer("Name","block_15_expand_BN","Epsilon",0.001,"Offset",params.block_15_expand_BN.Offset,"Scale",params.block_15_expand_BN.Scale,"TrainedMean",params.block_15_expand_BN.TrainedMean,"TrainedVariance",params.block_15_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_15_expand_relu")groupedConvolution2dLayer([3 3],1,960,"Name","block_15_depthwise","Padding","same","Bias",params.block_15_depthwise.Bias,"Weights",params.block_15_depthwise.Weights)batchNormalizationLayer("Name","block_15_depthwise_BN","Epsilon",0.001,"Offset",params.block_15_depthwise_BN.Offset,"Scale",params.block_15_depthwise_BN.Scale,"TrainedMean",params.block_15_depthwise_BN.TrainedMean,"TrainedVariance",params.block_15_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_15_depthwise_relu")convolution2dLayer([1 1],160,"Name","block_15_project","Padding","same","Bias",params.block_15_project.Bias,"Weights",params.block_15_project.Weights)batchNormalizationLayer("Name","block_15_project_BN","Epsilon",0.001,"Offset",params.block_15_project_BN.Offset,"Scale",params.block_15_project_BN.Scale,"TrainedMean",params.block_15_project_BN.TrainedMean,"TrainedVariance",params.block_15_project_BN.TrainedVariance)];
lgraph = addLayers(lgraph,tempLayers);tempLayers = [additionLayer(2,"Name","block_15_add")convolution2dLayer([1 1],960,"Name","block_16_expand","Padding","same","Bias",params.block_16_expand.Bias,"Weights",params.block_16_expand.Weights)batchNormalizationLayer("Name","block_16_expand_BN","Epsilon",0.001,"Offset",params.block_16_expand_BN.Offset,"Scale",params.block_16_expand_BN.Scale,"TrainedMean",params.block_16_expand_BN.TrainedMean,"TrainedVariance",params.block_16_expand_BN.TrainedVariance)clippedReluLayer(6,"Name","block_16_expand_relu")groupedConvolution2dLayer([3 3],1,960,"Name","block_16_depthwise","Padding","same","Bias",params.block_16_depthwise.Bias,"Weights",params.block_16_depthwise.Weights)batchNormalizationLayer("Name","block_16_depthwise_BN","Epsilon",0.001,"Offset",params.block_16_depthwise_BN.Offset,"Scale",params.block_16_depthwise_BN.Scale,"TrainedMean",params.block_16_depthwise_BN.TrainedMean,"TrainedVariance",params.block_16_depthwise_BN.TrainedVariance)clippedReluLayer(6,"Name","block_16_depthwise_relu")convolution2dLayer([1 1],320,"Name","block_16_project","Padding","same","Bias",params.block_16_project.Bias,"Weights",params.block_16_project.Weights)batchNormalizationLayer("Name","block_16_project_BN","Epsilon",0.001,"Offset",params.block_16_project_BN.Offset,"Scale",params.block_16_project_BN.Scale,"TrainedMean",params.block_16_project_BN.TrainedMean,"TrainedVariance",params.block_16_project_BN.TrainedVariance)convolution2dLayer([1 1],1280,"Name","Conv_1","Bias",params.Conv_1.Bias,"Weights",params.Conv_1.Weights)batchNormalizationLayer("Name","Conv_1_bn","Epsilon",0.001,"Offset",params.Conv_1_bn.Offset,"Scale",params.Conv_1_bn.Scale,"TrainedMean",params.Conv_1_bn.TrainedMean,"TrainedVariance",params.Conv_1_bn.TrainedVariance)clippedReluLayer(6,"Name","out_relu")globalAveragePooling2dLayer("Name","global_average_pooling2d_1")fullyConnectedLayer(2,"Name","Logits")softmaxLayer("Name","softmax")classificationLayer("Name","classoutput")];
lgraph = addLayers(lgraph,tempLayers);% 清理辅助变量
clear tempLayers;lgraph = connectLayers(lgraph,"block_1_project_BN","block_2_expand");
lgraph = connectLayers(lgraph,"block_1_project_BN","block_2_add/in2");
lgraph = connectLayers(lgraph,"block_2_project_BN","block_2_add/in1");
lgraph = connectLayers(lgraph,"block_3_project_BN","block_4_expand");
lgraph = connectLayers(lgraph,"block_3_project_BN","block_4_add/in2");
lgraph = connectLayers(lgraph,"block_4_project_BN","block_4_add/in1");
lgraph = connectLayers(lgraph,"block_4_add","block_5_expand");
lgraph = connectLayers(lgraph,"block_4_add","block_5_add/in2");
lgraph = connectLayers(lgraph,"block_5_project_BN","block_5_add/in1");
lgraph = connectLayers(lgraph,"block_6_project_BN","block_7_expand");
lgraph = connectLayers(lgraph,"block_6_project_BN","block_7_add/in2");
lgraph = connectLayers(lgraph,"block_7_project_BN","block_7_add/in1");
lgraph = connectLayers(lgraph,"block_7_add","block_8_expand");
lgraph = connectLayers(lgraph,"block_7_add","block_8_add/in2");
lgraph = connectLayers(lgraph,"block_8_project_BN","block_8_add/in1");
lgraph = connectLayers(lgraph,"block_8_add","block_9_expand");
lgraph = connectLayers(lgraph,"block_8_add","block_9_add/in2");
lgraph = connectLayers(lgraph,"block_9_project_BN","block_9_add/in1");
lgraph = connectLayers(lgraph,"block_10_project_BN","block_11_expand");
lgraph = connectLayers(lgraph,"block_10_project_BN","block_11_add/in2");
lgraph = connectLayers(lgraph,"block_11_project_BN","block_11_add/in1");
lgraph = connectLayers(lgraph,"block_11_add","block_12_expand");
lgraph = connectLayers(lgraph,"block_11_add","block_12_add/in2");
lgraph = connectLayers(lgraph,"block_12_project_BN","block_12_add/in1");
lgraph = connectLayers(lgraph,"block_13_project_BN","block_14_expand");
lgraph = connectLayers(lgraph,"block_13_project_BN","block_14_add/in2");
lgraph = connectLayers(lgraph,"block_14_project_BN","block_14_add/in1");
lgraph = connectLayers(lgraph,"block_14_add","block_15_expand");
lgraph = connectLayers(lgraph,"block_14_add","block_15_add/in2");
lgraph = connectLayers(lgraph,"block_15_project_BN","block_15_add/in1");
options = trainingOptions('sgdm',...'maxEpochs',50,...'ValidationData',imdsValidation,...'ValidationFrequency',30,...'Verbose',false,...'InitialLearnRate',1e-3,...'plots','training-progress',...'Shuffle','every-epoch',...'ExecutionEnvironment','cpu');
%%
net = trainNetwork(imdsTrain,lgraph,options);
save('CNNtestmini2','net')

注:params和Igraph为深度学习工具箱自动生成,需手动添加至脚本

预测用代码
main2.m

clear;clc;
dbstop if error
load CNNtestmini2.mat
net.Layers;
%read the image to classify
[file,path] = uigetfile('*');
image = fullfile(path,file);
I = imresize(imread(image),[254,254]);
%adjust size of image
% sz = net.Layers(1).InputSize;
% I = I(1:sz(1),1:sz(2),1:sz(3));
%classify
label = classify(net,I);

文本识别
使用LSTM网络建立
main3.m

clear;clc;
dbstop if error
%% 导入数据集
filename = "data.xlsx";
data = readtable(filename,"TextType","string");
data = table2cell(data);
for i = size(data,1):-1:1if ismissing(data{i,1})data(i,:) = [];end
end
data = data([1:10,12:1000],:);
data = cell2table(data,"VariableNames",{'Description','Category'});
head(data);
data.Category = categorical(data.Category);
figure(1)
histogram(data.Category);
xlabel('Class');
ylabel('Frequency');
title('Class Distribution');
cvp = cvpartition(data.Category,'HoldOut',0.2);
dataTrain = data(training(cvp),:);
dataValidation = data(test(cvp),:);
textDataTrain = dataTrain.Description;
textDataValidation = dataValidation.Description;
YTrain = dataTrain.Category;
YValidation = dataValidation.Category;figure(2)
wordcloud(textDataTrain);
title("Training Data");%文本预处理
% documentsTrain = preprocessText(textDataTrain);
documentsTrain = tokenizedDocument(textDataTrain);
documentsTrain = lower(documentsTrain);
documentsTrain = erasePunctuation(documentsTrain);
% documentsValidation = preprocessText(textDataValidation);
documentsValidation = tokenizedDocument(textDataValidation);
documentsValidation = lower(documentsValidation);
documentsValidation = erasePunctuation(documentsValidation);
documentsTrain(1:5);
%转变文件顺序
enc = wordEncoding(documentsTrain);
documentLengths = doclength(documentsTrain);
figure(2)
histogram(documentLengths)
title("Document Lengths");
xlabel("Length");
ylabel("Number of Documents");
sequenceLength = 60;
XTrain = doc2sequence(enc,documentsTrain,'Length',sequenceLength);
XTrain(1:5);
XValidation = doc2sequence(enc,documentsValidation,'Length',sequenceLength);%创建LSTM模型
inputSize = 1;
embeddingDimension = 50;
numHiddenUnits = 80;
numWords = enc.NumWords;
numClasses = numel(categories(YTrain));
layers = [...sequenceInputLayer(inputSize)wordEmbeddingLayer(embeddingDimension,numWords)lstmLayer(numHiddenUnits,'OutputMode','last')fullyConnectedLayer(numClasses)softmaxLayerclassificationLayer];
% layers = [
%     sequenceInputLayer(1,"Name","input")
%     lstmLayer(128,"Name","lstm","OutputMode","last")
%     dropoutLayer(0.5,"Name","dropout")
%     fullyConnectedLayer(2,"Name","fc")
%     softmaxLayer("Name","softmax")
%     classificationLayer("Name","classification")];
%设置训练参数
options = trainingOptions("adam",..."MiniBatchSize",16,...'Plots','training-progress',...'GradientThreshold',2,...'MaxEpochs',60,...'ValidationData',{XValidation,YValidation},...'Shuffle','every-epoch',...'Verbose',false);
% YTrain = cellstr(YTrain);
net = trainNetwork(XTrain,YTrain,layers,options);
save('LSTMtestmini2','net')

预测用代码
main4.m

clear;clc;
dbstop if error
load LSTMtestmini2.mat
filename = "data.xlsx";
data = readtable(filename,"TextType","string");
data = table2cell(data);
for i = size(data,1):-1:1if ismissing(data{i,1})data(i,:) = [];end
end
reportsNew = data(11,1);
%文本预处理
documentsNew = tokenizedDocument(reportsNew);
documentsTrain = lower(documentsNew);
documentsTrain = erasePunctuation(documentsTrain);
enc = wordEncoding(documentsTrain);
documentLengths = doclength(documentsTrain);
sequenceLength = 40;
XNew = doc2sequence(enc,documentsNew,'Length',sequenceLength);
[labelsNew,scores] = classify(net,XNew);

深度学习--大黄蜂预测相关推荐

  1. 一文搞定深度学习建模预测全流程(Python)

    作者 | 泳鱼 来源 | 算法进阶 本文详细地梳理及实现了深度学习模型构建及预测的全流程,代码示例基于python及神经网络库keras,通过设计一个深度神经网络模型做波士顿房价预测.主要依赖的Pyt ...

  2. 深度学习时间序列预测:LSTM算法构建时间序列单变量模型预测大气压( air pressure)+代码实战

    深度学习时间序列预测:LSTM算法构建时间序列单变量模型预测大气压( air pressure)+代码实战 长短期记忆(Long short-term memory, LSTM)是一种特殊的RNN,主 ...

  3. 深度学习时间序列预测:卷积神经网络(CNN)算法构建单变量时间序列预测模型预测空气质量(PM2.5)+代码实战

    深度学习时间序列预测:卷积神经网络(CNN)算法构建单变量时间序列预测模型预测空气质量(PM2.5)+代码实战 神经网络(neual networks)是人工智能研究领域的一部分,当前最流行的神经网络 ...

  4. 深度学习时间序列预测:GRU算法构建单变量时间序列预测模型+代码实战

    深度学习时间序列预测:GRU算法构建单变量时间序列预测模型+代码实战 GRU(Gate Recurrent Unit)是循环神经网络(Recurrent Neural Network, RNN)的一种 ...

  5. 深度学习时间序列预测:GRU算法构建多变量时间序列预测模型+代码实战

    深度学习时间序列预测:GRU算法构建多变量时间序列预测模型+代码实战 注意参考:深度学习多变量时间序列预测:GRU算法构建单变量时间序列预测模型+代码实战 GRU(Gate Recurrent Uni ...

  6. 深度学习时间序列预测:LSTM算法构建时间序列单变量模型预测空气质量(PM2.5)+代码实战

    深度学习时间序列预测:LSTM算法构建时间序列单变量模型预测空气质量(PM2.5)+代码实战 # 导入需要的包和函数: from __future__ import print_function im ...

  7. DL之随机性:理解和探究采用深度学习算法预测时导致多次运行结果不一致的问题

    DL之随机性:理解和探究采用深度学习算法预测时导致多次运行结果不一致的问题 目录 理解和探究采用深度学习算法预测时导致每次运行结果不一致的问题

  8. 深度学习建模预测全流程(Python)!

    本文详细地梳理及实现了深度学习模型构建及预测的全流程,代码示例基于python及神经网络库keras,通过设计一个深度神经网络模型做波士顿房价预测.主要依赖的Python库有:keras.scikit ...

  9. 基于原始影像数据的深度学习模型预测脑龄可获得可靠的遗传生物标志物

    基于机器学习对神经影像数据进行分析可以准确预测健康人的年龄.预测年龄与健康大脑的年龄的偏差被证明与认知障碍和疾病有关.在这里,我们基于深度学习的预测建模方法,特别是卷积神经网络(CNN),进一步测试了 ...

最新文章

  1. windows下SVN使用 Add指令、Undo Add指令
  2. selenium WebDriverException: Message: unknown error: DevToolsActivePort file doesnt exist
  3. 第一季度Teradata营收下降7.3% 利润下跌63%
  4. Linux 命令:pwd、touch、ll、wget
  5. JVM 学习四:类加载之双亲委派机制与沙箱安全机制
  6. 设计模式综和实战项目x-gen系列二
  7. RocketMQ : RemotingTooMuchRequestException: sendDefaultImpl call timeout
  8. python图例重复显示_python – 具有两个标记的多行的自定义图例,用于相同的文本...
  9. 数据结构与算法之间的关系
  10. mysql-5.7.17.msi安装
  11. SQL 查询所有表名、字段名、类型、长度、存储过程、视图
  12. A Style-Aware Content Loss for Real-time HD Style Transfer(一个风格转换的风格感知损失)CVPR2018
  13. ubuntu硬盘序列号怎么查询_如何在Linux中查找硬盘的详细信息?
  14. 什么是脏数据,缓存中是否可能产生脏数据,如果出现脏数据该怎么处理?
  15. js click与onclick事件绑定,触发与解绑
  16. electron builder 打包错误相关问题
  17. activity串行多实例审批
  18. ES5和ES6的区别。
  19. OA系统选型:明确需求是捷径
  20. MMP数据库greenplum,与hadoop的区别

热门文章

  1. 安全危机 破解U盘加密工具的加密原理
  2. Ubuntu 18.04下触控板右键失灵的解决方法
  3. face++与python实现人脸识别签到(考勤)功能
  4. ajax放大缩小,vue接入实时视频,并可以控制缩放与上下左右
  5. HFSS - 矩形口径喇叭天线的设计与仿真
  6. pci-e服务器显卡性能,PCI-E延长线拖累显卡性能?实测大惊喜
  7. 新品推介:乐扩PCI-E转NGFF(PCIe)SSD+SATA转NGFF(SATA)转接卡
  8. 阿米巴经营模式:唤醒员工工作激情与梦想
  9. 使用three.js渲染瓦片地球-第一篇
  10. 电影小程序微信小程序项目源码