softmax实现cifar10分类
将cifar10改成单一通道后,套用前面的softmax分类,分类率40%左右,想哭。。。
%matplotlib inline from mxnet.gluon import data as gdata from mxnet import autograd,nd import gluonbook as gb import sys
cifar_train = gdata.vision.CIFAR10(train=True) cifar_test = gdata.vision.CIFAR10(train=False)
(len(cifar_train),len(cifar_test))
(50000, 10000)
feature,label = cifar_train[0]
feature.shape,feature.dtype
((32, 32, 3), numpy.uint8)
label,type(label),label.dtype
(6, numpy.int32, dtype('int32'))
batch_size = 256 transformer = gdata.vision.transforms.ToTensor()
if sys.platform.startswith('win'):num_workers = 0 # 0 表示不用额外的进程来加速读取数据。 else:num_workers = 4train_iter = gdata.DataLoader(cifar_train.transform_first(transformer),batch_size, shuffle=True,num_workers=num_workers) test_iter = gdata.DataLoader(cifar_test.transform_first(transformer),batch_size, shuffle=False,num_workers=num_workers)
len(train_iter)
196
for X,y in train_iter:print(X)break
[[[[0.3137255 0.3019608 0.34509805 ... 0.2901961 0.30196080.34901962][0.36078432 0.35686275 0.32941177 ... 0.23137255 0.25098040.3764706 ][0.34509805 0.42352942 0.47058824 ... 0.1882353 0.196078430.3254902 ]...[0.7529412 0.654902 0.5882353 ... 0.67058825 0.66274510.78039217][0.72156864 0.60784316 0.5764706 ... 0.63529414 0.635294140.7372549 ][0.65882355 0.6117647 0.6039216 ... 0.67058825 0.66274510.6901961 ]][[0.3137255 0.28627452 0.3137255 ... 0.28627452 0.298039230.34509805][0.36078432 0.34117648 0.3019608 ... 0.22745098 0.247058820.37254903][0.34509805 0.40392157 0.44313726 ... 0.18431373 0.192156870.32156864]...[0.8039216 0.7058824 0.6431373 ... 0.7019608 0.698039230.8156863 ][0.7764706 0.6627451 0.6313726 ... 0.6666667 0.66666670.7764706 ][0.7176471 0.6666667 0.65882355 ... 0.7019608 0.698039230.7254902 ]][[0.21960784 0.2 0.23137255 ... 0.21176471 0.219607840.26666668][0.26666668 0.2509804 0.21960784 ... 0.14901961 0.168627460.29411766][0.2509804 0.31764707 0.36078432 ... 0.10588235 0.113725490.24313726]...[0.6039216 0.5058824 0.4392157 ... 0.49803922 0.482352940.5882353 ][0.5764706 0.4627451 0.43137255 ... 0.46666667 0.46274510.5529412 ][0.5137255 0.46666667 0.45882353 ... 0.5137255 0.498039220.5137255 ]]][[[0.14901961 0.14901961 0.15294118 ... 0.14509805 0.094117650.23137255][0.15686275 0.15686275 0.16078432 ... 0.15686275 0.113725490.2509804 ][0.16078432 0.16470589 0.16862746 ... 0.16862746 0.129411770.2627451 ]...[0.16862746 0.12156863 0.14901961 ... 0.30588236 0.423529420.24313726][0.16862746 0.1254902 0.13333334 ... 0.28235295 0.396078440.22352941][0.16470589 0.1254902 0.09411765 ... 0.19607843 0.294117660.16862746]][[0.15294118 0.15294118 0.15686275 ... 0.15294118 0.098039220.23529412][0.16078432 0.16078432 0.16470589 ... 0.16470589 0.117647060.25490198][0.16470589 0.16862746 0.17254902 ... 0.1764706 0.137254910.27058825]...[0.17254902 0.1254902 0.14901961 ... 0.23137255 0.30196080.19607843][0.16862746 0.1254902 0.13333334 ... 0.22745098 0.286274520.18039216][0.16862746 0.12941177 0.09411765 ... 0.1764706 0.247058820.14901961]][[0.13333334 0.13333334 0.13725491 ... 0.15686275 0.090196080.21568628][0.14117648 0.14117648 0.14509805 ... 0.16862746 0.109803920.23529412][0.14509805 0.14901961 0.15294118 ... 0.18039216 0.12549020.24705882]...[0.14901961 0.10980392 0.13333334 ... 0.17254902 0.219607840.15686275][0.14901961 0.11372549 0.12156863 ... 0.18431373 0.203921570.13333334][0.14901961 0.11372549 0.08627451 ... 0.16078432 0.211764710.1254902 ]]][[[0.07843138 0.08627451 0.10196079 ... 0.0627451 0.054901960.04705882][0.10980392 0.08627451 0.11764706 ... 0.06666667 0.054901960.04705882][0.09019608 0.07058824 0.09411765 ... 0.05882353 0.058823530.04705882]...[0.18039216 0.16862746 0.1882353 ... 0.13725491 0.137254910.13333334][0.14901961 0.15294118 0.16470589 ... 0.14901961 0.129411770.12156863][0.13725491 0.14117648 0.15686275 ... 0.13725491 0.121568630.11764706]][[0.08627451 0.09411765 0.10980392 ... 0.07058824 0.06274510.05490196][0.12156863 0.09411765 0.1254902 ... 0.07450981 0.06274510.05490196][0.10588235 0.08235294 0.10196079 ... 0.06666667 0.066666670.05490196]...[0.19607843 0.1882353 0.2 ... 0.15294118 0.152941180.14509805][0.16470589 0.17254902 0.1764706 ... 0.16078432 0.141176480.13333334][0.15294118 0.16078432 0.16862746 ... 0.14901961 0.133333340.12941177]][[0.07058824 0.07843138 0.09019608 ... 0.05882353 0.050980390.05098039][0.10980392 0.07450981 0.10588235 ... 0.0627451 0.054901960.05098039][0.08627451 0.05882353 0.08627451 ... 0.05490196 0.054901960.04705882]...[0.16078432 0.14901961 0.16862746 ... 0.1254902 0.12549020.12156863][0.12941177 0.13333334 0.14117648 ... 0.13333334 0.113725490.10588235][0.11764706 0.1254902 0.13333334 ... 0.12156863 0.105882350.10196079]]]...[[[0.20784314 0.36078432 0.85490197 ... 0.972549 0.96470590.96862745][0.22745098 0.35686275 0.827451 ... 0.9764706 0.968627450.9647059 ][0.3372549 0.5019608 0.90588236 ... 0.9764706 0.97647060.9647059 ]...[0.08627451 0.08627451 0.05098039 ... 0.15294118 0.109803920.09803922][0.14901961 0.09411765 0.05098039 ... 0.10980392 0.184313730.2784314 ][0.3882353 0.27058825 0.14117648 ... 0.07058824 0.117647060.16470589]][[0.09803922 0.24705882 0.8156863 ... 0.9411765 0.92549020.91764706][0.14509805 0.25882354 0.7882353 ... 0.9372549 0.92549020.8980392 ][0.2784314 0.43137255 0.88235295 ... 0.9372549 0.94117650.92941177]...[0.06666667 0.07450981 0.05098039 ... 0.13725491 0.094117650.08235294][0.14117648 0.09019608 0.05098039 ... 0.09803922 0.172549020.26666668][0.3882353 0.27450982 0.14117648 ... 0.0627451 0.109803920.15686275]][[0.10588235 0.26666668 0.827451 ... 0.9607843 0.94117650.92156863][0.14117648 0.28627452 0.8156863 ... 0.94509804 0.94117650.9254902 ][0.27450982 0.4392157 0.88235295 ... 0.9254902 0.94901960.96862745]...[0.0627451 0.07058824 0.04313726 ... 0.13725491 0.098039220.09019608][0.13333334 0.08235294 0.04313726 ... 0.09803922 0.17647060.27058825][0.38039216 0.2627451 0.13333334 ... 0.06666667 0.113725490.16078432]]][[[0.35686275 0.33333334 0.34901962 ... 0.19607843 0.18823530.1882353 ][0.38431373 0.37254903 0.39215687 ... 0.25882354 0.274509820.2627451 ][0.38431373 0.38039216 0.3882353 ... 0.2509804 0.254901980.24705882]...[0.7764706 0.76862746 0.72156864 ... 0.76862746 0.772549030.77254903][0.77254903 0.7647059 0.77254903 ... 0.76862746 0.768627460.77254903][0.7647059 0.75686276 0.7529412 ... 0.75686276 0.75294120.75686276]][[0.35686275 0.3372549 0.34509805 ... 0.20784314 0.203921570.19607843][0.3882353 0.38039216 0.39607844 ... 0.26666668 0.29019610.2627451 ][0.3882353 0.38039216 0.3882353 ... 0.2509804 0.266666680.25490198]...[0.78039217 0.77254903 0.73333335 ... 0.76862746 0.772549030.77254903][0.77254903 0.7647059 0.77254903 ... 0.76862746 0.768627460.77254903][0.7647059 0.75686276 0.75686276 ... 0.7490196 0.75294120.75686276]][[0.2901961 0.2627451 0.28235295 ... 0.13725491 0.137254910.13725491][0.34901962 0.3372549 0.36078432 ... 0.20392157 0.219607840.2 ][0.36078432 0.3529412 0.37254903 ... 0.20784314 0.215686280.21176471]...[0.77254903 0.7607843 0.72156864 ... 0.7607843 0.76470590.7647059 ][0.7647059 0.75686276 0.7607843 ... 0.7607843 0.76078430.7647059 ][0.7607843 0.7529412 0.7490196 ... 0.74509805 0.745098050.7490196 ]]][[[0.8745098 0.8784314 0.8784314 ... 0.8235294 0.80.7490196 ][0.83137256 0.8235294 0.827451 ... 0.7647059 0.745098050.73333335][0.8039216 0.79607844 0.8039216 ... 0.67058825 0.63137260.70980394]...[0.40784314 0.3647059 0.34901962 ... 0.29803923 0.274509820.28235295][0.41568628 0.36078432 0.35686275 ... 0.26666668 0.258823540.28627452][0.3882353 0.3529412 0.34117648 ... 0.2784314 0.266666680.28235295]][[0.8901961 0.89411765 0.89411765 ... 0.8117647 0.80392160.76862746][0.84705883 0.8392157 0.84313726 ... 0.75686276 0.745098050.7529412 ][0.81960785 0.8117647 0.81960785 ... 0.6627451 0.63137260.7294118 ]...[0.3372549 0.31764707 0.30588236 ... 0.2784314 0.254901980.2627451 ][0.32156864 0.29803923 0.29411766 ... 0.23921569 0.235294120.25882354][0.29411766 0.28235295 0.27450982 ... 0.2509804 0.247058820.25882354]][[0.9372549 0.9411765 0.9411765 ... 0.85490197 0.86274510.8352941 ][0.89411765 0.8862745 0.8901961 ... 0.79607844 0.80392160.81960785][0.8666667 0.85882354 0.8666667 ... 0.7019608 0.69019610.79607844]...[0.23921569 0.20784314 0.19607843 ... 0.30588236 0.26274510.2627451 ][0.23529412 0.2 0.19607843 ... 0.26666668 0.231372550.2509804 ][0.21960784 0.2 0.1882353 ... 0.27058825 0.239215690.2509804 ]]]] <NDArray 256x3x32x32 @cpu(0)>
def wrapped_iter(data_iter):for X, y in data_iter:X = X[:, :1, :, :]yield X, yfor X, y in wrapped_iter(train_iter):print(X)print(y)breakfor X, y in wrapped_iter(test_iter):print(X)print(y)break
[[[[0.40784314 0.3882353 0.40392157 ... 0.2509804 0.239215690.22745098][0.4 0.3882353 0.4 ... 0.2627451 0.26274510.23529412][0.39607844 0.38039216 0.4 ... 0.2901961 0.29019610.26666668]...[0.79607844 0.7882353 0.7882353 ... 0.59607846 0.584313750.5764706 ][0.74509805 0.7607843 0.74509805 ... 0.6431373 0.623529430.6117647 ][0.73333335 0.7254902 0.7372549 ... 0.6392157 0.64313730.6313726 ]]][[[1. 0.99215686 0.96862745 ... 0.62352943 0.68627450.8627451 ][1. 0.96862745 0.92156863 ... 0.5764706 0.69019610.7607843 ][1. 0.95686275 0.8745098 ... 0.63529414 0.75294120.7607843 ]...[0.49411765 0.5058824 0.58431375 ... 0.7019608 0.72941180.7490196 ][0.6431373 0.69803923 0.7254902 ... 0.7019608 0.71372550.7176471 ][0.8666667 0.9137255 0.8039216 ... 0.7058824 0.756862760.77254903]]][[[0.5411765 0.5411765 0.5647059 ... 0.29411766 0.219607840.25882354][0.58431375 0.56078434 0.5803922 ... 0.25490198 0.203921570.26666668][0.61960787 0.5686275 0.57254905 ... 0.23137255 0.219607840.25882354]...[0.59607846 0.6745098 0.70980394 ... 0.8352941 0.819607850.8 ][0.60784316 0.6901961 0.70980394 ... 0.8980392 0.917647060.8156863 ][0.6745098 0.75686276 0.7372549 ... 0.89411765 0.921568630.9098039 ]]]...[[[0.20392157 0.21176471 0.2 ... 0.14509805 0.168627460.13725491][0.19215687 0.20392157 0.21568628 ... 0.15294118 0.121568630.09019608][0.22352941 0.20784314 0.19607843 ... 0.21176471 0.172549020.09803922]...[0.49019608 0.47058824 0.5058824 ... 0.17254902 0.094117650.14509805][0.5019608 0.5882353 0.7019608 ... 0.1882353 0.180392160.18039216][0.42352942 0.5529412 0.68235296 ... 0.2 0.207843140.23137255]]][[[0.6431373 0.5803922 0.5921569 ... 0.24313726 0.36470590.27450982][0.69803923 0.6901961 0.5372549 ... 0.40392157 0.360784320.2901961 ][0.44705883 0.65882355 0.6 ... 0.49803922 0.35294120.29411766]...[0.827451 0.8039216 0.72156864 ... 0.25490198 0.254901980.29411766][0.89411765 0.8156863 0.7490196 ... 0.23529412 0.258823540.2901961 ][0.91764706 0.8392157 0.65882355 ... 0.22352941 0.227450980.27058825]]][[[0.04313726 0.07843138 0.14117648 ... 0.31764707 0.32549020.25882354][0.03529412 0.0627451 0.10980392 ... 0.3254902 0.282352950.2627451 ][0.01960784 0.05098039 0.07843138 ... 0.27450982 0.235294120.2901961 ]...[0.2627451 0.2901961 0.2509804 ... 0.32941177 0.349019620.3254902 ][0.24313726 0.21176471 0.1882353 ... 0.32941177 0.31372550.28627452][0.28235295 0.24705882 0.21960784 ... 0.3254902 0.294117660.26666668]]]] <NDArray 256x1x32x32 @cpu(0)>[2 9 4 7 3 1 3 5 9 6 2 9 4 4 9 5 3 7 2 9 3 2 1 4 3 1 0 6 7 4 4 0 5 6 3 3 82 6 1 8 1 4 0 7 1 4 8 4 5 1 0 6 8 1 0 8 4 4 7 0 9 9 2 6 4 4 2 7 3 4 3 0 09 2 4 0 7 6 5 9 6 5 0 0 0 6 7 8 8 7 7 8 7 9 3 4 4 6 1 0 5 6 0 6 6 7 1 8 92 2 5 2 9 9 8 6 2 4 3 1 7 0 2 4 8 3 6 3 7 2 4 4 9 2 3 7 0 6 9 4 9 6 6 7 68 2 5 4 7 6 0 2 9 5 9 3 1 5 9 2 1 7 7 0 5 0 5 2 3 9 7 1 3 5 5 7 0 6 2 3 15 3 6 2 2 5 7 0 7 5 8 5 9 7 0 7 2 8 1 7 4 2 3 8 6 1 6 1 6 0 8 8 8 7 9 4 26 6 9 1 5 2 5 1 4 6 1 8 9 2 4 7 0 4 3 3 6 5 9 4 1 0 2 5 9 3 1 6 6 6] <NDArray 256 @cpu(0)>[[[[0.61960787 0.62352943 0.64705884 ... 0.5372549 0.494117650.45490196][0.59607846 0.5921569 0.62352943 ... 0.53333336 0.490196080.46666667][0.5921569 0.5921569 0.61960787 ... 0.54509807 0.509803950.47058824]...[0.26666668 0.16470589 0.12156863 ... 0.14901961 0.050980390.15686275][0.23921569 0.19215687 0.13725491 ... 0.10196079 0.113725490.07843138][0.21176471 0.21960784 0.1764706 ... 0.09411765 0.133333340.08235294]]][[[0.92156863 0.90588236 0.9098039 ... 0.9137255 0.91372550.9098039 ][0.93333334 0.92156863 0.92156863 ... 0.9254902 0.92549020.92156863][0.92941177 0.91764706 0.91764706 ... 0.92156863 0.921568630.91764706]...[0.34117648 0.16862746 0.07450981 ... 0.6627451 0.71372550.7372549 ][0.32156864 0.18039216 0.14117648 ... 0.68235296 0.72549020.73333335][0.33333334 0.24313726 0.22745098 ... 0.65882355 0.70588240.7294118 ]]][[[0.61960787 0.61960787 0.54509807 ... 0.89411765 0.929411770.93333334][0.6666667 0.6745098 0.5921569 ... 0.9098039 0.96470590.9647059 ][0.68235296 0.6901961 0.6156863 ... 0.9019608 0.980392160.9607843 ]...[0.12156863 0.11764706 0.10196079 ... 0.14509805 0.035294120.01568628][0.09019608 0.10588235 0.09803922 ... 0.07450981 0.015686280.01960784][0.10980392 0.11764706 0.1254902 ... 0.01960784 0.015686280.02745098]]]...[[[0.2627451 0.26666668 0.27450982 ... 0.28235295 0.27843140.27450982][0.27058825 0.2784314 0.28627452 ... 0.2901961 0.29019610.28627452][0.2784314 0.28235295 0.28627452 ... 0.29411766 0.29019610.28627452]...[0.35686275 0.3882353 0.37254903 ... 0.30980393 0.349019620.3647059 ][0.33333334 0.35686275 0.34901962 ... 0.27058825 0.266666680.28235295][0.3254902 0.3372549 0.33333334 ... 0.2627451 0.266666680.25882354]]][[[0.7254902 0.7058824 0.6745098 ... 0.6156863 0.596078460.54901963][0.7921569 0.69411767 0.63529414 ... 0.6039216 0.57647060.5529412 ][0.7176471 0.6392157 0.627451 ... 0.5764706 0.57647060.5803922 ]...[0.6901961 0.62352943 0.6156863 ... 0.37254903 0.317647070.29803923][0.6784314 0.6392157 0.67058825 ... 0.39215687 0.384313730.36078432][0.64705884 0.59607846 0.62352943 ... 0.47843137 0.51764710.46666667]]][[[0.8 0.8039216 0.8156863 ... 0.8352941 0.847058830.84705883][0.80784315 0.8156863 0.827451 ... 0.8352941 0.82352940.827451 ][0.7882353 0.7921569 0.80784315 ... 0.78431374 0.768627460.76862746]...[0.5058824 0.50980395 0.52156866 ... 0.45882353 0.51372550.5294118 ][0.49411765 0.49803922 0.5058824 ... 0.4627451 0.51764710.5254902 ][0.4862745 0.49019608 0.49803922 ... 0.4509804 0.498039220.5058824 ]]]] <NDArray 256x1x32x32 @cpu(0)>[3 8 8 0 6 6 1 6 3 1 0 9 5 7 9 8 5 7 8 6 7 0 4 9 5 2 4 0 9 6 6 5 4 5 9 2 41 9 5 4 6 5 6 0 9 3 9 7 6 9 8 0 3 8 8 7 7 4 6 7 3 6 3 6 2 1 2 3 7 2 6 8 80 2 9 3 3 8 8 1 1 7 2 5 2 7 8 9 0 3 8 6 4 6 6 0 0 7 4 5 6 3 1 1 3 6 8 7 40 6 2 1 3 0 4 2 7 8 3 1 2 8 0 8 3 5 2 4 1 8 9 1 2 9 7 2 9 6 5 6 3 8 7 6 25 2 8 9 6 0 0 5 2 9 5 4 2 1 6 6 8 4 8 4 5 0 9 9 9 8 9 9 3 7 5 0 0 5 2 2 38 6 3 4 0 5 8 0 1 7 2 8 8 7 8 5 1 8 7 1 3 0 5 7 9 7 4 5 9 8 0 7 9 8 2 7 69 4 3 9 6 4 7 6 5 1 5 8 8 0 4 0 5 5 1 1 8 9 0 3 1 9 2 2 5 3 9 9 4 0] <NDArray 256 @cpu(0)>
from mxnet import gluon, init from mxnet.gluon import loss as gloss, nn
net = nn.Sequential() net.add(nn.Dense(10)) net.initialize(init.Normal(sigma=0.01))
loss = gloss.SoftmaxCrossEntropyLoss()
trainer = gluon.Trainer(net.collect_params(), 'sgd', {'learning_rate': 0.0001})
num_epochs = 100 gb.train_ch3(net, train_iter, test_iter, loss, num_epochs, batch_size, None,None, trainer)
epoch 1, loss 1.6195, train acc 0.457, test acc 0.410 epoch 2, loss 1.6196, train acc 0.457, test acc 0.411 epoch 3, loss 1.6181, train acc 0.457, test acc 0.411 epoch 4, loss 1.6183, train acc 0.457, test acc 0.411 epoch 5, loss 1.6191, train acc 0.457, test acc 0.410 epoch 6, loss 1.6196, train acc 0.457, test acc 0.411 epoch 7, loss 1.6189, train acc 0.457, test acc 0.410 epoch 8, loss 1.6189, train acc 0.457, test acc 0.411 epoch 9, loss 1.6183, train acc 0.457, test acc 0.410 epoch 10, loss 1.6186, train acc 0.457, test acc 0.411 epoch 11, loss 1.6182, train acc 0.457, test acc 0.410 epoch 12, loss 1.6175, train acc 0.457, test acc 0.410 epoch 13, loss 1.6181, train acc 0.457, test acc 0.410 epoch 14, loss 1.6182, train acc 0.457, test acc 0.411 epoch 15, loss 1.6192, train acc 0.457, test acc 0.410 epoch 16, loss 1.6191, train acc 0.457, test acc 0.411 epoch 17, loss 1.6182, train acc 0.457, test acc 0.410 epoch 18, loss 1.6176, train acc 0.457, test acc 0.410 epoch 19, loss 1.6175, train acc 0.458, test acc 0.410 epoch 20, loss 1.6182, train acc 0.457, test acc 0.410 epoch 21, loss 1.6178, train acc 0.457, test acc 0.410 epoch 22, loss 1.6180, train acc 0.457, test acc 0.410 epoch 23, loss 1.6178, train acc 0.457, test acc 0.411 epoch 24, loss 1.6179, train acc 0.457, test acc 0.411 epoch 25, loss 1.6178, train acc 0.457, test acc 0.411 epoch 26, loss 1.6180, train acc 0.457, test acc 0.411 epoch 27, loss 1.6181, train acc 0.457, test acc 0.410 epoch 28, loss 1.6172, train acc 0.457, test acc 0.410 epoch 29, loss 1.6177, train acc 0.457, test acc 0.411 epoch 30, loss 1.6170, train acc 0.458, test acc 0.410 epoch 31, loss 1.6162, train acc 0.458, test acc 0.410 epoch 32, loss 1.6184, train acc 0.457, test acc 0.410 epoch 33, loss 1.6175, train acc 0.457, test acc 0.410 epoch 34, loss 1.6174, train acc 0.457, test acc 0.411 epoch 35, loss 1.6173, train acc 0.457, test acc 0.411 epoch 36, loss 1.6177, train acc 0.457, test acc 0.411 epoch 37, loss 1.6174, train acc 0.457, test acc 0.410 epoch 38, loss 1.6174, train acc 0.457, test acc 0.410 epoch 39, loss 1.6171, train acc 0.457, test acc 0.411 epoch 40, loss 1.6178, train acc 0.457, test acc 0.410 epoch 41, loss 1.6173, train acc 0.457, test acc 0.410 epoch 42, loss 1.6169, train acc 0.457, test acc 0.411 epoch 43, loss 1.6166, train acc 0.457, test acc 0.410 epoch 44, loss 1.6172, train acc 0.457, test acc 0.410 epoch 45, loss 1.6166, train acc 0.457, test acc 0.410 epoch 46, loss 1.6174, train acc 0.457, test acc 0.410 epoch 47, loss 1.6170, train acc 0.457, test acc 0.410 epoch 48, loss 1.6166, train acc 0.457, test acc 0.410 epoch 49, loss 1.6165, train acc 0.457, test acc 0.410 epoch 50, loss 1.6163, train acc 0.457, test acc 0.410 epoch 51, loss 1.6167, train acc 0.457, test acc 0.410 epoch 52, loss 1.6172, train acc 0.457, test acc 0.410 epoch 53, loss 1.6163, train acc 0.458, test acc 0.410 epoch 54, loss 1.6166, train acc 0.457, test acc 0.410 epoch 55, loss 1.6163, train acc 0.457, test acc 0.410 epoch 56, loss 1.6171, train acc 0.457, test acc 0.410 epoch 57, loss 1.6170, train acc 0.457, test acc 0.410 epoch 58, loss 1.6163, train acc 0.457, test acc 0.410 epoch 59, loss 1.6160, train acc 0.458, test acc 0.410 epoch 60, loss 1.6163, train acc 0.457, test acc 0.410 epoch 61, loss 1.6165, train acc 0.457, test acc 0.410 epoch 62, loss 1.6157, train acc 0.457, test acc 0.410 epoch 63, loss 1.6169, train acc 0.457, test acc 0.410 epoch 64, loss 1.6158, train acc 0.457, test acc 0.410 epoch 65, loss 1.6167, train acc 0.457, test acc 0.410 epoch 66, loss 1.6162, train acc 0.458, test acc 0.410 epoch 67, loss 1.6167, train acc 0.457, test acc 0.410 epoch 68, loss 1.6163, train acc 0.457, test acc 0.409 epoch 69, loss 1.6170, train acc 0.457, test acc 0.410 epoch 70, loss 1.6164, train acc 0.457, test acc 0.410 epoch 71, loss 1.6166, train acc 0.457, test acc 0.410 epoch 72, loss 1.6157, train acc 0.457, test acc 0.410 epoch 73, loss 1.6159, train acc 0.457, test acc 0.410 epoch 74, loss 1.6163, train acc 0.457, test acc 0.410 epoch 75, loss 1.6162, train acc 0.457, test acc 0.410 epoch 76, loss 1.6154, train acc 0.457, test acc 0.409 epoch 77, loss 1.6161, train acc 0.457, test acc 0.410 epoch 78, loss 1.6169, train acc 0.457, test acc 0.409 epoch 79, loss 1.6154, train acc 0.457, test acc 0.409 epoch 80, loss 1.6162, train acc 0.457, test acc 0.409 epoch 81, loss 1.6163, train acc 0.457, test acc 0.410 epoch 82, loss 1.6161, train acc 0.457, test acc 0.409 epoch 83, loss 1.6156, train acc 0.457, test acc 0.410 epoch 84, loss 1.6153, train acc 0.458, test acc 0.409 epoch 85, loss 1.6159, train acc 0.457, test acc 0.409 epoch 86, loss 1.6164, train acc 0.457, test acc 0.410 epoch 87, loss 1.6154, train acc 0.457, test acc 0.410 epoch 88, loss 1.6152, train acc 0.457, test acc 0.410 epoch 89, loss 1.6154, train acc 0.457, test acc 0.410 epoch 90, loss 1.6155, train acc 0.457, test acc 0.409 epoch 91, loss 1.6160, train acc 0.458, test acc 0.409 epoch 92, loss 1.6148, train acc 0.458, test acc 0.409 epoch 93, loss 1.6156, train acc 0.457, test acc 0.409 epoch 94, loss 1.6152, train acc 0.457, test acc 0.409 epoch 95, loss 1.6157, train acc 0.458, test acc 0.410 epoch 96, loss 1.6152, train acc 0.458, test acc 0.410 epoch 97, loss 1.6152, train acc 0.457, test acc 0.410 epoch 98, loss 1.6151, train acc 0.457, test acc 0.410 epoch 99, loss 1.6150, train acc 0.457, test acc 0.409 epoch 100, loss 1.6158, train acc 0.457, test acc 0.410
gb.train_ch3??
转载于:https://www.cnblogs.com/TreeDream/p/10020362.html
softmax实现cifar10分类相关推荐
- [PyTorch] 基于Python和PyTorch的cifar-10分类
cifar-10数据集介绍 CIFAR-10数据集由10个类的60000个32x32彩色图像组成,每个类有6000个图像.有50000个训练图像和10000个测试图像. 数据集分为5个训练批次和1个测 ...
- cifar-10 分类 tensorflow 代码
看了黄文坚.唐源的<TensorFlow实战>对mnist分类的cnn教程,然后上网搜了下,发现挺多博主贴了对mnist分类的tensorflow代码,我想在同样框架下测试cifar-10 ...
- 现代神经网络(VGG),并用VGG16进行实战CIFAR10分类
专栏:神经网络复现目录 本章介绍的是现代神经网络的结构和复现,包括深度卷积神经网络(AlexNet),VGG,NiN,GoogleNet,残差网络(ResNet),稠密连接网络(DenseNet). ...
- 利用胶囊网络实现对CIFAR10分类
利用胶囊网络实现对CIFAR10分类 数据集:CIFAR-10数据集由10个类中的60000个32x32彩色图像组成,每个类有6000个图像.有50000个训练图像和10000个测试图像. 实验:搭建 ...
- softmax实现多分类算法推导及代码实现
关于多分类 我们常见的逻辑回归.SVM等常用于解决二分类问题,对于多分类问题,比如识别手写数字,它就需要10个分类,同样也可以用逻辑回归或SVM,只是需要多个二分类来组成多分类,但这里讨论另外一种方式 ...
- 1.6 例子:CIFAR-10分类
CIFAR-10分类,步骤如下: 1)使用torchvision加载并预处理CIFAR-10数据集 2)定义网络 3)定义损失函数和优化器 4)训练网络并更新网络参数 5)测试网络 CIFAR-10数 ...
- Pytorch总结三之 softmax回归用于分类问题
Pytorch总结三之 softmax回归用于分类问题 上篇博文Pytorch总结二之线性回归算法原理与实现介绍的线性回归模型适⽤于输出为连续值的情景. 在另⼀类情景中,模型输出可以是⼀个像图像类别这 ...
- CNN02:Pytorch实现VGG16的CIFAR10分类
CNN02:Pytorch实现VGG16的CIFAR10分类 1.VGG16的网络结构和原理 VGG的具体网络结构和原理参考博客: https://www.cnblogs.com/guoyaohua/ ...
- Pytorch打怪路(一)pytorch进行CIFAR-10分类(4)训练
pytorch进行CIFAR-10分类(4)训练 我的系列博文: Pytorch打怪路(一)pytorch进行CIFAR-10分类(1)CIFAR-10数据加载和处理 Pytorch打怪路(一)pyt ...
最新文章
- 《Excel 职场手册:260招菜鸟变达人》一第 14 招 利用数据验证记录数据录入时间...
- hibernate延迟加载lazy的原理,以及为什么session关闭会报错
- 《ASP.NET MVC企业实战》(二) MVC开发前奏
- SAP APF框架错误消息Filter is too complex的处理
- 负载均衡实现的几种方式
- 数据结构与算法--递归(Recursion Algorithm)
- 在发布ASP.NET网站的时候,出现state server错误
- 分享个WEB前端CSS兼容的问题.
- 10万量级30秒自动化配座,0现场故障:这届冬奥票务系统有点不一样
- c# timer 销毁_C# System.Timers.Timer定时器的使用和定时自动清理内存应用
- 关于ini读取错误问题?
- Excel—“撤销工作表保护密码”的破解并获取原始密码
- 疯狂java讲义第二章课后习题答案
- 应用程序现代化权威指南
- HMS Core Insights第三期直播预告—— 当我们在谈论App的时候,我们还可以谈论什么?
- linux脚本判断文件属性,linux的shell脚本中的逻辑判断、文件目录属性判断、if特殊用法、case判断...
- 网闸——安全隔离网闸:从第一代走向第二代
- OpenCV3之操作例子总汇
- C/C++程序计时函数
- Java根据奖品权重计算中奖概率实现抽奖(适用于砸金蛋、大转盘等抽奖活动)