建议使用两个第三方库torchstat和torchsummary

  1. 自己计算使用parameters

方式一
使用model.paramters

num_params = 0
for param in model.parameters():num_params += param.numel()
print(num_params)
print(f"params : {num_params/1e6}M")
# print(model.net)

方式二
使用model.named_parameters()

num_params = 0
for name, param in model.named_parameters():# print(name,param.numel())num_params += param.numel()
print(num_params)
print(f"params : {num_params/1e6} M")

方式三

def get_parameter_number(net):total_num = sum(p.numel() for p in net.parameters())trainable_num = sum(p.numel() for p in net.parameters() if p.requires_grad)return {'Total': total_num, 'Trainable': trainable_num}print(get_parameter_number(model))
  1. torchstat

使用pip install torchstat就可以安装torchstat了

pip install torchstat
from torchstat import stat
from torchvision.models.resnet import resnet34
model = resnet34()
stat(model, (3, 224, 224))
[MAdd]: AdaptiveAvgPool2d is not supported!
[Flops]: AdaptiveAvgPool2d is not supported!
[Memory]: AdaptiveAvgPool2d is not supported!module name  input shape output shape      params memory(MB)             MAdd            Flops  MemRead(B)  MemWrite(B) duration[%]    MemR+W(B)
0                      conv1    3 224 224   64 112 112      9408.0       3.06    235,225,088.0    118,013,952.0    639744.0    3211264.0       8.76%    3851008.0
1                        bn1   64 112 112   64 112 112       128.0       3.06      3,211,264.0      1,605,632.0   3211776.0    3211264.0       1.26%    6423040.0
2                       relu   64 112 112   64 112 112         0.0       3.06        802,816.0        802,816.0   3211264.0    3211264.0       0.43%    6422528.0
3                    maxpool   64 112 112   64  56  56         0.0       0.77      1,605,632.0        802,816.0   3211264.0     802816.0       2.00%    4014080.0
4             layer1.0.conv1   64  56  56   64  56  56     36864.0       0.77    231,010,304.0    115,605,504.0    950272.0     802816.0       3.26%    1753088.0
5               layer1.0.bn1   64  56  56   64  56  56       128.0       0.77        802,816.0        401,408.0    803328.0     802816.0       0.41%    1606144.0
6              layer1.0.relu   64  56  56   64  56  56         0.0       0.77        200,704.0        200,704.0    802816.0     802816.0       0.07%    1605632.0
7             layer1.0.conv2   64  56  56   64  56  56     36864.0       0.77    231,010,304.0    115,605,504.0    950272.0     802816.0       2.26%    1753088.0
8               layer1.0.bn2   64  56  56   64  56  56       128.0       0.77        802,816.0        401,408.0    803328.0     802816.0       0.40%    1606144.0
9             layer1.1.conv1   64  56  56   64  56  56     36864.0       0.77    231,010,304.0    115,605,504.0    950272.0     802816.0       2.43%    1753088.0
10              layer1.1.bn1   64  56  56   64  56  56       128.0       0.77        802,816.0        401,408.0    803328.0     802816.0       0.35%    1606144.0
11             layer1.1.relu   64  56  56   64  56  56         0.0       0.77        200,704.0        200,704.0    802816.0     802816.0       0.06%    1605632.0
12            layer1.1.conv2   64  56  56   64  56  56     36864.0       0.77    231,010,304.0    115,605,504.0    950272.0     802816.0       2.28%    1753088.0
13              layer1.1.bn2   64  56  56   64  56  56       128.0       0.77        802,816.0        401,408.0    803328.0     802816.0       0.38%    1606144.0
14            layer1.2.conv1   64  56  56   64  56  56     36864.0       0.77    231,010,304.0    115,605,504.0    950272.0     802816.0       2.38%    1753088.0
15              layer1.2.bn1   64  56  56   64  56  56       128.0       0.77        802,816.0        401,408.0    803328.0     802816.0       0.32%    1606144.0
16             layer1.2.relu   64  56  56   64  56  56         0.0       0.77        200,704.0        200,704.0    802816.0     802816.0       0.06%    1605632.0
17            layer1.2.conv2   64  56  56   64  56  56     36864.0       0.77    231,010,304.0    115,605,504.0    950272.0     802816.0       2.28%    1753088.0
18              layer1.2.bn2   64  56  56   64  56  56       128.0       0.77        802,816.0        401,408.0    803328.0     802816.0       0.41%    1606144.0
19            layer2.0.conv1   64  56  56  128  28  28     73728.0       0.38    115,505,152.0     57,802,752.0   1097728.0     401408.0       2.26%    1499136.0
20              layer2.0.bn1  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.17%     803840.0
21             layer2.0.relu  128  28  28  128  28  28         0.0       0.38        100,352.0        100,352.0    401408.0     401408.0       0.06%     802816.0
22            layer2.0.conv2  128  28  28  128  28  28    147456.0       0.38    231,110,656.0    115,605,504.0    991232.0     401408.0       2.46%    1392640.0
23              layer2.0.bn2  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.17%     803840.0
24     layer2.0.downsample.0   64  56  56  128  28  28      8192.0       0.38     12,744,704.0      6,422,528.0    835584.0     401408.0       1.67%    1236992.0
25     layer2.0.downsample.1  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.17%     803840.0
26            layer2.1.conv1  128  28  28  128  28  28    147456.0       0.38    231,110,656.0    115,605,504.0    991232.0     401408.0       1.77%    1392640.0
27              layer2.1.bn1  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.17%     803840.0
28             layer2.1.relu  128  28  28  128  28  28         0.0       0.38        100,352.0        100,352.0    401408.0     401408.0       0.06%     802816.0
29            layer2.1.conv2  128  28  28  128  28  28    147456.0       0.38    231,110,656.0    115,605,504.0    991232.0     401408.0       1.87%    1392640.0
30              layer2.1.bn2  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.19%     803840.0
31            layer2.2.conv1  128  28  28  128  28  28    147456.0       0.38    231,110,656.0    115,605,504.0    991232.0     401408.0       1.63%    1392640.0
32              layer2.2.bn1  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.18%     803840.0
33             layer2.2.relu  128  28  28  128  28  28         0.0       0.38        100,352.0        100,352.0    401408.0     401408.0       0.06%     802816.0
34            layer2.2.conv2  128  28  28  128  28  28    147456.0       0.38    231,110,656.0    115,605,504.0    991232.0     401408.0       1.68%    1392640.0
35              layer2.2.bn2  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.17%     803840.0
36            layer2.3.conv1  128  28  28  128  28  28    147456.0       0.38    231,110,656.0    115,605,504.0    991232.0     401408.0       1.66%    1392640.0
37              layer2.3.bn1  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.16%     803840.0
38             layer2.3.relu  128  28  28  128  28  28         0.0       0.38        100,352.0        100,352.0    401408.0     401408.0       0.06%     802816.0
39            layer2.3.conv2  128  28  28  128  28  28    147456.0       0.38    231,110,656.0    115,605,504.0    991232.0     401408.0       1.70%    1392640.0
40              layer2.3.bn2  128  28  28  128  28  28       256.0       0.38        401,408.0        200,704.0    402432.0     401408.0       0.16%     803840.0
41            layer3.0.conv1  128  28  28  256  14  14    294912.0       0.19    115,555,328.0     57,802,752.0   1581056.0     200704.0       2.17%    1781760.0
42              layer3.0.bn1  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.14%     403456.0
43             layer3.0.relu  256  14  14  256  14  14         0.0       0.19         50,176.0         50,176.0    200704.0     200704.0       0.10%     401408.0
44            layer3.0.conv2  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       2.64%    2760704.0
45              layer3.0.bn2  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.12%     403456.0
46     layer3.0.downsample.0  128  28  28  256  14  14     32768.0       0.19     12,794,880.0      6,422,528.0    532480.0     200704.0       1.23%     733184.0
47     layer3.0.downsample.1  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.13%     403456.0
48            layer3.1.conv1  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       1.68%    2760704.0
49              layer3.1.bn1  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.12%     403456.0
50             layer3.1.relu  256  14  14  256  14  14         0.0       0.19         50,176.0         50,176.0    200704.0     200704.0       0.06%     401408.0
51            layer3.1.conv2  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       1.64%    2760704.0
52              layer3.1.bn2  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.13%     403456.0
53            layer3.2.conv1  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       1.64%    2760704.0
54              layer3.2.bn1  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.12%     403456.0
55             layer3.2.relu  256  14  14  256  14  14         0.0       0.19         50,176.0         50,176.0    200704.0     200704.0       0.05%     401408.0
56            layer3.2.conv2  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       1.65%    2760704.0
57              layer3.2.bn2  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.12%     403456.0
58            layer3.3.conv1  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       1.73%    2760704.0
59              layer3.3.bn1  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.13%     403456.0
60             layer3.3.relu  256  14  14  256  14  14         0.0       0.19         50,176.0         50,176.0    200704.0     200704.0       0.06%     401408.0
61            layer3.3.conv2  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       2.13%    2760704.0
62              layer3.3.bn2  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.14%     403456.0
63            layer3.4.conv1  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       1.67%    2760704.0
64              layer3.4.bn1  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.14%     403456.0
65             layer3.4.relu  256  14  14  256  14  14         0.0       0.19         50,176.0         50,176.0    200704.0     200704.0       0.05%     401408.0
66            layer3.4.conv2  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       1.68%    2760704.0
67              layer3.4.bn2  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.12%     403456.0
68            layer3.5.conv1  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       1.95%    2760704.0
69              layer3.5.bn1  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.14%     403456.0
70             layer3.5.relu  256  14  14  256  14  14         0.0       0.19         50,176.0         50,176.0    200704.0     200704.0       0.06%     401408.0
71            layer3.5.conv2  256  14  14  256  14  14    589824.0       0.19    231,160,832.0    115,605,504.0   2560000.0     200704.0       2.01%    2760704.0
72              layer3.5.bn2  256  14  14  256  14  14       512.0       0.19        200,704.0        100,352.0    202752.0     200704.0       0.14%     403456.0
73            layer4.0.conv1  256  14  14  512   7   7   1179648.0       0.10    115,580,416.0     57,802,752.0   4919296.0     100352.0       2.81%    5019648.0
74              layer4.0.bn1  512   7   7  512   7   7      1024.0       0.10        100,352.0         50,176.0    104448.0     100352.0       0.13%     204800.0
75             layer4.0.relu  512   7   7  512   7   7         0.0       0.10         25,088.0         25,088.0    100352.0     100352.0       0.04%     200704.0
76            layer4.0.conv2  512   7   7  512   7   7   2359296.0       0.10    231,185,920.0    115,605,504.0   9537536.0     100352.0       6.47%    9637888.0
77              layer4.0.bn2  512   7   7  512   7   7      1024.0       0.10        100,352.0         50,176.0    104448.0     100352.0       0.13%     204800.0
78     layer4.0.downsample.0  256  14  14  512   7   7    131072.0       0.10     12,819,968.0      6,422,528.0    724992.0     100352.0       1.46%     825344.0
79     layer4.0.downsample.1  512   7   7  512   7   7      1024.0       0.10        100,352.0         50,176.0    104448.0     100352.0       0.14%     204800.0
80            layer4.1.conv1  512   7   7  512   7   7   2359296.0       0.10    231,185,920.0    115,605,504.0   9537536.0     100352.0       3.55%    9637888.0
81              layer4.1.bn1  512   7   7  512   7   7      1024.0       0.10        100,352.0         50,176.0    104448.0     100352.0       0.13%     204800.0
82             layer4.1.relu  512   7   7  512   7   7         0.0       0.10         25,088.0         25,088.0    100352.0     100352.0       0.03%     200704.0
83            layer4.1.conv2  512   7   7  512   7   7   2359296.0       0.10    231,185,920.0    115,605,504.0   9537536.0     100352.0       3.06%    9637888.0
84              layer4.1.bn2  512   7   7  512   7   7      1024.0       0.10        100,352.0         50,176.0    104448.0     100352.0       0.13%     204800.0
85            layer4.2.conv1  512   7   7  512   7   7   2359296.0       0.10    231,185,920.0    115,605,504.0   9537536.0     100352.0       2.94%    9637888.0
86              layer4.2.bn1  512   7   7  512   7   7      1024.0       0.10        100,352.0         50,176.0    104448.0     100352.0       0.14%     204800.0
87             layer4.2.relu  512   7   7  512   7   7         0.0       0.10         25,088.0         25,088.0    100352.0     100352.0       0.03%     200704.0
88            layer4.2.conv2  512   7   7  512   7   7   2359296.0       0.10    231,185,920.0    115,605,504.0   9537536.0     100352.0       3.02%    9637888.0
89              layer4.2.bn2  512   7   7  512   7   7      1024.0       0.10        100,352.0         50,176.0    104448.0     100352.0       0.12%     204800.0
90                   avgpool  512   7   7  512   1   1         0.0       0.00              0.0              0.0         0.0          0.0       0.38%          0.0
91                        fc          512         1000    513000.0       0.00      1,023,000.0        512,000.0   2054048.0       4000.0       1.14%    2058048.0
total                                                   21797672.0      37.62  7,342,524,440.0  3,674,223,104.0   2054048.0       4000.0     100.00%  167277632.0
=================================================================================================================================================================
Total params: 21,797,672
-----------------------------------------------------------------------------------------------------------------------------------------------------------------
Total memory: 37.62MB
Total MAdd: 7.34GMAdd
Total Flops: 3.67GFlops
Total MemR+W: 159.53MB
  1. 使用torchsummary
    安装,使用pip 安装
pip install torchsummary
from torchvision.models.resnet import resnet34
from torchsummary import summary
import torchmodel = resnet34(pretrained=False).eval().cuda()
summary(model, input_size=(3, 224, 224), batch_size=-1)
----------------------------------------------------------------Layer (type)               Output Shape         Param #
================================================================Conv2d-1         [-1, 64, 112, 112]           9,408BatchNorm2d-2         [-1, 64, 112, 112]             128ReLU-3         [-1, 64, 112, 112]               0MaxPool2d-4           [-1, 64, 56, 56]               0Conv2d-5           [-1, 64, 56, 56]          36,864BatchNorm2d-6           [-1, 64, 56, 56]             128ReLU-7           [-1, 64, 56, 56]               0Conv2d-8           [-1, 64, 56, 56]          36,864BatchNorm2d-9           [-1, 64, 56, 56]             128ReLU-10           [-1, 64, 56, 56]               0BasicBlock-11           [-1, 64, 56, 56]               0Conv2d-12           [-1, 64, 56, 56]          36,864BatchNorm2d-13           [-1, 64, 56, 56]             128ReLU-14           [-1, 64, 56, 56]               0Conv2d-15           [-1, 64, 56, 56]          36,864BatchNorm2d-16           [-1, 64, 56, 56]             128ReLU-17           [-1, 64, 56, 56]               0BasicBlock-18           [-1, 64, 56, 56]               0Conv2d-19           [-1, 64, 56, 56]          36,864BatchNorm2d-20           [-1, 64, 56, 56]             128ReLU-21           [-1, 64, 56, 56]               0Conv2d-22           [-1, 64, 56, 56]          36,864BatchNorm2d-23           [-1, 64, 56, 56]             128ReLU-24           [-1, 64, 56, 56]               0BasicBlock-25           [-1, 64, 56, 56]               0Conv2d-26          [-1, 128, 28, 28]          73,728BatchNorm2d-27          [-1, 128, 28, 28]             256ReLU-28          [-1, 128, 28, 28]               0Conv2d-29          [-1, 128, 28, 28]         147,456BatchNorm2d-30          [-1, 128, 28, 28]             256Conv2d-31          [-1, 128, 28, 28]           8,192BatchNorm2d-32          [-1, 128, 28, 28]             256ReLU-33          [-1, 128, 28, 28]               0BasicBlock-34          [-1, 128, 28, 28]               0Conv2d-35          [-1, 128, 28, 28]         147,456BatchNorm2d-36          [-1, 128, 28, 28]             256ReLU-37          [-1, 128, 28, 28]               0Conv2d-38          [-1, 128, 28, 28]         147,456BatchNorm2d-39          [-1, 128, 28, 28]             256ReLU-40          [-1, 128, 28, 28]               0BasicBlock-41          [-1, 128, 28, 28]               0Conv2d-42          [-1, 128, 28, 28]         147,456BatchNorm2d-43          [-1, 128, 28, 28]             256ReLU-44          [-1, 128, 28, 28]               0Conv2d-45          [-1, 128, 28, 28]         147,456BatchNorm2d-46          [-1, 128, 28, 28]             256ReLU-47          [-1, 128, 28, 28]               0BasicBlock-48          [-1, 128, 28, 28]               0Conv2d-49          [-1, 128, 28, 28]         147,456BatchNorm2d-50          [-1, 128, 28, 28]             256ReLU-51          [-1, 128, 28, 28]               0Conv2d-52          [-1, 128, 28, 28]         147,456BatchNorm2d-53          [-1, 128, 28, 28]             256ReLU-54          [-1, 128, 28, 28]               0BasicBlock-55          [-1, 128, 28, 28]               0Conv2d-56          [-1, 256, 14, 14]         294,912BatchNorm2d-57          [-1, 256, 14, 14]             512ReLU-58          [-1, 256, 14, 14]               0Conv2d-59          [-1, 256, 14, 14]         589,824BatchNorm2d-60          [-1, 256, 14, 14]             512Conv2d-61          [-1, 256, 14, 14]          32,768BatchNorm2d-62          [-1, 256, 14, 14]             512ReLU-63          [-1, 256, 14, 14]               0BasicBlock-64          [-1, 256, 14, 14]               0Conv2d-65          [-1, 256, 14, 14]         589,824BatchNorm2d-66          [-1, 256, 14, 14]             512ReLU-67          [-1, 256, 14, 14]               0Conv2d-68          [-1, 256, 14, 14]         589,824BatchNorm2d-69          [-1, 256, 14, 14]             512ReLU-70          [-1, 256, 14, 14]               0BasicBlock-71          [-1, 256, 14, 14]               0Conv2d-72          [-1, 256, 14, 14]         589,824BatchNorm2d-73          [-1, 256, 14, 14]             512ReLU-74          [-1, 256, 14, 14]               0Conv2d-75          [-1, 256, 14, 14]         589,824BatchNorm2d-76          [-1, 256, 14, 14]             512ReLU-77          [-1, 256, 14, 14]               0BasicBlock-78          [-1, 256, 14, 14]               0Conv2d-79          [-1, 256, 14, 14]         589,824BatchNorm2d-80          [-1, 256, 14, 14]             512ReLU-81          [-1, 256, 14, 14]               0Conv2d-82          [-1, 256, 14, 14]         589,824BatchNorm2d-83          [-1, 256, 14, 14]             512ReLU-84          [-1, 256, 14, 14]               0BasicBlock-85          [-1, 256, 14, 14]               0Conv2d-86          [-1, 256, 14, 14]         589,824BatchNorm2d-87          [-1, 256, 14, 14]             512ReLU-88          [-1, 256, 14, 14]               0Conv2d-89          [-1, 256, 14, 14]         589,824BatchNorm2d-90          [-1, 256, 14, 14]             512ReLU-91          [-1, 256, 14, 14]               0BasicBlock-92          [-1, 256, 14, 14]               0Conv2d-93          [-1, 256, 14, 14]         589,824BatchNorm2d-94          [-1, 256, 14, 14]             512ReLU-95          [-1, 256, 14, 14]               0Conv2d-96          [-1, 256, 14, 14]         589,824BatchNorm2d-97          [-1, 256, 14, 14]             512ReLU-98          [-1, 256, 14, 14]               0BasicBlock-99          [-1, 256, 14, 14]               0Conv2d-100            [-1, 512, 7, 7]       1,179,648BatchNorm2d-101            [-1, 512, 7, 7]           1,024ReLU-102            [-1, 512, 7, 7]               0Conv2d-103            [-1, 512, 7, 7]       2,359,296BatchNorm2d-104            [-1, 512, 7, 7]           1,024Conv2d-105            [-1, 512, 7, 7]         131,072BatchNorm2d-106            [-1, 512, 7, 7]           1,024ReLU-107            [-1, 512, 7, 7]               0BasicBlock-108            [-1, 512, 7, 7]               0Conv2d-109            [-1, 512, 7, 7]       2,359,296BatchNorm2d-110            [-1, 512, 7, 7]           1,024ReLU-111            [-1, 512, 7, 7]               0Conv2d-112            [-1, 512, 7, 7]       2,359,296BatchNorm2d-113            [-1, 512, 7, 7]           1,024ReLU-114            [-1, 512, 7, 7]               0BasicBlock-115            [-1, 512, 7, 7]               0Conv2d-116            [-1, 512, 7, 7]       2,359,296BatchNorm2d-117            [-1, 512, 7, 7]           1,024ReLU-118            [-1, 512, 7, 7]               0Conv2d-119            [-1, 512, 7, 7]       2,359,296BatchNorm2d-120            [-1, 512, 7, 7]           1,024ReLU-121            [-1, 512, 7, 7]               0BasicBlock-122            [-1, 512, 7, 7]               0
AdaptiveAvgPool2d-123            [-1, 512, 1, 1]               0Linear-124                 [-1, 1000]         513,000
================================================================
Total params: 21,797,672
Trainable params: 21,797,672
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 96.29
Params size (MB): 83.15
Estimated Total Size (MB): 180.01
----------------------------------------------------------------

pytorch 计算模型的GFlops和total params的方法相关推荐

  1. pytorch计算模型参数量

    1. 安装 thop 1.1 常规安装 pip install thop 1.2 若上述安装方式错误,可以参考以下方式: pip install thop-i http://pypi.douban.c ...

  2. pytorch 获取模型参数_Pytorch获取模型参数情况的方法

    分享人工智能技术干货,专注深度学习与计算机视觉领域! 相较于Tensorflow,Pytorch一开始就是以动态图构建神经网络图的,其获取模型参数的方法也比较容易,既可以根据其内建接口自己写代码获取模 ...

  3. 《融智学进阶文集》01:间接计算模型和间接形式化方法

    <融智学进阶文集>01: 间接计算模型和间接形式化方法 01-间接计算模型和间接形式化方法_邹晓辉.pdf 怎么采用融智学七遍通方法熟悉原创文本? 对照阅读 摘要: 本文旨在:从人机交互界 ...

  4. 【深度学习】深度学习中模型计算量(FLOPs)和参数量(Params)等的理解以及四种在python应用的计算方法总结

    接下来要分别概述以下内容: 1 首先什么是参数量,什么是计算量 2 如何计算 参数量,如何统计 计算量 3 换算参数量,把他换算成我们常用的单位,比如:mb 4 对于各个经典网络,论述他们是计算量大还 ...

  5. Pytorch 获取模型 Params/FLOPS

    Note! 查看 model 参数值 model.state_dict() 1.自定义 Params Pytorch依据其内建接口自己写代码获取模型参数情况,大家可以参考Pytorch提供的model ...

  6. pytorch模型参数信息 计算模型的FLOPs

    参考链接:https://blog.csdn.net/tsq292978891/article/details/87918244 打印模型参数信息 在python3环境下安装torchsummary ...

  7. pytorch计算FLOPs

    转自:pytorch计算FLOPs - 简书 1. 引言 其实模型的参数量好算,但浮点运算数并不好确定,我们一般也就根据参数量直接估计计算量了.但是像卷积之类的运算,它的参数量比较小,但是运算量非常大 ...

  8. 神经网络的计算量(FLOPs)、参数量(Params)、推理时间(FPS)的定义及实现方法

    目录 1. 定义 2. 实现方法 2.1. 计算参数量 2.2. 计算参数量和FLOPs 2.3. 计算推理时间(FPS) 3. 数据大小对参数量和FLOPs的影响 4. 参数量和FLOPs对于硬件要 ...

  9. pytorch保存模型pth_Day159:模型的保存与加载

    网络结构和参数可以分开的保存和加载,因此,pytorch保存模型有两种方法: 保存 整个模型 (结构+参数) 只保存模型参数(官方推荐) # 保存整个网络torch.save(model, check ...

  10. c++list遍历_小白学PyTorch | 6 模型的构建访问遍历存储(附代码)

    关注一下不迷路哦~喜欢的点个星标吧~<> 小白学PyTorch | 5 torchvision预训练模型与数据集全览 小白学PyTorch | 4 构建模型三要素与权重初始化 小白学PyT ...

最新文章

  1. 关不关机 扫地机器人_【小米智能家居】米家扫拖机器人,模拟人工来回擦拖地!...
  2. SSH远程登录解析(linux)
  3. VTK:提取可见细胞用法实战
  4. php curl 下载图片,CURL实现下载远程图片并保存到本地
  5. 使用StringWriter和StringReader的好处
  6. 火狐SEO插件:查询网站收录与外链的火狐油猴脚本工具
  7. Mac上的一位数密码你知道吗
  8. ECS上配置FTP Filezilla
  9. 进程以及状态 进程的创建
  10. 基于近邻用户协同过滤算法的音乐推荐系统
  11. 项目开发:网上书店(详细的开发流程记录)----注册登录功能,通过邮件验证
  12. JS 延时函数 setTimeout 或者 rxjs 写法
  13. 新闻资讯|iPad mini 概念想象重新设计,配备更大的8.4英寸Liquid Retina显示屏
  14. “马赛克”真能去除了?老司机狂喜!这一神器一键去除!
  15. Java代码安装maven jar_Java中Maven项目导出jar包配置的示例代码
  16. 你什么也无法告诉别人
  17. 桌面widget详解(四)——桌面音乐播放器(实战)
  18. ieee 802.3学习笔记-MII
  19. 笔试真题:100颗糖果,甲乙轮流从糖果盒中取出糖果,每次可取出2、4或6颗,若取得最后糖果的玩家为最终胜者,若甲先取z则(甲获胜,乙获胜,平局,不确定)
  20. Chain of Responsibility模式——读书笔记

热门文章

  1. 宝贵的核心珍藏_建立成功的神经网络的10个宝贵技巧
  2. matlab如何模拟数字舵机,模拟舵机和数字舵机区别
  3. html的单元格加线,html表格单元格添加斜下框线的方法
  4. WEB应用组合——LAMP软件源码编译安装
  5. 系统没有wmi服务器,Win8系统下sql 2008 MOF编译器无法连接WMI服务器怎么办
  6. 天行健,君子自强不息
  7. 国外量化投资经典案例介绍
  8. 拍摄视频,真的帧率越高越好吗?
  9. 【每月总结】2021年6月
  10. 关于163邮箱,上传附件,本地验证文件大小的问题。