3D ConvNet1是由Du Tran等人在2015年提出的提取视频时间域和空间域特征的三维立体卷积神经网络。Demo运行参考了C3D User Guide。本文会提到一些运行demo的注意事项,以避免出现我个人遇到的问题。

Run Train 3D ConvNet Demo

Change directory to YOUR_C3D_HOME/examples/c3d_trian_ucf101/

  • Compute volume mean from list

    • run sh create_volume_mean.sh to compute the volume mean file.
  • Train your own network from scratch
    • run sh train_ucf101.sh to train, expect a couple days to finish.
    • when running this train_ucf101.sh file, I encountered following problem, while I change the data_dir to my own path and set the GPU device to #1, which is available:
I0921 21:17:04.632088 6032 video_data_layer.cpp:344] read video from /MY_DATA_PATH/v_JumpingJack_g25_c05/
F0921 21:17:04.632129 6032 video_data_layer.cpp:346] Check failed: ReadImageSequenceToVolumeDatum(file_list_[id].c_str(), 1, label_list_[id], new_length, new_height, new_width, sampling_rate, &datum)
*** Check failure stack trace: ***0x7efd01251c7d google::LogMessage::Fail()0x7efd01253b30 google::LogMessage::SendToLog()
@ 0x7efd01251842 google::LogMessage::Flush()
@ 0x7efd0125454e google::LogMessageFatal::~LogMessageFatal()
@ 0x4b01fe caffe::VideoDataLayer<>::SetUp()
@ 0x4458cb caffe::Net<>::Init()
@ 0x446b75 caffe::Net<>::Net()
@ 0x43412f caffe::Solver<>::Init()
@ 0x43783b caffe::Solver<>::Solver()
@ 0x40b826 main
@ 0x7efcfcfd9a40 (unknown)
@ 0x40e079 _start
Aborted (core dumped)0
  • Solution: as I compared my extracted frame files with the one in demo, I found that they are different in frame image extension. The frame image extracted by author is in JPG while mine is in JPEG, so I change them in to JPG.
  • With the problem above solved, I encountered a new problem:
sdy@sdy:~/Git/C3D/examples/c3d_train_ucf101$ sh train_ucf101.sh
I0108 11:25:15.496800  6760 train_net.cpp:26] Starting Optimization
I0108 11:25:15.496939  6760 solver.cpp:41] Creating training net.
I0108 11:25:15.505625  6760 net.cpp:76] Creating Layer data
I0108 11:25:15.505667  6760 net.cpp:112] data -> data
I0108 11:25:15.505692  6760 net.cpp:112] data -> label
I0108 11:25:15.512724  6760 video_data_layer.cpp:283] Opening file ../c3d_finetuning/train_01.lst
I0108 11:25:15.601454  6760 video_data_layer.cpp:312] Shuffling data
I0108 11:25:15.989821  6760 video_data_layer.cpp:317] A total of 107258 video chunks.
I0108 11:25:15.989881  6760 video_data_layer.cpp:344] read video from /home/sdy/Git/C3D/data/UCF101/frames/v_JumpingJack_g25_c05/
I0108 11:25:16.024459  6760 video_data_layer.cpp:365] output data size: 30,3,16,112,112
I0108 11:25:16.024513  6760 video_data_layer.cpp:387] Loading mean file from ucf101_train_mean.binaryproto
F0108 11:25:16.024785  6760 blob.cpp:98] Check failed: data_
*** Check failure stack trace: ***@     0x7fa6f45a8daa  (unknown)@     0x7fa6f45a8ce4  (unknown)@     0x7fa6f45a86e6  (unknown)@     0x7fa6f45ab687  (unknown)@           0x436159  caffe::Blob<>::mutable_cpu_data()@           0x4371e6  caffe::Blob<>::FromProto()@           0x4ce1c5  caffe::VideoDataLayer<>::SetUp()@           0x45ea1f  caffe::Net<>::Init()@           0x4602c0  caffe::Net<>::Net()@           0x43aaee  caffe::Solver<>::Init()@           0x43fb8a  caffe::Solver<>::Solver()@           0x40b15f  main@     0x7fa6f0a95f45  (unknown)@           0x40de1e  (unknown)@              (nil)  (unknown)
Aborted (core dumped)
  • Solution:报错显示问题出在ucf101_train_mean.binaryproto文件上,于是重新运行create_volume_mean.sh,报错如下。参考github上相关问题答案,作者表示此类问题出在train list上,文件路径不正确或者帧文件起始编码不是从000001开始导致。因此,通过重新对比文件路径和list文件,发现有两个视频命名大小写与list中路径不同,如文件名是v_HandstandPushups_g01_c01但路径却是:/home/…/v_HandStandPushups_g01_c01/ 1 36,由此作相应修改。
sdy@sdy:~/Git/C3D/examples/c3d_train_ucf101$ sh create_volume_mean.sh
I0108 15:20:14.439486  8571 compute_volume_mean_from_list.cpp:53] using dropping rate 10
I0108 15:20:14.479281  8571 compute_volume_mean_from_list.cpp:80] Starting Iteration
F0108 15:20:54.331308  8571 compute_volume_mean_from_list.cpp:92] Check failed: size_in_datum == data_size (0 vs. 1050624) Incorrect data field size 0
*** Check failure stack trace: ***@     0x7f0e683a0daa  (unknown)@     0x7f0e683a0ce4  (unknown)@     0x7f0e683a06e6  (unknown)@     0x7f0e683a3687  (unknown)@           0x408d20  main@     0x7f0e64f91f45  (unknown)@           0x408edd  (unknown)@              (nil)  (unknown)
Aborted (core dumped)
  • 修改后成功生成ucf101_trian_mean.binaryproto文件
sdy@sdy:~/Git/C3D/examples/c3d_train_ucf101$ sh create_volume_mean.sh
I0108 22:02:42.597549 22827 compute_volume_mean_from_list.cpp:53] using dropping rate 10
I0108 22:02:42.662868 22827 compute_volume_mean_from_list.cpp:80] Starting Iteration
E0108 22:05:40.792213 22827 compute_volume_mean_from_list.cpp:106] Processed 10000 files.
E0108 22:05:50.637400 22827 compute_volume_mean_from_list.cpp:112] Processed 10725 files.
I0108 22:05:50.639452 22827 compute_volume_mean_from_list.cpp:119] Write to ucf101_train_mean.binaryproto
  • 运行train_ucf101.sh
sdy@sdy:~/Git/C3D/examples/c3d_train_ucf101$ sh train_ucf101.sh
I0108 22:10:21.996961 22888 train_net.cpp:26] Starting Optimization
I0108 22:10:21.997050 22888 solver.cpp:41] Creating training net.
I0108 22:10:21.997517 22888 net.cpp:76] Creating Layer data
I0108 22:10:21.997531 22888 net.cpp:112] data -> data
I0108 22:10:21.997541 22888 net.cpp:112] data -> label
I0108 22:10:21.997555 22888 video_data_layer.cpp:283] Opening file ../c3d_finetuning/train_01.lst
I0108 22:10:22.070096 22888 video_data_layer.cpp:312] Shuffling data
I0108 22:10:22.391844 22888 video_data_layer.cpp:317] A total of 107258 video chunks.
I0108 22:10:22.391902 22888 video_data_layer.cpp:344] read video from /home/sdy/Git/C3D/data/UCF101/frames/v_JumpingJack_g25_c05/
I0108 22:10:22.410243 22888 video_data_layer.cpp:365] output data size: 30,3,16,112,112
I0108 22:10:22.410284 22888 video_data_layer.cpp:387] Loading mean file from ucf101_train_mean.binaryproto
I0108 22:10:22.429491 22888 net.cpp:127] Top shape: 30 3 16 112 112 (18063360)
I0108 22:10:22.429550 22888 net.cpp:127] Top shape: 30 1 1 1 1 (30)
I0108 22:10:22.429569 22888 net.cpp:159] data does not need backward computation.
I0108 22:10:22.429590 22888 net.cpp:76] Creating Layer conv1a
I0108 22:10:22.429605 22888 net.cpp:86] conv1a <- data
I0108 22:10:22.429625 22888 net.cpp:112] conv1a -> conv1a
I0108 22:10:22.595605 22888 net.cpp:127] Top shape: 30 64 16 112 112 (385351680)
I0108 22:10:22.595657 22888 net.cpp:154] conv1a needs backward computation.
I0108 22:10:22.595681 22888 net.cpp:76] Creating Layer relu1a
I0108 22:10:22.595695 22888 net.cpp:86] relu1a <- conv1a
I0108 22:10:22.595716 22888 net.cpp:100] relu1a -> conv1a (in-place)
I0108 22:10:22.595736 22888 net.cpp:127] Top shape: 30 64 16 112 112 (385351680)
I0108 22:10:22.595749 22888 net.cpp:154] relu1a needs backward computation.
I0108 22:10:22.595764 22888 net.cpp:76] Creating Layer pool1
I0108 22:10:22.595777 22888 net.cpp:86] pool1 <- conv1a
I0108 22:10:22.595790 22888 net.cpp:112] pool1 -> pool1
I0108 22:10:22.598541 22888 net.cpp:127] Top shape: 30 64 16 56 56 (96337920)
I0108 22:10:22.598567 22888 net.cpp:154] pool1 needs backward computation.
I0108 22:10:22.598587 22888 net.cpp:76] Creating Layer conv2a
I0108 22:10:22.598598 22888 net.cpp:86] conv2a <- pool1
I0108 22:10:22.598610 22888 net.cpp:112] conv2a -> conv2a
I0108 22:10:22.605746 22888 net.cpp:127] Top shape: 30 128 16 56 56 (192675840)
I0108 22:10:22.605775 22888 net.cpp:154] conv2a needs backward computation.
I0108 22:10:22.605792 22888 net.cpp:76] Creating Layer relu2a
I0108 22:10:22.605809 22888 net.cpp:86] relu2a <- conv2a
I0108 22:10:22.605823 22888 net.cpp:100] relu2a -> conv2a (in-place)
I0108 22:10:22.605852 22888 net.cpp:127] Top shape: 30 128 16 56 56 (192675840)
I0108 22:10:22.605873 22888 net.cpp:154] relu2a needs backward computation.
I0108 22:10:22.605896 22888 net.cpp:76] Creating Layer pool2
I0108 22:10:22.605916 22888 net.cpp:86] pool2 <- conv2a
I0108 22:10:22.605937 22888 net.cpp:112] pool2 -> pool2
I0108 22:10:22.605958 22888 net.cpp:127] Top shape: 30 128 8 28 28 (24084480)
I0108 22:10:22.605978 22888 net.cpp:154] pool2 needs backward computation.
I0108 22:10:22.605993 22888 net.cpp:76] Creating Layer conv3a
I0108 22:10:22.606014 22888 net.cpp:86] conv3a <- pool2
I0108 22:10:22.606034 22888 net.cpp:112] conv3a -> conv3a
I0108 22:10:22.634486 22888 net.cpp:127] Top shape: 30 256 8 28 28 (48168960)
I0108 22:10:22.634656 22888 net.cpp:154] conv3a needs backward computation.
I0108 22:10:22.634786 22888 net.cpp:76] Creating Layer relu3a
I0108 22:10:22.634914 22888 net.cpp:86] relu3a <- conv3a
I0108 22:10:22.635069 22888 net.cpp:100] relu3a -> conv3a (in-place)
I0108 22:10:22.635203 22888 net.cpp:127] Top shape: 30 256 8 28 28 (48168960)
I0108 22:10:22.635341 22888 net.cpp:154] relu3a needs backward computation.
I0108 22:10:22.635465 22888 net.cpp:76] Creating Layer pool3
I0108 22:10:22.635601 22888 net.cpp:86] pool3 <- conv3a
I0108 22:10:22.635732 22888 net.cpp:112] pool3 -> pool3
I0108 22:10:22.635854 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)
I0108 22:10:22.635974 22888 net.cpp:154] pool3 needs backward computation.
I0108 22:10:22.636106 22888 net.cpp:76] Creating Layer conv4a
I0108 22:10:22.636245 22888 net.cpp:86] conv4a <- pool3
I0108 22:10:22.636371 22888 net.cpp:112] conv4a -> conv4a
I0108 22:10:22.694723 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)
I0108 22:10:22.694777 22888 net.cpp:154] conv4a needs backward computation.
I0108 22:10:22.694797 22888 net.cpp:76] Creating Layer relu4a
I0108 22:10:22.694818 22888 net.cpp:86] relu4a <- conv4a
I0108 22:10:22.694831 22888 net.cpp:100] relu4a -> conv4a (in-place)
I0108 22:10:22.694844 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)
I0108 22:10:22.694855 22888 net.cpp:154] relu4a needs backward computation.
I0108 22:10:22.694867 22888 net.cpp:76] Creating Layer pool4
I0108 22:10:22.694877 22888 net.cpp:86] pool4 <- conv4a
I0108 22:10:22.694890 22888 net.cpp:112] pool4 -> pool4
I0108 22:10:22.694903 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)
I0108 22:10:22.694913 22888 net.cpp:154] pool4 needs backward computation.
I0108 22:10:22.694928 22888 net.cpp:76] Creating Layer conv5a
I0108 22:10:22.694938 22888 net.cpp:86] conv5a <- pool4
I0108 22:10:22.694949 22888 net.cpp:112] conv5a -> conv5a
I0108 22:10:22.773874 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)
I0108 22:10:22.773926 22888 net.cpp:154] conv5a needs backward computation.
I0108 22:10:22.773949 22888 net.cpp:76] Creating Layer relu5a
I0108 22:10:22.773969 22888 net.cpp:86] relu5a <- conv5a
I0108 22:10:22.773988 22888 net.cpp:100] relu5a -> conv5a (in-place)
I0108 22:10:22.774005 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)
I0108 22:10:22.774024 22888 net.cpp:154] relu5a needs backward computation.
I0108 22:10:22.774041 22888 net.cpp:76] Creating Layer pool5
I0108 22:10:22.774058 22888 net.cpp:86] pool5 <- conv5a
I0108 22:10:22.774075 22888 net.cpp:112] pool5 -> pool5
I0108 22:10:22.774094 22888 net.cpp:127] Top shape: 30 256 1 4 4 (122880)
I0108 22:10:22.774111 22888 net.cpp:154] pool5 needs backward computation.
I0108 22:10:22.774137 22888 net.cpp:76] Creating Layer fc6
I0108 22:10:22.774155 22888 net.cpp:86] fc6 <- pool5
I0108 22:10:22.774173 22888 net.cpp:112] fc6 -> fc6
I0108 22:10:23.052224 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.052269 22888 net.cpp:154] fc6 needs backward computation.
I0108 22:10:23.052285 22888 net.cpp:76] Creating Layer relu6
I0108 22:10:23.052304 22888 net.cpp:86] relu6 <- fc6
I0108 22:10:23.052322 22888 net.cpp:100] relu6 -> fc6 (in-place)
I0108 22:10:23.052332 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.052350 22888 net.cpp:154] relu6 needs backward computation.
I0108 22:10:23.052361 22888 net.cpp:76] Creating Layer drop6
I0108 22:10:23.052369 22888 net.cpp:86] drop6 <- fc6
I0108 22:10:23.052379 22888 net.cpp:100] drop6 -> fc6 (in-place)
I0108 22:10:23.052394 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.052404 22888 net.cpp:154] drop6 needs backward computation.
I0108 22:10:23.052419 22888 net.cpp:76] Creating Layer fc7
I0108 22:10:23.052428 22888 net.cpp:86] fc7 <- fc6
I0108 22:10:23.052438 22888 net.cpp:112] fc7 -> fc7
I0108 22:10:23.188469 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.188519 22888 net.cpp:154] fc7 needs backward computation.
I0108 22:10:23.188541 22888 net.cpp:76] Creating Layer relu7
I0108 22:10:23.188555 22888 net.cpp:86] relu7 <- fc7
I0108 22:10:23.188575 22888 net.cpp:100] relu7 -> fc7 (in-place)
I0108 22:10:23.188585 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.188594 22888 net.cpp:154] relu7 needs backward computation.
I0108 22:10:23.188606 22888 net.cpp:76] Creating Layer drop7
I0108 22:10:23.188614 22888 net.cpp:86] drop7 <- fc7
I0108 22:10:23.188623 22888 net.cpp:100] drop7 -> fc7 (in-place)
I0108 22:10:23.188633 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.188643 22888 net.cpp:154] drop7 needs backward computation.
I0108 22:10:23.188657 22888 net.cpp:76] Creating Layer fc8
I0108 22:10:23.188699 22888 net.cpp:86] fc8 <- fc7
I0108 22:10:23.188721 22888 net.cpp:112] fc8 -> fc8
I0108 22:10:23.195559 22888 net.cpp:127] Top shape: 30 101 1 1 1 (3030)
I0108 22:10:23.195588 22888 net.cpp:154] fc8 needs backward computation.
I0108 22:10:23.195605 22888 net.cpp:76] Creating Layer loss
I0108 22:10:23.195618 22888 net.cpp:86] loss <- fc8
I0108 22:10:23.195628 22888 net.cpp:86] loss <- label
I0108 22:10:23.195967 22888 net.cpp:154] loss needs backward computation.
I0108 22:10:23.196009 22888 net.cpp:183] Collecting Learning Rate and Weight Decay.
I0108 22:10:23.196032 22888 net.cpp:176] Network initialization done.
I0108 22:10:23.196040 22888 net.cpp:177] Memory required for Data 3113914320
I0108 22:10:23.196116 22888 solver.cpp:44] Creating testing net.
I0108 22:10:23.197083 22888 net.cpp:76] Creating Layer data
I0108 22:10:23.197106 22888 net.cpp:112] data -> data
I0108 22:10:23.197119 22888 net.cpp:112] data -> label
I0108 22:10:23.197134 22888 video_data_layer.cpp:283] Opening file ../c3d_finetuning/test_01.lst
I0108 22:10:23.227228 22888 video_data_layer.cpp:312] Shuffling data
I0108 22:10:23.228469 22888 video_data_layer.cpp:317] A total of 41822 video chunks.
I0108 22:10:23.228487 22888 video_data_layer.cpp:344] read video from /home/sdy/Git/C3D/data/UCF101/frames/v_HammerThrow_g03_c02/
I0108 22:10:23.252522 22888 video_data_layer.cpp:365] output data size: 30,3,16,112,112
I0108 22:10:23.252559 22888 video_data_layer.cpp:387] Loading mean file from ucf101_train_mean.binaryproto
I0108 22:10:23.273470 22888 net.cpp:127] Top shape: 30 3 16 112 112 (18063360)
I0108 22:10:23.273516 22888 net.cpp:127] Top shape: 30 1 1 1 1 (30)
I0108 22:10:23.273537 22888 net.cpp:159] data does not need backward computation.
I0108 22:10:23.273558 22888 net.cpp:76] Creating Layer conv1a
I0108 22:10:23.273573 22888 net.cpp:86] conv1a <- data
I0108 22:10:23.273586 22888 net.cpp:112] conv1a -> conv1a
I0108 22:10:23.273968 22888 net.cpp:127] Top shape: 30 64 16 112 112 (385351680)
I0108 22:10:23.274107 22888 net.cpp:154] conv1a needs backward computation.
I0108 22:10:23.274211 22888 net.cpp:76] Creating Layer relu1a
I0108 22:10:23.274319 22888 net.cpp:86] relu1a <- conv1a
I0108 22:10:23.274411 22888 net.cpp:100] relu1a -> conv1a (in-place)
I0108 22:10:23.274513 22888 net.cpp:127] Top shape: 30 64 16 112 112 (385351680)
I0108 22:10:23.274618 22888 net.cpp:154] relu1a needs backward computation.
I0108 22:10:23.274719 22888 net.cpp:76] Creating Layer pool1
I0108 22:10:23.274817 22888 net.cpp:86] pool1 <- conv1a
I0108 22:10:23.274917 22888 net.cpp:112] pool1 -> pool1
I0108 22:10:23.275020 22888 net.cpp:127] Top shape: 30 64 16 56 56 (96337920)
I0108 22:10:23.275104 22888 net.cpp:154] pool1 needs backward computation.
I0108 22:10:23.275205 22888 net.cpp:76] Creating Layer conv2a
I0108 22:10:23.275306 22888 net.cpp:86] conv2a <- pool1
I0108 22:10:23.275405 22888 net.cpp:112] conv2a -> conv2a
I0108 22:10:23.283287 22888 net.cpp:127] Top shape: 30 128 16 56 56 (192675840)
I0108 22:10:23.283311 22888 net.cpp:154] conv2a needs backward computation.
I0108 22:10:23.283332 22888 net.cpp:76] Creating Layer relu2a
I0108 22:10:23.283350 22888 net.cpp:86] relu2a <- conv2a
I0108 22:10:23.283368 22888 net.cpp:100] relu2a -> conv2a (in-place)
I0108 22:10:23.283387 22888 net.cpp:127] Top shape: 30 128 16 56 56 (192675840)
I0108 22:10:23.283404 22888 net.cpp:154] relu2a needs backward computation.
I0108 22:10:23.283422 22888 net.cpp:76] Creating Layer pool2
I0108 22:10:23.283439 22888 net.cpp:86] pool2 <- conv2a
I0108 22:10:23.283458 22888 net.cpp:112] pool2 -> pool2
I0108 22:10:23.283476 22888 net.cpp:127] Top shape: 30 128 8 28 28 (24084480)
I0108 22:10:23.283493 22888 net.cpp:154] pool2 needs backward computation.
I0108 22:10:23.283511 22888 net.cpp:76] Creating Layer conv3a
I0108 22:10:23.283529 22888 net.cpp:86] conv3a <- pool2
I0108 22:10:23.283545 22888 net.cpp:112] conv3a -> conv3a
I0108 22:10:23.311955 22888 net.cpp:127] Top shape: 30 256 8 28 28 (48168960)
I0108 22:10:23.311991 22888 net.cpp:154] conv3a needs backward computation.
I0108 22:10:23.312014 22888 net.cpp:76] Creating Layer relu3a
I0108 22:10:23.312033 22888 net.cpp:86] relu3a <- conv3a
I0108 22:10:23.312052 22888 net.cpp:100] relu3a -> conv3a (in-place)
I0108 22:10:23.312069 22888 net.cpp:127] Top shape: 30 256 8 28 28 (48168960)
I0108 22:10:23.312086 22888 net.cpp:154] relu3a needs backward computation.
I0108 22:10:23.312105 22888 net.cpp:76] Creating Layer pool3
I0108 22:10:23.312122 22888 net.cpp:86] pool3 <- conv3a
I0108 22:10:23.312140 22888 net.cpp:112] pool3 -> pool3
I0108 22:10:23.312160 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)
I0108 22:10:23.312177 22888 net.cpp:154] pool3 needs backward computation.
I0108 22:10:23.312196 22888 net.cpp:76] Creating Layer conv4a
I0108 22:10:23.312212 22888 net.cpp:86] conv4a <- pool3
I0108 22:10:23.312229 22888 net.cpp:112] conv4a -> conv4a
I0108 22:10:23.369153 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)
I0108 22:10:23.369189 22888 net.cpp:154] conv4a needs backward computation.
I0108 22:10:23.369210 22888 net.cpp:76] Creating Layer relu4a
I0108 22:10:23.369228 22888 net.cpp:86] relu4a <- conv4a
I0108 22:10:23.369247 22888 net.cpp:100] relu4a -> conv4a (in-place)
I0108 22:10:23.369257 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)
I0108 22:10:23.369267 22888 net.cpp:154] relu4a needs backward computation.
I0108 22:10:23.369277 22888 net.cpp:76] Creating Layer pool4
I0108 22:10:23.369287 22888 net.cpp:86] pool4 <- conv4a
I0108 22:10:23.369297 22888 net.cpp:112] pool4 -> pool4
I0108 22:10:23.369307 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)
I0108 22:10:23.369318 22888 net.cpp:154] pool4 needs backward computation.
I0108 22:10:23.369329 22888 net.cpp:76] Creating Layer conv5a
I0108 22:10:23.369338 22888 net.cpp:86] conv5a <- pool4
I0108 22:10:23.369349 22888 net.cpp:112] conv5a -> conv5a
I0108 22:10:23.427034 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)
I0108 22:10:23.427076 22888 net.cpp:154] conv5a needs backward computation.
I0108 22:10:23.427098 22888 net.cpp:76] Creating Layer relu5a
I0108 22:10:23.427117 22888 net.cpp:86] relu5a <- conv5a
I0108 22:10:23.427135 22888 net.cpp:100] relu5a -> conv5a (in-place)
I0108 22:10:23.427146 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)
I0108 22:10:23.427165 22888 net.cpp:154] relu5a needs backward computation.
I0108 22:10:23.427182 22888 net.cpp:76] Creating Layer pool5
I0108 22:10:23.427199 22888 net.cpp:86] pool5 <- conv5a
I0108 22:10:23.427214 22888 net.cpp:112] pool5 -> pool5
I0108 22:10:23.427233 22888 net.cpp:127] Top shape: 30 256 1 4 4 (122880)
I0108 22:10:23.427251 22888 net.cpp:154] pool5 needs backward computation.
I0108 22:10:23.427274 22888 net.cpp:76] Creating Layer fc6
I0108 22:10:23.427284 22888 net.cpp:86] fc6 <- pool5
I0108 22:10:23.427300 22888 net.cpp:112] fc6 -> fc6
I0108 22:10:23.692828 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.692981 22888 net.cpp:154] fc6 needs backward computation.
I0108 22:10:23.693073 22888 net.cpp:76] Creating Layer relu6
I0108 22:10:23.693157 22888 net.cpp:86] relu6 <- fc6
I0108 22:10:23.693241 22888 net.cpp:100] relu6 -> fc6 (in-place)
I0108 22:10:23.693325 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.693408 22888 net.cpp:154] relu6 needs backward computation.
I0108 22:10:23.693492 22888 net.cpp:76] Creating Layer drop6
I0108 22:10:23.693577 22888 net.cpp:86] drop6 <- fc6
I0108 22:10:23.693660 22888 net.cpp:100] drop6 -> fc6 (in-place)
I0108 22:10:23.693743 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.693826 22888 net.cpp:154] drop6 needs backward computation.
I0108 22:10:23.693910 22888 net.cpp:76] Creating Layer fc7
I0108 22:10:23.693990 22888 net.cpp:86] fc7 <- fc6
I0108 22:10:23.694073 22888 net.cpp:112] fc7 -> fc7
I0108 22:10:23.833493 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.833534 22888 net.cpp:154] fc7 needs backward computation.
I0108 22:10:23.833549 22888 net.cpp:76] Creating Layer relu7
I0108 22:10:23.833559 22888 net.cpp:86] relu7 <- fc7
I0108 22:10:23.833571 22888 net.cpp:100] relu7 -> fc7 (in-place)
I0108 22:10:23.833581 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.833590 22888 net.cpp:154] relu7 needs backward computation.
I0108 22:10:23.833601 22888 net.cpp:76] Creating Layer drop7
I0108 22:10:23.833611 22888 net.cpp:86] drop7 <- fc7
I0108 22:10:23.833621 22888 net.cpp:100] drop7 -> fc7 (in-place)
I0108 22:10:23.833631 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)
I0108 22:10:23.833640 22888 net.cpp:154] drop7 needs backward computation.
I0108 22:10:23.833652 22888 net.cpp:76] Creating Layer fc8
I0108 22:10:23.833662 22888 net.cpp:86] fc8 <- fc7
I0108 22:10:23.833673 22888 net.cpp:112] fc8 -> fc8
I0108 22:10:23.840504 22888 net.cpp:127] Top shape: 30 101 1 1 1 (3030)
I0108 22:10:23.840533 22888 net.cpp:154] fc8 needs backward computation.
I0108 22:10:23.840553 22888 net.cpp:76] Creating Layer prob
I0108 22:10:23.840569 22888 net.cpp:86] prob <- fc8
I0108 22:10:23.840589 22888 net.cpp:112] prob -> prob
I0108 22:10:23.840607 22888 net.cpp:127] Top shape: 30 101 1 1 1 (3030)
I0108 22:10:23.840625 22888 net.cpp:154] prob needs backward computation.
I0108 22:10:23.840642 22888 net.cpp:76] Creating Layer accuracy
I0108 22:10:23.840659 22888 net.cpp:86] accuracy <- prob
I0108 22:10:23.840703 22888 net.cpp:86] accuracy <- label
I0108 22:10:23.840724 22888 net.cpp:112] accuracy -> accuracy
I0108 22:10:23.840745 22888 net.cpp:127] Top shape: 1 2 1 1 1 (2)
I0108 22:10:23.840764 22888 net.cpp:154] accuracy needs backward computation.
I0108 22:10:23.840780 22888 net.cpp:165] This network produces output accuracy
I0108 22:10:23.840811 22888 net.cpp:183] Collecting Learning Rate and Weight Decay.
I0108 22:10:23.840834 22888 net.cpp:176] Network initialization done.
I0108 22:10:23.840852 22888 net.cpp:177] Memory required for Data 3113926448
I0108 22:10:23.840927 22888 solver.cpp:49] Solver scaffolding done.
I0108 22:10:24.079597 22888 solver.cpp:61] Solving deep_c3d_ucf101
I0108 22:10:24.079658 22888 solver.cpp:106] Iteration 0, Testing net
I0108 22:11:26.784036 22888 solver.cpp:142] Test score #0: 0.00733333
I0108 22:11:26.784090 22888 solver.cpp:142] Test score #1: 4.69726
I0108 22:12:01.860625 22888 solver.cpp:237] Iteration 20, lr = 0.003
I0108 22:12:01.869343 22888 solver.cpp:87] Iteration 20, loss = 5.26183
I0108 22:12:37.542536 22888 solver.cpp:237] Iteration 40, lr = 0.003
I0108 22:12:37.542987 22888 solver.cpp:87] Iteration 40, loss = 4.71496
I0108 22:13:13.279296 22888 solver.cpp:237] Iteration 60, lr = 0.003
I0108 22:13:13.279765 22888 solver.cpp:87] Iteration 60, loss = 4.70241
I0108 22:13:49.041620 22888 solver.cpp:237] Iteration 80, lr = 0.003
I0108 22:13:49.056371 22888 solver.cpp:87] Iteration 80, loss = 4.65352

  1. D. Tran, L. Bourdev, R. Fergus, L. Torresani, and M. Paluri. Learning spatiotemporal features with 3d convolutional net- works. In ICCV, 2015. ↩

3D ConvNet Demo运行相关推荐

  1. [技美CG][DX12实战]Demo 运行【DX12】【VS2022】【龙书】【新手开箱可用】

    DirectX12 3D 游戏开发实践 Demo 运行 前言 官方网站 运行官方demo 1.安装vs2022 2.如图解压Common文件夹和MyD3D12Project到本地 3.解压后如图 4. ...

  2. (02)Cartographer源码无死角解析-(01) 环境搭建,demo运行,ROS一键安装_清除各种疑难杂症

    讲解关于slam一系列文章汇总链接:史上最全slam从零开始,针对于本栏目讲解(02)Cartographer源码无死角解析链接如下: (02)Cartographer源码无死角解析-(00)目录_最 ...

  3. 移动端也能兼容的web页面制作1:MDBootstrap演示Demo运行演示

    [ 导读 ] MDBootstrap 是基于 Vue.js 开发的一套前端框架,拥有美观大气的界面效果,友好的交互体验,更棒的是对于移动端也有很好的兼容性.先给大家看下演示 demo 的运行,后面将围 ...

  4. activiti 5.22的demo运行

    activiti 5.22的demo运行 从github上clon下来的activiti项目,运行demo项目activiti-webapp-explorer2时,在使用到流程设计工作区,选取acti ...

  5. Webrct之demo运行

    今天下载了谷歌的webrtc的demo运行了一下,感觉很好. demo地址: https://github.com/webrtc/samples 但是发现一个问题,在手机端一直报错 百度了一大圈之后, ...

  6. WPS C++ 二次开发 Demo运行

    1.官网二次开发地址:https://open.wps.cn/docs/client/wpsLoad 2.Demo源码下载: 经过测试上述链接找不到demo源码,可通过git命令下载: git clo ...

  7. pion demo运行

    pion demo运行 pion介绍 运行 demo 拉 pion 代码 编译 运行 examples.exe 运行 data-channels.go pion介绍 pion 是 go 语言写的 we ...

  8. js制作的炫酷3D太阳系行星运行效果

    想象着打开网页就能浏览太阳系行星的运行情况,促进我们更好的了解这个宇宙星空,于是找到了这样一段代码可以完美的实现这个功能,通过css和js就可以实现在网页上展示一个完美的太阳系行星的运行情况,效果炫酷 ...

  9. webRTC服务器搭建(基于Janus)与Demo运行

    原文网址:https://blog.csdn.net/newchenxf/article/details/110451532 转载请注明出处^^ 前言 2020年,直播带货不要太火,直播的方案基于啥? ...

最新文章

  1. UITableView数据的添加、删除、移动
  2. Php Fatal error: Allowed memory size of 33554432 bytes exhausted 的解决办法
  3. python学习-综合练习三(斐波那契数列、阿姆斯特朗数、//和/、十进制转二进制bin、八进制oct、十六进制hex、进制转换源码、python中::和:的区别)
  4. nssl1270-创世纪【树形dp,基环树】
  5. Linux Shell脚本专栏_自动发布Java项目(tomcat)_10
  6. 【自动控制原理】渐进稳定与临界稳定的区别与联系
  7. html中加pyecharts,如何在PPT中插入Pyecharts的图表?
  8. SCLK时钟信号可以高电平有效也可以低电平有效
  9. 紫罗兰永恒花园rust简谱_みちしるべ简谱-紫罗兰永恒花园ed
  10. 这表白代码让我虎躯一震!
  11. excel减法函数_Excel办公实操,提取多个条件的数据,办公必会技能
  12. 如何把一张pdf分成多个?一个pdf怎么分成若干个pdf?
  13. Android强行进阶:为何大厂APP如微信、支付宝、淘宝、手Q等只适配了armeabi-v7a/armeabi?
  14. freebsd MySQL 提权_Intel Sysret (CVE-2012-0217)内核提权漏洞
  15. 2018远景能源笔试
  16. 电脑广告弹窗怎么解决?
  17. 英语测试题软件,英语试题软件
  18. 诗歌中的宇宙飞船和电子计算机代表什么,《宇宙飞船的避火衣》阅读理解及答案...
  19. 用 SQL 做数据分析的十大常用功能,附面试原题解答!!
  20. [产品]:基于DTCloud开发的一款AI考试系统

热门文章

  1. 制作黑苹果安装U盘(Clover+OC+PE)三引导
  2. VUE学习(七) 自定义列表鼠标移入变色,点击变色(仿el-table实现)
  3. 什么是营销漏斗模型?如何创建一个营销漏斗策略
  4. inurl .php sid=,兄弟们现在还有那些好用的发外链的网站啊? - 搜外SEO问答
  5. 光线通过三棱镜模拟matlab仿真
  6. 假设检验_样本容量选取20211204
  7. php竞赛,PHP大赛
  8. 小就是大|2022 OceanBase 年度发布会亮点抢先看!
  9. 强化学习(9):TRPO、PPO以及DPPO算法
  10. 关于5G接入网,看这一篇就够啦!