zoukankan      html  css  js  c++  java
  • ResNet152网络复现(Caffe)

    一、准备数据集

    1)  下载数据集

    Imagnet网站上下载了三类图片,分别是big cat、dog、fish,其中训练集的图片数一共是4149,测试集的图片数是1003,训练集和测试集的图片数比例4:1,将训练集的图片保存在train文件夹下,测试集图片保存在val文件夹下.

    train、val文件夹下面均有bigcat、dog、fish三个文件夹,分别存放着对应类别的图片.

     

    2) 利用python代码,生成train.txt、val.txt

    train.txt、val.txt分别存储着训练集和测试集图片的文件名及其类别标签(注:bigcat:0、 dog:1、fish:2),格式如下:

    n02084071_9865.JPEG 1
    n02512053_3388.JPEG 2
    n02512053_6294.JPEG 2
    n02512053_2413.JPEG 2
    n02084071_5655.JPEG 1
    n02127808_9965.JPEG 0
    n02127808_8206.JPEG 0
    n02127808_4887.JPEG 0
    n02512053_1952.JPEG 2

         Python代码如下(可以先在windows环境下利用以下的代码生成txt):

     1 import os
     2 import random
     3 
     4 
     5 trainPath = 'F:\Resnet152\train\' 
     6 valPath = 'F:\Resnet152\val\' 
     7 
     8 train = {}
     9 val = {}
    10 
    11     
    12 # add
    13 for name in os.listdir(trainPath + "bigcat\"):
    14     train[name] = 0
    15      
    16 # add 
    17 for name in os.listdir(trainPath + "dog\"):
    18     train[name] = 1
    19     
    20 # add 
    21 for name in os.listdir(trainPath + "fish\"):
    22     train[name] = 2
    23 
    25 
    26 # add
    27 for name in os.listdir(valPath + "bigcat\"):
    28     val[name] = 0
    29      
    30 # add
    31 for name in os.listdir(valPath + "dog\"):
    32     val[name] = 1
    33     
    34 
    35 # add
    36 for name in os.listdir(valPath + "fish\"):
    37     val[name] = 2
    38     
    39     
    41 ftrain = open("F:\Resnet152\train\train.txt", 'w')
    42 fval = open("F:\Resnet152\val\val.txt", 'w')
    43 
    44 trainName = []
    45 valName = []
    46 for (item) in train:
    47     trainName.append(item)
    48 
    49 for item in val:
    50     valName.append(item)
    51 
    52 random.shuffle(trainName)
    53 random.shuffle(valName)
    54 
    55 for name in trainName:
    56     label = train[name]
    57     ftrain.write(name + " " + str(label) + "
    ")
    58 
    59 for name in valName:
    60     label = val[name]
    61     fval.write(name + " " + str(label) + "
    ")
    62 
    63 ftrain.close() 
    64 fval.close()   

     

    3) 生成数据集

    编写lmdb.sh脚本文件,利用train.txt、val.txt生成train_lmdb、val_lmdb

    lmdb.sh脚本代码如下

     1 #!/usr/bin/env sh
     2 # Create the face_48 lmdb inputs
     3 # N.B. set the path to the face_48 train + val data dirs
     4 
     5 EXAMPLE=/home/wy/ResNet152    #lmdb生成后存放目录
     6 DATA=/home/wy/ResNet152          #train.txt、val.txt存放目录
     7 TOOLS=/home/wy/caffe/build/tools #caffe安装目录
     8 
     9 TRAIN_DATA_ROOT=/home/wy/ResNet152/train/
    10 VAL_DATA_ROOT=/home/wy/ResNet152/val/
    11 
    12 # Set RESIZE=true to resize the images to 224 x 224. Leave as false if images have
    13 # already been resized using another tool.
    14 RESIZE=true
    15 if $RESIZE; then
    16   RESIZE_HEIGHT=224   #resize图片大小
    17   RESIZE_WIDTH=224
    18 else
    19   RESIZE_HEIGHT=0
    20   RESIZE_WIDTH=0
    21 fi
    22 
    23 if [ ! -d "$TRAIN_DATA_ROOT" ]; then
    24   echo "Error: TRAIN_DATA_ROOT is not a path to a directory: $TRAIN_DATA_ROOT"
    25   echo "Set the TRAIN_DATA_ROOT variable in create_face_48.sh to the path" 
    26        "where the face_48 training data is stored."
    27   exit 1
    28 fi
    29 
    30 if [ ! -d "$VAL_DATA_ROOT" ]; then
    31   echo "Error: VAL_DATA_ROOT is not a path to a directory: $VAL_DATA_ROOT"
    32   echo "Set the VAL_DATA_ROOT variable in create_face_48.sh to the path" 
    33        "where the face_48 validation data is stored."
    34   exit 1
    35 fi
    36 
    37 echo "Creating train lmdb..."
    38 
    39 GLOG_logtostderr=1 $TOOLS/convert_imageset 
    40     --resize_height=$RESIZE_HEIGHT 
    41     --resize_width=$RESIZE_WIDTH 
    42     --shuffle 
    43     $TRAIN_DATA_ROOT 
    44     $DATA/train.txt 
    45     $EXAMPLE/train_lmdb
    46 
    47 echo "Creating val lmdb..."
    48 
    49 GLOG_logtostderr=1 $TOOLS/convert_imageset 
    50     --resize_height=$RESIZE_HEIGHT 
    51     --resize_width=$RESIZE_WIDTH 
    52     --shuffle 
    53     $VAL_DATA_ROOT 
    54     $DATA/val.txt 
    55     $EXAMPLE/val_lmdb
    56 
    57 echo "Done."
    58 Status API Training Shop Blog About

    注:上述代码中有RESIZE_HEIGHT=224、RESIZE_WIDTH=224,因为使用的是ResNet网络,

    需要统一图片大小,如果使用的网络对图片大小无要求,或者已经是224*224大小的图片,可不进行resize操作

    linux系统终端中执行sh lmdb.sh即可生成lmdb文件,完成数据集的生成工作

    执行结束后,会在ResNet152目录下,生成两个文件夹,分别是train_lmdb、val_lmdb

     

    4) 计算图片平均值

    训练网络图片时,需要对图片做减均值处理,先将图片平均值保存到文件中,在训练网络中直接引用,先介绍如何生成图片平均值文件,后面引用的时候会特别注明,命令如下:

    /home/wy/caffe/build/tools/compute_image_mean /home/wy/ResNet152/train_lmdb /home/wy/ResNet152/mean.binaryproto

          利用Caffe安装目录下面的compute_image_mean.bin进行图片平均值的计算,输入的是train_lmdb,输出的为mean.binaryproto

          

         

    二、修改网络配置文件

    修改配置之前,先列出需要的文件:train_val.prototxt、deploy.prototxt、solver.prototxt(三个文件的具体内容会在博客最后给出)

    1)  train_val.prototxt

    文件名train_val包括train和val,可以先这样理解每进行500次训练的时候,会进行一次验证,方便输出供训练者观察情况,既然这样,就需要在这个文件里面指定train数据集和val数据集

          

           

           关于train_val.prototxt中其余部分的代码均是ResNet152的网络结构代码,无需修改

     

    2)  deploy.prototxt

    deploy.prototxt主要是当模型训练出来以后,利用模型对用户提交的一张图片进行分类应用的时候使用的网络文件,只需要进行一次前向传播计算输入图片所属类别的概率值,所以,此文件里面没有损失函数层的定义

     

    3) solver.prototxt

    文件指定训练的相关规则和参数,具体内容介绍见下图中的标注

     

    三、训练网络

    网络文件编辑完后,即可开始训练网络,可直接在linux终端输入以下命令

    /home/wy/caffe/build/tools/caffe train --solver=/home/wy/ResNet152/solver.prototxt

    前者是caffe安装目录路径,后者是solver.prototxt存放路径,回车后即可开始训练;也可创建train.sh脚本文件,文件中保存上述指令

    此时,在linux终端(导航到train.sh存放的路径),输入sh  train.sh即可

    训练过程中的一张图

     

    四、测试图片

    用训练出的model进行图片的预测

    /home/wy/caffe/build/examples/cpp_classification/classification.bin /home/wy/ResNet152/deploy.prototxt /home/wy/ResNet152/model/solver_iter_60000.caffemodel /home/wy/ResNet152/mean.binaryproto /home/wy/ResNet152/class_name.txt /home/wy/ResNet152/testImages/ISIC_0000001.jpg

    在linux下输入上述命令,即可对图片’ISIC_0000001.jpg’进行预测

    其中,class_name.txt文件中内容(和trainx.txt中标签要对应):

    0 bigcat  
    1 dog  
    2 fish 

    测试图片:

    看下测试结果(测试的时候换了台电脑,所以发现目录不太一样):

    所以,属于bigcat的概率为0.9999,属于fish的概率为0.0001,属于dog的概率为0

     

    train_val.prototxt

       1 name: "ResNet-152"
       2 layer {
       3     name: "data"
       4     type: "Data"
       5     top: "data"
       6     top: "label"
       7     include {
       8         phase: TRAIN
       9     }
      10     transform_param {
      11         mirror: true
      12         crop_size: 224
      13         mean_file: "/home/wy/ResNet152/mean.binaryproto"
      14     }
      15     data_param {
      16         source: "/home/wy/ResNet152/train_lmdb"
      17         batch_size: 1
      18         backend: LMDB
      19     }
      20 }
      21 layer {
      22     name: "data"
      23     type: "Data"
      24     top: "data"
      25     top: "label"
      26     include {
      27         phase: TEST
      28     }
      29     transform_param {
      30         mirror: false
      31         crop_size: 224
      32         mean_file:"/usr/develop/repertory/ResNet152/mean.binaryproto"
      33     }
      34     data_param {
      35         source: "/usr/develop/repertory/ResNet152/val_lmdb"
      36         batch_size: 1
      37         backend: LMDB
      38     }
      39 }
      40 
      41 layer {
      42     bottom: "data"
      43     top: "conv1"
      44     name: "conv1"
      45     type: "Convolution"
      46     convolution_param {
      47         num_output: 64
      48         kernel_size: 7
      49         pad: 3
      50         stride: 2
      51         weight_filler {
      52             type: "msra"
      53         }
      54         bias_term: false
      55 
      56     }
      57 }
      58 
      59 layer {
      60     bottom: "conv1"
      61     top: "conv1"
      62     name: "bn_conv1"
      63     type: "BatchNorm"
      64     batch_norm_param {
      65         use_global_stats: false
      66     }
      67 }
      68 
      69 layer {
      70     bottom: "conv1"
      71     top: "conv1"
      72     name: "scale_conv1"
      73     type: "Scale"
      74     scale_param {
      75         bias_term: true
      76     }
      77 }
      78 
      79 layer {
      80     bottom: "conv1"
      81     top: "conv1"
      82     name: "conv1_relu"
      83     type: "ReLU"
      84 }
      85 
      86 layer {
      87     bottom: "conv1"
      88     top: "pool1"
      89     name: "pool1"
      90     type: "Pooling"
      91     pooling_param {
      92         kernel_size: 3
      93         stride: 2
      94         pool: MAX
      95     }
      96 }
      97 
      98 layer {
      99     bottom: "pool1"
     100     top: "res2a_branch1"
     101     name: "res2a_branch1"
     102     type: "Convolution"
     103     convolution_param {
     104         num_output: 256
     105         kernel_size: 1
     106         pad: 0
     107         stride: 1
     108         weight_filler {
     109             type: "msra"
     110         }
     111         bias_term: false
     112 
     113     }
     114 }
     115 
     116 layer {
     117     bottom: "res2a_branch1"
     118     top: "res2a_branch1"
     119     name: "bn2a_branch1"
     120     type: "BatchNorm"
     121     batch_norm_param {
     122         use_global_stats: false
     123     }
     124 }
     125 
     126 layer {
     127     bottom: "res2a_branch1"
     128     top: "res2a_branch1"
     129     name: "scale2a_branch1"
     130     type: "Scale"
     131     scale_param {
     132         bias_term: true
     133     }
     134 }
     135 
     136 layer {
     137     bottom: "pool1"
     138     top: "res2a_branch2a"
     139     name: "res2a_branch2a"
     140     type: "Convolution"
     141     convolution_param {
     142         num_output: 64
     143         kernel_size: 1
     144         pad: 0
     145         stride: 1
     146         weight_filler {
     147             type: "msra"
     148         }
     149         bias_term: false
     150 
     151     }
     152 }
     153 
     154 layer {
     155     bottom: "res2a_branch2a"
     156     top: "res2a_branch2a"
     157     name: "bn2a_branch2a"
     158     type: "BatchNorm"
     159     batch_norm_param {
     160         use_global_stats: false
     161     }
     162 }
     163 
     164 layer {
     165     bottom: "res2a_branch2a"
     166     top: "res2a_branch2a"
     167     name: "scale2a_branch2a"
     168     type: "Scale"
     169     scale_param {
     170         bias_term: true
     171     }
     172 }
     173 
     174 layer {
     175     bottom: "res2a_branch2a"
     176     top: "res2a_branch2a"
     177     name: "res2a_branch2a_relu"
     178     type: "ReLU"
     179 }
     180 
     181 layer {
     182     bottom: "res2a_branch2a"
     183     top: "res2a_branch2b"
     184     name: "res2a_branch2b"
     185     type: "Convolution"
     186     convolution_param {
     187         num_output: 64
     188         kernel_size: 3
     189         pad: 1
     190         stride: 1
     191         weight_filler {
     192             type: "msra"
     193         }
     194         bias_term: false
     195 
     196     }
     197 }
     198 
     199 layer {
     200     bottom: "res2a_branch2b"
     201     top: "res2a_branch2b"
     202     name: "bn2a_branch2b"
     203     type: "BatchNorm"
     204     batch_norm_param {
     205         use_global_stats: false
     206     }
     207 }
     208 
     209 layer {
     210     bottom: "res2a_branch2b"
     211     top: "res2a_branch2b"
     212     name: "scale2a_branch2b"
     213     type: "Scale"
     214     scale_param {
     215         bias_term: true
     216     }
     217 }
     218 
     219 layer {
     220     bottom: "res2a_branch2b"
     221     top: "res2a_branch2b"
     222     name: "res2a_branch2b_relu"
     223     type: "ReLU"
     224 }
     225 
     226 layer {
     227     bottom: "res2a_branch2b"
     228     top: "res2a_branch2c"
     229     name: "res2a_branch2c"
     230     type: "Convolution"
     231     convolution_param {
     232         num_output: 256
     233         kernel_size: 1
     234         pad: 0
     235         stride: 1
     236         weight_filler {
     237             type: "msra"
     238         }
     239         bias_term: false
     240 
     241     }
     242 }
     243 
     244 layer {
     245     bottom: "res2a_branch2c"
     246     top: "res2a_branch2c"
     247     name: "bn2a_branch2c"
     248     type: "BatchNorm"
     249     batch_norm_param {
     250         use_global_stats: false
     251     }
     252 }
     253 
     254 layer {
     255     bottom: "res2a_branch2c"
     256     top: "res2a_branch2c"
     257     name: "scale2a_branch2c"
     258     type: "Scale"
     259     scale_param {
     260         bias_term: true
     261     }
     262 }
     263 
     264 layer {
     265     bottom: "res2a_branch1"
     266     bottom: "res2a_branch2c"
     267     top: "res2a"
     268     name: "res2a"
     269     type: "Eltwise"
     270     eltwise_param {
     271         operation: SUM
     272     }
     273 }
     274 
     275 layer {
     276     bottom: "res2a"
     277     top: "res2a"
     278     name: "res2a_relu"
     279     type: "ReLU"
     280 }
     281 
     282 layer {
     283     bottom: "res2a"
     284     top: "res2b_branch2a"
     285     name: "res2b_branch2a"
     286     type: "Convolution"
     287     convolution_param {
     288         num_output: 64
     289         kernel_size: 1
     290         pad: 0
     291         stride: 1
     292         weight_filler {
     293             type: "msra"
     294         }
     295         bias_term: false
     296 
     297     }
     298 }
     299 
     300 layer {
     301     bottom: "res2b_branch2a"
     302     top: "res2b_branch2a"
     303     name: "bn2b_branch2a"
     304     type: "BatchNorm"
     305     batch_norm_param {
     306         use_global_stats: false
     307     }
     308 }
     309 
     310 layer {
     311     bottom: "res2b_branch2a"
     312     top: "res2b_branch2a"
     313     name: "scale2b_branch2a"
     314     type: "Scale"
     315     scale_param {
     316         bias_term: true
     317     }
     318 }
     319 
     320 layer {
     321     bottom: "res2b_branch2a"
     322     top: "res2b_branch2a"
     323     name: "res2b_branch2a_relu"
     324     type: "ReLU"
     325 }
     326 
     327 layer {
     328     bottom: "res2b_branch2a"
     329     top: "res2b_branch2b"
     330     name: "res2b_branch2b"
     331     type: "Convolution"
     332     convolution_param {
     333         num_output: 64
     334         kernel_size: 3
     335         pad: 1
     336         stride: 1
     337         weight_filler {
     338             type: "msra"
     339         }
     340         bias_term: false
     341 
     342     }
     343 }
     344 
     345 layer {
     346     bottom: "res2b_branch2b"
     347     top: "res2b_branch2b"
     348     name: "bn2b_branch2b"
     349     type: "BatchNorm"
     350     batch_norm_param {
     351         use_global_stats: false
     352     }
     353 }
     354 
     355 layer {
     356     bottom: "res2b_branch2b"
     357     top: "res2b_branch2b"
     358     name: "scale2b_branch2b"
     359     type: "Scale"
     360     scale_param {
     361         bias_term: true
     362     }
     363 }
     364 
     365 layer {
     366     bottom: "res2b_branch2b"
     367     top: "res2b_branch2b"
     368     name: "res2b_branch2b_relu"
     369     type: "ReLU"
     370 }
     371 
     372 layer {
     373     bottom: "res2b_branch2b"
     374     top: "res2b_branch2c"
     375     name: "res2b_branch2c"
     376     type: "Convolution"
     377     convolution_param {
     378         num_output: 256
     379         kernel_size: 1
     380         pad: 0
     381         stride: 1
     382         weight_filler {
     383             type: "msra"
     384         }
     385         bias_term: false
     386 
     387     }
     388 }
     389 
     390 layer {
     391     bottom: "res2b_branch2c"
     392     top: "res2b_branch2c"
     393     name: "bn2b_branch2c"
     394     type: "BatchNorm"
     395     batch_norm_param {
     396         use_global_stats: false
     397     }
     398 }
     399 
     400 layer {
     401     bottom: "res2b_branch2c"
     402     top: "res2b_branch2c"
     403     name: "scale2b_branch2c"
     404     type: "Scale"
     405     scale_param {
     406         bias_term: true
     407     }
     408 }
     409 
     410 layer {
     411     bottom: "res2a"
     412     bottom: "res2b_branch2c"
     413     top: "res2b"
     414     name: "res2b"
     415     type: "Eltwise"
     416     eltwise_param {
     417         operation: SUM
     418     }
     419 }
     420 
     421 layer {
     422     bottom: "res2b"
     423     top: "res2b"
     424     name: "res2b_relu"
     425     type: "ReLU"
     426 }
     427 
     428 layer {
     429     bottom: "res2b"
     430     top: "res2c_branch2a"
     431     name: "res2c_branch2a"
     432     type: "Convolution"
     433     convolution_param {
     434         num_output: 64
     435         kernel_size: 1
     436         pad: 0
     437         stride: 1
     438         weight_filler {
     439             type: "msra"
     440         }
     441         bias_term: false
     442 
     443     }
     444 }
     445 
     446 layer {
     447     bottom: "res2c_branch2a"
     448     top: "res2c_branch2a"
     449     name: "bn2c_branch2a"
     450     type: "BatchNorm"
     451     batch_norm_param {
     452         use_global_stats: false
     453     }
     454 }
     455 
     456 layer {
     457     bottom: "res2c_branch2a"
     458     top: "res2c_branch2a"
     459     name: "scale2c_branch2a"
     460     type: "Scale"
     461     scale_param {
     462         bias_term: true
     463     }
     464 }
     465 
     466 layer {
     467     bottom: "res2c_branch2a"
     468     top: "res2c_branch2a"
     469     name: "res2c_branch2a_relu"
     470     type: "ReLU"
     471 }
     472 
     473 layer {
     474     bottom: "res2c_branch2a"
     475     top: "res2c_branch2b"
     476     name: "res2c_branch2b"
     477     type: "Convolution"
     478     convolution_param {
     479         num_output: 64
     480         kernel_size: 3
     481         pad: 1
     482         stride: 1
     483         weight_filler {
     484             type: "msra"
     485         }
     486         bias_term: false
     487 
     488     }
     489 }
     490 
     491 layer {
     492     bottom: "res2c_branch2b"
     493     top: "res2c_branch2b"
     494     name: "bn2c_branch2b"
     495     type: "BatchNorm"
     496     batch_norm_param {
     497         use_global_stats: false
     498     }
     499 }
     500 
     501 layer {
     502     bottom: "res2c_branch2b"
     503     top: "res2c_branch2b"
     504     name: "scale2c_branch2b"
     505     type: "Scale"
     506     scale_param {
     507         bias_term: true
     508     }
     509 }
     510 
     511 layer {
     512     bottom: "res2c_branch2b"
     513     top: "res2c_branch2b"
     514     name: "res2c_branch2b_relu"
     515     type: "ReLU"
     516 }
     517 
     518 layer {
     519     bottom: "res2c_branch2b"
     520     top: "res2c_branch2c"
     521     name: "res2c_branch2c"
     522     type: "Convolution"
     523     convolution_param {
     524         num_output: 256
     525         kernel_size: 1
     526         pad: 0
     527         stride: 1
     528         weight_filler {
     529             type: "msra"
     530         }
     531         bias_term: false
     532 
     533     }
     534 }
     535 
     536 layer {
     537     bottom: "res2c_branch2c"
     538     top: "res2c_branch2c"
     539     name: "bn2c_branch2c"
     540     type: "BatchNorm"
     541     batch_norm_param {
     542         use_global_stats: false
     543     }
     544 }
     545 
     546 layer {
     547     bottom: "res2c_branch2c"
     548     top: "res2c_branch2c"
     549     name: "scale2c_branch2c"
     550     type: "Scale"
     551     scale_param {
     552         bias_term: true
     553     }
     554 }
     555 
     556 layer {
     557     bottom: "res2b"
     558     bottom: "res2c_branch2c"
     559     top: "res2c"
     560     name: "res2c"
     561     type: "Eltwise"
     562     eltwise_param {
     563         operation: SUM
     564     }
     565 }
     566 
     567 layer {
     568     bottom: "res2c"
     569     top: "res2c"
     570     name: "res2c_relu"
     571     type: "ReLU"
     572 }
     573 
     574 layer {
     575     bottom: "res2c"
     576     top: "res3a_branch1"
     577     name: "res3a_branch1"
     578     type: "Convolution"
     579     convolution_param {
     580         num_output: 512
     581         kernel_size: 1
     582         pad: 0
     583         stride: 2
     584         weight_filler {
     585             type: "msra"
     586         }
     587         bias_term: false
     588 
     589     }
     590 }
     591 
     592 layer {
     593     bottom: "res3a_branch1"
     594     top: "res3a_branch1"
     595     name: "bn3a_branch1"
     596     type: "BatchNorm"
     597     batch_norm_param {
     598         use_global_stats: false
     599     }
     600 }
     601 
     602 layer {
     603     bottom: "res3a_branch1"
     604     top: "res3a_branch1"
     605     name: "scale3a_branch1"
     606     type: "Scale"
     607     scale_param {
     608         bias_term: true
     609     }
     610 }
     611 
     612 layer {
     613     bottom: "res2c"
     614     top: "res3a_branch2a"
     615     name: "res3a_branch2a"
     616     type: "Convolution"
     617     convolution_param {
     618         num_output: 128
     619         kernel_size: 1
     620         pad: 0
     621         stride: 2
     622         weight_filler {
     623             type: "msra"
     624         }
     625         bias_term: false
     626 
     627     }
     628 }
     629 
     630 layer {
     631     bottom: "res3a_branch2a"
     632     top: "res3a_branch2a"
     633     name: "bn3a_branch2a"
     634     type: "BatchNorm"
     635     batch_norm_param {
     636         use_global_stats: false
     637     }
     638 }
     639 
     640 layer {
     641     bottom: "res3a_branch2a"
     642     top: "res3a_branch2a"
     643     name: "scale3a_branch2a"
     644     type: "Scale"
     645     scale_param {
     646         bias_term: true
     647     }
     648 }
     649 
     650 layer {
     651     bottom: "res3a_branch2a"
     652     top: "res3a_branch2a"
     653     name: "res3a_branch2a_relu"
     654     type: "ReLU"
     655 }
     656 
     657 layer {
     658     bottom: "res3a_branch2a"
     659     top: "res3a_branch2b"
     660     name: "res3a_branch2b"
     661     type: "Convolution"
     662     convolution_param {
     663         num_output: 128
     664         kernel_size: 3
     665         pad: 1
     666         stride: 1
     667         weight_filler {
     668             type: "msra"
     669         }
     670         bias_term: false
     671 
     672     }
     673 }
     674 
     675 layer {
     676     bottom: "res3a_branch2b"
     677     top: "res3a_branch2b"
     678     name: "bn3a_branch2b"
     679     type: "BatchNorm"
     680     batch_norm_param {
     681         use_global_stats: false
     682     }
     683 }
     684 
     685 layer {
     686     bottom: "res3a_branch2b"
     687     top: "res3a_branch2b"
     688     name: "scale3a_branch2b"
     689     type: "Scale"
     690     scale_param {
     691         bias_term: true
     692     }
     693 }
     694 
     695 layer {
     696     bottom: "res3a_branch2b"
     697     top: "res3a_branch2b"
     698     name: "res3a_branch2b_relu"
     699     type: "ReLU"
     700 }
     701 
     702 layer {
     703     bottom: "res3a_branch2b"
     704     top: "res3a_branch2c"
     705     name: "res3a_branch2c"
     706     type: "Convolution"
     707     convolution_param {
     708         num_output: 512
     709         kernel_size: 1
     710         pad: 0
     711         stride: 1
     712         weight_filler {
     713             type: "msra"
     714         }
     715         bias_term: false
     716 
     717     }
     718 }
     719 
     720 layer {
     721     bottom: "res3a_branch2c"
     722     top: "res3a_branch2c"
     723     name: "bn3a_branch2c"
     724     type: "BatchNorm"
     725     batch_norm_param {
     726         use_global_stats: false
     727     }
     728 }
     729 
     730 layer {
     731     bottom: "res3a_branch2c"
     732     top: "res3a_branch2c"
     733     name: "scale3a_branch2c"
     734     type: "Scale"
     735     scale_param {
     736         bias_term: true
     737     }
     738 }
     739 
     740 layer {
     741     bottom: "res3a_branch1"
     742     bottom: "res3a_branch2c"
     743     top: "res3a"
     744     name: "res3a"
     745     type: "Eltwise"
     746     eltwise_param {
     747         operation: SUM
     748     }
     749 }
     750 
     751 layer {
     752     bottom: "res3a"
     753     top: "res3a"
     754     name: "res3a_relu"
     755     type: "ReLU"
     756 }
     757 
     758 layer {
     759     bottom: "res3a"
     760     top: "res3b1_branch2a"
     761     name: "res3b1_branch2a"
     762     type: "Convolution"
     763     convolution_param {
     764         num_output: 128
     765         kernel_size: 1
     766         pad: 0
     767         stride: 1
     768         weight_filler {
     769             type: "msra"
     770         }
     771         bias_term: false
     772 
     773     }
     774 }
     775 
     776 layer {
     777     bottom: "res3b1_branch2a"
     778     top: "res3b1_branch2a"
     779     name: "bn3b1_branch2a"
     780     type: "BatchNorm"
     781     batch_norm_param {
     782         use_global_stats: false
     783     }
     784 }
     785 
     786 layer {
     787     bottom: "res3b1_branch2a"
     788     top: "res3b1_branch2a"
     789     name: "scale3b1_branch2a"
     790     type: "Scale"
     791     scale_param {
     792         bias_term: true
     793     }
     794 }
     795 
     796 layer {
     797     bottom: "res3b1_branch2a"
     798     top: "res3b1_branch2a"
     799     name: "res3b1_branch2a_relu"
     800     type: "ReLU"
     801 }
     802 
     803 layer {
     804     bottom: "res3b1_branch2a"
     805     top: "res3b1_branch2b"
     806     name: "res3b1_branch2b"
     807     type: "Convolution"
     808     convolution_param {
     809         num_output: 128
     810         kernel_size: 3
     811         pad: 1
     812         stride: 1
     813         weight_filler {
     814             type: "msra"
     815         }
     816         bias_term: false
     817 
     818     }
     819 }
     820 
     821 layer {
     822     bottom: "res3b1_branch2b"
     823     top: "res3b1_branch2b"
     824     name: "bn3b1_branch2b"
     825     type: "BatchNorm"
     826     batch_norm_param {
     827         use_global_stats: false
     828     }
     829 }
     830 
     831 layer {
     832     bottom: "res3b1_branch2b"
     833     top: "res3b1_branch2b"
     834     name: "scale3b1_branch2b"
     835     type: "Scale"
     836     scale_param {
     837         bias_term: true
     838     }
     839 }
     840 
     841 layer {
     842     bottom: "res3b1_branch2b"
     843     top: "res3b1_branch2b"
     844     name: "res3b1_branch2b_relu"
     845     type: "ReLU"
     846 }
     847 
     848 layer {
     849     bottom: "res3b1_branch2b"
     850     top: "res3b1_branch2c"
     851     name: "res3b1_branch2c"
     852     type: "Convolution"
     853     convolution_param {
     854         num_output: 512
     855         kernel_size: 1
     856         pad: 0
     857         stride: 1
     858         weight_filler {
     859             type: "msra"
     860         }
     861         bias_term: false
     862 
     863     }
     864 }
     865 
     866 layer {
     867     bottom: "res3b1_branch2c"
     868     top: "res3b1_branch2c"
     869     name: "bn3b1_branch2c"
     870     type: "BatchNorm"
     871     batch_norm_param {
     872         use_global_stats: false
     873     }
     874 }
     875 
     876 layer {
     877     bottom: "res3b1_branch2c"
     878     top: "res3b1_branch2c"
     879     name: "scale3b1_branch2c"
     880     type: "Scale"
     881     scale_param {
     882         bias_term: true
     883     }
     884 }
     885 
     886 layer {
     887     bottom: "res3a"
     888     bottom: "res3b1_branch2c"
     889     top: "res3b1"
     890     name: "res3b1"
     891     type: "Eltwise"
     892     eltwise_param {
     893         operation: SUM
     894     }
     895 }
     896 
     897 layer {
     898     bottom: "res3b1"
     899     top: "res3b1"
     900     name: "res3b1_relu"
     901     type: "ReLU"
     902 }
     903 
     904 layer {
     905     bottom: "res3b1"
     906     top: "res3b2_branch2a"
     907     name: "res3b2_branch2a"
     908     type: "Convolution"
     909     convolution_param {
     910         num_output: 128
     911         kernel_size: 1
     912         pad: 0
     913         stride: 1
     914         weight_filler {
     915             type: "msra"
     916         }
     917         bias_term: false
     918 
     919     }
     920 }
     921 
     922 layer {
     923     bottom: "res3b2_branch2a"
     924     top: "res3b2_branch2a"
     925     name: "bn3b2_branch2a"
     926     type: "BatchNorm"
     927     batch_norm_param {
     928         use_global_stats: false
     929     }
     930 }
     931 
     932 layer {
     933     bottom: "res3b2_branch2a"
     934     top: "res3b2_branch2a"
     935     name: "scale3b2_branch2a"
     936     type: "Scale"
     937     scale_param {
     938         bias_term: true
     939     }
     940 }
     941 
     942 layer {
     943     bottom: "res3b2_branch2a"
     944     top: "res3b2_branch2a"
     945     name: "res3b2_branch2a_relu"
     946     type: "ReLU"
     947 }
     948 
     949 layer {
     950     bottom: "res3b2_branch2a"
     951     top: "res3b2_branch2b"
     952     name: "res3b2_branch2b"
     953     type: "Convolution"
     954     convolution_param {
     955         num_output: 128
     956         kernel_size: 3
     957         pad: 1
     958         stride: 1
     959         weight_filler {
     960             type: "msra"
     961         }
     962         bias_term: false
     963 
     964     }
     965 }
     966 
     967 layer {
     968     bottom: "res3b2_branch2b"
     969     top: "res3b2_branch2b"
     970     name: "bn3b2_branch2b"
     971     type: "BatchNorm"
     972     batch_norm_param {
     973         use_global_stats: false
     974     }
     975 }
     976 
     977 layer {
     978     bottom: "res3b2_branch2b"
     979     top: "res3b2_branch2b"
     980     name: "scale3b2_branch2b"
     981     type: "Scale"
     982     scale_param {
     983         bias_term: true
     984     }
     985 }
     986 
     987 layer {
     988     bottom: "res3b2_branch2b"
     989     top: "res3b2_branch2b"
     990     name: "res3b2_branch2b_relu"
     991     type: "ReLU"
     992 }
     993 
     994 layer {
     995     bottom: "res3b2_branch2b"
     996     top: "res3b2_branch2c"
     997     name: "res3b2_branch2c"
     998     type: "Convolution"
     999     convolution_param {
    1000         num_output: 512
    1001         kernel_size: 1
    1002         pad: 0
    1003         stride: 1
    1004         weight_filler {
    1005             type: "msra"
    1006         }
    1007         bias_term: false
    1008 
    1009     }
    1010 }
    1011 
    1012 layer {
    1013     bottom: "res3b2_branch2c"
    1014     top: "res3b2_branch2c"
    1015     name: "bn3b2_branch2c"
    1016     type: "BatchNorm"
    1017     batch_norm_param {
    1018         use_global_stats: false
    1019     }
    1020 }
    1021 
    1022 layer {
    1023     bottom: "res3b2_branch2c"
    1024     top: "res3b2_branch2c"
    1025     name: "scale3b2_branch2c"
    1026     type: "Scale"
    1027     scale_param {
    1028         bias_term: true
    1029     }
    1030 }
    1031 
    1032 layer {
    1033     bottom: "res3b1"
    1034     bottom: "res3b2_branch2c"
    1035     top: "res3b2"
    1036     name: "res3b2"
    1037     type: "Eltwise"
    1038     eltwise_param {
    1039         operation: SUM
    1040     }
    1041 }
    1042 
    1043 layer {
    1044     bottom: "res3b2"
    1045     top: "res3b2"
    1046     name: "res3b2_relu"
    1047     type: "ReLU"
    1048 }
    1049 
    1050 layer {
    1051     bottom: "res3b2"
    1052     top: "res3b3_branch2a"
    1053     name: "res3b3_branch2a"
    1054     type: "Convolution"
    1055     convolution_param {
    1056         num_output: 128
    1057         kernel_size: 1
    1058         pad: 0
    1059         stride: 1
    1060         weight_filler {
    1061             type: "msra"
    1062         }
    1063         bias_term: false
    1064 
    1065     }
    1066 }
    1067 
    1068 layer {
    1069     bottom: "res3b3_branch2a"
    1070     top: "res3b3_branch2a"
    1071     name: "bn3b3_branch2a"
    1072     type: "BatchNorm"
    1073     batch_norm_param {
    1074         use_global_stats: false
    1075     }
    1076 }
    1077 
    1078 layer {
    1079     bottom: "res3b3_branch2a"
    1080     top: "res3b3_branch2a"
    1081     name: "scale3b3_branch2a"
    1082     type: "Scale"
    1083     scale_param {
    1084         bias_term: true
    1085     }
    1086 }
    1087 
    1088 layer {
    1089     bottom: "res3b3_branch2a"
    1090     top: "res3b3_branch2a"
    1091     name: "res3b3_branch2a_relu"
    1092     type: "ReLU"
    1093 }
    1094 
    1095 layer {
    1096     bottom: "res3b3_branch2a"
    1097     top: "res3b3_branch2b"
    1098     name: "res3b3_branch2b"
    1099     type: "Convolution"
    1100     convolution_param {
    1101         num_output: 128
    1102         kernel_size: 3
    1103         pad: 1
    1104         stride: 1
    1105         weight_filler {
    1106             type: "msra"
    1107         }
    1108         bias_term: false
    1109 
    1110     }
    1111 }
    1112 
    1113 layer {
    1114     bottom: "res3b3_branch2b"
    1115     top: "res3b3_branch2b"
    1116     name: "bn3b3_branch2b"
    1117     type: "BatchNorm"
    1118     batch_norm_param {
    1119         use_global_stats: false
    1120     }
    1121 }
    1122 
    1123 layer {
    1124     bottom: "res3b3_branch2b"
    1125     top: "res3b3_branch2b"
    1126     name: "scale3b3_branch2b"
    1127     type: "Scale"
    1128     scale_param {
    1129         bias_term: true
    1130     }
    1131 }
    1132 
    1133 layer {
    1134     bottom: "res3b3_branch2b"
    1135     top: "res3b3_branch2b"
    1136     name: "res3b3_branch2b_relu"
    1137     type: "ReLU"
    1138 }
    1139 
    1140 layer {
    1141     bottom: "res3b3_branch2b"
    1142     top: "res3b3_branch2c"
    1143     name: "res3b3_branch2c"
    1144     type: "Convolution"
    1145     convolution_param {
    1146         num_output: 512
    1147         kernel_size: 1
    1148         pad: 0
    1149         stride: 1
    1150         weight_filler {
    1151             type: "msra"
    1152         }
    1153         bias_term: false
    1154 
    1155     }
    1156 }
    1157 
    1158 layer {
    1159     bottom: "res3b3_branch2c"
    1160     top: "res3b3_branch2c"
    1161     name: "bn3b3_branch2c"
    1162     type: "BatchNorm"
    1163     batch_norm_param {
    1164         use_global_stats: false
    1165     }
    1166 }
    1167 
    1168 layer {
    1169     bottom: "res3b3_branch2c"
    1170     top: "res3b3_branch2c"
    1171     name: "scale3b3_branch2c"
    1172     type: "Scale"
    1173     scale_param {
    1174         bias_term: true
    1175     }
    1176 }
    1177 
    1178 layer {
    1179     bottom: "res3b2"
    1180     bottom: "res3b3_branch2c"
    1181     top: "res3b3"
    1182     name: "res3b3"
    1183     type: "Eltwise"
    1184     eltwise_param {
    1185         operation: SUM
    1186     }
    1187 }
    1188 
    1189 layer {
    1190     bottom: "res3b3"
    1191     top: "res3b3"
    1192     name: "res3b3_relu"
    1193     type: "ReLU"
    1194 }
    1195 
    1196 layer {
    1197     bottom: "res3b3"
    1198     top: "res3b4_branch2a"
    1199     name: "res3b4_branch2a"
    1200     type: "Convolution"
    1201     convolution_param {
    1202         num_output: 128
    1203         kernel_size: 1
    1204         pad: 0
    1205         stride: 1
    1206         weight_filler {
    1207             type: "msra"
    1208         }
    1209         bias_term: false
    1210 
    1211     }
    1212 }
    1213 
    1214 layer {
    1215     bottom: "res3b4_branch2a"
    1216     top: "res3b4_branch2a"
    1217     name: "bn3b4_branch2a"
    1218     type: "BatchNorm"
    1219     batch_norm_param {
    1220         use_global_stats: false
    1221     }
    1222 }
    1223 
    1224 layer {
    1225     bottom: "res3b4_branch2a"
    1226     top: "res3b4_branch2a"
    1227     name: "scale3b4_branch2a"
    1228     type: "Scale"
    1229     scale_param {
    1230         bias_term: true
    1231     }
    1232 }
    1233 
    1234 layer {
    1235     bottom: "res3b4_branch2a"
    1236     top: "res3b4_branch2a"
    1237     name: "res3b4_branch2a_relu"
    1238     type: "ReLU"
    1239 }
    1240 
    1241 layer {
    1242     bottom: "res3b4_branch2a"
    1243     top: "res3b4_branch2b"
    1244     name: "res3b4_branch2b"
    1245     type: "Convolution"
    1246     convolution_param {
    1247         num_output: 128
    1248         kernel_size: 3
    1249         pad: 1
    1250         stride: 1
    1251         weight_filler {
    1252             type: "msra"
    1253         }
    1254         bias_term: false
    1255 
    1256     }
    1257 }
    1258 
    1259 layer {
    1260     bottom: "res3b4_branch2b"
    1261     top: "res3b4_branch2b"
    1262     name: "bn3b4_branch2b"
    1263     type: "BatchNorm"
    1264     batch_norm_param {
    1265         use_global_stats: false
    1266     }
    1267 }
    1268 
    1269 layer {
    1270     bottom: "res3b4_branch2b"
    1271     top: "res3b4_branch2b"
    1272     name: "scale3b4_branch2b"
    1273     type: "Scale"
    1274     scale_param {
    1275         bias_term: true
    1276     }
    1277 }
    1278 
    1279 layer {
    1280     bottom: "res3b4_branch2b"
    1281     top: "res3b4_branch2b"
    1282     name: "res3b4_branch2b_relu"
    1283     type: "ReLU"
    1284 }
    1285 
    1286 layer {
    1287     bottom: "res3b4_branch2b"
    1288     top: "res3b4_branch2c"
    1289     name: "res3b4_branch2c"
    1290     type: "Convolution"
    1291     convolution_param {
    1292         num_output: 512
    1293         kernel_size: 1
    1294         pad: 0
    1295         stride: 1
    1296         weight_filler {
    1297             type: "msra"
    1298         }
    1299         bias_term: false
    1300 
    1301     }
    1302 }
    1303 
    1304 layer {
    1305     bottom: "res3b4_branch2c"
    1306     top: "res3b4_branch2c"
    1307     name: "bn3b4_branch2c"
    1308     type: "BatchNorm"
    1309     batch_norm_param {
    1310         use_global_stats: false
    1311     }
    1312 }
    1313 
    1314 layer {
    1315     bottom: "res3b4_branch2c"
    1316     top: "res3b4_branch2c"
    1317     name: "scale3b4_branch2c"
    1318     type: "Scale"
    1319     scale_param {
    1320         bias_term: true
    1321     }
    1322 }
    1323 
    1324 layer {
    1325     bottom: "res3b3"
    1326     bottom: "res3b4_branch2c"
    1327     top: "res3b4"
    1328     name: "res3b4"
    1329     type: "Eltwise"
    1330     eltwise_param {
    1331         operation: SUM
    1332     }
    1333 }
    1334 
    1335 layer {
    1336     bottom: "res3b4"
    1337     top: "res3b4"
    1338     name: "res3b4_relu"
    1339     type: "ReLU"
    1340 }
    1341 
    1342 layer {
    1343     bottom: "res3b4"
    1344     top: "res3b5_branch2a"
    1345     name: "res3b5_branch2a"
    1346     type: "Convolution"
    1347     convolution_param {
    1348         num_output: 128
    1349         kernel_size: 1
    1350         pad: 0
    1351         stride: 1
    1352         weight_filler {
    1353             type: "msra"
    1354         }
    1355         bias_term: false
    1356 
    1357     }
    1358 }
    1359 
    1360 layer {
    1361     bottom: "res3b5_branch2a"
    1362     top: "res3b5_branch2a"
    1363     name: "bn3b5_branch2a"
    1364     type: "BatchNorm"
    1365     batch_norm_param {
    1366         use_global_stats: false
    1367     }
    1368 }
    1369 
    1370 layer {
    1371     bottom: "res3b5_branch2a"
    1372     top: "res3b5_branch2a"
    1373     name: "scale3b5_branch2a"
    1374     type: "Scale"
    1375     scale_param {
    1376         bias_term: true
    1377     }
    1378 }
    1379 
    1380 layer {
    1381     bottom: "res3b5_branch2a"
    1382     top: "res3b5_branch2a"
    1383     name: "res3b5_branch2a_relu"
    1384     type: "ReLU"
    1385 }
    1386 
    1387 layer {
    1388     bottom: "res3b5_branch2a"
    1389     top: "res3b5_branch2b"
    1390     name: "res3b5_branch2b"
    1391     type: "Convolution"
    1392     convolution_param {
    1393         num_output: 128
    1394         kernel_size: 3
    1395         pad: 1
    1396         stride: 1
    1397         weight_filler {
    1398             type: "msra"
    1399         }
    1400         bias_term: false
    1401 
    1402     }
    1403 }
    1404 
    1405 layer {
    1406     bottom: "res3b5_branch2b"
    1407     top: "res3b5_branch2b"
    1408     name: "bn3b5_branch2b"
    1409     type: "BatchNorm"
    1410     batch_norm_param {
    1411         use_global_stats: false
    1412     }
    1413 }
    1414 
    1415 layer {
    1416     bottom: "res3b5_branch2b"
    1417     top: "res3b5_branch2b"
    1418     name: "scale3b5_branch2b"
    1419     type: "Scale"
    1420     scale_param {
    1421         bias_term: true
    1422     }
    1423 }
    1424 
    1425 layer {
    1426     bottom: "res3b5_branch2b"
    1427     top: "res3b5_branch2b"
    1428     name: "res3b5_branch2b_relu"
    1429     type: "ReLU"
    1430 }
    1431 
    1432 layer {
    1433     bottom: "res3b5_branch2b"
    1434     top: "res3b5_branch2c"
    1435     name: "res3b5_branch2c"
    1436     type: "Convolution"
    1437     convolution_param {
    1438         num_output: 512
    1439         kernel_size: 1
    1440         pad: 0
    1441         stride: 1
    1442         weight_filler {
    1443             type: "msra"
    1444         }
    1445         bias_term: false
    1446 
    1447     }
    1448 }
    1449 
    1450 layer {
    1451     bottom: "res3b5_branch2c"
    1452     top: "res3b5_branch2c"
    1453     name: "bn3b5_branch2c"
    1454     type: "BatchNorm"
    1455     batch_norm_param {
    1456         use_global_stats: false
    1457     }
    1458 }
    1459 
    1460 layer {
    1461     bottom: "res3b5_branch2c"
    1462     top: "res3b5_branch2c"
    1463     name: "scale3b5_branch2c"
    1464     type: "Scale"
    1465     scale_param {
    1466         bias_term: true
    1467     }
    1468 }
    1469 
    1470 layer {
    1471     bottom: "res3b4"
    1472     bottom: "res3b5_branch2c"
    1473     top: "res3b5"
    1474     name: "res3b5"
    1475     type: "Eltwise"
    1476     eltwise_param {
    1477         operation: SUM
    1478     }
    1479 }
    1480 
    1481 layer {
    1482     bottom: "res3b5"
    1483     top: "res3b5"
    1484     name: "res3b5_relu"
    1485     type: "ReLU"
    1486 }
    1487 
    1488 layer {
    1489     bottom: "res3b5"
    1490     top: "res3b6_branch2a"
    1491     name: "res3b6_branch2a"
    1492     type: "Convolution"
    1493     convolution_param {
    1494         num_output: 128
    1495         kernel_size: 1
    1496         pad: 0
    1497         stride: 1
    1498         weight_filler {
    1499             type: "msra"
    1500         }
    1501         bias_term: false
    1502 
    1503     }
    1504 }
    1505 
    1506 layer {
    1507     bottom: "res3b6_branch2a"
    1508     top: "res3b6_branch2a"
    1509     name: "bn3b6_branch2a"
    1510     type: "BatchNorm"
    1511     batch_norm_param {
    1512         use_global_stats: false
    1513     }
    1514 }
    1515 
    1516 layer {
    1517     bottom: "res3b6_branch2a"
    1518     top: "res3b6_branch2a"
    1519     name: "scale3b6_branch2a"
    1520     type: "Scale"
    1521     scale_param {
    1522         bias_term: true
    1523     }
    1524 }
    1525 
    1526 layer {
    1527     bottom: "res3b6_branch2a"
    1528     top: "res3b6_branch2a"
    1529     name: "res3b6_branch2a_relu"
    1530     type: "ReLU"
    1531 }
    1532 
    1533 layer {
    1534     bottom: "res3b6_branch2a"
    1535     top: "res3b6_branch2b"
    1536     name: "res3b6_branch2b"
    1537     type: "Convolution"
    1538     convolution_param {
    1539         num_output: 128
    1540         kernel_size: 3
    1541         pad: 1
    1542         stride: 1
    1543         weight_filler {
    1544             type: "msra"
    1545         }
    1546         bias_term: false
    1547 
    1548     }
    1549 }
    1550 
    1551 layer {
    1552     bottom: "res3b6_branch2b"
    1553     top: "res3b6_branch2b"
    1554     name: "bn3b6_branch2b"
    1555     type: "BatchNorm"
    1556     batch_norm_param {
    1557         use_global_stats: false
    1558     }
    1559 }
    1560 
    1561 layer {
    1562     bottom: "res3b6_branch2b"
    1563     top: "res3b6_branch2b"
    1564     name: "scale3b6_branch2b"
    1565     type: "Scale"
    1566     scale_param {
    1567         bias_term: true
    1568     }
    1569 }
    1570 
    1571 layer {
    1572     bottom: "res3b6_branch2b"
    1573     top: "res3b6_branch2b"
    1574     name: "res3b6_branch2b_relu"
    1575     type: "ReLU"
    1576 }
    1577 
    1578 layer {
    1579     bottom: "res3b6_branch2b"
    1580     top: "res3b6_branch2c"
    1581     name: "res3b6_branch2c"
    1582     type: "Convolution"
    1583     convolution_param {
    1584         num_output: 512
    1585         kernel_size: 1
    1586         pad: 0
    1587         stride: 1
    1588         weight_filler {
    1589             type: "msra"
    1590         }
    1591         bias_term: false
    1592 
    1593     }
    1594 }
    1595 
    1596 layer {
    1597     bottom: "res3b6_branch2c"
    1598     top: "res3b6_branch2c"
    1599     name: "bn3b6_branch2c"
    1600     type: "BatchNorm"
    1601     batch_norm_param {
    1602         use_global_stats: false
    1603     }
    1604 }
    1605 
    1606 layer {
    1607     bottom: "res3b6_branch2c"
    1608     top: "res3b6_branch2c"
    1609     name: "scale3b6_branch2c"
    1610     type: "Scale"
    1611     scale_param {
    1612         bias_term: true
    1613     }
    1614 }
    1615 
    1616 layer {
    1617     bottom: "res3b5"
    1618     bottom: "res3b6_branch2c"
    1619     top: "res3b6"
    1620     name: "res3b6"
    1621     type: "Eltwise"
    1622     eltwise_param {
    1623         operation: SUM
    1624     }
    1625 }
    1626 
    1627 layer {
    1628     bottom: "res3b6"
    1629     top: "res3b6"
    1630     name: "res3b6_relu"
    1631     type: "ReLU"
    1632 }
    1633 
    1634 layer {
    1635     bottom: "res3b6"
    1636     top: "res3b7_branch2a"
    1637     name: "res3b7_branch2a"
    1638     type: "Convolution"
    1639     convolution_param {
    1640         num_output: 128
    1641         kernel_size: 1
    1642         pad: 0
    1643         stride: 1
    1644         weight_filler {
    1645             type: "msra"
    1646         }
    1647         bias_term: false
    1648 
    1649     }
    1650 }
    1651 
    1652 layer {
    1653     bottom: "res3b7_branch2a"
    1654     top: "res3b7_branch2a"
    1655     name: "bn3b7_branch2a"
    1656     type: "BatchNorm"
    1657     batch_norm_param {
    1658         use_global_stats: false
    1659     }
    1660 }
    1661 
    1662 layer {
    1663     bottom: "res3b7_branch2a"
    1664     top: "res3b7_branch2a"
    1665     name: "scale3b7_branch2a"
    1666     type: "Scale"
    1667     scale_param {
    1668         bias_term: true
    1669     }
    1670 }
    1671 
    1672 layer {
    1673     bottom: "res3b7_branch2a"
    1674     top: "res3b7_branch2a"
    1675     name: "res3b7_branch2a_relu"
    1676     type: "ReLU"
    1677 }
    1678 
    1679 layer {
    1680     bottom: "res3b7_branch2a"
    1681     top: "res3b7_branch2b"
    1682     name: "res3b7_branch2b"
    1683     type: "Convolution"
    1684     convolution_param {
    1685         num_output: 128
    1686         kernel_size: 3
    1687         pad: 1
    1688         stride: 1
    1689         weight_filler {
    1690             type: "msra"
    1691         }
    1692         bias_term: false
    1693 
    1694     }
    1695 }
    1696 
    1697 layer {
    1698     bottom: "res3b7_branch2b"
    1699     top: "res3b7_branch2b"
    1700     name: "bn3b7_branch2b"
    1701     type: "BatchNorm"
    1702     batch_norm_param {
    1703         use_global_stats: false
    1704     }
    1705 }
    1706 
    1707 layer {
    1708     bottom: "res3b7_branch2b"
    1709     top: "res3b7_branch2b"
    1710     name: "scale3b7_branch2b"
    1711     type: "Scale"
    1712     scale_param {
    1713         bias_term: true
    1714     }
    1715 }
    1716 
    1717 layer {
    1718     bottom: "res3b7_branch2b"
    1719     top: "res3b7_branch2b"
    1720     name: "res3b7_branch2b_relu"
    1721     type: "ReLU"
    1722 }
    1723 
    1724 layer {
    1725     bottom: "res3b7_branch2b"
    1726     top: "res3b7_branch2c"
    1727     name: "res3b7_branch2c"
    1728     type: "Convolution"
    1729     convolution_param {
    1730         num_output: 512
    1731         kernel_size: 1
    1732         pad: 0
    1733         stride: 1
    1734         weight_filler {
    1735             type: "msra"
    1736         }
    1737         bias_term: false
    1738 
    1739     }
    1740 }
    1741 
    1742 layer {
    1743     bottom: "res3b7_branch2c"
    1744     top: "res3b7_branch2c"
    1745     name: "bn3b7_branch2c"
    1746     type: "BatchNorm"
    1747     batch_norm_param {
    1748         use_global_stats: false
    1749     }
    1750 }
    1751 
    1752 layer {
    1753     bottom: "res3b7_branch2c"
    1754     top: "res3b7_branch2c"
    1755     name: "scale3b7_branch2c"
    1756     type: "Scale"
    1757     scale_param {
    1758         bias_term: true
    1759     }
    1760 }
    1761 
    1762 layer {
    1763     bottom: "res3b6"
    1764     bottom: "res3b7_branch2c"
    1765     top: "res3b7"
    1766     name: "res3b7"
    1767     type: "Eltwise"
    1768     eltwise_param {
    1769         operation: SUM
    1770     }
    1771 }
    1772 
    1773 layer {
    1774     bottom: "res3b7"
    1775     top: "res3b7"
    1776     name: "res3b7_relu"
    1777     type: "ReLU"
    1778 }
    1779 
    1780 layer {
    1781     bottom: "res3b7"
    1782     top: "res4a_branch1"
    1783     name: "res4a_branch1"
    1784     type: "Convolution"
    1785     convolution_param {
    1786         num_output: 1024
    1787         kernel_size: 1
    1788         pad: 0
    1789         stride: 2
    1790         weight_filler {
    1791             type: "msra"
    1792         }
    1793         bias_term: false
    1794 
    1795     }
    1796 }
    1797 
    1798 layer {
    1799     bottom: "res4a_branch1"
    1800     top: "res4a_branch1"
    1801     name: "bn4a_branch1"
    1802     type: "BatchNorm"
    1803     batch_norm_param {
    1804         use_global_stats: false
    1805     }
    1806 }
    1807 
    1808 layer {
    1809     bottom: "res4a_branch1"
    1810     top: "res4a_branch1"
    1811     name: "scale4a_branch1"
    1812     type: "Scale"
    1813     scale_param {
    1814         bias_term: true
    1815     }
    1816 }
    1817 
    1818 layer {
    1819     bottom: "res3b7"
    1820     top: "res4a_branch2a"
    1821     name: "res4a_branch2a"
    1822     type: "Convolution"
    1823     convolution_param {
    1824         num_output: 256
    1825         kernel_size: 1
    1826         pad: 0
    1827         stride: 2
    1828         weight_filler {
    1829             type: "msra"
    1830         }
    1831         bias_term: false
    1832 
    1833     }
    1834 }
    1835 
    1836 layer {
    1837     bottom: "res4a_branch2a"
    1838     top: "res4a_branch2a"
    1839     name: "bn4a_branch2a"
    1840     type: "BatchNorm"
    1841     batch_norm_param {
    1842         use_global_stats: false
    1843     }
    1844 }
    1845 
    1846 layer {
    1847     bottom: "res4a_branch2a"
    1848     top: "res4a_branch2a"
    1849     name: "scale4a_branch2a"
    1850     type: "Scale"
    1851     scale_param {
    1852         bias_term: true
    1853     }
    1854 }
    1855 
    1856 layer {
    1857     bottom: "res4a_branch2a"
    1858     top: "res4a_branch2a"
    1859     name: "res4a_branch2a_relu"
    1860     type: "ReLU"
    1861 }
    1862 
    1863 layer {
    1864     bottom: "res4a_branch2a"
    1865     top: "res4a_branch2b"
    1866     name: "res4a_branch2b"
    1867     type: "Convolution"
    1868     convolution_param {
    1869         num_output: 256
    1870         kernel_size: 3
    1871         pad: 1
    1872         stride: 1
    1873         weight_filler {
    1874             type: "msra"
    1875         }
    1876         bias_term: false
    1877 
    1878     }
    1879 }
    1880 
    1881 layer {
    1882     bottom: "res4a_branch2b"
    1883     top: "res4a_branch2b"
    1884     name: "bn4a_branch2b"
    1885     type: "BatchNorm"
    1886     batch_norm_param {
    1887         use_global_stats: false
    1888     }
    1889 }
    1890 
    1891 layer {
    1892     bottom: "res4a_branch2b"
    1893     top: "res4a_branch2b"
    1894     name: "scale4a_branch2b"
    1895     type: "Scale"
    1896     scale_param {
    1897         bias_term: true
    1898     }
    1899 }
    1900 
    1901 layer {
    1902     bottom: "res4a_branch2b"
    1903     top: "res4a_branch2b"
    1904     name: "res4a_branch2b_relu"
    1905     type: "ReLU"
    1906 }
    1907 
    1908 layer {
    1909     bottom: "res4a_branch2b"
    1910     top: "res4a_branch2c"
    1911     name: "res4a_branch2c"
    1912     type: "Convolution"
    1913     convolution_param {
    1914         num_output: 1024
    1915         kernel_size: 1
    1916         pad: 0
    1917         stride: 1
    1918         weight_filler {
    1919             type: "msra"
    1920         }
    1921         bias_term: false
    1922 
    1923     }
    1924 }
    1925 
    1926 layer {
    1927     bottom: "res4a_branch2c"
    1928     top: "res4a_branch2c"
    1929     name: "bn4a_branch2c"
    1930     type: "BatchNorm"
    1931     batch_norm_param {
    1932         use_global_stats: false
    1933     }
    1934 }
    1935 
    1936 layer {
    1937     bottom: "res4a_branch2c"
    1938     top: "res4a_branch2c"
    1939     name: "scale4a_branch2c"
    1940     type: "Scale"
    1941     scale_param {
    1942         bias_term: true
    1943     }
    1944 }
    1945 
    1946 layer {
    1947     bottom: "res4a_branch1"
    1948     bottom: "res4a_branch2c"
    1949     top: "res4a"
    1950     name: "res4a"
    1951     type: "Eltwise"
    1952     eltwise_param {
    1953         operation: SUM
    1954     }
    1955 }
    1956 
    1957 layer {
    1958     bottom: "res4a"
    1959     top: "res4a"
    1960     name: "res4a_relu"
    1961     type: "ReLU"
    1962 }
    1963 
    1964 layer {
    1965     bottom: "res4a"
    1966     top: "res4b1_branch2a"
    1967     name: "res4b1_branch2a"
    1968     type: "Convolution"
    1969     convolution_param {
    1970         num_output: 256
    1971         kernel_size: 1
    1972         pad: 0
    1973         stride: 1
    1974         weight_filler {
    1975             type: "msra"
    1976         }
    1977         bias_term: false
    1978 
    1979     }
    1980 }
    1981 
    1982 layer {
    1983     bottom: "res4b1_branch2a"
    1984     top: "res4b1_branch2a"
    1985     name: "bn4b1_branch2a"
    1986     type: "BatchNorm"
    1987     batch_norm_param {
    1988         use_global_stats: false
    1989     }
    1990 }
    1991 
    1992 layer {
    1993     bottom: "res4b1_branch2a"
    1994     top: "res4b1_branch2a"
    1995     name: "scale4b1_branch2a"
    1996     type: "Scale"
    1997     scale_param {
    1998         bias_term: true
    1999     }
    2000 }
    2001 
    2002 layer {
    2003     bottom: "res4b1_branch2a"
    2004     top: "res4b1_branch2a"
    2005     name: "res4b1_branch2a_relu"
    2006     type: "ReLU"
    2007 }
    2008 
    2009 layer {
    2010     bottom: "res4b1_branch2a"
    2011     top: "res4b1_branch2b"
    2012     name: "res4b1_branch2b"
    2013     type: "Convolution"
    2014     convolution_param {
    2015         num_output: 256
    2016         kernel_size: 3
    2017         pad: 1
    2018         stride: 1
    2019         weight_filler {
    2020             type: "msra"
    2021         }
    2022         bias_term: false
    2023 
    2024     }
    2025 }
    2026 
    2027 layer {
    2028     bottom: "res4b1_branch2b"
    2029     top: "res4b1_branch2b"
    2030     name: "bn4b1_branch2b"
    2031     type: "BatchNorm"
    2032     batch_norm_param {
    2033         use_global_stats: false
    2034     }
    2035 }
    2036 
    2037 layer {
    2038     bottom: "res4b1_branch2b"
    2039     top: "res4b1_branch2b"
    2040     name: "scale4b1_branch2b"
    2041     type: "Scale"
    2042     scale_param {
    2043         bias_term: true
    2044     }
    2045 }
    2046 
    2047 layer {
    2048     bottom: "res4b1_branch2b"
    2049     top: "res4b1_branch2b"
    2050     name: "res4b1_branch2b_relu"
    2051     type: "ReLU"
    2052 }
    2053 
    2054 layer {
    2055     bottom: "res4b1_branch2b"
    2056     top: "res4b1_branch2c"
    2057     name: "res4b1_branch2c"
    2058     type: "Convolution"
    2059     convolution_param {
    2060         num_output: 1024
    2061         kernel_size: 1
    2062         pad: 0
    2063         stride: 1
    2064         weight_filler {
    2065             type: "msra"
    2066         }
    2067         bias_term: false
    2068 
    2069     }
    2070 }
    2071 
    2072 layer {
    2073     bottom: "res4b1_branch2c"
    2074     top: "res4b1_branch2c"
    2075     name: "bn4b1_branch2c"
    2076     type: "BatchNorm"
    2077     batch_norm_param {
    2078         use_global_stats: false
    2079     }
    2080 }
    2081 
    2082 layer {
    2083     bottom: "res4b1_branch2c"
    2084     top: "res4b1_branch2c"
    2085     name: "scale4b1_branch2c"
    2086     type: "Scale"
    2087     scale_param {
    2088         bias_term: true
    2089     }
    2090 }
    2091 
    2092 layer {
    2093     bottom: "res4a"
    2094     bottom: "res4b1_branch2c"
    2095     top: "res4b1"
    2096     name: "res4b1"
    2097     type: "Eltwise"
    2098     eltwise_param {
    2099         operation: SUM
    2100     }
    2101 }
    2102 
    2103 layer {
    2104     bottom: "res4b1"
    2105     top: "res4b1"
    2106     name: "res4b1_relu"
    2107     type: "ReLU"
    2108 }
    2109 
    2110 layer {
    2111     bottom: "res4b1"
    2112     top: "res4b2_branch2a"
    2113     name: "res4b2_branch2a"
    2114     type: "Convolution"
    2115     convolution_param {
    2116         num_output: 256
    2117         kernel_size: 1
    2118         pad: 0
    2119         stride: 1
    2120         weight_filler {
    2121             type: "msra"
    2122         }
    2123         bias_term: false
    2124 
    2125     }
    2126 }
    2127 
    2128 layer {
    2129     bottom: "res4b2_branch2a"
    2130     top: "res4b2_branch2a"
    2131     name: "bn4b2_branch2a"
    2132     type: "BatchNorm"
    2133     batch_norm_param {
    2134         use_global_stats: false
    2135     }
    2136 }
    2137 
    2138 layer {
    2139     bottom: "res4b2_branch2a"
    2140     top: "res4b2_branch2a"
    2141     name: "scale4b2_branch2a"
    2142     type: "Scale"
    2143     scale_param {
    2144         bias_term: true
    2145     }
    2146 }
    2147 
    2148 layer {
    2149     bottom: "res4b2_branch2a"
    2150     top: "res4b2_branch2a"
    2151     name: "res4b2_branch2a_relu"
    2152     type: "ReLU"
    2153 }
    2154 
    2155 layer {
    2156     bottom: "res4b2_branch2a"
    2157     top: "res4b2_branch2b"
    2158     name: "res4b2_branch2b"
    2159     type: "Convolution"
    2160     convolution_param {
    2161         num_output: 256
    2162         kernel_size: 3
    2163         pad: 1
    2164         stride: 1
    2165         weight_filler {
    2166             type: "msra"
    2167         }
    2168         bias_term: false
    2169 
    2170     }
    2171 }
    2172 
    2173 layer {
    2174     bottom: "res4b2_branch2b"
    2175     top: "res4b2_branch2b"
    2176     name: "bn4b2_branch2b"
    2177     type: "BatchNorm"
    2178     batch_norm_param {
    2179         use_global_stats: false
    2180     }
    2181 }
    2182 
    2183 layer {
    2184     bottom: "res4b2_branch2b"
    2185     top: "res4b2_branch2b"
    2186     name: "scale4b2_branch2b"
    2187     type: "Scale"
    2188     scale_param {
    2189         bias_term: true
    2190     }
    2191 }
    2192 
    2193 layer {
    2194     bottom: "res4b2_branch2b"
    2195     top: "res4b2_branch2b"
    2196     name: "res4b2_branch2b_relu"
    2197     type: "ReLU"
    2198 }
    2199 
    2200 layer {
    2201     bottom: "res4b2_branch2b"
    2202     top: "res4b2_branch2c"
    2203     name: "res4b2_branch2c"
    2204     type: "Convolution"
    2205     convolution_param {
    2206         num_output: 1024
    2207         kernel_size: 1
    2208         pad: 0
    2209         stride: 1
    2210         weight_filler {
    2211             type: "msra"
    2212         }
    2213         bias_term: false
    2214 
    2215     }
    2216 }
    2217 
    2218 layer {
    2219     bottom: "res4b2_branch2c"
    2220     top: "res4b2_branch2c"
    2221     name: "bn4b2_branch2c"
    2222     type: "BatchNorm"
    2223     batch_norm_param {
    2224         use_global_stats: false
    2225     }
    2226 }
    2227 
    2228 layer {
    2229     bottom: "res4b2_branch2c"
    2230     top: "res4b2_branch2c"
    2231     name: "scale4b2_branch2c"
    2232     type: "Scale"
    2233     scale_param {
    2234         bias_term: true
    2235     }
    2236 }
    2237 
    2238 layer {
    2239     bottom: "res4b1"
    2240     bottom: "res4b2_branch2c"
    2241     top: "res4b2"
    2242     name: "res4b2"
    2243     type: "Eltwise"
    2244     eltwise_param {
    2245         operation: SUM
    2246     }
    2247 }
    2248 
    2249 layer {
    2250     bottom: "res4b2"
    2251     top: "res4b2"
    2252     name: "res4b2_relu"
    2253     type: "ReLU"
    2254 }
    2255 
    2256 layer {
    2257     bottom: "res4b2"
    2258     top: "res4b3_branch2a"
    2259     name: "res4b3_branch2a"
    2260     type: "Convolution"
    2261     convolution_param {
    2262         num_output: 256
    2263         kernel_size: 1
    2264         pad: 0
    2265         stride: 1
    2266         weight_filler {
    2267             type: "msra"
    2268         }
    2269         bias_term: false
    2270 
    2271     }
    2272 }
    2273 
    2274 layer {
    2275     bottom: "res4b3_branch2a"
    2276     top: "res4b3_branch2a"
    2277     name: "bn4b3_branch2a"
    2278     type: "BatchNorm"
    2279     batch_norm_param {
    2280         use_global_stats: false
    2281     }
    2282 }
    2283 
    2284 layer {
    2285     bottom: "res4b3_branch2a"
    2286     top: "res4b3_branch2a"
    2287     name: "scale4b3_branch2a"
    2288     type: "Scale"
    2289     scale_param {
    2290         bias_term: true
    2291     }
    2292 }
    2293 
    2294 layer {
    2295     bottom: "res4b3_branch2a"
    2296     top: "res4b3_branch2a"
    2297     name: "res4b3_branch2a_relu"
    2298     type: "ReLU"
    2299 }
    2300 
    2301 layer {
    2302     bottom: "res4b3_branch2a"
    2303     top: "res4b3_branch2b"
    2304     name: "res4b3_branch2b"
    2305     type: "Convolution"
    2306     convolution_param {
    2307         num_output: 256
    2308         kernel_size: 3
    2309         pad: 1
    2310         stride: 1
    2311         weight_filler {
    2312             type: "msra"
    2313         }
    2314         bias_term: false
    2315 
    2316     }
    2317 }
    2318 
    2319 layer {
    2320     bottom: "res4b3_branch2b"
    2321     top: "res4b3_branch2b"
    2322     name: "bn4b3_branch2b"
    2323     type: "BatchNorm"
    2324     batch_norm_param {
    2325         use_global_stats: false
    2326     }
    2327 }
    2328 
    2329 layer {
    2330     bottom: "res4b3_branch2b"
    2331     top: "res4b3_branch2b"
    2332     name: "scale4b3_branch2b"
    2333     type: "Scale"
    2334     scale_param {
    2335         bias_term: true
    2336     }
    2337 }
    2338 
    2339 layer {
    2340     bottom: "res4b3_branch2b"
    2341     top: "res4b3_branch2b"
    2342     name: "res4b3_branch2b_relu"
    2343     type: "ReLU"
    2344 }
    2345 
    2346 layer {
    2347     bottom: "res4b3_branch2b"
    2348     top: "res4b3_branch2c"
    2349     name: "res4b3_branch2c"
    2350     type: "Convolution"
    2351     convolution_param {
    2352         num_output: 1024
    2353         kernel_size: 1
    2354         pad: 0
    2355         stride: 1
    2356         weight_filler {
    2357             type: "msra"
    2358         }
    2359         bias_term: false
    2360 
    2361     }
    2362 }
    2363 
    2364 layer {
    2365     bottom: "res4b3_branch2c"
    2366     top: "res4b3_branch2c"
    2367     name: "bn4b3_branch2c"
    2368     type: "BatchNorm"
    2369     batch_norm_param {
    2370         use_global_stats: false
    2371     }
    2372 }
    2373 
    2374 layer {
    2375     bottom: "res4b3_branch2c"
    2376     top: "res4b3_branch2c"
    2377     name: "scale4b3_branch2c"
    2378     type: "Scale"
    2379     scale_param {
    2380         bias_term: true
    2381     }
    2382 }
    2383 
    2384 layer {
    2385     bottom: "res4b2"
    2386     bottom: "res4b3_branch2c"
    2387     top: "res4b3"
    2388     name: "res4b3"
    2389     type: "Eltwise"
    2390     eltwise_param {
    2391         operation: SUM
    2392     }
    2393 }
    2394 
    2395 layer {
    2396     bottom: "res4b3"
    2397     top: "res4b3"
    2398     name: "res4b3_relu"
    2399     type: "ReLU"
    2400 }
    2401 
    2402 layer {
    2403     bottom: "res4b3"
    2404     top: "res4b4_branch2a"
    2405     name: "res4b4_branch2a"
    2406     type: "Convolution"
    2407     convolution_param {
    2408         num_output: 256
    2409         kernel_size: 1
    2410         pad: 0
    2411         stride: 1
    2412         weight_filler {
    2413             type: "msra"
    2414         }
    2415         bias_term: false
    2416 
    2417     }
    2418 }
    2419 
    2420 layer {
    2421     bottom: "res4b4_branch2a"
    2422     top: "res4b4_branch2a"
    2423     name: "bn4b4_branch2a"
    2424     type: "BatchNorm"
    2425     batch_norm_param {
    2426         use_global_stats: false
    2427     }
    2428 }
    2429 
    2430 layer {
    2431     bottom: "res4b4_branch2a"
    2432     top: "res4b4_branch2a"
    2433     name: "scale4b4_branch2a"
    2434     type: "Scale"
    2435     scale_param {
    2436         bias_term: true
    2437     }
    2438 }
    2439 
    2440 layer {
    2441     bottom: "res4b4_branch2a"
    2442     top: "res4b4_branch2a"
    2443     name: "res4b4_branch2a_relu"
    2444     type: "ReLU"
    2445 }
    2446 
    2447 layer {
    2448     bottom: "res4b4_branch2a"
    2449     top: "res4b4_branch2b"
    2450     name: "res4b4_branch2b"
    2451     type: "Convolution"
    2452     convolution_param {
    2453         num_output: 256
    2454         kernel_size: 3
    2455         pad: 1
    2456         stride: 1
    2457         weight_filler {
    2458             type: "msra"
    2459         }
    2460         bias_term: false
    2461 
    2462     }
    2463 }
    2464 
    2465 layer {
    2466     bottom: "res4b4_branch2b"
    2467     top: "res4b4_branch2b"
    2468     name: "bn4b4_branch2b"
    2469     type: "BatchNorm"
    2470     batch_norm_param {
    2471         use_global_stats: false
    2472     }
    2473 }
    2474 
    2475 layer {
    2476     bottom: "res4b4_branch2b"
    2477     top: "res4b4_branch2b"
    2478     name: "scale4b4_branch2b"
    2479     type: "Scale"
    2480     scale_param {
    2481         bias_term: true
    2482     }
    2483 }
    2484 
    2485 layer {
    2486     bottom: "res4b4_branch2b"
    2487     top: "res4b4_branch2b"
    2488     name: "res4b4_branch2b_relu"
    2489     type: "ReLU"
    2490 }
    2491 
    2492 layer {
    2493     bottom: "res4b4_branch2b"
    2494     top: "res4b4_branch2c"
    2495     name: "res4b4_branch2c"
    2496     type: "Convolution"
    2497     convolution_param {
    2498         num_output: 1024
    2499         kernel_size: 1
    2500         pad: 0
    2501         stride: 1
    2502         weight_filler {
    2503             type: "msra"
    2504         }
    2505         bias_term: false
    2506 
    2507     }
    2508 }
    2509 
    2510 layer {
    2511     bottom: "res4b4_branch2c"
    2512     top: "res4b4_branch2c"
    2513     name: "bn4b4_branch2c"
    2514     type: "BatchNorm"
    2515     batch_norm_param {
    2516         use_global_stats: false
    2517     }
    2518 }
    2519 
    2520 layer {
    2521     bottom: "res4b4_branch2c"
    2522     top: "res4b4_branch2c"
    2523     name: "scale4b4_branch2c"
    2524     type: "Scale"
    2525     scale_param {
    2526         bias_term: true
    2527     }
    2528 }
    2529 
    2530 layer {
    2531     bottom: "res4b3"
    2532     bottom: "res4b4_branch2c"
    2533     top: "res4b4"
    2534     name: "res4b4"
    2535     type: "Eltwise"
    2536     eltwise_param {
    2537         operation: SUM
    2538     }
    2539 }
    2540 
    2541 layer {
    2542     bottom: "res4b4"
    2543     top: "res4b4"
    2544     name: "res4b4_relu"
    2545     type: "ReLU"
    2546 }
    2547 
    2548 layer {
    2549     bottom: "res4b4"
    2550     top: "res4b5_branch2a"
    2551     name: "res4b5_branch2a"
    2552     type: "Convolution"
    2553     convolution_param {
    2554         num_output: 256
    2555         kernel_size: 1
    2556         pad: 0
    2557         stride: 1
    2558         weight_filler {
    2559             type: "msra"
    2560         }
    2561         bias_term: false
    2562 
    2563     }
    2564 }
    2565 
    2566 layer {
    2567     bottom: "res4b5_branch2a"
    2568     top: "res4b5_branch2a"
    2569     name: "bn4b5_branch2a"
    2570     type: "BatchNorm"
    2571     batch_norm_param {
    2572         use_global_stats: false
    2573     }
    2574 }
    2575 
    2576 layer {
    2577     bottom: "res4b5_branch2a"
    2578     top: "res4b5_branch2a"
    2579     name: "scale4b5_branch2a"
    2580     type: "Scale"
    2581     scale_param {
    2582         bias_term: true
    2583     }
    2584 }
    2585 
    2586 layer {
    2587     bottom: "res4b5_branch2a"
    2588     top: "res4b5_branch2a"
    2589     name: "res4b5_branch2a_relu"
    2590     type: "ReLU"
    2591 }
    2592 
    2593 layer {
    2594     bottom: "res4b5_branch2a"
    2595     top: "res4b5_branch2b"
    2596     name: "res4b5_branch2b"
    2597     type: "Convolution"
    2598     convolution_param {
    2599         num_output: 256
    2600         kernel_size: 3
    2601         pad: 1
    2602         stride: 1
    2603         weight_filler {
    2604             type: "msra"
    2605         }
    2606         bias_term: false
    2607 
    2608     }
    2609 }
    2610 
    2611 layer {
    2612     bottom: "res4b5_branch2b"
    2613     top: "res4b5_branch2b"
    2614     name: "bn4b5_branch2b"
    2615     type: "BatchNorm"
    2616     batch_norm_param {
    2617         use_global_stats: false
    2618     }
    2619 }
    2620 
    2621 layer {
    2622     bottom: "res4b5_branch2b"
    2623     top: "res4b5_branch2b"
    2624     name: "scale4b5_branch2b"
    2625     type: "Scale"
    2626     scale_param {
    2627         bias_term: true
    2628     }
    2629 }
    2630 
    2631 layer {
    2632     bottom: "res4b5_branch2b"
    2633     top: "res4b5_branch2b"
    2634     name: "res4b5_branch2b_relu"
    2635     type: "ReLU"
    2636 }
    2637 
    2638 layer {
    2639     bottom: "res4b5_branch2b"
    2640     top: "res4b5_branch2c"
    2641     name: "res4b5_branch2c"
    2642     type: "Convolution"
    2643     convolution_param {
    2644         num_output: 1024
    2645         kernel_size: 1
    2646         pad: 0
    2647         stride: 1
    2648         weight_filler {
    2649             type: "msra"
    2650         }
    2651         bias_term: false
    2652 
    2653     }
    2654 }
    2655 
    2656 layer {
    2657     bottom: "res4b5_branch2c"
    2658     top: "res4b5_branch2c"
    2659     name: "bn4b5_branch2c"
    2660     type: "BatchNorm"
    2661     batch_norm_param {
    2662         use_global_stats: false
    2663     }
    2664 }
    2665 
    2666 layer {
    2667     bottom: "res4b5_branch2c"
    2668     top: "res4b5_branch2c"
    2669     name: "scale4b5_branch2c"
    2670     type: "Scale"
    2671     scale_param {
    2672         bias_term: true
    2673     }
    2674 }
    2675 
    2676 layer {
    2677     bottom: "res4b4"
    2678     bottom: "res4b5_branch2c"
    2679     top: "res4b5"
    2680     name: "res4b5"
    2681     type: "Eltwise"
    2682     eltwise_param {
    2683         operation: SUM
    2684     }
    2685 }
    2686 
    2687 layer {
    2688     bottom: "res4b5"
    2689     top: "res4b5"
    2690     name: "res4b5_relu"
    2691     type: "ReLU"
    2692 }
    2693 
    2694 layer {
    2695     bottom: "res4b5"
    2696     top: "res4b6_branch2a"
    2697     name: "res4b6_branch2a"
    2698     type: "Convolution"
    2699     convolution_param {
    2700         num_output: 256
    2701         kernel_size: 1
    2702         pad: 0
    2703         stride: 1
    2704         weight_filler {
    2705             type: "msra"
    2706         }
    2707         bias_term: false
    2708 
    2709     }
    2710 }
    2711 
    2712 layer {
    2713     bottom: "res4b6_branch2a"
    2714     top: "res4b6_branch2a"
    2715     name: "bn4b6_branch2a"
    2716     type: "BatchNorm"
    2717     batch_norm_param {
    2718         use_global_stats: false
    2719     }
    2720 }
    2721 
    2722 layer {
    2723     bottom: "res4b6_branch2a"
    2724     top: "res4b6_branch2a"
    2725     name: "scale4b6_branch2a"
    2726     type: "Scale"
    2727     scale_param {
    2728         bias_term: true
    2729     }
    2730 }
    2731 
    2732 layer {
    2733     bottom: "res4b6_branch2a"
    2734     top: "res4b6_branch2a"
    2735     name: "res4b6_branch2a_relu"
    2736     type: "ReLU"
    2737 }
    2738 
    2739 layer {
    2740     bottom: "res4b6_branch2a"
    2741     top: "res4b6_branch2b"
    2742     name: "res4b6_branch2b"
    2743     type: "Convolution"
    2744     convolution_param {
    2745         num_output: 256
    2746         kernel_size: 3
    2747         pad: 1
    2748         stride: 1
    2749         weight_filler {
    2750             type: "msra"
    2751         }
    2752         bias_term: false
    2753 
    2754     }
    2755 }
    2756 
    2757 layer {
    2758     bottom: "res4b6_branch2b"
    2759     top: "res4b6_branch2b"
    2760     name: "bn4b6_branch2b"
    2761     type: "BatchNorm"
    2762     batch_norm_param {
    2763         use_global_stats: false
    2764     }
    2765 }
    2766 
    2767 layer {
    2768     bottom: "res4b6_branch2b"
    2769     top: "res4b6_branch2b"
    2770     name: "scale4b6_branch2b"
    2771     type: "Scale"
    2772     scale_param {
    2773         bias_term: true
    2774     }
    2775 }
    2776 
    2777 layer {
    2778     bottom: "res4b6_branch2b"
    2779     top: "res4b6_branch2b"
    2780     name: "res4b6_branch2b_relu"
    2781     type: "ReLU"
    2782 }
    2783 
    2784 layer {
    2785     bottom: "res4b6_branch2b"
    2786     top: "res4b6_branch2c"
    2787     name: "res4b6_branch2c"
    2788     type: "Convolution"
    2789     convolution_param {
    2790         num_output: 1024
    2791         kernel_size: 1
    2792         pad: 0
    2793         stride: 1
    2794         weight_filler {
    2795             type: "msra"
    2796         }
    2797         bias_term: false
    2798 
    2799     }
    2800 }
    2801 
    2802 layer {
    2803     bottom: "res4b6_branch2c"
    2804     top: "res4b6_branch2c"
    2805     name: "bn4b6_branch2c"
    2806     type: "BatchNorm"
    2807     batch_norm_param {
    2808         use_global_stats: false
    2809     }
    2810 }
    2811 
    2812 layer {
    2813     bottom: "res4b6_branch2c"
    2814     top: "res4b6_branch2c"
    2815     name: "scale4b6_branch2c"
    2816     type: "Scale"
    2817     scale_param {
    2818         bias_term: true
    2819     }
    2820 }
    2821 
    2822 layer {
    2823     bottom: "res4b5"
    2824     bottom: "res4b6_branch2c"
    2825     top: "res4b6"
    2826     name: "res4b6"
    2827     type: "Eltwise"
    2828     eltwise_param {
    2829         operation: SUM
    2830     }
    2831 }
    2832 
    2833 layer {
    2834     bottom: "res4b6"
    2835     top: "res4b6"
    2836     name: "res4b6_relu"
    2837     type: "ReLU"
    2838 }
    2839 
    2840 layer {
    2841     bottom: "res4b6"
    2842     top: "res4b7_branch2a"
    2843     name: "res4b7_branch2a"
    2844     type: "Convolution"
    2845     convolution_param {
    2846         num_output: 256
    2847         kernel_size: 1
    2848         pad: 0
    2849         stride: 1
    2850         weight_filler {
    2851             type: "msra"
    2852         }
    2853         bias_term: false
    2854 
    2855     }
    2856 }
    2857 
    2858 layer {
    2859     bottom: "res4b7_branch2a"
    2860     top: "res4b7_branch2a"
    2861     name: "bn4b7_branch2a"
    2862     type: "BatchNorm"
    2863     batch_norm_param {
    2864         use_global_stats: false
    2865     }
    2866 }
    2867 
    2868 layer {
    2869     bottom: "res4b7_branch2a"
    2870     top: "res4b7_branch2a"
    2871     name: "scale4b7_branch2a"
    2872     type: "Scale"
    2873     scale_param {
    2874         bias_term: true
    2875     }
    2876 }
    2877 
    2878 layer {
    2879     bottom: "res4b7_branch2a"
    2880     top: "res4b7_branch2a"
    2881     name: "res4b7_branch2a_relu"
    2882     type: "ReLU"
    2883 }
    2884 
    2885 layer {
    2886     bottom: "res4b7_branch2a"
    2887     top: "res4b7_branch2b"
    2888     name: "res4b7_branch2b"
    2889     type: "Convolution"
    2890     convolution_param {
    2891         num_output: 256
    2892         kernel_size: 3
    2893         pad: 1
    2894         stride: 1
    2895         weight_filler {
    2896             type: "msra"
    2897         }
    2898         bias_term: false
    2899 
    2900     }
    2901 }
    2902 
    2903 layer {
    2904     bottom: "res4b7_branch2b"
    2905     top: "res4b7_branch2b"
    2906     name: "bn4b7_branch2b"
    2907     type: "BatchNorm"
    2908     batch_norm_param {
    2909         use_global_stats: false
    2910     }
    2911 }
    2912 
    2913 layer {
    2914     bottom: "res4b7_branch2b"
    2915     top: "res4b7_branch2b"
    2916     name: "scale4b7_branch2b"
    2917     type: "Scale"
    2918     scale_param {
    2919         bias_term: true
    2920     }
    2921 }
    2922 
    2923 layer {
    2924     bottom: "res4b7_branch2b"
    2925     top: "res4b7_branch2b"
    2926     name: "res4b7_branch2b_relu"
    2927     type: "ReLU"
    2928 }
    2929 
    2930 layer {
    2931     bottom: "res4b7_branch2b"
    2932     top: "res4b7_branch2c"
    2933     name: "res4b7_branch2c"
    2934     type: "Convolution"
    2935     convolution_param {
    2936         num_output: 1024
    2937         kernel_size: 1
    2938         pad: 0
    2939         stride: 1
    2940         weight_filler {
    2941             type: "msra"
    2942         }
    2943         bias_term: false
    2944 
    2945     }
    2946 }
    2947 
    2948 layer {
    2949     bottom: "res4b7_branch2c"
    2950     top: "res4b7_branch2c"
    2951     name: "bn4b7_branch2c"
    2952     type: "BatchNorm"
    2953     batch_norm_param {
    2954         use_global_stats: false
    2955     }
    2956 }
    2957 
    2958 layer {
    2959     bottom: "res4b7_branch2c"
    2960     top: "res4b7_branch2c"
    2961     name: "scale4b7_branch2c"
    2962     type: "Scale"
    2963     scale_param {
    2964         bias_term: true
    2965     }
    2966 }
    2967 
    2968 layer {
    2969     bottom: "res4b6"
    2970     bottom: "res4b7_branch2c"
    2971     top: "res4b7"
    2972     name: "res4b7"
    2973     type: "Eltwise"
    2974     eltwise_param {
    2975         operation: SUM
    2976     }
    2977 }
    2978 
    2979 layer {
    2980     bottom: "res4b7"
    2981     top: "res4b7"
    2982     name: "res4b7_relu"
    2983     type: "ReLU"
    2984 }
    2985 
    2986 layer {
    2987     bottom: "res4b7"
    2988     top: "res4b8_branch2a"
    2989     name: "res4b8_branch2a"
    2990     type: "Convolution"
    2991     convolution_param {
    2992         num_output: 256
    2993         kernel_size: 1
    2994         pad: 0
    2995         stride: 1
    2996         weight_filler {
    2997             type: "msra"
    2998         }
    2999         bias_term: false
    3000 
    3001     }
    3002 }
    3003 
    3004 layer {
    3005     bottom: "res4b8_branch2a"
    3006     top: "res4b8_branch2a"
    3007     name: "bn4b8_branch2a"
    3008     type: "BatchNorm"
    3009     batch_norm_param {
    3010         use_global_stats: false
    3011     }
    3012 }
    3013 
    3014 layer {
    3015     bottom: "res4b8_branch2a"
    3016     top: "res4b8_branch2a"
    3017     name: "scale4b8_branch2a"
    3018     type: "Scale"
    3019     scale_param {
    3020         bias_term: true
    3021     }
    3022 }
    3023 
    3024 layer {
    3025     bottom: "res4b8_branch2a"
    3026     top: "res4b8_branch2a"
    3027     name: "res4b8_branch2a_relu"
    3028     type: "ReLU"
    3029 }
    3030 
    3031 layer {
    3032     bottom: "res4b8_branch2a"
    3033     top: "res4b8_branch2b"
    3034     name: "res4b8_branch2b"
    3035     type: "Convolution"
    3036     convolution_param {
    3037         num_output: 256
    3038         kernel_size: 3
    3039         pad: 1
    3040         stride: 1
    3041         weight_filler {
    3042             type: "msra"
    3043         }
    3044         bias_term: false
    3045 
    3046     }
    3047 }
    3048 
    3049 layer {
    3050     bottom: "res4b8_branch2b"
    3051     top: "res4b8_branch2b"
    3052     name: "bn4b8_branch2b"
    3053     type: "BatchNorm"
    3054     batch_norm_param {
    3055         use_global_stats: false
    3056     }
    3057 }
    3058 
    3059 layer {
    3060     bottom: "res4b8_branch2b"
    3061     top: "res4b8_branch2b"
    3062     name: "scale4b8_branch2b"
    3063     type: "Scale"
    3064     scale_param {
    3065         bias_term: true
    3066     }
    3067 }
    3068 
    3069 layer {
    3070     bottom: "res4b8_branch2b"
    3071     top: "res4b8_branch2b"
    3072     name: "res4b8_branch2b_relu"
    3073     type: "ReLU"
    3074 }
    3075 
    3076 layer {
    3077     bottom: "res4b8_branch2b"
    3078     top: "res4b8_branch2c"
    3079     name: "res4b8_branch2c"
    3080     type: "Convolution"
    3081     convolution_param {
    3082         num_output: 1024
    3083         kernel_size: 1
    3084         pad: 0
    3085         stride: 1
    3086         weight_filler {
    3087             type: "msra"
    3088         }
    3089         bias_term: false
    3090 
    3091     }
    3092 }
    3093 
    3094 layer {
    3095     bottom: "res4b8_branch2c"
    3096     top: "res4b8_branch2c"
    3097     name: "bn4b8_branch2c"
    3098     type: "BatchNorm"
    3099     batch_norm_param {
    3100         use_global_stats: false
    3101     }
    3102 }
    3103 
    3104 layer {
    3105     bottom: "res4b8_branch2c"
    3106     top: "res4b8_branch2c"
    3107     name: "scale4b8_branch2c"
    3108     type: "Scale"
    3109     scale_param {
    3110         bias_term: true
    3111     }
    3112 }
    3113 
    3114 layer {
    3115     bottom: "res4b7"
    3116     bottom: "res4b8_branch2c"
    3117     top: "res4b8"
    3118     name: "res4b8"
    3119     type: "Eltwise"
    3120     eltwise_param {
    3121         operation: SUM
    3122     }
    3123 }
    3124 
    3125 layer {
    3126     bottom: "res4b8"
    3127     top: "res4b8"
    3128     name: "res4b8_relu"
    3129     type: "ReLU"
    3130 }
    3131 
    3132 layer {
    3133     bottom: "res4b8"
    3134     top: "res4b9_branch2a"
    3135     name: "res4b9_branch2a"
    3136     type: "Convolution"
    3137     convolution_param {
    3138         num_output: 256
    3139         kernel_size: 1
    3140         pad: 0
    3141         stride: 1
    3142         weight_filler {
    3143             type: "msra"
    3144         }
    3145         bias_term: false
    3146 
    3147     }
    3148 }
    3149 
    3150 layer {
    3151     bottom: "res4b9_branch2a"
    3152     top: "res4b9_branch2a"
    3153     name: "bn4b9_branch2a"
    3154     type: "BatchNorm"
    3155     batch_norm_param {
    3156         use_global_stats: false
    3157     }
    3158 }
    3159 
    3160 layer {
    3161     bottom: "res4b9_branch2a"
    3162     top: "res4b9_branch2a"
    3163     name: "scale4b9_branch2a"
    3164     type: "Scale"
    3165     scale_param {
    3166         bias_term: true
    3167     }
    3168 }
    3169 
    3170 layer {
    3171     bottom: "res4b9_branch2a"
    3172     top: "res4b9_branch2a"
    3173     name: "res4b9_branch2a_relu"
    3174     type: "ReLU"
    3175 }
    3176 
    3177 layer {
    3178     bottom: "res4b9_branch2a"
    3179     top: "res4b9_branch2b"
    3180     name: "res4b9_branch2b"
    3181     type: "Convolution"
    3182     convolution_param {
    3183         num_output: 256
    3184         kernel_size: 3
    3185         pad: 1
    3186         stride: 1
    3187         weight_filler {
    3188             type: "msra"
    3189         }
    3190         bias_term: false
    3191 
    3192     }
    3193 }
    3194 
    3195 layer {
    3196     bottom: "res4b9_branch2b"
    3197     top: "res4b9_branch2b"
    3198     name: "bn4b9_branch2b"
    3199     type: "BatchNorm"
    3200     batch_norm_param {
    3201         use_global_stats: false
    3202     }
    3203 }
    3204 
    3205 layer {
    3206     bottom: "res4b9_branch2b"
    3207     top: "res4b9_branch2b"
    3208     name: "scale4b9_branch2b"
    3209     type: "Scale"
    3210     scale_param {
    3211         bias_term: true
    3212     }
    3213 }
    3214 
    3215 layer {
    3216     bottom: "res4b9_branch2b"
    3217     top: "res4b9_branch2b"
    3218     name: "res4b9_branch2b_relu"
    3219     type: "ReLU"
    3220 }
    3221 
    3222 layer {
    3223     bottom: "res4b9_branch2b"
    3224     top: "res4b9_branch2c"
    3225     name: "res4b9_branch2c"
    3226     type: "Convolution"
    3227     convolution_param {
    3228         num_output: 1024
    3229         kernel_size: 1
    3230         pad: 0
    3231         stride: 1
    3232         weight_filler {
    3233             type: "msra"
    3234         }
    3235         bias_term: false
    3236 
    3237     }
    3238 }
    3239 
    3240 layer {
    3241     bottom: "res4b9_branch2c"
    3242     top: "res4b9_branch2c"
    3243     name: "bn4b9_branch2c"
    3244     type: "BatchNorm"
    3245     batch_norm_param {
    3246         use_global_stats: false
    3247     }
    3248 }
    3249 
    3250 layer {
    3251     bottom: "res4b9_branch2c"
    3252     top: "res4b9_branch2c"
    3253     name: "scale4b9_branch2c"
    3254     type: "Scale"
    3255     scale_param {
    3256         bias_term: true
    3257     }
    3258 }
    3259 
    3260 layer {
    3261     bottom: "res4b8"
    3262     bottom: "res4b9_branch2c"
    3263     top: "res4b9"
    3264     name: "res4b9"
    3265     type: "Eltwise"
    3266     eltwise_param {
    3267         operation: SUM
    3268     }
    3269 }
    3270 
    3271 layer {
    3272     bottom: "res4b9"
    3273     top: "res4b9"
    3274     name: "res4b9_relu"
    3275     type: "ReLU"
    3276 }
    3277 
    3278 layer {
    3279     bottom: "res4b9"
    3280     top: "res4b10_branch2a"
    3281     name: "res4b10_branch2a"
    3282     type: "Convolution"
    3283     convolution_param {
    3284         num_output: 256
    3285         kernel_size: 1
    3286         pad: 0
    3287         stride: 1
    3288         weight_filler {
    3289             type: "msra"
    3290         }
    3291         bias_term: false
    3292 
    3293     }
    3294 }
    3295 
    3296 layer {
    3297     bottom: "res4b10_branch2a"
    3298     top: "res4b10_branch2a"
    3299     name: "bn4b10_branch2a"
    3300     type: "BatchNorm"
    3301     batch_norm_param {
    3302         use_global_stats: false
    3303     }
    3304 }
    3305 
    3306 layer {
    3307     bottom: "res4b10_branch2a"
    3308     top: "res4b10_branch2a"
    3309     name: "scale4b10_branch2a"
    3310     type: "Scale"
    3311     scale_param {
    3312         bias_term: true
    3313     }
    3314 }
    3315 
    3316 layer {
    3317     bottom: "res4b10_branch2a"
    3318     top: "res4b10_branch2a"
    3319     name: "res4b10_branch2a_relu"
    3320     type: "ReLU"
    3321 }
    3322 
    3323 layer {
    3324     bottom: "res4b10_branch2a"
    3325     top: "res4b10_branch2b"
    3326     name: "res4b10_branch2b"
    3327     type: "Convolution"
    3328     convolution_param {
    3329         num_output: 256
    3330         kernel_size: 3
    3331         pad: 1
    3332         stride: 1
    3333         weight_filler {
    3334             type: "msra"
    3335         }
    3336         bias_term: false
    3337 
    3338     }
    3339 }
    3340 
    3341 layer {
    3342     bottom: "res4b10_branch2b"
    3343     top: "res4b10_branch2b"
    3344     name: "bn4b10_branch2b"
    3345     type: "BatchNorm"
    3346     batch_norm_param {
    3347         use_global_stats: false
    3348     }
    3349 }
    3350 
    3351 layer {
    3352     bottom: "res4b10_branch2b"
    3353     top: "res4b10_branch2b"
    3354     name: "scale4b10_branch2b"
    3355     type: "Scale"
    3356     scale_param {
    3357         bias_term: true
    3358     }
    3359 }
    3360 
    3361 layer {
    3362     bottom: "res4b10_branch2b"
    3363     top: "res4b10_branch2b"
    3364     name: "res4b10_branch2b_relu"
    3365     type: "ReLU"
    3366 }
    3367 
    3368 layer {
    3369     bottom: "res4b10_branch2b"
    3370     top: "res4b10_branch2c"
    3371     name: "res4b10_branch2c"
    3372     type: "Convolution"
    3373     convolution_param {
    3374         num_output: 1024
    3375         kernel_size: 1
    3376         pad: 0
    3377         stride: 1
    3378         weight_filler {
    3379             type: "msra"
    3380         }
    3381         bias_term: false
    3382 
    3383     }
    3384 }
    3385 
    3386 layer {
    3387     bottom: "res4b10_branch2c"
    3388     top: "res4b10_branch2c"
    3389     name: "bn4b10_branch2c"
    3390     type: "BatchNorm"
    3391     batch_norm_param {
    3392         use_global_stats: false
    3393     }
    3394 }
    3395 
    3396 layer {
    3397     bottom: "res4b10_branch2c"
    3398     top: "res4b10_branch2c"
    3399     name: "scale4b10_branch2c"
    3400     type: "Scale"
    3401     scale_param {
    3402         bias_term: true
    3403     }
    3404 }
    3405 
    3406 layer {
    3407     bottom: "res4b9"
    3408     bottom: "res4b10_branch2c"
    3409     top: "res4b10"
    3410     name: "res4b10"
    3411     type: "Eltwise"
    3412     eltwise_param {
    3413         operation: SUM
    3414     }
    3415 }
    3416 
    3417 layer {
    3418     bottom: "res4b10"
    3419     top: "res4b10"
    3420     name: "res4b10_relu"
    3421     type: "ReLU"
    3422 }
    3423 
    3424 layer {
    3425     bottom: "res4b10"
    3426     top: "res4b11_branch2a"
    3427     name: "res4b11_branch2a"
    3428     type: "Convolution"
    3429     convolution_param {
    3430         num_output: 256
    3431         kernel_size: 1
    3432         pad: 0
    3433         stride: 1
    3434         weight_filler {
    3435             type: "msra"
    3436         }
    3437         bias_term: false
    3438 
    3439     }
    3440 }
    3441 
    3442 layer {
    3443     bottom: "res4b11_branch2a"
    3444     top: "res4b11_branch2a"
    3445     name: "bn4b11_branch2a"
    3446     type: "BatchNorm"
    3447     batch_norm_param {
    3448         use_global_stats: false
    3449     }
    3450 }
    3451 
    3452 layer {
    3453     bottom: "res4b11_branch2a"
    3454     top: "res4b11_branch2a"
    3455     name: "scale4b11_branch2a"
    3456     type: "Scale"
    3457     scale_param {
    3458         bias_term: true
    3459     }
    3460 }
    3461 
    3462 layer {
    3463     bottom: "res4b11_branch2a"
    3464     top: "res4b11_branch2a"
    3465     name: "res4b11_branch2a_relu"
    3466     type: "ReLU"
    3467 }
    3468 
    3469 layer {
    3470     bottom: "res4b11_branch2a"
    3471     top: "res4b11_branch2b"
    3472     name: "res4b11_branch2b"
    3473     type: "Convolution"
    3474     convolution_param {
    3475         num_output: 256
    3476         kernel_size: 3
    3477         pad: 1
    3478         stride: 1
    3479         weight_filler {
    3480             type: "msra"
    3481         }
    3482         bias_term: false
    3483 
    3484     }
    3485 }
    3486 
    3487 layer {
    3488     bottom: "res4b11_branch2b"
    3489     top: "res4b11_branch2b"
    3490     name: "bn4b11_branch2b"
    3491     type: "BatchNorm"
    3492     batch_norm_param {
    3493         use_global_stats: false
    3494     }
    3495 }
    3496 
    3497 layer {
    3498     bottom: "res4b11_branch2b"
    3499     top: "res4b11_branch2b"
    3500     name: "scale4b11_branch2b"
    3501     type: "Scale"
    3502     scale_param {
    3503         bias_term: true
    3504     }
    3505 }
    3506 
    3507 layer {
    3508     bottom: "res4b11_branch2b"
    3509     top: "res4b11_branch2b"
    3510     name: "res4b11_branch2b_relu"
    3511     type: "ReLU"
    3512 }
    3513 
    3514 layer {
    3515     bottom: "res4b11_branch2b"
    3516     top: "res4b11_branch2c"
    3517     name: "res4b11_branch2c"
    3518     type: "Convolution"
    3519     convolution_param {
    3520         num_output: 1024
    3521         kernel_size: 1
    3522         pad: 0
    3523         stride: 1
    3524         weight_filler {
    3525             type: "msra"
    3526         }
    3527         bias_term: false
    3528 
    3529     }
    3530 }
    3531 
    3532 layer {
    3533     bottom: "res4b11_branch2c"
    3534     top: "res4b11_branch2c"
    3535     name: "bn4b11_branch2c"
    3536     type: "BatchNorm"
    3537     batch_norm_param {
    3538         use_global_stats: false
    3539     }
    3540 }
    3541 
    3542 layer {
    3543     bottom: "res4b11_branch2c"
    3544     top: "res4b11_branch2c"
    3545     name: "scale4b11_branch2c"
    3546     type: "Scale"
    3547     scale_param {
    3548         bias_term: true
    3549     }
    3550 }
    3551 
    3552 layer {
    3553     bottom: "res4b10"
    3554     bottom: "res4b11_branch2c"
    3555     top: "res4b11"
    3556     name: "res4b11"
    3557     type: "Eltwise"
    3558     eltwise_param {
    3559         operation: SUM
    3560     }
    3561 }
    3562 
    3563 layer {
    3564     bottom: "res4b11"
    3565     top: "res4b11"
    3566     name: "res4b11_relu"
    3567     type: "ReLU"
    3568 }
    3569 
    3570 layer {
    3571     bottom: "res4b11"
    3572     top: "res4b12_branch2a"
    3573     name: "res4b12_branch2a"
    3574     type: "Convolution"
    3575     convolution_param {
    3576         num_output: 256
    3577         kernel_size: 1
    3578         pad: 0
    3579         stride: 1
    3580         weight_filler {
    3581             type: "msra"
    3582         }
    3583         bias_term: false
    3584 
    3585     }
    3586 }
    3587 
    3588 layer {
    3589     bottom: "res4b12_branch2a"
    3590     top: "res4b12_branch2a"
    3591     name: "bn4b12_branch2a"
    3592     type: "BatchNorm"
    3593     batch_norm_param {
    3594         use_global_stats: false
    3595     }
    3596 }
    3597 
    3598 layer {
    3599     bottom: "res4b12_branch2a"
    3600     top: "res4b12_branch2a"
    3601     name: "scale4b12_branch2a"
    3602     type: "Scale"
    3603     scale_param {
    3604         bias_term: true
    3605     }
    3606 }
    3607 
    3608 layer {
    3609     bottom: "res4b12_branch2a"
    3610     top: "res4b12_branch2a"
    3611     name: "res4b12_branch2a_relu"
    3612     type: "ReLU"
    3613 }
    3614 
    3615 layer {
    3616     bottom: "res4b12_branch2a"
    3617     top: "res4b12_branch2b"
    3618     name: "res4b12_branch2b"
    3619     type: "Convolution"
    3620     convolution_param {
    3621         num_output: 256
    3622         kernel_size: 3
    3623         pad: 1
    3624         stride: 1
    3625         weight_filler {
    3626             type: "msra"
    3627         }
    3628         bias_term: false
    3629 
    3630     }
    3631 }
    3632 
    3633 layer {
    3634     bottom: "res4b12_branch2b"
    3635     top: "res4b12_branch2b"
    3636     name: "bn4b12_branch2b"
    3637     type: "BatchNorm"
    3638     batch_norm_param {
    3639         use_global_stats: false
    3640     }
    3641 }
    3642 
    3643 layer {
    3644     bottom: "res4b12_branch2b"
    3645     top: "res4b12_branch2b"
    3646     name: "scale4b12_branch2b"
    3647     type: "Scale"
    3648     scale_param {
    3649         bias_term: true
    3650     }
    3651 }
    3652 
    3653 layer {
    3654     bottom: "res4b12_branch2b"
    3655     top: "res4b12_branch2b"
    3656     name: "res4b12_branch2b_relu"
    3657     type: "ReLU"
    3658 }
    3659 
    3660 layer {
    3661     bottom: "res4b12_branch2b"
    3662     top: "res4b12_branch2c"
    3663     name: "res4b12_branch2c"
    3664     type: "Convolution"
    3665     convolution_param {
    3666         num_output: 1024
    3667         kernel_size: 1
    3668         pad: 0
    3669         stride: 1
    3670         weight_filler {
    3671             type: "msra"
    3672         }
    3673         bias_term: false
    3674 
    3675     }
    3676 }
    3677 
    3678 layer {
    3679     bottom: "res4b12_branch2c"
    3680     top: "res4b12_branch2c"
    3681     name: "bn4b12_branch2c"
    3682     type: "BatchNorm"
    3683     batch_norm_param {
    3684         use_global_stats: false
    3685     }
    3686 }
    3687 
    3688 layer {
    3689     bottom: "res4b12_branch2c"
    3690     top: "res4b12_branch2c"
    3691     name: "scale4b12_branch2c"
    3692     type: "Scale"
    3693     scale_param {
    3694         bias_term: true
    3695     }
    3696 }
    3697 
    3698 layer {
    3699     bottom: "res4b11"
    3700     bottom: "res4b12_branch2c"
    3701     top: "res4b12"
    3702     name: "res4b12"
    3703     type: "Eltwise"
    3704     eltwise_param {
    3705         operation: SUM
    3706     }
    3707 }
    3708 
    3709 layer {
    3710     bottom: "res4b12"
    3711     top: "res4b12"
    3712     name: "res4b12_relu"
    3713     type: "ReLU"
    3714 }
    3715 
    3716 layer {
    3717     bottom: "res4b12"
    3718     top: "res4b13_branch2a"
    3719     name: "res4b13_branch2a"
    3720     type: "Convolution"
    3721     convolution_param {
    3722         num_output: 256
    3723         kernel_size: 1
    3724         pad: 0
    3725         stride: 1
    3726         weight_filler {
    3727             type: "msra"
    3728         }
    3729         bias_term: false
    3730 
    3731     }
    3732 }
    3733 
    3734 layer {
    3735     bottom: "res4b13_branch2a"
    3736     top: "res4b13_branch2a"
    3737     name: "bn4b13_branch2a"
    3738     type: "BatchNorm"
    3739     batch_norm_param {
    3740         use_global_stats: false
    3741     }
    3742 }
    3743 
    3744 layer {
    3745     bottom: "res4b13_branch2a"
    3746     top: "res4b13_branch2a"
    3747     name: "scale4b13_branch2a"
    3748     type: "Scale"
    3749     scale_param {
    3750         bias_term: true
    3751     }
    3752 }
    3753 
    3754 layer {
    3755     bottom: "res4b13_branch2a"
    3756     top: "res4b13_branch2a"
    3757     name: "res4b13_branch2a_relu"
    3758     type: "ReLU"
    3759 }
    3760 
    3761 layer {
    3762     bottom: "res4b13_branch2a"
    3763     top: "res4b13_branch2b"
    3764     name: "res4b13_branch2b"
    3765     type: "Convolution"
    3766     convolution_param {
    3767         num_output: 256
    3768         kernel_size: 3
    3769         pad: 1
    3770         stride: 1
    3771         weight_filler {
    3772             type: "msra"
    3773         }
    3774         bias_term: false
    3775 
    3776     }
    3777 }
    3778 
    3779 layer {
    3780     bottom: "res4b13_branch2b"
    3781     top: "res4b13_branch2b"
    3782     name: "bn4b13_branch2b"
    3783     type: "BatchNorm"
    3784     batch_norm_param {
    3785         use_global_stats: false
    3786     }
    3787 }
    3788 
    3789 layer {
    3790     bottom: "res4b13_branch2b"
    3791     top: "res4b13_branch2b"
    3792     name: "scale4b13_branch2b"
    3793     type: "Scale"
    3794     scale_param {
    3795         bias_term: true
    3796     }
    3797 }
    3798 
    3799 layer {
    3800     bottom: "res4b13_branch2b"
    3801     top: "res4b13_branch2b"
    3802     name: "res4b13_branch2b_relu"
    3803     type: "ReLU"
    3804 }
    3805 
    3806 layer {
    3807     bottom: "res4b13_branch2b"
    3808     top: "res4b13_branch2c"
    3809     name: "res4b13_branch2c"
    3810     type: "Convolution"
    3811     convolution_param {
    3812         num_output: 1024
    3813         kernel_size: 1
    3814         pad: 0
    3815         stride: 1
    3816         weight_filler {
    3817             type: "msra"
    3818         }
    3819         bias_term: false
    3820 
    3821     }
    3822 }
    3823 
    3824 layer {
    3825     bottom: "res4b13_branch2c"
    3826     top: "res4b13_branch2c"
    3827     name: "bn4b13_branch2c"
    3828     type: "BatchNorm"
    3829     batch_norm_param {
    3830         use_global_stats: false
    3831     }
    3832 }
    3833 
    3834 layer {
    3835     bottom: "res4b13_branch2c"
    3836     top: "res4b13_branch2c"
    3837     name: "scale4b13_branch2c"
    3838     type: "Scale"
    3839     scale_param {
    3840         bias_term: true
    3841     }
    3842 }
    3843 
    3844 layer {
    3845     bottom: "res4b12"
    3846     bottom: "res4b13_branch2c"
    3847     top: "res4b13"
    3848     name: "res4b13"
    3849     type: "Eltwise"
    3850     eltwise_param {
    3851         operation: SUM
    3852     }
    3853 }
    3854 
    3855 layer {
    3856     bottom: "res4b13"
    3857     top: "res4b13"
    3858     name: "res4b13_relu"
    3859     type: "ReLU"
    3860 }
    3861 
    3862 layer {
    3863     bottom: "res4b13"
    3864     top: "res4b14_branch2a"
    3865     name: "res4b14_branch2a"
    3866     type: "Convolution"
    3867     convolution_param {
    3868         num_output: 256
    3869         kernel_size: 1
    3870         pad: 0
    3871         stride: 1
    3872         weight_filler {
    3873             type: "msra"
    3874         }
    3875         bias_term: false
    3876 
    3877     }
    3878 }
    3879 
    3880 layer {
    3881     bottom: "res4b14_branch2a"
    3882     top: "res4b14_branch2a"
    3883     name: "bn4b14_branch2a"
    3884     type: "BatchNorm"
    3885     batch_norm_param {
    3886         use_global_stats: false
    3887     }
    3888 }
    3889 
    3890 layer {
    3891     bottom: "res4b14_branch2a"
    3892     top: "res4b14_branch2a"
    3893     name: "scale4b14_branch2a"
    3894     type: "Scale"
    3895     scale_param {
    3896         bias_term: true
    3897     }
    3898 }
    3899 
    3900 layer {
    3901     bottom: "res4b14_branch2a"
    3902     top: "res4b14_branch2a"
    3903     name: "res4b14_branch2a_relu"
    3904     type: "ReLU"
    3905 }
    3906 
    3907 layer {
    3908     bottom: "res4b14_branch2a"
    3909     top: "res4b14_branch2b"
    3910     name: "res4b14_branch2b"
    3911     type: "Convolution"
    3912     convolution_param {
    3913         num_output: 256
    3914         kernel_size: 3
    3915         pad: 1
    3916         stride: 1
    3917         weight_filler {
    3918             type: "msra"
    3919         }
    3920         bias_term: false
    3921 
    3922     }
    3923 }
    3924 
    3925 layer {
    3926     bottom: "res4b14_branch2b"
    3927     top: "res4b14_branch2b"
    3928     name: "bn4b14_branch2b"
    3929     type: "BatchNorm"
    3930     batch_norm_param {
    3931         use_global_stats: false
    3932     }
    3933 }
    3934 
    3935 layer {
    3936     bottom: "res4b14_branch2b"
    3937     top: "res4b14_branch2b"
    3938     name: "scale4b14_branch2b"
    3939     type: "Scale"
    3940     scale_param {
    3941         bias_term: true
    3942     }
    3943 }
    3944 
    3945 layer {
    3946     bottom: "res4b14_branch2b"
    3947     top: "res4b14_branch2b"
    3948     name: "res4b14_branch2b_relu"
    3949     type: "ReLU"
    3950 }
    3951 
    3952 layer {
    3953     bottom: "res4b14_branch2b"
    3954     top: "res4b14_branch2c"
    3955     name: "res4b14_branch2c"
    3956     type: "Convolution"
    3957     convolution_param {
    3958         num_output: 1024
    3959         kernel_size: 1
    3960         pad: 0
    3961         stride: 1
    3962         weight_filler {
    3963             type: "msra"
    3964         }
    3965         bias_term: false
    3966 
    3967     }
    3968 }
    3969 
    3970 layer {
    3971     bottom: "res4b14_branch2c"
    3972     top: "res4b14_branch2c"
    3973     name: "bn4b14_branch2c"
    3974     type: "BatchNorm"
    3975     batch_norm_param {
    3976         use_global_stats: false
    3977     }
    3978 }
    3979 
    3980 layer {
    3981     bottom: "res4b14_branch2c"
    3982     top: "res4b14_branch2c"
    3983     name: "scale4b14_branch2c"
    3984     type: "Scale"
    3985     scale_param {
    3986         bias_term: true
    3987     }
    3988 }
    3989 
    3990 layer {
    3991     bottom: "res4b13"
    3992     bottom: "res4b14_branch2c"
    3993     top: "res4b14"
    3994     name: "res4b14"
    3995     type: "Eltwise"
    3996     eltwise_param {
    3997         operation: SUM
    3998     }
    3999 }
    4000 
    4001 layer {
    4002     bottom: "res4b14"
    4003     top: "res4b14"
    4004     name: "res4b14_relu"
    4005     type: "ReLU"
    4006 }
    4007 
    4008 layer {
    4009     bottom: "res4b14"
    4010     top: "res4b15_branch2a"
    4011     name: "res4b15_branch2a"
    4012     type: "Convolution"
    4013     convolution_param {
    4014         num_output: 256
    4015         kernel_size: 1
    4016         pad: 0
    4017         stride: 1
    4018         weight_filler {
    4019             type: "msra"
    4020         }
    4021         bias_term: false
    4022 
    4023     }
    4024 }
    4025 
    4026 layer {
    4027     bottom: "res4b15_branch2a"
    4028     top: "res4b15_branch2a"
    4029     name: "bn4b15_branch2a"
    4030     type: "BatchNorm"
    4031     batch_norm_param {
    4032         use_global_stats: false
    4033     }
    4034 }
    4035 
    4036 layer {
    4037     bottom: "res4b15_branch2a"
    4038     top: "res4b15_branch2a"
    4039     name: "scale4b15_branch2a"
    4040     type: "Scale"
    4041     scale_param {
    4042         bias_term: true
    4043     }
    4044 }
    4045 
    4046 layer {
    4047     bottom: "res4b15_branch2a"
    4048     top: "res4b15_branch2a"
    4049     name: "res4b15_branch2a_relu"
    4050     type: "ReLU"
    4051 }
    4052 
    4053 layer {
    4054     bottom: "res4b15_branch2a"
    4055     top: "res4b15_branch2b"
    4056     name: "res4b15_branch2b"
    4057     type: "Convolution"
    4058     convolution_param {
    4059         num_output: 256
    4060         kernel_size: 3
    4061         pad: 1
    4062         stride: 1
    4063         weight_filler {
    4064             type: "msra"
    4065         }
    4066         bias_term: false
    4067 
    4068     }
    4069 }
    4070 
    4071 layer {
    4072     bottom: "res4b15_branch2b"
    4073     top: "res4b15_branch2b"
    4074     name: "bn4b15_branch2b"
    4075     type: "BatchNorm"
    4076     batch_norm_param {
    4077         use_global_stats: false
    4078     }
    4079 }
    4080 
    4081 layer {
    4082     bottom: "res4b15_branch2b"
    4083     top: "res4b15_branch2b"
    4084     name: "scale4b15_branch2b"
    4085     type: "Scale"
    4086     scale_param {
    4087         bias_term: true
    4088     }
    4089 }
    4090 
    4091 layer {
    4092     bottom: "res4b15_branch2b"
    4093     top: "res4b15_branch2b"
    4094     name: "res4b15_branch2b_relu"
    4095     type: "ReLU"
    4096 }
    4097 
    4098 layer {
    4099     bottom: "res4b15_branch2b"
    4100     top: "res4b15_branch2c"
    4101     name: "res4b15_branch2c"
    4102     type: "Convolution"
    4103     convolution_param {
    4104         num_output: 1024
    4105         kernel_size: 1
    4106         pad: 0
    4107         stride: 1
    4108         weight_filler {
    4109             type: "msra"
    4110         }
    4111         bias_term: false
    4112 
    4113     }
    4114 }
    4115 
    4116 layer {
    4117     bottom: "res4b15_branch2c"
    4118     top: "res4b15_branch2c"
    4119     name: "bn4b15_branch2c"
    4120     type: "BatchNorm"
    4121     batch_norm_param {
    4122         use_global_stats: false
    4123     }
    4124 }
    4125 
    4126 layer {
    4127     bottom: "res4b15_branch2c"
    4128     top: "res4b15_branch2c"
    4129     name: "scale4b15_branch2c"
    4130     type: "Scale"
    4131     scale_param {
    4132         bias_term: true
    4133     }
    4134 }
    4135 
    4136 layer {
    4137     bottom: "res4b14"
    4138     bottom: "res4b15_branch2c"
    4139     top: "res4b15"
    4140     name: "res4b15"
    4141     type: "Eltwise"
    4142     eltwise_param {
    4143         operation: SUM
    4144     }
    4145 }
    4146 
    4147 layer {
    4148     bottom: "res4b15"
    4149     top: "res4b15"
    4150     name: "res4b15_relu"
    4151     type: "ReLU"
    4152 }
    4153 
    4154 layer {
    4155     bottom: "res4b15"
    4156     top: "res4b16_branch2a"
    4157     name: "res4b16_branch2a"
    4158     type: "Convolution"
    4159     convolution_param {
    4160         num_output: 256
    4161         kernel_size: 1
    4162         pad: 0
    4163         stride: 1
    4164         weight_filler {
    4165             type: "msra"
    4166         }
    4167         bias_term: false
    4168 
    4169     }
    4170 }
    4171 
    4172 layer {
    4173     bottom: "res4b16_branch2a"
    4174     top: "res4b16_branch2a"
    4175     name: "bn4b16_branch2a"
    4176     type: "BatchNorm"
    4177     batch_norm_param {
    4178         use_global_stats: false
    4179     }
    4180 }
    4181 
    4182 layer {
    4183     bottom: "res4b16_branch2a"
    4184     top: "res4b16_branch2a"
    4185     name: "scale4b16_branch2a"
    4186     type: "Scale"
    4187     scale_param {
    4188         bias_term: true
    4189     }
    4190 }
    4191 
    4192 layer {
    4193     bottom: "res4b16_branch2a"
    4194     top: "res4b16_branch2a"
    4195     name: "res4b16_branch2a_relu"
    4196     type: "ReLU"
    4197 }
    4198 
    4199 layer {
    4200     bottom: "res4b16_branch2a"
    4201     top: "res4b16_branch2b"
    4202     name: "res4b16_branch2b"
    4203     type: "Convolution"
    4204     convolution_param {
    4205         num_output: 256
    4206         kernel_size: 3
    4207         pad: 1
    4208         stride: 1
    4209         weight_filler {
    4210             type: "msra"
    4211         }
    4212         bias_term: false
    4213 
    4214     }
    4215 }
    4216 
    4217 layer {
    4218     bottom: "res4b16_branch2b"
    4219     top: "res4b16_branch2b"
    4220     name: "bn4b16_branch2b"
    4221     type: "BatchNorm"
    4222     batch_norm_param {
    4223         use_global_stats: false
    4224     }
    4225 }
    4226 
    4227 layer {
    4228     bottom: "res4b16_branch2b"
    4229     top: "res4b16_branch2b"
    4230     name: "scale4b16_branch2b"
    4231     type: "Scale"
    4232     scale_param {
    4233         bias_term: true
    4234     }
    4235 }
    4236 
    4237 layer {
    4238     bottom: "res4b16_branch2b"
    4239     top: "res4b16_branch2b"
    4240     name: "res4b16_branch2b_relu"
    4241     type: "ReLU"
    4242 }
    4243 
    4244 layer {
    4245     bottom: "res4b16_branch2b"
    4246     top: "res4b16_branch2c"
    4247     name: "res4b16_branch2c"
    4248     type: "Convolution"
    4249     convolution_param {
    4250         num_output: 1024
    4251         kernel_size: 1
    4252         pad: 0
    4253         stride: 1
    4254         weight_filler {
    4255             type: "msra"
    4256         }
    4257         bias_term: false
    4258 
    4259     }
    4260 }
    4261 
    4262 layer {
    4263     bottom: "res4b16_branch2c"
    4264     top: "res4b16_branch2c"
    4265     name: "bn4b16_branch2c"
    4266     type: "BatchNorm"
    4267     batch_norm_param {
    4268         use_global_stats: false
    4269     }
    4270 }
    4271 
    4272 layer {
    4273     bottom: "res4b16_branch2c"
    4274     top: "res4b16_branch2c"
    4275     name: "scale4b16_branch2c"
    4276     type: "Scale"
    4277     scale_param {
    4278         bias_term: true
    4279     }
    4280 }
    4281 
    4282 layer {
    4283     bottom: "res4b15"
    4284     bottom: "res4b16_branch2c"
    4285     top: "res4b16"
    4286     name: "res4b16"
    4287     type: "Eltwise"
    4288     eltwise_param {
    4289         operation: SUM
    4290     }
    4291 }
    4292 
    4293 layer {
    4294     bottom: "res4b16"
    4295     top: "res4b16"
    4296     name: "res4b16_relu"
    4297     type: "ReLU"
    4298 }
    4299 
    4300 layer {
    4301     bottom: "res4b16"
    4302     top: "res4b17_branch2a"
    4303     name: "res4b17_branch2a"
    4304     type: "Convolution"
    4305     convolution_param {
    4306         num_output: 256
    4307         kernel_size: 1
    4308         pad: 0
    4309         stride: 1
    4310         weight_filler {
    4311             type: "msra"
    4312         }
    4313         bias_term: false
    4314 
    4315     }
    4316 }
    4317 
    4318 layer {
    4319     bottom: "res4b17_branch2a"
    4320     top: "res4b17_branch2a"
    4321     name: "bn4b17_branch2a"
    4322     type: "BatchNorm"
    4323     batch_norm_param {
    4324         use_global_stats: false
    4325     }
    4326 }
    4327 
    4328 layer {
    4329     bottom: "res4b17_branch2a"
    4330     top: "res4b17_branch2a"
    4331     name: "scale4b17_branch2a"
    4332     type: "Scale"
    4333     scale_param {
    4334         bias_term: true
    4335     }
    4336 }
    4337 
    4338 layer {
    4339     bottom: "res4b17_branch2a"
    4340     top: "res4b17_branch2a"
    4341     name: "res4b17_branch2a_relu"
    4342     type: "ReLU"
    4343 }
    4344 
    4345 layer {
    4346     bottom: "res4b17_branch2a"
    4347     top: "res4b17_branch2b"
    4348     name: "res4b17_branch2b"
    4349     type: "Convolution"
    4350     convolution_param {
    4351         num_output: 256
    4352         kernel_size: 3
    4353         pad: 1
    4354         stride: 1
    4355         weight_filler {
    4356             type: "msra"
    4357         }
    4358         bias_term: false
    4359 
    4360     }
    4361 }
    4362 
    4363 layer {
    4364     bottom: "res4b17_branch2b"
    4365     top: "res4b17_branch2b"
    4366     name: "bn4b17_branch2b"
    4367     type: "BatchNorm"
    4368     batch_norm_param {
    4369         use_global_stats: false
    4370     }
    4371 }
    4372 
    4373 layer {
    4374     bottom: "res4b17_branch2b"
    4375     top: "res4b17_branch2b"
    4376     name: "scale4b17_branch2b"
    4377     type: "Scale"
    4378     scale_param {
    4379         bias_term: true
    4380     }
    4381 }
    4382 
    4383 layer {
    4384     bottom: "res4b17_branch2b"
    4385     top: "res4b17_branch2b"
    4386     name: "res4b17_branch2b_relu"
    4387     type: "ReLU"
    4388 }
    4389 
    4390 layer {
    4391     bottom: "res4b17_branch2b"
    4392     top: "res4b17_branch2c"
    4393     name: "res4b17_branch2c"
    4394     type: "Convolution"
    4395     convolution_param {
    4396         num_output: 1024
    4397         kernel_size: 1
    4398         pad: 0
    4399         stride: 1
    4400         weight_filler {
    4401             type: "msra"
    4402         }
    4403         bias_term: false
    4404 
    4405     }
    4406 }
    4407 
    4408 layer {
    4409     bottom: "res4b17_branch2c"
    4410     top: "res4b17_branch2c"
    4411     name: "bn4b17_branch2c"
    4412     type: "BatchNorm"
    4413     batch_norm_param {
    4414         use_global_stats: false
    4415     }
    4416 }
    4417 
    4418 layer {
    4419     bottom: "res4b17_branch2c"
    4420     top: "res4b17_branch2c"
    4421     name: "scale4b17_branch2c"
    4422     type: "Scale"
    4423     scale_param {
    4424         bias_term: true
    4425     }
    4426 }
    4427 
    4428 layer {
    4429     bottom: "res4b16"
    4430     bottom: "res4b17_branch2c"
    4431     top: "res4b17"
    4432     name: "res4b17"
    4433     type: "Eltwise"
    4434     eltwise_param {
    4435         operation: SUM
    4436     }
    4437 }
    4438 
    4439 layer {
    4440     bottom: "res4b17"
    4441     top: "res4b17"
    4442     name: "res4b17_relu"
    4443     type: "ReLU"
    4444 }
    4445 
    4446 layer {
    4447     bottom: "res4b17"
    4448     top: "res4b18_branch2a"
    4449     name: "res4b18_branch2a"
    4450     type: "Convolution"
    4451     convolution_param {
    4452         num_output: 256
    4453         kernel_size: 1
    4454         pad: 0
    4455         stride: 1
    4456         weight_filler {
    4457             type: "msra"
    4458         }
    4459         bias_term: false
    4460 
    4461     }
    4462 }
    4463 
    4464 layer {
    4465     bottom: "res4b18_branch2a"
    4466     top: "res4b18_branch2a"
    4467     name: "bn4b18_branch2a"
    4468     type: "BatchNorm"
    4469     batch_norm_param {
    4470         use_global_stats: false
    4471     }
    4472 }
    4473 
    4474 layer {
    4475     bottom: "res4b18_branch2a"
    4476     top: "res4b18_branch2a"
    4477     name: "scale4b18_branch2a"
    4478     type: "Scale"
    4479     scale_param {
    4480         bias_term: true
    4481     }
    4482 }
    4483 
    4484 layer {
    4485     bottom: "res4b18_branch2a"
    4486     top: "res4b18_branch2a"
    4487     name: "res4b18_branch2a_relu"
    4488     type: "ReLU"
    4489 }
    4490 
    4491 layer {
    4492     bottom: "res4b18_branch2a"
    4493     top: "res4b18_branch2b"
    4494     name: "res4b18_branch2b"
    4495     type: "Convolution"
    4496     convolution_param {
    4497         num_output: 256
    4498         kernel_size: 3
    4499         pad: 1
    4500         stride: 1
    4501         weight_filler {
    4502             type: "msra"
    4503         }
    4504         bias_term: false
    4505 
    4506     }
    4507 }
    4508 
    4509 layer {
    4510     bottom: "res4b18_branch2b"
    4511     top: "res4b18_branch2b"
    4512     name: "bn4b18_branch2b"
    4513     type: "BatchNorm"
    4514     batch_norm_param {
    4515         use_global_stats: false
    4516     }
    4517 }
    4518 
    4519 layer {
    4520     bottom: "res4b18_branch2b"
    4521     top: "res4b18_branch2b"
    4522     name: "scale4b18_branch2b"
    4523     type: "Scale"
    4524     scale_param {
    4525         bias_term: true
    4526     }
    4527 }
    4528 
    4529 layer {
    4530     bottom: "res4b18_branch2b"
    4531     top: "res4b18_branch2b"
    4532     name: "res4b18_branch2b_relu"
    4533     type: "ReLU"
    4534 }
    4535 
    4536 layer {
    4537     bottom: "res4b18_branch2b"
    4538     top: "res4b18_branch2c"
    4539     name: "res4b18_branch2c"
    4540     type: "Convolution"
    4541     convolution_param {
    4542         num_output: 1024
    4543         kernel_size: 1
    4544         pad: 0
    4545         stride: 1
    4546         weight_filler {
    4547             type: "msra"
    4548         }
    4549         bias_term: false
    4550 
    4551     }
    4552 }
    4553 
    4554 layer {
    4555     bottom: "res4b18_branch2c"
    4556     top: "res4b18_branch2c"
    4557     name: "bn4b18_branch2c"
    4558     type: "BatchNorm"
    4559     batch_norm_param {
    4560         use_global_stats: false
    4561     }
    4562 }
    4563 
    4564 layer {
    4565     bottom: "res4b18_branch2c"
    4566     top: "res4b18_branch2c"
    4567     name: "scale4b18_branch2c"
    4568     type: "Scale"
    4569     scale_param {
    4570         bias_term: true
    4571     }
    4572 }
    4573 
    4574 layer {
    4575     bottom: "res4b17"
    4576     bottom: "res4b18_branch2c"
    4577     top: "res4b18"
    4578     name: "res4b18"
    4579     type: "Eltwise"
    4580     eltwise_param {
    4581         operation: SUM
    4582     }
    4583 }
    4584 
    4585 layer {
    4586     bottom: "res4b18"
    4587     top: "res4b18"
    4588     name: "res4b18_relu"
    4589     type: "ReLU"
    4590 }
    4591 
    4592 layer {
    4593     bottom: "res4b18"
    4594     top: "res4b19_branch2a"
    4595     name: "res4b19_branch2a"
    4596     type: "Convolution"
    4597     convolution_param {
    4598         num_output: 256
    4599         kernel_size: 1
    4600         pad: 0
    4601         stride: 1
    4602         weight_filler {
    4603             type: "msra"
    4604         }
    4605         bias_term: false
    4606 
    4607     }
    4608 }
    4609 
    4610 layer {
    4611     bottom: "res4b19_branch2a"
    4612     top: "res4b19_branch2a"
    4613     name: "bn4b19_branch2a"
    4614     type: "BatchNorm"
    4615     batch_norm_param {
    4616         use_global_stats: false
    4617     }
    4618 }
    4619 
    4620 layer {
    4621     bottom: "res4b19_branch2a"
    4622     top: "res4b19_branch2a"
    4623     name: "scale4b19_branch2a"
    4624     type: "Scale"
    4625     scale_param {
    4626         bias_term: true
    4627     }
    4628 }
    4629 
    4630 layer {
    4631     bottom: "res4b19_branch2a"
    4632     top: "res4b19_branch2a"
    4633     name: "res4b19_branch2a_relu"
    4634     type: "ReLU"
    4635 }
    4636 
    4637 layer {
    4638     bottom: "res4b19_branch2a"
    4639     top: "res4b19_branch2b"
    4640     name: "res4b19_branch2b"
    4641     type: "Convolution"
    4642     convolution_param {
    4643         num_output: 256
    4644         kernel_size: 3
    4645         pad: 1
    4646         stride: 1
    4647         weight_filler {
    4648             type: "msra"
    4649         }
    4650         bias_term: false
    4651 
    4652     }
    4653 }
    4654 
    4655 layer {
    4656     bottom: "res4b19_branch2b"
    4657     top: "res4b19_branch2b"
    4658     name: "bn4b19_branch2b"
    4659     type: "BatchNorm"
    4660     batch_norm_param {
    4661         use_global_stats: false
    4662     }
    4663 }
    4664 
    4665 layer {
    4666     bottom: "res4b19_branch2b"
    4667     top: "res4b19_branch2b"
    4668     name: "scale4b19_branch2b"
    4669     type: "Scale"
    4670     scale_param {
    4671         bias_term: true
    4672     }
    4673 }
    4674 
    4675 layer {
    4676     bottom: "res4b19_branch2b"
    4677     top: "res4b19_branch2b"
    4678     name: "res4b19_branch2b_relu"
    4679     type: "ReLU"
    4680 }
    4681 
    4682 layer {
    4683     bottom: "res4b19_branch2b"
    4684     top: "res4b19_branch2c"
    4685     name: "res4b19_branch2c"
    4686     type: "Convolution"
    4687     convolution_param {
    4688         num_output: 1024
    4689         kernel_size: 1
    4690         pad: 0
    4691         stride: 1
    4692         weight_filler {
    4693             type: "msra"
    4694         }
    4695         bias_term: false
    4696 
    4697     }
    4698 }
    4699 
    4700 layer {
    4701     bottom: "res4b19_branch2c"
    4702     top: "res4b19_branch2c"
    4703     name: "bn4b19_branch2c"
    4704     type: "BatchNorm"
    4705     batch_norm_param {
    4706         use_global_stats: false
    4707     }
    4708 }
    4709 
    4710 layer {
    4711     bottom: "res4b19_branch2c"
    4712     top: "res4b19_branch2c"
    4713     name: "scale4b19_branch2c"
    4714     type: "Scale"
    4715     scale_param {
    4716         bias_term: true
    4717     }
    4718 }
    4719 
    4720 layer {
    4721     bottom: "res4b18"
    4722     bottom: "res4b19_branch2c"
    4723     top: "res4b19"
    4724     name: "res4b19"
    4725     type: "Eltwise"
    4726     eltwise_param {
    4727         operation: SUM
    4728     }
    4729 }
    4730 
    4731 layer {
    4732     bottom: "res4b19"
    4733     top: "res4b19"
    4734     name: "res4b19_relu"
    4735     type: "ReLU"
    4736 }
    4737 
    4738 layer {
    4739     bottom: "res4b19"
    4740     top: "res4b20_branch2a"
    4741     name: "res4b20_branch2a"
    4742     type: "Convolution"
    4743     convolution_param {
    4744         num_output: 256
    4745         kernel_size: 1
    4746         pad: 0
    4747         stride: 1
    4748         weight_filler {
    4749             type: "msra"
    4750         }
    4751         bias_term: false
    4752 
    4753     }
    4754 }
    4755 
    4756 layer {
    4757     bottom: "res4b20_branch2a"
    4758     top: "res4b20_branch2a"
    4759     name: "bn4b20_branch2a"
    4760     type: "BatchNorm"
    4761     batch_norm_param {
    4762         use_global_stats: false
    4763     }
    4764 }
    4765 
    4766 layer {
    4767     bottom: "res4b20_branch2a"
    4768     top: "res4b20_branch2a"
    4769     name: "scale4b20_branch2a"
    4770     type: "Scale"
    4771     scale_param {
    4772         bias_term: true
    4773     }
    4774 }
    4775 
    4776 layer {
    4777     bottom: "res4b20_branch2a"
    4778     top: "res4b20_branch2a"
    4779     name: "res4b20_branch2a_relu"
    4780     type: "ReLU"
    4781 }
    4782 
    4783 layer {
    4784     bottom: "res4b20_branch2a"
    4785     top: "res4b20_branch2b"
    4786     name: "res4b20_branch2b"
    4787     type: "Convolution"
    4788     convolution_param {
    4789         num_output: 256
    4790         kernel_size: 3
    4791         pad: 1
    4792         stride: 1
    4793         weight_filler {
    4794             type: "msra"
    4795         }
    4796         bias_term: false
    4797 
    4798     }
    4799 }
    4800 
    4801 layer {
    4802     bottom: "res4b20_branch2b"
    4803     top: "res4b20_branch2b"
    4804     name: "bn4b20_branch2b"
    4805     type: "BatchNorm"
    4806     batch_norm_param {
    4807         use_global_stats: false
    4808     }
    4809 }
    4810 
    4811 layer {
    4812     bottom: "res4b20_branch2b"
    4813     top: "res4b20_branch2b"
    4814     name: "scale4b20_branch2b"
    4815     type: "Scale"
    4816     scale_param {
    4817         bias_term: true
    4818     }
    4819 }
    4820 
    4821 layer {
    4822     bottom: "res4b20_branch2b"
    4823     top: "res4b20_branch2b"
    4824     name: "res4b20_branch2b_relu"
    4825     type: "ReLU"
    4826 }
    4827 
    4828 layer {
    4829     bottom: "res4b20_branch2b"
    4830     top: "res4b20_branch2c"
    4831     name: "res4b20_branch2c"
    4832     type: "Convolution"
    4833     convolution_param {
    4834         num_output: 1024
    4835         kernel_size: 1
    4836         pad: 0
    4837         stride: 1
    4838         weight_filler {
    4839             type: "msra"
    4840         }
    4841         bias_term: false
    4842 
    4843     }
    4844 }
    4845 
    4846 layer {
    4847     bottom: "res4b20_branch2c"
    4848     top: "res4b20_branch2c"
    4849     name: "bn4b20_branch2c"
    4850     type: "BatchNorm"
    4851     batch_norm_param {
    4852         use_global_stats: false
    4853     }
    4854 }
    4855 
    4856 layer {
    4857     bottom: "res4b20_branch2c"
    4858     top: "res4b20_branch2c"
    4859     name: "scale4b20_branch2c"
    4860     type: "Scale"
    4861     scale_param {
    4862         bias_term: true
    4863     }
    4864 }
    4865 
    4866 layer {
    4867     bottom: "res4b19"
    4868     bottom: "res4b20_branch2c"
    4869     top: "res4b20"
    4870     name: "res4b20"
    4871     type: "Eltwise"
    4872     eltwise_param {
    4873         operation: SUM
    4874     }
    4875 }
    4876 
    4877 layer {
    4878     bottom: "res4b20"
    4879     top: "res4b20"
    4880     name: "res4b20_relu"
    4881     type: "ReLU"
    4882 }
    4883 
    4884 layer {
    4885     bottom: "res4b20"
    4886     top: "res4b21_branch2a"
    4887     name: "res4b21_branch2a"
    4888     type: "Convolution"
    4889     convolution_param {
    4890         num_output: 256
    4891         kernel_size: 1
    4892         pad: 0
    4893         stride: 1
    4894         weight_filler {
    4895             type: "msra"
    4896         }
    4897         bias_term: false
    4898 
    4899     }
    4900 }
    4901 
    4902 layer {
    4903     bottom: "res4b21_branch2a"
    4904     top: "res4b21_branch2a"
    4905     name: "bn4b21_branch2a"
    4906     type: "BatchNorm"
    4907     batch_norm_param {
    4908         use_global_stats: false
    4909     }
    4910 }
    4911 
    4912 layer {
    4913     bottom: "res4b21_branch2a"
    4914     top: "res4b21_branch2a"
    4915     name: "scale4b21_branch2a"
    4916     type: "Scale"
    4917     scale_param {
    4918         bias_term: true
    4919     }
    4920 }
    4921 
    4922 layer {
    4923     bottom: "res4b21_branch2a"
    4924     top: "res4b21_branch2a"
    4925     name: "res4b21_branch2a_relu"
    4926     type: "ReLU"
    4927 }
    4928 
    4929 layer {
    4930     bottom: "res4b21_branch2a"
    4931     top: "res4b21_branch2b"
    4932     name: "res4b21_branch2b"
    4933     type: "Convolution"
    4934     convolution_param {
    4935         num_output: 256
    4936         kernel_size: 3
    4937         pad: 1
    4938         stride: 1
    4939         weight_filler {
    4940             type: "msra"
    4941         }
    4942         bias_term: false
    4943 
    4944     }
    4945 }
    4946 
    4947 layer {
    4948     bottom: "res4b21_branch2b"
    4949     top: "res4b21_branch2b"
    4950     name: "bn4b21_branch2b"
    4951     type: "BatchNorm"
    4952     batch_norm_param {
    4953         use_global_stats: false
    4954     }
    4955 }
    4956 
    4957 layer {
    4958     bottom: "res4b21_branch2b"
    4959     top: "res4b21_branch2b"
    4960     name: "scale4b21_branch2b"
    4961     type: "Scale"
    4962     scale_param {
    4963         bias_term: true
    4964     }
    4965 }
    4966 
    4967 layer {
    4968     bottom: "res4b21_branch2b"
    4969     top: "res4b21_branch2b"
    4970     name: "res4b21_branch2b_relu"
    4971     type: "ReLU"
    4972 }
    4973 
    4974 layer {
    4975     bottom: "res4b21_branch2b"
    4976     top: "res4b21_branch2c"
    4977     name: "res4b21_branch2c"
    4978     type: "Convolution"
    4979     convolution_param {
    4980         num_output: 1024
    4981         kernel_size: 1
    4982         pad: 0
    4983         stride: 1
    4984         weight_filler {
    4985             type: "msra"
    4986         }
    4987         bias_term: false
    4988 
    4989     }
    4990 }
    4991 
    4992 layer {
    4993     bottom: "res4b21_branch2c"
    4994     top: "res4b21_branch2c"
    4995     name: "bn4b21_branch2c"
    4996     type: "BatchNorm"
    4997     batch_norm_param {
    4998         use_global_stats: false
    4999     }
    5000 }
    5001 
    5002 layer {
    5003     bottom: "res4b21_branch2c"
    5004     top: "res4b21_branch2c"
    5005     name: "scale4b21_branch2c"
    5006     type: "Scale"
    5007     scale_param {
    5008         bias_term: true
    5009     }
    5010 }
    5011 
    5012 layer {
    5013     bottom: "res4b20"
    5014     bottom: "res4b21_branch2c"
    5015     top: "res4b21"
    5016     name: "res4b21"
    5017     type: "Eltwise"
    5018     eltwise_param {
    5019         operation: SUM
    5020     }
    5021 }
    5022 
    5023 layer {
    5024     bottom: "res4b21"
    5025     top: "res4b21"
    5026     name: "res4b21_relu"
    5027     type: "ReLU"
    5028 }
    5029 
    5030 layer {
    5031     bottom: "res4b21"
    5032     top: "res4b22_branch2a"
    5033     name: "res4b22_branch2a"
    5034     type: "Convolution"
    5035     convolution_param {
    5036         num_output: 256
    5037         kernel_size: 1
    5038         pad: 0
    5039         stride: 1
    5040         weight_filler {
    5041             type: "msra"
    5042         }
    5043         bias_term: false
    5044 
    5045     }
    5046 }
    5047 
    5048 layer {
    5049     bottom: "res4b22_branch2a"
    5050     top: "res4b22_branch2a"
    5051     name: "bn4b22_branch2a"
    5052     type: "BatchNorm"
    5053     batch_norm_param {
    5054         use_global_stats: false
    5055     }
    5056 }
    5057 
    5058 layer {
    5059     bottom: "res4b22_branch2a"
    5060     top: "res4b22_branch2a"
    5061     name: "scale4b22_branch2a"
    5062     type: "Scale"
    5063     scale_param {
    5064         bias_term: true
    5065     }
    5066 }
    5067 
    5068 layer {
    5069     bottom: "res4b22_branch2a"
    5070     top: "res4b22_branch2a"
    5071     name: "res4b22_branch2a_relu"
    5072     type: "ReLU"
    5073 }
    5074 
    5075 layer {
    5076     bottom: "res4b22_branch2a"
    5077     top: "res4b22_branch2b"
    5078     name: "res4b22_branch2b"
    5079     type: "Convolution"
    5080     convolution_param {
    5081         num_output: 256
    5082         kernel_size: 3
    5083         pad: 1
    5084         stride: 1
    5085         weight_filler {
    5086             type: "msra"
    5087         }
    5088         bias_term: false
    5089 
    5090     }
    5091 }
    5092 
    5093 layer {
    5094     bottom: "res4b22_branch2b"
    5095     top: "res4b22_branch2b"
    5096     name: "bn4b22_branch2b"
    5097     type: "BatchNorm"
    5098     batch_norm_param {
    5099         use_global_stats: false
    5100     }
    5101 }
    5102 
    5103 layer {
    5104     bottom: "res4b22_branch2b"
    5105     top: "res4b22_branch2b"
    5106     name: "scale4b22_branch2b"
    5107     type: "Scale"
    5108     scale_param {
    5109         bias_term: true
    5110     }
    5111 }
    5112 
    5113 layer {
    5114     bottom: "res4b22_branch2b"
    5115     top: "res4b22_branch2b"
    5116     name: "res4b22_branch2b_relu"
    5117     type: "ReLU"
    5118 }
    5119 
    5120 layer {
    5121     bottom: "res4b22_branch2b"
    5122     top: "res4b22_branch2c"
    5123     name: "res4b22_branch2c"
    5124     type: "Convolution"
    5125     convolution_param {
    5126         num_output: 1024
    5127         kernel_size: 1
    5128         pad: 0
    5129         stride: 1
    5130         weight_filler {
    5131             type: "msra"
    5132         }
    5133         bias_term: false
    5134 
    5135     }
    5136 }
    5137 
    5138 layer {
    5139     bottom: "res4b22_branch2c"
    5140     top: "res4b22_branch2c"
    5141     name: "bn4b22_branch2c"
    5142     type: "BatchNorm"
    5143     batch_norm_param {
    5144         use_global_stats: false
    5145     }
    5146 }
    5147 
    5148 layer {
    5149     bottom: "res4b22_branch2c"
    5150     top: "res4b22_branch2c"
    5151     name: "scale4b22_branch2c"
    5152     type: "Scale"
    5153     scale_param {
    5154         bias_term: true
    5155     }
    5156 }
    5157 
    5158 layer {
    5159     bottom: "res4b21"
    5160     bottom: "res4b22_branch2c"
    5161     top: "res4b22"
    5162     name: "res4b22"
    5163     type: "Eltwise"
    5164     eltwise_param {
    5165         operation: SUM
    5166     }
    5167 }
    5168 
    5169 layer {
    5170     bottom: "res4b22"
    5171     top: "res4b22"
    5172     name: "res4b22_relu"
    5173     type: "ReLU"
    5174 }
    5175 
    5176 layer {
    5177     bottom: "res4b22"
    5178     top: "res4b23_branch2a"
    5179     name: "res4b23_branch2a"
    5180     type: "Convolution"
    5181     convolution_param {
    5182         num_output: 256
    5183         kernel_size: 1
    5184         pad: 0
    5185         stride: 1
    5186         weight_filler {
    5187             type: "msra"
    5188         }
    5189         bias_term: false
    5190 
    5191     }
    5192 }
    5193 
    5194 layer {
    5195     bottom: "res4b23_branch2a"
    5196     top: "res4b23_branch2a"
    5197     name: "bn4b23_branch2a"
    5198     type: "BatchNorm"
    5199     batch_norm_param {
    5200         use_global_stats: false
    5201     }
    5202 }
    5203 
    5204 layer {
    5205     bottom: "res4b23_branch2a"
    5206     top: "res4b23_branch2a"
    5207     name: "scale4b23_branch2a"
    5208     type: "Scale"
    5209     scale_param {
    5210         bias_term: true
    5211     }
    5212 }
    5213 
    5214 layer {
    5215     bottom: "res4b23_branch2a"
    5216     top: "res4b23_branch2a"
    5217     name: "res4b23_branch2a_relu"
    5218     type: "ReLU"
    5219 }
    5220 
    5221 layer {
    5222     bottom: "res4b23_branch2a"
    5223     top: "res4b23_branch2b"
    5224     name: "res4b23_branch2b"
    5225     type: "Convolution"
    5226     convolution_param {
    5227         num_output: 256
    5228         kernel_size: 3
    5229         pad: 1
    5230         stride: 1
    5231         weight_filler {
    5232             type: "msra"
    5233         }
    5234         bias_term: false
    5235 
    5236     }
    5237 }
    5238 
    5239 layer {
    5240     bottom: "res4b23_branch2b"
    5241     top: "res4b23_branch2b"
    5242     name: "bn4b23_branch2b"
    5243     type: "BatchNorm"
    5244     batch_norm_param {
    5245         use_global_stats: false
    5246     }
    5247 }
    5248 
    5249 layer {
    5250     bottom: "res4b23_branch2b"
    5251     top: "res4b23_branch2b"
    5252     name: "scale4b23_branch2b"
    5253     type: "Scale"
    5254     scale_param {
    5255         bias_term: true
    5256     }
    5257 }
    5258 
    5259 layer {
    5260     bottom: "res4b23_branch2b"
    5261     top: "res4b23_branch2b"
    5262     name: "res4b23_branch2b_relu"
    5263     type: "ReLU"
    5264 }
    5265 
    5266 layer {
    5267     bottom: "res4b23_branch2b"
    5268     top: "res4b23_branch2c"
    5269     name: "res4b23_branch2c"
    5270     type: "Convolution"
    5271     convolution_param {
    5272         num_output: 1024
    5273         kernel_size: 1
    5274         pad: 0
    5275         stride: 1
    5276         weight_filler {
    5277             type: "msra"
    5278         }
    5279         bias_term: false
    5280 
    5281     }
    5282 }
    5283 
    5284 layer {
    5285     bottom: "res4b23_branch2c"
    5286     top: "res4b23_branch2c"
    5287     name: "bn4b23_branch2c"
    5288     type: "BatchNorm"
    5289     batch_norm_param {
    5290         use_global_stats: false
    5291     }
    5292 }
    5293 
    5294 layer {
    5295     bottom: "res4b23_branch2c"
    5296     top: "res4b23_branch2c"
    5297     name: "scale4b23_branch2c"
    5298     type: "Scale"
    5299     scale_param {
    5300         bias_term: true
    5301     }
    5302 }
    5303 
    5304 layer {
    5305     bottom: "res4b22"
    5306     bottom: "res4b23_branch2c"
    5307     top: "res4b23"
    5308     name: "res4b23"
    5309     type: "Eltwise"
    5310     eltwise_param {
    5311         operation: SUM
    5312     }
    5313 }
    5314 
    5315 layer {
    5316     bottom: "res4b23"
    5317     top: "res4b23"
    5318     name: "res4b23_relu"
    5319     type: "ReLU"
    5320 }
    5321 
    5322 layer {
    5323     bottom: "res4b23"
    5324     top: "res4b24_branch2a"
    5325     name: "res4b24_branch2a"
    5326     type: "Convolution"
    5327     convolution_param {
    5328         num_output: 256
    5329         kernel_size: 1
    5330         pad: 0
    5331         stride: 1
    5332         weight_filler {
    5333             type: "msra"
    5334         }
    5335         bias_term: false
    5336 
    5337     }
    5338 }
    5339 
    5340 layer {
    5341     bottom: "res4b24_branch2a"
    5342     top: "res4b24_branch2a"
    5343     name: "bn4b24_branch2a"
    5344     type: "BatchNorm"
    5345     batch_norm_param {
    5346         use_global_stats: false
    5347     }
    5348 }
    5349 
    5350 layer {
    5351     bottom: "res4b24_branch2a"
    5352     top: "res4b24_branch2a"
    5353     name: "scale4b24_branch2a"
    5354     type: "Scale"
    5355     scale_param {
    5356         bias_term: true
    5357     }
    5358 }
    5359 
    5360 layer {
    5361     bottom: "res4b24_branch2a"
    5362     top: "res4b24_branch2a"
    5363     name: "res4b24_branch2a_relu"
    5364     type: "ReLU"
    5365 }
    5366 
    5367 layer {
    5368     bottom: "res4b24_branch2a"
    5369     top: "res4b24_branch2b"
    5370     name: "res4b24_branch2b"
    5371     type: "Convolution"
    5372     convolution_param {
    5373         num_output: 256
    5374         kernel_size: 3
    5375         pad: 1
    5376         stride: 1
    5377         weight_filler {
    5378             type: "msra"
    5379         }
    5380         bias_term: false
    5381 
    5382     }
    5383 }
    5384 
    5385 layer {
    5386     bottom: "res4b24_branch2b"
    5387     top: "res4b24_branch2b"
    5388     name: "bn4b24_branch2b"
    5389     type: "BatchNorm"
    5390     batch_norm_param {
    5391         use_global_stats: false
    5392     }
    5393 }
    5394 
    5395 layer {
    5396     bottom: "res4b24_branch2b"
    5397     top: "res4b24_branch2b"
    5398     name: "scale4b24_branch2b"
    5399     type: "Scale"
    5400     scale_param {
    5401         bias_term: true
    5402     }
    5403 }
    5404 
    5405 layer {
    5406     bottom: "res4b24_branch2b"
    5407     top: "res4b24_branch2b"
    5408     name: "res4b24_branch2b_relu"
    5409     type: "ReLU"
    5410 }
    5411 
    5412 layer {
    5413     bottom: "res4b24_branch2b"
    5414     top: "res4b24_branch2c"
    5415     name: "res4b24_branch2c"
    5416     type: "Convolution"
    5417     convolution_param {
    5418         num_output: 1024
    5419         kernel_size: 1
    5420         pad: 0
    5421         stride: 1
    5422         weight_filler {
    5423             type: "msra"
    5424         }
    5425         bias_term: false
    5426 
    5427     }
    5428 }
    5429 
    5430 layer {
    5431     bottom: "res4b24_branch2c"
    5432     top: "res4b24_branch2c"
    5433     name: "bn4b24_branch2c"
    5434     type: "BatchNorm"
    5435     batch_norm_param {
    5436         use_global_stats: false
    5437     }
    5438 }
    5439 
    5440 layer {
    5441     bottom: "res4b24_branch2c"
    5442     top: "res4b24_branch2c"
    5443     name: "scale4b24_branch2c"
    5444     type: "Scale"
    5445     scale_param {
    5446         bias_term: true
    5447     }
    5448 }
    5449 
    5450 layer {
    5451     bottom: "res4b23"
    5452     bottom: "res4b24_branch2c"
    5453     top: "res4b24"
    5454     name: "res4b24"
    5455     type: "Eltwise"
    5456     eltwise_param {
    5457         operation: SUM
    5458     }
    5459 }
    5460 
    5461 layer {
    5462     bottom: "res4b24"
    5463     top: "res4b24"
    5464     name: "res4b24_relu"
    5465     type: "ReLU"
    5466 }
    5467 
    5468 layer {
    5469     bottom: "res4b24"
    5470     top: "res4b25_branch2a"
    5471     name: "res4b25_branch2a"
    5472     type: "Convolution"
    5473     convolution_param {
    5474         num_output: 256
    5475         kernel_size: 1
    5476         pad: 0
    5477         stride: 1
    5478         weight_filler {
    5479             type: "msra"
    5480         }
    5481         bias_term: false
    5482 
    5483     }
    5484 }
    5485 
    5486 layer {
    5487     bottom: "res4b25_branch2a"
    5488     top: "res4b25_branch2a"
    5489     name: "bn4b25_branch2a"
    5490     type: "BatchNorm"
    5491     batch_norm_param {
    5492         use_global_stats: false
    5493     }
    5494 }
    5495 
    5496 layer {
    5497     bottom: "res4b25_branch2a"
    5498     top: "res4b25_branch2a"
    5499     name: "scale4b25_branch2a"
    5500     type: "Scale"
    5501     scale_param {
    5502         bias_term: true
    5503     }
    5504 }
    5505 
    5506 layer {
    5507     bottom: "res4b25_branch2a"
    5508     top: "res4b25_branch2a"
    5509     name: "res4b25_branch2a_relu"
    5510     type: "ReLU"
    5511 }
    5512 
    5513 layer {
    5514     bottom: "res4b25_branch2a"
    5515     top: "res4b25_branch2b"
    5516     name: "res4b25_branch2b"
    5517     type: "Convolution"
    5518     convolution_param {
    5519         num_output: 256
    5520         kernel_size: 3
    5521         pad: 1
    5522         stride: 1
    5523         weight_filler {
    5524             type: "msra"
    5525         }
    5526         bias_term: false
    5527 
    5528     }
    5529 }
    5530 
    5531 layer {
    5532     bottom: "res4b25_branch2b"
    5533     top: "res4b25_branch2b"
    5534     name: "bn4b25_branch2b"
    5535     type: "BatchNorm"
    5536     batch_norm_param {
    5537         use_global_stats: false
    5538     }
    5539 }
    5540 
    5541 layer {
    5542     bottom: "res4b25_branch2b"
    5543     top: "res4b25_branch2b"
    5544     name: "scale4b25_branch2b"
    5545     type: "Scale"
    5546     scale_param {
    5547         bias_term: true
    5548     }
    5549 }
    5550 
    5551 layer {
    5552     bottom: "res4b25_branch2b"
    5553     top: "res4b25_branch2b"
    5554     name: "res4b25_branch2b_relu"
    5555     type: "ReLU"
    5556 }
    5557 
    5558 layer {
    5559     bottom: "res4b25_branch2b"
    5560     top: "res4b25_branch2c"
    5561     name: "res4b25_branch2c"
    5562     type: "Convolution"
    5563     convolution_param {
    5564         num_output: 1024
    5565         kernel_size: 1
    5566         pad: 0
    5567         stride: 1
    5568         weight_filler {
    5569             type: "msra"
    5570         }
    5571         bias_term: false
    5572 
    5573     }
    5574 }
    5575 
    5576 layer {
    5577     bottom: "res4b25_branch2c"
    5578     top: "res4b25_branch2c"
    5579     name: "bn4b25_branch2c"
    5580     type: "BatchNorm"
    5581     batch_norm_param {
    5582         use_global_stats: false
    5583     }
    5584 }
    5585 
    5586 layer {
    5587     bottom: "res4b25_branch2c"
    5588     top: "res4b25_branch2c"
    5589     name: "scale4b25_branch2c"
    5590     type: "Scale"
    5591     scale_param {
    5592         bias_term: true
    5593     }
    5594 }
    5595 
    5596 layer {
    5597     bottom: "res4b24"
    5598     bottom: "res4b25_branch2c"
    5599     top: "res4b25"
    5600     name: "res4b25"
    5601     type: "Eltwise"
    5602     eltwise_param {
    5603         operation: SUM
    5604     }
    5605 }
    5606 
    5607 layer {
    5608     bottom: "res4b25"
    5609     top: "res4b25"
    5610     name: "res4b25_relu"
    5611     type: "ReLU"
    5612 }
    5613 
    5614 layer {
    5615     bottom: "res4b25"
    5616     top: "res4b26_branch2a"
    5617     name: "res4b26_branch2a"
    5618     type: "Convolution"
    5619     convolution_param {
    5620         num_output: 256
    5621         kernel_size: 1
    5622         pad: 0
    5623         stride: 1
    5624         weight_filler {
    5625             type: "msra"
    5626         }
    5627         bias_term: false
    5628 
    5629     }
    5630 }
    5631 
    5632 layer {
    5633     bottom: "res4b26_branch2a"
    5634     top: "res4b26_branch2a"
    5635     name: "bn4b26_branch2a"
    5636     type: "BatchNorm"
    5637     batch_norm_param {
    5638         use_global_stats: false
    5639     }
    5640 }
    5641 
    5642 layer {
    5643     bottom: "res4b26_branch2a"
    5644     top: "res4b26_branch2a"
    5645     name: "scale4b26_branch2a"
    5646     type: "Scale"
    5647     scale_param {
    5648         bias_term: true
    5649     }
    5650 }
    5651 
    5652 layer {
    5653     bottom: "res4b26_branch2a"
    5654     top: "res4b26_branch2a"
    5655     name: "res4b26_branch2a_relu"
    5656     type: "ReLU"
    5657 }
    5658 
    5659 layer {
    5660     bottom: "res4b26_branch2a"
    5661     top: "res4b26_branch2b"
    5662     name: "res4b26_branch2b"
    5663     type: "Convolution"
    5664     convolution_param {
    5665         num_output: 256
    5666         kernel_size: 3
    5667         pad: 1
    5668         stride: 1
    5669         weight_filler {
    5670             type: "msra"
    5671         }
    5672         bias_term: false
    5673 
    5674     }
    5675 }
    5676 
    5677 layer {
    5678     bottom: "res4b26_branch2b"
    5679     top: "res4b26_branch2b"
    5680     name: "bn4b26_branch2b"
    5681     type: "BatchNorm"
    5682     batch_norm_param {
    5683         use_global_stats: false
    5684     }
    5685 }
    5686 
    5687 layer {
    5688     bottom: "res4b26_branch2b"
    5689     top: "res4b26_branch2b"
    5690     name: "scale4b26_branch2b"
    5691     type: "Scale"
    5692     scale_param {
    5693         bias_term: true
    5694     }
    5695 }
    5696 
    5697 layer {
    5698     bottom: "res4b26_branch2b"
    5699     top: "res4b26_branch2b"
    5700     name: "res4b26_branch2b_relu"
    5701     type: "ReLU"
    5702 }
    5703 
    5704 layer {
    5705     bottom: "res4b26_branch2b"
    5706     top: "res4b26_branch2c"
    5707     name: "res4b26_branch2c"
    5708     type: "Convolution"
    5709     convolution_param {
    5710         num_output: 1024
    5711         kernel_size: 1
    5712         pad: 0
    5713         stride: 1
    5714         weight_filler {
    5715             type: "msra"
    5716         }
    5717         bias_term: false
    5718 
    5719     }
    5720 }
    5721 
    5722 layer {
    5723     bottom: "res4b26_branch2c"
    5724     top: "res4b26_branch2c"
    5725     name: "bn4b26_branch2c"
    5726     type: "BatchNorm"
    5727     batch_norm_param {
    5728         use_global_stats: false
    5729     }
    5730 }
    5731 
    5732 layer {
    5733     bottom: "res4b26_branch2c"
    5734     top: "res4b26_branch2c"
    5735     name: "scale4b26_branch2c"
    5736     type: "Scale"
    5737     scale_param {
    5738         bias_term: true
    5739     }
    5740 }
    5741 
    5742 layer {
    5743     bottom: "res4b25"
    5744     bottom: "res4b26_branch2c"
    5745     top: "res4b26"
    5746     name: "res4b26"
    5747     type: "Eltwise"
    5748     eltwise_param {
    5749         operation: SUM
    5750     }
    5751 }
    5752 
    5753 layer {
    5754     bottom: "res4b26"
    5755     top: "res4b26"
    5756     name: "res4b26_relu"
    5757     type: "ReLU"
    5758 }
    5759 
    5760 layer {
    5761     bottom: "res4b26"
    5762     top: "res4b27_branch2a"
    5763     name: "res4b27_branch2a"
    5764     type: "Convolution"
    5765     convolution_param {
    5766         num_output: 256
    5767         kernel_size: 1
    5768         pad: 0
    5769         stride: 1
    5770         weight_filler {
    5771             type: "msra"
    5772         }
    5773         bias_term: false
    5774 
    5775     }
    5776 }
    5777 
    5778 layer {
    5779     bottom: "res4b27_branch2a"
    5780     top: "res4b27_branch2a"
    5781     name: "bn4b27_branch2a"
    5782     type: "BatchNorm"
    5783     batch_norm_param {
    5784         use_global_stats: false
    5785     }
    5786 }
    5787 
    5788 layer {
    5789     bottom: "res4b27_branch2a"
    5790     top: "res4b27_branch2a"
    5791     name: "scale4b27_branch2a"
    5792     type: "Scale"
    5793     scale_param {
    5794         bias_term: true
    5795     }
    5796 }
    5797 
    5798 layer {
    5799     bottom: "res4b27_branch2a"
    5800     top: "res4b27_branch2a"
    5801     name: "res4b27_branch2a_relu"
    5802     type: "ReLU"
    5803 }
    5804 
    5805 layer {
    5806     bottom: "res4b27_branch2a"
    5807     top: "res4b27_branch2b"
    5808     name: "res4b27_branch2b"
    5809     type: "Convolution"
    5810     convolution_param {
    5811         num_output: 256
    5812         kernel_size: 3
    5813         pad: 1
    5814         stride: 1
    5815         weight_filler {
    5816             type: "msra"
    5817         }
    5818         bias_term: false
    5819 
    5820     }
    5821 }
    5822 
    5823 layer {
    5824     bottom: "res4b27_branch2b"
    5825     top: "res4b27_branch2b"
    5826     name: "bn4b27_branch2b"
    5827     type: "BatchNorm"
    5828     batch_norm_param {
    5829         use_global_stats: false
    5830     }
    5831 }
    5832 
    5833 layer {
    5834     bottom: "res4b27_branch2b"
    5835     top: "res4b27_branch2b"
    5836     name: "scale4b27_branch2b"
    5837     type: "Scale"
    5838     scale_param {
    5839         bias_term: true
    5840     }
    5841 }
    5842 
    5843 layer {
    5844     bottom: "res4b27_branch2b"
    5845     top: "res4b27_branch2b"
    5846     name: "res4b27_branch2b_relu"
    5847     type: "ReLU"
    5848 }
    5849 
    5850 layer {
    5851     bottom: "res4b27_branch2b"
    5852     top: "res4b27_branch2c"
    5853     name: "res4b27_branch2c"
    5854     type: "Convolution"
    5855     convolution_param {
    5856         num_output: 1024
    5857         kernel_size: 1
    5858         pad: 0
    5859         stride: 1
    5860         weight_filler {
    5861             type: "msra"
    5862         }
    5863         bias_term: false
    5864 
    5865     }
    5866 }
    5867 
    5868 layer {
    5869     bottom: "res4b27_branch2c"
    5870     top: "res4b27_branch2c"
    5871     name: "bn4b27_branch2c"
    5872     type: "BatchNorm"
    5873     batch_norm_param {
    5874         use_global_stats: false
    5875     }
    5876 }
    5877 
    5878 layer {
    5879     bottom: "res4b27_branch2c"
    5880     top: "res4b27_branch2c"
    5881     name: "scale4b27_branch2c"
    5882     type: "Scale"
    5883     scale_param {
    5884         bias_term: true
    5885     }
    5886 }
    5887 
    5888 layer {
    5889     bottom: "res4b26"
    5890     bottom: "res4b27_branch2c"
    5891     top: "res4b27"
    5892     name: "res4b27"
    5893     type: "Eltwise"
    5894     eltwise_param {
    5895         operation: SUM
    5896     }
    5897 }
    5898 
    5899 layer {
    5900     bottom: "res4b27"
    5901     top: "res4b27"
    5902     name: "res4b27_relu"
    5903     type: "ReLU"
    5904 }
    5905 
    5906 layer {
    5907     bottom: "res4b27"
    5908     top: "res4b28_branch2a"
    5909     name: "res4b28_branch2a"
    5910     type: "Convolution"
    5911     convolution_param {
    5912         num_output: 256
    5913         kernel_size: 1
    5914         pad: 0
    5915         stride: 1
    5916         weight_filler {
    5917             type: "msra"
    5918         }
    5919         bias_term: false
    5920 
    5921     }
    5922 }
    5923 
    5924 layer {
    5925     bottom: "res4b28_branch2a"
    5926     top: "res4b28_branch2a"
    5927     name: "bn4b28_branch2a"
    5928     type: "BatchNorm"
    5929     batch_norm_param {
    5930         use_global_stats: false
    5931     }
    5932 }
    5933 
    5934 layer {
    5935     bottom: "res4b28_branch2a"
    5936     top: "res4b28_branch2a"
    5937     name: "scale4b28_branch2a"
    5938     type: "Scale"
    5939     scale_param {
    5940         bias_term: true
    5941     }
    5942 }
    5943 
    5944 layer {
    5945     bottom: "res4b28_branch2a"
    5946     top: "res4b28_branch2a"
    5947     name: "res4b28_branch2a_relu"
    5948     type: "ReLU"
    5949 }
    5950 
    5951 layer {
    5952     bottom: "res4b28_branch2a"
    5953     top: "res4b28_branch2b"
    5954     name: "res4b28_branch2b"
    5955     type: "Convolution"
    5956     convolution_param {
    5957         num_output: 256
    5958         kernel_size: 3
    5959         pad: 1
    5960         stride: 1
    5961         weight_filler {
    5962             type: "msra"
    5963         }
    5964         bias_term: false
    5965 
    5966     }
    5967 }
    5968 
    5969 layer {
    5970     bottom: "res4b28_branch2b"
    5971     top: "res4b28_branch2b"
    5972     name: "bn4b28_branch2b"
    5973     type: "BatchNorm"
    5974     batch_norm_param {
    5975         use_global_stats: false
    5976     }
    5977 }
    5978 
    5979 layer {
    5980     bottom: "res4b28_branch2b"
    5981     top: "res4b28_branch2b"
    5982     name: "scale4b28_branch2b"
    5983     type: "Scale"
    5984     scale_param {
    5985         bias_term: true
    5986     }
    5987 }
    5988 
    5989 layer {
    5990     bottom: "res4b28_branch2b"
    5991     top: "res4b28_branch2b"
    5992     name: "res4b28_branch2b_relu"
    5993     type: "ReLU"
    5994 }
    5995 
    5996 layer {
    5997     bottom: "res4b28_branch2b"
    5998     top: "res4b28_branch2c"
    5999     name: "res4b28_branch2c"
    6000     type: "Convolution"
    6001     convolution_param {
    6002         num_output: 1024
    6003         kernel_size: 1
    6004         pad: 0
    6005         stride: 1
    6006         weight_filler {
    6007             type: "msra"
    6008         }
    6009         bias_term: false
    6010 
    6011     }
    6012 }
    6013 
    6014 layer {
    6015     bottom: "res4b28_branch2c"
    6016     top: "res4b28_branch2c"
    6017     name: "bn4b28_branch2c"
    6018     type: "BatchNorm"
    6019     batch_norm_param {
    6020         use_global_stats: false
    6021     }
    6022 }
    6023 
    6024 layer {
    6025     bottom: "res4b28_branch2c"
    6026     top: "res4b28_branch2c"
    6027     name: "scale4b28_branch2c"
    6028     type: "Scale"
    6029     scale_param {
    6030         bias_term: true
    6031     }
    6032 }
    6033 
    6034 layer {
    6035     bottom: "res4b27"
    6036     bottom: "res4b28_branch2c"
    6037     top: "res4b28"
    6038     name: "res4b28"
    6039     type: "Eltwise"
    6040     eltwise_param {
    6041         operation: SUM
    6042     }
    6043 }
    6044 
    6045 layer {
    6046     bottom: "res4b28"
    6047     top: "res4b28"
    6048     name: "res4b28_relu"
    6049     type: "ReLU"
    6050 }
    6051 
    6052 layer {
    6053     bottom: "res4b28"
    6054     top: "res4b29_branch2a"
    6055     name: "res4b29_branch2a"
    6056     type: "Convolution"
    6057     convolution_param {
    6058         num_output: 256
    6059         kernel_size: 1
    6060         pad: 0
    6061         stride: 1
    6062         weight_filler {
    6063             type: "msra"
    6064         }
    6065         bias_term: false
    6066 
    6067     }
    6068 }
    6069 
    6070 layer {
    6071     bottom: "res4b29_branch2a"
    6072     top: "res4b29_branch2a"
    6073     name: "bn4b29_branch2a"
    6074     type: "BatchNorm"
    6075     batch_norm_param {
    6076         use_global_stats: false
    6077     }
    6078 }
    6079 
    6080 layer {
    6081     bottom: "res4b29_branch2a"
    6082     top: "res4b29_branch2a"
    6083     name: "scale4b29_branch2a"
    6084     type: "Scale"
    6085     scale_param {
    6086         bias_term: true
    6087     }
    6088 }
    6089 
    6090 layer {
    6091     bottom: "res4b29_branch2a"
    6092     top: "res4b29_branch2a"
    6093     name: "res4b29_branch2a_relu"
    6094     type: "ReLU"
    6095 }
    6096 
    6097 layer {
    6098     bottom: "res4b29_branch2a"
    6099     top: "res4b29_branch2b"
    6100     name: "res4b29_branch2b"
    6101     type: "Convolution"
    6102     convolution_param {
    6103         num_output: 256
    6104         kernel_size: 3
    6105         pad: 1
    6106         stride: 1
    6107         weight_filler {
    6108             type: "msra"
    6109         }
    6110         bias_term: false
    6111 
    6112     }
    6113 }
    6114 
    6115 layer {
    6116     bottom: "res4b29_branch2b"
    6117     top: "res4b29_branch2b"
    6118     name: "bn4b29_branch2b"
    6119     type: "BatchNorm"
    6120     batch_norm_param {
    6121         use_global_stats: false
    6122     }
    6123 }
    6124 
    6125 layer {
    6126     bottom: "res4b29_branch2b"
    6127     top: "res4b29_branch2b"
    6128     name: "scale4b29_branch2b"
    6129     type: "Scale"
    6130     scale_param {
    6131         bias_term: true
    6132     }
    6133 }
    6134 
    6135 layer {
    6136     bottom: "res4b29_branch2b"
    6137     top: "res4b29_branch2b"
    6138     name: "res4b29_branch2b_relu"
    6139     type: "ReLU"
    6140 }
    6141 
    6142 layer {
    6143     bottom: "res4b29_branch2b"
    6144     top: "res4b29_branch2c"
    6145     name: "res4b29_branch2c"
    6146     type: "Convolution"
    6147     convolution_param {
    6148         num_output: 1024
    6149         kernel_size: 1
    6150         pad: 0
    6151         stride: 1
    6152         weight_filler {
    6153             type: "msra"
    6154         }
    6155         bias_term: false
    6156 
    6157     }
    6158 }
    6159 
    6160 layer {
    6161     bottom: "res4b29_branch2c"
    6162     top: "res4b29_branch2c"
    6163     name: "bn4b29_branch2c"
    6164     type: "BatchNorm"
    6165     batch_norm_param {
    6166         use_global_stats: false
    6167     }
    6168 }
    6169 
    6170 layer {
    6171     bottom: "res4b29_branch2c"
    6172     top: "res4b29_branch2c"
    6173     name: "scale4b29_branch2c"
    6174     type: "Scale"
    6175     scale_param {
    6176         bias_term: true
    6177     }
    6178 }
    6179 
    6180 layer {
    6181     bottom: "res4b28"
    6182     bottom: "res4b29_branch2c"
    6183     top: "res4b29"
    6184     name: "res4b29"
    6185     type: "Eltwise"
    6186     eltwise_param {
    6187         operation: SUM
    6188     }
    6189 }
    6190 
    6191 layer {
    6192     bottom: "res4b29"
    6193     top: "res4b29"
    6194     name: "res4b29_relu"
    6195     type: "ReLU"
    6196 }
    6197 
    6198 layer {
    6199     bottom: "res4b29"
    6200     top: "res4b30_branch2a"
    6201     name: "res4b30_branch2a"
    6202     type: "Convolution"
    6203     convolution_param {
    6204         num_output: 256
    6205         kernel_size: 1
    6206         pad: 0
    6207         stride: 1
    6208         weight_filler {
    6209             type: "msra"
    6210         }
    6211         bias_term: false
    6212 
    6213     }
    6214 }
    6215 
    6216 layer {
    6217     bottom: "res4b30_branch2a"
    6218     top: "res4b30_branch2a"
    6219     name: "bn4b30_branch2a"
    6220     type: "BatchNorm"
    6221     batch_norm_param {
    6222         use_global_stats: false
    6223     }
    6224 }
    6225 
    6226 layer {
    6227     bottom: "res4b30_branch2a"
    6228     top: "res4b30_branch2a"
    6229     name: "scale4b30_branch2a"
    6230     type: "Scale"
    6231     scale_param {
    6232         bias_term: true
    6233     }
    6234 }
    6235 
    6236 layer {
    6237     bottom: "res4b30_branch2a"
    6238     top: "res4b30_branch2a"
    6239     name: "res4b30_branch2a_relu"
    6240     type: "ReLU"
    6241 }
    6242 
    6243 layer {
    6244     bottom: "res4b30_branch2a"
    6245     top: "res4b30_branch2b"
    6246     name: "res4b30_branch2b"
    6247     type: "Convolution"
    6248     convolution_param {
    6249         num_output: 256
    6250         kernel_size: 3
    6251         pad: 1
    6252         stride: 1
    6253         weight_filler {
    6254             type: "msra"
    6255         }
    6256         bias_term: false
    6257 
    6258     }
    6259 }
    6260 
    6261 layer {
    6262     bottom: "res4b30_branch2b"
    6263     top: "res4b30_branch2b"
    6264     name: "bn4b30_branch2b"
    6265     type: "BatchNorm"
    6266     batch_norm_param {
    6267         use_global_stats: false
    6268     }
    6269 }
    6270 
    6271 layer {
    6272     bottom: "res4b30_branch2b"
    6273     top: "res4b30_branch2b"
    6274     name: "scale4b30_branch2b"
    6275     type: "Scale"
    6276     scale_param {
    6277         bias_term: true
    6278     }
    6279 }
    6280 
    6281 layer {
    6282     bottom: "res4b30_branch2b"
    6283     top: "res4b30_branch2b"
    6284     name: "res4b30_branch2b_relu"
    6285     type: "ReLU"
    6286 }
    6287 
    6288 layer {
    6289     bottom: "res4b30_branch2b"
    6290     top: "res4b30_branch2c"
    6291     name: "res4b30_branch2c"
    6292     type: "Convolution"
    6293     convolution_param {
    6294         num_output: 1024
    6295         kernel_size: 1
    6296         pad: 0
    6297         stride: 1
    6298         weight_filler {
    6299             type: "msra"
    6300         }
    6301         bias_term: false
    6302 
    6303     }
    6304 }
    6305 
    6306 layer {
    6307     bottom: "res4b30_branch2c"
    6308     top: "res4b30_branch2c"
    6309     name: "bn4b30_branch2c"
    6310     type: "BatchNorm"
    6311     batch_norm_param {
    6312         use_global_stats: false
    6313     }
    6314 }
    6315 
    6316 layer {
    6317     bottom: "res4b30_branch2c"
    6318     top: "res4b30_branch2c"
    6319     name: "scale4b30_branch2c"
    6320     type: "Scale"
    6321     scale_param {
    6322         bias_term: true
    6323     }
    6324 }
    6325 
    6326 layer {
    6327     bottom: "res4b29"
    6328     bottom: "res4b30_branch2c"
    6329     top: "res4b30"
    6330     name: "res4b30"
    6331     type: "Eltwise"
    6332     eltwise_param {
    6333         operation: SUM
    6334     }
    6335 }
    6336 
    6337 layer {
    6338     bottom: "res4b30"
    6339     top: "res4b30"
    6340     name: "res4b30_relu"
    6341     type: "ReLU"
    6342 }
    6343 
    6344 layer {
    6345     bottom: "res4b30"
    6346     top: "res4b31_branch2a"
    6347     name: "res4b31_branch2a"
    6348     type: "Convolution"
    6349     convolution_param {
    6350         num_output: 256
    6351         kernel_size: 1
    6352         pad: 0
    6353         stride: 1
    6354         weight_filler {
    6355             type: "msra"
    6356         }
    6357         bias_term: false
    6358 
    6359     }
    6360 }
    6361 
    6362 layer {
    6363     bottom: "res4b31_branch2a"
    6364     top: "res4b31_branch2a"
    6365     name: "bn4b31_branch2a"
    6366     type: "BatchNorm"
    6367     batch_norm_param {
    6368         use_global_stats: false
    6369     }
    6370 }
    6371 
    6372 layer {
    6373     bottom: "res4b31_branch2a"
    6374     top: "res4b31_branch2a"
    6375     name: "scale4b31_branch2a"
    6376     type: "Scale"
    6377     scale_param {
    6378         bias_term: true
    6379     }
    6380 }
    6381 
    6382 layer {
    6383     bottom: "res4b31_branch2a"
    6384     top: "res4b31_branch2a"
    6385     name: "res4b31_branch2a_relu"
    6386     type: "ReLU"
    6387 }
    6388 
    6389 layer {
    6390     bottom: "res4b31_branch2a"
    6391     top: "res4b31_branch2b"
    6392     name: "res4b31_branch2b"
    6393     type: "Convolution"
    6394     convolution_param {
    6395         num_output: 256
    6396         kernel_size: 3
    6397         pad: 1
    6398         stride: 1
    6399         weight_filler {
    6400             type: "msra"
    6401         }
    6402         bias_term: false
    6403 
    6404     }
    6405 }
    6406 
    6407 layer {
    6408     bottom: "res4b31_branch2b"
    6409     top: "res4b31_branch2b"
    6410     name: "bn4b31_branch2b"
    6411     type: "BatchNorm"
    6412     batch_norm_param {
    6413         use_global_stats: false
    6414     }
    6415 }
    6416 
    6417 layer {
    6418     bottom: "res4b31_branch2b"
    6419     top: "res4b31_branch2b"
    6420     name: "scale4b31_branch2b"
    6421     type: "Scale"
    6422     scale_param {
    6423         bias_term: true
    6424     }
    6425 }
    6426 
    6427 layer {
    6428     bottom: "res4b31_branch2b"
    6429     top: "res4b31_branch2b"
    6430     name: "res4b31_branch2b_relu"
    6431     type: "ReLU"
    6432 }
    6433 
    6434 layer {
    6435     bottom: "res4b31_branch2b"
    6436     top: "res4b31_branch2c"
    6437     name: "res4b31_branch2c"
    6438     type: "Convolution"
    6439     convolution_param {
    6440         num_output: 1024
    6441         kernel_size: 1
    6442         pad: 0
    6443         stride: 1
    6444         weight_filler {
    6445             type: "msra"
    6446         }
    6447         bias_term: false
    6448 
    6449     }
    6450 }
    6451 
    6452 layer {
    6453     bottom: "res4b31_branch2c"
    6454     top: "res4b31_branch2c"
    6455     name: "bn4b31_branch2c"
    6456     type: "BatchNorm"
    6457     batch_norm_param {
    6458         use_global_stats: false
    6459     }
    6460 }
    6461 
    6462 layer {
    6463     bottom: "res4b31_branch2c"
    6464     top: "res4b31_branch2c"
    6465     name: "scale4b31_branch2c"
    6466     type: "Scale"
    6467     scale_param {
    6468         bias_term: true
    6469     }
    6470 }
    6471 
    6472 layer {
    6473     bottom: "res4b30"
    6474     bottom: "res4b31_branch2c"
    6475     top: "res4b31"
    6476     name: "res4b31"
    6477     type: "Eltwise"
    6478     eltwise_param {
    6479         operation: SUM
    6480     }
    6481 }
    6482 
    6483 layer {
    6484     bottom: "res4b31"
    6485     top: "res4b31"
    6486     name: "res4b31_relu"
    6487     type: "ReLU"
    6488 }
    6489 
    6490 layer {
    6491     bottom: "res4b31"
    6492     top: "res4b32_branch2a"
    6493     name: "res4b32_branch2a"
    6494     type: "Convolution"
    6495     convolution_param {
    6496         num_output: 256
    6497         kernel_size: 1
    6498         pad: 0
    6499         stride: 1
    6500         weight_filler {
    6501             type: "msra"
    6502         }
    6503         bias_term: false
    6504 
    6505     }
    6506 }
    6507 
    6508 layer {
    6509     bottom: "res4b32_branch2a"
    6510     top: "res4b32_branch2a"
    6511     name: "bn4b32_branch2a"
    6512     type: "BatchNorm"
    6513     batch_norm_param {
    6514         use_global_stats: false
    6515     }
    6516 }
    6517 
    6518 layer {
    6519     bottom: "res4b32_branch2a"
    6520     top: "res4b32_branch2a"
    6521     name: "scale4b32_branch2a"
    6522     type: "Scale"
    6523     scale_param {
    6524         bias_term: true
    6525     }
    6526 }
    6527 
    6528 layer {
    6529     bottom: "res4b32_branch2a"
    6530     top: "res4b32_branch2a"
    6531     name: "res4b32_branch2a_relu"
    6532     type: "ReLU"
    6533 }
    6534 
    6535 layer {
    6536     bottom: "res4b32_branch2a"
    6537     top: "res4b32_branch2b"
    6538     name: "res4b32_branch2b"
    6539     type: "Convolution"
    6540     convolution_param {
    6541         num_output: 256
    6542         kernel_size: 3
    6543         pad: 1
    6544         stride: 1
    6545         weight_filler {
    6546             type: "msra"
    6547         }
    6548         bias_term: false
    6549 
    6550     }
    6551 }
    6552 
    6553 layer {
    6554     bottom: "res4b32_branch2b"
    6555     top: "res4b32_branch2b"
    6556     name: "bn4b32_branch2b"
    6557     type: "BatchNorm"
    6558     batch_norm_param {
    6559         use_global_stats: false
    6560     }
    6561 }
    6562 
    6563 layer {
    6564     bottom: "res4b32_branch2b"
    6565     top: "res4b32_branch2b"
    6566     name: "scale4b32_branch2b"
    6567     type: "Scale"
    6568     scale_param {
    6569         bias_term: true
    6570     }
    6571 }
    6572 
    6573 layer {
    6574     bottom: "res4b32_branch2b"
    6575     top: "res4b32_branch2b"
    6576     name: "res4b32_branch2b_relu"
    6577     type: "ReLU"
    6578 }
    6579 
    6580 layer {
    6581     bottom: "res4b32_branch2b"
    6582     top: "res4b32_branch2c"
    6583     name: "res4b32_branch2c"
    6584     type: "Convolution"
    6585     convolution_param {
    6586         num_output: 1024
    6587         kernel_size: 1
    6588         pad: 0
    6589         stride: 1
    6590         weight_filler {
    6591             type: "msra"
    6592         }
    6593         bias_term: false
    6594 
    6595     }
    6596 }
    6597 
    6598 layer {
    6599     bottom: "res4b32_branch2c"
    6600     top: "res4b32_branch2c"
    6601     name: "bn4b32_branch2c"
    6602     type: "BatchNorm"
    6603     batch_norm_param {
    6604         use_global_stats: false
    6605     }
    6606 }
    6607 
    6608 layer {
    6609     bottom: "res4b32_branch2c"
    6610     top: "res4b32_branch2c"
    6611     name: "scale4b32_branch2c"
    6612     type: "Scale"
    6613     scale_param {
    6614         bias_term: true
    6615     }
    6616 }
    6617 
    6618 layer {
    6619     bottom: "res4b31"
    6620     bottom: "res4b32_branch2c"
    6621     top: "res4b32"
    6622     name: "res4b32"
    6623     type: "Eltwise"
    6624     eltwise_param {
    6625         operation: SUM
    6626     }
    6627 }
    6628 
    6629 layer {
    6630     bottom: "res4b32"
    6631     top: "res4b32"
    6632     name: "res4b32_relu"
    6633     type: "ReLU"
    6634 }
    6635 
    6636 layer {
    6637     bottom: "res4b32"
    6638     top: "res4b33_branch2a"
    6639     name: "res4b33_branch2a"
    6640     type: "Convolution"
    6641     convolution_param {
    6642         num_output: 256
    6643         kernel_size: 1
    6644         pad: 0
    6645         stride: 1
    6646         weight_filler {
    6647             type: "msra"
    6648         }
    6649         bias_term: false
    6650 
    6651     }
    6652 }
    6653 
    6654 layer {
    6655     bottom: "res4b33_branch2a"
    6656     top: "res4b33_branch2a"
    6657     name: "bn4b33_branch2a"
    6658     type: "BatchNorm"
    6659     batch_norm_param {
    6660         use_global_stats: false
    6661     }
    6662 }
    6663 
    6664 layer {
    6665     bottom: "res4b33_branch2a"
    6666     top: "res4b33_branch2a"
    6667     name: "scale4b33_branch2a"
    6668     type: "Scale"
    6669     scale_param {
    6670         bias_term: true
    6671     }
    6672 }
    6673 
    6674 layer {
    6675     bottom: "res4b33_branch2a"
    6676     top: "res4b33_branch2a"
    6677     name: "res4b33_branch2a_relu"
    6678     type: "ReLU"
    6679 }
    6680 
    6681 layer {
    6682     bottom: "res4b33_branch2a"
    6683     top: "res4b33_branch2b"
    6684     name: "res4b33_branch2b"
    6685     type: "Convolution"
    6686     convolution_param {
    6687         num_output: 256
    6688         kernel_size: 3
    6689         pad: 1
    6690         stride: 1
    6691         weight_filler {
    6692             type: "msra"
    6693         }
    6694         bias_term: false
    6695 
    6696     }
    6697 }
    6698 
    6699 layer {
    6700     bottom: "res4b33_branch2b"
    6701     top: "res4b33_branch2b"
    6702     name: "bn4b33_branch2b"
    6703     type: "BatchNorm"
    6704     batch_norm_param {
    6705         use_global_stats: false
    6706     }
    6707 }
    6708 
    6709 layer {
    6710     bottom: "res4b33_branch2b"
    6711     top: "res4b33_branch2b"
    6712     name: "scale4b33_branch2b"
    6713     type: "Scale"
    6714     scale_param {
    6715         bias_term: true
    6716     }
    6717 }
    6718 
    6719 layer {
    6720     bottom: "res4b33_branch2b"
    6721     top: "res4b33_branch2b"
    6722     name: "res4b33_branch2b_relu"
    6723     type: "ReLU"
    6724 }
    6725 
    6726 layer {
    6727     bottom: "res4b33_branch2b"
    6728     top: "res4b33_branch2c"
    6729     name: "res4b33_branch2c"
    6730     type: "Convolution"
    6731     convolution_param {
    6732         num_output: 1024
    6733         kernel_size: 1
    6734         pad: 0
    6735         stride: 1
    6736         weight_filler {
    6737             type: "msra"
    6738         }
    6739         bias_term: false
    6740 
    6741     }
    6742 }
    6743 
    6744 layer {
    6745     bottom: "res4b33_branch2c"
    6746     top: "res4b33_branch2c"
    6747     name: "bn4b33_branch2c"
    6748     type: "BatchNorm"
    6749     batch_norm_param {
    6750         use_global_stats: false
    6751     }
    6752 }
    6753 
    6754 layer {
    6755     bottom: "res4b33_branch2c"
    6756     top: "res4b33_branch2c"
    6757     name: "scale4b33_branch2c"
    6758     type: "Scale"
    6759     scale_param {
    6760         bias_term: true
    6761     }
    6762 }
    6763 
    6764 layer {
    6765     bottom: "res4b32"
    6766     bottom: "res4b33_branch2c"
    6767     top: "res4b33"
    6768     name: "res4b33"
    6769     type: "Eltwise"
    6770     eltwise_param {
    6771         operation: SUM
    6772     }
    6773 }
    6774 
    6775 layer {
    6776     bottom: "res4b33"
    6777     top: "res4b33"
    6778     name: "res4b33_relu"
    6779     type: "ReLU"
    6780 }
    6781 
    6782 layer {
    6783     bottom: "res4b33"
    6784     top: "res4b34_branch2a"
    6785     name: "res4b34_branch2a"
    6786     type: "Convolution"
    6787     convolution_param {
    6788         num_output: 256
    6789         kernel_size: 1
    6790         pad: 0
    6791         stride: 1
    6792         weight_filler {
    6793             type: "msra"
    6794         }
    6795         bias_term: false
    6796 
    6797     }
    6798 }
    6799 
    6800 layer {
    6801     bottom: "res4b34_branch2a"
    6802     top: "res4b34_branch2a"
    6803     name: "bn4b34_branch2a"
    6804     type: "BatchNorm"
    6805     batch_norm_param {
    6806         use_global_stats: false
    6807     }
    6808 }
    6809 
    6810 layer {
    6811     bottom: "res4b34_branch2a"
    6812     top: "res4b34_branch2a"
    6813     name: "scale4b34_branch2a"
    6814     type: "Scale"
    6815     scale_param {
    6816         bias_term: true
    6817     }
    6818 }
    6819 
    6820 layer {
    6821     bottom: "res4b34_branch2a"
    6822     top: "res4b34_branch2a"
    6823     name: "res4b34_branch2a_relu"
    6824     type: "ReLU"
    6825 }
    6826 
    6827 layer {
    6828     bottom: "res4b34_branch2a"
    6829     top: "res4b34_branch2b"
    6830     name: "res4b34_branch2b"
    6831     type: "Convolution"
    6832     convolution_param {
    6833         num_output: 256
    6834         kernel_size: 3
    6835         pad: 1
    6836         stride: 1
    6837         weight_filler {
    6838             type: "msra"
    6839         }
    6840         bias_term: false
    6841 
    6842     }
    6843 }
    6844 
    6845 layer {
    6846     bottom: "res4b34_branch2b"
    6847     top: "res4b34_branch2b"
    6848     name: "bn4b34_branch2b"
    6849     type: "BatchNorm"
    6850     batch_norm_param {
    6851         use_global_stats: false
    6852     }
    6853 }
    6854 
    6855 layer {
    6856     bottom: "res4b34_branch2b"
    6857     top: "res4b34_branch2b"
    6858     name: "scale4b34_branch2b"
    6859     type: "Scale"
    6860     scale_param {
    6861         bias_term: true
    6862     }
    6863 }
    6864 
    6865 layer {
    6866     bottom: "res4b34_branch2b"
    6867     top: "res4b34_branch2b"
    6868     name: "res4b34_branch2b_relu"
    6869     type: "ReLU"
    6870 }
    6871 
    6872 layer {
    6873     bottom: "res4b34_branch2b"
    6874     top: "res4b34_branch2c"
    6875     name: "res4b34_branch2c"
    6876     type: "Convolution"
    6877     convolution_param {
    6878         num_output: 1024
    6879         kernel_size: 1
    6880         pad: 0
    6881         stride: 1
    6882         weight_filler {
    6883             type: "msra"
    6884         }
    6885         bias_term: false
    6886 
    6887     }
    6888 }
    6889 
    6890 layer {
    6891     bottom: "res4b34_branch2c"
    6892     top: "res4b34_branch2c"
    6893     name: "bn4b34_branch2c"
    6894     type: "BatchNorm"
    6895     batch_norm_param {
    6896         use_global_stats: false
    6897     }
    6898 }
    6899 
    6900 layer {
    6901     bottom: "res4b34_branch2c"
    6902     top: "res4b34_branch2c"
    6903     name: "scale4b34_branch2c"
    6904     type: "Scale"
    6905     scale_param {
    6906         bias_term: true
    6907     }
    6908 }
    6909 
    6910 layer {
    6911     bottom: "res4b33"
    6912     bottom: "res4b34_branch2c"
    6913     top: "res4b34"
    6914     name: "res4b34"
    6915     type: "Eltwise"
    6916     eltwise_param {
    6917         operation: SUM
    6918     }
    6919 }
    6920 
    6921 layer {
    6922     bottom: "res4b34"
    6923     top: "res4b34"
    6924     name: "res4b34_relu"
    6925     type: "ReLU"
    6926 }
    6927 
    6928 layer {
    6929     bottom: "res4b34"
    6930     top: "res4b35_branch2a"
    6931     name: "res4b35_branch2a"
    6932     type: "Convolution"
    6933     convolution_param {
    6934         num_output: 256
    6935         kernel_size: 1
    6936         pad: 0
    6937         stride: 1
    6938         weight_filler {
    6939             type: "msra"
    6940         }
    6941         bias_term: false
    6942 
    6943     }
    6944 }
    6945 
    6946 layer {
    6947     bottom: "res4b35_branch2a"
    6948     top: "res4b35_branch2a"
    6949     name: "bn4b35_branch2a"
    6950     type: "BatchNorm"
    6951     batch_norm_param {
    6952         use_global_stats: false
    6953     }
    6954 }
    6955 
    6956 layer {
    6957     bottom: "res4b35_branch2a"
    6958     top: "res4b35_branch2a"
    6959     name: "scale4b35_branch2a"
    6960     type: "Scale"
    6961     scale_param {
    6962         bias_term: true
    6963     }
    6964 }
    6965 
    6966 layer {
    6967     bottom: "res4b35_branch2a"
    6968     top: "res4b35_branch2a"
    6969     name: "res4b35_branch2a_relu"
    6970     type: "ReLU"
    6971 }
    6972 
    6973 layer {
    6974     bottom: "res4b35_branch2a"
    6975     top: "res4b35_branch2b"
    6976     name: "res4b35_branch2b"
    6977     type: "Convolution"
    6978     convolution_param {
    6979         num_output: 256
    6980         kernel_size: 3
    6981         pad: 1
    6982         stride: 1
    6983         weight_filler {
    6984             type: "msra"
    6985         }
    6986         bias_term: false
    6987 
    6988     }
    6989 }
    6990 
    6991 layer {
    6992     bottom: "res4b35_branch2b"
    6993     top: "res4b35_branch2b"
    6994     name: "bn4b35_branch2b"
    6995     type: "BatchNorm"
    6996     batch_norm_param {
    6997         use_global_stats: false
    6998     }
    6999 }
    7000 
    7001 layer {
    7002     bottom: "res4b35_branch2b"
    7003     top: "res4b35_branch2b"
    7004     name: "scale4b35_branch2b"
    7005     type: "Scale"
    7006     scale_param {
    7007         bias_term: true
    7008     }
    7009 }
    7010 
    7011 layer {
    7012     bottom: "res4b35_branch2b"
    7013     top: "res4b35_branch2b"
    7014     name: "res4b35_branch2b_relu"
    7015     type: "ReLU"
    7016 }
    7017 
    7018 layer {
    7019     bottom: "res4b35_branch2b"
    7020     top: "res4b35_branch2c"
    7021     name: "res4b35_branch2c"
    7022     type: "Convolution"
    7023     convolution_param {
    7024         num_output: 1024
    7025         kernel_size: 1
    7026         pad: 0
    7027         stride: 1
    7028         weight_filler {
    7029             type: "msra"
    7030         }
    7031         bias_term: false
    7032 
    7033     }
    7034 }
    7035 
    7036 layer {
    7037     bottom: "res4b35_branch2c"
    7038     top: "res4b35_branch2c"
    7039     name: "bn4b35_branch2c"
    7040     type: "BatchNorm"
    7041     batch_norm_param {
    7042         use_global_stats: false
    7043     }
    7044 }
    7045 
    7046 layer {
    7047     bottom: "res4b35_branch2c"
    7048     top: "res4b35_branch2c"
    7049     name: "scale4b35_branch2c"
    7050     type: "Scale"
    7051     scale_param {
    7052         bias_term: true
    7053     }
    7054 }
    7055 
    7056 layer {
    7057     bottom: "res4b34"
    7058     bottom: "res4b35_branch2c"
    7059     top: "res4b35"
    7060     name: "res4b35"
    7061     type: "Eltwise"
    7062     eltwise_param {
    7063         operation: SUM
    7064     }
    7065 }
    7066 
    7067 layer {
    7068     bottom: "res4b35"
    7069     top: "res4b35"
    7070     name: "res4b35_relu"
    7071     type: "ReLU"
    7072 }
    7073 
    7074 layer {
    7075     bottom: "res4b35"
    7076     top: "res5a_branch1"
    7077     name: "res5a_branch1"
    7078     type: "Convolution"
    7079     convolution_param {
    7080         num_output: 2048
    7081         kernel_size: 1
    7082         pad: 0
    7083         stride: 2
    7084         weight_filler {
    7085             type: "msra"
    7086         }
    7087         bias_term: false
    7088 
    7089     }
    7090 }
    7091 
    7092 layer {
    7093     bottom: "res5a_branch1"
    7094     top: "res5a_branch1"
    7095     name: "bn5a_branch1"
    7096     type: "BatchNorm"
    7097     batch_norm_param {
    7098         use_global_stats: false
    7099     }
    7100 }
    7101 
    7102 layer {
    7103     bottom: "res5a_branch1"
    7104     top: "res5a_branch1"
    7105     name: "scale5a_branch1"
    7106     type: "Scale"
    7107     scale_param {
    7108         bias_term: true
    7109     }
    7110 }
    7111 
    7112 layer {
    7113     bottom: "res4b35"
    7114     top: "res5a_branch2a"
    7115     name: "res5a_branch2a"
    7116     type: "Convolution"
    7117     convolution_param {
    7118         num_output: 512
    7119         kernel_size: 1
    7120         pad: 0
    7121         stride: 2
    7122         weight_filler {
    7123             type: "msra"
    7124         }
    7125         bias_term: false
    7126 
    7127     }
    7128 }
    7129 
    7130 layer {
    7131     bottom: "res5a_branch2a"
    7132     top: "res5a_branch2a"
    7133     name: "bn5a_branch2a"
    7134     type: "BatchNorm"
    7135     batch_norm_param {
    7136         use_global_stats: false
    7137     }
    7138 }
    7139 
    7140 layer {
    7141     bottom: "res5a_branch2a"
    7142     top: "res5a_branch2a"
    7143     name: "scale5a_branch2a"
    7144     type: "Scale"
    7145     scale_param {
    7146         bias_term: true
    7147     }
    7148 }
    7149 
    7150 layer {
    7151     bottom: "res5a_branch2a"
    7152     top: "res5a_branch2a"
    7153     name: "res5a_branch2a_relu"
    7154     type: "ReLU"
    7155 }
    7156 
    7157 layer {
    7158     bottom: "res5a_branch2a"
    7159     top: "res5a_branch2b"
    7160     name: "res5a_branch2b"
    7161     type: "Convolution"
    7162     convolution_param {
    7163         num_output: 512
    7164         kernel_size: 3
    7165         pad: 1
    7166         stride: 1
    7167         weight_filler {
    7168             type: "msra"
    7169         }
    7170         bias_term: false
    7171 
    7172     }
    7173 }
    7174 
    7175 layer {
    7176     bottom: "res5a_branch2b"
    7177     top: "res5a_branch2b"
    7178     name: "bn5a_branch2b"
    7179     type: "BatchNorm"
    7180     batch_norm_param {
    7181         use_global_stats: false
    7182     }
    7183 }
    7184 
    7185 layer {
    7186     bottom: "res5a_branch2b"
    7187     top: "res5a_branch2b"
    7188     name: "scale5a_branch2b"
    7189     type: "Scale"
    7190     scale_param {
    7191         bias_term: true
    7192     }
    7193 }
    7194 
    7195 layer {
    7196     bottom: "res5a_branch2b"
    7197     top: "res5a_branch2b"
    7198     name: "res5a_branch2b_relu"
    7199     type: "ReLU"
    7200 }
    7201 
    7202 layer {
    7203     bottom: "res5a_branch2b"
    7204     top: "res5a_branch2c"
    7205     name: "res5a_branch2c"
    7206     type: "Convolution"
    7207     convolution_param {
    7208         num_output: 2048
    7209         kernel_size: 1
    7210         pad: 0
    7211         stride: 1
    7212         weight_filler {
    7213             type: "msra"
    7214         }
    7215         bias_term: false
    7216 
    7217     }
    7218 }
    7219 
    7220 layer {
    7221     bottom: "res5a_branch2c"
    7222     top: "res5a_branch2c"
    7223     name: "bn5a_branch2c"
    7224     type: "BatchNorm"
    7225     batch_norm_param {
    7226         use_global_stats: false
    7227     }
    7228 }
    7229 
    7230 layer {
    7231     bottom: "res5a_branch2c"
    7232     top: "res5a_branch2c"
    7233     name: "scale5a_branch2c"
    7234     type: "Scale"
    7235     scale_param {
    7236         bias_term: true
    7237     }
    7238 }
    7239 
    7240 layer {
    7241     bottom: "res5a_branch1"
    7242     bottom: "res5a_branch2c"
    7243     top: "res5a"
    7244     name: "res5a"
    7245     type: "Eltwise"
    7246     eltwise_param {
    7247         operation: SUM
    7248     }
    7249 }
    7250 
    7251 layer {
    7252     bottom: "res5a"
    7253     top: "res5a"
    7254     name: "res5a_relu"
    7255     type: "ReLU"
    7256 }
    7257 
    7258 layer {
    7259     bottom: "res5a"
    7260     top: "res5b_branch2a"
    7261     name: "res5b_branch2a"
    7262     type: "Convolution"
    7263     convolution_param {
    7264         num_output: 512
    7265         kernel_size: 1
    7266         pad: 0
    7267         stride: 1
    7268         weight_filler {
    7269             type: "msra"
    7270         }
    7271         bias_term: false
    7272 
    7273     }
    7274 }
    7275 
    7276 layer {
    7277     bottom: "res5b_branch2a"
    7278     top: "res5b_branch2a"
    7279     name: "bn5b_branch2a"
    7280     type: "BatchNorm"
    7281     batch_norm_param {
    7282         use_global_stats: false
    7283     }
    7284 }
    7285 
    7286 layer {
    7287     bottom: "res5b_branch2a"
    7288     top: "res5b_branch2a"
    7289     name: "scale5b_branch2a"
    7290     type: "Scale"
    7291     scale_param {
    7292         bias_term: true
    7293     }
    7294 }
    7295 
    7296 layer {
    7297     bottom: "res5b_branch2a"
    7298     top: "res5b_branch2a"
    7299     name: "res5b_branch2a_relu"
    7300     type: "ReLU"
    7301 }
    7302 
    7303 layer {
    7304     bottom: "res5b_branch2a"
    7305     top: "res5b_branch2b"
    7306     name: "res5b_branch2b"
    7307     type: "Convolution"
    7308     convolution_param {
    7309         num_output: 512
    7310         kernel_size: 3
    7311         pad: 1
    7312         stride: 1
    7313         weight_filler {
    7314             type: "msra"
    7315         }
    7316         bias_term: false
    7317 
    7318     }
    7319 }
    7320 
    7321 layer {
    7322     bottom: "res5b_branch2b"
    7323     top: "res5b_branch2b"
    7324     name: "bn5b_branch2b"
    7325     type: "BatchNorm"
    7326     batch_norm_param {
    7327         use_global_stats: false
    7328     }
    7329 }
    7330 
    7331 layer {
    7332     bottom: "res5b_branch2b"
    7333     top: "res5b_branch2b"
    7334     name: "scale5b_branch2b"
    7335     type: "Scale"
    7336     scale_param {
    7337         bias_term: true
    7338     }
    7339 }
    7340 
    7341 layer {
    7342     bottom: "res5b_branch2b"
    7343     top: "res5b_branch2b"
    7344     name: "res5b_branch2b_relu"
    7345     type: "ReLU"
    7346 }
    7347 
    7348 layer {
    7349     bottom: "res5b_branch2b"
    7350     top: "res5b_branch2c"
    7351     name: "res5b_branch2c"
    7352     type: "Convolution"
    7353     convolution_param {
    7354         num_output: 2048
    7355         kernel_size: 1
    7356         pad: 0
    7357         stride: 1
    7358         weight_filler {
    7359             type: "msra"
    7360         }
    7361         bias_term: false
    7362 
    7363     }
    7364 }
    7365 
    7366 layer {
    7367     bottom: "res5b_branch2c"
    7368     top: "res5b_branch2c"
    7369     name: "bn5b_branch2c"
    7370     type: "BatchNorm"
    7371     batch_norm_param {
    7372         use_global_stats: false
    7373     }
    7374 }
    7375 
    7376 layer {
    7377     bottom: "res5b_branch2c"
    7378     top: "res5b_branch2c"
    7379     name: "scale5b_branch2c"
    7380     type: "Scale"
    7381     scale_param {
    7382         bias_term: true
    7383     }
    7384 }
    7385 
    7386 layer {
    7387     bottom: "res5a"
    7388     bottom: "res5b_branch2c"
    7389     top: "res5b"
    7390     name: "res5b"
    7391     type: "Eltwise"
    7392     eltwise_param {
    7393         operation: SUM
    7394     }
    7395 }
    7396 
    7397 layer {
    7398     bottom: "res5b"
    7399     top: "res5b"
    7400     name: "res5b_relu"
    7401     type: "ReLU"
    7402 }
    7403 
    7404 layer {
    7405     bottom: "res5b"
    7406     top: "res5c_branch2a"
    7407     name: "res5c_branch2a"
    7408     type: "Convolution"
    7409     convolution_param {
    7410         num_output: 512
    7411         kernel_size: 1
    7412         pad: 0
    7413         stride: 1
    7414         weight_filler {
    7415             type: "msra"
    7416         }
    7417         bias_term: false
    7418 
    7419     }
    7420 }
    7421 
    7422 layer {
    7423     bottom: "res5c_branch2a"
    7424     top: "res5c_branch2a"
    7425     name: "bn5c_branch2a"
    7426     type: "BatchNorm"
    7427     batch_norm_param {
    7428         use_global_stats: false
    7429     }
    7430 }
    7431 
    7432 layer {
    7433     bottom: "res5c_branch2a"
    7434     top: "res5c_branch2a"
    7435     name: "scale5c_branch2a"
    7436     type: "Scale"
    7437     scale_param {
    7438         bias_term: true
    7439     }
    7440 }
    7441 
    7442 layer {
    7443     bottom: "res5c_branch2a"
    7444     top: "res5c_branch2a"
    7445     name: "res5c_branch2a_relu"
    7446     type: "ReLU"
    7447 }
    7448 
    7449 layer {
    7450     bottom: "res5c_branch2a"
    7451     top: "res5c_branch2b"
    7452     name: "res5c_branch2b"
    7453     type: "Convolution"
    7454     convolution_param {
    7455         num_output: 512
    7456         kernel_size: 3
    7457         pad: 1
    7458         stride: 1
    7459         weight_filler {
    7460             type: "msra"
    7461         }
    7462         bias_term: false
    7463 
    7464     }
    7465 }
    7466 
    7467 layer {
    7468     bottom: "res5c_branch2b"
    7469     top: "res5c_branch2b"
    7470     name: "bn5c_branch2b"
    7471     type: "BatchNorm"
    7472     batch_norm_param {
    7473         use_global_stats: false
    7474     }
    7475 }
    7476 
    7477 layer {
    7478     bottom: "res5c_branch2b"
    7479     top: "res5c_branch2b"
    7480     name: "scale5c_branch2b"
    7481     type: "Scale"
    7482     scale_param {
    7483         bias_term: true
    7484     }
    7485 }
    7486 
    7487 layer {
    7488     bottom: "res5c_branch2b"
    7489     top: "res5c_branch2b"
    7490     name: "res5c_branch2b_relu"
    7491     type: "ReLU"
    7492 }
    7493 
    7494 layer {
    7495     bottom: "res5c_branch2b"
    7496     top: "res5c_branch2c"
    7497     name: "res5c_branch2c"
    7498     type: "Convolution"
    7499     convolution_param {
    7500         num_output: 2048
    7501         kernel_size: 1
    7502         pad: 0
    7503         stride: 1
    7504         weight_filler {
    7505             type: "msra"
    7506         }
    7507         bias_term: false
    7508 
    7509     }
    7510 }
    7511 
    7512 layer {
    7513     bottom: "res5c_branch2c"
    7514     top: "res5c_branch2c"
    7515     name: "bn5c_branch2c"
    7516     type: "BatchNorm"
    7517     batch_norm_param {
    7518         use_global_stats: false
    7519     }
    7520 }
    7521 
    7522 layer {
    7523     bottom: "res5c_branch2c"
    7524     top: "res5c_branch2c"
    7525     name: "scale5c_branch2c"
    7526     type: "Scale"
    7527     scale_param {
    7528         bias_term: true
    7529     }
    7530 }
    7531 
    7532 layer {
    7533     bottom: "res5b"
    7534     bottom: "res5c_branch2c"
    7535     top: "res5c"
    7536     name: "res5c"
    7537     type: "Eltwise"
    7538     eltwise_param {
    7539         operation: SUM
    7540     }
    7541 }
    7542 
    7543 layer {
    7544     bottom: "res5c"
    7545     top: "res5c"
    7546     name: "res5c_relu"
    7547     type: "ReLU"
    7548 }
    7549 
    7550 layer {
    7551     bottom: "res5c"
    7552     top: "pool5"
    7553     name: "pool5"
    7554     type: "Pooling"
    7555     pooling_param {
    7556         kernel_size: 7
    7557         stride: 1
    7558         pool: AVE
    7559     }
    7560 }
    7561 
    7562 layer {
    7563     bottom: "pool5"
    7564     top: "fc3"
    7565     name: "fc3"
    7566     type: "InnerProduct"
    7567     param {
    7568         lr_mult: 1
    7569         decay_mult: 1
    7570     }
    7571     param {
    7572         lr_mult: 2
    7573         decay_mult: 1
    7574     }
    7575     inner_product_param {
    7576         num_output: 3
    7577         weight_filler {
    7578             type: "xavier"
    7579         }
    7580         bias_filler {
    7581             type: "constant"
    7582             value: 0
    7583         }
    7584     }
    7585 }
    7586 
    7587 layer {
    7588     bottom: "fc3"
    7589     bottom: "label"
    7590     name: "loss"
    7591     type: "SoftmaxWithLoss"
    7592     top: "loss"
    7593 }
    7594 
    7595 layer {
    7596        name: "probt"
    7597        type: "Softmax"
    7598        bottom: "fc3"
    7599        top: "probt"
    7600        include {
    7601        phase: TEST
    7602     }
    7603 }
    7604 
    7605 layer {
    7606   bottom: "fc3"
    7607   bottom: "label"
    7608   top: "accuracy@1"
    7609   name: "accuracy/top1"
    7610   type: "Accuracy"
    7611   accuracy_param {
    7612     top_k: 1
    7613   }
    7614 }
    View Code

           deploy.prototxt

       1 name: "ResNet-152"
       2 input: "data"
       3 input_shape {
       4   dim: 1
       5   dim: 3
       6   dim: 224
       7   dim: 224
       8 }
       9 
      10 layer {
      11     bottom: "data"
      12     top: "conv1"
      13     name: "conv1"
      14     type: "Convolution"
      15     convolution_param {
      16         num_output: 64
      17         kernel_size: 7
      18         pad: 3
      19         stride: 2
      20         bias_term: false
      21     }
      22 }
      23 
      24 layer {
      25     bottom: "conv1"
      26     top: "conv1"
      27     name: "bn_conv1"
      28     type: "BatchNorm"
      29     batch_norm_param {
      30         use_global_stats: true
      31     }
      32 }
      33 
      34 layer {
      35     bottom: "conv1"
      36     top: "conv1"
      37     name: "scale_conv1"
      38     type: "Scale"
      39     scale_param {
      40         bias_term: true
      41     }
      42 }
      43 
      44 layer {
      45     top: "conv1"
      46     bottom: "conv1"
      47     name: "conv1_relu"
      48     type: "ReLU"
      49 }
      50 
      51 layer {
      52     bottom: "conv1"
      53     top: "pool1"
      54     name: "pool1"
      55     type: "Pooling"
      56     pooling_param {
      57         kernel_size: 3
      58         stride: 2
      59         pool: MAX
      60     }
      61 }
      62 
      63 layer {
      64     bottom: "pool1"
      65     top: "res2a_branch1"
      66     name: "res2a_branch1"
      67     type: "Convolution"
      68     convolution_param {
      69         num_output: 256
      70         kernel_size: 1
      71         pad: 0
      72         stride: 1
      73         bias_term: false
      74     }
      75 }
      76 
      77 layer {
      78     bottom: "res2a_branch1"
      79     top: "res2a_branch1"
      80     name: "bn2a_branch1"
      81     type: "BatchNorm"
      82     batch_norm_param {
      83         use_global_stats: true
      84     }
      85 }
      86 
      87 layer {
      88     bottom: "res2a_branch1"
      89     top: "res2a_branch1"
      90     name: "scale2a_branch1"
      91     type: "Scale"
      92     scale_param {
      93         bias_term: true
      94     }
      95 }
      96 
      97 layer {
      98     bottom: "pool1"
      99     top: "res2a_branch2a"
     100     name: "res2a_branch2a"
     101     type: "Convolution"
     102     convolution_param {
     103         num_output: 64
     104         kernel_size: 1
     105         pad: 0
     106         stride: 1
     107         bias_term: false
     108     }
     109 }
     110 
     111 layer {
     112     bottom: "res2a_branch2a"
     113     top: "res2a_branch2a"
     114     name: "bn2a_branch2a"
     115     type: "BatchNorm"
     116     batch_norm_param {
     117         use_global_stats: true
     118     }
     119 }
     120 
     121 layer {
     122     bottom: "res2a_branch2a"
     123     top: "res2a_branch2a"
     124     name: "scale2a_branch2a"
     125     type: "Scale"
     126     scale_param {
     127         bias_term: true
     128     }
     129 }
     130 
     131 layer {
     132     top: "res2a_branch2a"
     133     bottom: "res2a_branch2a"
     134     name: "res2a_branch2a_relu"
     135     type: "ReLU"
     136 }
     137 
     138 layer {
     139     bottom: "res2a_branch2a"
     140     top: "res2a_branch2b"
     141     name: "res2a_branch2b"
     142     type: "Convolution"
     143     convolution_param {
     144         num_output: 64
     145         kernel_size: 3
     146         pad: 1
     147         stride: 1
     148         bias_term: false
     149     }
     150 }
     151 
     152 layer {
     153     bottom: "res2a_branch2b"
     154     top: "res2a_branch2b"
     155     name: "bn2a_branch2b"
     156     type: "BatchNorm"
     157     batch_norm_param {
     158         use_global_stats: true
     159     }
     160 }
     161 
     162 layer {
     163     bottom: "res2a_branch2b"
     164     top: "res2a_branch2b"
     165     name: "scale2a_branch2b"
     166     type: "Scale"
     167     scale_param {
     168         bias_term: true
     169     }
     170 }
     171 
     172 layer {
     173     top: "res2a_branch2b"
     174     bottom: "res2a_branch2b"
     175     name: "res2a_branch2b_relu"
     176     type: "ReLU"
     177 }
     178 
     179 layer {
     180     bottom: "res2a_branch2b"
     181     top: "res2a_branch2c"
     182     name: "res2a_branch2c"
     183     type: "Convolution"
     184     convolution_param {
     185         num_output: 256
     186         kernel_size: 1
     187         pad: 0
     188         stride: 1
     189         bias_term: false
     190     }
     191 }
     192 
     193 layer {
     194     bottom: "res2a_branch2c"
     195     top: "res2a_branch2c"
     196     name: "bn2a_branch2c"
     197     type: "BatchNorm"
     198     batch_norm_param {
     199         use_global_stats: true
     200     }
     201 }
     202 
     203 layer {
     204     bottom: "res2a_branch2c"
     205     top: "res2a_branch2c"
     206     name: "scale2a_branch2c"
     207     type: "Scale"
     208     scale_param {
     209         bias_term: true
     210     }
     211 }
     212 
     213 layer {
     214     bottom: "res2a_branch1"
     215     bottom: "res2a_branch2c"
     216     top: "res2a"
     217     name: "res2a"
     218     type: "Eltwise"
     219 }
     220 
     221 layer {
     222     bottom: "res2a"
     223     top: "res2a"
     224     name: "res2a_relu"
     225     type: "ReLU"
     226 }
     227 
     228 layer {
     229     bottom: "res2a"
     230     top: "res2b_branch2a"
     231     name: "res2b_branch2a"
     232     type: "Convolution"
     233     convolution_param {
     234         num_output: 64
     235         kernel_size: 1
     236         pad: 0
     237         stride: 1
     238         bias_term: false
     239     }
     240 }
     241 
     242 layer {
     243     bottom: "res2b_branch2a"
     244     top: "res2b_branch2a"
     245     name: "bn2b_branch2a"
     246     type: "BatchNorm"
     247     batch_norm_param {
     248         use_global_stats: true
     249     }
     250 }
     251 
     252 layer {
     253     bottom: "res2b_branch2a"
     254     top: "res2b_branch2a"
     255     name: "scale2b_branch2a"
     256     type: "Scale"
     257     scale_param {
     258         bias_term: true
     259     }
     260 }
     261 
     262 layer {
     263     top: "res2b_branch2a"
     264     bottom: "res2b_branch2a"
     265     name: "res2b_branch2a_relu"
     266     type: "ReLU"
     267 }
     268 
     269 layer {
     270     bottom: "res2b_branch2a"
     271     top: "res2b_branch2b"
     272     name: "res2b_branch2b"
     273     type: "Convolution"
     274     convolution_param {
     275         num_output: 64
     276         kernel_size: 3
     277         pad: 1
     278         stride: 1
     279         bias_term: false
     280     }
     281 }
     282 
     283 layer {
     284     bottom: "res2b_branch2b"
     285     top: "res2b_branch2b"
     286     name: "bn2b_branch2b"
     287     type: "BatchNorm"
     288     batch_norm_param {
     289         use_global_stats: true
     290     }
     291 }
     292 
     293 layer {
     294     bottom: "res2b_branch2b"
     295     top: "res2b_branch2b"
     296     name: "scale2b_branch2b"
     297     type: "Scale"
     298     scale_param {
     299         bias_term: true
     300     }
     301 }
     302 
     303 layer {
     304     top: "res2b_branch2b"
     305     bottom: "res2b_branch2b"
     306     name: "res2b_branch2b_relu"
     307     type: "ReLU"
     308 }
     309 
     310 layer {
     311     bottom: "res2b_branch2b"
     312     top: "res2b_branch2c"
     313     name: "res2b_branch2c"
     314     type: "Convolution"
     315     convolution_param {
     316         num_output: 256
     317         kernel_size: 1
     318         pad: 0
     319         stride: 1
     320         bias_term: false
     321     }
     322 }
     323 
     324 layer {
     325     bottom: "res2b_branch2c"
     326     top: "res2b_branch2c"
     327     name: "bn2b_branch2c"
     328     type: "BatchNorm"
     329     batch_norm_param {
     330         use_global_stats: true
     331     }
     332 }
     333 
     334 layer {
     335     bottom: "res2b_branch2c"
     336     top: "res2b_branch2c"
     337     name: "scale2b_branch2c"
     338     type: "Scale"
     339     scale_param {
     340         bias_term: true
     341     }
     342 }
     343 
     344 layer {
     345     bottom: "res2a"
     346     bottom: "res2b_branch2c"
     347     top: "res2b"
     348     name: "res2b"
     349     type: "Eltwise"
     350 }
     351 
     352 layer {
     353     bottom: "res2b"
     354     top: "res2b"
     355     name: "res2b_relu"
     356     type: "ReLU"
     357 }
     358 
     359 layer {
     360     bottom: "res2b"
     361     top: "res2c_branch2a"
     362     name: "res2c_branch2a"
     363     type: "Convolution"
     364     convolution_param {
     365         num_output: 64
     366         kernel_size: 1
     367         pad: 0
     368         stride: 1
     369         bias_term: false
     370     }
     371 }
     372 
     373 layer {
     374     bottom: "res2c_branch2a"
     375     top: "res2c_branch2a"
     376     name: "bn2c_branch2a"
     377     type: "BatchNorm"
     378     batch_norm_param {
     379         use_global_stats: true
     380     }
     381 }
     382 
     383 layer {
     384     bottom: "res2c_branch2a"
     385     top: "res2c_branch2a"
     386     name: "scale2c_branch2a"
     387     type: "Scale"
     388     scale_param {
     389         bias_term: true
     390     }
     391 }
     392 
     393 layer {
     394     top: "res2c_branch2a"
     395     bottom: "res2c_branch2a"
     396     name: "res2c_branch2a_relu"
     397     type: "ReLU"
     398 }
     399 
     400 layer {
     401     bottom: "res2c_branch2a"
     402     top: "res2c_branch2b"
     403     name: "res2c_branch2b"
     404     type: "Convolution"
     405     convolution_param {
     406         num_output: 64
     407         kernel_size: 3
     408         pad: 1
     409         stride: 1
     410         bias_term: false
     411     }
     412 }
     413 
     414 layer {
     415     bottom: "res2c_branch2b"
     416     top: "res2c_branch2b"
     417     name: "bn2c_branch2b"
     418     type: "BatchNorm"
     419     batch_norm_param {
     420         use_global_stats: true
     421     }
     422 }
     423 
     424 layer {
     425     bottom: "res2c_branch2b"
     426     top: "res2c_branch2b"
     427     name: "scale2c_branch2b"
     428     type: "Scale"
     429     scale_param {
     430         bias_term: true
     431     }
     432 }
     433 
     434 layer {
     435     top: "res2c_branch2b"
     436     bottom: "res2c_branch2b"
     437     name: "res2c_branch2b_relu"
     438     type: "ReLU"
     439 }
     440 
     441 layer {
     442     bottom: "res2c_branch2b"
     443     top: "res2c_branch2c"
     444     name: "res2c_branch2c"
     445     type: "Convolution"
     446     convolution_param {
     447         num_output: 256
     448         kernel_size: 1
     449         pad: 0
     450         stride: 1
     451         bias_term: false
     452     }
     453 }
     454 
     455 layer {
     456     bottom: "res2c_branch2c"
     457     top: "res2c_branch2c"
     458     name: "bn2c_branch2c"
     459     type: "BatchNorm"
     460     batch_norm_param {
     461         use_global_stats: true
     462     }
     463 }
     464 
     465 layer {
     466     bottom: "res2c_branch2c"
     467     top: "res2c_branch2c"
     468     name: "scale2c_branch2c"
     469     type: "Scale"
     470     scale_param {
     471         bias_term: true
     472     }
     473 }
     474 
     475 layer {
     476     bottom: "res2b"
     477     bottom: "res2c_branch2c"
     478     top: "res2c"
     479     name: "res2c"
     480     type: "Eltwise"
     481 }
     482 
     483 layer {
     484     bottom: "res2c"
     485     top: "res2c"
     486     name: "res2c_relu"
     487     type: "ReLU"
     488 }
     489 
     490 layer {
     491     bottom: "res2c"
     492     top: "res3a_branch1"
     493     name: "res3a_branch1"
     494     type: "Convolution"
     495     convolution_param {
     496         num_output: 512
     497         kernel_size: 1
     498         pad: 0
     499         stride: 2
     500         bias_term: false
     501     }
     502 }
     503 
     504 layer {
     505     bottom: "res3a_branch1"
     506     top: "res3a_branch1"
     507     name: "bn3a_branch1"
     508     type: "BatchNorm"
     509     batch_norm_param {
     510         use_global_stats: true
     511     }
     512 }
     513 
     514 layer {
     515     bottom: "res3a_branch1"
     516     top: "res3a_branch1"
     517     name: "scale3a_branch1"
     518     type: "Scale"
     519     scale_param {
     520         bias_term: true
     521     }
     522 }
     523 
     524 layer {
     525     bottom: "res2c"
     526     top: "res3a_branch2a"
     527     name: "res3a_branch2a"
     528     type: "Convolution"
     529     convolution_param {
     530         num_output: 128
     531         kernel_size: 1
     532         pad: 0
     533         stride: 2
     534         bias_term: false
     535     }
     536 }
     537 
     538 layer {
     539     bottom: "res3a_branch2a"
     540     top: "res3a_branch2a"
     541     name: "bn3a_branch2a"
     542     type: "BatchNorm"
     543     batch_norm_param {
     544         use_global_stats: true
     545     }
     546 }
     547 
     548 layer {
     549     bottom: "res3a_branch2a"
     550     top: "res3a_branch2a"
     551     name: "scale3a_branch2a"
     552     type: "Scale"
     553     scale_param {
     554         bias_term: true
     555     }
     556 }
     557 
     558 layer {
     559     top: "res3a_branch2a"
     560     bottom: "res3a_branch2a"
     561     name: "res3a_branch2a_relu"
     562     type: "ReLU"
     563 }
     564 
     565 layer {
     566     bottom: "res3a_branch2a"
     567     top: "res3a_branch2b"
     568     name: "res3a_branch2b"
     569     type: "Convolution"
     570     convolution_param {
     571         num_output: 128
     572         kernel_size: 3
     573         pad: 1
     574         stride: 1
     575         bias_term: false
     576     }
     577 }
     578 
     579 layer {
     580     bottom: "res3a_branch2b"
     581     top: "res3a_branch2b"
     582     name: "bn3a_branch2b"
     583     type: "BatchNorm"
     584     batch_norm_param {
     585         use_global_stats: true
     586     }
     587 }
     588 
     589 layer {
     590     bottom: "res3a_branch2b"
     591     top: "res3a_branch2b"
     592     name: "scale3a_branch2b"
     593     type: "Scale"
     594     scale_param {
     595         bias_term: true
     596     }
     597 }
     598 
     599 layer {
     600     top: "res3a_branch2b"
     601     bottom: "res3a_branch2b"
     602     name: "res3a_branch2b_relu"
     603     type: "ReLU"
     604 }
     605 
     606 layer {
     607     bottom: "res3a_branch2b"
     608     top: "res3a_branch2c"
     609     name: "res3a_branch2c"
     610     type: "Convolution"
     611     convolution_param {
     612         num_output: 512
     613         kernel_size: 1
     614         pad: 0
     615         stride: 1
     616         bias_term: false
     617     }
     618 }
     619 
     620 layer {
     621     bottom: "res3a_branch2c"
     622     top: "res3a_branch2c"
     623     name: "bn3a_branch2c"
     624     type: "BatchNorm"
     625     batch_norm_param {
     626         use_global_stats: true
     627     }
     628 }
     629 
     630 layer {
     631     bottom: "res3a_branch2c"
     632     top: "res3a_branch2c"
     633     name: "scale3a_branch2c"
     634     type: "Scale"
     635     scale_param {
     636         bias_term: true
     637     }
     638 }
     639 
     640 layer {
     641     bottom: "res3a_branch1"
     642     bottom: "res3a_branch2c"
     643     top: "res3a"
     644     name: "res3a"
     645     type: "Eltwise"
     646 }
     647 
     648 layer {
     649     bottom: "res3a"
     650     top: "res3a"
     651     name: "res3a_relu"
     652     type: "ReLU"
     653 }
     654 
     655 layer {
     656     bottom: "res3a"
     657     top: "res3b1_branch2a"
     658     name: "res3b1_branch2a"
     659     type: "Convolution"
     660     convolution_param {
     661         num_output: 128
     662         kernel_size: 1
     663         pad: 0
     664         stride: 1
     665         bias_term: false
     666     }
     667 }
     668 
     669 layer {
     670     bottom: "res3b1_branch2a"
     671     top: "res3b1_branch2a"
     672     name: "bn3b1_branch2a"
     673     type: "BatchNorm"
     674     batch_norm_param {
     675         use_global_stats: true
     676     }
     677 }
     678 
     679 layer {
     680     bottom: "res3b1_branch2a"
     681     top: "res3b1_branch2a"
     682     name: "scale3b1_branch2a"
     683     type: "Scale"
     684     scale_param {
     685         bias_term: true
     686     }
     687 }
     688 
     689 layer {
     690     top: "res3b1_branch2a"
     691     bottom: "res3b1_branch2a"
     692     name: "res3b1_branch2a_relu"
     693     type: "ReLU"
     694 }
     695 
     696 layer {
     697     bottom: "res3b1_branch2a"
     698     top: "res3b1_branch2b"
     699     name: "res3b1_branch2b"
     700     type: "Convolution"
     701     convolution_param {
     702         num_output: 128
     703         kernel_size: 3
     704         pad: 1
     705         stride: 1
     706         bias_term: false
     707     }
     708 }
     709 
     710 layer {
     711     bottom: "res3b1_branch2b"
     712     top: "res3b1_branch2b"
     713     name: "bn3b1_branch2b"
     714     type: "BatchNorm"
     715     batch_norm_param {
     716         use_global_stats: true
     717     }
     718 }
     719 
     720 layer {
     721     bottom: "res3b1_branch2b"
     722     top: "res3b1_branch2b"
     723     name: "scale3b1_branch2b"
     724     type: "Scale"
     725     scale_param {
     726         bias_term: true
     727     }
     728 }
     729 
     730 layer {
     731     top: "res3b1_branch2b"
     732     bottom: "res3b1_branch2b"
     733     name: "res3b1_branch2b_relu"
     734     type: "ReLU"
     735 }
     736 
     737 layer {
     738     bottom: "res3b1_branch2b"
     739     top: "res3b1_branch2c"
     740     name: "res3b1_branch2c"
     741     type: "Convolution"
     742     convolution_param {
     743         num_output: 512
     744         kernel_size: 1
     745         pad: 0
     746         stride: 1
     747         bias_term: false
     748     }
     749 }
     750 
     751 layer {
     752     bottom: "res3b1_branch2c"
     753     top: "res3b1_branch2c"
     754     name: "bn3b1_branch2c"
     755     type: "BatchNorm"
     756     batch_norm_param {
     757         use_global_stats: true
     758     }
     759 }
     760 
     761 layer {
     762     bottom: "res3b1_branch2c"
     763     top: "res3b1_branch2c"
     764     name: "scale3b1_branch2c"
     765     type: "Scale"
     766     scale_param {
     767         bias_term: true
     768     }
     769 }
     770 
     771 layer {
     772     bottom: "res3a"
     773     bottom: "res3b1_branch2c"
     774     top: "res3b1"
     775     name: "res3b1"
     776     type: "Eltwise"
     777 }
     778 
     779 layer {
     780     bottom: "res3b1"
     781     top: "res3b1"
     782     name: "res3b1_relu"
     783     type: "ReLU"
     784 }
     785 
     786 layer {
     787     bottom: "res3b1"
     788     top: "res3b2_branch2a"
     789     name: "res3b2_branch2a"
     790     type: "Convolution"
     791     convolution_param {
     792         num_output: 128
     793         kernel_size: 1
     794         pad: 0
     795         stride: 1
     796         bias_term: false
     797     }
     798 }
     799 
     800 layer {
     801     bottom: "res3b2_branch2a"
     802     top: "res3b2_branch2a"
     803     name: "bn3b2_branch2a"
     804     type: "BatchNorm"
     805     batch_norm_param {
     806         use_global_stats: true
     807     }
     808 }
     809 
     810 layer {
     811     bottom: "res3b2_branch2a"
     812     top: "res3b2_branch2a"
     813     name: "scale3b2_branch2a"
     814     type: "Scale"
     815     scale_param {
     816         bias_term: true
     817     }
     818 }
     819 
     820 layer {
     821     top: "res3b2_branch2a"
     822     bottom: "res3b2_branch2a"
     823     name: "res3b2_branch2a_relu"
     824     type: "ReLU"
     825 }
     826 
     827 layer {
     828     bottom: "res3b2_branch2a"
     829     top: "res3b2_branch2b"
     830     name: "res3b2_branch2b"
     831     type: "Convolution"
     832     convolution_param {
     833         num_output: 128
     834         kernel_size: 3
     835         pad: 1
     836         stride: 1
     837         bias_term: false
     838     }
     839 }
     840 
     841 layer {
     842     bottom: "res3b2_branch2b"
     843     top: "res3b2_branch2b"
     844     name: "bn3b2_branch2b"
     845     type: "BatchNorm"
     846     batch_norm_param {
     847         use_global_stats: true
     848     }
     849 }
     850 
     851 layer {
     852     bottom: "res3b2_branch2b"
     853     top: "res3b2_branch2b"
     854     name: "scale3b2_branch2b"
     855     type: "Scale"
     856     scale_param {
     857         bias_term: true
     858     }
     859 }
     860 
     861 layer {
     862     top: "res3b2_branch2b"
     863     bottom: "res3b2_branch2b"
     864     name: "res3b2_branch2b_relu"
     865     type: "ReLU"
     866 }
     867 
     868 layer {
     869     bottom: "res3b2_branch2b"
     870     top: "res3b2_branch2c"
     871     name: "res3b2_branch2c"
     872     type: "Convolution"
     873     convolution_param {
     874         num_output: 512
     875         kernel_size: 1
     876         pad: 0
     877         stride: 1
     878         bias_term: false
     879     }
     880 }
     881 
     882 layer {
     883     bottom: "res3b2_branch2c"
     884     top: "res3b2_branch2c"
     885     name: "bn3b2_branch2c"
     886     type: "BatchNorm"
     887     batch_norm_param {
     888         use_global_stats: true
     889     }
     890 }
     891 
     892 layer {
     893     bottom: "res3b2_branch2c"
     894     top: "res3b2_branch2c"
     895     name: "scale3b2_branch2c"
     896     type: "Scale"
     897     scale_param {
     898         bias_term: true
     899     }
     900 }
     901 
     902 layer {
     903     bottom: "res3b1"
     904     bottom: "res3b2_branch2c"
     905     top: "res3b2"
     906     name: "res3b2"
     907     type: "Eltwise"
     908 }
     909 
     910 layer {
     911     bottom: "res3b2"
     912     top: "res3b2"
     913     name: "res3b2_relu"
     914     type: "ReLU"
     915 }
     916 
     917 layer {
     918     bottom: "res3b2"
     919     top: "res3b3_branch2a"
     920     name: "res3b3_branch2a"
     921     type: "Convolution"
     922     convolution_param {
     923         num_output: 128
     924         kernel_size: 1
     925         pad: 0
     926         stride: 1
     927         bias_term: false
     928     }
     929 }
     930 
     931 layer {
     932     bottom: "res3b3_branch2a"
     933     top: "res3b3_branch2a"
     934     name: "bn3b3_branch2a"
     935     type: "BatchNorm"
     936     batch_norm_param {
     937         use_global_stats: true
     938     }
     939 }
     940 
     941 layer {
     942     bottom: "res3b3_branch2a"
     943     top: "res3b3_branch2a"
     944     name: "scale3b3_branch2a"
     945     type: "Scale"
     946     scale_param {
     947         bias_term: true
     948     }
     949 }
     950 
     951 layer {
     952     top: "res3b3_branch2a"
     953     bottom: "res3b3_branch2a"
     954     name: "res3b3_branch2a_relu"
     955     type: "ReLU"
     956 }
     957 
     958 layer {
     959     bottom: "res3b3_branch2a"
     960     top: "res3b3_branch2b"
     961     name: "res3b3_branch2b"
     962     type: "Convolution"
     963     convolution_param {
     964         num_output: 128
     965         kernel_size: 3
     966         pad: 1
     967         stride: 1
     968         bias_term: false
     969     }
     970 }
     971 
     972 layer {
     973     bottom: "res3b3_branch2b"
     974     top: "res3b3_branch2b"
     975     name: "bn3b3_branch2b"
     976     type: "BatchNorm"
     977     batch_norm_param {
     978         use_global_stats: true
     979     }
     980 }
     981 
     982 layer {
     983     bottom: "res3b3_branch2b"
     984     top: "res3b3_branch2b"
     985     name: "scale3b3_branch2b"
     986     type: "Scale"
     987     scale_param {
     988         bias_term: true
     989     }
     990 }
     991 
     992 layer {
     993     top: "res3b3_branch2b"
     994     bottom: "res3b3_branch2b"
     995     name: "res3b3_branch2b_relu"
     996     type: "ReLU"
     997 }
     998 
     999 layer {
    1000     bottom: "res3b3_branch2b"
    1001     top: "res3b3_branch2c"
    1002     name: "res3b3_branch2c"
    1003     type: "Convolution"
    1004     convolution_param {
    1005         num_output: 512
    1006         kernel_size: 1
    1007         pad: 0
    1008         stride: 1
    1009         bias_term: false
    1010     }
    1011 }
    1012 
    1013 layer {
    1014     bottom: "res3b3_branch2c"
    1015     top: "res3b3_branch2c"
    1016     name: "bn3b3_branch2c"
    1017     type: "BatchNorm"
    1018     batch_norm_param {
    1019         use_global_stats: true
    1020     }
    1021 }
    1022 
    1023 layer {
    1024     bottom: "res3b3_branch2c"
    1025     top: "res3b3_branch2c"
    1026     name: "scale3b3_branch2c"
    1027     type: "Scale"
    1028     scale_param {
    1029         bias_term: true
    1030     }
    1031 }
    1032 
    1033 layer {
    1034     bottom: "res3b2"
    1035     bottom: "res3b3_branch2c"
    1036     top: "res3b3"
    1037     name: "res3b3"
    1038     type: "Eltwise"
    1039 }
    1040 
    1041 layer {
    1042     bottom: "res3b3"
    1043     top: "res3b3"
    1044     name: "res3b3_relu"
    1045     type: "ReLU"
    1046 }
    1047 
    1048 layer {
    1049     bottom: "res3b3"
    1050     top: "res3b4_branch2a"
    1051     name: "res3b4_branch2a"
    1052     type: "Convolution"
    1053     convolution_param {
    1054         num_output: 128
    1055         kernel_size: 1
    1056         pad: 0
    1057         stride: 1
    1058         bias_term: false
    1059     }
    1060 }
    1061 
    1062 layer {
    1063     bottom: "res3b4_branch2a"
    1064     top: "res3b4_branch2a"
    1065     name: "bn3b4_branch2a"
    1066     type: "BatchNorm"
    1067     batch_norm_param {
    1068         use_global_stats: true
    1069     }
    1070 }
    1071 
    1072 layer {
    1073     bottom: "res3b4_branch2a"
    1074     top: "res3b4_branch2a"
    1075     name: "scale3b4_branch2a"
    1076     type: "Scale"
    1077     scale_param {
    1078         bias_term: true
    1079     }
    1080 }
    1081 
    1082 layer {
    1083     top: "res3b4_branch2a"
    1084     bottom: "res3b4_branch2a"
    1085     name: "res3b4_branch2a_relu"
    1086     type: "ReLU"
    1087 }
    1088 
    1089 layer {
    1090     bottom: "res3b4_branch2a"
    1091     top: "res3b4_branch2b"
    1092     name: "res3b4_branch2b"
    1093     type: "Convolution"
    1094     convolution_param {
    1095         num_output: 128
    1096         kernel_size: 3
    1097         pad: 1
    1098         stride: 1
    1099         bias_term: false
    1100     }
    1101 }
    1102 
    1103 layer {
    1104     bottom: "res3b4_branch2b"
    1105     top: "res3b4_branch2b"
    1106     name: "bn3b4_branch2b"
    1107     type: "BatchNorm"
    1108     batch_norm_param {
    1109         use_global_stats: true
    1110     }
    1111 }
    1112 
    1113 layer {
    1114     bottom: "res3b4_branch2b"
    1115     top: "res3b4_branch2b"
    1116     name: "scale3b4_branch2b"
    1117     type: "Scale"
    1118     scale_param {
    1119         bias_term: true
    1120     }
    1121 }
    1122 
    1123 layer {
    1124     top: "res3b4_branch2b"
    1125     bottom: "res3b4_branch2b"
    1126     name: "res3b4_branch2b_relu"
    1127     type: "ReLU"
    1128 }
    1129 
    1130 layer {
    1131     bottom: "res3b4_branch2b"
    1132     top: "res3b4_branch2c"
    1133     name: "res3b4_branch2c"
    1134     type: "Convolution"
    1135     convolution_param {
    1136         num_output: 512
    1137         kernel_size: 1
    1138         pad: 0
    1139         stride: 1
    1140         bias_term: false
    1141     }
    1142 }
    1143 
    1144 layer {
    1145     bottom: "res3b4_branch2c"
    1146     top: "res3b4_branch2c"
    1147     name: "bn3b4_branch2c"
    1148     type: "BatchNorm"
    1149     batch_norm_param {
    1150         use_global_stats: true
    1151     }
    1152 }
    1153 
    1154 layer {
    1155     bottom: "res3b4_branch2c"
    1156     top: "res3b4_branch2c"
    1157     name: "scale3b4_branch2c"
    1158     type: "Scale"
    1159     scale_param {
    1160         bias_term: true
    1161     }
    1162 }
    1163 
    1164 layer {
    1165     bottom: "res3b3"
    1166     bottom: "res3b4_branch2c"
    1167     top: "res3b4"
    1168     name: "res3b4"
    1169     type: "Eltwise"
    1170 }
    1171 
    1172 layer {
    1173     bottom: "res3b4"
    1174     top: "res3b4"
    1175     name: "res3b4_relu"
    1176     type: "ReLU"
    1177 }
    1178 
    1179 layer {
    1180     bottom: "res3b4"
    1181     top: "res3b5_branch2a"
    1182     name: "res3b5_branch2a"
    1183     type: "Convolution"
    1184     convolution_param {
    1185         num_output: 128
    1186         kernel_size: 1
    1187         pad: 0
    1188         stride: 1
    1189         bias_term: false
    1190     }
    1191 }
    1192 
    1193 layer {
    1194     bottom: "res3b5_branch2a"
    1195     top: "res3b5_branch2a"
    1196     name: "bn3b5_branch2a"
    1197     type: "BatchNorm"
    1198     batch_norm_param {
    1199         use_global_stats: true
    1200     }
    1201 }
    1202 
    1203 layer {
    1204     bottom: "res3b5_branch2a"
    1205     top: "res3b5_branch2a"
    1206     name: "scale3b5_branch2a"
    1207     type: "Scale"
    1208     scale_param {
    1209         bias_term: true
    1210     }
    1211 }
    1212 
    1213 layer {
    1214     top: "res3b5_branch2a"
    1215     bottom: "res3b5_branch2a"
    1216     name: "res3b5_branch2a_relu"
    1217     type: "ReLU"
    1218 }
    1219 
    1220 layer {
    1221     bottom: "res3b5_branch2a"
    1222     top: "res3b5_branch2b"
    1223     name: "res3b5_branch2b"
    1224     type: "Convolution"
    1225     convolution_param {
    1226         num_output: 128
    1227         kernel_size: 3
    1228         pad: 1
    1229         stride: 1
    1230         bias_term: false
    1231     }
    1232 }
    1233 
    1234 layer {
    1235     bottom: "res3b5_branch2b"
    1236     top: "res3b5_branch2b"
    1237     name: "bn3b5_branch2b"
    1238     type: "BatchNorm"
    1239     batch_norm_param {
    1240         use_global_stats: true
    1241     }
    1242 }
    1243 
    1244 layer {
    1245     bottom: "res3b5_branch2b"
    1246     top: "res3b5_branch2b"
    1247     name: "scale3b5_branch2b"
    1248     type: "Scale"
    1249     scale_param {
    1250         bias_term: true
    1251     }
    1252 }
    1253 
    1254 layer {
    1255     top: "res3b5_branch2b"
    1256     bottom: "res3b5_branch2b"
    1257     name: "res3b5_branch2b_relu"
    1258     type: "ReLU"
    1259 }
    1260 
    1261 layer {
    1262     bottom: "res3b5_branch2b"
    1263     top: "res3b5_branch2c"
    1264     name: "res3b5_branch2c"
    1265     type: "Convolution"
    1266     convolution_param {
    1267         num_output: 512
    1268         kernel_size: 1
    1269         pad: 0
    1270         stride: 1
    1271         bias_term: false
    1272     }
    1273 }
    1274 
    1275 layer {
    1276     bottom: "res3b5_branch2c"
    1277     top: "res3b5_branch2c"
    1278     name: "bn3b5_branch2c"
    1279     type: "BatchNorm"
    1280     batch_norm_param {
    1281         use_global_stats: true
    1282     }
    1283 }
    1284 
    1285 layer {
    1286     bottom: "res3b5_branch2c"
    1287     top: "res3b5_branch2c"
    1288     name: "scale3b5_branch2c"
    1289     type: "Scale"
    1290     scale_param {
    1291         bias_term: true
    1292     }
    1293 }
    1294 
    1295 layer {
    1296     bottom: "res3b4"
    1297     bottom: "res3b5_branch2c"
    1298     top: "res3b5"
    1299     name: "res3b5"
    1300     type: "Eltwise"
    1301 }
    1302 
    1303 layer {
    1304     bottom: "res3b5"
    1305     top: "res3b5"
    1306     name: "res3b5_relu"
    1307     type: "ReLU"
    1308 }
    1309 
    1310 layer {
    1311     bottom: "res3b5"
    1312     top: "res3b6_branch2a"
    1313     name: "res3b6_branch2a"
    1314     type: "Convolution"
    1315     convolution_param {
    1316         num_output: 128
    1317         kernel_size: 1
    1318         pad: 0
    1319         stride: 1
    1320         bias_term: false
    1321     }
    1322 }
    1323 
    1324 layer {
    1325     bottom: "res3b6_branch2a"
    1326     top: "res3b6_branch2a"
    1327     name: "bn3b6_branch2a"
    1328     type: "BatchNorm"
    1329     batch_norm_param {
    1330         use_global_stats: true
    1331     }
    1332 }
    1333 
    1334 layer {
    1335     bottom: "res3b6_branch2a"
    1336     top: "res3b6_branch2a"
    1337     name: "scale3b6_branch2a"
    1338     type: "Scale"
    1339     scale_param {
    1340         bias_term: true
    1341     }
    1342 }
    1343 
    1344 layer {
    1345     top: "res3b6_branch2a"
    1346     bottom: "res3b6_branch2a"
    1347     name: "res3b6_branch2a_relu"
    1348     type: "ReLU"
    1349 }
    1350 
    1351 layer {
    1352     bottom: "res3b6_branch2a"
    1353     top: "res3b6_branch2b"
    1354     name: "res3b6_branch2b"
    1355     type: "Convolution"
    1356     convolution_param {
    1357         num_output: 128
    1358         kernel_size: 3
    1359         pad: 1
    1360         stride: 1
    1361         bias_term: false
    1362     }
    1363 }
    1364 
    1365 layer {
    1366     bottom: "res3b6_branch2b"
    1367     top: "res3b6_branch2b"
    1368     name: "bn3b6_branch2b"
    1369     type: "BatchNorm"
    1370     batch_norm_param {
    1371         use_global_stats: true
    1372     }
    1373 }
    1374 
    1375 layer {
    1376     bottom: "res3b6_branch2b"
    1377     top: "res3b6_branch2b"
    1378     name: "scale3b6_branch2b"
    1379     type: "Scale"
    1380     scale_param {
    1381         bias_term: true
    1382     }
    1383 }
    1384 
    1385 layer {
    1386     top: "res3b6_branch2b"
    1387     bottom: "res3b6_branch2b"
    1388     name: "res3b6_branch2b_relu"
    1389     type: "ReLU"
    1390 }
    1391 
    1392 layer {
    1393     bottom: "res3b6_branch2b"
    1394     top: "res3b6_branch2c"
    1395     name: "res3b6_branch2c"
    1396     type: "Convolution"
    1397     convolution_param {
    1398         num_output: 512
    1399         kernel_size: 1
    1400         pad: 0
    1401         stride: 1
    1402         bias_term: false
    1403     }
    1404 }
    1405 
    1406 layer {
    1407     bottom: "res3b6_branch2c"
    1408     top: "res3b6_branch2c"
    1409     name: "bn3b6_branch2c"
    1410     type: "BatchNorm"
    1411     batch_norm_param {
    1412         use_global_stats: true
    1413     }
    1414 }
    1415 
    1416 layer {
    1417     bottom: "res3b6_branch2c"
    1418     top: "res3b6_branch2c"
    1419     name: "scale3b6_branch2c"
    1420     type: "Scale"
    1421     scale_param {
    1422         bias_term: true
    1423     }
    1424 }
    1425 
    1426 layer {
    1427     bottom: "res3b5"
    1428     bottom: "res3b6_branch2c"
    1429     top: "res3b6"
    1430     name: "res3b6"
    1431     type: "Eltwise"
    1432 }
    1433 
    1434 layer {
    1435     bottom: "res3b6"
    1436     top: "res3b6"
    1437     name: "res3b6_relu"
    1438     type: "ReLU"
    1439 }
    1440 
    1441 layer {
    1442     bottom: "res3b6"
    1443     top: "res3b7_branch2a"
    1444     name: "res3b7_branch2a"
    1445     type: "Convolution"
    1446     convolution_param {
    1447         num_output: 128
    1448         kernel_size: 1
    1449         pad: 0
    1450         stride: 1
    1451         bias_term: false
    1452     }
    1453 }
    1454 
    1455 layer {
    1456     bottom: "res3b7_branch2a"
    1457     top: "res3b7_branch2a"
    1458     name: "bn3b7_branch2a"
    1459     type: "BatchNorm"
    1460     batch_norm_param {
    1461         use_global_stats: true
    1462     }
    1463 }
    1464 
    1465 layer {
    1466     bottom: "res3b7_branch2a"
    1467     top: "res3b7_branch2a"
    1468     name: "scale3b7_branch2a"
    1469     type: "Scale"
    1470     scale_param {
    1471         bias_term: true
    1472     }
    1473 }
    1474 
    1475 layer {
    1476     top: "res3b7_branch2a"
    1477     bottom: "res3b7_branch2a"
    1478     name: "res3b7_branch2a_relu"
    1479     type: "ReLU"
    1480 }
    1481 
    1482 layer {
    1483     bottom: "res3b7_branch2a"
    1484     top: "res3b7_branch2b"
    1485     name: "res3b7_branch2b"
    1486     type: "Convolution"
    1487     convolution_param {
    1488         num_output: 128
    1489         kernel_size: 3
    1490         pad: 1
    1491         stride: 1
    1492         bias_term: false
    1493     }
    1494 }
    1495 
    1496 layer {
    1497     bottom: "res3b7_branch2b"
    1498     top: "res3b7_branch2b"
    1499     name: "bn3b7_branch2b"
    1500     type: "BatchNorm"
    1501     batch_norm_param {
    1502         use_global_stats: true
    1503     }
    1504 }
    1505 
    1506 layer {
    1507     bottom: "res3b7_branch2b"
    1508     top: "res3b7_branch2b"
    1509     name: "scale3b7_branch2b"
    1510     type: "Scale"
    1511     scale_param {
    1512         bias_term: true
    1513     }
    1514 }
    1515 
    1516 layer {
    1517     top: "res3b7_branch2b"
    1518     bottom: "res3b7_branch2b"
    1519     name: "res3b7_branch2b_relu"
    1520     type: "ReLU"
    1521 }
    1522 
    1523 layer {
    1524     bottom: "res3b7_branch2b"
    1525     top: "res3b7_branch2c"
    1526     name: "res3b7_branch2c"
    1527     type: "Convolution"
    1528     convolution_param {
    1529         num_output: 512
    1530         kernel_size: 1
    1531         pad: 0
    1532         stride: 1
    1533         bias_term: false
    1534     }
    1535 }
    1536 
    1537 layer {
    1538     bottom: "res3b7_branch2c"
    1539     top: "res3b7_branch2c"
    1540     name: "bn3b7_branch2c"
    1541     type: "BatchNorm"
    1542     batch_norm_param {
    1543         use_global_stats: true
    1544     }
    1545 }
    1546 
    1547 layer {
    1548     bottom: "res3b7_branch2c"
    1549     top: "res3b7_branch2c"
    1550     name: "scale3b7_branch2c"
    1551     type: "Scale"
    1552     scale_param {
    1553         bias_term: true
    1554     }
    1555 }
    1556 
    1557 layer {
    1558     bottom: "res3b6"
    1559     bottom: "res3b7_branch2c"
    1560     top: "res3b7"
    1561     name: "res3b7"
    1562     type: "Eltwise"
    1563 }
    1564 
    1565 layer {
    1566     bottom: "res3b7"
    1567     top: "res3b7"
    1568     name: "res3b7_relu"
    1569     type: "ReLU"
    1570 }
    1571 
    1572 layer {
    1573     bottom: "res3b7"
    1574     top: "res4a_branch1"
    1575     name: "res4a_branch1"
    1576     type: "Convolution"
    1577     convolution_param {
    1578         num_output: 1024
    1579         kernel_size: 1
    1580         pad: 0
    1581         stride: 2
    1582         bias_term: false
    1583     }
    1584 }
    1585 
    1586 layer {
    1587     bottom: "res4a_branch1"
    1588     top: "res4a_branch1"
    1589     name: "bn4a_branch1"
    1590     type: "BatchNorm"
    1591     batch_norm_param {
    1592         use_global_stats: true
    1593     }
    1594 }
    1595 
    1596 layer {
    1597     bottom: "res4a_branch1"
    1598     top: "res4a_branch1"
    1599     name: "scale4a_branch1"
    1600     type: "Scale"
    1601     scale_param {
    1602         bias_term: true
    1603     }
    1604 }
    1605 
    1606 layer {
    1607     bottom: "res3b7"
    1608     top: "res4a_branch2a"
    1609     name: "res4a_branch2a"
    1610     type: "Convolution"
    1611     convolution_param {
    1612         num_output: 256
    1613         kernel_size: 1
    1614         pad: 0
    1615         stride: 2
    1616         bias_term: false
    1617     }
    1618 }
    1619 
    1620 layer {
    1621     bottom: "res4a_branch2a"
    1622     top: "res4a_branch2a"
    1623     name: "bn4a_branch2a"
    1624     type: "BatchNorm"
    1625     batch_norm_param {
    1626         use_global_stats: true
    1627     }
    1628 }
    1629 
    1630 layer {
    1631     bottom: "res4a_branch2a"
    1632     top: "res4a_branch2a"
    1633     name: "scale4a_branch2a"
    1634     type: "Scale"
    1635     scale_param {
    1636         bias_term: true
    1637     }
    1638 }
    1639 
    1640 layer {
    1641     top: "res4a_branch2a"
    1642     bottom: "res4a_branch2a"
    1643     name: "res4a_branch2a_relu"
    1644     type: "ReLU"
    1645 }
    1646 
    1647 layer {
    1648     bottom: "res4a_branch2a"
    1649     top: "res4a_branch2b"
    1650     name: "res4a_branch2b"
    1651     type: "Convolution"
    1652     convolution_param {
    1653         num_output: 256
    1654         kernel_size: 3
    1655         pad: 1
    1656         stride: 1
    1657         bias_term: false
    1658     }
    1659 }
    1660 
    1661 layer {
    1662     bottom: "res4a_branch2b"
    1663     top: "res4a_branch2b"
    1664     name: "bn4a_branch2b"
    1665     type: "BatchNorm"
    1666     batch_norm_param {
    1667         use_global_stats: true
    1668     }
    1669 }
    1670 
    1671 layer {
    1672     bottom: "res4a_branch2b"
    1673     top: "res4a_branch2b"
    1674     name: "scale4a_branch2b"
    1675     type: "Scale"
    1676     scale_param {
    1677         bias_term: true
    1678     }
    1679 }
    1680 
    1681 layer {
    1682     top: "res4a_branch2b"
    1683     bottom: "res4a_branch2b"
    1684     name: "res4a_branch2b_relu"
    1685     type: "ReLU"
    1686 }
    1687 
    1688 layer {
    1689     bottom: "res4a_branch2b"
    1690     top: "res4a_branch2c"
    1691     name: "res4a_branch2c"
    1692     type: "Convolution"
    1693     convolution_param {
    1694         num_output: 1024
    1695         kernel_size: 1
    1696         pad: 0
    1697         stride: 1
    1698         bias_term: false
    1699     }
    1700 }
    1701 
    1702 layer {
    1703     bottom: "res4a_branch2c"
    1704     top: "res4a_branch2c"
    1705     name: "bn4a_branch2c"
    1706     type: "BatchNorm"
    1707     batch_norm_param {
    1708         use_global_stats: true
    1709     }
    1710 }
    1711 
    1712 layer {
    1713     bottom: "res4a_branch2c"
    1714     top: "res4a_branch2c"
    1715     name: "scale4a_branch2c"
    1716     type: "Scale"
    1717     scale_param {
    1718         bias_term: true
    1719     }
    1720 }
    1721 
    1722 layer {
    1723     bottom: "res4a_branch1"
    1724     bottom: "res4a_branch2c"
    1725     top: "res4a"
    1726     name: "res4a"
    1727     type: "Eltwise"
    1728 }
    1729 
    1730 layer {
    1731     bottom: "res4a"
    1732     top: "res4a"
    1733     name: "res4a_relu"
    1734     type: "ReLU"
    1735 }
    1736 
    1737 layer {
    1738     bottom: "res4a"
    1739     top: "res4b1_branch2a"
    1740     name: "res4b1_branch2a"
    1741     type: "Convolution"
    1742     convolution_param {
    1743         num_output: 256
    1744         kernel_size: 1
    1745         pad: 0
    1746         stride: 1
    1747         bias_term: false
    1748     }
    1749 }
    1750 
    1751 layer {
    1752     bottom: "res4b1_branch2a"
    1753     top: "res4b1_branch2a"
    1754     name: "bn4b1_branch2a"
    1755     type: "BatchNorm"
    1756     batch_norm_param {
    1757         use_global_stats: true
    1758     }
    1759 }
    1760 
    1761 layer {
    1762     bottom: "res4b1_branch2a"
    1763     top: "res4b1_branch2a"
    1764     name: "scale4b1_branch2a"
    1765     type: "Scale"
    1766     scale_param {
    1767         bias_term: true
    1768     }
    1769 }
    1770 
    1771 layer {
    1772     top: "res4b1_branch2a"
    1773     bottom: "res4b1_branch2a"
    1774     name: "res4b1_branch2a_relu"
    1775     type: "ReLU"
    1776 }
    1777 
    1778 layer {
    1779     bottom: "res4b1_branch2a"
    1780     top: "res4b1_branch2b"
    1781     name: "res4b1_branch2b"
    1782     type: "Convolution"
    1783     convolution_param {
    1784         num_output: 256
    1785         kernel_size: 3
    1786         pad: 1
    1787         stride: 1
    1788         bias_term: false
    1789     }
    1790 }
    1791 
    1792 layer {
    1793     bottom: "res4b1_branch2b"
    1794     top: "res4b1_branch2b"
    1795     name: "bn4b1_branch2b"
    1796     type: "BatchNorm"
    1797     batch_norm_param {
    1798         use_global_stats: true
    1799     }
    1800 }
    1801 
    1802 layer {
    1803     bottom: "res4b1_branch2b"
    1804     top: "res4b1_branch2b"
    1805     name: "scale4b1_branch2b"
    1806     type: "Scale"
    1807     scale_param {
    1808         bias_term: true
    1809     }
    1810 }
    1811 
    1812 layer {
    1813     top: "res4b1_branch2b"
    1814     bottom: "res4b1_branch2b"
    1815     name: "res4b1_branch2b_relu"
    1816     type: "ReLU"
    1817 }
    1818 
    1819 layer {
    1820     bottom: "res4b1_branch2b"
    1821     top: "res4b1_branch2c"
    1822     name: "res4b1_branch2c"
    1823     type: "Convolution"
    1824     convolution_param {
    1825         num_output: 1024
    1826         kernel_size: 1
    1827         pad: 0
    1828         stride: 1
    1829         bias_term: false
    1830     }
    1831 }
    1832 
    1833 layer {
    1834     bottom: "res4b1_branch2c"
    1835     top: "res4b1_branch2c"
    1836     name: "bn4b1_branch2c"
    1837     type: "BatchNorm"
    1838     batch_norm_param {
    1839         use_global_stats: true
    1840     }
    1841 }
    1842 
    1843 layer {
    1844     bottom: "res4b1_branch2c"
    1845     top: "res4b1_branch2c"
    1846     name: "scale4b1_branch2c"
    1847     type: "Scale"
    1848     scale_param {
    1849         bias_term: true
    1850     }
    1851 }
    1852 
    1853 layer {
    1854     bottom: "res4a"
    1855     bottom: "res4b1_branch2c"
    1856     top: "res4b1"
    1857     name: "res4b1"
    1858     type: "Eltwise"
    1859 }
    1860 
    1861 layer {
    1862     bottom: "res4b1"
    1863     top: "res4b1"
    1864     name: "res4b1_relu"
    1865     type: "ReLU"
    1866 }
    1867 
    1868 layer {
    1869     bottom: "res4b1"
    1870     top: "res4b2_branch2a"
    1871     name: "res4b2_branch2a"
    1872     type: "Convolution"
    1873     convolution_param {
    1874         num_output: 256
    1875         kernel_size: 1
    1876         pad: 0
    1877         stride: 1
    1878         bias_term: false
    1879     }
    1880 }
    1881 
    1882 layer {
    1883     bottom: "res4b2_branch2a"
    1884     top: "res4b2_branch2a"
    1885     name: "bn4b2_branch2a"
    1886     type: "BatchNorm"
    1887     batch_norm_param {
    1888         use_global_stats: true
    1889     }
    1890 }
    1891 
    1892 layer {
    1893     bottom: "res4b2_branch2a"
    1894     top: "res4b2_branch2a"
    1895     name: "scale4b2_branch2a"
    1896     type: "Scale"
    1897     scale_param {
    1898         bias_term: true
    1899     }
    1900 }
    1901 
    1902 layer {
    1903     top: "res4b2_branch2a"
    1904     bottom: "res4b2_branch2a"
    1905     name: "res4b2_branch2a_relu"
    1906     type: "ReLU"
    1907 }
    1908 
    1909 layer {
    1910     bottom: "res4b2_branch2a"
    1911     top: "res4b2_branch2b"
    1912     name: "res4b2_branch2b"
    1913     type: "Convolution"
    1914     convolution_param {
    1915         num_output: 256
    1916         kernel_size: 3
    1917         pad: 1
    1918         stride: 1
    1919         bias_term: false
    1920     }
    1921 }
    1922 
    1923 layer {
    1924     bottom: "res4b2_branch2b"
    1925     top: "res4b2_branch2b"
    1926     name: "bn4b2_branch2b"
    1927     type: "BatchNorm"
    1928     batch_norm_param {
    1929         use_global_stats: true
    1930     }
    1931 }
    1932 
    1933 layer {
    1934     bottom: "res4b2_branch2b"
    1935     top: "res4b2_branch2b"
    1936     name: "scale4b2_branch2b"
    1937     type: "Scale"
    1938     scale_param {
    1939         bias_term: true
    1940     }
    1941 }
    1942 
    1943 layer {
    1944     top: "res4b2_branch2b"
    1945     bottom: "res4b2_branch2b"
    1946     name: "res4b2_branch2b_relu"
    1947     type: "ReLU"
    1948 }
    1949 
    1950 layer {
    1951     bottom: "res4b2_branch2b"
    1952     top: "res4b2_branch2c"
    1953     name: "res4b2_branch2c"
    1954     type: "Convolution"
    1955     convolution_param {
    1956         num_output: 1024
    1957         kernel_size: 1
    1958         pad: 0
    1959         stride: 1
    1960         bias_term: false
    1961     }
    1962 }
    1963 
    1964 layer {
    1965     bottom: "res4b2_branch2c"
    1966     top: "res4b2_branch2c"
    1967     name: "bn4b2_branch2c"
    1968     type: "BatchNorm"
    1969     batch_norm_param {
    1970         use_global_stats: true
    1971     }
    1972 }
    1973 
    1974 layer {
    1975     bottom: "res4b2_branch2c"
    1976     top: "res4b2_branch2c"
    1977     name: "scale4b2_branch2c"
    1978     type: "Scale"
    1979     scale_param {
    1980         bias_term: true
    1981     }
    1982 }
    1983 
    1984 layer {
    1985     bottom: "res4b1"
    1986     bottom: "res4b2_branch2c"
    1987     top: "res4b2"
    1988     name: "res4b2"
    1989     type: "Eltwise"
    1990 }
    1991 
    1992 layer {
    1993     bottom: "res4b2"
    1994     top: "res4b2"
    1995     name: "res4b2_relu"
    1996     type: "ReLU"
    1997 }
    1998 
    1999 layer {
    2000     bottom: "res4b2"
    2001     top: "res4b3_branch2a"
    2002     name: "res4b3_branch2a"
    2003     type: "Convolution"
    2004     convolution_param {
    2005         num_output: 256
    2006         kernel_size: 1
    2007         pad: 0
    2008         stride: 1
    2009         bias_term: false
    2010     }
    2011 }
    2012 
    2013 layer {
    2014     bottom: "res4b3_branch2a"
    2015     top: "res4b3_branch2a"
    2016     name: "bn4b3_branch2a"
    2017     type: "BatchNorm"
    2018     batch_norm_param {
    2019         use_global_stats: true
    2020     }
    2021 }
    2022 
    2023 layer {
    2024     bottom: "res4b3_branch2a"
    2025     top: "res4b3_branch2a"
    2026     name: "scale4b3_branch2a"
    2027     type: "Scale"
    2028     scale_param {
    2029         bias_term: true
    2030     }
    2031 }
    2032 
    2033 layer {
    2034     top: "res4b3_branch2a"
    2035     bottom: "res4b3_branch2a"
    2036     name: "res4b3_branch2a_relu"
    2037     type: "ReLU"
    2038 }
    2039 
    2040 layer {
    2041     bottom: "res4b3_branch2a"
    2042     top: "res4b3_branch2b"
    2043     name: "res4b3_branch2b"
    2044     type: "Convolution"
    2045     convolution_param {
    2046         num_output: 256
    2047         kernel_size: 3
    2048         pad: 1
    2049         stride: 1
    2050         bias_term: false
    2051     }
    2052 }
    2053 
    2054 layer {
    2055     bottom: "res4b3_branch2b"
    2056     top: "res4b3_branch2b"
    2057     name: "bn4b3_branch2b"
    2058     type: "BatchNorm"
    2059     batch_norm_param {
    2060         use_global_stats: true
    2061     }
    2062 }
    2063 
    2064 layer {
    2065     bottom: "res4b3_branch2b"
    2066     top: "res4b3_branch2b"
    2067     name: "scale4b3_branch2b"
    2068     type: "Scale"
    2069     scale_param {
    2070         bias_term: true
    2071     }
    2072 }
    2073 
    2074 layer {
    2075     top: "res4b3_branch2b"
    2076     bottom: "res4b3_branch2b"
    2077     name: "res4b3_branch2b_relu"
    2078     type: "ReLU"
    2079 }
    2080 
    2081 layer {
    2082     bottom: "res4b3_branch2b"
    2083     top: "res4b3_branch2c"
    2084     name: "res4b3_branch2c"
    2085     type: "Convolution"
    2086     convolution_param {
    2087         num_output: 1024
    2088         kernel_size: 1
    2089         pad: 0
    2090         stride: 1
    2091         bias_term: false
    2092     }
    2093 }
    2094 
    2095 layer {
    2096     bottom: "res4b3_branch2c"
    2097     top: "res4b3_branch2c"
    2098     name: "bn4b3_branch2c"
    2099     type: "BatchNorm"
    2100     batch_norm_param {
    2101         use_global_stats: true
    2102     }
    2103 }
    2104 
    2105 layer {
    2106     bottom: "res4b3_branch2c"
    2107     top: "res4b3_branch2c"
    2108     name: "scale4b3_branch2c"
    2109     type: "Scale"
    2110     scale_param {
    2111         bias_term: true
    2112     }
    2113 }
    2114 
    2115 layer {
    2116     bottom: "res4b2"
    2117     bottom: "res4b3_branch2c"
    2118     top: "res4b3"
    2119     name: "res4b3"
    2120     type: "Eltwise"
    2121 }
    2122 
    2123 layer {
    2124     bottom: "res4b3"
    2125     top: "res4b3"
    2126     name: "res4b3_relu"
    2127     type: "ReLU"
    2128 }
    2129 
    2130 layer {
    2131     bottom: "res4b3"
    2132     top: "res4b4_branch2a"
    2133     name: "res4b4_branch2a"
    2134     type: "Convolution"
    2135     convolution_param {
    2136         num_output: 256
    2137         kernel_size: 1
    2138         pad: 0
    2139         stride: 1
    2140         bias_term: false
    2141     }
    2142 }
    2143 
    2144 layer {
    2145     bottom: "res4b4_branch2a"
    2146     top: "res4b4_branch2a"
    2147     name: "bn4b4_branch2a"
    2148     type: "BatchNorm"
    2149     batch_norm_param {
    2150         use_global_stats: true
    2151     }
    2152 }
    2153 
    2154 layer {
    2155     bottom: "res4b4_branch2a"
    2156     top: "res4b4_branch2a"
    2157     name: "scale4b4_branch2a"
    2158     type: "Scale"
    2159     scale_param {
    2160         bias_term: true
    2161     }
    2162 }
    2163 
    2164 layer {
    2165     top: "res4b4_branch2a"
    2166     bottom: "res4b4_branch2a"
    2167     name: "res4b4_branch2a_relu"
    2168     type: "ReLU"
    2169 }
    2170 
    2171 layer {
    2172     bottom: "res4b4_branch2a"
    2173     top: "res4b4_branch2b"
    2174     name: "res4b4_branch2b"
    2175     type: "Convolution"
    2176     convolution_param {
    2177         num_output: 256
    2178         kernel_size: 3
    2179         pad: 1
    2180         stride: 1
    2181         bias_term: false
    2182     }
    2183 }
    2184 
    2185 layer {
    2186     bottom: "res4b4_branch2b"
    2187     top: "res4b4_branch2b"
    2188     name: "bn4b4_branch2b"
    2189     type: "BatchNorm"
    2190     batch_norm_param {
    2191         use_global_stats: true
    2192     }
    2193 }
    2194 
    2195 layer {
    2196     bottom: "res4b4_branch2b"
    2197     top: "res4b4_branch2b"
    2198     name: "scale4b4_branch2b"
    2199     type: "Scale"
    2200     scale_param {
    2201         bias_term: true
    2202     }
    2203 }
    2204 
    2205 layer {
    2206     top: "res4b4_branch2b"
    2207     bottom: "res4b4_branch2b"
    2208     name: "res4b4_branch2b_relu"
    2209     type: "ReLU"
    2210 }
    2211 
    2212 layer {
    2213     bottom: "res4b4_branch2b"
    2214     top: "res4b4_branch2c"
    2215     name: "res4b4_branch2c"
    2216     type: "Convolution"
    2217     convolution_param {
    2218         num_output: 1024
    2219         kernel_size: 1
    2220         pad: 0
    2221         stride: 1
    2222         bias_term: false
    2223     }
    2224 }
    2225 
    2226 layer {
    2227     bottom: "res4b4_branch2c"
    2228     top: "res4b4_branch2c"
    2229     name: "bn4b4_branch2c"
    2230     type: "BatchNorm"
    2231     batch_norm_param {
    2232         use_global_stats: true
    2233     }
    2234 }
    2235 
    2236 layer {
    2237     bottom: "res4b4_branch2c"
    2238     top: "res4b4_branch2c"
    2239     name: "scale4b4_branch2c"
    2240     type: "Scale"
    2241     scale_param {
    2242         bias_term: true
    2243     }
    2244 }
    2245 
    2246 layer {
    2247     bottom: "res4b3"
    2248     bottom: "res4b4_branch2c"
    2249     top: "res4b4"
    2250     name: "res4b4"
    2251     type: "Eltwise"
    2252 }
    2253 
    2254 layer {
    2255     bottom: "res4b4"
    2256     top: "res4b4"
    2257     name: "res4b4_relu"
    2258     type: "ReLU"
    2259 }
    2260 
    2261 layer {
    2262     bottom: "res4b4"
    2263     top: "res4b5_branch2a"
    2264     name: "res4b5_branch2a"
    2265     type: "Convolution"
    2266     convolution_param {
    2267         num_output: 256
    2268         kernel_size: 1
    2269         pad: 0
    2270         stride: 1
    2271         bias_term: false
    2272     }
    2273 }
    2274 
    2275 layer {
    2276     bottom: "res4b5_branch2a"
    2277     top: "res4b5_branch2a"
    2278     name: "bn4b5_branch2a"
    2279     type: "BatchNorm"
    2280     batch_norm_param {
    2281         use_global_stats: true
    2282     }
    2283 }
    2284 
    2285 layer {
    2286     bottom: "res4b5_branch2a"
    2287     top: "res4b5_branch2a"
    2288     name: "scale4b5_branch2a"
    2289     type: "Scale"
    2290     scale_param {
    2291         bias_term: true
    2292     }
    2293 }
    2294 
    2295 layer {
    2296     top: "res4b5_branch2a"
    2297     bottom: "res4b5_branch2a"
    2298     name: "res4b5_branch2a_relu"
    2299     type: "ReLU"
    2300 }
    2301 
    2302 layer {
    2303     bottom: "res4b5_branch2a"
    2304     top: "res4b5_branch2b"
    2305     name: "res4b5_branch2b"
    2306     type: "Convolution"
    2307     convolution_param {
    2308         num_output: 256
    2309         kernel_size: 3
    2310         pad: 1
    2311         stride: 1
    2312         bias_term: false
    2313     }
    2314 }
    2315 
    2316 layer {
    2317     bottom: "res4b5_branch2b"
    2318     top: "res4b5_branch2b"
    2319     name: "bn4b5_branch2b"
    2320     type: "BatchNorm"
    2321     batch_norm_param {
    2322         use_global_stats: true
    2323     }
    2324 }
    2325 
    2326 layer {
    2327     bottom: "res4b5_branch2b"
    2328     top: "res4b5_branch2b"
    2329     name: "scale4b5_branch2b"
    2330     type: "Scale"
    2331     scale_param {
    2332         bias_term: true
    2333     }
    2334 }
    2335 
    2336 layer {
    2337     top: "res4b5_branch2b"
    2338     bottom: "res4b5_branch2b"
    2339     name: "res4b5_branch2b_relu"
    2340     type: "ReLU"
    2341 }
    2342 
    2343 layer {
    2344     bottom: "res4b5_branch2b"
    2345     top: "res4b5_branch2c"
    2346     name: "res4b5_branch2c"
    2347     type: "Convolution"
    2348     convolution_param {
    2349         num_output: 1024
    2350         kernel_size: 1
    2351         pad: 0
    2352         stride: 1
    2353         bias_term: false
    2354     }
    2355 }
    2356 
    2357 layer {
    2358     bottom: "res4b5_branch2c"
    2359     top: "res4b5_branch2c"
    2360     name: "bn4b5_branch2c"
    2361     type: "BatchNorm"
    2362     batch_norm_param {
    2363         use_global_stats: true
    2364     }
    2365 }
    2366 
    2367 layer {
    2368     bottom: "res4b5_branch2c"
    2369     top: "res4b5_branch2c"
    2370     name: "scale4b5_branch2c"
    2371     type: "Scale"
    2372     scale_param {
    2373         bias_term: true
    2374     }
    2375 }
    2376 
    2377 layer {
    2378     bottom: "res4b4"
    2379     bottom: "res4b5_branch2c"
    2380     top: "res4b5"
    2381     name: "res4b5"
    2382     type: "Eltwise"
    2383 }
    2384 
    2385 layer {
    2386     bottom: "res4b5"
    2387     top: "res4b5"
    2388     name: "res4b5_relu"
    2389     type: "ReLU"
    2390 }
    2391 
    2392 layer {
    2393     bottom: "res4b5"
    2394     top: "res4b6_branch2a"
    2395     name: "res4b6_branch2a"
    2396     type: "Convolution"
    2397     convolution_param {
    2398         num_output: 256
    2399         kernel_size: 1
    2400         pad: 0
    2401         stride: 1
    2402         bias_term: false
    2403     }
    2404 }
    2405 
    2406 layer {
    2407     bottom: "res4b6_branch2a"
    2408     top: "res4b6_branch2a"
    2409     name: "bn4b6_branch2a"
    2410     type: "BatchNorm"
    2411     batch_norm_param {
    2412         use_global_stats: true
    2413     }
    2414 }
    2415 
    2416 layer {
    2417     bottom: "res4b6_branch2a"
    2418     top: "res4b6_branch2a"
    2419     name: "scale4b6_branch2a"
    2420     type: "Scale"
    2421     scale_param {
    2422         bias_term: true
    2423     }
    2424 }
    2425 
    2426 layer {
    2427     top: "res4b6_branch2a"
    2428     bottom: "res4b6_branch2a"
    2429     name: "res4b6_branch2a_relu"
    2430     type: "ReLU"
    2431 }
    2432 
    2433 layer {
    2434     bottom: "res4b6_branch2a"
    2435     top: "res4b6_branch2b"
    2436     name: "res4b6_branch2b"
    2437     type: "Convolution"
    2438     convolution_param {
    2439         num_output: 256
    2440         kernel_size: 3
    2441         pad: 1
    2442         stride: 1
    2443         bias_term: false
    2444     }
    2445 }
    2446 
    2447 layer {
    2448     bottom: "res4b6_branch2b"
    2449     top: "res4b6_branch2b"
    2450     name: "bn4b6_branch2b"
    2451     type: "BatchNorm"
    2452     batch_norm_param {
    2453         use_global_stats: true
    2454     }
    2455 }
    2456 
    2457 layer {
    2458     bottom: "res4b6_branch2b"
    2459     top: "res4b6_branch2b"
    2460     name: "scale4b6_branch2b"
    2461     type: "Scale"
    2462     scale_param {
    2463         bias_term: true
    2464     }
    2465 }
    2466 
    2467 layer {
    2468     top: "res4b6_branch2b"
    2469     bottom: "res4b6_branch2b"
    2470     name: "res4b6_branch2b_relu"
    2471     type: "ReLU"
    2472 }
    2473 
    2474 layer {
    2475     bottom: "res4b6_branch2b"
    2476     top: "res4b6_branch2c"
    2477     name: "res4b6_branch2c"
    2478     type: "Convolution"
    2479     convolution_param {
    2480         num_output: 1024
    2481         kernel_size: 1
    2482         pad: 0
    2483         stride: 1
    2484         bias_term: false
    2485     }
    2486 }
    2487 
    2488 layer {
    2489     bottom: "res4b6_branch2c"
    2490     top: "res4b6_branch2c"
    2491     name: "bn4b6_branch2c"
    2492     type: "BatchNorm"
    2493     batch_norm_param {
    2494         use_global_stats: true
    2495     }
    2496 }
    2497 
    2498 layer {
    2499     bottom: "res4b6_branch2c"
    2500     top: "res4b6_branch2c"
    2501     name: "scale4b6_branch2c"
    2502     type: "Scale"
    2503     scale_param {
    2504         bias_term: true
    2505     }
    2506 }
    2507 
    2508 layer {
    2509     bottom: "res4b5"
    2510     bottom: "res4b6_branch2c"
    2511     top: "res4b6"
    2512     name: "res4b6"
    2513     type: "Eltwise"
    2514 }
    2515 
    2516 layer {
    2517     bottom: "res4b6"
    2518     top: "res4b6"
    2519     name: "res4b6_relu"
    2520     type: "ReLU"
    2521 }
    2522 
    2523 layer {
    2524     bottom: "res4b6"
    2525     top: "res4b7_branch2a"
    2526     name: "res4b7_branch2a"
    2527     type: "Convolution"
    2528     convolution_param {
    2529         num_output: 256
    2530         kernel_size: 1
    2531         pad: 0
    2532         stride: 1
    2533         bias_term: false
    2534     }
    2535 }
    2536 
    2537 layer {
    2538     bottom: "res4b7_branch2a"
    2539     top: "res4b7_branch2a"
    2540     name: "bn4b7_branch2a"
    2541     type: "BatchNorm"
    2542     batch_norm_param {
    2543         use_global_stats: true
    2544     }
    2545 }
    2546 
    2547 layer {
    2548     bottom: "res4b7_branch2a"
    2549     top: "res4b7_branch2a"
    2550     name: "scale4b7_branch2a"
    2551     type: "Scale"
    2552     scale_param {
    2553         bias_term: true
    2554     }
    2555 }
    2556 
    2557 layer {
    2558     top: "res4b7_branch2a"
    2559     bottom: "res4b7_branch2a"
    2560     name: "res4b7_branch2a_relu"
    2561     type: "ReLU"
    2562 }
    2563 
    2564 layer {
    2565     bottom: "res4b7_branch2a"
    2566     top: "res4b7_branch2b"
    2567     name: "res4b7_branch2b"
    2568     type: "Convolution"
    2569     convolution_param {
    2570         num_output: 256
    2571         kernel_size: 3
    2572         pad: 1
    2573         stride: 1
    2574         bias_term: false
    2575     }
    2576 }
    2577 
    2578 layer {
    2579     bottom: "res4b7_branch2b"
    2580     top: "res4b7_branch2b"
    2581     name: "bn4b7_branch2b"
    2582     type: "BatchNorm"
    2583     batch_norm_param {
    2584         use_global_stats: true
    2585     }
    2586 }
    2587 
    2588 layer {
    2589     bottom: "res4b7_branch2b"
    2590     top: "res4b7_branch2b"
    2591     name: "scale4b7_branch2b"
    2592     type: "Scale"
    2593     scale_param {
    2594         bias_term: true
    2595     }
    2596 }
    2597 
    2598 layer {
    2599     top: "res4b7_branch2b"
    2600     bottom: "res4b7_branch2b"
    2601     name: "res4b7_branch2b_relu"
    2602     type: "ReLU"
    2603 }
    2604 
    2605 layer {
    2606     bottom: "res4b7_branch2b"
    2607     top: "res4b7_branch2c"
    2608     name: "res4b7_branch2c"
    2609     type: "Convolution"
    2610     convolution_param {
    2611         num_output: 1024
    2612         kernel_size: 1
    2613         pad: 0
    2614         stride: 1
    2615         bias_term: false
    2616     }
    2617 }
    2618 
    2619 layer {
    2620     bottom: "res4b7_branch2c"
    2621     top: "res4b7_branch2c"
    2622     name: "bn4b7_branch2c"
    2623     type: "BatchNorm"
    2624     batch_norm_param {
    2625         use_global_stats: true
    2626     }
    2627 }
    2628 
    2629 layer {
    2630     bottom: "res4b7_branch2c"
    2631     top: "res4b7_branch2c"
    2632     name: "scale4b7_branch2c"
    2633     type: "Scale"
    2634     scale_param {
    2635         bias_term: true
    2636     }
    2637 }
    2638 
    2639 layer {
    2640     bottom: "res4b6"
    2641     bottom: "res4b7_branch2c"
    2642     top: "res4b7"
    2643     name: "res4b7"
    2644     type: "Eltwise"
    2645 }
    2646 
    2647 layer {
    2648     bottom: "res4b7"
    2649     top: "res4b7"
    2650     name: "res4b7_relu"
    2651     type: "ReLU"
    2652 }
    2653 
    2654 layer {
    2655     bottom: "res4b7"
    2656     top: "res4b8_branch2a"
    2657     name: "res4b8_branch2a"
    2658     type: "Convolution"
    2659     convolution_param {
    2660         num_output: 256
    2661         kernel_size: 1
    2662         pad: 0
    2663         stride: 1
    2664         bias_term: false
    2665     }
    2666 }
    2667 
    2668 layer {
    2669     bottom: "res4b8_branch2a"
    2670     top: "res4b8_branch2a"
    2671     name: "bn4b8_branch2a"
    2672     type: "BatchNorm"
    2673     batch_norm_param {
    2674         use_global_stats: true
    2675     }
    2676 }
    2677 
    2678 layer {
    2679     bottom: "res4b8_branch2a"
    2680     top: "res4b8_branch2a"
    2681     name: "scale4b8_branch2a"
    2682     type: "Scale"
    2683     scale_param {
    2684         bias_term: true
    2685     }
    2686 }
    2687 
    2688 layer {
    2689     top: "res4b8_branch2a"
    2690     bottom: "res4b8_branch2a"
    2691     name: "res4b8_branch2a_relu"
    2692     type: "ReLU"
    2693 }
    2694 
    2695 layer {
    2696     bottom: "res4b8_branch2a"
    2697     top: "res4b8_branch2b"
    2698     name: "res4b8_branch2b"
    2699     type: "Convolution"
    2700     convolution_param {
    2701         num_output: 256
    2702         kernel_size: 3
    2703         pad: 1
    2704         stride: 1
    2705         bias_term: false
    2706     }
    2707 }
    2708 
    2709 layer {
    2710     bottom: "res4b8_branch2b"
    2711     top: "res4b8_branch2b"
    2712     name: "bn4b8_branch2b"
    2713     type: "BatchNorm"
    2714     batch_norm_param {
    2715         use_global_stats: true
    2716     }
    2717 }
    2718 
    2719 layer {
    2720     bottom: "res4b8_branch2b"
    2721     top: "res4b8_branch2b"
    2722     name: "scale4b8_branch2b"
    2723     type: "Scale"
    2724     scale_param {
    2725         bias_term: true
    2726     }
    2727 }
    2728 
    2729 layer {
    2730     top: "res4b8_branch2b"
    2731     bottom: "res4b8_branch2b"
    2732     name: "res4b8_branch2b_relu"
    2733     type: "ReLU"
    2734 }
    2735 
    2736 layer {
    2737     bottom: "res4b8_branch2b"
    2738     top: "res4b8_branch2c"
    2739     name: "res4b8_branch2c"
    2740     type: "Convolution"
    2741     convolution_param {
    2742         num_output: 1024
    2743         kernel_size: 1
    2744         pad: 0
    2745         stride: 1
    2746         bias_term: false
    2747     }
    2748 }
    2749 
    2750 layer {
    2751     bottom: "res4b8_branch2c"
    2752     top: "res4b8_branch2c"
    2753     name: "bn4b8_branch2c"
    2754     type: "BatchNorm"
    2755     batch_norm_param {
    2756         use_global_stats: true
    2757     }
    2758 }
    2759 
    2760 layer {
    2761     bottom: "res4b8_branch2c"
    2762     top: "res4b8_branch2c"
    2763     name: "scale4b8_branch2c"
    2764     type: "Scale"
    2765     scale_param {
    2766         bias_term: true
    2767     }
    2768 }
    2769 
    2770 layer {
    2771     bottom: "res4b7"
    2772     bottom: "res4b8_branch2c"
    2773     top: "res4b8"
    2774     name: "res4b8"
    2775     type: "Eltwise"
    2776 }
    2777 
    2778 layer {
    2779     bottom: "res4b8"
    2780     top: "res4b8"
    2781     name: "res4b8_relu"
    2782     type: "ReLU"
    2783 }
    2784 
    2785 layer {
    2786     bottom: "res4b8"
    2787     top: "res4b9_branch2a"
    2788     name: "res4b9_branch2a"
    2789     type: "Convolution"
    2790     convolution_param {
    2791         num_output: 256
    2792         kernel_size: 1
    2793         pad: 0
    2794         stride: 1
    2795         bias_term: false
    2796     }
    2797 }
    2798 
    2799 layer {
    2800     bottom: "res4b9_branch2a"
    2801     top: "res4b9_branch2a"
    2802     name: "bn4b9_branch2a"
    2803     type: "BatchNorm"
    2804     batch_norm_param {
    2805         use_global_stats: true
    2806     }
    2807 }
    2808 
    2809 layer {
    2810     bottom: "res4b9_branch2a"
    2811     top: "res4b9_branch2a"
    2812     name: "scale4b9_branch2a"
    2813     type: "Scale"
    2814     scale_param {
    2815         bias_term: true
    2816     }
    2817 }
    2818 
    2819 layer {
    2820     top: "res4b9_branch2a"
    2821     bottom: "res4b9_branch2a"
    2822     name: "res4b9_branch2a_relu"
    2823     type: "ReLU"
    2824 }
    2825 
    2826 layer {
    2827     bottom: "res4b9_branch2a"
    2828     top: "res4b9_branch2b"
    2829     name: "res4b9_branch2b"
    2830     type: "Convolution"
    2831     convolution_param {
    2832         num_output: 256
    2833         kernel_size: 3
    2834         pad: 1
    2835         stride: 1
    2836         bias_term: false
    2837     }
    2838 }
    2839 
    2840 layer {
    2841     bottom: "res4b9_branch2b"
    2842     top: "res4b9_branch2b"
    2843     name: "bn4b9_branch2b"
    2844     type: "BatchNorm"
    2845     batch_norm_param {
    2846         use_global_stats: true
    2847     }
    2848 }
    2849 
    2850 layer {
    2851     bottom: "res4b9_branch2b"
    2852     top: "res4b9_branch2b"
    2853     name: "scale4b9_branch2b"
    2854     type: "Scale"
    2855     scale_param {
    2856         bias_term: true
    2857     }
    2858 }
    2859 
    2860 layer {
    2861     top: "res4b9_branch2b"
    2862     bottom: "res4b9_branch2b"
    2863     name: "res4b9_branch2b_relu"
    2864     type: "ReLU"
    2865 }
    2866 
    2867 layer {
    2868     bottom: "res4b9_branch2b"
    2869     top: "res4b9_branch2c"
    2870     name: "res4b9_branch2c"
    2871     type: "Convolution"
    2872     convolution_param {
    2873         num_output: 1024
    2874         kernel_size: 1
    2875         pad: 0
    2876         stride: 1
    2877         bias_term: false
    2878     }
    2879 }
    2880 
    2881 layer {
    2882     bottom: "res4b9_branch2c"
    2883     top: "res4b9_branch2c"
    2884     name: "bn4b9_branch2c"
    2885     type: "BatchNorm"
    2886     batch_norm_param {
    2887         use_global_stats: true
    2888     }
    2889 }
    2890 
    2891 layer {
    2892     bottom: "res4b9_branch2c"
    2893     top: "res4b9_branch2c"
    2894     name: "scale4b9_branch2c"
    2895     type: "Scale"
    2896     scale_param {
    2897         bias_term: true
    2898     }
    2899 }
    2900 
    2901 layer {
    2902     bottom: "res4b8"
    2903     bottom: "res4b9_branch2c"
    2904     top: "res4b9"
    2905     name: "res4b9"
    2906     type: "Eltwise"
    2907 }
    2908 
    2909 layer {
    2910     bottom: "res4b9"
    2911     top: "res4b9"
    2912     name: "res4b9_relu"
    2913     type: "ReLU"
    2914 }
    2915 
    2916 layer {
    2917     bottom: "res4b9"
    2918     top: "res4b10_branch2a"
    2919     name: "res4b10_branch2a"
    2920     type: "Convolution"
    2921     convolution_param {
    2922         num_output: 256
    2923         kernel_size: 1
    2924         pad: 0
    2925         stride: 1
    2926         bias_term: false
    2927     }
    2928 }
    2929 
    2930 layer {
    2931     bottom: "res4b10_branch2a"
    2932     top: "res4b10_branch2a"
    2933     name: "bn4b10_branch2a"
    2934     type: "BatchNorm"
    2935     batch_norm_param {
    2936         use_global_stats: true
    2937     }
    2938 }
    2939 
    2940 layer {
    2941     bottom: "res4b10_branch2a"
    2942     top: "res4b10_branch2a"
    2943     name: "scale4b10_branch2a"
    2944     type: "Scale"
    2945     scale_param {
    2946         bias_term: true
    2947     }
    2948 }
    2949 
    2950 layer {
    2951     top: "res4b10_branch2a"
    2952     bottom: "res4b10_branch2a"
    2953     name: "res4b10_branch2a_relu"
    2954     type: "ReLU"
    2955 }
    2956 
    2957 layer {
    2958     bottom: "res4b10_branch2a"
    2959     top: "res4b10_branch2b"
    2960     name: "res4b10_branch2b"
    2961     type: "Convolution"
    2962     convolution_param {
    2963         num_output: 256
    2964         kernel_size: 3
    2965         pad: 1
    2966         stride: 1
    2967         bias_term: false
    2968     }
    2969 }
    2970 
    2971 layer {
    2972     bottom: "res4b10_branch2b"
    2973     top: "res4b10_branch2b"
    2974     name: "bn4b10_branch2b"
    2975     type: "BatchNorm"
    2976     batch_norm_param {
    2977         use_global_stats: true
    2978     }
    2979 }
    2980 
    2981 layer {
    2982     bottom: "res4b10_branch2b"
    2983     top: "res4b10_branch2b"
    2984     name: "scale4b10_branch2b"
    2985     type: "Scale"
    2986     scale_param {
    2987         bias_term: true
    2988     }
    2989 }
    2990 
    2991 layer {
    2992     top: "res4b10_branch2b"
    2993     bottom: "res4b10_branch2b"
    2994     name: "res4b10_branch2b_relu"
    2995     type: "ReLU"
    2996 }
    2997 
    2998 layer {
    2999     bottom: "res4b10_branch2b"
    3000     top: "res4b10_branch2c"
    3001     name: "res4b10_branch2c"
    3002     type: "Convolution"
    3003     convolution_param {
    3004         num_output: 1024
    3005         kernel_size: 1
    3006         pad: 0
    3007         stride: 1
    3008         bias_term: false
    3009     }
    3010 }
    3011 
    3012 layer {
    3013     bottom: "res4b10_branch2c"
    3014     top: "res4b10_branch2c"
    3015     name: "bn4b10_branch2c"
    3016     type: "BatchNorm"
    3017     batch_norm_param {
    3018         use_global_stats: true
    3019     }
    3020 }
    3021 
    3022 layer {
    3023     bottom: "res4b10_branch2c"
    3024     top: "res4b10_branch2c"
    3025     name: "scale4b10_branch2c"
    3026     type: "Scale"
    3027     scale_param {
    3028         bias_term: true
    3029     }
    3030 }
    3031 
    3032 layer {
    3033     bottom: "res4b9"
    3034     bottom: "res4b10_branch2c"
    3035     top: "res4b10"
    3036     name: "res4b10"
    3037     type: "Eltwise"
    3038 }
    3039 
    3040 layer {
    3041     bottom: "res4b10"
    3042     top: "res4b10"
    3043     name: "res4b10_relu"
    3044     type: "ReLU"
    3045 }
    3046 
    3047 layer {
    3048     bottom: "res4b10"
    3049     top: "res4b11_branch2a"
    3050     name: "res4b11_branch2a"
    3051     type: "Convolution"
    3052     convolution_param {
    3053         num_output: 256
    3054         kernel_size: 1
    3055         pad: 0
    3056         stride: 1
    3057         bias_term: false
    3058     }
    3059 }
    3060 
    3061 layer {
    3062     bottom: "res4b11_branch2a"
    3063     top: "res4b11_branch2a"
    3064     name: "bn4b11_branch2a"
    3065     type: "BatchNorm"
    3066     batch_norm_param {
    3067         use_global_stats: true
    3068     }
    3069 }
    3070 
    3071 layer {
    3072     bottom: "res4b11_branch2a"
    3073     top: "res4b11_branch2a"
    3074     name: "scale4b11_branch2a"
    3075     type: "Scale"
    3076     scale_param {
    3077         bias_term: true
    3078     }
    3079 }
    3080 
    3081 layer {
    3082     top: "res4b11_branch2a"
    3083     bottom: "res4b11_branch2a"
    3084     name: "res4b11_branch2a_relu"
    3085     type: "ReLU"
    3086 }
    3087 
    3088 layer {
    3089     bottom: "res4b11_branch2a"
    3090     top: "res4b11_branch2b"
    3091     name: "res4b11_branch2b"
    3092     type: "Convolution"
    3093     convolution_param {
    3094         num_output: 256
    3095         kernel_size: 3
    3096         pad: 1
    3097         stride: 1
    3098         bias_term: false
    3099     }
    3100 }
    3101 
    3102 layer {
    3103     bottom: "res4b11_branch2b"
    3104     top: "res4b11_branch2b"
    3105     name: "bn4b11_branch2b"
    3106     type: "BatchNorm"
    3107     batch_norm_param {
    3108         use_global_stats: true
    3109     }
    3110 }
    3111 
    3112 layer {
    3113     bottom: "res4b11_branch2b"
    3114     top: "res4b11_branch2b"
    3115     name: "scale4b11_branch2b"
    3116     type: "Scale"
    3117     scale_param {
    3118         bias_term: true
    3119     }
    3120 }
    3121 
    3122 layer {
    3123     top: "res4b11_branch2b"
    3124     bottom: "res4b11_branch2b"
    3125     name: "res4b11_branch2b_relu"
    3126     type: "ReLU"
    3127 }
    3128 
    3129 layer {
    3130     bottom: "res4b11_branch2b"
    3131     top: "res4b11_branch2c"
    3132     name: "res4b11_branch2c"
    3133     type: "Convolution"
    3134     convolution_param {
    3135         num_output: 1024
    3136         kernel_size: 1
    3137         pad: 0
    3138         stride: 1
    3139         bias_term: false
    3140     }
    3141 }
    3142 
    3143 layer {
    3144     bottom: "res4b11_branch2c"
    3145     top: "res4b11_branch2c"
    3146     name: "bn4b11_branch2c"
    3147     type: "BatchNorm"
    3148     batch_norm_param {
    3149         use_global_stats: true
    3150     }
    3151 }
    3152 
    3153 layer {
    3154     bottom: "res4b11_branch2c"
    3155     top: "res4b11_branch2c"
    3156     name: "scale4b11_branch2c"
    3157     type: "Scale"
    3158     scale_param {
    3159         bias_term: true
    3160     }
    3161 }
    3162 
    3163 layer {
    3164     bottom: "res4b10"
    3165     bottom: "res4b11_branch2c"
    3166     top: "res4b11"
    3167     name: "res4b11"
    3168     type: "Eltwise"
    3169 }
    3170 
    3171 layer {
    3172     bottom: "res4b11"
    3173     top: "res4b11"
    3174     name: "res4b11_relu"
    3175     type: "ReLU"
    3176 }
    3177 
    3178 layer {
    3179     bottom: "res4b11"
    3180     top: "res4b12_branch2a"
    3181     name: "res4b12_branch2a"
    3182     type: "Convolution"
    3183     convolution_param {
    3184         num_output: 256
    3185         kernel_size: 1
    3186         pad: 0
    3187         stride: 1
    3188         bias_term: false
    3189     }
    3190 }
    3191 
    3192 layer {
    3193     bottom: "res4b12_branch2a"
    3194     top: "res4b12_branch2a"
    3195     name: "bn4b12_branch2a"
    3196     type: "BatchNorm"
    3197     batch_norm_param {
    3198         use_global_stats: true
    3199     }
    3200 }
    3201 
    3202 layer {
    3203     bottom: "res4b12_branch2a"
    3204     top: "res4b12_branch2a"
    3205     name: "scale4b12_branch2a"
    3206     type: "Scale"
    3207     scale_param {
    3208         bias_term: true
    3209     }
    3210 }
    3211 
    3212 layer {
    3213     top: "res4b12_branch2a"
    3214     bottom: "res4b12_branch2a"
    3215     name: "res4b12_branch2a_relu"
    3216     type: "ReLU"
    3217 }
    3218 
    3219 layer {
    3220     bottom: "res4b12_branch2a"
    3221     top: "res4b12_branch2b"
    3222     name: "res4b12_branch2b"
    3223     type: "Convolution"
    3224     convolution_param {
    3225         num_output: 256
    3226         kernel_size: 3
    3227         pad: 1
    3228         stride: 1
    3229         bias_term: false
    3230     }
    3231 }
    3232 
    3233 layer {
    3234     bottom: "res4b12_branch2b"
    3235     top: "res4b12_branch2b"
    3236     name: "bn4b12_branch2b"
    3237     type: "BatchNorm"
    3238     batch_norm_param {
    3239         use_global_stats: true
    3240     }
    3241 }
    3242 
    3243 layer {
    3244     bottom: "res4b12_branch2b"
    3245     top: "res4b12_branch2b"
    3246     name: "scale4b12_branch2b"
    3247     type: "Scale"
    3248     scale_param {
    3249         bias_term: true
    3250     }
    3251 }
    3252 
    3253 layer {
    3254     top: "res4b12_branch2b"
    3255     bottom: "res4b12_branch2b"
    3256     name: "res4b12_branch2b_relu"
    3257     type: "ReLU"
    3258 }
    3259 
    3260 layer {
    3261     bottom: "res4b12_branch2b"
    3262     top: "res4b12_branch2c"
    3263     name: "res4b12_branch2c"
    3264     type: "Convolution"
    3265     convolution_param {
    3266         num_output: 1024
    3267         kernel_size: 1
    3268         pad: 0
    3269         stride: 1
    3270         bias_term: false
    3271     }
    3272 }
    3273 
    3274 layer {
    3275     bottom: "res4b12_branch2c"
    3276     top: "res4b12_branch2c"
    3277     name: "bn4b12_branch2c"
    3278     type: "BatchNorm"
    3279     batch_norm_param {
    3280         use_global_stats: true
    3281     }
    3282 }
    3283 
    3284 layer {
    3285     bottom: "res4b12_branch2c"
    3286     top: "res4b12_branch2c"
    3287     name: "scale4b12_branch2c"
    3288     type: "Scale"
    3289     scale_param {
    3290         bias_term: true
    3291     }
    3292 }
    3293 
    3294 layer {
    3295     bottom: "res4b11"
    3296     bottom: "res4b12_branch2c"
    3297     top: "res4b12"
    3298     name: "res4b12"
    3299     type: "Eltwise"
    3300 }
    3301 
    3302 layer {
    3303     bottom: "res4b12"
    3304     top: "res4b12"
    3305     name: "res4b12_relu"
    3306     type: "ReLU"
    3307 }
    3308 
    3309 layer {
    3310     bottom: "res4b12"
    3311     top: "res4b13_branch2a"
    3312     name: "res4b13_branch2a"
    3313     type: "Convolution"
    3314     convolution_param {
    3315         num_output: 256
    3316         kernel_size: 1
    3317         pad: 0
    3318         stride: 1
    3319         bias_term: false
    3320     }
    3321 }
    3322 
    3323 layer {
    3324     bottom: "res4b13_branch2a"
    3325     top: "res4b13_branch2a"
    3326     name: "bn4b13_branch2a"
    3327     type: "BatchNorm"
    3328     batch_norm_param {
    3329         use_global_stats: true
    3330     }
    3331 }
    3332 
    3333 layer {
    3334     bottom: "res4b13_branch2a"
    3335     top: "res4b13_branch2a"
    3336     name: "scale4b13_branch2a"
    3337     type: "Scale"
    3338     scale_param {
    3339         bias_term: true
    3340     }
    3341 }
    3342 
    3343 layer {
    3344     top: "res4b13_branch2a"
    3345     bottom: "res4b13_branch2a"
    3346     name: "res4b13_branch2a_relu"
    3347     type: "ReLU"
    3348 }
    3349 
    3350 layer {
    3351     bottom: "res4b13_branch2a"
    3352     top: "res4b13_branch2b"
    3353     name: "res4b13_branch2b"
    3354     type: "Convolution"
    3355     convolution_param {
    3356         num_output: 256
    3357         kernel_size: 3
    3358         pad: 1
    3359         stride: 1
    3360         bias_term: false
    3361     }
    3362 }
    3363 
    3364 layer {
    3365     bottom: "res4b13_branch2b"
    3366     top: "res4b13_branch2b"
    3367     name: "bn4b13_branch2b"
    3368     type: "BatchNorm"
    3369     batch_norm_param {
    3370         use_global_stats: true
    3371     }
    3372 }
    3373 
    3374 layer {
    3375     bottom: "res4b13_branch2b"
    3376     top: "res4b13_branch2b"
    3377     name: "scale4b13_branch2b"
    3378     type: "Scale"
    3379     scale_param {
    3380         bias_term: true
    3381     }
    3382 }
    3383 
    3384 layer {
    3385     top: "res4b13_branch2b"
    3386     bottom: "res4b13_branch2b"
    3387     name: "res4b13_branch2b_relu"
    3388     type: "ReLU"
    3389 }
    3390 
    3391 layer {
    3392     bottom: "res4b13_branch2b"
    3393     top: "res4b13_branch2c"
    3394     name: "res4b13_branch2c"
    3395     type: "Convolution"
    3396     convolution_param {
    3397         num_output: 1024
    3398         kernel_size: 1
    3399         pad: 0
    3400         stride: 1
    3401         bias_term: false
    3402     }
    3403 }
    3404 
    3405 layer {
    3406     bottom: "res4b13_branch2c"
    3407     top: "res4b13_branch2c"
    3408     name: "bn4b13_branch2c"
    3409     type: "BatchNorm"
    3410     batch_norm_param {
    3411         use_global_stats: true
    3412     }
    3413 }
    3414 
    3415 layer {
    3416     bottom: "res4b13_branch2c"
    3417     top: "res4b13_branch2c"
    3418     name: "scale4b13_branch2c"
    3419     type: "Scale"
    3420     scale_param {
    3421         bias_term: true
    3422     }
    3423 }
    3424 
    3425 layer {
    3426     bottom: "res4b12"
    3427     bottom: "res4b13_branch2c"
    3428     top: "res4b13"
    3429     name: "res4b13"
    3430     type: "Eltwise"
    3431 }
    3432 
    3433 layer {
    3434     bottom: "res4b13"
    3435     top: "res4b13"
    3436     name: "res4b13_relu"
    3437     type: "ReLU"
    3438 }
    3439 
    3440 layer {
    3441     bottom: "res4b13"
    3442     top: "res4b14_branch2a"
    3443     name: "res4b14_branch2a"
    3444     type: "Convolution"
    3445     convolution_param {
    3446         num_output: 256
    3447         kernel_size: 1
    3448         pad: 0
    3449         stride: 1
    3450         bias_term: false
    3451     }
    3452 }
    3453 
    3454 layer {
    3455     bottom: "res4b14_branch2a"
    3456     top: "res4b14_branch2a"
    3457     name: "bn4b14_branch2a"
    3458     type: "BatchNorm"
    3459     batch_norm_param {
    3460         use_global_stats: true
    3461     }
    3462 }
    3463 
    3464 layer {
    3465     bottom: "res4b14_branch2a"
    3466     top: "res4b14_branch2a"
    3467     name: "scale4b14_branch2a"
    3468     type: "Scale"
    3469     scale_param {
    3470         bias_term: true
    3471     }
    3472 }
    3473 
    3474 layer {
    3475     top: "res4b14_branch2a"
    3476     bottom: "res4b14_branch2a"
    3477     name: "res4b14_branch2a_relu"
    3478     type: "ReLU"
    3479 }
    3480 
    3481 layer {
    3482     bottom: "res4b14_branch2a"
    3483     top: "res4b14_branch2b"
    3484     name: "res4b14_branch2b"
    3485     type: "Convolution"
    3486     convolution_param {
    3487         num_output: 256
    3488         kernel_size: 3
    3489         pad: 1
    3490         stride: 1
    3491         bias_term: false
    3492     }
    3493 }
    3494 
    3495 layer {
    3496     bottom: "res4b14_branch2b"
    3497     top: "res4b14_branch2b"
    3498     name: "bn4b14_branch2b"
    3499     type: "BatchNorm"
    3500     batch_norm_param {
    3501         use_global_stats: true
    3502     }
    3503 }
    3504 
    3505 layer {
    3506     bottom: "res4b14_branch2b"
    3507     top: "res4b14_branch2b"
    3508     name: "scale4b14_branch2b"
    3509     type: "Scale"
    3510     scale_param {
    3511         bias_term: true
    3512     }
    3513 }
    3514 
    3515 layer {
    3516     top: "res4b14_branch2b"
    3517     bottom: "res4b14_branch2b"
    3518     name: "res4b14_branch2b_relu"
    3519     type: "ReLU"
    3520 }
    3521 
    3522 layer {
    3523     bottom: "res4b14_branch2b"
    3524     top: "res4b14_branch2c"
    3525     name: "res4b14_branch2c"
    3526     type: "Convolution"
    3527     convolution_param {
    3528         num_output: 1024
    3529         kernel_size: 1
    3530         pad: 0
    3531         stride: 1
    3532         bias_term: false
    3533     }
    3534 }
    3535 
    3536 layer {
    3537     bottom: "res4b14_branch2c"
    3538     top: "res4b14_branch2c"
    3539     name: "bn4b14_branch2c"
    3540     type: "BatchNorm"
    3541     batch_norm_param {
    3542         use_global_stats: true
    3543     }
    3544 }
    3545 
    3546 layer {
    3547     bottom: "res4b14_branch2c"
    3548     top: "res4b14_branch2c"
    3549     name: "scale4b14_branch2c"
    3550     type: "Scale"
    3551     scale_param {
    3552         bias_term: true
    3553     }
    3554 }
    3555 
    3556 layer {
    3557     bottom: "res4b13"
    3558     bottom: "res4b14_branch2c"
    3559     top: "res4b14"
    3560     name: "res4b14"
    3561     type: "Eltwise"
    3562 }
    3563 
    3564 layer {
    3565     bottom: "res4b14"
    3566     top: "res4b14"
    3567     name: "res4b14_relu"
    3568     type: "ReLU"
    3569 }
    3570 
    3571 layer {
    3572     bottom: "res4b14"
    3573     top: "res4b15_branch2a"
    3574     name: "res4b15_branch2a"
    3575     type: "Convolution"
    3576     convolution_param {
    3577         num_output: 256
    3578         kernel_size: 1
    3579         pad: 0
    3580         stride: 1
    3581         bias_term: false
    3582     }
    3583 }
    3584 
    3585 layer {
    3586     bottom: "res4b15_branch2a"
    3587     top: "res4b15_branch2a"
    3588     name: "bn4b15_branch2a"
    3589     type: "BatchNorm"
    3590     batch_norm_param {
    3591         use_global_stats: true
    3592     }
    3593 }
    3594 
    3595 layer {
    3596     bottom: "res4b15_branch2a"
    3597     top: "res4b15_branch2a"
    3598     name: "scale4b15_branch2a"
    3599     type: "Scale"
    3600     scale_param {
    3601         bias_term: true
    3602     }
    3603 }
    3604 
    3605 layer {
    3606     top: "res4b15_branch2a"
    3607     bottom: "res4b15_branch2a"
    3608     name: "res4b15_branch2a_relu"
    3609     type: "ReLU"
    3610 }
    3611 
    3612 layer {
    3613     bottom: "res4b15_branch2a"
    3614     top: "res4b15_branch2b"
    3615     name: "res4b15_branch2b"
    3616     type: "Convolution"
    3617     convolution_param {
    3618         num_output: 256
    3619         kernel_size: 3
    3620         pad: 1
    3621         stride: 1
    3622         bias_term: false
    3623     }
    3624 }
    3625 
    3626 layer {
    3627     bottom: "res4b15_branch2b"
    3628     top: "res4b15_branch2b"
    3629     name: "bn4b15_branch2b"
    3630     type: "BatchNorm"
    3631     batch_norm_param {
    3632         use_global_stats: true
    3633     }
    3634 }
    3635 
    3636 layer {
    3637     bottom: "res4b15_branch2b"
    3638     top: "res4b15_branch2b"
    3639     name: "scale4b15_branch2b"
    3640     type: "Scale"
    3641     scale_param {
    3642         bias_term: true
    3643     }
    3644 }
    3645 
    3646 layer {
    3647     top: "res4b15_branch2b"
    3648     bottom: "res4b15_branch2b"
    3649     name: "res4b15_branch2b_relu"
    3650     type: "ReLU"
    3651 }
    3652 
    3653 layer {
    3654     bottom: "res4b15_branch2b"
    3655     top: "res4b15_branch2c"
    3656     name: "res4b15_branch2c"
    3657     type: "Convolution"
    3658     convolution_param {
    3659         num_output: 1024
    3660         kernel_size: 1
    3661         pad: 0
    3662         stride: 1
    3663         bias_term: false
    3664     }
    3665 }
    3666 
    3667 layer {
    3668     bottom: "res4b15_branch2c"
    3669     top: "res4b15_branch2c"
    3670     name: "bn4b15_branch2c"
    3671     type: "BatchNorm"
    3672     batch_norm_param {
    3673         use_global_stats: true
    3674     }
    3675 }
    3676 
    3677 layer {
    3678     bottom: "res4b15_branch2c"
    3679     top: "res4b15_branch2c"
    3680     name: "scale4b15_branch2c"
    3681     type: "Scale"
    3682     scale_param {
    3683         bias_term: true
    3684     }
    3685 }
    3686 
    3687 layer {
    3688     bottom: "res4b14"
    3689     bottom: "res4b15_branch2c"
    3690     top: "res4b15"
    3691     name: "res4b15"
    3692     type: "Eltwise"
    3693 }
    3694 
    3695 layer {
    3696     bottom: "res4b15"
    3697     top: "res4b15"
    3698     name: "res4b15_relu"
    3699     type: "ReLU"
    3700 }
    3701 
    3702 layer {
    3703     bottom: "res4b15"
    3704     top: "res4b16_branch2a"
    3705     name: "res4b16_branch2a"
    3706     type: "Convolution"
    3707     convolution_param {
    3708         num_output: 256
    3709         kernel_size: 1
    3710         pad: 0
    3711         stride: 1
    3712         bias_term: false
    3713     }
    3714 }
    3715 
    3716 layer {
    3717     bottom: "res4b16_branch2a"
    3718     top: "res4b16_branch2a"
    3719     name: "bn4b16_branch2a"
    3720     type: "BatchNorm"
    3721     batch_norm_param {
    3722         use_global_stats: true
    3723     }
    3724 }
    3725 
    3726 layer {
    3727     bottom: "res4b16_branch2a"
    3728     top: "res4b16_branch2a"
    3729     name: "scale4b16_branch2a"
    3730     type: "Scale"
    3731     scale_param {
    3732         bias_term: true
    3733     }
    3734 }
    3735 
    3736 layer {
    3737     top: "res4b16_branch2a"
    3738     bottom: "res4b16_branch2a"
    3739     name: "res4b16_branch2a_relu"
    3740     type: "ReLU"
    3741 }
    3742 
    3743 layer {
    3744     bottom: "res4b16_branch2a"
    3745     top: "res4b16_branch2b"
    3746     name: "res4b16_branch2b"
    3747     type: "Convolution"
    3748     convolution_param {
    3749         num_output: 256
    3750         kernel_size: 3
    3751         pad: 1
    3752         stride: 1
    3753         bias_term: false
    3754     }
    3755 }
    3756 
    3757 layer {
    3758     bottom: "res4b16_branch2b"
    3759     top: "res4b16_branch2b"
    3760     name: "bn4b16_branch2b"
    3761     type: "BatchNorm"
    3762     batch_norm_param {
    3763         use_global_stats: true
    3764     }
    3765 }
    3766 
    3767 layer {
    3768     bottom: "res4b16_branch2b"
    3769     top: "res4b16_branch2b"
    3770     name: "scale4b16_branch2b"
    3771     type: "Scale"
    3772     scale_param {
    3773         bias_term: true
    3774     }
    3775 }
    3776 
    3777 layer {
    3778     top: "res4b16_branch2b"
    3779     bottom: "res4b16_branch2b"
    3780     name: "res4b16_branch2b_relu"
    3781     type: "ReLU"
    3782 }
    3783 
    3784 layer {
    3785     bottom: "res4b16_branch2b"
    3786     top: "res4b16_branch2c"
    3787     name: "res4b16_branch2c"
    3788     type: "Convolution"
    3789     convolution_param {
    3790         num_output: 1024
    3791         kernel_size: 1
    3792         pad: 0
    3793         stride: 1
    3794         bias_term: false
    3795     }
    3796 }
    3797 
    3798 layer {
    3799     bottom: "res4b16_branch2c"
    3800     top: "res4b16_branch2c"
    3801     name: "bn4b16_branch2c"
    3802     type: "BatchNorm"
    3803     batch_norm_param {
    3804         use_global_stats: true
    3805     }
    3806 }
    3807 
    3808 layer {
    3809     bottom: "res4b16_branch2c"
    3810     top: "res4b16_branch2c"
    3811     name: "scale4b16_branch2c"
    3812     type: "Scale"
    3813     scale_param {
    3814         bias_term: true
    3815     }
    3816 }
    3817 
    3818 layer {
    3819     bottom: "res4b15"
    3820     bottom: "res4b16_branch2c"
    3821     top: "res4b16"
    3822     name: "res4b16"
    3823     type: "Eltwise"
    3824 }
    3825 
    3826 layer {
    3827     bottom: "res4b16"
    3828     top: "res4b16"
    3829     name: "res4b16_relu"
    3830     type: "ReLU"
    3831 }
    3832 
    3833 layer {
    3834     bottom: "res4b16"
    3835     top: "res4b17_branch2a"
    3836     name: "res4b17_branch2a"
    3837     type: "Convolution"
    3838     convolution_param {
    3839         num_output: 256
    3840         kernel_size: 1
    3841         pad: 0
    3842         stride: 1
    3843         bias_term: false
    3844     }
    3845 }
    3846 
    3847 layer {
    3848     bottom: "res4b17_branch2a"
    3849     top: "res4b17_branch2a"
    3850     name: "bn4b17_branch2a"
    3851     type: "BatchNorm"
    3852     batch_norm_param {
    3853         use_global_stats: true
    3854     }
    3855 }
    3856 
    3857 layer {
    3858     bottom: "res4b17_branch2a"
    3859     top: "res4b17_branch2a"
    3860     name: "scale4b17_branch2a"
    3861     type: "Scale"
    3862     scale_param {
    3863         bias_term: true
    3864     }
    3865 }
    3866 
    3867 layer {
    3868     top: "res4b17_branch2a"
    3869     bottom: "res4b17_branch2a"
    3870     name: "res4b17_branch2a_relu"
    3871     type: "ReLU"
    3872 }
    3873 
    3874 layer {
    3875     bottom: "res4b17_branch2a"
    3876     top: "res4b17_branch2b"
    3877     name: "res4b17_branch2b"
    3878     type: "Convolution"
    3879     convolution_param {
    3880         num_output: 256
    3881         kernel_size: 3
    3882         pad: 1
    3883         stride: 1
    3884         bias_term: false
    3885     }
    3886 }
    3887 
    3888 layer {
    3889     bottom: "res4b17_branch2b"
    3890     top: "res4b17_branch2b"
    3891     name: "bn4b17_branch2b"
    3892     type: "BatchNorm"
    3893     batch_norm_param {
    3894         use_global_stats: true
    3895     }
    3896 }
    3897 
    3898 layer {
    3899     bottom: "res4b17_branch2b"
    3900     top: "res4b17_branch2b"
    3901     name: "scale4b17_branch2b"
    3902     type: "Scale"
    3903     scale_param {
    3904         bias_term: true
    3905     }
    3906 }
    3907 
    3908 layer {
    3909     top: "res4b17_branch2b"
    3910     bottom: "res4b17_branch2b"
    3911     name: "res4b17_branch2b_relu"
    3912     type: "ReLU"
    3913 }
    3914 
    3915 layer {
    3916     bottom: "res4b17_branch2b"
    3917     top: "res4b17_branch2c"
    3918     name: "res4b17_branch2c"
    3919     type: "Convolution"
    3920     convolution_param {
    3921         num_output: 1024
    3922         kernel_size: 1
    3923         pad: 0
    3924         stride: 1
    3925         bias_term: false
    3926     }
    3927 }
    3928 
    3929 layer {
    3930     bottom: "res4b17_branch2c"
    3931     top: "res4b17_branch2c"
    3932     name: "bn4b17_branch2c"
    3933     type: "BatchNorm"
    3934     batch_norm_param {
    3935         use_global_stats: true
    3936     }
    3937 }
    3938 
    3939 layer {
    3940     bottom: "res4b17_branch2c"
    3941     top: "res4b17_branch2c"
    3942     name: "scale4b17_branch2c"
    3943     type: "Scale"
    3944     scale_param {
    3945         bias_term: true
    3946     }
    3947 }
    3948 
    3949 layer {
    3950     bottom: "res4b16"
    3951     bottom: "res4b17_branch2c"
    3952     top: "res4b17"
    3953     name: "res4b17"
    3954     type: "Eltwise"
    3955 }
    3956 
    3957 layer {
    3958     bottom: "res4b17"
    3959     top: "res4b17"
    3960     name: "res4b17_relu"
    3961     type: "ReLU"
    3962 }
    3963 
    3964 layer {
    3965     bottom: "res4b17"
    3966     top: "res4b18_branch2a"
    3967     name: "res4b18_branch2a"
    3968     type: "Convolution"
    3969     convolution_param {
    3970         num_output: 256
    3971         kernel_size: 1
    3972         pad: 0
    3973         stride: 1
    3974         bias_term: false
    3975     }
    3976 }
    3977 
    3978 layer {
    3979     bottom: "res4b18_branch2a"
    3980     top: "res4b18_branch2a"
    3981     name: "bn4b18_branch2a"
    3982     type: "BatchNorm"
    3983     batch_norm_param {
    3984         use_global_stats: true
    3985     }
    3986 }
    3987 
    3988 layer {
    3989     bottom: "res4b18_branch2a"
    3990     top: "res4b18_branch2a"
    3991     name: "scale4b18_branch2a"
    3992     type: "Scale"
    3993     scale_param {
    3994         bias_term: true
    3995     }
    3996 }
    3997 
    3998 layer {
    3999     top: "res4b18_branch2a"
    4000     bottom: "res4b18_branch2a"
    4001     name: "res4b18_branch2a_relu"
    4002     type: "ReLU"
    4003 }
    4004 
    4005 layer {
    4006     bottom: "res4b18_branch2a"
    4007     top: "res4b18_branch2b"
    4008     name: "res4b18_branch2b"
    4009     type: "Convolution"
    4010     convolution_param {
    4011         num_output: 256
    4012         kernel_size: 3
    4013         pad: 1
    4014         stride: 1
    4015         bias_term: false
    4016     }
    4017 }
    4018 
    4019 layer {
    4020     bottom: "res4b18_branch2b"
    4021     top: "res4b18_branch2b"
    4022     name: "bn4b18_branch2b"
    4023     type: "BatchNorm"
    4024     batch_norm_param {
    4025         use_global_stats: true
    4026     }
    4027 }
    4028 
    4029 layer {
    4030     bottom: "res4b18_branch2b"
    4031     top: "res4b18_branch2b"
    4032     name: "scale4b18_branch2b"
    4033     type: "Scale"
    4034     scale_param {
    4035         bias_term: true
    4036     }
    4037 }
    4038 
    4039 layer {
    4040     top: "res4b18_branch2b"
    4041     bottom: "res4b18_branch2b"
    4042     name: "res4b18_branch2b_relu"
    4043     type: "ReLU"
    4044 }
    4045 
    4046 layer {
    4047     bottom: "res4b18_branch2b"
    4048     top: "res4b18_branch2c"
    4049     name: "res4b18_branch2c"
    4050     type: "Convolution"
    4051     convolution_param {
    4052         num_output: 1024
    4053         kernel_size: 1
    4054         pad: 0
    4055         stride: 1
    4056         bias_term: false
    4057     }
    4058 }
    4059 
    4060 layer {
    4061     bottom: "res4b18_branch2c"
    4062     top: "res4b18_branch2c"
    4063     name: "bn4b18_branch2c"
    4064     type: "BatchNorm"
    4065     batch_norm_param {
    4066         use_global_stats: true
    4067     }
    4068 }
    4069 
    4070 layer {
    4071     bottom: "res4b18_branch2c"
    4072     top: "res4b18_branch2c"
    4073     name: "scale4b18_branch2c"
    4074     type: "Scale"
    4075     scale_param {
    4076         bias_term: true
    4077     }
    4078 }
    4079 
    4080 layer {
    4081     bottom: "res4b17"
    4082     bottom: "res4b18_branch2c"
    4083     top: "res4b18"
    4084     name: "res4b18"
    4085     type: "Eltwise"
    4086 }
    4087 
    4088 layer {
    4089     bottom: "res4b18"
    4090     top: "res4b18"
    4091     name: "res4b18_relu"
    4092     type: "ReLU"
    4093 }
    4094 
    4095 layer {
    4096     bottom: "res4b18"
    4097     top: "res4b19_branch2a"
    4098     name: "res4b19_branch2a"
    4099     type: "Convolution"
    4100     convolution_param {
    4101         num_output: 256
    4102         kernel_size: 1
    4103         pad: 0
    4104         stride: 1
    4105         bias_term: false
    4106     }
    4107 }
    4108 
    4109 layer {
    4110     bottom: "res4b19_branch2a"
    4111     top: "res4b19_branch2a"
    4112     name: "bn4b19_branch2a"
    4113     type: "BatchNorm"
    4114     batch_norm_param {
    4115         use_global_stats: true
    4116     }
    4117 }
    4118 
    4119 layer {
    4120     bottom: "res4b19_branch2a"
    4121     top: "res4b19_branch2a"
    4122     name: "scale4b19_branch2a"
    4123     type: "Scale"
    4124     scale_param {
    4125         bias_term: true
    4126     }
    4127 }
    4128 
    4129 layer {
    4130     top: "res4b19_branch2a"
    4131     bottom: "res4b19_branch2a"
    4132     name: "res4b19_branch2a_relu"
    4133     type: "ReLU"
    4134 }
    4135 
    4136 layer {
    4137     bottom: "res4b19_branch2a"
    4138     top: "res4b19_branch2b"
    4139     name: "res4b19_branch2b"
    4140     type: "Convolution"
    4141     convolution_param {
    4142         num_output: 256
    4143         kernel_size: 3
    4144         pad: 1
    4145         stride: 1
    4146         bias_term: false
    4147     }
    4148 }
    4149 
    4150 layer {
    4151     bottom: "res4b19_branch2b"
    4152     top: "res4b19_branch2b"
    4153     name: "bn4b19_branch2b"
    4154     type: "BatchNorm"
    4155     batch_norm_param {
    4156         use_global_stats: true
    4157     }
    4158 }
    4159 
    4160 layer {
    4161     bottom: "res4b19_branch2b"
    4162     top: "res4b19_branch2b"
    4163     name: "scale4b19_branch2b"
    4164     type: "Scale"
    4165     scale_param {
    4166         bias_term: true
    4167     }
    4168 }
    4169 
    4170 layer {
    4171     top: "res4b19_branch2b"
    4172     bottom: "res4b19_branch2b"
    4173     name: "res4b19_branch2b_relu"
    4174     type: "ReLU"
    4175 }
    4176 
    4177 layer {
    4178     bottom: "res4b19_branch2b"
    4179     top: "res4b19_branch2c"
    4180     name: "res4b19_branch2c"
    4181     type: "Convolution"
    4182     convolution_param {
    4183         num_output: 1024
    4184         kernel_size: 1
    4185         pad: 0
    4186         stride: 1
    4187         bias_term: false
    4188     }
    4189 }
    4190 
    4191 layer {
    4192     bottom: "res4b19_branch2c"
    4193     top: "res4b19_branch2c"
    4194     name: "bn4b19_branch2c"
    4195     type: "BatchNorm"
    4196     batch_norm_param {
    4197         use_global_stats: true
    4198     }
    4199 }
    4200 
    4201 layer {
    4202     bottom: "res4b19_branch2c"
    4203     top: "res4b19_branch2c"
    4204     name: "scale4b19_branch2c"
    4205     type: "Scale"
    4206     scale_param {
    4207         bias_term: true
    4208     }
    4209 }
    4210 
    4211 layer {
    4212     bottom: "res4b18"
    4213     bottom: "res4b19_branch2c"
    4214     top: "res4b19"
    4215     name: "res4b19"
    4216     type: "Eltwise"
    4217 }
    4218 
    4219 layer {
    4220     bottom: "res4b19"
    4221     top: "res4b19"
    4222     name: "res4b19_relu"
    4223     type: "ReLU"
    4224 }
    4225 
    4226 layer {
    4227     bottom: "res4b19"
    4228     top: "res4b20_branch2a"
    4229     name: "res4b20_branch2a"
    4230     type: "Convolution"
    4231     convolution_param {
    4232         num_output: 256
    4233         kernel_size: 1
    4234         pad: 0
    4235         stride: 1
    4236         bias_term: false
    4237     }
    4238 }
    4239 
    4240 layer {
    4241     bottom: "res4b20_branch2a"
    4242     top: "res4b20_branch2a"
    4243     name: "bn4b20_branch2a"
    4244     type: "BatchNorm"
    4245     batch_norm_param {
    4246         use_global_stats: true
    4247     }
    4248 }
    4249 
    4250 layer {
    4251     bottom: "res4b20_branch2a"
    4252     top: "res4b20_branch2a"
    4253     name: "scale4b20_branch2a"
    4254     type: "Scale"
    4255     scale_param {
    4256         bias_term: true
    4257     }
    4258 }
    4259 
    4260 layer {
    4261     top: "res4b20_branch2a"
    4262     bottom: "res4b20_branch2a"
    4263     name: "res4b20_branch2a_relu"
    4264     type: "ReLU"
    4265 }
    4266 
    4267 layer {
    4268     bottom: "res4b20_branch2a"
    4269     top: "res4b20_branch2b"
    4270     name: "res4b20_branch2b"
    4271     type: "Convolution"
    4272     convolution_param {
    4273         num_output: 256
    4274         kernel_size: 3
    4275         pad: 1
    4276         stride: 1
    4277         bias_term: false
    4278     }
    4279 }
    4280 
    4281 layer {
    4282     bottom: "res4b20_branch2b"
    4283     top: "res4b20_branch2b"
    4284     name: "bn4b20_branch2b"
    4285     type: "BatchNorm"
    4286     batch_norm_param {
    4287         use_global_stats: true
    4288     }
    4289 }
    4290 
    4291 layer {
    4292     bottom: "res4b20_branch2b"
    4293     top: "res4b20_branch2b"
    4294     name: "scale4b20_branch2b"
    4295     type: "Scale"
    4296     scale_param {
    4297         bias_term: true
    4298     }
    4299 }
    4300 
    4301 layer {
    4302     top: "res4b20_branch2b"
    4303     bottom: "res4b20_branch2b"
    4304     name: "res4b20_branch2b_relu"
    4305     type: "ReLU"
    4306 }
    4307 
    4308 layer {
    4309     bottom: "res4b20_branch2b"
    4310     top: "res4b20_branch2c"
    4311     name: "res4b20_branch2c"
    4312     type: "Convolution"
    4313     convolution_param {
    4314         num_output: 1024
    4315         kernel_size: 1
    4316         pad: 0
    4317         stride: 1
    4318         bias_term: false
    4319     }
    4320 }
    4321 
    4322 layer {
    4323     bottom: "res4b20_branch2c"
    4324     top: "res4b20_branch2c"
    4325     name: "bn4b20_branch2c"
    4326     type: "BatchNorm"
    4327     batch_norm_param {
    4328         use_global_stats: true
    4329     }
    4330 }
    4331 
    4332 layer {
    4333     bottom: "res4b20_branch2c"
    4334     top: "res4b20_branch2c"
    4335     name: "scale4b20_branch2c"
    4336     type: "Scale"
    4337     scale_param {
    4338         bias_term: true
    4339     }
    4340 }
    4341 
    4342 layer {
    4343     bottom: "res4b19"
    4344     bottom: "res4b20_branch2c"
    4345     top: "res4b20"
    4346     name: "res4b20"
    4347     type: "Eltwise"
    4348 }
    4349 
    4350 layer {
    4351     bottom: "res4b20"
    4352     top: "res4b20"
    4353     name: "res4b20_relu"
    4354     type: "ReLU"
    4355 }
    4356 
    4357 layer {
    4358     bottom: "res4b20"
    4359     top: "res4b21_branch2a"
    4360     name: "res4b21_branch2a"
    4361     type: "Convolution"
    4362     convolution_param {
    4363         num_output: 256
    4364         kernel_size: 1
    4365         pad: 0
    4366         stride: 1
    4367         bias_term: false
    4368     }
    4369 }
    4370 
    4371 layer {
    4372     bottom: "res4b21_branch2a"
    4373     top: "res4b21_branch2a"
    4374     name: "bn4b21_branch2a"
    4375     type: "BatchNorm"
    4376     batch_norm_param {
    4377         use_global_stats: true
    4378     }
    4379 }
    4380 
    4381 layer {
    4382     bottom: "res4b21_branch2a"
    4383     top: "res4b21_branch2a"
    4384     name: "scale4b21_branch2a"
    4385     type: "Scale"
    4386     scale_param {
    4387         bias_term: true
    4388     }
    4389 }
    4390 
    4391 layer {
    4392     top: "res4b21_branch2a"
    4393     bottom: "res4b21_branch2a"
    4394     name: "res4b21_branch2a_relu"
    4395     type: "ReLU"
    4396 }
    4397 
    4398 layer {
    4399     bottom: "res4b21_branch2a"
    4400     top: "res4b21_branch2b"
    4401     name: "res4b21_branch2b"
    4402     type: "Convolution"
    4403     convolution_param {
    4404         num_output: 256
    4405         kernel_size: 3
    4406         pad: 1
    4407         stride: 1
    4408         bias_term: false
    4409     }
    4410 }
    4411 
    4412 layer {
    4413     bottom: "res4b21_branch2b"
    4414     top: "res4b21_branch2b"
    4415     name: "bn4b21_branch2b"
    4416     type: "BatchNorm"
    4417     batch_norm_param {
    4418         use_global_stats: true
    4419     }
    4420 }
    4421 
    4422 layer {
    4423     bottom: "res4b21_branch2b"
    4424     top: "res4b21_branch2b"
    4425     name: "scale4b21_branch2b"
    4426     type: "Scale"
    4427     scale_param {
    4428         bias_term: true
    4429     }
    4430 }
    4431 
    4432 layer {
    4433     top: "res4b21_branch2b"
    4434     bottom: "res4b21_branch2b"
    4435     name: "res4b21_branch2b_relu"
    4436     type: "ReLU"
    4437 }
    4438 
    4439 layer {
    4440     bottom: "res4b21_branch2b"
    4441     top: "res4b21_branch2c"
    4442     name: "res4b21_branch2c"
    4443     type: "Convolution"
    4444     convolution_param {
    4445         num_output: 1024
    4446         kernel_size: 1
    4447         pad: 0
    4448         stride: 1
    4449         bias_term: false
    4450     }
    4451 }
    4452 
    4453 layer {
    4454     bottom: "res4b21_branch2c"
    4455     top: "res4b21_branch2c"
    4456     name: "bn4b21_branch2c"
    4457     type: "BatchNorm"
    4458     batch_norm_param {
    4459         use_global_stats: true
    4460     }
    4461 }
    4462 
    4463 layer {
    4464     bottom: "res4b21_branch2c"
    4465     top: "res4b21_branch2c"
    4466     name: "scale4b21_branch2c"
    4467     type: "Scale"
    4468     scale_param {
    4469         bias_term: true
    4470     }
    4471 }
    4472 
    4473 layer {
    4474     bottom: "res4b20"
    4475     bottom: "res4b21_branch2c"
    4476     top: "res4b21"
    4477     name: "res4b21"
    4478     type: "Eltwise"
    4479 }
    4480 
    4481 layer {
    4482     bottom: "res4b21"
    4483     top: "res4b21"
    4484     name: "res4b21_relu"
    4485     type: "ReLU"
    4486 }
    4487 
    4488 layer {
    4489     bottom: "res4b21"
    4490     top: "res4b22_branch2a"
    4491     name: "res4b22_branch2a"
    4492     type: "Convolution"
    4493     convolution_param {
    4494         num_output: 256
    4495         kernel_size: 1
    4496         pad: 0
    4497         stride: 1
    4498         bias_term: false
    4499     }
    4500 }
    4501 
    4502 layer {
    4503     bottom: "res4b22_branch2a"
    4504     top: "res4b22_branch2a"
    4505     name: "bn4b22_branch2a"
    4506     type: "BatchNorm"
    4507     batch_norm_param {
    4508         use_global_stats: true
    4509     }
    4510 }
    4511 
    4512 layer {
    4513     bottom: "res4b22_branch2a"
    4514     top: "res4b22_branch2a"
    4515     name: "scale4b22_branch2a"
    4516     type: "Scale"
    4517     scale_param {
    4518         bias_term: true
    4519     }
    4520 }
    4521 
    4522 layer {
    4523     top: "res4b22_branch2a"
    4524     bottom: "res4b22_branch2a"
    4525     name: "res4b22_branch2a_relu"
    4526     type: "ReLU"
    4527 }
    4528 
    4529 layer {
    4530     bottom: "res4b22_branch2a"
    4531     top: "res4b22_branch2b"
    4532     name: "res4b22_branch2b"
    4533     type: "Convolution"
    4534     convolution_param {
    4535         num_output: 256
    4536         kernel_size: 3
    4537         pad: 1
    4538         stride: 1
    4539         bias_term: false
    4540     }
    4541 }
    4542 
    4543 layer {
    4544     bottom: "res4b22_branch2b"
    4545     top: "res4b22_branch2b"
    4546     name: "bn4b22_branch2b"
    4547     type: "BatchNorm"
    4548     batch_norm_param {
    4549         use_global_stats: true
    4550     }
    4551 }
    4552 
    4553 layer {
    4554     bottom: "res4b22_branch2b"
    4555     top: "res4b22_branch2b"
    4556     name: "scale4b22_branch2b"
    4557     type: "Scale"
    4558     scale_param {
    4559         bias_term: true
    4560     }
    4561 }
    4562 
    4563 layer {
    4564     top: "res4b22_branch2b"
    4565     bottom: "res4b22_branch2b"
    4566     name: "res4b22_branch2b_relu"
    4567     type: "ReLU"
    4568 }
    4569 
    4570 layer {
    4571     bottom: "res4b22_branch2b"
    4572     top: "res4b22_branch2c"
    4573     name: "res4b22_branch2c"
    4574     type: "Convolution"
    4575     convolution_param {
    4576         num_output: 1024
    4577         kernel_size: 1
    4578         pad: 0
    4579         stride: 1
    4580         bias_term: false
    4581     }
    4582 }
    4583 
    4584 layer {
    4585     bottom: "res4b22_branch2c"
    4586     top: "res4b22_branch2c"
    4587     name: "bn4b22_branch2c"
    4588     type: "BatchNorm"
    4589     batch_norm_param {
    4590         use_global_stats: true
    4591     }
    4592 }
    4593 
    4594 layer {
    4595     bottom: "res4b22_branch2c"
    4596     top: "res4b22_branch2c"
    4597     name: "scale4b22_branch2c"
    4598     type: "Scale"
    4599     scale_param {
    4600         bias_term: true
    4601     }
    4602 }
    4603 
    4604 layer {
    4605     bottom: "res4b21"
    4606     bottom: "res4b22_branch2c"
    4607     top: "res4b22"
    4608     name: "res4b22"
    4609     type: "Eltwise"
    4610 }
    4611 
    4612 layer {
    4613     bottom: "res4b22"
    4614     top: "res4b22"
    4615     name: "res4b22_relu"
    4616     type: "ReLU"
    4617 }
    4618 
    4619 layer {
    4620     bottom: "res4b22"
    4621     top: "res4b23_branch2a"
    4622     name: "res4b23_branch2a"
    4623     type: "Convolution"
    4624     convolution_param {
    4625         num_output: 256
    4626         kernel_size: 1
    4627         pad: 0
    4628         stride: 1
    4629         bias_term: false
    4630     }
    4631 }
    4632 
    4633 layer {
    4634     bottom: "res4b23_branch2a"
    4635     top: "res4b23_branch2a"
    4636     name: "bn4b23_branch2a"
    4637     type: "BatchNorm"
    4638     batch_norm_param {
    4639         use_global_stats: true
    4640     }
    4641 }
    4642 
    4643 layer {
    4644     bottom: "res4b23_branch2a"
    4645     top: "res4b23_branch2a"
    4646     name: "scale4b23_branch2a"
    4647     type: "Scale"
    4648     scale_param {
    4649         bias_term: true
    4650     }
    4651 }
    4652 
    4653 layer {
    4654     top: "res4b23_branch2a"
    4655     bottom: "res4b23_branch2a"
    4656     name: "res4b23_branch2a_relu"
    4657     type: "ReLU"
    4658 }
    4659 
    4660 layer {
    4661     bottom: "res4b23_branch2a"
    4662     top: "res4b23_branch2b"
    4663     name: "res4b23_branch2b"
    4664     type: "Convolution"
    4665     convolution_param {
    4666         num_output: 256
    4667         kernel_size: 3
    4668         pad: 1
    4669         stride: 1
    4670         bias_term: false
    4671     }
    4672 }
    4673 
    4674 layer {
    4675     bottom: "res4b23_branch2b"
    4676     top: "res4b23_branch2b"
    4677     name: "bn4b23_branch2b"
    4678     type: "BatchNorm"
    4679     batch_norm_param {
    4680         use_global_stats: true
    4681     }
    4682 }
    4683 
    4684 layer {
    4685     bottom: "res4b23_branch2b"
    4686     top: "res4b23_branch2b"
    4687     name: "scale4b23_branch2b"
    4688     type: "Scale"
    4689     scale_param {
    4690         bias_term: true
    4691     }
    4692 }
    4693 
    4694 layer {
    4695     top: "res4b23_branch2b"
    4696     bottom: "res4b23_branch2b"
    4697     name: "res4b23_branch2b_relu"
    4698     type: "ReLU"
    4699 }
    4700 
    4701 layer {
    4702     bottom: "res4b23_branch2b"
    4703     top: "res4b23_branch2c"
    4704     name: "res4b23_branch2c"
    4705     type: "Convolution"
    4706     convolution_param {
    4707         num_output: 1024
    4708         kernel_size: 1
    4709         pad: 0
    4710         stride: 1
    4711         bias_term: false
    4712     }
    4713 }
    4714 
    4715 layer {
    4716     bottom: "res4b23_branch2c"
    4717     top: "res4b23_branch2c"
    4718     name: "bn4b23_branch2c"
    4719     type: "BatchNorm"
    4720     batch_norm_param {
    4721         use_global_stats: true
    4722     }
    4723 }
    4724 
    4725 layer {
    4726     bottom: "res4b23_branch2c"
    4727     top: "res4b23_branch2c"
    4728     name: "scale4b23_branch2c"
    4729     type: "Scale"
    4730     scale_param {
    4731         bias_term: true
    4732     }
    4733 }
    4734 
    4735 layer {
    4736     bottom: "res4b22"
    4737     bottom: "res4b23_branch2c"
    4738     top: "res4b23"
    4739     name: "res4b23"
    4740     type: "Eltwise"
    4741 }
    4742 
    4743 layer {
    4744     bottom: "res4b23"
    4745     top: "res4b23"
    4746     name: "res4b23_relu"
    4747     type: "ReLU"
    4748 }
    4749 
    4750 layer {
    4751     bottom: "res4b23"
    4752     top: "res4b24_branch2a"
    4753     name: "res4b24_branch2a"
    4754     type: "Convolution"
    4755     convolution_param {
    4756         num_output: 256
    4757         kernel_size: 1
    4758         pad: 0
    4759         stride: 1
    4760         bias_term: false
    4761     }
    4762 }
    4763 
    4764 layer {
    4765     bottom: "res4b24_branch2a"
    4766     top: "res4b24_branch2a"
    4767     name: "bn4b24_branch2a"
    4768     type: "BatchNorm"
    4769     batch_norm_param {
    4770         use_global_stats: true
    4771     }
    4772 }
    4773 
    4774 layer {
    4775     bottom: "res4b24_branch2a"
    4776     top: "res4b24_branch2a"
    4777     name: "scale4b24_branch2a"
    4778     type: "Scale"
    4779     scale_param {
    4780         bias_term: true
    4781     }
    4782 }
    4783 
    4784 layer {
    4785     top: "res4b24_branch2a"
    4786     bottom: "res4b24_branch2a"
    4787     name: "res4b24_branch2a_relu"
    4788     type: "ReLU"
    4789 }
    4790 
    4791 layer {
    4792     bottom: "res4b24_branch2a"
    4793     top: "res4b24_branch2b"
    4794     name: "res4b24_branch2b"
    4795     type: "Convolution"
    4796     convolution_param {
    4797         num_output: 256
    4798         kernel_size: 3
    4799         pad: 1
    4800         stride: 1
    4801         bias_term: false
    4802     }
    4803 }
    4804 
    4805 layer {
    4806     bottom: "res4b24_branch2b"
    4807     top: "res4b24_branch2b"
    4808     name: "bn4b24_branch2b"
    4809     type: "BatchNorm"
    4810     batch_norm_param {
    4811         use_global_stats: true
    4812     }
    4813 }
    4814 
    4815 layer {
    4816     bottom: "res4b24_branch2b"
    4817     top: "res4b24_branch2b"
    4818     name: "scale4b24_branch2b"
    4819     type: "Scale"
    4820     scale_param {
    4821         bias_term: true
    4822     }
    4823 }
    4824 
    4825 layer {
    4826     top: "res4b24_branch2b"
    4827     bottom: "res4b24_branch2b"
    4828     name: "res4b24_branch2b_relu"
    4829     type: "ReLU"
    4830 }
    4831 
    4832 layer {
    4833     bottom: "res4b24_branch2b"
    4834     top: "res4b24_branch2c"
    4835     name: "res4b24_branch2c"
    4836     type: "Convolution"
    4837     convolution_param {
    4838         num_output: 1024
    4839         kernel_size: 1
    4840         pad: 0
    4841         stride: 1
    4842         bias_term: false
    4843     }
    4844 }
    4845 
    4846 layer {
    4847     bottom: "res4b24_branch2c"
    4848     top: "res4b24_branch2c"
    4849     name: "bn4b24_branch2c"
    4850     type: "BatchNorm"
    4851     batch_norm_param {
    4852         use_global_stats: true
    4853     }
    4854 }
    4855 
    4856 layer {
    4857     bottom: "res4b24_branch2c"
    4858     top: "res4b24_branch2c"
    4859     name: "scale4b24_branch2c"
    4860     type: "Scale"
    4861     scale_param {
    4862         bias_term: true
    4863     }
    4864 }
    4865 
    4866 layer {
    4867     bottom: "res4b23"
    4868     bottom: "res4b24_branch2c"
    4869     top: "res4b24"
    4870     name: "res4b24"
    4871     type: "Eltwise"
    4872 }
    4873 
    4874 layer {
    4875     bottom: "res4b24"
    4876     top: "res4b24"
    4877     name: "res4b24_relu"
    4878     type: "ReLU"
    4879 }
    4880 
    4881 layer {
    4882     bottom: "res4b24"
    4883     top: "res4b25_branch2a"
    4884     name: "res4b25_branch2a"
    4885     type: "Convolution"
    4886     convolution_param {
    4887         num_output: 256
    4888         kernel_size: 1
    4889         pad: 0
    4890         stride: 1
    4891         bias_term: false
    4892     }
    4893 }
    4894 
    4895 layer {
    4896     bottom: "res4b25_branch2a"
    4897     top: "res4b25_branch2a"
    4898     name: "bn4b25_branch2a"
    4899     type: "BatchNorm"
    4900     batch_norm_param {
    4901         use_global_stats: true
    4902     }
    4903 }
    4904 
    4905 layer {
    4906     bottom: "res4b25_branch2a"
    4907     top: "res4b25_branch2a"
    4908     name: "scale4b25_branch2a"
    4909     type: "Scale"
    4910     scale_param {
    4911         bias_term: true
    4912     }
    4913 }
    4914 
    4915 layer {
    4916     top: "res4b25_branch2a"
    4917     bottom: "res4b25_branch2a"
    4918     name: "res4b25_branch2a_relu"
    4919     type: "ReLU"
    4920 }
    4921 
    4922 layer {
    4923     bottom: "res4b25_branch2a"
    4924     top: "res4b25_branch2b"
    4925     name: "res4b25_branch2b"
    4926     type: "Convolution"
    4927     convolution_param {
    4928         num_output: 256
    4929         kernel_size: 3
    4930         pad: 1
    4931         stride: 1
    4932         bias_term: false
    4933     }
    4934 }
    4935 
    4936 layer {
    4937     bottom: "res4b25_branch2b"
    4938     top: "res4b25_branch2b"
    4939     name: "bn4b25_branch2b"
    4940     type: "BatchNorm"
    4941     batch_norm_param {
    4942         use_global_stats: true
    4943     }
    4944 }
    4945 
    4946 layer {
    4947     bottom: "res4b25_branch2b"
    4948     top: "res4b25_branch2b"
    4949     name: "scale4b25_branch2b"
    4950     type: "Scale"
    4951     scale_param {
    4952         bias_term: true
    4953     }
    4954 }
    4955 
    4956 layer {
    4957     top: "res4b25_branch2b"
    4958     bottom: "res4b25_branch2b"
    4959     name: "res4b25_branch2b_relu"
    4960     type: "ReLU"
    4961 }
    4962 
    4963 layer {
    4964     bottom: "res4b25_branch2b"
    4965     top: "res4b25_branch2c"
    4966     name: "res4b25_branch2c"
    4967     type: "Convolution"
    4968     convolution_param {
    4969         num_output: 1024
    4970         kernel_size: 1
    4971         pad: 0
    4972         stride: 1
    4973         bias_term: false
    4974     }
    4975 }
    4976 
    4977 layer {
    4978     bottom: "res4b25_branch2c"
    4979     top: "res4b25_branch2c"
    4980     name: "bn4b25_branch2c"
    4981     type: "BatchNorm"
    4982     batch_norm_param {
    4983         use_global_stats: true
    4984     }
    4985 }
    4986 
    4987 layer {
    4988     bottom: "res4b25_branch2c"
    4989     top: "res4b25_branch2c"
    4990     name: "scale4b25_branch2c"
    4991     type: "Scale"
    4992     scale_param {
    4993         bias_term: true
    4994     }
    4995 }
    4996 
    4997 layer {
    4998     bottom: "res4b24"
    4999     bottom: "res4b25_branch2c"
    5000     top: "res4b25"
    5001     name: "res4b25"
    5002     type: "Eltwise"
    5003 }
    5004 
    5005 layer {
    5006     bottom: "res4b25"
    5007     top: "res4b25"
    5008     name: "res4b25_relu"
    5009     type: "ReLU"
    5010 }
    5011 
    5012 layer {
    5013     bottom: "res4b25"
    5014     top: "res4b26_branch2a"
    5015     name: "res4b26_branch2a"
    5016     type: "Convolution"
    5017     convolution_param {
    5018         num_output: 256
    5019         kernel_size: 1
    5020         pad: 0
    5021         stride: 1
    5022         bias_term: false
    5023     }
    5024 }
    5025 
    5026 layer {
    5027     bottom: "res4b26_branch2a"
    5028     top: "res4b26_branch2a"
    5029     name: "bn4b26_branch2a"
    5030     type: "BatchNorm"
    5031     batch_norm_param {
    5032         use_global_stats: true
    5033     }
    5034 }
    5035 
    5036 layer {
    5037     bottom: "res4b26_branch2a"
    5038     top: "res4b26_branch2a"
    5039     name: "scale4b26_branch2a"
    5040     type: "Scale"
    5041     scale_param {
    5042         bias_term: true
    5043     }
    5044 }
    5045 
    5046 layer {
    5047     top: "res4b26_branch2a"
    5048     bottom: "res4b26_branch2a"
    5049     name: "res4b26_branch2a_relu"
    5050     type: "ReLU"
    5051 }
    5052 
    5053 layer {
    5054     bottom: "res4b26_branch2a"
    5055     top: "res4b26_branch2b"
    5056     name: "res4b26_branch2b"
    5057     type: "Convolution"
    5058     convolution_param {
    5059         num_output: 256
    5060         kernel_size: 3
    5061         pad: 1
    5062         stride: 1
    5063         bias_term: false
    5064     }
    5065 }
    5066 
    5067 layer {
    5068     bottom: "res4b26_branch2b"
    5069     top: "res4b26_branch2b"
    5070     name: "bn4b26_branch2b"
    5071     type: "BatchNorm"
    5072     batch_norm_param {
    5073         use_global_stats: true
    5074     }
    5075 }
    5076 
    5077 layer {
    5078     bottom: "res4b26_branch2b"
    5079     top: "res4b26_branch2b"
    5080     name: "scale4b26_branch2b"
    5081     type: "Scale"
    5082     scale_param {
    5083         bias_term: true
    5084     }
    5085 }
    5086 
    5087 layer {
    5088     top: "res4b26_branch2b"
    5089     bottom: "res4b26_branch2b"
    5090     name: "res4b26_branch2b_relu"
    5091     type: "ReLU"
    5092 }
    5093 
    5094 layer {
    5095     bottom: "res4b26_branch2b"
    5096     top: "res4b26_branch2c"
    5097     name: "res4b26_branch2c"
    5098     type: "Convolution"
    5099     convolution_param {
    5100         num_output: 1024
    5101         kernel_size: 1
    5102         pad: 0
    5103         stride: 1
    5104         bias_term: false
    5105     }
    5106 }
    5107 
    5108 layer {
    5109     bottom: "res4b26_branch2c"
    5110     top: "res4b26_branch2c"
    5111     name: "bn4b26_branch2c"
    5112     type: "BatchNorm"
    5113     batch_norm_param {
    5114         use_global_stats: true
    5115     }
    5116 }
    5117 
    5118 layer {
    5119     bottom: "res4b26_branch2c"
    5120     top: "res4b26_branch2c"
    5121     name: "scale4b26_branch2c"
    5122     type: "Scale"
    5123     scale_param {
    5124         bias_term: true
    5125     }
    5126 }
    5127 
    5128 layer {
    5129     bottom: "res4b25"
    5130     bottom: "res4b26_branch2c"
    5131     top: "res4b26"
    5132     name: "res4b26"
    5133     type: "Eltwise"
    5134 }
    5135 
    5136 layer {
    5137     bottom: "res4b26"
    5138     top: "res4b26"
    5139     name: "res4b26_relu"
    5140     type: "ReLU"
    5141 }
    5142 
    5143 layer {
    5144     bottom: "res4b26"
    5145     top: "res4b27_branch2a"
    5146     name: "res4b27_branch2a"
    5147     type: "Convolution"
    5148     convolution_param {
    5149         num_output: 256
    5150         kernel_size: 1
    5151         pad: 0
    5152         stride: 1
    5153         bias_term: false
    5154     }
    5155 }
    5156 
    5157 layer {
    5158     bottom: "res4b27_branch2a"
    5159     top: "res4b27_branch2a"
    5160     name: "bn4b27_branch2a"
    5161     type: "BatchNorm"
    5162     batch_norm_param {
    5163         use_global_stats: true
    5164     }
    5165 }
    5166 
    5167 layer {
    5168     bottom: "res4b27_branch2a"
    5169     top: "res4b27_branch2a"
    5170     name: "scale4b27_branch2a"
    5171     type: "Scale"
    5172     scale_param {
    5173         bias_term: true
    5174     }
    5175 }
    5176 
    5177 layer {
    5178     top: "res4b27_branch2a"
    5179     bottom: "res4b27_branch2a"
    5180     name: "res4b27_branch2a_relu"
    5181     type: "ReLU"
    5182 }
    5183 
    5184 layer {
    5185     bottom: "res4b27_branch2a"
    5186     top: "res4b27_branch2b"
    5187     name: "res4b27_branch2b"
    5188     type: "Convolution"
    5189     convolution_param {
    5190         num_output: 256
    5191         kernel_size: 3
    5192         pad: 1
    5193         stride: 1
    5194         bias_term: false
    5195     }
    5196 }
    5197 
    5198 layer {
    5199     bottom: "res4b27_branch2b"
    5200     top: "res4b27_branch2b"
    5201     name: "bn4b27_branch2b"
    5202     type: "BatchNorm"
    5203     batch_norm_param {
    5204         use_global_stats: true
    5205     }
    5206 }
    5207 
    5208 layer {
    5209     bottom: "res4b27_branch2b"
    5210     top: "res4b27_branch2b"
    5211     name: "scale4b27_branch2b"
    5212     type: "Scale"
    5213     scale_param {
    5214         bias_term: true
    5215     }
    5216 }
    5217 
    5218 layer {
    5219     top: "res4b27_branch2b"
    5220     bottom: "res4b27_branch2b"
    5221     name: "res4b27_branch2b_relu"
    5222     type: "ReLU"
    5223 }
    5224 
    5225 layer {
    5226     bottom: "res4b27_branch2b"
    5227     top: "res4b27_branch2c"
    5228     name: "res4b27_branch2c"
    5229     type: "Convolution"
    5230     convolution_param {
    5231         num_output: 1024
    5232         kernel_size: 1
    5233         pad: 0
    5234         stride: 1
    5235         bias_term: false
    5236     }
    5237 }
    5238 
    5239 layer {
    5240     bottom: "res4b27_branch2c"
    5241     top: "res4b27_branch2c"
    5242     name: "bn4b27_branch2c"
    5243     type: "BatchNorm"
    5244     batch_norm_param {
    5245         use_global_stats: true
    5246     }
    5247 }
    5248 
    5249 layer {
    5250     bottom: "res4b27_branch2c"
    5251     top: "res4b27_branch2c"
    5252     name: "scale4b27_branch2c"
    5253     type: "Scale"
    5254     scale_param {
    5255         bias_term: true
    5256     }
    5257 }
    5258 
    5259 layer {
    5260     bottom: "res4b26"
    5261     bottom: "res4b27_branch2c"
    5262     top: "res4b27"
    5263     name: "res4b27"
    5264     type: "Eltwise"
    5265 }
    5266 
    5267 layer {
    5268     bottom: "res4b27"
    5269     top: "res4b27"
    5270     name: "res4b27_relu"
    5271     type: "ReLU"
    5272 }
    5273 
    5274 layer {
    5275     bottom: "res4b27"
    5276     top: "res4b28_branch2a"
    5277     name: "res4b28_branch2a"
    5278     type: "Convolution"
    5279     convolution_param {
    5280         num_output: 256
    5281         kernel_size: 1
    5282         pad: 0
    5283         stride: 1
    5284         bias_term: false
    5285     }
    5286 }
    5287 
    5288 layer {
    5289     bottom: "res4b28_branch2a"
    5290     top: "res4b28_branch2a"
    5291     name: "bn4b28_branch2a"
    5292     type: "BatchNorm"
    5293     batch_norm_param {
    5294         use_global_stats: true
    5295     }
    5296 }
    5297 
    5298 layer {
    5299     bottom: "res4b28_branch2a"
    5300     top: "res4b28_branch2a"
    5301     name: "scale4b28_branch2a"
    5302     type: "Scale"
    5303     scale_param {
    5304         bias_term: true
    5305     }
    5306 }
    5307 
    5308 layer {
    5309     top: "res4b28_branch2a"
    5310     bottom: "res4b28_branch2a"
    5311     name: "res4b28_branch2a_relu"
    5312     type: "ReLU"
    5313 }
    5314 
    5315 layer {
    5316     bottom: "res4b28_branch2a"
    5317     top: "res4b28_branch2b"
    5318     name: "res4b28_branch2b"
    5319     type: "Convolution"
    5320     convolution_param {
    5321         num_output: 256
    5322         kernel_size: 3
    5323         pad: 1
    5324         stride: 1
    5325         bias_term: false
    5326     }
    5327 }
    5328 
    5329 layer {
    5330     bottom: "res4b28_branch2b"
    5331     top: "res4b28_branch2b"
    5332     name: "bn4b28_branch2b"
    5333     type: "BatchNorm"
    5334     batch_norm_param {
    5335         use_global_stats: true
    5336     }
    5337 }
    5338 
    5339 layer {
    5340     bottom: "res4b28_branch2b"
    5341     top: "res4b28_branch2b"
    5342     name: "scale4b28_branch2b"
    5343     type: "Scale"
    5344     scale_param {
    5345         bias_term: true
    5346     }
    5347 }
    5348 
    5349 layer {
    5350     top: "res4b28_branch2b"
    5351     bottom: "res4b28_branch2b"
    5352     name: "res4b28_branch2b_relu"
    5353     type: "ReLU"
    5354 }
    5355 
    5356 layer {
    5357     bottom: "res4b28_branch2b"
    5358     top: "res4b28_branch2c"
    5359     name: "res4b28_branch2c"
    5360     type: "Convolution"
    5361     convolution_param {
    5362         num_output: 1024
    5363         kernel_size: 1
    5364         pad: 0
    5365         stride: 1
    5366         bias_term: false
    5367     }
    5368 }
    5369 
    5370 layer {
    5371     bottom: "res4b28_branch2c"
    5372     top: "res4b28_branch2c"
    5373     name: "bn4b28_branch2c"
    5374     type: "BatchNorm"
    5375     batch_norm_param {
    5376         use_global_stats: true
    5377     }
    5378 }
    5379 
    5380 layer {
    5381     bottom: "res4b28_branch2c"
    5382     top: "res4b28_branch2c"
    5383     name: "scale4b28_branch2c"
    5384     type: "Scale"
    5385     scale_param {
    5386         bias_term: true
    5387     }
    5388 }
    5389 
    5390 layer {
    5391     bottom: "res4b27"
    5392     bottom: "res4b28_branch2c"
    5393     top: "res4b28"
    5394     name: "res4b28"
    5395     type: "Eltwise"
    5396 }
    5397 
    5398 layer {
    5399     bottom: "res4b28"
    5400     top: "res4b28"
    5401     name: "res4b28_relu"
    5402     type: "ReLU"
    5403 }
    5404 
    5405 layer {
    5406     bottom: "res4b28"
    5407     top: "res4b29_branch2a"
    5408     name: "res4b29_branch2a"
    5409     type: "Convolution"
    5410     convolution_param {
    5411         num_output: 256
    5412         kernel_size: 1
    5413         pad: 0
    5414         stride: 1
    5415         bias_term: false
    5416     }
    5417 }
    5418 
    5419 layer {
    5420     bottom: "res4b29_branch2a"
    5421     top: "res4b29_branch2a"
    5422     name: "bn4b29_branch2a"
    5423     type: "BatchNorm"
    5424     batch_norm_param {
    5425         use_global_stats: true
    5426     }
    5427 }
    5428 
    5429 layer {
    5430     bottom: "res4b29_branch2a"
    5431     top: "res4b29_branch2a"
    5432     name: "scale4b29_branch2a"
    5433     type: "Scale"
    5434     scale_param {
    5435         bias_term: true
    5436     }
    5437 }
    5438 
    5439 layer {
    5440     top: "res4b29_branch2a"
    5441     bottom: "res4b29_branch2a"
    5442     name: "res4b29_branch2a_relu"
    5443     type: "ReLU"
    5444 }
    5445 
    5446 layer {
    5447     bottom: "res4b29_branch2a"
    5448     top: "res4b29_branch2b"
    5449     name: "res4b29_branch2b"
    5450     type: "Convolution"
    5451     convolution_param {
    5452         num_output: 256
    5453         kernel_size: 3
    5454         pad: 1
    5455         stride: 1
    5456         bias_term: false
    5457     }
    5458 }
    5459 
    5460 layer {
    5461     bottom: "res4b29_branch2b"
    5462     top: "res4b29_branch2b"
    5463     name: "bn4b29_branch2b"
    5464     type: "BatchNorm"
    5465     batch_norm_param {
    5466         use_global_stats: true
    5467     }
    5468 }
    5469 
    5470 layer {
    5471     bottom: "res4b29_branch2b"
    5472     top: "res4b29_branch2b"
    5473     name: "scale4b29_branch2b"
    5474     type: "Scale"
    5475     scale_param {
    5476         bias_term: true
    5477     }
    5478 }
    5479 
    5480 layer {
    5481     top: "res4b29_branch2b"
    5482     bottom: "res4b29_branch2b"
    5483     name: "res4b29_branch2b_relu"
    5484     type: "ReLU"
    5485 }
    5486 
    5487 layer {
    5488     bottom: "res4b29_branch2b"
    5489     top: "res4b29_branch2c"
    5490     name: "res4b29_branch2c"
    5491     type: "Convolution"
    5492     convolution_param {
    5493         num_output: 1024
    5494         kernel_size: 1
    5495         pad: 0
    5496         stride: 1
    5497         bias_term: false
    5498     }
    5499 }
    5500 
    5501 layer {
    5502     bottom: "res4b29_branch2c"
    5503     top: "res4b29_branch2c"
    5504     name: "bn4b29_branch2c"
    5505     type: "BatchNorm"
    5506     batch_norm_param {
    5507         use_global_stats: true
    5508     }
    5509 }
    5510 
    5511 layer {
    5512     bottom: "res4b29_branch2c"
    5513     top: "res4b29_branch2c"
    5514     name: "scale4b29_branch2c"
    5515     type: "Scale"
    5516     scale_param {
    5517         bias_term: true
    5518     }
    5519 }
    5520 
    5521 layer {
    5522     bottom: "res4b28"
    5523     bottom: "res4b29_branch2c"
    5524     top: "res4b29"
    5525     name: "res4b29"
    5526     type: "Eltwise"
    5527 }
    5528 
    5529 layer {
    5530     bottom: "res4b29"
    5531     top: "res4b29"
    5532     name: "res4b29_relu"
    5533     type: "ReLU"
    5534 }
    5535 
    5536 layer {
    5537     bottom: "res4b29"
    5538     top: "res4b30_branch2a"
    5539     name: "res4b30_branch2a"
    5540     type: "Convolution"
    5541     convolution_param {
    5542         num_output: 256
    5543         kernel_size: 1
    5544         pad: 0
    5545         stride: 1
    5546         bias_term: false
    5547     }
    5548 }
    5549 
    5550 layer {
    5551     bottom: "res4b30_branch2a"
    5552     top: "res4b30_branch2a"
    5553     name: "bn4b30_branch2a"
    5554     type: "BatchNorm"
    5555     batch_norm_param {
    5556         use_global_stats: true
    5557     }
    5558 }
    5559 
    5560 layer {
    5561     bottom: "res4b30_branch2a"
    5562     top: "res4b30_branch2a"
    5563     name: "scale4b30_branch2a"
    5564     type: "Scale"
    5565     scale_param {
    5566         bias_term: true
    5567     }
    5568 }
    5569 
    5570 layer {
    5571     top: "res4b30_branch2a"
    5572     bottom: "res4b30_branch2a"
    5573     name: "res4b30_branch2a_relu"
    5574     type: "ReLU"
    5575 }
    5576 
    5577 layer {
    5578     bottom: "res4b30_branch2a"
    5579     top: "res4b30_branch2b"
    5580     name: "res4b30_branch2b"
    5581     type: "Convolution"
    5582     convolution_param {
    5583         num_output: 256
    5584         kernel_size: 3
    5585         pad: 1
    5586         stride: 1
    5587         bias_term: false
    5588     }
    5589 }
    5590 
    5591 layer {
    5592     bottom: "res4b30_branch2b"
    5593     top: "res4b30_branch2b"
    5594     name: "bn4b30_branch2b"
    5595     type: "BatchNorm"
    5596     batch_norm_param {
    5597         use_global_stats: true
    5598     }
    5599 }
    5600 
    5601 layer {
    5602     bottom: "res4b30_branch2b"
    5603     top: "res4b30_branch2b"
    5604     name: "scale4b30_branch2b"
    5605     type: "Scale"
    5606     scale_param {
    5607         bias_term: true
    5608     }
    5609 }
    5610 
    5611 layer {
    5612     top: "res4b30_branch2b"
    5613     bottom: "res4b30_branch2b"
    5614     name: "res4b30_branch2b_relu"
    5615     type: "ReLU"
    5616 }
    5617 
    5618 layer {
    5619     bottom: "res4b30_branch2b"
    5620     top: "res4b30_branch2c"
    5621     name: "res4b30_branch2c"
    5622     type: "Convolution"
    5623     convolution_param {
    5624         num_output: 1024
    5625         kernel_size: 1
    5626         pad: 0
    5627         stride: 1
    5628         bias_term: false
    5629     }
    5630 }
    5631 
    5632 layer {
    5633     bottom: "res4b30_branch2c"
    5634     top: "res4b30_branch2c"
    5635     name: "bn4b30_branch2c"
    5636     type: "BatchNorm"
    5637     batch_norm_param {
    5638         use_global_stats: true
    5639     }
    5640 }
    5641 
    5642 layer {
    5643     bottom: "res4b30_branch2c"
    5644     top: "res4b30_branch2c"
    5645     name: "scale4b30_branch2c"
    5646     type: "Scale"
    5647     scale_param {
    5648         bias_term: true
    5649     }
    5650 }
    5651 
    5652 layer {
    5653     bottom: "res4b29"
    5654     bottom: "res4b30_branch2c"
    5655     top: "res4b30"
    5656     name: "res4b30"
    5657     type: "Eltwise"
    5658 }
    5659 
    5660 layer {
    5661     bottom: "res4b30"
    5662     top: "res4b30"
    5663     name: "res4b30_relu"
    5664     type: "ReLU"
    5665 }
    5666 
    5667 layer {
    5668     bottom: "res4b30"
    5669     top: "res4b31_branch2a"
    5670     name: "res4b31_branch2a"
    5671     type: "Convolution"
    5672     convolution_param {
    5673         num_output: 256
    5674         kernel_size: 1
    5675         pad: 0
    5676         stride: 1
    5677         bias_term: false
    5678     }
    5679 }
    5680 
    5681 layer {
    5682     bottom: "res4b31_branch2a"
    5683     top: "res4b31_branch2a"
    5684     name: "bn4b31_branch2a"
    5685     type: "BatchNorm"
    5686     batch_norm_param {
    5687         use_global_stats: true
    5688     }
    5689 }
    5690 
    5691 layer {
    5692     bottom: "res4b31_branch2a"
    5693     top: "res4b31_branch2a"
    5694     name: "scale4b31_branch2a"
    5695     type: "Scale"
    5696     scale_param {
    5697         bias_term: true
    5698     }
    5699 }
    5700 
    5701 layer {
    5702     top: "res4b31_branch2a"
    5703     bottom: "res4b31_branch2a"
    5704     name: "res4b31_branch2a_relu"
    5705     type: "ReLU"
    5706 }
    5707 
    5708 layer {
    5709     bottom: "res4b31_branch2a"
    5710     top: "res4b31_branch2b"
    5711     name: "res4b31_branch2b"
    5712     type: "Convolution"
    5713     convolution_param {
    5714         num_output: 256
    5715         kernel_size: 3
    5716         pad: 1
    5717         stride: 1
    5718         bias_term: false
    5719     }
    5720 }
    5721 
    5722 layer {
    5723     bottom: "res4b31_branch2b"
    5724     top: "res4b31_branch2b"
    5725     name: "bn4b31_branch2b"
    5726     type: "BatchNorm"
    5727     batch_norm_param {
    5728         use_global_stats: true
    5729     }
    5730 }
    5731 
    5732 layer {
    5733     bottom: "res4b31_branch2b"
    5734     top: "res4b31_branch2b"
    5735     name: "scale4b31_branch2b"
    5736     type: "Scale"
    5737     scale_param {
    5738         bias_term: true
    5739     }
    5740 }
    5741 
    5742 layer {
    5743     top: "res4b31_branch2b"
    5744     bottom: "res4b31_branch2b"
    5745     name: "res4b31_branch2b_relu"
    5746     type: "ReLU"
    5747 }
    5748 
    5749 layer {
    5750     bottom: "res4b31_branch2b"
    5751     top: "res4b31_branch2c"
    5752     name: "res4b31_branch2c"
    5753     type: "Convolution"
    5754     convolution_param {
    5755         num_output: 1024
    5756         kernel_size: 1
    5757         pad: 0
    5758         stride: 1
    5759         bias_term: false
    5760     }
    5761 }
    5762 
    5763 layer {
    5764     bottom: "res4b31_branch2c"
    5765     top: "res4b31_branch2c"
    5766     name: "bn4b31_branch2c"
    5767     type: "BatchNorm"
    5768     batch_norm_param {
    5769         use_global_stats: true
    5770     }
    5771 }
    5772 
    5773 layer {
    5774     bottom: "res4b31_branch2c"
    5775     top: "res4b31_branch2c"
    5776     name: "scale4b31_branch2c"
    5777     type: "Scale"
    5778     scale_param {
    5779         bias_term: true
    5780     }
    5781 }
    5782 
    5783 layer {
    5784     bottom: "res4b30"
    5785     bottom: "res4b31_branch2c"
    5786     top: "res4b31"
    5787     name: "res4b31"
    5788     type: "Eltwise"
    5789 }
    5790 
    5791 layer {
    5792     bottom: "res4b31"
    5793     top: "res4b31"
    5794     name: "res4b31_relu"
    5795     type: "ReLU"
    5796 }
    5797 
    5798 layer {
    5799     bottom: "res4b31"
    5800     top: "res4b32_branch2a"
    5801     name: "res4b32_branch2a"
    5802     type: "Convolution"
    5803     convolution_param {
    5804         num_output: 256
    5805         kernel_size: 1
    5806         pad: 0
    5807         stride: 1
    5808         bias_term: false
    5809     }
    5810 }
    5811 
    5812 layer {
    5813     bottom: "res4b32_branch2a"
    5814     top: "res4b32_branch2a"
    5815     name: "bn4b32_branch2a"
    5816     type: "BatchNorm"
    5817     batch_norm_param {
    5818         use_global_stats: true
    5819     }
    5820 }
    5821 
    5822 layer {
    5823     bottom: "res4b32_branch2a"
    5824     top: "res4b32_branch2a"
    5825     name: "scale4b32_branch2a"
    5826     type: "Scale"
    5827     scale_param {
    5828         bias_term: true
    5829     }
    5830 }
    5831 
    5832 layer {
    5833     top: "res4b32_branch2a"
    5834     bottom: "res4b32_branch2a"
    5835     name: "res4b32_branch2a_relu"
    5836     type: "ReLU"
    5837 }
    5838 
    5839 layer {
    5840     bottom: "res4b32_branch2a"
    5841     top: "res4b32_branch2b"
    5842     name: "res4b32_branch2b"
    5843     type: "Convolution"
    5844     convolution_param {
    5845         num_output: 256
    5846         kernel_size: 3
    5847         pad: 1
    5848         stride: 1
    5849         bias_term: false
    5850     }
    5851 }
    5852 
    5853 layer {
    5854     bottom: "res4b32_branch2b"
    5855     top: "res4b32_branch2b"
    5856     name: "bn4b32_branch2b"
    5857     type: "BatchNorm"
    5858     batch_norm_param {
    5859         use_global_stats: true
    5860     }
    5861 }
    5862 
    5863 layer {
    5864     bottom: "res4b32_branch2b"
    5865     top: "res4b32_branch2b"
    5866     name: "scale4b32_branch2b"
    5867     type: "Scale"
    5868     scale_param {
    5869         bias_term: true
    5870     }
    5871 }
    5872 
    5873 layer {
    5874     top: "res4b32_branch2b"
    5875     bottom: "res4b32_branch2b"
    5876     name: "res4b32_branch2b_relu"
    5877     type: "ReLU"
    5878 }
    5879 
    5880 layer {
    5881     bottom: "res4b32_branch2b"
    5882     top: "res4b32_branch2c"
    5883     name: "res4b32_branch2c"
    5884     type: "Convolution"
    5885     convolution_param {
    5886         num_output: 1024
    5887         kernel_size: 1
    5888         pad: 0
    5889         stride: 1
    5890         bias_term: false
    5891     }
    5892 }
    5893 
    5894 layer {
    5895     bottom: "res4b32_branch2c"
    5896     top: "res4b32_branch2c"
    5897     name: "bn4b32_branch2c"
    5898     type: "BatchNorm"
    5899     batch_norm_param {
    5900         use_global_stats: true
    5901     }
    5902 }
    5903 
    5904 layer {
    5905     bottom: "res4b32_branch2c"
    5906     top: "res4b32_branch2c"
    5907     name: "scale4b32_branch2c"
    5908     type: "Scale"
    5909     scale_param {
    5910         bias_term: true
    5911     }
    5912 }
    5913 
    5914 layer {
    5915     bottom: "res4b31"
    5916     bottom: "res4b32_branch2c"
    5917     top: "res4b32"
    5918     name: "res4b32"
    5919     type: "Eltwise"
    5920 }
    5921 
    5922 layer {
    5923     bottom: "res4b32"
    5924     top: "res4b32"
    5925     name: "res4b32_relu"
    5926     type: "ReLU"
    5927 }
    5928 
    5929 layer {
    5930     bottom: "res4b32"
    5931     top: "res4b33_branch2a"
    5932     name: "res4b33_branch2a"
    5933     type: "Convolution"
    5934     convolution_param {
    5935         num_output: 256
    5936         kernel_size: 1
    5937         pad: 0
    5938         stride: 1
    5939         bias_term: false
    5940     }
    5941 }
    5942 
    5943 layer {
    5944     bottom: "res4b33_branch2a"
    5945     top: "res4b33_branch2a"
    5946     name: "bn4b33_branch2a"
    5947     type: "BatchNorm"
    5948     batch_norm_param {
    5949         use_global_stats: true
    5950     }
    5951 }
    5952 
    5953 layer {
    5954     bottom: "res4b33_branch2a"
    5955     top: "res4b33_branch2a"
    5956     name: "scale4b33_branch2a"
    5957     type: "Scale"
    5958     scale_param {
    5959         bias_term: true
    5960     }
    5961 }
    5962 
    5963 layer {
    5964     top: "res4b33_branch2a"
    5965     bottom: "res4b33_branch2a"
    5966     name: "res4b33_branch2a_relu"
    5967     type: "ReLU"
    5968 }
    5969 
    5970 layer {
    5971     bottom: "res4b33_branch2a"
    5972     top: "res4b33_branch2b"
    5973     name: "res4b33_branch2b"
    5974     type: "Convolution"
    5975     convolution_param {
    5976         num_output: 256
    5977         kernel_size: 3
    5978         pad: 1
    5979         stride: 1
    5980         bias_term: false
    5981     }
    5982 }
    5983 
    5984 layer {
    5985     bottom: "res4b33_branch2b"
    5986     top: "res4b33_branch2b"
    5987     name: "bn4b33_branch2b"
    5988     type: "BatchNorm"
    5989     batch_norm_param {
    5990         use_global_stats: true
    5991     }
    5992 }
    5993 
    5994 layer {
    5995     bottom: "res4b33_branch2b"
    5996     top: "res4b33_branch2b"
    5997     name: "scale4b33_branch2b"
    5998     type: "Scale"
    5999     scale_param {
    6000         bias_term: true
    6001     }
    6002 }
    6003 
    6004 layer {
    6005     top: "res4b33_branch2b"
    6006     bottom: "res4b33_branch2b"
    6007     name: "res4b33_branch2b_relu"
    6008     type: "ReLU"
    6009 }
    6010 
    6011 layer {
    6012     bottom: "res4b33_branch2b"
    6013     top: "res4b33_branch2c"
    6014     name: "res4b33_branch2c"
    6015     type: "Convolution"
    6016     convolution_param {
    6017         num_output: 1024
    6018         kernel_size: 1
    6019         pad: 0
    6020         stride: 1
    6021         bias_term: false
    6022     }
    6023 }
    6024 
    6025 layer {
    6026     bottom: "res4b33_branch2c"
    6027     top: "res4b33_branch2c"
    6028     name: "bn4b33_branch2c"
    6029     type: "BatchNorm"
    6030     batch_norm_param {
    6031         use_global_stats: true
    6032     }
    6033 }
    6034 
    6035 layer {
    6036     bottom: "res4b33_branch2c"
    6037     top: "res4b33_branch2c"
    6038     name: "scale4b33_branch2c"
    6039     type: "Scale"
    6040     scale_param {
    6041         bias_term: true
    6042     }
    6043 }
    6044 
    6045 layer {
    6046     bottom: "res4b32"
    6047     bottom: "res4b33_branch2c"
    6048     top: "res4b33"
    6049     name: "res4b33"
    6050     type: "Eltwise"
    6051 }
    6052 
    6053 layer {
    6054     bottom: "res4b33"
    6055     top: "res4b33"
    6056     name: "res4b33_relu"
    6057     type: "ReLU"
    6058 }
    6059 
    6060 layer {
    6061     bottom: "res4b33"
    6062     top: "res4b34_branch2a"
    6063     name: "res4b34_branch2a"
    6064     type: "Convolution"
    6065     convolution_param {
    6066         num_output: 256
    6067         kernel_size: 1
    6068         pad: 0
    6069         stride: 1
    6070         bias_term: false
    6071     }
    6072 }
    6073 
    6074 layer {
    6075     bottom: "res4b34_branch2a"
    6076     top: "res4b34_branch2a"
    6077     name: "bn4b34_branch2a"
    6078     type: "BatchNorm"
    6079     batch_norm_param {
    6080         use_global_stats: true
    6081     }
    6082 }
    6083 
    6084 layer {
    6085     bottom: "res4b34_branch2a"
    6086     top: "res4b34_branch2a"
    6087     name: "scale4b34_branch2a"
    6088     type: "Scale"
    6089     scale_param {
    6090         bias_term: true
    6091     }
    6092 }
    6093 
    6094 layer {
    6095     top: "res4b34_branch2a"
    6096     bottom: "res4b34_branch2a"
    6097     name: "res4b34_branch2a_relu"
    6098     type: "ReLU"
    6099 }
    6100 
    6101 layer {
    6102     bottom: "res4b34_branch2a"
    6103     top: "res4b34_branch2b"
    6104     name: "res4b34_branch2b"
    6105     type: "Convolution"
    6106     convolution_param {
    6107         num_output: 256
    6108         kernel_size: 3
    6109         pad: 1
    6110         stride: 1
    6111         bias_term: false
    6112     }
    6113 }
    6114 
    6115 layer {
    6116     bottom: "res4b34_branch2b"
    6117     top: "res4b34_branch2b"
    6118     name: "bn4b34_branch2b"
    6119     type: "BatchNorm"
    6120     batch_norm_param {
    6121         use_global_stats: true
    6122     }
    6123 }
    6124 
    6125 layer {
    6126     bottom: "res4b34_branch2b"
    6127     top: "res4b34_branch2b"
    6128     name: "scale4b34_branch2b"
    6129     type: "Scale"
    6130     scale_param {
    6131         bias_term: true
    6132     }
    6133 }
    6134 
    6135 layer {
    6136     top: "res4b34_branch2b"
    6137     bottom: "res4b34_branch2b"
    6138     name: "res4b34_branch2b_relu"
    6139     type: "ReLU"
    6140 }
    6141 
    6142 layer {
    6143     bottom: "res4b34_branch2b"
    6144     top: "res4b34_branch2c"
    6145     name: "res4b34_branch2c"
    6146     type: "Convolution"
    6147     convolution_param {
    6148         num_output: 1024
    6149         kernel_size: 1
    6150         pad: 0
    6151         stride: 1
    6152         bias_term: false
    6153     }
    6154 }
    6155 
    6156 layer {
    6157     bottom: "res4b34_branch2c"
    6158     top: "res4b34_branch2c"
    6159     name: "bn4b34_branch2c"
    6160     type: "BatchNorm"
    6161     batch_norm_param {
    6162         use_global_stats: true
    6163     }
    6164 }
    6165 
    6166 layer {
    6167     bottom: "res4b34_branch2c"
    6168     top: "res4b34_branch2c"
    6169     name: "scale4b34_branch2c"
    6170     type: "Scale"
    6171     scale_param {
    6172         bias_term: true
    6173     }
    6174 }
    6175 
    6176 layer {
    6177     bottom: "res4b33"
    6178     bottom: "res4b34_branch2c"
    6179     top: "res4b34"
    6180     name: "res4b34"
    6181     type: "Eltwise"
    6182 }
    6183 
    6184 layer {
    6185     bottom: "res4b34"
    6186     top: "res4b34"
    6187     name: "res4b34_relu"
    6188     type: "ReLU"
    6189 }
    6190 
    6191 layer {
    6192     bottom: "res4b34"
    6193     top: "res4b35_branch2a"
    6194     name: "res4b35_branch2a"
    6195     type: "Convolution"
    6196     convolution_param {
    6197         num_output: 256
    6198         kernel_size: 1
    6199         pad: 0
    6200         stride: 1
    6201         bias_term: false
    6202     }
    6203 }
    6204 
    6205 layer {
    6206     bottom: "res4b35_branch2a"
    6207     top: "res4b35_branch2a"
    6208     name: "bn4b35_branch2a"
    6209     type: "BatchNorm"
    6210     batch_norm_param {
    6211         use_global_stats: true
    6212     }
    6213 }
    6214 
    6215 layer {
    6216     bottom: "res4b35_branch2a"
    6217     top: "res4b35_branch2a"
    6218     name: "scale4b35_branch2a"
    6219     type: "Scale"
    6220     scale_param {
    6221         bias_term: true
    6222     }
    6223 }
    6224 
    6225 layer {
    6226     top: "res4b35_branch2a"
    6227     bottom: "res4b35_branch2a"
    6228     name: "res4b35_branch2a_relu"
    6229     type: "ReLU"
    6230 }
    6231 
    6232 layer {
    6233     bottom: "res4b35_branch2a"
    6234     top: "res4b35_branch2b"
    6235     name: "res4b35_branch2b"
    6236     type: "Convolution"
    6237     convolution_param {
    6238         num_output: 256
    6239         kernel_size: 3
    6240         pad: 1
    6241         stride: 1
    6242         bias_term: false
    6243     }
    6244 }
    6245 
    6246 layer {
    6247     bottom: "res4b35_branch2b"
    6248     top: "res4b35_branch2b"
    6249     name: "bn4b35_branch2b"
    6250     type: "BatchNorm"
    6251     batch_norm_param {
    6252         use_global_stats: true
    6253     }
    6254 }
    6255 
    6256 layer {
    6257     bottom: "res4b35_branch2b"
    6258     top: "res4b35_branch2b"
    6259     name: "scale4b35_branch2b"
    6260     type: "Scale"
    6261     scale_param {
    6262         bias_term: true
    6263     }
    6264 }
    6265 
    6266 layer {
    6267     top: "res4b35_branch2b"
    6268     bottom: "res4b35_branch2b"
    6269     name: "res4b35_branch2b_relu"
    6270     type: "ReLU"
    6271 }
    6272 
    6273 layer {
    6274     bottom: "res4b35_branch2b"
    6275     top: "res4b35_branch2c"
    6276     name: "res4b35_branch2c"
    6277     type: "Convolution"
    6278     convolution_param {
    6279         num_output: 1024
    6280         kernel_size: 1
    6281         pad: 0
    6282         stride: 1
    6283         bias_term: false
    6284     }
    6285 }
    6286 
    6287 layer {
    6288     bottom: "res4b35_branch2c"
    6289     top: "res4b35_branch2c"
    6290     name: "bn4b35_branch2c"
    6291     type: "BatchNorm"
    6292     batch_norm_param {
    6293         use_global_stats: true
    6294     }
    6295 }
    6296 
    6297 layer {
    6298     bottom: "res4b35_branch2c"
    6299     top: "res4b35_branch2c"
    6300     name: "scale4b35_branch2c"
    6301     type: "Scale"
    6302     scale_param {
    6303         bias_term: true
    6304     }
    6305 }
    6306 
    6307 layer {
    6308     bottom: "res4b34"
    6309     bottom: "res4b35_branch2c"
    6310     top: "res4b35"
    6311     name: "res4b35"
    6312     type: "Eltwise"
    6313 }
    6314 
    6315 layer {
    6316     bottom: "res4b35"
    6317     top: "res4b35"
    6318     name: "res4b35_relu"
    6319     type: "ReLU"
    6320 }
    6321 
    6322 layer {
    6323     bottom: "res4b35"
    6324     top: "res5a_branch1"
    6325     name: "res5a_branch1"
    6326     type: "Convolution"
    6327     convolution_param {
    6328         num_output: 2048
    6329         kernel_size: 1
    6330         pad: 0
    6331         stride: 2
    6332         bias_term: false
    6333     }
    6334 }
    6335 
    6336 layer {
    6337     bottom: "res5a_branch1"
    6338     top: "res5a_branch1"
    6339     name: "bn5a_branch1"
    6340     type: "BatchNorm"
    6341     batch_norm_param {
    6342         use_global_stats: true
    6343     }
    6344 }
    6345 
    6346 layer {
    6347     bottom: "res5a_branch1"
    6348     top: "res5a_branch1"
    6349     name: "scale5a_branch1"
    6350     type: "Scale"
    6351     scale_param {
    6352         bias_term: true
    6353     }
    6354 }
    6355 
    6356 layer {
    6357     bottom: "res4b35"
    6358     top: "res5a_branch2a"
    6359     name: "res5a_branch2a"
    6360     type: "Convolution"
    6361     convolution_param {
    6362         num_output: 512
    6363         kernel_size: 1
    6364         pad: 0
    6365         stride: 2
    6366         bias_term: false
    6367     }
    6368 }
    6369 
    6370 layer {
    6371     bottom: "res5a_branch2a"
    6372     top: "res5a_branch2a"
    6373     name: "bn5a_branch2a"
    6374     type: "BatchNorm"
    6375     batch_norm_param {
    6376         use_global_stats: true
    6377     }
    6378 }
    6379 
    6380 layer {
    6381     bottom: "res5a_branch2a"
    6382     top: "res5a_branch2a"
    6383     name: "scale5a_branch2a"
    6384     type: "Scale"
    6385     scale_param {
    6386         bias_term: true
    6387     }
    6388 }
    6389 
    6390 layer {
    6391     top: "res5a_branch2a"
    6392     bottom: "res5a_branch2a"
    6393     name: "res5a_branch2a_relu"
    6394     type: "ReLU"
    6395 }
    6396 
    6397 layer {
    6398     bottom: "res5a_branch2a"
    6399     top: "res5a_branch2b"
    6400     name: "res5a_branch2b"
    6401     type: "Convolution"
    6402     convolution_param {
    6403         num_output: 512
    6404         kernel_size: 3
    6405         pad: 1
    6406         stride: 1
    6407         bias_term: false
    6408     }
    6409 }
    6410 
    6411 layer {
    6412     bottom: "res5a_branch2b"
    6413     top: "res5a_branch2b"
    6414     name: "bn5a_branch2b"
    6415     type: "BatchNorm"
    6416     batch_norm_param {
    6417         use_global_stats: true
    6418     }
    6419 }
    6420 
    6421 layer {
    6422     bottom: "res5a_branch2b"
    6423     top: "res5a_branch2b"
    6424     name: "scale5a_branch2b"
    6425     type: "Scale"
    6426     scale_param {
    6427         bias_term: true
    6428     }
    6429 }
    6430 
    6431 layer {
    6432     top: "res5a_branch2b"
    6433     bottom: "res5a_branch2b"
    6434     name: "res5a_branch2b_relu"
    6435     type: "ReLU"
    6436 }
    6437 
    6438 layer {
    6439     bottom: "res5a_branch2b"
    6440     top: "res5a_branch2c"
    6441     name: "res5a_branch2c"
    6442     type: "Convolution"
    6443     convolution_param {
    6444         num_output: 2048
    6445         kernel_size: 1
    6446         pad: 0
    6447         stride: 1
    6448         bias_term: false
    6449     }
    6450 }
    6451 
    6452 layer {
    6453     bottom: "res5a_branch2c"
    6454     top: "res5a_branch2c"
    6455     name: "bn5a_branch2c"
    6456     type: "BatchNorm"
    6457     batch_norm_param {
    6458         use_global_stats: true
    6459     }
    6460 }
    6461 
    6462 layer {
    6463     bottom: "res5a_branch2c"
    6464     top: "res5a_branch2c"
    6465     name: "scale5a_branch2c"
    6466     type: "Scale"
    6467     scale_param {
    6468         bias_term: true
    6469     }
    6470 }
    6471 
    6472 layer {
    6473     bottom: "res5a_branch1"
    6474     bottom: "res5a_branch2c"
    6475     top: "res5a"
    6476     name: "res5a"
    6477     type: "Eltwise"
    6478 }
    6479 
    6480 layer {
    6481     bottom: "res5a"
    6482     top: "res5a"
    6483     name: "res5a_relu"
    6484     type: "ReLU"
    6485 }
    6486 
    6487 layer {
    6488     bottom: "res5a"
    6489     top: "res5b_branch2a"
    6490     name: "res5b_branch2a"
    6491     type: "Convolution"
    6492     convolution_param {
    6493         num_output: 512
    6494         kernel_size: 1
    6495         pad: 0
    6496         stride: 1
    6497         bias_term: false
    6498     }
    6499 }
    6500 
    6501 layer {
    6502     bottom: "res5b_branch2a"
    6503     top: "res5b_branch2a"
    6504     name: "bn5b_branch2a"
    6505     type: "BatchNorm"
    6506     batch_norm_param {
    6507         use_global_stats: true
    6508     }
    6509 }
    6510 
    6511 layer {
    6512     bottom: "res5b_branch2a"
    6513     top: "res5b_branch2a"
    6514     name: "scale5b_branch2a"
    6515     type: "Scale"
    6516     scale_param {
    6517         bias_term: true
    6518     }
    6519 }
    6520 
    6521 layer {
    6522     top: "res5b_branch2a"
    6523     bottom: "res5b_branch2a"
    6524     name: "res5b_branch2a_relu"
    6525     type: "ReLU"
    6526 }
    6527 
    6528 layer {
    6529     bottom: "res5b_branch2a"
    6530     top: "res5b_branch2b"
    6531     name: "res5b_branch2b"
    6532     type: "Convolution"
    6533     convolution_param {
    6534         num_output: 512
    6535         kernel_size: 3
    6536         pad: 1
    6537         stride: 1
    6538         bias_term: false
    6539     }
    6540 }
    6541 
    6542 layer {
    6543     bottom: "res5b_branch2b"
    6544     top: "res5b_branch2b"
    6545     name: "bn5b_branch2b"
    6546     type: "BatchNorm"
    6547     batch_norm_param {
    6548         use_global_stats: true
    6549     }
    6550 }
    6551 
    6552 layer {
    6553     bottom: "res5b_branch2b"
    6554     top: "res5b_branch2b"
    6555     name: "scale5b_branch2b"
    6556     type: "Scale"
    6557     scale_param {
    6558         bias_term: true
    6559     }
    6560 }
    6561 
    6562 layer {
    6563     top: "res5b_branch2b"
    6564     bottom: "res5b_branch2b"
    6565     name: "res5b_branch2b_relu"
    6566     type: "ReLU"
    6567 }
    6568 
    6569 layer {
    6570     bottom: "res5b_branch2b"
    6571     top: "res5b_branch2c"
    6572     name: "res5b_branch2c"
    6573     type: "Convolution"
    6574     convolution_param {
    6575         num_output: 2048
    6576         kernel_size: 1
    6577         pad: 0
    6578         stride: 1
    6579         bias_term: false
    6580     }
    6581 }
    6582 
    6583 layer {
    6584     bottom: "res5b_branch2c"
    6585     top: "res5b_branch2c"
    6586     name: "bn5b_branch2c"
    6587     type: "BatchNorm"
    6588     batch_norm_param {
    6589         use_global_stats: true
    6590     }
    6591 }
    6592 
    6593 layer {
    6594     bottom: "res5b_branch2c"
    6595     top: "res5b_branch2c"
    6596     name: "scale5b_branch2c"
    6597     type: "Scale"
    6598     scale_param {
    6599         bias_term: true
    6600     }
    6601 }
    6602 
    6603 layer {
    6604     bottom: "res5a"
    6605     bottom: "res5b_branch2c"
    6606     top: "res5b"
    6607     name: "res5b"
    6608     type: "Eltwise"
    6609 }
    6610 
    6611 layer {
    6612     bottom: "res5b"
    6613     top: "res5b"
    6614     name: "res5b_relu"
    6615     type: "ReLU"
    6616 }
    6617 
    6618 layer {
    6619     bottom: "res5b"
    6620     top: "res5c_branch2a"
    6621     name: "res5c_branch2a"
    6622     type: "Convolution"
    6623     convolution_param {
    6624         num_output: 512
    6625         kernel_size: 1
    6626         pad: 0
    6627         stride: 1
    6628         bias_term: false
    6629     }
    6630 }
    6631 
    6632 layer {
    6633     bottom: "res5c_branch2a"
    6634     top: "res5c_branch2a"
    6635     name: "bn5c_branch2a"
    6636     type: "BatchNorm"
    6637     batch_norm_param {
    6638         use_global_stats: true
    6639     }
    6640 }
    6641 
    6642 layer {
    6643     bottom: "res5c_branch2a"
    6644     top: "res5c_branch2a"
    6645     name: "scale5c_branch2a"
    6646     type: "Scale"
    6647     scale_param {
    6648         bias_term: true
    6649     }
    6650 }
    6651 
    6652 layer {
    6653     top: "res5c_branch2a"
    6654     bottom: "res5c_branch2a"
    6655     name: "res5c_branch2a_relu"
    6656     type: "ReLU"
    6657 }
    6658 
    6659 layer {
    6660     bottom: "res5c_branch2a"
    6661     top: "res5c_branch2b"
    6662     name: "res5c_branch2b"
    6663     type: "Convolution"
    6664     convolution_param {
    6665         num_output: 512
    6666         kernel_size: 3
    6667         pad: 1
    6668         stride: 1
    6669         bias_term: false
    6670     }
    6671 }
    6672 
    6673 layer {
    6674     bottom: "res5c_branch2b"
    6675     top: "res5c_branch2b"
    6676     name: "bn5c_branch2b"
    6677     type: "BatchNorm"
    6678     batch_norm_param {
    6679         use_global_stats: true
    6680     }
    6681 }
    6682 
    6683 layer {
    6684     bottom: "res5c_branch2b"
    6685     top: "res5c_branch2b"
    6686     name: "scale5c_branch2b"
    6687     type: "Scale"
    6688     scale_param {
    6689         bias_term: true
    6690     }
    6691 }
    6692 
    6693 layer {
    6694     top: "res5c_branch2b"
    6695     bottom: "res5c_branch2b"
    6696     name: "res5c_branch2b_relu"
    6697     type: "ReLU"
    6698 }
    6699 
    6700 layer {
    6701     bottom: "res5c_branch2b"
    6702     top: "res5c_branch2c"
    6703     name: "res5c_branch2c"
    6704     type: "Convolution"
    6705     convolution_param {
    6706         num_output: 2048
    6707         kernel_size: 1
    6708         pad: 0
    6709         stride: 1
    6710         bias_term: false
    6711     }
    6712 }
    6713 
    6714 layer {
    6715     bottom: "res5c_branch2c"
    6716     top: "res5c_branch2c"
    6717     name: "bn5c_branch2c"
    6718     type: "BatchNorm"
    6719     batch_norm_param {
    6720         use_global_stats: true
    6721     }
    6722 }
    6723 
    6724 layer {
    6725     bottom: "res5c_branch2c"
    6726     top: "res5c_branch2c"
    6727     name: "scale5c_branch2c"
    6728     type: "Scale"
    6729     scale_param {
    6730         bias_term: true
    6731     }
    6732 }
    6733 
    6734 layer {
    6735     bottom: "res5b"
    6736     bottom: "res5c_branch2c"
    6737     top: "res5c"
    6738     name: "res5c"
    6739     type: "Eltwise"
    6740 }
    6741 
    6742 layer {
    6743     bottom: "res5c"
    6744     top: "res5c"
    6745     name: "res5c_relu"
    6746     type: "ReLU"
    6747 }
    6748 
    6749 layer {
    6750     bottom: "res5c"
    6751     top: "pool5"
    6752     name: "pool5"
    6753     type: "Pooling"
    6754     pooling_param {
    6755         kernel_size: 7
    6756         stride: 1
    6757         pool: AVE
    6758     }
    6759 }
    6760 
    6761 layer {
    6762     bottom: "pool5"
    6763     top: "fc3"
    6764     name: "fc3"
    6765     type: "InnerProduct"
    6766     inner_product_param {
    6767         num_output: 3
    6768     }
    6769 }
    6770 
    6771 layer {
    6772     bottom: "fc3"
    6773     top: "prob"
    6774     name: "prob"
    6775     type: "Softmax"
    6776 }
    View Code

       

          solver.prototxt

     1 net: "/home/wy/ResNet152/train_val.prototxt"
     2 test_iter: 1003
     3 test_interval: 4000
     4 test_initialization: false
     5 display: 100
     6 average_loss: 100
     7 base_lr: 0.05
     8 lr_policy: "step"
     9 stepsize: 150000
    10 gamma: 0.1
    11 max_iter: 600000
    12 momentum: 0.9
    13 weight_decay: 0.0001
    14 snapshot: 40000
    15 snapshot_prefix: "/home/wy/ResNet152/model/"
    16 solver_mode: GPU
    View Code

    作为一枚技术小白,写这篇笔记的时候参考了很多博客论文,在这里表示感谢,同时,未经同意,请勿转载....

  • 相关阅读:
    eclipse web项目没有run on server
    npm install 包 失败解决方法
    svn already lock解决方法
    查看centos的版本
    SmartGit 授权Non-Commerical
    http://jingyan.baidu.com/article/dca1fa6fa07000f1a44052f6.html
    http://zhidao.baidu.com/link?url=3tJ_i5gyYLrd7rFPk0eRYre_oxjCZvTOMOutp89LGhUgi6Ic6Ncama_GMAHnwfF73SVYGqy364vDfv6AY4ERPa
    http://www.oschina.net/code/snippet_12_13918
    http://www.360doc.com/content/12/0516/14/1671317_211422841.shtml
    https://v2ex.com/t/170386
  • 原文地址:https://www.cnblogs.com/wangyong/p/8616939.html
Copyright © 2011-2022 走看看