zoukankan      html  css  js  c++  java
  • Mask R-CNN shape数据集案例调试

    shape数据集案例

      为了清楚观察数据shape值的变化,我们用model.summary()打印了总结信息,具体操作是在mrcnn文件夹中的model.py文件里在模型compile之后加上如下代码:

    1 # Compile
    2         self.keras_model.compile(
    3             optimizer=optimizer,
    4             loss=[None] * len(self.keras_model.outputs))
    5         self.keras_model.summary()

      调用shape数据集测试中打印的结果:(这里我更改了Input的None为128,便于观察)

    __________________________________________________________________________________________________
    Layer (type)                Output Shape   Param #     Connected to                     
    ==================================================================================================
    input_image (InputLayer) (None, 128, 128, 3) 0

    stage1: __________________________________________________________________________________________________ zero_padding2d_1 (ZeroPadding2D (None, 134, 134, 3) 0 input_image[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, 64, 64, 64) 9472 zero_padding2d_1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNorm) (None, 64, 64, 64) 256 conv1[0][0] __________________________________________________________________________________________________ activation_1 (Activation) (None, 64, 64, 64) 0 bn_conv1[0][0] __________________________________________________________________________________________________ max_pooling2d_1 (MaxPooling2D) (None, 32, 32, 64) 0 activation_1[0][0] __________________________________________________________________________________________________


    stage2:
    res2a_branch2a (Conv2D) (None, 32, 32, 64) 4160 max_pooling2d_1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNorm) (None, 32, 32, 64) 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ activation_2 (Activation) (None, 32, 32, 64) 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, 32, 32, 64) 36928 activation_2[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNorm) (None, 32, 32, 64) 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ activation_3 (Activation) (None, 32, 32, 64) 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, 32, 32, 256) 16640 activation_3[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, 32, 32, 256) 16640 max_pooling2d_1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNorm) (None, 32, 32, 256) 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNorm) (None, 32, 32, 256) 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ add_1 (Add) (None, 32, 32, 256) 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_out (Activation) (None, 32, 32, 256) 0 add_1[0][0]
    以上为conv_block
    __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, 32, 32, 64) 16448 res2a_out[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNorm) (None, 32, 32, 64) 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ activation_4 (Activation) (None, 32, 32, 64) 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, 32, 32, 64) 36928 activation_4[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNorm) (None, 32, 32, 64) 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ activation_5 (Activation) (None, 32, 32, 64) 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, 32, 32, 256) 16640 activation_5[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNorm) (None, 32, 32, 256) 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ add_2 (Add) (None, 32, 32, 256) 0 bn2b_branch2c[0][0] res2a_out[0][0] __________________________________________________________________________________________________ res2b_out (Activation) (None, 32, 32, 256) 0 add_2[0][0]
    identity_block结束,下面是连接P3的卷积后特征图 __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, 32, 32, 64) 16448 res2b_out[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNorm) (None, 32, 32, 64) 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ activation_6 (Activation) (None, 32, 32, 64) 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, 32, 32, 64) 36928 activation_6[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNorm) (None, 32, 32, 64) 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ activation_7 (Activation) (None, 32, 32, 64) 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, 32, 32, 256) 16640 activation_7[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNorm) (None, 32, 32, 256) 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ add_3 (Add) (None, 32, 32, 256) 0 bn2c_branch2c[0][0] res2b_out[0][0] __________________________________________________________________________________________________ res2c_out (Activation) (None, 32, 32, 256) 0 add_3[0][0] __________________________________________________________________________________________________
    C2=res2c_out


    stage3:
    res3a_branch2a (Conv2D) (None, 16, 16, 128) 32896 res2c_out[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNorm) (None, 16, 16, 128) 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ activation_8 (Activation) (None, 16, 16, 128) 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, 16, 16, 128) 147584 activation_8[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNorm) (None, 16, 16, 128) 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ activation_9 (Activation) (None, 16, 16, 128) 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, 16, 16, 512) 66048 activation_9[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, 16, 16, 512) 131584 res2c_out[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNorm) (None, 16, 16, 512) 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNorm) (None, 16, 16, 512) 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ add_4 (Add) (None, 16, 16, 512) 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_out (Activation) (None, 16, 16, 512) 0 add_4[0][0] __________________________________________________________________________________________________ res3b_branch2a (Conv2D) (None, 16, 16, 128) 65664 res3a_out[0][0] __________________________________________________________________________________________________ bn3b_branch2a (BatchNorm) (None, 16, 16, 128) 512 res3b_branch2a[0][0] __________________________________________________________________________________________________ activation_10 (Activation) (None, 16, 16, 128) 0 bn3b_branch2a[0][0] __________________________________________________________________________________________________ res3b_branch2b (Conv2D) (None, 16, 16, 128) 147584 activation_10[0][0] __________________________________________________________________________________________________ bn3b_branch2b (BatchNorm) (None, 16, 16, 128) 512 res3b_branch2b[0][0] __________________________________________________________________________________________________ activation_11 (Activation) (None, 16, 16, 128) 0 bn3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2c (Conv2D) (None, 16, 16, 512) 66048 activation_11[0][0] __________________________________________________________________________________________________ bn3b_branch2c (BatchNorm) (None, 16, 16, 512) 2048 res3b_branch2c[0][0] __________________________________________________________________________________________________ add_5 (Add) (None, 16, 16, 512) 0 bn3b_branch2c[0][0] res3a_out[0][0] __________________________________________________________________________________________________ res3b_out (Activation) (None, 16, 16, 512) 0 add_5[0][0] __________________________________________________________________________________________________ res3c_branch2a (Conv2D) (None, 16, 16, 128) 65664 res3b_out[0][0] __________________________________________________________________________________________________ bn3c_branch2a (BatchNorm) (None, 16, 16, 128) 512 res3c_branch2a[0][0] __________________________________________________________________________________________________ activation_12 (Activation) (None, 16, 16, 128) 0 bn3c_branch2a[0][0] __________________________________________________________________________________________________ res3c_branch2b (Conv2D) (None, 16, 16, 128) 147584 activation_12[0][0] __________________________________________________________________________________________________ bn3c_branch2b (BatchNorm) (None, 16, 16, 128) 512 res3c_branch2b[0][0] __________________________________________________________________________________________________ activation_13 (Activation) (None, 16, 16, 128) 0 bn3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2c (Conv2D) (None, 16, 16, 512) 66048 activation_13[0][0] __________________________________________________________________________________________________ bn3c_branch2c (BatchNorm) (None, 16, 16, 512) 2048 res3c_branch2c[0][0] __________________________________________________________________________________________________ add_6 (Add) (None, 16, 16, 512) 0 bn3c_branch2c[0][0] res3b_out[0][0] __________________________________________________________________________________________________ res3c_out (Activation) (None, 16, 16, 512) 0 add_6[0][0] __________________________________________________________________________________________________ res3d_branch2a (Conv2D) (None, 16, 16, 128) 65664 res3c_out[0][0] __________________________________________________________________________________________________ bn3d_branch2a (BatchNorm) (None, 16, 16, 128) 512 res3d_branch2a[0][0] __________________________________________________________________________________________________ activation_14 (Activation) (None, 16, 16, 128) 0 bn3d_branch2a[0][0] __________________________________________________________________________________________________ res3d_branch2b (Conv2D) (None, 16, 16, 128) 147584 activation_14[0][0] __________________________________________________________________________________________________ bn3d_branch2b (BatchNorm) (None, 16, 16, 128) 512 res3d_branch2b[0][0] __________________________________________________________________________________________________ activation_15 (Activation) (None, 16, 16, 128) 0 bn3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2c (Conv2D) (None, 16, 16, 512) 66048 activation_15[0][0] __________________________________________________________________________________________________ bn3d_branch2c (BatchNorm) (None, 16, 16, 512) 2048 res3d_branch2c[0][0] __________________________________________________________________________________________________ add_7 (Add) (None, 16, 16, 512) 0 bn3d_branch2c[0][0] res3c_out[0][0] __________________________________________________________________________________________________ res3d_out (Activation) (None, 16, 16, 512) 0 add_7[0][0] __________________________________________________________________________________________________
    C3=res3d_out


    stage4:
    res4a_branch2a (Conv2D) (None, 8, 8, 256) 131328 res3d_out[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ activation_16 (Activation) (None, 8, 8, 256) 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_16[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ activation_17 (Activation) (None, 8, 8, 256) 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_17[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, 8, 8, 1024) 525312 res3d_out[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNorm) (None, 8, 8, 1024) 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ add_8 (Add) (None, 8, 8, 1024) 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_out (Activation) (None, 8, 8, 1024) 0 add_8[0][0] __________________________________________________________________________________________________ res4b_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4a_out[0][0] __________________________________________________________________________________________________ bn4b_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4b_branch2a[0][0] __________________________________________________________________________________________________ activation_18 (Activation) (None, 8, 8, 256) 0 bn4b_branch2a[0][0] __________________________________________________________________________________________________ res4b_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_18[0][0] __________________________________________________________________________________________________ bn4b_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4b_branch2b[0][0] __________________________________________________________________________________________________ activation_19 (Activation) (None, 8, 8, 256) 0 bn4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_19[0][0] __________________________________________________________________________________________________ bn4b_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4b_branch2c[0][0] __________________________________________________________________________________________________ add_9 (Add) (None, 8, 8, 1024) 0 bn4b_branch2c[0][0] res4a_out[0][0] __________________________________________________________________________________________________ res4b_out (Activation) (None, 8, 8, 1024) 0 add_9[0][0] __________________________________________________________________________________________________ res4c_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4b_out[0][0] __________________________________________________________________________________________________ bn4c_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4c_branch2a[0][0] __________________________________________________________________________________________________ activation_20 (Activation) (None, 8, 8, 256) 0 bn4c_branch2a[0][0] __________________________________________________________________________________________________ res4c_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_20[0][0] __________________________________________________________________________________________________ bn4c_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4c_branch2b[0][0] __________________________________________________________________________________________________ activation_21 (Activation) (None, 8, 8, 256) 0 bn4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_21[0][0] __________________________________________________________________________________________________ bn4c_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4c_branch2c[0][0] __________________________________________________________________________________________________ add_10 (Add) (None, 8, 8, 1024) 0 bn4c_branch2c[0][0] res4b_out[0][0] __________________________________________________________________________________________________ res4c_out (Activation) (None, 8, 8, 1024) 0 add_10[0][0] __________________________________________________________________________________________________ res4d_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4c_out[0][0] __________________________________________________________________________________________________ bn4d_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4d_branch2a[0][0] __________________________________________________________________________________________________ activation_22 (Activation) (None, 8, 8, 256) 0 bn4d_branch2a[0][0] __________________________________________________________________________________________________ res4d_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_22[0][0] __________________________________________________________________________________________________ bn4d_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4d_branch2b[0][0] __________________________________________________________________________________________________ activation_23 (Activation) (None, 8, 8, 256) 0 bn4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_23[0][0] __________________________________________________________________________________________________ bn4d_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4d_branch2c[0][0] __________________________________________________________________________________________________ add_11 (Add) (None, 8, 8, 1024) 0 bn4d_branch2c[0][0] res4c_out[0][0] __________________________________________________________________________________________________ res4d_out (Activation) (None, 8, 8, 1024) 0 add_11[0][0] __________________________________________________________________________________________________ res4e_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4d_out[0][0] __________________________________________________________________________________________________ bn4e_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4e_branch2a[0][0] __________________________________________________________________________________________________ activation_24 (Activation) (None, 8, 8, 256) 0 bn4e_branch2a[0][0] __________________________________________________________________________________________________ res4e_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_24[0][0] __________________________________________________________________________________________________ bn4e_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4e_branch2b[0][0] __________________________________________________________________________________________________ activation_25 (Activation) (None, 8, 8, 256) 0 bn4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_25[0][0] __________________________________________________________________________________________________ bn4e_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4e_branch2c[0][0] __________________________________________________________________________________________________ add_12 (Add) (None, 8, 8, 1024) 0 bn4e_branch2c[0][0] res4d_out[0][0] __________________________________________________________________________________________________ res4e_out (Activation) (None, 8, 8, 1024) 0 add_12[0][0] __________________________________________________________________________________________________ res4f_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4e_out[0][0] __________________________________________________________________________________________________ bn4f_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4f_branch2a[0][0] __________________________________________________________________________________________________ activation_26 (Activation) (None, 8, 8, 256) 0 bn4f_branch2a[0][0] __________________________________________________________________________________________________ res4f_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_26[0][0] __________________________________________________________________________________________________ bn4f_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4f_branch2b[0][0] __________________________________________________________________________________________________ activation_27 (Activation) (None, 8, 8, 256) 0 bn4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_27[0][0] __________________________________________________________________________________________________ bn4f_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4f_branch2c[0][0] __________________________________________________________________________________________________ add_13 (Add) (None, 8, 8, 1024) 0 bn4f_branch2c[0][0] res4e_out[0][0] __________________________________________________________________________________________________ res4f_out (Activation) (None, 8, 8, 1024) 0 add_13[0][0] __________________________________________________________________________________________________ res4g_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4f_out[0][0] __________________________________________________________________________________________________ bn4g_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4g_branch2a[0][0] __________________________________________________________________________________________________ activation_28 (Activation) (None, 8, 8, 256) 0 bn4g_branch2a[0][0] __________________________________________________________________________________________________ res4g_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_28[0][0] __________________________________________________________________________________________________ bn4g_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4g_branch2b[0][0] __________________________________________________________________________________________________ activation_29 (Activation) (None, 8, 8, 256) 0 bn4g_branch2b[0][0] __________________________________________________________________________________________________ res4g_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_29[0][0] __________________________________________________________________________________________________ bn4g_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4g_branch2c[0][0] __________________________________________________________________________________________________ add_14 (Add) (None, 8, 8, 1024) 0 bn4g_branch2c[0][0] res4f_out[0][0] __________________________________________________________________________________________________ res4g_out (Activation) (None, 8, 8, 1024) 0 add_14[0][0] __________________________________________________________________________________________________ res4h_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4g_out[0][0] __________________________________________________________________________________________________ bn4h_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4h_branch2a[0][0] __________________________________________________________________________________________________ activation_30 (Activation) (None, 8, 8, 256) 0 bn4h_branch2a[0][0] __________________________________________________________________________________________________ res4h_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_30[0][0] __________________________________________________________________________________________________ bn4h_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4h_branch2b[0][0] __________________________________________________________________________________________________ activation_31 (Activation) (None, 8, 8, 256) 0 bn4h_branch2b[0][0] __________________________________________________________________________________________________ res4h_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_31[0][0] __________________________________________________________________________________________________ bn4h_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4h_branch2c[0][0] __________________________________________________________________________________________________ add_15 (Add) (None, 8, 8, 1024) 0 bn4h_branch2c[0][0] res4g_out[0][0] __________________________________________________________________________________________________ res4h_out (Activation) (None, 8, 8, 1024) 0 add_15[0][0] __________________________________________________________________________________________________ res4i_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4h_out[0][0] __________________________________________________________________________________________________ bn4i_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4i_branch2a[0][0] __________________________________________________________________________________________________ activation_32 (Activation) (None, 8, 8, 256) 0 bn4i_branch2a[0][0] __________________________________________________________________________________________________ res4i_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_32[0][0] __________________________________________________________________________________________________ bn4i_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4i_branch2b[0][0] __________________________________________________________________________________________________ activation_33 (Activation) (None, 8, 8, 256) 0 bn4i_branch2b[0][0] __________________________________________________________________________________________________ res4i_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_33[0][0] __________________________________________________________________________________________________ bn4i_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4i_branch2c[0][0] __________________________________________________________________________________________________ add_16 (Add) (None, 8, 8, 1024) 0 bn4i_branch2c[0][0] res4h_out[0][0] __________________________________________________________________________________________________ res4i_out (Activation) (None, 8, 8, 1024) 0 add_16[0][0] __________________________________________________________________________________________________ res4j_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4i_out[0][0] __________________________________________________________________________________________________ bn4j_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4j_branch2a[0][0] __________________________________________________________________________________________________ activation_34 (Activation) (None, 8, 8, 256) 0 bn4j_branch2a[0][0] __________________________________________________________________________________________________ res4j_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_34[0][0] __________________________________________________________________________________________________ bn4j_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4j_branch2b[0][0] __________________________________________________________________________________________________ activation_35 (Activation) (None, 8, 8, 256) 0 bn4j_branch2b[0][0] __________________________________________________________________________________________________ res4j_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_35[0][0] __________________________________________________________________________________________________ bn4j_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4j_branch2c[0][0] __________________________________________________________________________________________________ add_17 (Add) (None, 8, 8, 1024) 0 bn4j_branch2c[0][0] res4i_out[0][0] __________________________________________________________________________________________________ res4j_out (Activation) (None, 8, 8, 1024) 0 add_17[0][0] __________________________________________________________________________________________________ res4k_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4j_out[0][0] __________________________________________________________________________________________________ bn4k_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4k_branch2a[0][0] __________________________________________________________________________________________________ activation_36 (Activation) (None, 8, 8, 256) 0 bn4k_branch2a[0][0] __________________________________________________________________________________________________ res4k_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_36[0][0] __________________________________________________________________________________________________ bn4k_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4k_branch2b[0][0] __________________________________________________________________________________________________ activation_37 (Activation) (None, 8, 8, 256) 0 bn4k_branch2b[0][0] __________________________________________________________________________________________________ res4k_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_37[0][0] __________________________________________________________________________________________________ bn4k_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4k_branch2c[0][0] __________________________________________________________________________________________________ add_18 (Add) (None, 8, 8, 1024) 0 bn4k_branch2c[0][0] res4j_out[0][0] __________________________________________________________________________________________________ res4k_out (Activation) (None, 8, 8, 1024) 0 add_18[0][0] __________________________________________________________________________________________________ res4l_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4k_out[0][0] __________________________________________________________________________________________________ bn4l_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4l_branch2a[0][0] __________________________________________________________________________________________________ activation_38 (Activation) (None, 8, 8, 256) 0 bn4l_branch2a[0][0] __________________________________________________________________________________________________ res4l_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_38[0][0] __________________________________________________________________________________________________ bn4l_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4l_branch2b[0][0] __________________________________________________________________________________________________ activation_39 (Activation) (None, 8, 8, 256) 0 bn4l_branch2b[0][0] __________________________________________________________________________________________________ res4l_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_39[0][0] __________________________________________________________________________________________________ bn4l_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4l_branch2c[0][0] __________________________________________________________________________________________________ add_19 (Add) (None, 8, 8, 1024) 0 bn4l_branch2c[0][0] res4k_out[0][0] __________________________________________________________________________________________________ res4l_out (Activation) (None, 8, 8, 1024) 0 add_19[0][0] __________________________________________________________________________________________________ res4m_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4l_out[0][0] __________________________________________________________________________________________________ bn4m_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4m_branch2a[0][0] __________________________________________________________________________________________________ activation_40 (Activation) (None, 8, 8, 256) 0 bn4m_branch2a[0][0] __________________________________________________________________________________________________ res4m_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_40[0][0] __________________________________________________________________________________________________ bn4m_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4m_branch2b[0][0] __________________________________________________________________________________________________ activation_41 (Activation) (None, 8, 8, 256) 0 bn4m_branch2b[0][0] __________________________________________________________________________________________________ res4m_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_41[0][0] __________________________________________________________________________________________________ bn4m_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4m_branch2c[0][0] __________________________________________________________________________________________________ add_20 (Add) (None, 8, 8, 1024) 0 bn4m_branch2c[0][0] res4l_out[0][0] __________________________________________________________________________________________________ res4m_out (Activation) (None, 8, 8, 1024) 0 add_20[0][0] __________________________________________________________________________________________________ res4n_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4m_out[0][0] __________________________________________________________________________________________________ bn4n_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4n_branch2a[0][0] __________________________________________________________________________________________________ activation_42 (Activation) (None, 8, 8, 256) 0 bn4n_branch2a[0][0] __________________________________________________________________________________________________ res4n_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_42[0][0] __________________________________________________________________________________________________ bn4n_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4n_branch2b[0][0] __________________________________________________________________________________________________ activation_43 (Activation) (None, 8, 8, 256) 0 bn4n_branch2b[0][0] __________________________________________________________________________________________________ res4n_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_43[0][0] __________________________________________________________________________________________________ bn4n_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4n_branch2c[0][0] __________________________________________________________________________________________________ add_21 (Add) (None, 8, 8, 1024) 0 bn4n_branch2c[0][0] res4m_out[0][0] __________________________________________________________________________________________________ res4n_out (Activation) (None, 8, 8, 1024) 0 add_21[0][0] __________________________________________________________________________________________________ res4o_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4n_out[0][0] __________________________________________________________________________________________________ bn4o_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4o_branch2a[0][0] __________________________________________________________________________________________________ activation_44 (Activation) (None, 8, 8, 256) 0 bn4o_branch2a[0][0] __________________________________________________________________________________________________ res4o_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_44[0][0] __________________________________________________________________________________________________ bn4o_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4o_branch2b[0][0] __________________________________________________________________________________________________ activation_45 (Activation) (None, 8, 8, 256) 0 bn4o_branch2b[0][0] __________________________________________________________________________________________________ res4o_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_45[0][0] __________________________________________________________________________________________________ bn4o_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4o_branch2c[0][0] __________________________________________________________________________________________________ add_22 (Add) (None, 8, 8, 1024) 0 bn4o_branch2c[0][0] res4n_out[0][0] __________________________________________________________________________________________________ res4o_out (Activation) (None, 8, 8, 1024) 0 add_22[0][0] __________________________________________________________________________________________________ res4p_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4o_out[0][0] __________________________________________________________________________________________________ bn4p_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4p_branch2a[0][0] __________________________________________________________________________________________________ activation_46 (Activation) (None, 8, 8, 256) 0 bn4p_branch2a[0][0] __________________________________________________________________________________________________ res4p_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_46[0][0] __________________________________________________________________________________________________ bn4p_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4p_branch2b[0][0] __________________________________________________________________________________________________ activation_47 (Activation) (None, 8, 8, 256) 0 bn4p_branch2b[0][0] __________________________________________________________________________________________________ res4p_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_47[0][0] __________________________________________________________________________________________________ bn4p_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4p_branch2c[0][0] __________________________________________________________________________________________________ add_23 (Add) (None, 8, 8, 1024) 0 bn4p_branch2c[0][0] res4o_out[0][0] __________________________________________________________________________________________________ res4p_out (Activation) (None, 8, 8, 1024) 0 add_23[0][0] __________________________________________________________________________________________________ res4q_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4p_out[0][0] __________________________________________________________________________________________________ bn4q_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4q_branch2a[0][0] __________________________________________________________________________________________________ activation_48 (Activation) (None, 8, 8, 256) 0 bn4q_branch2a[0][0] __________________________________________________________________________________________________ res4q_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_48[0][0] __________________________________________________________________________________________________ bn4q_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4q_branch2b[0][0] __________________________________________________________________________________________________ activation_49 (Activation) (None, 8, 8, 256) 0 bn4q_branch2b[0][0] __________________________________________________________________________________________________ res4q_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_49[0][0] __________________________________________________________________________________________________ bn4q_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4q_branch2c[0][0] __________________________________________________________________________________________________ add_24 (Add) (None, 8, 8, 1024) 0 bn4q_branch2c[0][0] res4p_out[0][0] __________________________________________________________________________________________________ res4q_out (Activation) (None, 8, 8, 1024) 0 add_24[0][0] __________________________________________________________________________________________________ res4r_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4q_out[0][0] __________________________________________________________________________________________________ bn4r_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4r_branch2a[0][0] __________________________________________________________________________________________________ activation_50 (Activation) (None, 8, 8, 256) 0 bn4r_branch2a[0][0] __________________________________________________________________________________________________ res4r_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_50[0][0] __________________________________________________________________________________________________ bn4r_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4r_branch2b[0][0] __________________________________________________________________________________________________ activation_51 (Activation) (None, 8, 8, 256) 0 bn4r_branch2b[0][0] __________________________________________________________________________________________________ res4r_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_51[0][0] __________________________________________________________________________________________________ bn4r_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4r_branch2c[0][0] __________________________________________________________________________________________________ add_25 (Add) (None, 8, 8, 1024) 0 bn4r_branch2c[0][0] res4q_out[0][0] __________________________________________________________________________________________________ res4r_out (Activation) (None, 8, 8, 1024) 0 add_25[0][0] __________________________________________________________________________________________________ res4s_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4r_out[0][0] __________________________________________________________________________________________________ bn4s_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4s_branch2a[0][0] __________________________________________________________________________________________________ activation_52 (Activation) (None, 8, 8, 256) 0 bn4s_branch2a[0][0] __________________________________________________________________________________________________ res4s_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_52[0][0] __________________________________________________________________________________________________ bn4s_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4s_branch2b[0][0] __________________________________________________________________________________________________ activation_53 (Activation) (None, 8, 8, 256) 0 bn4s_branch2b[0][0] __________________________________________________________________________________________________ res4s_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_53[0][0] __________________________________________________________________________________________________ bn4s_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4s_branch2c[0][0] __________________________________________________________________________________________________ add_26 (Add) (None, 8, 8, 1024) 0 bn4s_branch2c[0][0] res4r_out[0][0] __________________________________________________________________________________________________ res4s_out (Activation) (None, 8, 8, 1024) 0 add_26[0][0] __________________________________________________________________________________________________ res4t_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4s_out[0][0] __________________________________________________________________________________________________ bn4t_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4t_branch2a[0][0] __________________________________________________________________________________________________ activation_54 (Activation) (None, 8, 8, 256) 0 bn4t_branch2a[0][0] __________________________________________________________________________________________________ res4t_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_54[0][0] __________________________________________________________________________________________________ bn4t_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4t_branch2b[0][0] __________________________________________________________________________________________________ activation_55 (Activation) (None, 8, 8, 256) 0 bn4t_branch2b[0][0] __________________________________________________________________________________________________ res4t_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_55[0][0] __________________________________________________________________________________________________ bn4t_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4t_branch2c[0][0] __________________________________________________________________________________________________ add_27 (Add) (None, 8, 8, 1024) 0 bn4t_branch2c[0][0] res4s_out[0][0] __________________________________________________________________________________________________ res4t_out (Activation) (None, 8, 8, 1024) 0 add_27[0][0] __________________________________________________________________________________________________ res4u_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4t_out[0][0] __________________________________________________________________________________________________ bn4u_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4u_branch2a[0][0] __________________________________________________________________________________________________ activation_56 (Activation) (None, 8, 8, 256) 0 bn4u_branch2a[0][0] __________________________________________________________________________________________________ res4u_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_56[0][0] __________________________________________________________________________________________________ bn4u_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4u_branch2b[0][0] __________________________________________________________________________________________________ activation_57 (Activation) (None, 8, 8, 256) 0 bn4u_branch2b[0][0] __________________________________________________________________________________________________ res4u_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_57[0][0] __________________________________________________________________________________________________ bn4u_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4u_branch2c[0][0] __________________________________________________________________________________________________ add_28 (Add) (None, 8, 8, 1024) 0 bn4u_branch2c[0][0] res4t_out[0][0] __________________________________________________________________________________________________ res4u_out (Activation) (None, 8, 8, 1024) 0 add_28[0][0] __________________________________________________________________________________________________ res4v_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4u_out[0][0] __________________________________________________________________________________________________ bn4v_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4v_branch2a[0][0] __________________________________________________________________________________________________ activation_58 (Activation) (None, 8, 8, 256) 0 bn4v_branch2a[0][0] __________________________________________________________________________________________________ res4v_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_58[0][0] __________________________________________________________________________________________________ bn4v_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4v_branch2b[0][0] __________________________________________________________________________________________________ activation_59 (Activation) (None, 8, 8, 256) 0 bn4v_branch2b[0][0] __________________________________________________________________________________________________ res4v_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_59[0][0] __________________________________________________________________________________________________ bn4v_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4v_branch2c[0][0] __________________________________________________________________________________________________ add_29 (Add) (None, 8, 8, 1024) 0 bn4v_branch2c[0][0] res4u_out[0][0] __________________________________________________________________________________________________ res4v_out (Activation) (None, 8, 8, 1024) 0 add_29[0][0] __________________________________________________________________________________________________ res4w_branch2a (Conv2D) (None, 8, 8, 256) 262400 res4v_out[0][0] __________________________________________________________________________________________________ bn4w_branch2a (BatchNorm) (None, 8, 8, 256) 1024 res4w_branch2a[0][0] __________________________________________________________________________________________________ activation_60 (Activation) (None, 8, 8, 256) 0 bn4w_branch2a[0][0] __________________________________________________________________________________________________ res4w_branch2b (Conv2D) (None, 8, 8, 256) 590080 activation_60[0][0] __________________________________________________________________________________________________ bn4w_branch2b (BatchNorm) (None, 8, 8, 256) 1024 res4w_branch2b[0][0] __________________________________________________________________________________________________ activation_61 (Activation) (None, 8, 8, 256) 0 bn4w_branch2b[0][0] __________________________________________________________________________________________________ res4w_branch2c (Conv2D) (None, 8, 8, 1024) 263168 activation_61[0][0] __________________________________________________________________________________________________ bn4w_branch2c (BatchNorm) (None, 8, 8, 1024) 4096 res4w_branch2c[0][0] __________________________________________________________________________________________________ add_30 (Add) (None, 8, 8, 1024) 0 bn4w_branch2c[0][0] res4v_out[0][0] __________________________________________________________________________________________________ res4w_out (Activation) (None, 8, 8, 1024) 0 add_30[0][0] __________________________________________________________________________________________________
    C4=res4w_out

    stage5:
    res5a_branch2a (Conv2D) (None, 4, 4, 512) 524800 res4w_out[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNorm) (None, 4, 4, 512) 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ activation_62 (Activation) (None, 4, 4, 512) 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, 4, 4, 512) 2359808 activation_62[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNorm) (None, 4, 4, 512) 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ activation_63 (Activation) (None, 4, 4, 512) 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, 4, 4, 2048) 1050624 activation_63[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, 4, 4, 2048) 2099200 res4w_out[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNorm) (None, 4, 4, 2048) 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNorm) (None, 4, 4, 2048) 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ add_31 (Add) (None, 4, 4, 2048) 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_out (Activation) (None, 4, 4, 2048) 0 add_31[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, 4, 4, 512) 1049088 res5a_out[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNorm) (None, 4, 4, 512) 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ activation_64 (Activation) (None, 4, 4, 512) 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, 4, 4, 512) 2359808 activation_64[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNorm) (None, 4, 4, 512) 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ activation_65 (Activation) (None, 4, 4, 512) 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, 4, 4, 2048) 1050624 activation_65[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNorm) (None, 4, 4, 2048) 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ add_32 (Add) (None, 4, 4, 2048) 0 bn5b_branch2c[0][0] res5a_out[0][0] __________________________________________________________________________________________________ res5b_out (Activation) (None, 4, 4, 2048) 0 add_32[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, 4, 4, 512) 1049088 res5b_out[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNorm) (None, 4, 4, 512) 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ activation_66 (Activation) (None, 4, 4, 512) 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, 4, 4, 512) 2359808 activation_66[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNorm) (None, 4, 4, 512) 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ activation_67 (Activation) (None, 4, 4, 512) 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, 4, 4, 2048) 1050624 activation_67[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNorm) (None, 4, 4, 2048) 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ add_33 (Add) (None, 4, 4, 2048) 0 bn5c_branch2c[0][0] res5b_out[0][0] __________________________________________________________________________________________________ res5c_out (Activation) (None, 4, 4, 2048) 0 add_33[0][0] __________________________________________________________________________________________________
    C5=res5c_out
    fpn_c5p5 (Conv2D)               (None, 4, 4, 256)    524544      res5c_out[0][0]                  
    __________________________________________________________________________________________________
    P5=fpn_c5p5
    fpn_p5upsampled (UpSampling2D)  (None, 8, 8, 256)    0           fpn_c5p5[0][0]                   
    __________________________________________________________________________________________________
    fpn_c4p4 (Conv2D)               (None, 8, 8, 256)    262400      res4w_out[0][0]                  
    __________________________________________________________________________________________________
    fpn_p4add (Add)                 (None, 8, 8, 256)    0           fpn_p5upsampled[0][0]            
                                                                     fpn_c4p4[0][0]                   
    __________________________________________________________________________________________________
    P4=fpn_p4add

    fpn_p4upsampled (UpSampling2D)  (None, 16, 16, 256)  0           fpn_p4add[0][0]                  
    __________________________________________________________________________________________________
    fpn_c3p3 (Conv2D)               (None, 16, 16, 256)  131328      res3d_out[0][0]                  
    __________________________________________________________________________________________________
    fpn_p3add (Add)                 (None, 16, 16, 256)  0           fpn_p4upsampled[0][0]            
                                                                     fpn_c3p3[0][0]                   
    __________________________________________________________________________________________________
    P3=fpn_p3add
    fpn_p3upsampled (UpSampling2D)  (None, 32, 32, 256)  0           fpn_p3add[0][0]                  
    __________________________________________________________________________________________________
    fpn_c2p2 (Conv2D)               (None, 32, 32, 256)  65792       res2c_out[0][0]                  
    __________________________________________________________________________________________________
    fpn_p2add (Add)                 (None, 32, 32, 256)  0           fpn_p3upsampled[0][0]            
                                                                     fpn_c2p2[0][0]                   
    __________________________________________________________________________________________________
    P2=fpn_p2add

    fpn_p5 (Conv2D) (None, 4, 4, 256) 590080 fpn_c5p5[0][0] __________________________________________________________________________________________________ fpn_p2 (Conv2D) (None, 32, 32, 256) 590080 fpn_p2add[0][0] __________________________________________________________________________________________________ fpn_p3 (Conv2D) (None, 16, 16, 256) 590080 fpn_p3add[0][0] __________________________________________________________________________________________________ fpn_p4 (Conv2D) (None, 8, 8, 256) 590080 fpn_p4add[0][0] __________________________________________________________________________________________________ fpn_p6 (MaxPooling2D) (None, 2, 2, 256) 0 fpn_p5[0][0] __________________________________________________________________________________________________
    获得传入RPN网络的P2到P6
    rpn_model (Model) [(None, None, 2), (N 1189394 fpn_p2[0][0] fpn_p3[0][0] fpn_p4[0][0] fpn_p5[0][0] fpn_p6[0][0] __________________________________________________________________________________________________ rpn_class (Concatenate) (None, None, 2) 0 rpn_model[1][1] rpn_model[2][1] rpn_model[3][1] rpn_model[4][1] rpn_model[5][1] __________________________________________________________________________________________________ rpn_bbox (Concatenate) (None, None, 4) 0 rpn_model[1][2] rpn_model[2][2] rpn_model[3][2] rpn_model[4][2] rpn_model[5][2] __________________________________________________________________________________________________ anchors (Lambda) (8, 4092, 4) 0 input_image[0][0] __________________________________________________________________________________________________ input_gt_boxes (InputLayer) (None, None, 4) 0 __________________________________________________________________________________________________ ROI (ProposalLayer) (None, 2000, 4) 0 rpn_class[0][0] rpn_bbox[0][0] anchors[0][0] __________________________________________________________________________________________________ input_gt_class_ids (InputLayer) (None, None) 0 __________________________________________________________________________________________________ lambda_1 (Lambda) (None, None, 4) 0 input_gt_boxes[0][0] __________________________________________________________________________________________________ input_gt_masks (InputLayer) (None, 56, 56, None) 0 __________________________________________________________________________________________________ proposal_targets (DetectionTarg [(None, 32, 4), (Non 0 ROI[0][0] input_gt_class_ids[0][0] lambda_1[0][0] input_gt_masks[0][0] __________________________________________________________________________________________________ input_image_meta (InputLayer) (None, 16) 0 __________________________________________________________________________________________________ roi_align_mask (PyramidROIAlign (None, 32, 14, 14, 2 0 proposal_targets[0][0] input_image_meta[0][0] fpn_p2[0][0] fpn_p3[0][0] fpn_p4[0][0] fpn_p5[0][0] __________________________________________________________________________________________________ mrcnn_mask_conv1 (TimeDistribut (None, 32, 14, 14, 2 590080 roi_align_mask[0][0] __________________________________________________________________________________________________ mrcnn_mask_bn1 (TimeDistributed (None, 32, 14, 14, 2 1024 mrcnn_mask_conv1[0][0] __________________________________________________________________________________________________ activation_71 (Activation) (None, 32, 14, 14, 2 0 mrcnn_mask_bn1[0][0] __________________________________________________________________________________________________ mrcnn_mask_conv2 (TimeDistribut (None, 32, 14, 14, 2 590080 activation_71[0][0] __________________________________________________________________________________________________ roi_align_classifier (PyramidRO (None, 32, 7, 7, 256 0 proposal_targets[0][0] input_image_meta[0][0] fpn_p2[0][0] fpn_p3[0][0] fpn_p4[0][0] fpn_p5[0][0] __________________________________________________________________________________________________ mrcnn_mask_bn2 (TimeDistributed (None, 32, 14, 14, 2 1024 mrcnn_mask_conv2[0][0] __________________________________________________________________________________________________ mrcnn_class_conv1 (TimeDistribu (None, 32, 1, 1, 102 12846080 roi_align_classifier[0][0] __________________________________________________________________________________________________ activation_72 (Activation) (None, 32, 14, 14, 2 0 mrcnn_mask_bn2[0][0] __________________________________________________________________________________________________ mrcnn_class_bn1 (TimeDistribute (None, 32, 1, 1, 102 4096 mrcnn_class_conv1[0][0] __________________________________________________________________________________________________ mrcnn_mask_conv3 (TimeDistribut (None, 32, 14, 14, 2 590080 activation_72[0][0] __________________________________________________________________________________________________ activation_68 (Activation) (None, 32, 1, 1, 102 0 mrcnn_class_bn1[0][0] __________________________________________________________________________________________________ mrcnn_mask_bn3 (TimeDistributed (None, 32, 14, 14, 2 1024 mrcnn_mask_conv3[0][0] __________________________________________________________________________________________________ mrcnn_class_conv2 (TimeDistribu (None, 32, 1, 1, 102 1049600 activation_68[0][0] __________________________________________________________________________________________________ activation_73 (Activation) (None, 32, 14, 14, 2 0 mrcnn_mask_bn3[0][0] __________________________________________________________________________________________________ mrcnn_class_bn2 (TimeDistribute (None, 32, 1, 1, 102 4096 mrcnn_class_conv2[0][0] __________________________________________________________________________________________________ mrcnn_mask_conv4 (TimeDistribut (None, 32, 14, 14, 2 590080 activation_73[0][0] __________________________________________________________________________________________________ activation_69 (Activation) (None, 32, 1, 1, 102 0 mrcnn_class_bn2[0][0] __________________________________________________________________________________________________ mrcnn_mask_bn4 (TimeDistributed (None, 32, 14, 14, 2 1024 mrcnn_mask_conv4[0][0] __________________________________________________________________________________________________ pool_squeeze (Lambda) (None, 32, 1024) 0 activation_69[0][0] __________________________________________________________________________________________________ activation_74 (Activation) (None, 32, 14, 14, 2 0 mrcnn_mask_bn4[0][0] __________________________________________________________________________________________________ mrcnn_bbox_fc (TimeDistributed) (None, 32, 16) 16400 pool_squeeze[0][0] __________________________________________________________________________________________________ mrcnn_mask_deconv (TimeDistribu (None, 32, 28, 28, 2 262400 activation_74[0][0] __________________________________________________________________________________________________ rpn_class_logits (Concatenate) (None, None, 2) 0 rpn_model[1][0] rpn_model[2][0] rpn_model[3][0] rpn_model[4][0] rpn_model[5][0] __________________________________________________________________________________________________ mrcnn_class_logits (TimeDistrib (None, 32, 4) 4100 pool_squeeze[0][0] __________________________________________________________________________________________________ mrcnn_bbox (Reshape) (None, 32, 4, 4) 0 mrcnn_bbox_fc[0][0] __________________________________________________________________________________________________ mrcnn_mask (TimeDistributed) (None, 32, 28, 28, 4 1028 mrcnn_mask_deconv[0][0] __________________________________________________________________________________________________ input_rpn_match (InputLayer) (None, None, 1) 0 __________________________________________________________________________________________________ input_rpn_bbox (InputLayer) (None, None, 4) 0 __________________________________________________________________________________________________ lambda_4 (Lambda) (None, 4) 0 input_image_meta[0][0] __________________________________________________________________________________________________ mrcnn_class (TimeDistributed) (None, 32, 4) 0 mrcnn_class_logits[0][0] __________________________________________________________________________________________________ output_rois (Lambda) (None, 32, 4) 0 proposal_targets[0][0] __________________________________________________________________________________________________ rpn_class_loss (Lambda) () 0 input_rpn_match[0][0] rpn_class_logits[0][0] __________________________________________________________________________________________________ rpn_bbox_loss (Lambda) () 0 input_rpn_bbox[0][0] input_rpn_match[0][0] rpn_bbox[0][0] __________________________________________________________________________________________________ mrcnn_class_loss (Lambda) () 0 proposal_targets[0][1] mrcnn_class_logits[0][0] lambda_4[0][0] __________________________________________________________________________________________________ mrcnn_bbox_loss (Lambda) () 0 proposal_targets[0][2] proposal_targets[0][1] mrcnn_bbox[0][0] __________________________________________________________________________________________________ mrcnn_mask_loss (Lambda) () 0 proposal_targets[0][3] proposal_targets[0][1] mrcnn_mask[0][0] ================================================================================================== Total params: 63,744,170 Trainable params: 21,079,850 Non-trainable params: 42,664,320 __________________________________________________________________________________________________

     

     

     

     

  • 相关阅读:
    [转贴]Nexus7的编译过程。
    Office:SmartArt使用
    jira:导出的Excel无法打开
    TDT:关键字驱动测试方法
    SVN:“SVN”不是内部命令,解决方法
    VS:如何在VS2010中运行控制台程序时停留在控制台显示窗口
    Total Cammander中容合SVN
    NSIS:NSIS基础语法
    AppTimer.exe:程序启动时间测试工具
    MySql安装与GUi安装
  • 原文地址:https://www.cnblogs.com/dan-baishucaizi/p/11241374.html
Copyright © 2011-2022 走看看