zoukankan      html  css  js  c++  java
  • 【593】ResNet残差网络

    参考:inception模型和卷积层的残差连接的keras实现

    参考:Keras Implementation of ResNet-50 (Residual Networks) Architecture from Scratch  

    参考:一文读懂残差网络ResNet

    参考:ResNet Keras实现


      如下图所示,$F(x)$ 是一个或多个卷积层,然后将两者相加 $F(x) + x$。逐个元素对应相加,而不是连接。

      

    from keras.layers import Conv2D, Input, Add
    
    # input tensor for a 3-channel 256x256 image
    x = Input(shape=(256, 256, 3))
    
    # 3x3 conv with 3 output channels (same as input channels)
    y = Conv2D(3, (3, 3), padding='same')(x)
    
    # this returns x + y. (SKIP Connection)
    z = Add()([x, y])
    
    z = Activation('relu')(z) 

      


    1. Identity Block

      The identity block is the standard block used in ResNets and corresponds to the case where the input activation has the same dimension as the output activation.

      

    def identity_block(X, f, filters, stage, block):
       
        conv_name_base = 'res' + str(stage) + block + '_branch'
        bn_name_base = 'bn' + str(stage) + block + '_branch'
        F1, F2, F3 = filters
    
        X_shortcut = X
       
        X = Conv2D(filters=F1, kernel_size=(1, 1), strides=(1, 1), padding='valid', name=conv_name_base + '2a', kernel_initializer=glorot_uniform(seed=0))(X)
        X = BatchNormalization(axis=3, name=bn_name_base + '2a')(X)
        X = Activation('relu')(X)
    
        X = Conv2D(filters=F2, kernel_size=(f, f), strides=(1, 1), padding='same', name=conv_name_base + '2b', kernel_initializer=glorot_uniform(seed=0))(X)
        X = BatchNormalization(axis=3, name=bn_name_base + '2b')(X)
        X = Activation('relu')(X)
    
        X = Conv2D(filters=F3, kernel_size=(1, 1), strides=(1, 1), padding='valid', name=conv_name_base + '2c', kernel_initializer=glorot_uniform(seed=0))(X)
        X = BatchNormalization(axis=3, name=bn_name_base + '2c')(X)
    
        X = Add()([X, X_shortcut])# SKIP Connection
        X = Activation('relu')(X)
    
        return X
    

    2. Convolutional Block

      We can use this type of block when the input and output dimensions don’t match up. The difference with the identity block is that there is a CONV2D layer in the shortcut path.

      

    def convolutional_block(X, f, filters, stage, block, s=2):
       
        conv_name_base = 'res' + str(stage) + block + '_branch'
        bn_name_base = 'bn' + str(stage) + block + '_branch'
    
        F1, F2, F3 = filters
    
        X_shortcut = X
    
        X = Conv2D(filters=F1, kernel_size=(1, 1), strides=(s, s), padding='valid', name=conv_name_base + '2a', kernel_initializer=glorot_uniform(seed=0))(X)
        X = BatchNormalization(axis=3, name=bn_name_base + '2a')(X)
        X = Activation('relu')(X)
    
        X = Conv2D(filters=F2, kernel_size=(f, f), strides=(1, 1), padding='same', name=conv_name_base + '2b', kernel_initializer=glorot_uniform(seed=0))(X)
        X = BatchNormalization(axis=3, name=bn_name_base + '2b')(X)
        X = Activation('relu')(X)
    
        X = Conv2D(filters=F3, kernel_size=(1, 1), strides=(1, 1), padding='valid', name=conv_name_base + '2c', kernel_initializer=glorot_uniform(seed=0))(X)
        X = BatchNormalization(axis=3, name=bn_name_base + '2c')(X)
    
        X_shortcut = Conv2D(filters=F3, kernel_size=(1, 1), strides=(s, s), padding='valid', name=conv_name_base + '1', kernel_initializer=glorot_uniform(seed=0))(X_shortcut)
        X_shortcut = BatchNormalization(axis=3, name=bn_name_base + '1')(X_shortcut)
    
        X = Add()([X, X_shortcut])
        X = Activation('relu')(X)
    
        return X
    
  • 相关阅读:
    codec功能简介
    dtmf原理说明
    linux的vm.overcommit_memory的内存分配参数详解
    Hibernate与Sleep的区别
    简单的读写-simple_read_from_buffer
    linux delay sleep
    Linux系统上的popen()库函数
    Linux中popen函数的作用小结
    ulimit 命令详解
    LTE Cat1有什么用?基于4G LTE打造cat1,弥补NB-IoT和5G的空缺
  • 原文地址:https://www.cnblogs.com/alex-bn-lee/p/14972481.html
Copyright © 2011-2022 走看看