zoukankan      html  css  js  c++  java
  • Activation Functions

    Sigmoid

    Sigmoids saturate and kill gradients.

    Sigmoid outputs are not zero-centered.

    Exponential function is a little computational expensive.

    Tanh

    Kill gradients when saturated.

    It's zero-centered! : )

    ReLU

    Does not saturate. ( in positive region)

    Very computational efficient.

    Converges much faster than sigmoid/tanh in practice. (6 times)

    Seems more biologically plausible than sigmoid.

    BUT!

    Not zero-centered.

    No gradient when x<0.

    Take care of learning rate when using ReLU.

    Leakly ReLU

    Does not saturate.

    Very computational efficient.

    Converges much faster than sigmoid/tanh in practice. (6 times)

    will not "die"

     

    Parametric ReLU

    Exponential Linear Unit

  • 相关阅读:
    sqoop
    HBase API操作
    Hbase 01-安装| shell操作
    Azkaban
    Kafka-API
    Oozie
    Kafka Manager| KafkaMonitor
    kafka工作流程| 命令行操作
    CDH| HUE的自动化安装部署
    Impala
  • 原文地址:https://www.cnblogs.com/hizhaolei/p/10623472.html
Copyright © 2011-2022 走看看