zoukankan      html  css  js  c++  java
  • tensorflow 中 softmax_cross_entropy_with_logits 与 sparse_softmax_cross_entropy_with_logits 的区别

    http://stackoverflow.com/questions/37312421/tensorflow-whats-the-difference-between-sparse-softmax-cross-entropy-with-logi

    Having two different functions is a convenience, as they produce the same result.

    The difference is simple:

    • For sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range [0, num_classes-1].
    • For softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.

    Labels used in softmax_cross_entropy_with_logits are the one hot version of labels used in sparse_softmax_cross_entropy_with_logits.

    Another tiny difference is that with sparse_softmax_cross_entropy_with_logits, you can give -1 as a label to have loss 0 on this label.

  • 相关阅读:
    Shape详解
    C#装箱与拆箱
    C#值类型、引用类型的区别
    C#类型简述
    C# 关键字列表
    python图片转字符画
    软件测试面试题
    python关键字以及含义,用法
    JMeter的那些问题
    APP测试功能点
  • 原文地址:https://www.cnblogs.com/welhzh/p/6830494.html
Copyright © 2011-2022 走看看