zoukankan      html  css  js  c++  java
  • difference between sparse_softmax_cross_entropy_with_logits and softmax_cross_entropy_with_logits

    via: stackoverflow

    Having two different functions is a convenience, as they produce the same result.

    The difference is simple:

    • For sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range [0, num_classes-1].
    • For softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.

    Labels used in softmax_cross_entropy_with_logits are the one hot version of labels used in sparse_softmax_cross_entropy_with_logits.

    Another tiny difference is that with sparse_softmax_cross_entropy_with_logits, you can give -1 as a label to have loss 0 on this label.

  • 相关阅读:
    数组
    JavaScript语法
    Math.random()
    第二第三周暑期集训总结
    第一周
    ACM课程学习总结
    专题四---总结
    专题四--1004
    专题四--1005
    专题四--1006
  • 原文地址:https://www.cnblogs.com/yuelien/p/14650299.html
Copyright © 2011-2022 走看看