zoukankan      html  css  js  c++  java
  • [Machine Learning] Gradient Checking

    Gradient checking will assure that our backpropagation works as intended. We can approximate the derivative of our cost function with:

     

    epsilon = 1e-4;
    for i = 1:n,
      thetaPlus = theta;
      thetaPlus(i) += epsilon;
      thetaMinus = theta;
      thetaMinus(i) -= epsilon;
      gradApprox(i) = (J(thetaPlus) - J(thetaMinus))/(2*epsilon)
    end;

    We previously saw how to calculate the deltaVector. So once we compute our gradApprox vector, we can check that gradApprox ≈ deltaVector.

    Once you have verified once that your backpropagation algorithm is correct, you don't need to compute gradApprox again. The code to compute gradApprox can be very slow.

  • 相关阅读:
    讨论一下,乌云漏洞库的学习方法
    a
    asss
    密码重置
    SQL注入2
    起名字真难
    Header
    SQL注入1
    伪装者
    ofbiz 代码日记
  • 原文地址:https://www.cnblogs.com/Answer1215/p/13676314.html
Copyright © 2011-2022 走看看