zoukankan      html  css  js  c++  java
  • 【350】机器学习中的线性代数之矩阵求导

    参考:机器学习中的线性代数之矩阵求导

    参考:Matrix calculus - Wikipedia

    矩阵求导(Matrix Derivative)也称作矩阵微分(Matrix Differential),在机器学习、图像处理、最优化等领域的公式推导中经常用到。

    布局(Layout):在矩阵求导中有两种布局,分别为分母布局(denominator layout)分子布局(numerator layout)。这两种不同布局的求导规则是不一样的。

    个人理解:

    Numerator Layout:布局按照分子的排列,例如分子的m列,那么结果的m列是对应分子的,与分母正好相反,分母如果为n列,对应的n行,比较常用。

    Denominator Layout:与上面正好相反,结果正好是转置矩阵。

    Numerator-layout notation

    Using numerator-layout notation, we have:

    $
    {displaystyle {frac {partial y}{partial mathbf {x} }}=left[{frac {partial y}{partial x_{1}}}{frac {partial y}{partial x_{2}}}cdots {frac {partial y}{partial x_{n}}} ight].}
    $

    $
    {displaystyle {frac {partial mathbf {y} }{partial x}}={egin{bmatrix}{frac {partial y_{1}}{partial x}}\{frac {partial y_{2}}{partial x}}\vdots \{frac {partial y_{m}}{partial x}}\end{bmatrix}}.}
    $

    ${displaystyle {frac {partial mathbf {y} }{partial mathbf {x} }}={egin{bmatrix}{frac {partial y_{1}}{partial x_{1}}}&{frac {partial y_{1}}{partial x_{2}}}&cdots &{frac {partial y_{1}}{partial x_{n}}}\{frac {partial y_{2}}{partial x_{1}}}&{frac {partial y_{2}}{partial x_{2}}}&cdots &{frac {partial y_{2}}{partial x_{n}}}\vdots &vdots &ddots &vdots \{frac {partial y_{m}}{partial x_{1}}}&{frac {partial y_{m}}{partial x_{2}}}&cdots &{frac {partial y_{m}}{partial x_{n}}}\end{bmatrix}}.}
    $

    $
    frac{partial y}{partial mathbf{X}} = egin{bmatrix} frac{partial y}{partial x_{11}} & frac{partial y}{partial x_{21}} & cdots & frac{partial y}{partial x_{p1}}\ frac{partial y}{partial x_{12}} & frac{partial y}{partial x_{22}} & cdots & frac{partial y}{partial x_{p2}}\ vdots & vdots & ddots & vdots\ frac{partial y}{partial x_{1q}} & frac{partial y}{partial x_{2q}} & cdots & frac{partial y}{partial x_{pq}}\ end{bmatrix}.
    $

    The following definitions are only provided in numerator-layout notation:

    $
    frac{partial mathbf{Y}}{partial x} = egin{bmatrix} frac{partial y_{11}}{partial x} & frac{partial y_{12}}{partial x} & cdots & frac{partial y_{1n}}{partial x}\ frac{partial y_{21}}{partial x} & frac{partial y_{22}}{partial x} & cdots & frac{partial y_{2n}}{partial x}\ vdots & vdots & ddots & vdots\ frac{partial y_{m1}}{partial x} & frac{partial y_{m2}}{partial x} & cdots & frac{partial y_{mn}}{partial x}\ end{bmatrix}.
    $

    $
    dmathbf{X} = egin{bmatrix} dx_{11} & dx_{12} & cdots & dx_{1n}\ dx_{21} & dx_{22} & cdots & dx_{2n}\ vdots & vdots & ddots & vdots\ dx_{m1} & dx_{m2} & cdots & dx_{mn}\ end{bmatrix}.
    $

    代码参考:

    $$
    {displaystyle {frac {partial y}{partial mathbf {x} }}=left[{frac {partial y}{partial x_{1}}}{frac {partial y}{partial x_{2}}}cdots {frac {partial y}{partial x_{n}}}
    ight].} 
    $$
    
    $$
    {displaystyle {frac {partial mathbf {y} }{partial x}}={egin{bmatrix}{frac {partial y_{1}}{partial x}}\{frac {partial y_{2}}{partial x}}\vdots \{frac {partial y_{m}}{partial x}}\end{bmatrix}}.} 
    $$
    
    $${displaystyle {frac {partial mathbf {y} }{partial mathbf {x} }}={egin{bmatrix}{frac {partial y_{1}}{partial x_{1}}}&{frac {partial y_{1}}{partial x_{2}}}&cdots &{frac {partial y_{1}}{partial x_{n}}}\{frac {partial y_{2}}{partial x_{1}}}&{frac {partial y_{2}}{partial x_{2}}}&cdots &{frac {partial y_{2}}{partial x_{n}}}\vdots &vdots &ddots &vdots \{frac {partial y_{m}}{partial x_{1}}}&{frac {partial y_{m}}{partial x_{2}}}&cdots &{frac {partial y_{m}}{partial x_{n}}}\end{bmatrix}}.} 
    $$
    
    $$
    frac{partial y}{partial mathbf{X}} = egin{bmatrix} frac{partial y}{partial x_{11}} & frac{partial y}{partial x_{21}} & cdots & frac{partial y}{partial x_{p1}}\ frac{partial y}{partial x_{12}} & frac{partial y}{partial x_{22}} & cdots & frac{partial y}{partial x_{p2}}\ vdots & vdots & ddots & vdots\ frac{partial y}{partial x_{1q}} & frac{partial y}{partial x_{2q}} & cdots & frac{partial y}{partial x_{pq}}\ end{bmatrix}. 
    $$
    
    $$
    frac{partial mathbf{Y}}{partial x} = egin{bmatrix} frac{partial y_{11}}{partial x} & frac{partial y_{12}}{partial x} & cdots & frac{partial y_{1n}}{partial x}\ frac{partial y_{21}}{partial x} & frac{partial y_{22}}{partial x} & cdots & frac{partial y_{2n}}{partial x}\ vdots & vdots & ddots & vdots\ frac{partial y_{m1}}{partial x} & frac{partial y_{m2}}{partial x} & cdots & frac{partial y_{mn}}{partial x}\ end{bmatrix}. 
    $$
    
    $$
    dmathbf{X} = egin{bmatrix} dx_{11} & dx_{12} & cdots & dx_{1n}\ dx_{21} & dx_{22} & cdots & dx_{2n}\ vdots & vdots & ddots & vdots\ dx_{m1} & dx_{m2} & cdots & dx_{mn}\ end{bmatrix}. 
    $$
    

  • 相关阅读:
    Awk by Example--转载
    Linux sed Examples--转载
    EM算法——有隐含变量时,极大似然用梯度法搞不定只好来猜隐含变量期望值求max值了
    SVM最通俗的解读
    SVM中的线性分类器
    KD树——k=1时就是BST,里面的数学原理还是有不明白的地方,为啥方差划分?
    梯度下降法——得到的结果可能是局部最优值,如果凸函数则可保证梯度下降得到的是全局最优值
    搜索引擎——用户搜索意图的理解及其难点解析,本质是利用机器学习用户的意图分类
    深入浅出时序数据库之预处理篇——批处理和流处理,用户可定制,但目前流行influxdb没有做
    FreeWheel基于Go的实践经验漫谈——GC是大坑(关键业务场景不用),web框架尚未统一,和c++性能相比难说
  • 原文地址:https://www.cnblogs.com/alex-bn-lee/p/10292729.html
Copyright © 2011-2022 走看看