zoukankan      html  css  js  c++  java
  • expectation-maximization algorithm ---- PRML读书笔记

    An elegant and powerful method for finding maximum likelihood solutions for models with latent variables is called the expectation-maximization algorithm, or EM algorithm.

    If we assume that the data points are drawn independently from the distribution, then the log of the likelihood function is given by

    lnp(X|π,μ,Σ)=Σnln{ΣkπkN(xnkk)}

    EM for Gaussian Mixtures

    Given a Gaussian mixture model, the goal is to maximize the likelihood function with respect to the parameters(comprising the means and covariances of the components

    and the mixing coefficients).

    1.Initialize the means μk, covariances Σk and  mixing coefficients πk, and evaluate the initial value of the log likelihood.

    2.E step. Evaluate the responsibilities using the current parameter values

    3.M step. Re-estimate the parameters using the current responsibilities.

    4.Evaluate the log likelihood

    lnp(X|π,μ,Σ)=Σnln{ΣkπkN(xnkk)}

  • 相关阅读:
    位向量法、二进制法枚举子集
    jfinal 下载文件时重命名
    Java程序员开发参考资源
    (二)Hive1.2.2
    (一)Hive1.2.2
    YARN资源调度框架
    MapReduce并行编程模型
    HDFS 分布式文件系统
    Kafka
    观察者模式
  • 原文地址:https://www.cnblogs.com/donggongdechen/p/9813183.html
Copyright © 2011-2022 走看看