Bayes for Beginners: Probability and Likelihood 好好看,非常有用。
以前死活都不理解Probability和Likelihood的区别,为什么这两个东西的条件反一下就相等。
定义:
Probability是指在固定参数的情况下,事件的概率,必须是0-1,事件互斥且和为1. 我们常见的泊松分布、二项分布、正态分布的概率密度图描述的就是这个。
Likelihood是指固定的结果,我们的参数的概率,和不必为1,不必互斥,所以只有ratio是有意义的。
至于为什么L=P,这是因为定义就是这样的,wiki解释得非常清楚。
Consider a simple statistical model of a coin flip, with a single parameter that expresses the "fairness" of the coin. This parameter is the probability that a given coin lands heads up ("H") when tossed. can take on any numeric value within the range 0.0 to 1.0. For a perfectly fair coin, = 0.5.
Imagine flipping a coin twice, and observing the following data : two heads in two tosses ("HH"). Assuming that each successive coin flip is IID, then the probability of observing HH is
Hence: given the observed data HH, the likelihood that the model parameter equals 0.5, is 0.25. Mathematically, this is written as
This is not the same as saying that the probability that , given the observation HH, is 0.25. (For that, we could apply Bayes' theorem, which implies that the posterior probability is proportional to the likelihood times the prior probability.)
Suppose that the coin is not a fair coin, but instead it has . Then the probability of getting two heads is
Hence
More generally, for each value of , we can calculate the corresponding likelihood. The result of such calculations is displayed in Figure 1.
In Figure 1, the integral of the likelihood over the interval [0, 1] is 1/3. That illustrates an important aspect of likelihoods: likelihoods do not have to integrate (or sum) to 1, unlike probabilities.