Bayes Example: Biased Coin¶
Consider a Bernoulli trial of flipping a coin. What is probability of head if we got \(k\) heads in \(n\) trials?
Maximum Likelihood Estimate¶
Let \(\theta\) be the probability of head. We represent \(n\) Bernoulli trial results with \(n\) random variables \(X_1, \ldots, X_n\):
where \(i \in [1, n]\).
Since \(X_1, \ldots, X_n\) are independently and identically distributed (i.i.d.), their conditional probability of head given \(\theta\) is fixed:
where \(i \in [1, n]\).
Finding Maxima using derivatives:
Bayes Estimate¶
The Maximum Likelihood Estimate (MLE) only gives the most probable result, which cannot exclude the possibility of other values. In fact, the probability being estimated is also a random variable, which has its only distribution and expectation, and the Bayes Estimate will give the expectation.
Preliminary Knowledge¶
To calculate the posterior probability of a coin head, we need to know the Beta Function:
Bayes Formula¶
Before trial, we believe that all values of \(\theta\) are equally likely, therefore:
Suppose we got all heads in 100 trials. The expected value of head probability is \(\frac{101}{102} = 99.02\%\).
Back to Statistical Learning.