Whiteboard from today's meeting on Bayesian ML:
Cox: P(A, B) = P(A|B)P(B) = P(B|A)P(A)
=>
P(A|B) = [P(B|A)P(A)] / P(B) (Bayes)
The ML model needs to be a generative model - statistically model that generate data
Example - a mixture of normal distribution. Two normal distributions. You choose which one you use in probability 1/2 and then you use to generate data.
(n+1)! = (n+1)(n)!
https://en.wikipedia.org/wiki/Conjugate_prior
Tuesday, February 25, 2020
Saturday, February 22, 2020
We revised Bayesian thinking and it's relation to ML. https://drive.google.com/file/d/1CHxUPqtkyNWCklT7sgOC6I7YzAj29R-s/view?usp=sharing
Subscribe to:
Posts (Atom)
Our next ML study group meeting will take place on Monday the 8 th of October. I'll cover the contraction theorem. See relevant s...
-
Ml crash directory Are you familiar with regression - https://m.youtube.com/watch?v=aq8VU5KLmkY ? One way to view Ml is regression on ster...
-
We'll cover LDA in tw's meeting. Here is the slide - https://drive.google.com/open?id=1KRoCA4vo9H9oJOl3iD-qRqIHl9qQq9vf This is ...
-
Whiteboard from today's meeting on Bayesian ML: Cox: P(A, B) = P(A|B)P(B) = P(B|A)P(A) => P(A|B) = [P(B|A)P(A)] / P(B) (Bayes) ...