Chapters 2 and 3
Lecture 2
The first 1 & 2 chapters are basically all garbage. Come on. You know that stuff get over it. I'm going to talk about chapters 3+
Laws
Pemutations
Laws Of Probability
Addition Rules
For mutually exclusive sets (like for example hair color) then
If A and B are not mutually exclusive (like a worm, virus or both) then .
Complement Rule
For any event
Fun Bonus Rules
If A is a subset of B, then
Examples
Let V be the event that a computer has a virus. Let W be the event that a computer has a worm. Suppose P(V) = 0.15, P(W) = 0.05, and .
-
Probability of both a worm and a virus: ?
-
Lecture 3
"If B, then what I the probability of A" this also means "A given B"
and of course
This is for quantifying the relationship between two events.
Marginal Distribution
Uhhhhhh just add together all the irrelevant variables. Like in the total column we're marginalizing the hair color
Brown Eyes | Black Eyes | Total | |
---|---|---|---|
Red har | 5 | 6 | 11 |
Blue hair | 6 | 5 | 11 |
22 total |
So
So
Independent Events
Just make sure that the probability of one thing given another is the same as initial probability. Is the probability of red hair given brown eyes the same as the probability of red hair? If so then the brown eyes have no influence.
Multiplicative Law of Probability
If A and B are two events with and , then
Similarly we can have
If A and B are independent:
Now given the multiplicative rules we have three way to check for independence:
If any of the above are true then they're all true.
Total Law of Probability
Really intuitively (if I had the diagram here). If . Then
Gigachad Bayes' Rule
Combine total law of probability with the multiplicative rule.
This is important, notice the denominator of the rule is just the probability of B directly.
General Bayes' Rule
Let be mutally exclusive and exhaustive events with for each . Furthermore, let B be any event with . Then
Basically the same thing as above but you sum all the denominators it's dependent on. Intersect the denominators as all the bases.