[SOLVED] CS Bayesian Page 1

$25

File Name: CS_Bayesian_Page_1.zip
File Size: 169.56 KB

5/5 - (1 vote)

Page 1
EX TL A commuter encounters two traffic signals on the drive home. There is a 0.80 probability of being stopped at at least one of the signals; 0.65 probability of being stopped at the first signal; 0.55 at the second.
EX 2D Toss two fair 6-sided dice. The sample space is S = { (i , j) : i = 1, , 6, j = 1, , 6 }. The probability of an event is the number of outcomes in the event divided by 36. Some events:
A1 = sum is 7
A2 = second die results in 2
A3 = first die results in 3
A7 = first die is 1 or 2 or 3
A8 = first die is 3 or 4 or 5
A9 = sum is 9
= { (1 , 6) , (2 , 5) , (3 , 4) , (4 , 3) , (5 , 2) , (6 , 1) } = { (1 , 2) , (2 , 2) , (3 , 2) , (4 , 2) , (5 , 2) , (6 , 2) } = { (3 , 1) , (3 , 2) , (3 , 3) , (3 , 4) , (3 , 5) , (3 , 6) }
= { (3 , 6) , (4 , 5) , (5 , 4) , (6 , 3) }
P(A1) = 1/6 P(A2) = 1/6 P(A3) = 1/6 P(A7) = 1/2 P(A8) = 1/2 P(A9) = 1/9
EX 27 A bucket contains nB black and nW white balls (define n = nB + nW). Balls are randomly selected, without replacement. Whats the probability we first see a white ball on the ith selection? Twist: Suppose instead sampling is random with replacement.
EX 28 A bucket contains 5 black, 7 yellow and 9 red balls. A set of 3 balls is randomly selected.
EX CARDS N cards labeled 1, 2, , N are randomly permuted into N bins labeled 1, 2, , N (one card per bin). Whats the probability that exactly k of the card labels match the bins label? Recall from our past that if there are M cards & bins, the probability of no matches, denoted PM, is
P = M
M(1)j M(1)j 1111 1

j=0 j! j=2 j!
=
2! 3! 4! 5!
= e1 (whe+n Mis l+arge) M!
The first sums first two terms are 1 and -1 and cancel leading to the equivalent second sum. See the next page for some results.
EX VOTE There are 4 precincts in a city. 15% of voters reside in precinct 1, 35% in precinct 2; 10% in precinct 3 and 40% in precinct 4. In precinct 1 the incumbent is favored by 60% of voters; favorability in the remaining precincts, respectively, is 45%, 20% and 55%.
a. If we select a voter at random from the city, what is the probability the voter is a Precinct 1 resident that favors the incumbent?
b. If we select a voter at random from the city, what is the probability the voter favors the incumbent?
c. Suppose the selected voter indeed does favor the incumbent. Whats the probability the voter comes
from precinct 1? 2? 3? 4?
EX RRS How do we ensure reasonable privacy of responses when surveying people on a sensitive issue? Heres one potential way. A sample of people is handed this survey.
If your father was born in a month with 31 days, respond to only Q1 below; otherwise respond to only Q2. (Do not identify the question you answer.)
Q1: Is the second-to-last digit of your social security number odd (1, 3, 5, 7, or 9)? Q2: Have you ever cheated in school?
Your response: circle one YES NO

Page 2
EX SMOKE Heavy smokers are 8 times more likely than non-smokers to develop lung cancer; light smokers are 3 times as likely. The probability a (randomly selected) adult is a heavy smoker is 0.1; the probability is 0.25 that the adult is a light smoker. If a person is found to have lung cancer, what is the probability the person was a heavy smoker? Light? Non?
EX CARDS (results)
Suppose N = 7.
k
0
1
2
3
4
5
6
7
Prob k matches
1 1 + 1 1 + 1 1 = 0.367857143
e1 k!
11+11 0.1839397
0.3678794 11+11+1 0.3678794
2! 3! 4! 5! 6! 7!
2!
3! 4! 5! 6! = 0.368055556 1!
2!
3!2!4! 5! = 0.183333333
11+1 0.06131324
2!
3! 4! = 0.062500000 3!
11
2!4!3! = 0.013888889
1
2! = 0.004166667 5!
0
1 = 0.0001984127 7!
0.01532831
0.00365662 0.0005109437
0.00007299195 0.9999898
SUM 1.0000000000
The approximation requires N k large. Here thats 7 k. Its a quite good approximation even for fairly small values of N k (like for instance k = 3 where 7 k = 4 which is not usually considered large among nonnegative integers).
Beyond 4 or 5 cards, it really doesnt matter how many cards there are. (Yes it matters a bit.)

Page 3

Page 4

Page 6

Page 7

Bayesian Decision Theory
Two (complementary) hypotheses: H1 and H2. An event A.
Prior Odds: P(H1 ) Data:A Likelihood ratio: P(A| H1 )
Example
There are two types of coin: Type 1 coming up Heads with probability p1; Type 2 coming up Heads with probability p2. A coin is randomly selected; the probability it is a Type 1 coin is p. (The probability its a Type 2 coin is 1 p. Be careful: p is a fundamentally different thing from p1 and p2.) The coin is then tossed n times, independently, resulting in the sequence R1 R2 R3 Rn. For example if n = 6 one possible sequence is: H T H T H H. (The number of Heads in this sequence is X = 4.) The probability of this particular sequence is pi(1 pi)pi(1 pi)pipi =
p4 (1 p )2 where i = 1 or 2. All other sequences with 4 Hs and 2Ts have this probability. In ii
general then, if there are x Heads (and n x Tails) in some particular sequence, the probability of that sequence is pix (1 pi )nx . We now have:
P(H2 ) P(A| H2 )
Posterior Odds: P(H1 | A) = P(H1 ) P(A| H1 ) = Prior Odds Likelihood Ratio
Page 8
P(H |A) P(H )P(A|H )
PriorOdds Data Likelihood ratio
P(H1)= p P(H2 ) 1 p
A=RR R wherex oftheR areHeads 12ni
P(A|H) px(1p)nx 1 = 1 1
P(A|H2) px(1p)nx 22
2 22
P(H|A) P(H)P(A|H) p px(1p)nx PosteriorOdds 1=1 1=11
P(H2|A) P(H2)P(A|H2) 1ppx(1p)nx 22
A reasonable strategy for deciding which coin has been selected is to predict it is a Type 1 coin if the posterior odds are greater than 1 (then P(H1|A) is above 12) and if not, predict its a Type 2 coin. (This is somewhat arbitrary should the posterior odds be exactly 1: this generally cant happen. We will ignore the situation there are ways to work around it such as simply calling it a tie and refusing to make a decision.1)
p px(1p)nx Prediction: Type 1 coin if and only if 1 1 >1.
1 p px (1 p )nx 22
1 When do we see posterior odds of 1? Try p = 0.5 and, take p1 = 0.3, p2 = 0.7 and suppose the tossing results in 10 Heads in 20 tosses. (You can sort of see why neither coin is preferred here.)

Page 9

Page 10

Page 11

Page 12

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] CS Bayesian Page 1
$25