Originally Posted by psugirl2011
I need help with this problem.
A family buys two policies from the same insurance company. Losses under the two policies are independent and have continuous uniform distributions on the interval from 0 to 10. One policy has a deductible of 1 and the other has a deductible of 2. The family experiences exactly one loss under each policy.
Calculate the probability that the total benefit paid to the family does not exceed 5.
They have a graph 0<x<10 and 0<y<10 with the line x+y=8. how did they get x+y=8?
There's already threads on this so you'd probably help yourself but searching. The quick answer is if there is a deductible of 1 and 2 so we want the benfiit paid to not exceed 5, they are already going to pay 1 and 2 for deductible so 1 + 2 + 5 = 8