PDA

View Full Version : Husband and Wife Problem

05-16-2002, 09:21 PM
I posted the following problem a while ago, but I'm still uncomfortable about how I solved it. Basically I assumed that the covarianceof the two plans would be the average of Cov(H,H) and Cov(H,W), which would be one half the Var(H) since H and W are independent. While this leads to the correct answer, I'm not sure why. Please help!

Suppose the remaining lifetimes of a husband and wife are independently and uniformly distributed on the interval [0, 40]. An insurance company offers two products to married couples:

One which pays when the husband dies.
One which pays when both the husband and wife have died.

Calculate the covariance of the two payment times.

Pillow
05-16-2002, 10:07 PM
If this is from Course 1...I skip it every time.

Gandalf
05-17-2002, 08:35 AM
I recommend Pillow's approach on this one.

Maybe 1/2 Var (H,H) is a valid formula, but I don't know why it would be.

Here's the "first principles / brute force" solution, which you might be able to do in 5-6 minutes, if you don't make any false steps along the way.

General Covariance Formula: Cov(X,H) = E(XH) - E(X)E(H). Let X = time until second death, H = time until husband's death, W = time until wife's death.

Since uniform and independent, husband's density is 1/40; husband / wife's joint density is 1/1600.

E(X) = 26.667 (work not shown); E(H) = 20 (just 40/2 since uniform)

E(XH) = double integral h=0 to 40 w=0 to 40 of (joint density * second death time * h)

Over that entire range, second death time is not an algebraic function of h and w, so split the range into pieces:

Case 1: situation where w dies before h:

Double integral h=0 to 40 w = 0 to x of (joint density * h * h), because second death time is when husband dies. Integral = 400

Case 2: situation where h dies before w:

Double integral h=0 to 40 w = x to 40 of (joint density * w * h), because second death time is when wife dies. Integral = 200

Thus E(XH) = 400 + 200 = 600

Cov = 600 - 26.6667 * 20 = 66.667

Isn't Pillow's approach looking better and better?

There may be a shorter way; hope someone else posts it.

4sigma
05-18-2002, 05:47 AM
Husband-Wife covariance problems are among the most difficult to solve. This is not unexpected, since the intimate relationship between the H-W variables frequently results in a highly interlaced variance-covariance matrix which is not algebraically invertible. Numerical recursive methods are sometimes successful, but for many H-W relationships these do not successfully converge, even after several iterations.

For practical purposes, the best solution to this sort of problem is to treat both Husband and Wife as dependent variables, and to regress them both against a 3rd, independent variable with which they are both positively correlated. This is done as covered in the Course 4 materials (P&R p. 101). Having done this, the residual Husband and Wife variables can directly regressed, usually resulting in the desired positive correlation.

The key is to find the correct independent variable. Matchmaker is usually best if available, but data is often historical and current observations may not be available or may not be credible.
Under no circumstances should Mother-In-Law be used. While Mother-In-Law will no doubt be found positively correlated with either H or W, the other spousal variable is routinely found to have a correlation of -0.9 or less. The resultant residuals will be uncorrelated at best.
Therapist has been known to occasionally appear effective as an independent variable. However, one must be careful as Therapist is often a Random Walk and may appear correlated when not actually valid. One should be sure to test any Therapist variable with a Dickey-Fuller test before committing to use one as an independent correlated variable.
If MatchMaker data is no longer available or is no longer correlated with both Husband and Wife, other successful choices have been found to include Best Friend, Minister, or sometimes Coworker. :D

Macroman
05-18-2002, 10:55 AM
We were talking course #1 here.

Since the problem specifies independence of the lives, the suggections about regression are irrevelant. This is really a course #3 problem.

I don't have an improvement on Gandalf's solution, but what may be a simplifying suggestion. Since the base distributions are continuous uniform, the expected values for H and max(H,X) can be found by the centroid formulas from geometry. For H the expected value is (w-40)/2=20 and for HX the expected value is (2/3)*(w-40) = 26.667. To see the latter draw the rectangle representing possible lives, then split the rectangle with a line H=Y. Now the expected value of max(H,X) given that H&gt;X is the centroid of one triangle and the other triangle yields the same result for E(max(H,X)) given H&lt;X. You still need to integrate (as far as I can see) to find E(HX), so this may not really save you any time. I just point it out for an alternative.

As far as skipping questions, I certainly reccommend guessing on the most difficult questions on the actuarial exam. I would generally reccommend practicing all the types of problems as the best way to know which problems are the most difficult and ultimately make an informed decision about what to skip. At this point, being so close to the exam, I think it is better to enhance your proficiency with problems that you have some ability with at this late date, as opposed to working on problems where you feel totally in the dark.

Good luck.

05-19-2002, 12:31 PM
Here is another explanation.

Policy A pays when the husband dies.
Policy B pays when they both die.

Let H = time until Husband dies.
Let W = time until Wife dies.

Let A = time when Policy A is paid (time when Husband dies).
Let B = time when Policy B is paid (when the second spouse has died).

Note that

A = H.

B = H when H &gt; W,
B = W when W &gt; H.

H and W are identically and independently distributed so that
P(H &gt; W) =P(W &gt; H) = 0.5

H and W are uniformly distributed on [0, 40]

Cov(A, B | H &gt; W) = Cov(H, H) with P(H &gt; W) = 0.5
Cov(A, B | W &gt; H) = Cov(H, W) with P(W &gt; H) = 0.5

Cov(H, H) = Var(H) = 40 ^ 2 / 12
Cov(H, W) = 0 (because they are independent).

Cov(A, B) = ½ Cov(A, B | H &gt; W) + ½ Cov(A, B | W &gt; H) = 66.7

Gandalf
05-19-2002, 02:27 PM
You make several statements here where it is not clear which parts are generally true and which parts are unique to the special conditions here.

Cov(A, B | H > W) = Cov(H, H) with P(H > W) = 0.5
Cov(A, B | W > H) = Cov(H, W) with P(W > H) = 0.5

Are you saying "Cov(A, B | H > W) = Cov(H, H)" is always true and that in this problem "P(H > W) = 0.5" is true? If that is what you mean, I agree with the second statement but see no reason the first should be true. I would agree "Cov(A, B | H > W) = Cov(H, H | H > W)", but do not see any reason it is OK to drop the conditional.

Ditto with the situation where W > H.

Cov(H, H) = Var(H) = 40 ^ 2 / 12
Cov(H, W) = 0 (because they are independent).

Thinking about the second, unless it is OK to drop the conditional, we want Cov(H, W | W > H). It would surprise me if that conditional covariance were 0. Certainly, in the conditional case the two random variables are not independent.

Cov(A, B) = ½ Cov(A, B | H > W) + ½ Cov(A, B | W > H) = 66.7

Is the following formula always true
Cov(A, B) = Prob(X) Cov(A, B | X) + (1 - Prob(X)) Cov(A, B | not X)

Perhaps it is. I'm not trying to shake you up before your exam. I really don't know if it is true. (And I really can't imagine this problem could show up on the exam. The long way is quite long. Even if the shortcut way is valid, how could they possibly expect anyone to decide if it is valid in the time allowed?)

05-21-2002, 11:26 AM
It seems to work for me.

P(H&lt;W) = P(W&lt;H) = 1/2

40 w
/ /
| |
Cov(X,H|H&lt;W) = | | (h - 20) (w - 20) / 800 dh dw = 0
| |
/ /
0 0

40 h
/ /
| | 2
Cov(X,H|W&lt;H) = | | (h - 20) / 800 dw dh = 133.33
| |
/ /
0 0

Cov(X,H) = Cov(X,H|H&lt;W) P(H&lt;W) + Cov(X,H|W&lt;H) P(W&lt;H) = 66.67

Gandalf
05-21-2002, 12:10 PM
It seems to work for me.

P(H&lt;W) = P(W&lt;H) = 1/2

40 w
/ /
| |
Cov(X,H|H&lt;W) = | | (h - 20) (w - 20) / 800 dh dw = 0
| |
/ /
0 0

40 h
/ /
| | 2
Cov(X,H|W&lt;H) = | | (h - 20) / 800 dw dh = 133.33
| |
/ /
0 0

Cov(X,H) = Cov(X,H|H&lt;W) P(H&lt;W) + Cov(X,H|W&lt;H) P(W&lt;H) = 66.67

Those are effectively the same integrals I did, with the order of integration reversed. I was also evaluating E(XY) - E(X)E(Y) instead of E[(X-E(X))(Y-E(Y))], but those are equivalent.

Where I disagree is in the conditional covariance formulas. How can you justify Cov(X,H|W&lt;H) = E[(H-E(H))(H-E(H))|W&lt;H] instead of
= E[(H-E(H |W&lt;H))(H-E(H |W&lt;H))|W&lt;H] ?

Again, you may want to delay responding until after your exam. Even if this exact problem were to show up, you would be better off spending your exam time elsewhere.

05-21-2002, 01:04 PM
This should be right.

40 w
/ /
| |
E(H|H&lt;W) = | | h / 800 dh dw = 13.33
| |
/ /
0 0

40 w
/ /
| |
E(W|H&lt;W) = | | w / 800 dh dw = 26.67
| |
/ /
0 0

40 w
/ /
| |
Cov(H,W|H&lt;W) = | | (h - E(H|H&lt;W)) (w - E(W|H&lt;W)) / 800 dh dw = 44.44
| |
/ /
0 0

40 h
/ /
| |
E(H|W&lt;H) = | | h / 800 dw dh = 26.67
| |
/ /
0 0

40 h
/ /
| | 2
Cov(H,H|W&lt;H) = | | (h - E(H|W&lt;H)) / 800 dw dh = 88.89
| |
/ /
0 0

Cov(X, H) = 1/2 Cov(H,W|H&lt;W) + 1/2 Cov(H,H|W&lt;H) = 66.67

Gandalf
05-21-2002, 01:39 PM
OK, I'll now agree with the calculations and the result.

But look how far we've come from the original shortcut solution.

The original shortcut solution said we could take the average of two unconditional variances, one of which was 0 and one of which was uniform (hence a known formula).

It is now proposed that we should take the average of two conditional covariances, each of which must be evaluated as a double integral, neither of which is 0.

And we still have no idea (at least I don't) whether the average of the two conditional covariances is correct anytime P(H&lt;W) = .5, or whether it is unique to the fact that each of H and W was uniform.

The quality of work you've done on this problem suggests to me you are well prepared for the exam. But if something like this comes up, skip it unless you have spare time at the end.

Good luck.

Gandalf
05-10-2004, 06:48 AM
:bump:

GTMike
03-17-2013, 05:56 PM
I recommend Pillow's approach on this one.

E(X) = 26.667 (work not shown); E(H) = 20 (just 40/2 since uniform)

Do you mind showing the work, or at least describing how you get E(X)?

Gandalf
03-17-2013, 07:40 PM
Method 1: for any function g(h,w), E(g(h,w))=double integral of g(h,w)f(h,w) over region where density f(h,w) is nonzero.

So with g(h,w)=max(h,w),
E[g(h,w)]=\int_0^{40} \int_0^{40}g(h,w)f(h,w)dhdw
E[g(h,w)]=\int_0^{40} \int_0^w (w/1600)dhdw+\int_0^{40} \int_w^{40} (h1/1600)dhdw
details to be finished by you

Had to split the integral because g(h,w) = h if h>w but = w if h<w

Method 2:

Calculate cdf F(x)=Prob(max(h,w)<X)=Prob(h<x and w<x) = (x/40)^2 for 0<x<40, since h and w independent.

Use Darth Vader rule: E(X)=\int_0^{40} (1-F(x))dx

GTMike
03-18-2013, 10:44 AM
Thanks! approached it with a fresh mind this morning. All is clear.

actuary-r
03-20-2013, 10:14 PM
Husband-Wife covariance problems are among the most difficult to solve. This is not unexpected, since the intimate relationship between the H-W variables frequently results in a highly interlaced variance-covariance matrix which is not algebraically invertible. Numerical recursive methods are sometimes successful, but for many H-W relationships these do not successfully converge, even after several iterations.

For practical purposes, the best solution to this sort of problem is to treat both Husband and Wife as dependent variables, and to regress them both against a 3rd, independent variable with which they are both positively correlated. This is done as covered in the Course 4 materials (P&R p. 101). Having done this, the residual Husband and Wife variables can directly regressed, usually resulting in the desired positive correlation.

The key is to find the correct independent variable. Matchmaker is usually best if available, but data is often historical and current observations may not be available or may not be credible.
Under no circumstances should Mother-In-Law be used. While Mother-In-Law will no doubt be found positively correlated with either H or W, the other spousal variable is routinely found to have a correlation of -0.9 or less. The resultant residuals will be uncorrelated at best.
Therapist has been known to occasionally appear effective as an independent variable. However, one must be careful as Therapist is often a Random Walk and may appear correlated when not actually valid. One should be sure to test any Therapist variable with a Dickey-Fuller test before committing to use one as an independent correlated variable.
If MatchMaker data is no longer available or is no longer correlated with both Husband and Wife, other successful choices have been found to include Best Friend, Minister, or sometimes Coworker. :D

Thank you for making my day :)