Professional Documents
Culture Documents
This Content Downloaded From 193.0.118.39 On Sun, 23 May 2021 13:27:39 UTC
This Content Downloaded From 193.0.118.39 On Sun, 23 May 2021 13:27:39 UTC
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms
University of Pittsburgh
cov(X, Y) f J {P(X > x, Y >y) - P(X > x)P(Y > y)} dxdy.
00 00
(1) EX = (1 - F(x))
0
dx - F(x) dx.
-00
The extensi
(4) E(X1 - Yl)(X2 - Y2)(X3 - Y3) = J K(ul, u2, u3) du, du2 du3, 00
where
- {P(A1A2A3) + P(A1)P(A2A3)
-P(A2)P(A1A3) - P(A3)P(A1A2)}
and Ai = {Xi < ui}, i = 1,2,3, B1 = {X1 2 -u1}. Jogdeo mentioned that a
similar result holds for n ? 3. We give a different generalization of Hoeffding's
lemma. Notice that expression (3) can be rewritten as
where Xx(x) = 1 if X > x, 0 otherwise, and that the covariance is the second-
order joint cumulant for the random vector (X, Y). In the following we extend
the results to the rth-order joint cumulant where r 2 3.
2. Main results. Consider a random vector (X1,..., Xr), where ElXilr < 00,
i-1,...,r.
(vii) cum X- = EXj, cum(Xj, Xj) = VarXj and cum(Xi, Xj) = cov(Xi,
To represent certain moments by cumulants, we have the following useful
identity.
THEOREM 1. For the random vector (X1,..., Xr), r > 1, if EIXijr < oo,
i = 1,2,..., r, then
(10) + E E(xi)E(xj)F(x(' j) )
i<j
II1(Xi) E rl ?(Xk)F(xj)
-o -00 i-1 j]1 k*j
+ E rl e(xk)F(xi, xj)
i<j k~i, j
nl ... nk
= (- nl * . . nk|0 1 .
-00 0
+ E E
i<j
cum(X1,..., Xr)
+ +(-1)rl(r-1)! HI EXj
j=1
+ E E(xi)E(xj)F(x
i<j
_(1)fl~l+fP2f f !F(xji j E v)
-00 - \J~J00
- 5?. E(Xk)F(xj, j E v, \
kev,
kG V1 ~ EV
+ ..+(_1)n"', FB E(Xj)
X F(xi E k=V2
I) 2)- e(xk)F(x i E gJ 2 \ k)
E(X)Fx2
kev2
r~~~~~~~~
+ *@ * +(-1)r(r -.1)!-00
*. f-00
[E(Xi)
i=
- Fi(Xi)] dX1 ... dXr,
cum(X_ X)r) * fU( c uXX(Xx), ... ., X-Xr (Xr)) dx1 ... dxr
This completes theoproof
-0000 -
-00 -00
(-1) cum(Xx,(xi), i E A
where A U B = {1, 2, .. ., r}. We
tion and/or survival function in
for all x1, x2, X3, where the A, Ai each denote one of the ineq
Then Xi, Xj for all i # j are uncorrelated and EX1X2X3 = EX
only if the Xi's are mutually independent.
Using Theorem 1, we get this conclusion directly. The "if" part is trivial.
Conversely, since F E= #(3) we know Fxix.(xi, xj) E= .(2) [#(n) can be def
similarly]. Since Xi and Xj are uncorrelated this implies the Xi's are pair
independent by Hoeffding's lemma. Thus, using Remark 4, (9) becomes
EX1X2X3 - EX1EX2EX3
THEOREM 2. If Fxl,..., xn(xl,..., Xn) E #(n), then EXi* ... Xik= =11 EXj
for all subsets {il..., i}, of 1, ... ., n} if and only if Xl,..., Xn are independent.
PROOF. FX. Xn(x *... * Xn) E J(n) means Fxl,..., Xl(X4l * * Xik) e .#(k)
for any subset (il,..., Wk*. By induction on n, using Theorem 1, we obtain
n
The integrand
Xi are mutual
If the reverse inequalities between the probabilities in (a) and (b) hold the three
concepts are called NUOD, NLOD and NOD, respectively.
NOTE. In Definition 2 in Block and Ting (1981), POD is used for what is
called PUOD in this paper.
(d) P(X < x) 2 P(Xi < xi, i E A)P(Xj < x;, j E AC),
v PLOD <
(15) Association => SPOD => POD 1(n).
Since association, SPOD, POD, PLOD, PUOD are all subclasses of X#(n),
Theorem 2 generalizes some results in Lehmann (1966) and it gives us another
proof of Theorem 2 in Joag-Dev (1983) as well as some new characterizations of
independence for POD random variables. Corollary 1 is the result of Joag-Dev.
COROLLARY 1. Let X1, .... Xn be SPOD and assume cov(Xi, Xj) = 0 for all
i ] j. Then X1,..., Xn are mutually independent.
n - 1 r.v.'s are mutually independent and thus EXi ... Xik = for all
1 < k < n - 1. Hence cum(Xk, k e vp) = 0, whenever 1 < card(vp) < n - 1.
So we only need to check EX1 ... Xn = HP7 jEXj. By Lemma 1, Theorem 1 and
because of the independence of any (n - 1) r.v.'s,
n
Similarly,
n
= f f... {P(-X1 > X1, -X2 > X2, X3 > X3 ... Xn > Xn)
-00 -00
-P(Xl < -XJ)P(X2 < -X2) H P(Xi > Xi)} CI1 .. Cln
i=3
The last equality holds by the induction assumption of mutual independence and
the last inequality is due to SPOD. Combining (16) and (17) completes the
proof. O
THEOREM 3. Let X1, X2, X3 be POD and assume Xi, Xj for all i # j
uncorrelated. Then X1, X2, X3 are mutually independent.
PROOF. The following two summands are nonnegative since X1, X2, X3 are
POD. By Lemma 1 we then have
[P(Xl > xl, X2 > X2, X3 > X3)- P(Xi > Xi)]
+ [P(Xl < X1, X2 < X2, X3 < X3)- PMX < Xi)]
= E cov(xXi(xi), Xx1(xi)).
i*j
Since Xi, Xj POD and cov(Xi, Xj) = 0 we obtain cov(Xx (xi), xx (xj)) = 0.
Thus P(Xi > xi, i = 1, 2,3) - HFVP(Xi > xi) = 0, i.e., X1, X2, X3 are mutually
independent. 5
REMARK 5. For three r.v.'s X1, X2, X3 the mixed positive dependence de-
fined in Chhetry, Kimeldorf and Zahed (1986) implies POD but the converse is
not true as shown by an example in Joag-Dev (1983). Notice that since the mixed
positive dependence implies POD in Corollary 1, the SPOD can be relaxed to this
mixed condition.
PROOF. By Theorem 2, we need only check EX1 ... Xn = EX1 ... EX,. On
the one hand,
2 0.
-H P
j=1
1 1 1 1 1/8
1 1 0 0 1/8
1 0 1 0 1/8
o 1 1 0 1/8
1 0 0 1 1/8
o 1 0 1 1/8
o 0 1 1 1/8
o o 0 0 1/8
Since P(Xi > 1/2, i = 1,...,4) - FH1P(Xi > 1/2) = 1/16 > 0, X1,..., X4 are
not mutually independent. Notice also that
= cum(Xxi(xi) i= 1,... 4)
4
> 0,
so these r.v.'s are not SPOD.
X1 X2 X3 X4 X5 Pr
1 1 1 1 1 1/16
1 1 0 0 1 1/16
1 0 1 0 1 1/16
o 1 1 0 1 1/16
1 0 0 1 1 1/16
o 1 0 1 1 1/16
o o 1 1 1 1/16
o 0 0 0 1 1/16
1 1 1 1 0 1/16
1 1 0 0 0 1/16
1 0 1 0 0 1/16
o 1 1 0 0 1/16
1 0 0 1 0 1/16
o 1 0 1 0 1/16
o o 1 1 0 1/16
o o 0 0 0 1/16
It is easy to check that this is PUOD and PLOD, thus it is POD. However
EXiXj = 4/16 and EXi = 1/2 for all i, j.
In this example we can use Theorem 3 to prove that any Xi, X1, Xk are
mutually independent since subsets of POD r.v.'s are still POD.
Newman and Wright (1981), using an inequality for the ch.f.'s of r.v.'s
X1,..., Xm, provided another proof for the characterization of the independence
of associated r.v.'s. This is Theorem 1 of Newman and Wright (1981). These
authors proved that if X1,..., Xm are associated with finite variance, joint and
marginal ch.f.'s 4(rl,..., rm) and 4)(rj), then
m
LEMMA 3. For the r.v. (X,,..., Xm) with EIXilm < oo, m > 1,
cum(exp( irXl), .. *I* exp( irmXm ))
Xcum(xXX(Xi).--
PROOF. This proof of the result is similar to that of Lemma 2. Use the
identity
Notice that
cum(exp(irkXk), k = 1 ... ., m)
= cum(exp(irkXk) - 1, k = m)
= 1:0 ....
00 -00 j=l
Using Lemma 3, we can obtain a result parallel to (18) for certain classes of
r.v.'s.
THEOREM 5. If X1,..., Xm are r.v.'s such that EIXjIm < oo, j = 1,...,
and CUm(XX(X4),..., XX (Xk)) has the same sign for all subsets {il.... ik
{1,...,m} and all xl,.... .Xk. Then
Here 44r1, . .., rm) and 4f,(r1) are the joint and marginal ch. f.'s of
PROOF. From Lemma 3, Theorem 1 and the fact that the cum(Xx(xi),
... ,x (xm)) all have the same sign we have for m > 1,
= ..*-. im
-00 -00
-00 -00
__ 00
m m
Whenever card vl = 1
EXAMPLE 3. Consider the r.v.'s X1, X2, X3 with the following distribution:
X1 X2 X3 Pr
1 1 1 1/4
1 0 0 1/4
0 1 0 1/4
0 0 1 1/4
PROOF. We prove the PUOD case only. Using Lemmas 1 and 3 and
Remark 3,
-Fl(xl) ...
=Irl .. Irml
-.F(xl) .F
REMARK 9. Assume EXi > 0 for i = 1, 2,3. Also assume that cov(X,, Xj) 2 0
for i, j = 1, 2, 3 [or the even stronger conditions cov(Xx (xj), xx (xj)) ? 0 and
cum(X1, X2, X3) 2 0]. These do not imply PUOD as is shown in the following
example.
EXAMPLE 4. Let X1, X2, X3 take the values 0, + 1 with: P(X1 = x1, X2 =
X3 = X3 X1X2X3 ? 0) = 0; P(X1 = X2 = X3 =0) =0; P(Xi =0 Xi = xi'
Xk = Xk, XiXk > 0) = 1/9, i, j, k = 1,2,3, X= = 1 or x; = Xk = -1; and
P(X1 = X1, X2 = X2 X3 = X3) = 1/36 for the remaining cases. It is easy to
check that EXi = EX1X2X3 = 0, EXiXj > 0 and cum(X1, X2, X3) =
P(X1 > 0 X2 > 0, X3 > 0) - P(X1 > o)P(X2 > o)P(X3 > 0)
= -(11/36)3 < 0.
REMARK 10. Let EXi > 0 and assume (X1, X2, X3) PUOD. This does not
imply cum(X1, X2, X3) ? 0 as is shown in Example 5.
EXAMPLE 5. Let (X1, X2, X3) have the following distribution. It is easy to
check that (X1, X2, X3) is PUOD and that EXi = 0, but cum(X1, X2, X3) =
-0.15 < 0.
Xi X2 X3 Pr
1 1 1 0.35
1 1 -1 0.05
1 -1 1 0.05
-1 1 1 0.05
0 0 -1 0.05
-1 0 0 0.05
0 -1 0 0.05
-1 -1 -1 0.35
REMARK 11. Let (X1, X2, X3) be associated. It need not be true that
cum(X1, X2, X3) 2 0 as is shown in Example 6.
EXAMPLE 6. Assume (X1, X2, X3) are binary r.v.'s with distribution P(X1 =
X2 = X3 = 0) = 0.3; P(X1 = X1, X2 = X2, X3 = X3) = 0.1 for all othe
{X1, X2, X3} E (0, 1}3.
Checking all binary nondecreasing functions F(Xj, X2, X3) and A(X1, X2
we have cov(F, A) 2 0. Thus (X1, X2, X3) are associated but cum(Xl, X2, X3
-0.012 < 0.
REMARK 12. If (X, Y) are binary and cov(X, Y) 2 0, then (X, Y) is associ-
ated as was shown in Barlow and Proschan (1981). However, if (X1, X2, X3) are
binary, then cov(Xi, Xj) ? 0, i, j = 1,2,3, and cum(X1, X2, X3) > 0 do n
imply (X1, X2, X3) associated as is seen in Example 7.
EXAMPLE 7. Assume (X1, X2, X3) are binary r.v.'s with the following distri-
bution. Then cov(Xi, Xj) = 1/180 > 0 and cum(Xl, X2, X3) = 1/135 > 0. Ho
ever, for the increasing functions max(Xl, X2) and max(Xl, X3),
X1 X2 X3 Pr
0 0 0 0
0 0 1 1/30
0 1 0 1/30
1 0 0 1/30
1 1 0 1/10
1 0 1 1/10
0 1 1 1/10
1 1 1 6/10
If we add some restrictions, some results can be obtained. We will give these
and omit the easy proofs.
REMARK 13. Notice that under the preceding assumptions we have the
peculiar situation that PUOD o NLOD and PLOD NUOD.
REFERENCES