This is to give further discussion on conditional distributions. The basic discussion can be found at https://gaomj.cn/probability4/
The following definitions and theorems are NOT stated in a completely rigorous style. Thus the rigorous statements may be referred to other advanced materials on probability theory.
Suppose we have a random vector (X,Y) with joint distribution measure PX,Y. We first define the marginal distribution:
Definition: Define the projection πX:(x,y)↦x, then the marginal distribution of X can be defined as the pushforward measure of PX,Y:
PX:=PX,Y∘πX−1.
With this definition, we have by the property of pushforward:
P(X∈Ex)=∫πX−1(Ex)dPX,Y=∫ExdPX.Let κY,X be a regular conditional distribution of Y given X such that κY,X(x,⋅)=P(⋅∣X=x). Then,
Theorem:
1.
PX,Y(E)=∫κ(x,πX−1(x)∩E)dPX.
2.
∫f(x,y)PX,Y(dx,dy)=∫dPX∫f(x,y)κY,X(x,dy).
3.
PY(Ey)=∫Ω1κY,X(x,Ey)dPX.
Therefore, it can often be seen that PX,Y=PXPY∣X.
The following is their informal proof. (1) We write
P(E)=∫dPX,YP((x,y)∈E)=∫Ω1d(PX,Y∘πX−1)κ(x,πX−1(x)∩E)=∫Ω1dPXκ(x,πX−1(x)∩E).P(E)=∫dPX,YP((x,y)∈E)=∫Ω1d(PX,Y∘πX−1)κ(x,πX−1(x)∩E)=∫Ω1dPXκ(x,πX−1(x)∩E).(2) We derive it from (See Theorem 24 in https://gaomj.cn/probability4/#sec:2.3)
∫f(Y(ω))P(dω)=∫P(dω)∫f(y)κY,σ(X)(ω,dy).Then
∫f(X(ω),Y(ω))P(dω)∫f(x,y)PX,Y(dx,dy)=∫P(dω)∫f(X(ω),y)κY,σ(X)(ω,dy),=∫dPX∫f(x,y)κY,X(x,dy).∫f(X(ω),Y(ω))P(dω)∫f(x,y)PX,Y(dx,dy)=∫P(dω)∫f(X(ω),y)κY,σ(X)(ω,dy),=∫dPX∫f(x,y)κY,X(x,dy).
(3) With the result of (2), we have
P(Y∈Ey)=∫y∈EydPX,Y=∫dPX∫EyκY,X(x,dy)&=∫dPXκY,X(x,Ey).P(Y∈Ey)=∫y∈EydPX,Y=∫dPX∫EyκY,X(x,dy)&=∫dPXκY,X(x,Ey).
发表回复