Attente de réciproque d'une variable

Réponses:

27

peut-il être 1 / E (X)?

Non, en général, ce n'est pas possible; L'inégalité de Jensen nous dit que si XX est une variable aléatoire et φφ est une fonction convexe, alors φ ( E [ X ] ) E [ φ ( X ) ]φ(E[X])E[φ(X)] . Si XX est strictement positif, alors 1 / X1/X est convexe, donc E [ 1 / X ] 1 / E [ X ]E[1/X]1/E[X] , et pour une fonction strictement convexe, l'égalité ne se produit que si XXa une variance nulle ... donc dans les cas où nous avons tendance à nous intéresser, les deux sont généralement inégaux.

En supposant que nous avons affaire à une variable positive, s'il est clair pour vous que XX et 1 / X1/X seront inversement liés ( Cov ( X , 1 / X ) 0Cov(X,1/X)0 ), cela impliquerait E ( X 1 / X ) - E ( X ) E ( 1 / X ) 0E(X1/X)E(X)E(1/X)0 ce qui implique E ( X ) E ( 1 / X ) 1E(X)E(1/X)1 , donc E ( 1 / X ) 1 / E ( X )E(1/X)1/E(X) .

Je suis confus en appliquant l'attente au dénominateur.

Utilisez la loi du statisticien inconscient

E [ g ( X ) ] = - g ( x ) f X ( x ) d x

E [ g( X) ] = - g( x ) fX( x ) dX

(dans le cas continu)

donc quand g ( X ) = 1Xg( X) = 1X ,E[1X ]=- f(x)x dxE[1X]=f(x)xdx

Dans certains cas, l'attente peut être évaluée par inspection (par exemple avec des variables gamma aléatoires), ou en dérivant la distribution de l'inverse, ou par d'autres moyens.

Glen_b -Reinstate Monica
la source
14

Comme Glen_b le dit, c'est probablement faux, car l'inverse est une fonction non linéaire. Si vous voulez une approximation de E ( 1 / X )E(1/X) peut - être que vous pouvez utiliser une expansion de Taylor autour E ( X )E(X) :

E ( 1X )E(1E ( X ) -1E ( X ) 2 (X-E(X))+1E ( X ) 3 (X-E(X))2)== 1E ( X ) +1E(X)3Var(X)

E(1X)E(1E(X)1E(X)2(XE(X))+1E(X)3(XE(X))2)==1E(X)+1E(X)3Var(X)
so you just need mean and variance of X, and if the distribution of XX is symmetric this approximation can be very accurate.

EDIT: the maybe above is quite critical, see the comment from BioXX below.

Matteo Fasiolo
la source
oh yes yes...I am very sorry that I could not apprehend that fact...I have one more q...Is this applicable to any kind of function???actually I am stuck with |x||x|...How can the expectation of |x||x| can be deduced in terms of E(x)E(x) and V(x)V(x)
Sandipan Karmakar
2
I don't think you can use it for |X||X| as that function is not differentiable. I would rather divide the problem into the cases and say E(|X|)=E(X|X>0)p(X>0)+E(X|X<0)p(X<0)E(|X|)=E(X|X>0)p(X>0)+E(X|X<0)p(X<0), I guess.
Matteo Fasiolo
1
@MatteoFasiolo Can you please explain why the symmetry of the distribution of XX (or lack thereof) has an effect on the accuracy of the Taylor approximation? Do you have a source that you could point me to that explains why this is?
Aaron Hendrickson
1
@AaronHendrickson my reasoning is simply that the next term in the expansion is proportional to E{(XE(X))3}E{(XE(X))3} which is related to the skewness of the distribution of XX. Skewness is an asymmetry measure. However, zero skewness does not guarantee symmetry and I am not sure whether symmetry guarantees zero skewness. Hence, this is all heuristic and there might be plenty of counterexamples.
Matteo Fasiolo
4
I don't understand how this solution gets so many upvotes. For a single random variable XX there is no justificiation about the quality of this approximation. The third derivative f(x)=1/xf(x)=1/x is not bounded. Moreover the remainder of the approx. is 1/6f(ξ)(Xμ)31/6f′′′(ξ)(Xμ)3 where ξξ is itself a random variable between XX and μμ. The remainder won't vanish in general and may be very huge. Taylor approx. may only be useful if one has sequence of random variables Xnμ=Op(an)Xnμ=Op(an) where an0an0. Even then uniform integrability is needed additionally if interested in the expectation.
BloXX
8

Others have already explained that the answer to the question is NO, except trivial cases. Below we give an approach to finding E1XE1X when X>0X>0 with probability one, and the moment generating function MX(t)=EetXMX(t)=EetX do exist. An application of this method (and a generalization) is given in Expected value of 1/x1/x when xx follows a Beta distribution, we will here also give a simpler example.

First, note that 0etxdt=1x0etxdt=1x (simple calculus exercise). Then, write E(1X)=0x1f(x)dx=0(0etxdt)f(x)dx=0(0etxf(x)dx)dt=0MX(t)dt

E(1X)=0x1f(x)dx=0(0etxdt)f(x)dx=0(0etxf(x)dx)dt=0MX(t)dt
A simple application: Let XX have the exponential distribution with rate 1, that is, with density ex,x>0ex,x>0 and moment generating function MX(t)=11t,t<1MX(t)=11t,t<1. Then 0MX(t)dt=011+tdt=ln(1+t)|0=0MX(t)dt=011+tdt=ln(1+t)0=, so definitely do not converge, and is very different from 1EX=11=11EX=11=1.
kjetil b halvorsen
la source
7

An alternative approach to calculating E(1/X)E(1/X) knowing X is a positive random variable is through its moment generating function E[eλX]E[eλX]. Since by elementary calculas 0eλxdλ=1x

0eλxdλ=1x
we have, by Fubini's theorem 0E[eλX]dλ=E[1X].
0E[eλX]dλ=E[1X].
user172761
la source
2
The idea here is right, but the details wrong. Pleasecheck
kjetil b halvorsen
1
@Kjetil I don't see what the problem is: apart from the inconsequential differences of using tXtX instead of tXtX in the definition of the MGF and naming the variable tt instead of λλ, the answer you just posted is identical to this one.
whuber
1
You are right, the problems was less than I thought. Still this answer would be better withm some more details. I will upvote this tomorrow ( when I have new votes)
kjetil b halvorsen
1

To first give an intuition, what about using the discrete case in finite sample to illustrate that E(1/X)1/E(X)E(1/X)1/E(X) (putting aside cases such as E(X)=0E(X)=0)?

In finite sample, using the term average for expectation is not that abusive, thus if one has on the one hand

E(X)=1NNi=1XiE(X)=1NNi=1Xi

and one has on the other hand

E(1/X)=1NNi=11/Xi

it becomes obvious that, with N>1,

E(1/X)=1NNi=11/XiNNi=1Xi=1/E(X)

Which leads to say that, basically, E(1/X)1/E(X) since the inverse of the (discrete) sum is not the (discrete) sum of inverses.

Analogously in the asymptotic 0-centered continuous case, one has

E(1/X)=f(x)xdx1/xf(x)dx=1/E(X).

keepAlive
la source