Expectation of inverse of sum of positive iid variablesExpectation of reciprocal of a variableI've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?Taking the expectation of Taylor series (especially the remainder)Error bars on log of big numbersWhat is the covariance matrix of the unbiased sample covariance matrix?Rewriting the formula for skewness and kurtosis? is there intuitive explanation?If I group my data the variance changes, what does this tell me?Taylor approximation of expected value of multivariate functionExpectation of square root of sum of independent squared uniform random variablesExpectation of the softmax transform for Gaussian multivariate variablesApproximating expectation with Taylor seriesHow does one do a Wald test on estimates from two variables?

How many satellites can stay in a Lagrange point?

Pull-up sequence accumulator counter

Averting Real Women Don’t Wear Dresses

How risky is real estate?

Symbolic equivalent of chmod 400

How can I convince my reader that I will not use a certain trope?

How to append a matrix element by element?

How come I was asked by a CBP officer why I was in the US?

Mount a folder with a space on Linux

Simple object validator with a new API

The use of "I" and "we" used in the same sentence and other questions

Does squid ink pasta bleed?

How often can a PC check with passive perception during a combat turn?

Do sudoku answers always have a single minimal clue set?

Why is C++ initial allocation so much larger than C's?

Do equal angles necessarily mean a polygon is regular?

How can I set command-line parameters through `.emacs` file?

Does the Paladin's Aura of Protection affect only either her or ONE ally in range?

How to perform Login Authentication at the client-side?

"It will become the talk of Paris" - translation into French

Why would people reject a god's purely beneficial blessing?

What would Earth look like at night in medieval times?

Is adding a new player (or players) a DM decision, or a group decision?

Architecture of networked game engine



Expectation of inverse of sum of positive iid variables


Expectation of reciprocal of a variableI've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?Taking the expectation of Taylor series (especially the remainder)Error bars on log of big numbersWhat is the covariance matrix of the unbiased sample covariance matrix?Rewriting the formula for skewness and kurtosis? is there intuitive explanation?If I group my data the variance changes, what does this tell me?Taylor approximation of expected value of multivariate functionExpectation of square root of sum of independent squared uniform random variablesExpectation of the softmax transform for Gaussian multivariate variablesApproximating expectation with Taylor seriesHow does one do a Wald test on estimates from two variables?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








6












$begingroup$


Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.



My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?



There seems to be some strategy that may work based on the taylor extension, but



  • I'm not sure about the hypothesis that need to be met;

  • if it works in this case; and

  • if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?

More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$



So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks



EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.



PS: this is almost a cross-post of this on Math.SE.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
    $endgroup$
    – Ferdi
    Mar 25 at 7:58










  • $begingroup$
    It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
    $endgroup$
    – Gopi
    Mar 25 at 8:04










  • $begingroup$
    Perhaps Markov's inequality or Chebyshev's inequality are useful.
    $endgroup$
    – Ertxiem
    Mar 25 at 8:07










  • $begingroup$
    Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
    $endgroup$
    – whuber
    Mar 26 at 18:46











  • $begingroup$
    @whuber thanks, I'm not sure I've understood the details. This means that I've probably made a mistake in my answer but I can't see where? In addition, using the TCL $barX_n$ converges towards a gaussian, so if $X_i$ has finite moments, the gaussian approx of $barX_n$ should have finite moment?
    $endgroup$
    – Gopi
    Mar 27 at 1:42

















6












$begingroup$


Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.



My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?



There seems to be some strategy that may work based on the taylor extension, but



  • I'm not sure about the hypothesis that need to be met;

  • if it works in this case; and

  • if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?

More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$



So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks



EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.



PS: this is almost a cross-post of this on Math.SE.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
    $endgroup$
    – Ferdi
    Mar 25 at 7:58










  • $begingroup$
    It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
    $endgroup$
    – Gopi
    Mar 25 at 8:04










  • $begingroup$
    Perhaps Markov's inequality or Chebyshev's inequality are useful.
    $endgroup$
    – Ertxiem
    Mar 25 at 8:07










  • $begingroup$
    Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
    $endgroup$
    – whuber
    Mar 26 at 18:46











  • $begingroup$
    @whuber thanks, I'm not sure I've understood the details. This means that I've probably made a mistake in my answer but I can't see where? In addition, using the TCL $barX_n$ converges towards a gaussian, so if $X_i$ has finite moments, the gaussian approx of $barX_n$ should have finite moment?
    $endgroup$
    – Gopi
    Mar 27 at 1:42













6












6








6


1



$begingroup$


Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.



My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?



There seems to be some strategy that may work based on the taylor extension, but



  • I'm not sure about the hypothesis that need to be met;

  • if it works in this case; and

  • if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?

More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$



So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks



EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.



PS: this is almost a cross-post of this on Math.SE.










share|cite|improve this question











$endgroup$




Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.



My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?



There seems to be some strategy that may work based on the taylor extension, but



  • I'm not sure about the hypothesis that need to be met;

  • if it works in this case; and

  • if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?

More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$



So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks



EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.



PS: this is almost a cross-post of this on Math.SE.







variance expected-value iid






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 25 at 12:26







Gopi

















asked Mar 25 at 7:49









GopiGopi

1406 bronze badges




1406 bronze badges











  • $begingroup$
    Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
    $endgroup$
    – Ferdi
    Mar 25 at 7:58










  • $begingroup$
    It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
    $endgroup$
    – Gopi
    Mar 25 at 8:04










  • $begingroup$
    Perhaps Markov's inequality or Chebyshev's inequality are useful.
    $endgroup$
    – Ertxiem
    Mar 25 at 8:07










  • $begingroup$
    Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
    $endgroup$
    – whuber
    Mar 26 at 18:46











  • $begingroup$
    @whuber thanks, I'm not sure I've understood the details. This means that I've probably made a mistake in my answer but I can't see where? In addition, using the TCL $barX_n$ converges towards a gaussian, so if $X_i$ has finite moments, the gaussian approx of $barX_n$ should have finite moment?
    $endgroup$
    – Gopi
    Mar 27 at 1:42
















  • $begingroup$
    Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
    $endgroup$
    – Ferdi
    Mar 25 at 7:58










  • $begingroup$
    It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
    $endgroup$
    – Gopi
    Mar 25 at 8:04










  • $begingroup$
    Perhaps Markov's inequality or Chebyshev's inequality are useful.
    $endgroup$
    – Ertxiem
    Mar 25 at 8:07










  • $begingroup$
    Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
    $endgroup$
    – whuber
    Mar 26 at 18:46











  • $begingroup$
    @whuber thanks, I'm not sure I've understood the details. This means that I've probably made a mistake in my answer but I can't see where? In addition, using the TCL $barX_n$ converges towards a gaussian, so if $X_i$ has finite moments, the gaussian approx of $barX_n$ should have finite moment?
    $endgroup$
    – Gopi
    Mar 27 at 1:42















$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
Mar 25 at 7:58




$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
Mar 25 at 7:58












$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
Mar 25 at 8:04




$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
Mar 25 at 8:04












$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
Mar 25 at 8:07




$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
Mar 25 at 8:07












$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber
Mar 26 at 18:46





$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber
Mar 26 at 18:46













$begingroup$
@whuber thanks, I'm not sure I've understood the details. This means that I've probably made a mistake in my answer but I can't see where? In addition, using the TCL $barX_n$ converges towards a gaussian, so if $X_i$ has finite moments, the gaussian approx of $barX_n$ should have finite moment?
$endgroup$
– Gopi
Mar 27 at 1:42




$begingroup$
@whuber thanks, I'm not sure I've understood the details. This means that I've probably made a mistake in my answer but I can't see where? In addition, using the TCL $barX_n$ converges towards a gaussian, so if $X_i$ has finite moments, the gaussian approx of $barX_n$ should have finite moment?
$endgroup$
– Gopi
Mar 27 at 1:42










2 Answers
2






active

oldest

votes


















2












$begingroup$

You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    Mar 25 at 9:18







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:26










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    Mar 25 at 9:54






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:58






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    Mar 25 at 10:07


















1












$begingroup$

I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    Mar 25 at 12:47













Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f399261%2fexpectation-of-inverse-of-sum-of-positive-iid-variables%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









2












$begingroup$

You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    Mar 25 at 9:18







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:26










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    Mar 25 at 9:54






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:58






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    Mar 25 at 10:07















2












$begingroup$

You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    Mar 25 at 9:18







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:26










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    Mar 25 at 9:54






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:58






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    Mar 25 at 10:07













2












2








2





$begingroup$

You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.






share|cite|improve this answer











$endgroup$



You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Mar 25 at 11:27

























answered Mar 25 at 9:02









kjetil b halvorsenkjetil b halvorsen

34.8k9 gold badges90 silver badges265 bronze badges




34.8k9 gold badges90 silver badges265 bronze badges











  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    Mar 25 at 9:18







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:26










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    Mar 25 at 9:54






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:58






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    Mar 25 at 10:07
















  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    Mar 25 at 9:18







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:26










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    Mar 25 at 9:54






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    Mar 25 at 9:58






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    Mar 25 at 10:07















$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
Mar 25 at 9:18





$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
Mar 25 at 9:18





1




1




$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
Mar 25 at 9:26




$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
Mar 25 at 9:26












$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
Mar 25 at 9:54




$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
Mar 25 at 9:54




1




1




$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
Mar 25 at 9:58




$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
Mar 25 at 9:58




1




1




$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
Mar 25 at 10:07




$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
Mar 25 at 10:07













1












$begingroup$

I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    Mar 25 at 12:47















1












$begingroup$

I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    Mar 25 at 12:47













1












1








1





$begingroup$

I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.






share|cite|improve this answer











$endgroup$



I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Mar 26 at 17:23

























answered Mar 25 at 11:16









GopiGopi

1406 bronze badges




1406 bronze badges











  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    Mar 25 at 12:47
















  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    Mar 25 at 12:47















$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
Mar 25 at 12:47




$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
Mar 25 at 12:47

















draft saved

draft discarded
















































Thanks for contributing an answer to Cross Validated!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f399261%2fexpectation-of-inverse-of-sum-of-positive-iid-variables%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript