Computing the expectation of the number of balls in a box












5












$begingroup$



  • There are $r$ boxes and $n$ balls.

  • Each ball is placed in a box with equal probability, independently of the other balls.

  • Let $X_{i}$ be the number of balls in box $i$,
    $1 leq i leq r$.

  • Compute $mathbb{E}left[X_{i}right], mathbb{E}left[X_{i}X_{j}right]$.


I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Are there any restrictions on $j$?
    $endgroup$
    – Sean Lee
    2 days ago










  • $begingroup$
    @SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
    $endgroup$
    – 631
    2 days ago










  • $begingroup$
    Computationally, the answer to the second part appears to be $frac{n^2}{r^2}$
    $endgroup$
    – Sean Lee
    2 days ago










  • $begingroup$
    Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
    $endgroup$
    – Daniel Schepler
    2 days ago
















5












$begingroup$



  • There are $r$ boxes and $n$ balls.

  • Each ball is placed in a box with equal probability, independently of the other balls.

  • Let $X_{i}$ be the number of balls in box $i$,
    $1 leq i leq r$.

  • Compute $mathbb{E}left[X_{i}right], mathbb{E}left[X_{i}X_{j}right]$.


I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Are there any restrictions on $j$?
    $endgroup$
    – Sean Lee
    2 days ago










  • $begingroup$
    @SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
    $endgroup$
    – 631
    2 days ago










  • $begingroup$
    Computationally, the answer to the second part appears to be $frac{n^2}{r^2}$
    $endgroup$
    – Sean Lee
    2 days ago










  • $begingroup$
    Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
    $endgroup$
    – Daniel Schepler
    2 days ago














5












5








5





$begingroup$



  • There are $r$ boxes and $n$ balls.

  • Each ball is placed in a box with equal probability, independently of the other balls.

  • Let $X_{i}$ be the number of balls in box $i$,
    $1 leq i leq r$.

  • Compute $mathbb{E}left[X_{i}right], mathbb{E}left[X_{i}X_{j}right]$.


I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.










share|cite|improve this question











$endgroup$





  • There are $r$ boxes and $n$ balls.

  • Each ball is placed in a box with equal probability, independently of the other balls.

  • Let $X_{i}$ be the number of balls in box $i$,
    $1 leq i leq r$.

  • Compute $mathbb{E}left[X_{i}right], mathbb{E}left[X_{i}X_{j}right]$.


I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.







probability-theory






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 2 days ago









Felix Marin

68.9k7110147




68.9k7110147










asked 2 days ago









631631

585




585












  • $begingroup$
    Are there any restrictions on $j$?
    $endgroup$
    – Sean Lee
    2 days ago










  • $begingroup$
    @SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
    $endgroup$
    – 631
    2 days ago










  • $begingroup$
    Computationally, the answer to the second part appears to be $frac{n^2}{r^2}$
    $endgroup$
    – Sean Lee
    2 days ago










  • $begingroup$
    Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
    $endgroup$
    – Daniel Schepler
    2 days ago


















  • $begingroup$
    Are there any restrictions on $j$?
    $endgroup$
    – Sean Lee
    2 days ago










  • $begingroup$
    @SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
    $endgroup$
    – 631
    2 days ago










  • $begingroup$
    Computationally, the answer to the second part appears to be $frac{n^2}{r^2}$
    $endgroup$
    – Sean Lee
    2 days ago










  • $begingroup$
    Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
    $endgroup$
    – Daniel Schepler
    2 days ago
















$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
2 days ago




$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
2 days ago












$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
2 days ago




$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
2 days ago












$begingroup$
Computationally, the answer to the second part appears to be $frac{n^2}{r^2}$
$endgroup$
– Sean Lee
2 days ago




$begingroup$
Computationally, the answer to the second part appears to be $frac{n^2}{r^2}$
$endgroup$
– Sean Lee
2 days ago












$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
2 days ago




$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
2 days ago










3 Answers
3






active

oldest

votes


















2












$begingroup$

Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



$$ mathbb{E}[X_i] = frac{n}{r} $$



Now, we would like to know what is $mathbb{E}[X_i X_j] $.



We begin by making the following observation:



$$X_i = n - sum_{jneq i}X_j $$



Which gives us:



$$ X_isum_{jneq i}X_j = nX_i - X_i^2$$



Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



begin{align}mathbb{E}[X_i X_j] &= frac{1}{r}Big(mathbb{E}[X_i sum_{jneq i} X_j] + mathbb{E}[X_i^2]Big) \
&= frac{1}{r} mathbb{E}[nX_i] \
&= frac{n^2}{r^2}
end{align}






share|cite|improve this answer











$endgroup$









  • 1




    $begingroup$
    If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = frac{n(n-1)}{r^2}$ for $i ne j$ and $E(X_i^2) = frac{n}{r} + frac{n(n-1)}{r^2}$.)
    $endgroup$
    – Daniel Schepler
    2 days ago












  • $begingroup$
    Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac{1}{r}[(r-1)E(X_iX_j) + E(X_i^2)] = frac{n^2}{r^2}$
    $endgroup$
    – Sean Lee
    2 days ago








  • 1




    $begingroup$
    I've now expanded VHarisop's answer with my calculations for part two of the question.
    $endgroup$
    – Daniel Schepler
    2 days ago



















4












$begingroup$

For the first part, you can use linearity of expectation to compute $mathbb{E}[X_i]$.
Specifically, you know that for a fixed box, the probability of putting a ball in it
is $frac{1}{r}$. Let



$$
Y_k^{(i)} = begin{cases}
1 &, text{ if ball $k$ was placed in box $i$} \
0 &, text{ otherwise}
end{cases},
$$

which satisfies $mathbb{E}[Y_k^{(i)}] = mathbb{P}(Y_k^{(i)} = 1) = frac{1}{r}.$
Then you can write



$$
X_i = sum_{j=1}^n Y_j^{(i)} Rightarrow mathbb{E}X_i = sum_{j=1}^n frac{1}{r} = frac{n}{r}.
$$





For the second part, you can proceed similarly: $X_i = sum_{k=1}^n Y_k^{(i)}$ and $X_j = sum_{ell=1}^n Y_{ell}^{(j)}$, so:
$$
X_i X_j = sum_{k=1}^n sum_{ell=1}^n Y_k^{(i)} Y_{ell}^{(j)} implies
mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}).
$$

We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^{(i)} Y_{ell}^{(j)} = Y_k^{(i)} Y_k^{(j)} = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent random variables. Therefore, in this case,
$$mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}) = mathbb{E}(Y_k^{(i)}) mathbb{E}(Y_{ell}^{(j)}) = frac{1}{r} cdot frac{1}{r}.$$
In summary, if $i ne j$, then
$$mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n delta_{k ne ell} cdot frac{1}{r^2} = frac{n(n-1)}{r^2}$$
where $delta_{k ne ell}$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



For the case $i = j$, I will leave the similar computation of $mathbb{E}(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)})$ for the case $k = ell$.






share|cite|improve this answer











$endgroup$









  • 1




    $begingroup$
    I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
    $endgroup$
    – Daniel Schepler
    2 days ago










  • $begingroup$
    @DanielSchepler: Looks good, thank you!
    $endgroup$
    – VHarisop
    yesterday



















0












$begingroup$

Think of placing the ball in box "$i$" as success and not placing it as a failure.



This situation can be represented using the Hypergeometric Distribution.
$$
P(X=k) = frac{{K choose k} {N- Kchoose n - k}}{{N choose n}}.
$$



$N$ is the population size (number of boxes $r$)



$K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



$n$ is the number of draws (the number of balls $n$).



$k$ is the number of observed successes (the number of balls in box "$i$").



The expectation of the Hypergeometric Distribution is $nfrac{K}N$, hence the mean of your variable
$$E[X_i]=nfrac{1}{r}=frac{n}{r}$$






share|cite|improve this answer









$endgroup$














    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3184022%2fcomputing-the-expectation-of-the-number-of-balls-in-a-box%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



    $$ mathbb{E}[X_i] = frac{n}{r} $$



    Now, we would like to know what is $mathbb{E}[X_i X_j] $.



    We begin by making the following observation:



    $$X_i = n - sum_{jneq i}X_j $$



    Which gives us:



    $$ X_isum_{jneq i}X_j = nX_i - X_i^2$$



    Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



    begin{align}mathbb{E}[X_i X_j] &= frac{1}{r}Big(mathbb{E}[X_i sum_{jneq i} X_j] + mathbb{E}[X_i^2]Big) \
    &= frac{1}{r} mathbb{E}[nX_i] \
    &= frac{n^2}{r^2}
    end{align}






    share|cite|improve this answer











    $endgroup$









    • 1




      $begingroup$
      If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = frac{n(n-1)}{r^2}$ for $i ne j$ and $E(X_i^2) = frac{n}{r} + frac{n(n-1)}{r^2}$.)
      $endgroup$
      – Daniel Schepler
      2 days ago












    • $begingroup$
      Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac{1}{r}[(r-1)E(X_iX_j) + E(X_i^2)] = frac{n^2}{r^2}$
      $endgroup$
      – Sean Lee
      2 days ago








    • 1




      $begingroup$
      I've now expanded VHarisop's answer with my calculations for part two of the question.
      $endgroup$
      – Daniel Schepler
      2 days ago
















    2












    $begingroup$

    Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



    $$ mathbb{E}[X_i] = frac{n}{r} $$



    Now, we would like to know what is $mathbb{E}[X_i X_j] $.



    We begin by making the following observation:



    $$X_i = n - sum_{jneq i}X_j $$



    Which gives us:



    $$ X_isum_{jneq i}X_j = nX_i - X_i^2$$



    Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



    begin{align}mathbb{E}[X_i X_j] &= frac{1}{r}Big(mathbb{E}[X_i sum_{jneq i} X_j] + mathbb{E}[X_i^2]Big) \
    &= frac{1}{r} mathbb{E}[nX_i] \
    &= frac{n^2}{r^2}
    end{align}






    share|cite|improve this answer











    $endgroup$









    • 1




      $begingroup$
      If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = frac{n(n-1)}{r^2}$ for $i ne j$ and $E(X_i^2) = frac{n}{r} + frac{n(n-1)}{r^2}$.)
      $endgroup$
      – Daniel Schepler
      2 days ago












    • $begingroup$
      Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac{1}{r}[(r-1)E(X_iX_j) + E(X_i^2)] = frac{n^2}{r^2}$
      $endgroup$
      – Sean Lee
      2 days ago








    • 1




      $begingroup$
      I've now expanded VHarisop's answer with my calculations for part two of the question.
      $endgroup$
      – Daniel Schepler
      2 days ago














    2












    2








    2





    $begingroup$

    Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



    $$ mathbb{E}[X_i] = frac{n}{r} $$



    Now, we would like to know what is $mathbb{E}[X_i X_j] $.



    We begin by making the following observation:



    $$X_i = n - sum_{jneq i}X_j $$



    Which gives us:



    $$ X_isum_{jneq i}X_j = nX_i - X_i^2$$



    Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



    begin{align}mathbb{E}[X_i X_j] &= frac{1}{r}Big(mathbb{E}[X_i sum_{jneq i} X_j] + mathbb{E}[X_i^2]Big) \
    &= frac{1}{r} mathbb{E}[nX_i] \
    &= frac{n^2}{r^2}
    end{align}






    share|cite|improve this answer











    $endgroup$



    Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



    $$ mathbb{E}[X_i] = frac{n}{r} $$



    Now, we would like to know what is $mathbb{E}[X_i X_j] $.



    We begin by making the following observation:



    $$X_i = n - sum_{jneq i}X_j $$



    Which gives us:



    $$ X_isum_{jneq i}X_j = nX_i - X_i^2$$



    Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



    begin{align}mathbb{E}[X_i X_j] &= frac{1}{r}Big(mathbb{E}[X_i sum_{jneq i} X_j] + mathbb{E}[X_i^2]Big) \
    &= frac{1}{r} mathbb{E}[nX_i] \
    &= frac{n^2}{r^2}
    end{align}







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited 2 days ago

























    answered 2 days ago









    Sean LeeSean Lee

    828214




    828214








    • 1




      $begingroup$
      If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = frac{n(n-1)}{r^2}$ for $i ne j$ and $E(X_i^2) = frac{n}{r} + frac{n(n-1)}{r^2}$.)
      $endgroup$
      – Daniel Schepler
      2 days ago












    • $begingroup$
      Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac{1}{r}[(r-1)E(X_iX_j) + E(X_i^2)] = frac{n^2}{r^2}$
      $endgroup$
      – Sean Lee
      2 days ago








    • 1




      $begingroup$
      I've now expanded VHarisop's answer with my calculations for part two of the question.
      $endgroup$
      – Daniel Schepler
      2 days ago














    • 1




      $begingroup$
      If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = frac{n(n-1)}{r^2}$ for $i ne j$ and $E(X_i^2) = frac{n}{r} + frac{n(n-1)}{r^2}$.)
      $endgroup$
      – Daniel Schepler
      2 days ago












    • $begingroup$
      Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac{1}{r}[(r-1)E(X_iX_j) + E(X_i^2)] = frac{n^2}{r^2}$
      $endgroup$
      – Sean Lee
      2 days ago








    • 1




      $begingroup$
      I've now expanded VHarisop's answer with my calculations for part two of the question.
      $endgroup$
      – Daniel Schepler
      2 days ago








    1




    1




    $begingroup$
    If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = frac{n(n-1)}{r^2}$ for $i ne j$ and $E(X_i^2) = frac{n}{r} + frac{n(n-1)}{r^2}$.)
    $endgroup$
    – Daniel Schepler
    2 days ago






    $begingroup$
    If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = frac{n(n-1)}{r^2}$ for $i ne j$ and $E(X_i^2) = frac{n}{r} + frac{n(n-1)}{r^2}$.)
    $endgroup$
    – Daniel Schepler
    2 days ago














    $begingroup$
    Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac{1}{r}[(r-1)E(X_iX_j) + E(X_i^2)] = frac{n^2}{r^2}$
    $endgroup$
    – Sean Lee
    2 days ago






    $begingroup$
    Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac{1}{r}[(r-1)E(X_iX_j) + E(X_i^2)] = frac{n^2}{r^2}$
    $endgroup$
    – Sean Lee
    2 days ago






    1




    1




    $begingroup$
    I've now expanded VHarisop's answer with my calculations for part two of the question.
    $endgroup$
    – Daniel Schepler
    2 days ago




    $begingroup$
    I've now expanded VHarisop's answer with my calculations for part two of the question.
    $endgroup$
    – Daniel Schepler
    2 days ago











    4












    $begingroup$

    For the first part, you can use linearity of expectation to compute $mathbb{E}[X_i]$.
    Specifically, you know that for a fixed box, the probability of putting a ball in it
    is $frac{1}{r}$. Let



    $$
    Y_k^{(i)} = begin{cases}
    1 &, text{ if ball $k$ was placed in box $i$} \
    0 &, text{ otherwise}
    end{cases},
    $$

    which satisfies $mathbb{E}[Y_k^{(i)}] = mathbb{P}(Y_k^{(i)} = 1) = frac{1}{r}.$
    Then you can write



    $$
    X_i = sum_{j=1}^n Y_j^{(i)} Rightarrow mathbb{E}X_i = sum_{j=1}^n frac{1}{r} = frac{n}{r}.
    $$





    For the second part, you can proceed similarly: $X_i = sum_{k=1}^n Y_k^{(i)}$ and $X_j = sum_{ell=1}^n Y_{ell}^{(j)}$, so:
    $$
    X_i X_j = sum_{k=1}^n sum_{ell=1}^n Y_k^{(i)} Y_{ell}^{(j)} implies
    mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}).
    $$

    We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^{(i)} Y_{ell}^{(j)} = Y_k^{(i)} Y_k^{(j)} = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent random variables. Therefore, in this case,
    $$mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}) = mathbb{E}(Y_k^{(i)}) mathbb{E}(Y_{ell}^{(j)}) = frac{1}{r} cdot frac{1}{r}.$$
    In summary, if $i ne j$, then
    $$mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n delta_{k ne ell} cdot frac{1}{r^2} = frac{n(n-1)}{r^2}$$
    where $delta_{k ne ell}$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



    For the case $i = j$, I will leave the similar computation of $mathbb{E}(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)})$ for the case $k = ell$.






    share|cite|improve this answer











    $endgroup$









    • 1




      $begingroup$
      I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
      $endgroup$
      – Daniel Schepler
      2 days ago










    • $begingroup$
      @DanielSchepler: Looks good, thank you!
      $endgroup$
      – VHarisop
      yesterday
















    4












    $begingroup$

    For the first part, you can use linearity of expectation to compute $mathbb{E}[X_i]$.
    Specifically, you know that for a fixed box, the probability of putting a ball in it
    is $frac{1}{r}$. Let



    $$
    Y_k^{(i)} = begin{cases}
    1 &, text{ if ball $k$ was placed in box $i$} \
    0 &, text{ otherwise}
    end{cases},
    $$

    which satisfies $mathbb{E}[Y_k^{(i)}] = mathbb{P}(Y_k^{(i)} = 1) = frac{1}{r}.$
    Then you can write



    $$
    X_i = sum_{j=1}^n Y_j^{(i)} Rightarrow mathbb{E}X_i = sum_{j=1}^n frac{1}{r} = frac{n}{r}.
    $$





    For the second part, you can proceed similarly: $X_i = sum_{k=1}^n Y_k^{(i)}$ and $X_j = sum_{ell=1}^n Y_{ell}^{(j)}$, so:
    $$
    X_i X_j = sum_{k=1}^n sum_{ell=1}^n Y_k^{(i)} Y_{ell}^{(j)} implies
    mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}).
    $$

    We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^{(i)} Y_{ell}^{(j)} = Y_k^{(i)} Y_k^{(j)} = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent random variables. Therefore, in this case,
    $$mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}) = mathbb{E}(Y_k^{(i)}) mathbb{E}(Y_{ell}^{(j)}) = frac{1}{r} cdot frac{1}{r}.$$
    In summary, if $i ne j$, then
    $$mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n delta_{k ne ell} cdot frac{1}{r^2} = frac{n(n-1)}{r^2}$$
    where $delta_{k ne ell}$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



    For the case $i = j$, I will leave the similar computation of $mathbb{E}(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)})$ for the case $k = ell$.






    share|cite|improve this answer











    $endgroup$









    • 1




      $begingroup$
      I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
      $endgroup$
      – Daniel Schepler
      2 days ago










    • $begingroup$
      @DanielSchepler: Looks good, thank you!
      $endgroup$
      – VHarisop
      yesterday














    4












    4








    4





    $begingroup$

    For the first part, you can use linearity of expectation to compute $mathbb{E}[X_i]$.
    Specifically, you know that for a fixed box, the probability of putting a ball in it
    is $frac{1}{r}$. Let



    $$
    Y_k^{(i)} = begin{cases}
    1 &, text{ if ball $k$ was placed in box $i$} \
    0 &, text{ otherwise}
    end{cases},
    $$

    which satisfies $mathbb{E}[Y_k^{(i)}] = mathbb{P}(Y_k^{(i)} = 1) = frac{1}{r}.$
    Then you can write



    $$
    X_i = sum_{j=1}^n Y_j^{(i)} Rightarrow mathbb{E}X_i = sum_{j=1}^n frac{1}{r} = frac{n}{r}.
    $$





    For the second part, you can proceed similarly: $X_i = sum_{k=1}^n Y_k^{(i)}$ and $X_j = sum_{ell=1}^n Y_{ell}^{(j)}$, so:
    $$
    X_i X_j = sum_{k=1}^n sum_{ell=1}^n Y_k^{(i)} Y_{ell}^{(j)} implies
    mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}).
    $$

    We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^{(i)} Y_{ell}^{(j)} = Y_k^{(i)} Y_k^{(j)} = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent random variables. Therefore, in this case,
    $$mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}) = mathbb{E}(Y_k^{(i)}) mathbb{E}(Y_{ell}^{(j)}) = frac{1}{r} cdot frac{1}{r}.$$
    In summary, if $i ne j$, then
    $$mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n delta_{k ne ell} cdot frac{1}{r^2} = frac{n(n-1)}{r^2}$$
    where $delta_{k ne ell}$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



    For the case $i = j$, I will leave the similar computation of $mathbb{E}(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)})$ for the case $k = ell$.






    share|cite|improve this answer











    $endgroup$



    For the first part, you can use linearity of expectation to compute $mathbb{E}[X_i]$.
    Specifically, you know that for a fixed box, the probability of putting a ball in it
    is $frac{1}{r}$. Let



    $$
    Y_k^{(i)} = begin{cases}
    1 &, text{ if ball $k$ was placed in box $i$} \
    0 &, text{ otherwise}
    end{cases},
    $$

    which satisfies $mathbb{E}[Y_k^{(i)}] = mathbb{P}(Y_k^{(i)} = 1) = frac{1}{r}.$
    Then you can write



    $$
    X_i = sum_{j=1}^n Y_j^{(i)} Rightarrow mathbb{E}X_i = sum_{j=1}^n frac{1}{r} = frac{n}{r}.
    $$





    For the second part, you can proceed similarly: $X_i = sum_{k=1}^n Y_k^{(i)}$ and $X_j = sum_{ell=1}^n Y_{ell}^{(j)}$, so:
    $$
    X_i X_j = sum_{k=1}^n sum_{ell=1}^n Y_k^{(i)} Y_{ell}^{(j)} implies
    mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}).
    $$

    We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^{(i)} Y_{ell}^{(j)} = Y_k^{(i)} Y_k^{(j)} = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^{(i)}$ and $Y_{ell}^{(j)}$ are independent random variables. Therefore, in this case,
    $$mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)}) = mathbb{E}(Y_k^{(i)}) mathbb{E}(Y_{ell}^{(j)}) = frac{1}{r} cdot frac{1}{r}.$$
    In summary, if $i ne j$, then
    $$mathbb{E}(X_i X_j) = sum_{k=1}^n sum_{ell=1}^n delta_{k ne ell} cdot frac{1}{r^2} = frac{n(n-1)}{r^2}$$
    where $delta_{k ne ell}$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



    For the case $i = j$, I will leave the similar computation of $mathbb{E}(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbb{E}(Y_k^{(i)} Y_{ell}^{(j)})$ for the case $k = ell$.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited 2 days ago









    Daniel Schepler

    9,3341821




    9,3341821










    answered 2 days ago









    VHarisopVHarisop

    1,228421




    1,228421








    • 1




      $begingroup$
      I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
      $endgroup$
      – Daniel Schepler
      2 days ago










    • $begingroup$
      @DanielSchepler: Looks good, thank you!
      $endgroup$
      – VHarisop
      yesterday














    • 1




      $begingroup$
      I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
      $endgroup$
      – Daniel Schepler
      2 days ago










    • $begingroup$
      @DanielSchepler: Looks good, thank you!
      $endgroup$
      – VHarisop
      yesterday








    1




    1




    $begingroup$
    I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
    $endgroup$
    – Daniel Schepler
    2 days ago




    $begingroup$
    I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
    $endgroup$
    – Daniel Schepler
    2 days ago












    $begingroup$
    @DanielSchepler: Looks good, thank you!
    $endgroup$
    – VHarisop
    yesterday




    $begingroup$
    @DanielSchepler: Looks good, thank you!
    $endgroup$
    – VHarisop
    yesterday











    0












    $begingroup$

    Think of placing the ball in box "$i$" as success and not placing it as a failure.



    This situation can be represented using the Hypergeometric Distribution.
    $$
    P(X=k) = frac{{K choose k} {N- Kchoose n - k}}{{N choose n}}.
    $$



    $N$ is the population size (number of boxes $r$)



    $K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



    $n$ is the number of draws (the number of balls $n$).



    $k$ is the number of observed successes (the number of balls in box "$i$").



    The expectation of the Hypergeometric Distribution is $nfrac{K}N$, hence the mean of your variable
    $$E[X_i]=nfrac{1}{r}=frac{n}{r}$$






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      Think of placing the ball in box "$i$" as success and not placing it as a failure.



      This situation can be represented using the Hypergeometric Distribution.
      $$
      P(X=k) = frac{{K choose k} {N- Kchoose n - k}}{{N choose n}}.
      $$



      $N$ is the population size (number of boxes $r$)



      $K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



      $n$ is the number of draws (the number of balls $n$).



      $k$ is the number of observed successes (the number of balls in box "$i$").



      The expectation of the Hypergeometric Distribution is $nfrac{K}N$, hence the mean of your variable
      $$E[X_i]=nfrac{1}{r}=frac{n}{r}$$






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        Think of placing the ball in box "$i$" as success and not placing it as a failure.



        This situation can be represented using the Hypergeometric Distribution.
        $$
        P(X=k) = frac{{K choose k} {N- Kchoose n - k}}{{N choose n}}.
        $$



        $N$ is the population size (number of boxes $r$)



        $K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



        $n$ is the number of draws (the number of balls $n$).



        $k$ is the number of observed successes (the number of balls in box "$i$").



        The expectation of the Hypergeometric Distribution is $nfrac{K}N$, hence the mean of your variable
        $$E[X_i]=nfrac{1}{r}=frac{n}{r}$$






        share|cite|improve this answer









        $endgroup$



        Think of placing the ball in box "$i$" as success and not placing it as a failure.



        This situation can be represented using the Hypergeometric Distribution.
        $$
        P(X=k) = frac{{K choose k} {N- Kchoose n - k}}{{N choose n}}.
        $$



        $N$ is the population size (number of boxes $r$)



        $K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



        $n$ is the number of draws (the number of balls $n$).



        $k$ is the number of observed successes (the number of balls in box "$i$").



        The expectation of the Hypergeometric Distribution is $nfrac{K}N$, hence the mean of your variable
        $$E[X_i]=nfrac{1}{r}=frac{n}{r}$$







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 2 days ago









        RScrlliRScrlli

        761114




        761114






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3184022%2fcomputing-the-expectation-of-the-number-of-balls-in-a-box%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            數位音樂下載

            格利澤436b

            When can things happen in Etherscan, such as the picture below?