What precisely does it mean to borrow information?











up vote
2
down vote

favorite












I often people them talk about information borrowing or information sharing in Bayesian hierarchical models. I can't seem to get a straight answer about what this actually means and if it is unique to Bayesian hierarchical models. I sort of get the idea: some levels in your hierarchy share a common parameter. I have no idea how this translates to "information borrowing" though.




  1. Is "information borrowing"/ "information sharing" a buzz word people like to throw out?


  2. Is there an example with closed form posteriors that illustrates this sharing phenomenon?


  3. Is this unique to a Bayesian analysis? Generally, when I see examples of "information borrowing" they are just mixed models. Maybe I learned this models in an old fashioned way, but I don't see any sharing.



I am not interested in starting a philosophical debate about methods. I am just curious about the use of this term.










share|cite|improve this question






















  • For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
    – Isabella Ghement
    2 hours ago















up vote
2
down vote

favorite












I often people them talk about information borrowing or information sharing in Bayesian hierarchical models. I can't seem to get a straight answer about what this actually means and if it is unique to Bayesian hierarchical models. I sort of get the idea: some levels in your hierarchy share a common parameter. I have no idea how this translates to "information borrowing" though.




  1. Is "information borrowing"/ "information sharing" a buzz word people like to throw out?


  2. Is there an example with closed form posteriors that illustrates this sharing phenomenon?


  3. Is this unique to a Bayesian analysis? Generally, when I see examples of "information borrowing" they are just mixed models. Maybe I learned this models in an old fashioned way, but I don't see any sharing.



I am not interested in starting a philosophical debate about methods. I am just curious about the use of this term.










share|cite|improve this question






















  • For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
    – Isabella Ghement
    2 hours ago













up vote
2
down vote

favorite









up vote
2
down vote

favorite











I often people them talk about information borrowing or information sharing in Bayesian hierarchical models. I can't seem to get a straight answer about what this actually means and if it is unique to Bayesian hierarchical models. I sort of get the idea: some levels in your hierarchy share a common parameter. I have no idea how this translates to "information borrowing" though.




  1. Is "information borrowing"/ "information sharing" a buzz word people like to throw out?


  2. Is there an example with closed form posteriors that illustrates this sharing phenomenon?


  3. Is this unique to a Bayesian analysis? Generally, when I see examples of "information borrowing" they are just mixed models. Maybe I learned this models in an old fashioned way, but I don't see any sharing.



I am not interested in starting a philosophical debate about methods. I am just curious about the use of this term.










share|cite|improve this question













I often people them talk about information borrowing or information sharing in Bayesian hierarchical models. I can't seem to get a straight answer about what this actually means and if it is unique to Bayesian hierarchical models. I sort of get the idea: some levels in your hierarchy share a common parameter. I have no idea how this translates to "information borrowing" though.




  1. Is "information borrowing"/ "information sharing" a buzz word people like to throw out?


  2. Is there an example with closed form posteriors that illustrates this sharing phenomenon?


  3. Is this unique to a Bayesian analysis? Generally, when I see examples of "information borrowing" they are just mixed models. Maybe I learned this models in an old fashioned way, but I don't see any sharing.



I am not interested in starting a philosophical debate about methods. I am just curious about the use of this term.







machine-learning bayesian multilevel-analysis terminology hierarchical-bayesian






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 4 hours ago









EliK

304112




304112












  • For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
    – Isabella Ghement
    2 hours ago


















  • For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
    – Isabella Ghement
    2 hours ago
















For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
– Isabella Ghement
2 hours ago




For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
– Isabella Ghement
2 hours ago










2 Answers
2






active

oldest

votes

















up vote
3
down vote













Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs the estimate for a given group. That's an example of 'borrowing information'.






share|cite|improve this answer




























    up vote
    0
    down vote













    This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



    EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.






    share|cite





















      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "65"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f381761%2fwhat-precisely-does-it-mean-to-borrow-information%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      3
      down vote













      Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs the estimate for a given group. That's an example of 'borrowing information'.






      share|cite|improve this answer

























        up vote
        3
        down vote













        Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs the estimate for a given group. That's an example of 'borrowing information'.






        share|cite|improve this answer























          up vote
          3
          down vote










          up vote
          3
          down vote









          Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs the estimate for a given group. That's an example of 'borrowing information'.






          share|cite|improve this answer












          Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs the estimate for a given group. That's an example of 'borrowing information'.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered 3 hours ago









          Glen_b

          208k22396735




          208k22396735
























              up vote
              0
              down vote













              This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



              EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.






              share|cite

























                up vote
                0
                down vote













                This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



                EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.






                share|cite























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



                  EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.






                  share|cite












                  This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



                  EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.







                  share|cite












                  share|cite



                  share|cite










                  answered 6 mins ago









                  Gordon Smyth

                  4,4331124




                  4,4331124






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Cross Validated!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f381761%2fwhat-precisely-does-it-mean-to-borrow-information%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      數位音樂下載

                      格利澤436b

                      When can things happen in Etherscan, such as the picture below?