importance sampling from posterior distribution in RImportance Sampling MC - a couple of questions regarding PDFReweighting importance-weighted samples in Bayesian bootstrapVariance of weighted importance sampling when random variables are boundedGeneral questions on rejection samplingBayesian importance sampling as an answer to a “paradox” by WassermanComparing two Importance Densities for Bayesian InferenceHow to use posterior density samples to infer unknown quantities/parameters?How can we use importance sampling for drawing posterior distribution samples?An approach to analyse error in ABC posteriorBayesian estimation by sampling the prior?

What does "Scientists rise up against statistical significance" mean? (Comment in Nature)

What exact color does ozone gas have?

What's the difference between releasing hormones and tropic hormones?

Can I say "fingers" when referring to toes?

It grows, but water kills it

15% tax on $7.5k earnings. Is that right?

Terse Method to Swap Lowest for Highest?

Redundant comparison & "if" before assignment

Open a doc from terminal, but not by its name

Is aluminum electrical wire used on aircraft?

When were female captains banned from Starfleet?

Mixing PEX brands

Can a stoichiometric mixture of oxygen and methane exist as a liquid at standard pressure and some (low) temperature?

Why would a new[] expression ever invoke a destructor?

A social experiment. What is the worst that can happen?

Fear of getting stuck on one programming language / technology that is not used in my country

Why "had" in "[something] we would have made had we used [something]"?

Sums of entire surjective functions

Why does a simple loop result in ASYNC_NETWORK_IO waits?

Is there a way to get `mathscr' with lower case letters in pdfLaTeX?

Store Credit Card Information in Password Manager?

How much character growth crosses the line into breaking the character

Are Captain Marvel's powers affected by Thanos' actions in Infinity War

Can disgust be a key component of horror?



importance sampling from posterior distribution in R


Importance Sampling MC - a couple of questions regarding PDFReweighting importance-weighted samples in Bayesian bootstrapVariance of weighted importance sampling when random variables are boundedGeneral questions on rejection samplingBayesian importance sampling as an answer to a “paradox” by WassermanComparing two Importance Densities for Bayesian InferenceHow to use posterior density samples to infer unknown quantities/parameters?How can we use importance sampling for drawing posterior distribution samples?An approach to analyse error in ABC posteriorBayesian estimation by sampling the prior?













1












$begingroup$


Today I read that Importance Sampling can be used to draw posterior distribution samples just like Rejection Sampling. However, my understanding of Importance Sampling is that its main purpose is to just compute expectations (ie. integrals) that would otherwise be hard to compute. How can Importance Sampling be used to draw posterior distribution samples from a Bayesian Model?



I only know the posterior positive ratio up to some constant, i.e., a constant term in the normalisation.










share|cite|improve this question











$endgroup$



migrated from stackoverflow.com yesterday


This question came from our site for professional and enthusiast programmers.






















    1












    $begingroup$


    Today I read that Importance Sampling can be used to draw posterior distribution samples just like Rejection Sampling. However, my understanding of Importance Sampling is that its main purpose is to just compute expectations (ie. integrals) that would otherwise be hard to compute. How can Importance Sampling be used to draw posterior distribution samples from a Bayesian Model?



    I only know the posterior positive ratio up to some constant, i.e., a constant term in the normalisation.










    share|cite|improve this question











    $endgroup$



    migrated from stackoverflow.com yesterday


    This question came from our site for professional and enthusiast programmers.




















      1












      1








      1





      $begingroup$


      Today I read that Importance Sampling can be used to draw posterior distribution samples just like Rejection Sampling. However, my understanding of Importance Sampling is that its main purpose is to just compute expectations (ie. integrals) that would otherwise be hard to compute. How can Importance Sampling be used to draw posterior distribution samples from a Bayesian Model?



      I only know the posterior positive ratio up to some constant, i.e., a constant term in the normalisation.










      share|cite|improve this question











      $endgroup$




      Today I read that Importance Sampling can be used to draw posterior distribution samples just like Rejection Sampling. However, my understanding of Importance Sampling is that its main purpose is to just compute expectations (ie. integrals) that would otherwise be hard to compute. How can Importance Sampling be used to draw posterior distribution samples from a Bayesian Model?



      I only know the posterior positive ratio up to some constant, i.e., a constant term in the normalisation.







      bayesian simulation monte-carlo posterior importance-sampling






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited 20 hours ago









      Xi'an

      58.6k897363




      58.6k897363










      asked yesterday









      陳寧寬陳寧寬

      62




      62




      migrated from stackoverflow.com yesterday


      This question came from our site for professional and enthusiast programmers.









      migrated from stackoverflow.com yesterday


      This question came from our site for professional and enthusiast programmers.






















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          The boundary between estimating expectations and producing simulations is rather vague in the sense that, once given an importance sampling sample
          $$(x_1,omega_1),ldots,(x_T,omega_T)qquadomega_t=f(x_t)/g(x_t)$$
          estimating $mathbb E_f[h(X)]$ by $$frac1T,sum_t=1^T omega_t h(x_t)$$ and estimating the cdf $F(cdot)$ by$$hat F(x)=frac1T,sum_t=1^Tomega_t Bbb I_xle x_t$$ is of the same nature. Simulating from $hat F$ is easily done by inversion and is also the concept at the basis of weighted bootstrap.



          In the case where the density $f$ is not properly normalised, as in most Bayesian settings, renormalising the $omega_t$'s by their sum is also a converging approximation.



          $qquadqquad$enter image description here



          The field of particle filters and sequential Monte Carlo (SMC) is taking advantage of this principle to handle sequential targets and state-space models. As in the illustrations above and below:



          $qquadqquad$enter image description here






          share|cite|improve this answer











          $endgroup$




















            0












            $begingroup$

            by using IS(importance sampling) you can calculate a expectation of posterior, that is enough to take prior as Importance distribution. to generate random sample from posterior, you should re-sampling from sample that generated from prior. you should use, SIR (sampling importance resampling).



            SIR is a extended of IS that by using it you can draw sample from a posterior distribution



            SIR



            to generate $m$ sample from the posterior distribution,



            $pi(theta|X) propto f(x|theta) pi(theta)$, follow this step:



            step 1: draw $k$ i.i.d sample from $pi(theta)$,(k>m)



            $theta_i _i=1^k overseti.i.dsim pi(theta)$



            step 2



            appoint weight



            beginarrayc
            theta & theta_1 & theta_2 & cdots & theta_k \ hline \
            w & w_1=fractheta_1)theta_i)
            & w_2= fractheta_2)theta_i)
            & cdots
            & w_k=fracp(xtheta_i)
            endarray



            now by sampling $theta_i _i=1^m$ (without replacement) form



            $theta_i _i=1^k$ with probability $w_i _i=1^k$ , you can obtain random sample from posterior.






            share|cite|improve this answer









            $endgroup$












              Your Answer





              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "65"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: false,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: null,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398764%2fimportance-sampling-from-posterior-distribution-in-r%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              0












              $begingroup$

              The boundary between estimating expectations and producing simulations is rather vague in the sense that, once given an importance sampling sample
              $$(x_1,omega_1),ldots,(x_T,omega_T)qquadomega_t=f(x_t)/g(x_t)$$
              estimating $mathbb E_f[h(X)]$ by $$frac1T,sum_t=1^T omega_t h(x_t)$$ and estimating the cdf $F(cdot)$ by$$hat F(x)=frac1T,sum_t=1^Tomega_t Bbb I_xle x_t$$ is of the same nature. Simulating from $hat F$ is easily done by inversion and is also the concept at the basis of weighted bootstrap.



              In the case where the density $f$ is not properly normalised, as in most Bayesian settings, renormalising the $omega_t$'s by their sum is also a converging approximation.



              $qquadqquad$enter image description here



              The field of particle filters and sequential Monte Carlo (SMC) is taking advantage of this principle to handle sequential targets and state-space models. As in the illustrations above and below:



              $qquadqquad$enter image description here






              share|cite|improve this answer











              $endgroup$

















                0












                $begingroup$

                The boundary between estimating expectations and producing simulations is rather vague in the sense that, once given an importance sampling sample
                $$(x_1,omega_1),ldots,(x_T,omega_T)qquadomega_t=f(x_t)/g(x_t)$$
                estimating $mathbb E_f[h(X)]$ by $$frac1T,sum_t=1^T omega_t h(x_t)$$ and estimating the cdf $F(cdot)$ by$$hat F(x)=frac1T,sum_t=1^Tomega_t Bbb I_xle x_t$$ is of the same nature. Simulating from $hat F$ is easily done by inversion and is also the concept at the basis of weighted bootstrap.



                In the case where the density $f$ is not properly normalised, as in most Bayesian settings, renormalising the $omega_t$'s by their sum is also a converging approximation.



                $qquadqquad$enter image description here



                The field of particle filters and sequential Monte Carlo (SMC) is taking advantage of this principle to handle sequential targets and state-space models. As in the illustrations above and below:



                $qquadqquad$enter image description here






                share|cite|improve this answer











                $endgroup$















                  0












                  0








                  0





                  $begingroup$

                  The boundary between estimating expectations and producing simulations is rather vague in the sense that, once given an importance sampling sample
                  $$(x_1,omega_1),ldots,(x_T,omega_T)qquadomega_t=f(x_t)/g(x_t)$$
                  estimating $mathbb E_f[h(X)]$ by $$frac1T,sum_t=1^T omega_t h(x_t)$$ and estimating the cdf $F(cdot)$ by$$hat F(x)=frac1T,sum_t=1^Tomega_t Bbb I_xle x_t$$ is of the same nature. Simulating from $hat F$ is easily done by inversion and is also the concept at the basis of weighted bootstrap.



                  In the case where the density $f$ is not properly normalised, as in most Bayesian settings, renormalising the $omega_t$'s by their sum is also a converging approximation.



                  $qquadqquad$enter image description here



                  The field of particle filters and sequential Monte Carlo (SMC) is taking advantage of this principle to handle sequential targets and state-space models. As in the illustrations above and below:



                  $qquadqquad$enter image description here






                  share|cite|improve this answer











                  $endgroup$



                  The boundary between estimating expectations and producing simulations is rather vague in the sense that, once given an importance sampling sample
                  $$(x_1,omega_1),ldots,(x_T,omega_T)qquadomega_t=f(x_t)/g(x_t)$$
                  estimating $mathbb E_f[h(X)]$ by $$frac1T,sum_t=1^T omega_t h(x_t)$$ and estimating the cdf $F(cdot)$ by$$hat F(x)=frac1T,sum_t=1^Tomega_t Bbb I_xle x_t$$ is of the same nature. Simulating from $hat F$ is easily done by inversion and is also the concept at the basis of weighted bootstrap.



                  In the case where the density $f$ is not properly normalised, as in most Bayesian settings, renormalising the $omega_t$'s by their sum is also a converging approximation.



                  $qquadqquad$enter image description here



                  The field of particle filters and sequential Monte Carlo (SMC) is taking advantage of this principle to handle sequential targets and state-space models. As in the illustrations above and below:



                  $qquadqquad$enter image description here







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited 20 hours ago

























                  answered 23 hours ago









                  Xi'anXi'an

                  58.6k897363




                  58.6k897363























                      0












                      $begingroup$

                      by using IS(importance sampling) you can calculate a expectation of posterior, that is enough to take prior as Importance distribution. to generate random sample from posterior, you should re-sampling from sample that generated from prior. you should use, SIR (sampling importance resampling).



                      SIR is a extended of IS that by using it you can draw sample from a posterior distribution



                      SIR



                      to generate $m$ sample from the posterior distribution,



                      $pi(theta|X) propto f(x|theta) pi(theta)$, follow this step:



                      step 1: draw $k$ i.i.d sample from $pi(theta)$,(k>m)



                      $theta_i _i=1^k overseti.i.dsim pi(theta)$



                      step 2



                      appoint weight



                      beginarrayc
                      theta & theta_1 & theta_2 & cdots & theta_k \ hline \
                      w & w_1=fractheta_1)theta_i)
                      & w_2= fractheta_2)theta_i)
                      & cdots
                      & w_k=fracp(xtheta_i)
                      endarray



                      now by sampling $theta_i _i=1^m$ (without replacement) form



                      $theta_i _i=1^k$ with probability $w_i _i=1^k$ , you can obtain random sample from posterior.






                      share|cite|improve this answer









                      $endgroup$

















                        0












                        $begingroup$

                        by using IS(importance sampling) you can calculate a expectation of posterior, that is enough to take prior as Importance distribution. to generate random sample from posterior, you should re-sampling from sample that generated from prior. you should use, SIR (sampling importance resampling).



                        SIR is a extended of IS that by using it you can draw sample from a posterior distribution



                        SIR



                        to generate $m$ sample from the posterior distribution,



                        $pi(theta|X) propto f(x|theta) pi(theta)$, follow this step:



                        step 1: draw $k$ i.i.d sample from $pi(theta)$,(k>m)



                        $theta_i _i=1^k overseti.i.dsim pi(theta)$



                        step 2



                        appoint weight



                        beginarrayc
                        theta & theta_1 & theta_2 & cdots & theta_k \ hline \
                        w & w_1=fractheta_1)theta_i)
                        & w_2= fractheta_2)theta_i)
                        & cdots
                        & w_k=fracp(xtheta_i)
                        endarray



                        now by sampling $theta_i _i=1^m$ (without replacement) form



                        $theta_i _i=1^k$ with probability $w_i _i=1^k$ , you can obtain random sample from posterior.






                        share|cite|improve this answer









                        $endgroup$















                          0












                          0








                          0





                          $begingroup$

                          by using IS(importance sampling) you can calculate a expectation of posterior, that is enough to take prior as Importance distribution. to generate random sample from posterior, you should re-sampling from sample that generated from prior. you should use, SIR (sampling importance resampling).



                          SIR is a extended of IS that by using it you can draw sample from a posterior distribution



                          SIR



                          to generate $m$ sample from the posterior distribution,



                          $pi(theta|X) propto f(x|theta) pi(theta)$, follow this step:



                          step 1: draw $k$ i.i.d sample from $pi(theta)$,(k>m)



                          $theta_i _i=1^k overseti.i.dsim pi(theta)$



                          step 2



                          appoint weight



                          beginarrayc
                          theta & theta_1 & theta_2 & cdots & theta_k \ hline \
                          w & w_1=fractheta_1)theta_i)
                          & w_2= fractheta_2)theta_i)
                          & cdots
                          & w_k=fracp(xtheta_i)
                          endarray



                          now by sampling $theta_i _i=1^m$ (without replacement) form



                          $theta_i _i=1^k$ with probability $w_i _i=1^k$ , you can obtain random sample from posterior.






                          share|cite|improve this answer









                          $endgroup$



                          by using IS(importance sampling) you can calculate a expectation of posterior, that is enough to take prior as Importance distribution. to generate random sample from posterior, you should re-sampling from sample that generated from prior. you should use, SIR (sampling importance resampling).



                          SIR is a extended of IS that by using it you can draw sample from a posterior distribution



                          SIR



                          to generate $m$ sample from the posterior distribution,



                          $pi(theta|X) propto f(x|theta) pi(theta)$, follow this step:



                          step 1: draw $k$ i.i.d sample from $pi(theta)$,(k>m)



                          $theta_i _i=1^k overseti.i.dsim pi(theta)$



                          step 2



                          appoint weight



                          beginarrayc
                          theta & theta_1 & theta_2 & cdots & theta_k \ hline \
                          w & w_1=fractheta_1)theta_i)
                          & w_2= fractheta_2)theta_i)
                          & cdots
                          & w_k=fracp(xtheta_i)
                          endarray



                          now by sampling $theta_i _i=1^m$ (without replacement) form



                          $theta_i _i=1^k$ with probability $w_i _i=1^k$ , you can obtain random sample from posterior.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered 18 hours ago









                          masoudmasoud

                          655




                          655



























                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Cross Validated!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398764%2fimportance-sampling-from-posterior-distribution-in-r%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

                              SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

                              은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현