没有合适的资源?快使用搜索试试~ 我知道了~
首页Sequential Monte Carlo Methods for Bayesian Computation
资源详情
资源评论
资源推荐
Motivating Example 1: Generic Bayesian Model
Let X be a vector parameter of interest with an associated prior µ; i.e.
X µ
(
)
.
We observe a realization of y of Y which is assumed to satisfy
Y
j
(
X = x
)
g
(
j
x
)
;
i.e. the likelihood function is g
(
y
j
x
)
.
Bayesian inference on X relies on the posterior of X given Y = y:
p
(
x
j
y
)
=
µ
(
x
)
g
(
y
j
x
)
p
(
y
)
where the marginal likelihood/evidence satis…es
p
(
y
)
=
Z
µ
(
x
)
g
(
y
j
x
)
dx.
“Machine learning” examples: Latent Dirichlet Allocation,
(Hiearchical) Dirichlet processes...
A . Doucet (MLSS Sept. 2012) Sept. 2012 2 / 136
Motivating Example 1: Generic Bayesian Model
Let X be a vector parameter of interest with an associated prior µ; i.e.
X µ
(
)
.
We observe a realization of y of Y which is assumed to satisfy
Y
j
(
X = x
)
g
(
j
x
)
;
i.e. the likelihood function is g
(
y
j
x
)
.
Bayesian inference on X relies on the posterior of X given Y = y:
p
(
x
j
y
)
=
µ
(
x
)
g
(
y
j
x
)
p
(
y
)
where the marginal likelihood/evidence satis…es
p
(
y
)
=
Z
µ
(
x
)
g
(
y
j
x
)
dx.
“Machine learning” examples: Latent Dirichlet Allocation,
(Hiearchical) Dirichlet processes...
A . Doucet (MLSS Sept. 2012) Sept. 2012 2 / 136
Motivating Example 1: Generic Bayesian Model
Let X be a vector parameter of interest with an associated prior µ; i.e.
X µ
(
)
.
We observe a realization of y of Y which is assumed to satisfy
Y
j
(
X = x
)
g
(
j
x
)
;
i.e. the likelihood function is g
(
y
j
x
)
.
Bayesian inference on X relies on the posterior of X given Y = y:
p
(
x
j
y
)
=
µ
(
x
)
g
(
y
j
x
)
p
(
y
)
where the marginal likelihood/evidence satis…es
p
(
y
)
=
Z
µ
(
x
)
g
(
y
j
x
)
dx.
“Machine learning” examples: Latent Dirichlet Allocation,
(Hiearchical) Dirichlet processes...
A . Doucet (MLSS Sept. 2012) Sept. 2012 2 / 136
Motivating Example 1: Generic Bayesian Model
Let X be a vector parameter of interest with an associated prior µ; i.e.
X µ
(
)
.
We observe a realization of y of Y which is assumed to satisfy
Y
j
(
X = x
)
g
(
j
x
)
;
i.e. the likelihood function is g
(
y
j
x
)
.
Bayesian inference on X relies on the posterior of X given Y = y:
p
(
x
j
y
)
=
µ
(
x
)
g
(
y
j
x
)
p
(
y
)
where the marginal likelihood/evidence satis…es
p
(
y
)
=
Z
µ
(
x
)
g
(
y
j
x
)
dx.
“Machine learning” examples: Latent Dirichlet Allocation,
(Hiearchical) Dirichlet processes...
A . Doucet (MLSS Sept. 2012) Sept. 2012 2 / 136
剩余406页未读,继续阅读
Alladins
- 粉丝: 1
- 资源: 57
上传资源 快速赚钱
- 我的内容管理 收起
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
会员权益专享
最新资源
- stc12c5a60s2 例程
- Android通过全局变量传递数据
- c++校园超市商品信息管理系统课程设计说明书(含源代码) (2).pdf
- 建筑供配电系统相关课件.pptx
- 企业管理规章制度及管理模式.doc
- vb打开摄像头.doc
- 云计算-可信计算中认证协议改进方案.pdf
- [详细完整版]单片机编程4.ppt
- c语言常用算法.pdf
- c++经典程序代码大全.pdf
- 单片机数字时钟资料.doc
- 11项目管理前沿1.0.pptx
- 基于ssm的“魅力”繁峙宣传网站的设计与实现论文.doc
- 智慧交通综合解决方案.pptx
- 建筑防潮设计-PowerPointPresentati.pptx
- SPC统计过程控制程序.pptx
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论0