dialog
给定一问一答 post & response
似然为p(post,resp)
p(post,resp) = p(post) x p(resp|post)
训练: loss function = -logp(resp|post) [交叉熵 伯努利分布]
测试:使p(resp|post)最大的resp [post已经给定]
dialog & vae
由vae得知,一个样本点的似然:
##log p(post,resp)>= L = -KL (q(z|post,resp)||p(z)) + E_{z~q} log p(post,resp|z)
log p(resp | post)>= L = -KL (q(z|resp,post)||p(z|post)) + E_{z~q} log p(resp|z, post)