DEEP PROBABILISTIC PROGRAMMING

Code & tools available at http://edwardlib.org.
Paper available at http://openreview.net/pdf?id=Hy6b4Pqee

We propose Edward, a new Turing-complete probabilistic programming language which builds on two compositional representations—random variables and inference.

  1. We show how to integrate our language into existing computational graph frameworks such as TensorFlow; this provides significant speedups over existing probabilistic systems.
  2. We also show how Edward makes it easy to fit the same
    model using a variety of composable inference methods, ranging from point estimation, to variational inference, to MCMC. By treating inference as a first class citizen, on a par with modeling, we show that probabilistic programming can be as computationally efficient and flexible as traditional deep learning.
  3. For flexibility, we show how to reuse the modeling representation within inference to design rich variational models and generative adversarial networks.
  4. For efficiency, we show that our implementation of Hamiltonian Monte Carlo is 35x faster than hand-optimized software such as Stan.
最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容