Differences

This shows you the differences between two versions of the page.

Link to this comparison view

likelihood_free [2017/02/24 00:55] (current)
Line 1: Line 1:
 +====== Likelihood Free Inference ======
  
 +We want estimations on theta that is robust of uncertainty of nuisance parameters.
 +
 +https://​arxiv.org/​abs/​1506.02169 Approximating Likelihood Ratios with Calibrated Discriminative Classifiers
 +
 +n many fields of science, generalized likelihood ratio tests are established tools for statistical inference. At the same time, it has become increasingly common that a simulator (or generative model) is used to describe complex processes that tie parameters θ of an underlying theory and measurement apparatus to high-dimensional observations x∈ℝp. However, simulator often do not provide a way to evaluate the likelihood function for a given observation x, which motivates a new class of likelihood-free inference algorithms. In this paper, we show that likelihood ratios are invariant under a specific class of dimensionality reduction maps ℝp↦ℝ. As a direct consequence,​ we show that discriminative classifiers can be used to approximate the generalized likelihood ratio statistic when only a generative model for the data is available. This leads to a new machine learning-based approach to likelihood-free inference that is complementary to Approximate Bayesian Computation,​ and which does not require a prior on the model parameters. Experimental results on artificial problems with known exact likelihoods illustrate the potential of the proposed method.
 +
 +https://​github.com/​diana-hep/​carl
 +
 +https://​figshare.com/​articles/​NIPS_2016_Keynote_Machine_Learning_Likelihood_Free_Inference_in_Particle_Physics/​4291565/​1 ​ NIPS 2016 Keynote: Machine Learning & Likelihood Free Inference in Particle Physics
 +
 +https://​arxiv.org/​pdf/​1605.07826.pdf Asymptotically exact inference in likelihood-free models
 +
 +Many generative models can be expressed
 +as a differentiable function of random inputs
 +drawn from some simple probability density.
 +This framework includes both deep generative
 +architectures such as Variational Autoencoders
 +and a large class of ‘likelihood-free’
 +simulator models. We present a method for
 +performing efficient MCMC inference in such
 +models when conditioning on observations of
 +the model output. For some models this offers
 +an asymptotically exact inference method
 +where Approximate Bayesian Computation
 +might otherwise be employed. We use the
 +intuition that inference corresponds to integrating
 +a density across the manifold corresponding
 +to the set of inputs consistent with
 +the observed outputs. This motivates the use
 +of a constrained variant of Hamiltonian Monte
 +Carlo which leverages the smooth geometry of
 +the manifold to coherently move between inputs
 +exactly consistent with observations. We
 +validate the method by performing inference
 +tasks in a diverse set of models.