Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
autoregressive_network [2018/04/27 11:38]
admin
autoregressive_network [2018/09/04 02:39]
admin
Line 174: Line 174:
 Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions,​ and their greater expressivity allows them to better capture multimodal target distributions. Experimentally,​ NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST. Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions,​ and their greater expressivity allows them to better capture multimodal target distributions. Experimentally,​ NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.
  
-https://medium.com/the-artificial-impostor/notes-understanding-tensorflow-part-3-7f6633fcc7c7+https://arxiv.org/abs/1806.05575 Autoregressive Quantile Networks for Generative Modeling
  
-https://www.arxiv-vanity.com/​papers/​1803.01271/​ An Empirical Evaluation ​of Generic Convolutional and Recurrent Networks for Sequence Modeling¬†+We introduce autoregressive implicit quantile networks (AIQN), a fundamentally different approach to generative modeling than those commonly used, that implicitly captures the distribution using quantile regressionAIQN is able to achieve superior perceptual quality and improvements in evaluation metrics, without incurring a loss of sample diversity.
- +
-https://​arxiv.org/​abs/​1703.06846v3 Boosting Dilated Convolutional Networks with Mixed Tensor Decompositions+
  
 +https://​arxiv.org/​pdf/​1802.06901.pdf Deterministic Non-Autoregressive Neural Sequence Modeling
 +by Iterative Refinement
  
 +https://​github.com/​ikostrikov/​pytorch-flows A PyTorch implementations of Masked Autoregressive Flow and some other invertible transformations from Glow: Generative Flow with Invertible 1x1 Convolutions and Density estimation using Real NVP.