Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
autoregressive_network [2018/02/04 11:42]
admin
autoregressive_network [2018/09/04 02:39]
admin
Line 165: Line 165:
 through a comprehensive study over both real world and synthetic data. Also, we illustrate the through a comprehensive study over both real world and synthetic data. Also, we illustrate the
 utility of our models in outlier detection and digit modeling tasks. utility of our models in outlier detection and digit modeling tasks.
 +
 +https://​arxiv.org/​abs/​1710.10304v4 Few-shot Autoregressive Density Estimation: Towards Learning to Learn Distributions
 +
 + this paper, we show how 1) neural attention and 2) meta learning techniques can be used in combination with autoregressive models to enable effective few-shot density estimation. ​
 +
 +https://​arxiv.org/​abs/​1804.00779 Neural Autoregressive Flows
 +
 +Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions,​ and their greater expressivity allows them to better capture multimodal target distributions. Experimentally,​ NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.
 +
 +https://​arxiv.org/​abs/​1806.05575 Autoregressive Quantile Networks for Generative Modeling
 +
 +We introduce autoregressive implicit quantile networks (AIQN), a fundamentally different approach to generative modeling than those commonly used, that implicitly captures the distribution using quantile regression. AIQN is able to achieve superior perceptual quality and improvements in evaluation metrics, without incurring a loss of sample diversity.
 +
 +https://​arxiv.org/​pdf/​1802.06901.pdf Deterministic Non-Autoregressive Neural Sequence Modeling
 +by Iterative Refinement
 +
 +https://​github.com/​ikostrikov/​pytorch-flows A PyTorch implementations of Masked Autoregressive Flow and some other invertible transformations from Glow: Generative Flow with Invertible 1x1 Convolutions and Density estimation using Real NVP.