Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
autoregressive_network [2018/02/04 11:41]
admin
autoregressive_network [2018/11/03 13:58] (current)
admin
Line 160: Line 160:
 https://​arxiv.org/​abs/​1801.09819v2 Transformation Autoregressive Networks https://​arxiv.org/​abs/​1801.09819v2 Transformation Autoregressive Networks
 The fundamental task of general density estimation has been of keen interest to machine learning. Recent advances in density estimation have either: a) proposed a flexible model to estimate the conditional factors of the chain rule, p(xi|xi−1,​…);​ or b) used flexible, non-linear transformations of variables of a simple base distribution. Instead, this work jointly leverages transformations of variables and autoregressive conditional models, and proposes novel methods for both. We provide a deeper understanding of our methods, showing a considerable improvement through a comprehensive study over both real world and synthetic data. Moreover, we illustrate the use of our models in outlier detection and image modeling tasks. The fundamental task of general density estimation has been of keen interest to machine learning. Recent advances in density estimation have either: a) proposed a flexible model to estimate the conditional factors of the chain rule, p(xi|xi−1,​…);​ or b) used flexible, non-linear transformations of variables of a simple base distribution. Instead, this work jointly leverages transformations of variables and autoregressive conditional models, and proposes novel methods for both. We provide a deeper understanding of our methods, showing a considerable improvement through a comprehensive study over both real world and synthetic data. Moreover, we illustrate the use of our models in outlier detection and image modeling tasks.
 +
 +In conclusion, this work jointly leverages transformations of variables and autoregressive models,
 +and proposes novel methods for both. We show a considerable improvement with our methods
 +through a comprehensive study over both real world and synthetic data. Also, we illustrate the
 +utility of our models in outlier detection and digit modeling tasks.
 +
 +https://​arxiv.org/​abs/​1710.10304v4 Few-shot Autoregressive Density Estimation: Towards Learning to Learn Distributions
 +
 + this paper, we show how 1) neural attention and 2) meta learning techniques can be used in combination with autoregressive models to enable effective few-shot density estimation. ​
 +
 +https://​arxiv.org/​abs/​1804.00779 Neural Autoregressive Flows
 +
 +Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions,​ and their greater expressivity allows them to better capture multimodal target distributions. Experimentally,​ NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST. ​ https://​github.com/​CW-Huang/​NAF
 +
 +https://​arxiv.org/​abs/​1806.05575 Autoregressive Quantile Networks for Generative Modeling
 +
 +We introduce autoregressive implicit quantile networks (AIQN), a fundamentally different approach to generative modeling than those commonly used, that implicitly captures the distribution using quantile regression. AIQN is able to achieve superior perceptual quality and improvements in evaluation metrics, without incurring a loss of sample diversity.
 +
 +https://​arxiv.org/​pdf/​1802.06901.pdf Deterministic Non-Autoregressive Neural Sequence Modeling
 +by Iterative Refinement
 +
 +https://​arxiv.org/​abs/​1811.00002v1 WaveGlow: A Flow-based Generative Network for Speech Synthesis
 +
 +WaveGlow combines insights from Glow and WaveNet in order to provide fast, efficient and high-quality audio synthesis, without the need for auto-regression. WaveGlow is implemented using only a single network, trained using only a single cost function: maximizing the likelihood of the training data, which makes the training procedure simple and stable. Our PyTorch implementation produces audio samples at a rate of more than 500 kHz on an NVIDIA V100 GPU. Mean Opinion Scores show that it delivers audio quality as good as the best publicly available WaveNet implementation. All code will be made publicly available online. https://​nv-adlr.github.io/​WaveGlow
 +
 +https://​github.com/​ikostrikov/​pytorch-flows A PyTorch implementations of Masked Autoregressive Flow and some other invertible transformations from Glow: Generative Flow with Invertible 1x1 Convolutions and Density estimation using Real NVP.