Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
autoregressive_network [2018/04/04 09:52]
admin
autoregressive_network [2018/11/03 13:58] (current)
admin
Line 172: Line 172:
 https://​arxiv.org/​abs/​1804.00779 Neural Autoregressive Flows https://​arxiv.org/​abs/​1804.00779 Neural Autoregressive Flows
  
-Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions,​ and their greater expressivity allows them to better capture multimodal target distributions. Experimentally,​ NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.+Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions,​ and their greater expressivity allows them to better capture multimodal target distributions. Experimentally,​ NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.  https://​github.com/​CW-Huang/​NAF 
 + 
 +https://​arxiv.org/​abs/​1806.05575 Autoregressive Quantile Networks for Generative Modeling 
 + 
 +We introduce autoregressive implicit quantile networks (AIQN), a fundamentally different approach to generative modeling than those commonly used, that implicitly captures the distribution using quantile regression. AIQN is able to achieve superior perceptual quality and improvements in evaluation metrics, without incurring a loss of sample diversity. 
 + 
 +https://​arxiv.org/​pdf/​1802.06901.pdf Deterministic Non-Autoregressive Neural Sequence Modeling 
 +by Iterative Refinement 
 + 
 +https://​arxiv.org/​abs/​1811.00002v1 WaveGlow: A Flow-based Generative Network for Speech Synthesis 
 + 
 +WaveGlow combines insights from Glow and WaveNet in order to provide fast, efficient and high-quality audio synthesis, without the need for auto-regression. WaveGlow is implemented using only a single network, trained using only a single cost function: maximizing the likelihood of the training data, which makes the training procedure simple and stable. Our PyTorch implementation produces audio samples at a rate of more than 500 kHz on an NVIDIA V100 GPU. Mean Opinion Scores show that it delivers audio quality as good as the best publicly available WaveNet implementation. All code will be made publicly available online. https://​nv-adlr.github.io/​WaveGlow 
 + 
 +https://​github.com/​ikostrikov/​pytorch-flows A PyTorch implementations of Masked Autoregressive Flow and some other invertible transformations from Glow: Generative Flow with Invertible 1x1 Convolutions and Density estimation using Real NVP.