This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
data_synthesis [2018/04/22 20:16]
data_synthesis [2018/12/02 02:28] (current)
Line 155: Line 155:
 datasets and improves upon results obtained using real datasets and improves upon results obtained using real
 data alone. data alone.
 +https://​arxiv.org/​pdf/​1805.10561v1.pdf Adversarial Constraint Learning for Structured Prediction
 +Learning requires a blackbox
 +simulator of structured outputs, which generates
 +valid labels, but need not model their corresponding
 +inputs or the input-label relationship. At
 +training time, we constrain the model to produce
 +outputs that cannot be distinguished from simulated
 +labels by adversarial training. Providing our framework
 +with a small number of labeled inputs gives
 +rise to a new semi-supervised structured prediction
 +model; we evaluate this model on multiple tasks —
 +tracking, pose estimation and time series prediction
 +— and find that it achieves high accuracy with only
 +a small number of labeled inputs. In some cases, no
 +labels are required at all.
 +https://​arxiv.org/​abs/​1809.01219v1 Graph-based Deep-Tree Recursive Neural Network (DTRNN) for Text Classification
 +The DTG method can generate a richer and more accurate representation for nodes (or vertices) in graphs. It adds flexibility in exploring the vertex neighborhood information to better reflect the second order proximity and homophily equivalence in a graph. ​
 +https://​arxiv.org/​abs/​1811.11264v1 Synthesizing Tabular Data using Generative Adversarial Networks
 +Generative adversarial networks (GANs) implicitly learn the probability distribution of a dataset and can draw samples from the distribution. This paper presents, Tabular GAN (TGAN), a generative adversarial network which can generate tabular data like medical or educational records. Using the power of deep neural networks, TGAN generates high-quality and fully synthetic tables while simultaneously generating discrete and continuous variables. When we evaluate our model on three datasets, we find that TGAN outperforms conventional statistical generative models in both capturing the correlation between columns and scaling up for large datasets.
 +https://​arxiv.org/​abs/​1806.03384 Data Synthesis based on Generative Adversarial Networks