This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
data_synthesis [2018/05/31 03:13]
data_synthesis [2018/12/02 02:10]
Line 172: Line 172:
 a small number of labeled inputs. In some cases, no a small number of labeled inputs. In some cases, no
 labels are required at all. labels are required at all.
 +https://​arxiv.org/​abs/​1809.01219v1 Graph-based Deep-Tree Recursive Neural Network (DTRNN) for Text Classification
 +The DTG method can generate a richer and more accurate representation for nodes (or vertices) in graphs. It adds flexibility in exploring the vertex neighborhood information to better reflect the second order proximity and homophily equivalence in a graph. ​
 +https://​arxiv.org/​abs/​1811.11264v1 Synthesizing Tabular Data using Generative Adversarial Networks
 +Generative adversarial networks (GANs) implicitly learn the probability distribution of a dataset and can draw samples from the distribution. This paper presents, Tabular GAN (TGAN), a generative adversarial network which can generate tabular data like medical or educational records. Using the power of deep neural networks, TGAN generates high-quality and fully synthetic tables while simultaneously generating discrete and continuous variables. When we evaluate our model on three datasets, we find that TGAN outperforms conventional statistical generative models in both capturing the correlation between columns and scaling up for large datasets.