Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
neural_style [2017/06/15 10:48]
127.0.0.1 external edit
neural_style [2019/01/19 13:53] (current)
admin
Line 99: Line 99:
  
  We design a model consisting of three modules: a profile detector to decide whether a post should be responded using the profile and which key should be addressed, a bidirectional decoder to generate responses forward and backward starting from a selected profile value, and a position detector that predicts a word position from which decoding should start given a selected profile value. We show that general conversation data from social media can be used to generate profile-coherent responses. ​  We design a model consisting of three modules: a profile detector to decide whether a post should be responded using the profile and which key should be addressed, a bidirectional decoder to generate responses forward and backward starting from a selected profile value, and a position detector that predicts a word position from which decoding should start given a selected profile value. We show that general conversation data from social media can be used to generate profile-coherent responses. ​
 +
 +https://​arxiv.org/​abs/​1711.00889 Structured Generative Adversarial Networks
 +
 +https://​github.com/​VinceMarron/​style_transfer Style Transfer as Optimal Transport
 +
 +https://​nlp.stanford.edu/​pubs/​li2018transfer.pdf Delete, Retrieve, Generate:
 +A Simple Approach to Sentiment and Style Transfer
 +
 + In
 +this paper, we propose simpler methods motivated
 +by the observation that text attributes
 +are often marked by distinctive phrases (e.g.,
 +“too small”). Our strongest method extracts
 +content words by deleting phrases associated
 +with the sentence’s original attribute value, retrieves
 +new phrases associated with the target
 +attribute, and uses a neural model to fluently
 +combine these into a final output.
 +
 +https://​arxiv.org/​abs/​1808.10122v1 Learning Neural Templates for Text Generation
 +
 +Encoder-decoder models are largely (a) uninterpretable,​ and (b) difficult to control in terms of their phrasing or content. This work proposes a neural generation system using a hidden semi-markov model (HSMM) decoder, which learns latent, discrete templates jointly with learning to generate. We show that this model learns useful templates, and that these templates make generation both more interpretable and controllable. ​
 +
 +https://​web.cs.hacettepe.edu.tr/​~karacan/​projects/​attribute_hallucination/#​ Manipulating Attributes of Natural Scenes via Hallucination
 +
 +https://​arxiv.org/​abs/​1810.01175 Line Drawings from 3D Models
 +
 +
 +https://​hal.inria.fr/​hal-01802131v2/​document Unsupervised Learning of Artistic Styles with
 +Archetypal Style Analysis
 +
 +https://​compvis.github.io/​adaptive-style-transfer/​ A Style-Aware Content Loss for Real-time HD Style Transfer