Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
context [2018/05/31 15:14]
admin
context [2018/11/24 12:13]
admin
Line 207: Line 207:
  
 We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons. We apply this mechanism in the framework of metalearning,​ where the aim is to replicate some of the flexibility of human learning in machines. Conditionally shifted neurons modify their activation values with task-specific shifts retrieved from a memory module, which is populated rapidly based on limited task experience. On metalearning benchmarks from the vision and language domains, models augmented with conditionally shifted neurons achieve state-of-the-art results. We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons. We apply this mechanism in the framework of metalearning,​ where the aim is to replicate some of the flexibility of human learning in machines. Conditionally shifted neurons modify their activation values with task-specific shifts retrieved from a memory module, which is populated rapidly based on limited task experience. On metalearning benchmarks from the vision and language domains, models augmented with conditionally shifted neurons achieve state-of-the-art results.
 +
 +https://​arxiv.org/​abs/​1612.08083v3 Language Modeling with Gated Convolutional Networks
 +
 +https://​arxiv.org/​abs/​1711.06640v2 Neural Motifs: Scene Graph Parsing with Global Context
 +
 +Our analysis motivates a new baseline: given object detections, predict the most frequent relation between object pairs with the given labels, as seen in the training set. This baseline improves on the previous state-of-the-art by an average of 3.6% relative improvement across evaluation settings. We then introduce Stacked Motif Networks, a new architecture designed to capture higher order motifs in scene graphs that further improves over our strong baseline by an average 7.1% relative gain. Our code is available at github.com/​rowanz/​neural-motifs.
 +
 +https://​arxiv.org/​abs/​1808.08493 Contextual Parameter Generation for Universal Neural Machine Translation
 +
 +https://​arxiv.org/​abs/​1809.01997 Dual Ask-Answer Network for Machine Reading Comprehension
 +
 +https://​openreview.net/​forum?​id=BylBfnRqFm CAML: Fast Context Adaptation via Meta-Learning
 +
 +https://​arxiv.org/​pdf/​1810.03642v1.pdf CAML: Fast Context Adaptation via Meta-Learning
 +
 +
 +CAML: Fast Context Adaptation via Meta-Learning
 +Luisa M Zintgraf, Kyriacos Shiarlis, Vitaly Kurin, Katja Hofmann, Shimon Whiteson
 +(Submitted on 8 Oct 2018 (this version), latest version 12 Oct 2018 (v2))
 +We propose CAML, a meta-learning method for fast adaptation that partitions the model parameters into two parts: context parameters that serve as additional input to the model and are adapted on individual tasks, and shared parameters that are meta-trained and shared across tasks.
 +
 +https://​arxiv.org/​abs/​1810.08135 Contextual Topic Modeling For Dialog Systems
 +
 +Our work for detecting conversation topics and keywords can be used to guide chatbots towards coherent dialog.
 +
 +https://​www.nature.com/​articles/​s41467-018-06781-2 Reference-point centering and range-adaptation enhance human reinforcement learning at the cost of irrational preferences
 +