This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
active_learning [2018/01/26 10:28]
active_learning [2018/03/15 11:27]
Line 43: Line 43:
 https://​arxiv.org/​abs/​1801.08230 Deep Interactive Evolution https://​arxiv.org/​abs/​1801.08230 Deep Interactive Evolution
 +https://​arxiv.org/​abs/​1802.04877 Learning via social awareness: improving sketch representations with facial feedback
 + This paper argues that such research has overlooked an important and useful intrinsic motivator: social interaction. We posit that making an AI agent aware of implicit social feedback from humans can allow for faster learning of more generalizable and useful representations,​ and could potentially impact AI safety. We collect social feedback in the form of facial expression reactions to samples from Sketch RNN, an LSTM-based variational autoencoder (VAE) designed to produce sketch drawings. We use a Latent Constraints GAN (LC-GAN) to learn from the facial feedback of a small group of viewers, and then show in an independent evaluation with 76 users that this model produced sketches that lead to significantly more positive facial expressions. Thus, we establish that implicit social feedback can improve the output of a deep learning model.
 +https://​arxiv.org/​pdf/​1802.07427.pdf Active Learning with Partial Feedback
 +https://​arxiv.org/​abs/​1708.00489v3 Active Learning for Convolutional Neural Networks: A Core-Set Approach
 +Our empirical analysis showed that classical
 +uncertainty based methods have limited applicability to the CNNs due to the correlations caused
 +by batch sampling. We re-formulate the active learning problem as core-set selection and study the
 +core-set problem for CNNs. We further validated our algorithm using an extensive empirical study.
 +Empirical results on three datasets showed state-of-the-art performance by a large margin.