Menu

Sparse Coding for Multi-task and Transfer Learning

calendar icon Oct 6, 2014 2142 views
video thumbnail
Pause
Mute
speed icon
speed icon
0.25
0.5
0.75
1
1.25
1.5
1.75
2

We consider the problem of learning many regression or binary classification tasks simultaneously, under the assumption that the tasks' weight vectors are well approximated as sparse combinations of the atoms of a dictionary. This assumption, together with the large quantity of available tasks, allows for a principled method for choosing the dictionary. We provide theoretical and experimental justifications of this claim, both in the domain of multitask learning, where the learned dictionary is applied to a fixed set of tasks, and in the domain of learning to learn, where the tasks are randomly generated and the learned dictionary is applied to new tasks sampled by the same process. These results also implies that that as number of tasks grow our method matches the performance of the Lasso with best a-priori known dictionary. Finally, we discuss extensions of our method to other coding schemes beyond sparse coding and multilayer networks. This is joint work with Andreas Maurer and Bernardino Romera-Paredes.

RELATED CATEGORIES

MORE VIDEOS FROM THE EVENT

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.