+ All Categories
Home > Documents > A Unifying Framework for Multi-task Learning

A Unifying Framework for Multi-task Learning

Date post: 05-Nov-2021
Category:
Upload: others
View: 6 times
Download: 0 times
Share this document with a friend
33
A Unifying Framework for Multi-task Learning Carlo Ciliberto
Transcript
Page 1: A Unifying Framework for Multi-task Learning

A Unifying Framework for

Multi-task LearningCarlo Ciliberto

Page 2: A Unifying Framework for Multi-task Learning

Sharing Information

Page 3: A Unifying Framework for Multi-task Learning

Sharing Information

Page 4: A Unifying Framework for Multi-task Learning

Sharing Information

Page 5: A Unifying Framework for Multi-task Learning

Without Sharing Information

Page 6: A Unifying Framework for Multi-task Learning

Sharing Information

Page 7: A Unifying Framework for Multi-task Learning

Multi-task Learning: AssumptionLeveraging on the tasks relations/structure

reduces the complexity of the problem

Page 8: A Unifying Framework for Multi-task Learning
Page 9: A Unifying Framework for Multi-task Learning
Page 10: A Unifying Framework for Multi-task Learning

Impose known structures[Evgeniou et al. 2005, Fergus et al. 2010, Kadri et al. 2010, Minh et al 2013, Jayaraman et

al., 2014 and many others]

Parametrize and Learn

the structure [Argyriou et al. 2008, Jacob et al. 2009, Zhang et al, 2010 Dinuzzo et al. 2011, Zhong

2012, and many other]

Page 11: A Unifying Framework for Multi-task Learning

Learning To Learn

Output Representation Learning

Page 12: A Unifying Framework for Multi-task Learning

To Abstract,

Understand & Organize

Page 13: A Unifying Framework for Multi-task Learning

Can we design a unifying

(convex) framework for

learning Multiple Tasks and

their structure?

Page 14: A Unifying Framework for Multi-task Learning

Can we design a unifying

(convex) framework for

learning Multiple Tasks and

their structure?

Yes!

Page 15: A Unifying Framework for Multi-task Learning

Can we provide a general

meta-strategy for

optimization…

…with convergence

guarantees?

Page 16: A Unifying Framework for Multi-task Learning

Can we provide a general

meta-strategy for

optimization…

…with convergence

guarantees?

Yes!

Page 17: A Unifying Framework for Multi-task Learning

Can we derive new models

of tasks structures from

such a framework?

Page 18: A Unifying Framework for Multi-task Learning

Can we derive new models

of tasks structures from

such a framework?

Yes!

Page 19: A Unifying Framework for Multi-task Learning

Can we derive new models of tasks

structures from such a framework?

Can we provide a general meta-strategy for

optimization, with convergence guarantees?

Can we design a unifying (convex)

framework for learning multiple-tasks and

their structure?

[Ciliberto et al. - ICML 2015]

[Ciliberto et al. - CVPR 2015]

Page 20: A Unifying Framework for Multi-task Learning

RKHSfor Vector-Valued functions

Page 21: A Unifying Framework for Multi-task Learning

Examples

~ Graph Laplacian[Evgeniou et al. 2005, Argyriou et al. 2013]

Low dimensional subspace sharing[Argyriou et al. 2008, Zhang et al. 2010]

Cluster Multi-task learning[Jacob et al. 2009, kwok et al. 2012]

Sparse Kernel Multi-task Learning[Ciliberto et al. 2015]

Page 22: A Unifying Framework for Multi-task Learning

Can we derive new models of tasks

structures from such a framework?

Can we provide a general meta-strategy for

optimization, with convergence guarantees?

Can we design a unifying (convex)

framework for learning multiple-tasks and

their structure?

[Ciliberto et al. - ICML 2015]

[Ciliberto et al. - CVPR 2015]

Page 23: A Unifying Framework for Multi-task Learning

Are we done?

Page 24: A Unifying Framework for Multi-task Learning
Page 25: A Unifying Framework for Multi-task Learning
Page 26: A Unifying Framework for Multi-task Learning
Page 27: A Unifying Framework for Multi-task Learning

Can we find a parametrization

for all Operator-valued

Kernels?

Can we still learn them?

Page 28: A Unifying Framework for Multi-task Learning

Can we find a parametrization

for all Operator-valued

Kernels?

Can we still learn them?

Spoiler alert: Yes![Ciliberto et al. - In Preparation]

Page 29: A Unifying Framework for Multi-task Learning

Take home messages

Page 30: A Unifying Framework for Multi-task Learning

Take home messages

Multi Task LearningIf tasks are related, solving them jointly can be much more favorable!

MTL

Page 31: A Unifying Framework for Multi-task Learning

Take home messages

Multi Task LearningIf tasks are related, solving them jointly can be much more favorable!

RKHS for vector-valued functionsAre the way to go! you can:

MTL

Page 32: A Unifying Framework for Multi-task Learning

Take home messages

Multi Task LearningIf tasks are related, solving them jointly can be much more favorable!

RKHS for vector-valued functionsAre the way to go! you can:

Impose prior knowledge on the structureBy designing a suitable structure matrix A

Learn the relations!Imposing a structure penalty F(A) on the problem

MTL

Page 33: A Unifying Framework for Multi-task Learning

Take home messages

Multi Task LearningIf tasks are related, solving them jointly can be much more favorable!

RKHS for vector-valued functionsAre the way to go! you can:

Impose prior knowledge on the structureBy designing a suitable structure matrix A

Learn the relations!Imposing a structure penalty F(A) on the problem

Future Work

More complex intra-task relationsImpose or learn more complex input-output relations

MTL


Recommended