Date post: | 12-Apr-2017 |
Category: |
Technology |
Upload: | marujirou |
View: | 157 times |
Download: | 2 times |
DeepMulti-TaskLearningwithSharedMemoryPengfei Liu,Xipeng Qiu,Xuanjing HuangEMNLP2016readinggrouppresenter:ryosukemiyazaki
AbstractDuetothelargenumberofparametersneuralmodelsneedalarge-scalecorpus. →unsupervisedpre-trainingiseffectiveMulti-tasklearningalsoimprovethefinalperformance.ThispaperproposeLSTMwithexternalmemoryformulti-tasklearning.
Model:ME-LSTM
Keyvector,Erasevector,Addvector
Model:ME-LSTM
Readingoperation
Ksegment,Mdimensionsperonesegment
,
Model:ME-LSTM
DeepFusionstrategy
Model:ME-LSTM
Writingoperation
Twoarchitectures
ARC-1 ARC-2
TrainingTask-specificoutputlayer
Linearcombinationofcostfunction
λm istheweightsforeachtaskm
Experiment:textclassification
Result:Movie
Result:Product
Analysis:Visualizedeepfusiongate
Sentimentscore
Dimensionsofdeepfusiongategt
Activate→ black
Analysis:Visualizedeepfusiongate
Conclusion・ Thispaperproposetwodeeparchitecturesformulti-tasklearning.・ Theydesignanexternalmemorytostoretheknowledgebyrelatedtasks.・ Deepfusionstrategyenablingthemodeltogivesharedinformation.