9. Multi-taskLearning
• Task란?
• 머신러닝알고리즘이수행해야하는작업.
• Ex.개와고양이구별하기,손글씨로쓴숫자를보고어떤
숫자인지맞추기.
• Multi+Task
• 여러개의Task를동시에학습하는방법론.
Prerequisites
<Single-taskLearning>
<Multi-taskLearning>
Model 3
Training
Set 3
Model 2
Training
Set 2
Model 1
Training
Set 1
Model 3
Training
Set 3
Model 2
Training
Set 2
Model 1
Training
Set 1
10. Multi-taskLearning
• Task란?
• 머신러닝알고리즘이수행해야하는작업.
• Ex.개와고양이구별하기,손글씨로쓴숫자를보고어떤
숫자인지맞추기.
• Multi+Task
• 여러개의Task를동시에학습하는방법론.
• 왜동시에학습해야하는가?
• 각각을따로학습할때보다더성능이향상될수있기때문!
• 개와고양이를구별하는Task와늑대와호랑이를구별하
는Task가있다고할때,둘을각각학습시키는것보다둘을
한번에학습시키면더성능이좋아질것을기대할수있지
않을까?
Prerequisites
<Single-taskLearning>
<Multi-taskLearning>
Model 3
Training
Set 3
Model 2
Training
Set 2
Model 1
Training
Set 1
Model 3
Training
Set 3
Model 2
Training
Set 2
Model 1
Training
Set 1
11. Multi-taskLearning
• Task란?
• 머신러닝알고리즘이수행해야하는작업.
• Ex.개와고양이구별하기,손글씨로쓴숫자를보고어떤
숫자인지맞추기.
• Multi+Task
• 여러개의Task를동시에학습하는방법론.
• 왜동시에학습해야하는가?
• 각각을따로학습할때보다더성능이향상될수있기때문!
• 개와고양이를구별하는Task와개와호랑이를구별하는
Task가있다고할때,둘을각각학습시키는것보다둘을한
번에학습시키면더성능이좋아질것을기대할수있지않
을까?
• KnowledgeTransfer
• Single-taskLearning보다Multi-taskLearning의성능
이좋아졌을때,한Task의지식이다른Task로전이되어
서성능이좋아졌다고말함.
• MultitaskLearning의목표는지식전이가일어나도록하
는것!
Prerequisites
<Single-taskLearning>
<Multi-taskLearning>
Model 3
Training
Set 3
Model 2
Training
Set 2
Model 1
Training
Set 1
Model 3
Training
Set 3
Model 2
Training
Set 2
Model 1
Training
Set 1
13. OnlineLearning
• 오프라인러닝
• 우리가아는일반적인머신러닝.
• 모델아!트레이닝데이터이만큼한번에다줄테니까,러닝해봐!
• 온라인러닝
• 데이터가스트림으로주어지는상황에서의머신러닝.
• 모델아!트레이닝데이터를한번에주면용량이너무크니까,조금씩나눠서줄게!러닝해봐!
• 근데내가주는거저장하기엔너무클테니까트레이닝데이터는쓰고버려!
• 온라인러닝에서는한번쓴데이터를다시쓸수없는경우가반드시존재한다.
• 모든데이터를1epoch에만쓸수있는것은아님!
• 받은batchdata를이용해여러epoch러닝하고버려도 됨.
Prerequisites
Training
Subset
Training
Subset
Training
Subset
Training
Subset
Training
Subset
Model
Training
Set
Model
Training
Subset