Research
研究プロジェクト・論文・書籍等
- 論文
Mitigating the Diminishing Effect of Elastic Weight Consolidation
- #言語処理
- #その他
The 29th International Conference on Computational Linguistics (COLING 2022)
Elastic weight consolidation (EWC, Kirkpatrick et al. 2017) is a promising approach to addressing catastrophic forgetting in sequential training. We find that the effect of EWC can diminish when fine-tuning large-scale pre-trained language models on different datasets. We present two simple objective functions to mitigate this problem by rescaling the components of EWC. Experiments on natural language inference and fact-checking tasks indicate that our methods require much smaller values for the trade-off parameters to achieve results comparable to EWC.