๋ฐ˜์‘ํ˜•

Bert 1

[NLP 1] BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding ๋…ผ๋ฌธ ๋ฆฌ๋ทฐ - Introduction & Related Works

#์Šค์Šค๋กœ ๊ณต๋ถ€ํ•˜๊ณ  ๋งŒ๋“  ๋‚ด์šฉ์ž…๋‹ˆ๋‹ค. BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding https://arxiv.org/abs/1810.04805 (์›๋ฌธ) ์ž์—ฐ์–ด์ฒ˜๋ฆฌ์—์„œ ๊ฐ€์žฅ ๊ธฐ๋ณธ์ด ๋˜๊ณ  ์ค‘์š”ํ•œ ๋…ผ๋ฌธ ์ค‘ ํ•˜๋‚˜์ธ ๋ฒ„ํŠธ ๋…ผ๋ฌธ์ž…๋‹ˆ๋‹ค. ์›๋ฌธ์„ ๋ฐ”ํƒ•์œผ๋กœ ์„ค๋ช…ํ•˜์˜€์œผ๋ฉฐ, ์ดํ•ด๋ฅผ ๋•๊ธฐ ์œ„ํ•œ ๋ช‡ ๊ฐ€์ง€ ํ•œ๊ตญ์–ด ์˜ˆ์‹œ๋ฅผ ๋ฆฌ์„œ์น˜ํ•˜์—ฌ ๋„ฃ์—ˆ์Šต๋‹ˆ๋‹ค! ์•„๋งˆ 5๋ถ€๋ถ„์œผ๋กœ ๋‚˜๋ˆ„์–ด ์„ค๋ช…ํ•  ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค! - Introduction & Related Works - Pre-training - Fine-tuning - Experiment - Conclusion + koBert BERT๋Š” ๊ตฌ๊ธ€์—์„œ ๊ฐœ๋ฐœํ•œ NLP ์‚ฌ์ „ ํ›ˆ๋ จ ๋ชจ๋ธ๋กœ, ํŠน์ • ๋ถ„์•ผ์— ๊ตญํ•œ๋œ ๊ธฐ์ˆ ์ด..

๋ฐ˜์‘ํ˜•