논문리뷰 - BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
·
공부/논문
BERT에 대한 논문리뷰를 진행해보려고 한다. Transformer, 그리고 인공지능 기본지식이 있다는 전제하에 설명한 글이다.  https://arxiv.org/abs/1810.04805 BERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingWe introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train ..