最新 | 最热门 | 最高评价

+0  My summary for the paper “Unified Language Model Pre-training for Natural Language Understanding and Generation”

Tag: machine learning | study | BERT | Transformer
ROBIN DONG 发于 2021年08月10日 08:56 | 点击: 110 | 展开摘要
For NLU (Natural Language Understanding), we use the bidirectional language model (like BERT), but for NLG(Natural Language Generation), the left-to-right unidirectional language model (like GPT) is the only choice.

Could we accomplish the

查看全文: http://www.udpwork.com/item/18130.html
|<<<1>>>| 一共1页, 1条记录