---
+0 My summary for the paper “Unified Language Model Pre-training for Natural Language Understanding and Generation”
ROBIN DONG 发于 2021年08月10日 08:56
| 点击: 523
| 展开摘要
For NLU (Natural Language Understanding), we use the bidirectional language model (like BERT), but for NLG(Natural Language Generation), the left-to-right unidirectional language model (like GPT) is the only choice.
Could we accomplish the
查看全文: http://www.udpwork.com/item/18130.html
Could we accomplish the
查看全文: http://www.udpwork.com/item/18130.html