GPT-3 - Wikipedia
https://en.wikipedia.org/wiki/GPT-3Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series (and the successor to GPT-2) created by OpenAI, a San Francisco-based artificial
GPT-3 - Wikipedia
en.wikipedia.org › wiki › GPT-3Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model that uses deep learning to produce human-like text.. It is the third-generation language prediction model in the GPT-n series (and the successor to GPT-2) created by OpenAI, a San Francisco-based artificial intelligence research laboratory.
GPT-3 总结 - 知乎 - Zhihu
https://zhuanlan.zhihu.com/p/165882989GPT-3 achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly(动态) reasoning or domain adaptation, such as unscrambling words, using a novel word in a sentence, or performing 3-digit arithmetic. we find that GPT-3 can generate samples of news articles which human …