33
/ru/
AIzaSyAYiBZKx7MnpbEhh9jyipgxe19OcubqV5w
August 1, 2025
8270167
788596
2

11 июн 2018 г. - Generative Pre-Training, a milestone in language modeling (Alec Radford et al.)

Описание:

Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever from OpenAI introduce the concept of Generative Pre-Training, a two-step approach to language model training that includes unsupervised pre-training followed by task-specific supervised fine-tuning. This approach leads to the development of GPT (Generative Pre-trained Transformer) models, which achieve state-of-the-art performance on various natural language processing tasks. GPT and its successors, GPT-2 and GPT-3, demonstrate remarkable language understanding and generation capabilities, shaping the future of AI-generated content and NLP applications.

https://openai.com/research/language-unsupervised

Добавлено на ленту времени:

18 апр 2023
0
0
161

Дата:

11 июн 2018 г.
Сейчас
~ 7 г назад