33
/es/
AIzaSyB4mHJ5NPEv-XzF7P6NDYXjlkCWaeKw5bc
November 1, 2025
8270167
788596
2
Public Timelines
FAQ Obtener premium

11 jun 2018 año - Generative Pre-Training, a milestone in language modeling (Alec Radford et al.)

Descripción:

Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever from OpenAI introduce the concept of Generative Pre-Training, a two-step approach to language model training that includes unsupervised pre-training followed by task-specific supervised fine-tuning. This approach leads to the development of GPT (Generative Pre-trained Transformer) models, which achieve state-of-the-art performance on various natural language processing tasks. GPT and its successors, GPT-2 and GPT-3, demonstrate remarkable language understanding and generation capabilities, shaping the future of AI-generated content and NLP applications.

https://openai.com/research/language-unsupervised

Añadido al timeline:

18 abr 2023
0
0
207

fecha:

11 jun 2018 año
Ahora mismo
~ 7 years and 3 months ago