33
/it/
AIzaSyAYiBZKx7MnpbEhh9jyipgxe19OcubqV5w
August 1, 2025
8270167
788596
2

11 giug 2018 anni - Generative Pre-Training, a milestone in language modeling (Alec Radford et al.)

Descrizione:

Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever from OpenAI introduce the concept of Generative Pre-Training, a two-step approach to language model training that includes unsupervised pre-training followed by task-specific supervised fine-tuning. This approach leads to the development of GPT (Generative Pre-trained Transformer) models, which achieve state-of-the-art performance on various natural language processing tasks. GPT and its successors, GPT-2 and GPT-3, demonstrate remarkable language understanding and generation capabilities, shaping the future of AI-generated content and NLP applications.

https://openai.com/research/language-unsupervised

Aggiunto al nastro di tempo:

18 apr 2023
0
0
160

Data:

11 giug 2018 anni
Adesso
~ 7 years ago