33
/it/
AIzaSyB4mHJ5NPEv-XzF7P6NDYXjlkCWaeKw5bc
November 1, 2025
8270167
788596
2
Public Timelines
FAQ Ricevere il Premium

11 giug 2018 anni - Generative Pre-Training, a milestone in language modeling (Alec Radford et al.)

Descrizione:

Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever from OpenAI introduce the concept of Generative Pre-Training, a two-step approach to language model training that includes unsupervised pre-training followed by task-specific supervised fine-tuning. This approach leads to the development of GPT (Generative Pre-trained Transformer) models, which achieve state-of-the-art performance on various natural language processing tasks. GPT and its successors, GPT-2 and GPT-3, demonstrate remarkable language understanding and generation capabilities, shaping the future of AI-generated content and NLP applications.

https://openai.com/research/language-unsupervised

Aggiunto al nastro di tempo:

18 apr 2023
0
0
206

Data:

11 giug 2018 anni
Adesso
~ 7 years and 3 months ago