Dataset (machine learning) in the context of Generative pre-trained transformer


Dataset (machine learning) in the context of Generative pre-trained transformer

Dataset (machine learning) Study page number 1 of 1

Play TriviaQuestions Online!

or

Skip to study material about Dataset (machine learning) in the context of "Generative pre-trained transformer"


HINT:

👉 Dataset (machine learning) in the context of Generative pre-trained transformer

A generative pre-trained transformer (GPT) is a type of large language model (LLM) that is widely used in generative AI chatbots. GPTs are based on a deep learning architecture called the transformer. They are pre-trained on large datasets of unlabeled content, and able to generate novel content.

OpenAI was the first to apply generative pre-training to the transformer architecture, introducing the GPT-1 model in 2018. The company has since released many bigger GPT models. The popular chatbot ChatGPT, released in late 2022 (using GPT-3.5), was followed by many competitor chatbots using their own generative pre-trained transformers to generate text, such as Gemini, DeepSeek or Claude.

↓ Explore More Topics
In this Dossier