GPT-3.5 in the context of Generative pre-trained transformer


GPT-3.5 in the context of Generative pre-trained transformer

GPT-3.5 Study page number 1 of 1

Play TriviaQuestions Online!

or

Skip to study material about GPT-3.5 in the context of "Generative pre-trained transformer"


HINT:

👉 GPT-3.5 in the context of Generative pre-trained transformer

A generative pre-trained transformer (GPT) is a type of large language model (LLM) that is widely used in generative AI chatbots. GPTs are based on a deep learning architecture called the transformer. They are pre-trained on large datasets of unlabeled content, and able to generate novel content.

OpenAI was the first to apply generative pre-training to the transformer architecture, introducing the GPT-1 model in 2018. The company has since released many bigger GPT models. The popular chatbot ChatGPT, released in late 2022 (using GPT-3.5), was followed by many competitor chatbots using their own generative pre-trained transformers to generate text, such as Gemini, DeepSeek or Claude.

↓ Explore More Topics
In this Dossier

GPT-3.5 in the context of GPT-4

Generative Pre-trained Transformer 4 (GPT-4) is a large language model developed by OpenAI and the fourth in its series of GPT foundation models.

GPT-4 is more capable than its predecessor GPT-3.5 and followed by its successor GPT-5. GPT-4V is a version of GPT-4 that can process images in addition to text. OpenAI has not revealed technical details and statistics about GPT-4, such as the precise size of the model.

View the full Wikipedia page for GPT-4
↑ Return to Menu