AI Glossary
Decide the buzzwords. A simple dictionary for modern AI terminology.
Artificial General Intelligence (AGI)
ConceptsA hypothetical type of AI that can understand, learn, and apply intelligence across a wide range of tasks at or above human level.
Context Window
ArchitectureThe limit on the amount of text (tokens) a model can consider at one time (input + output).
Deep Learning
ConceptsA subset of machine learning based on artificial neural networks with multiple layers, capable of learning complex patterns from large amounts of data.
Fine-tuning
TrainingThe process of taking a pre-trained model and training it further on a specific dataset to specialize it for a certain task.
Generative AI
ConceptsA broad category of AI systems designed to generate new content, such as text, images, audio, or code, based on patterns learned from existing data.
Hallucination
IssuesWhen an AI model confidently generates false or nonsensical information.
Large Language Model (LLM)
ArchitectureA type of AI model trained on massive amounts of text data, capable of understanding and generating human-like language.
Machine Learning
ConceptsA branch of AI focused on building systems that learn from data and improve their performance over time without being explicitly programmed.
Multimodal
CapabilitiesThe ability of a model to process and generate multiple types of media (text, images, audio) simultaneously.
Natural Language Processing (NLP)
ConceptsA field of AI focused on the interaction between computers and human language, enabling machines to read, understand, and derive meaning from text.
Neural Network
ArchitectureA computing system inspired by the human brain, consisting of interconnected nodes (neurons) that process information and learn patterns.
Prompt Engineering
TechniquesThe practice of designing and refining the input text (prompts) given to an AI model to elicit the most accurate, relevant, or creative response.
RAG
ArchitectureRetrieval-Augmented Generation. A technique where an LLM is provided with external data (context) to generate more accurate answers.
Temperature
ParametersA parameter that controls the randomness of the model's output. Higher values (0.8+) make output more creative/random, lower values (0.2) make it more focused/deterministic.
Token
BasicsThe basic unit of text processing for an LLM. Roughly 0.75 words. "Hamburger" might be 2-3 tokens.
Zero-shot Learning
TrainingThe ability of a model to perform a task without having seen any examples of that specific task during training.