icon_install_ios_web icon_install_ios_web icon_install_android_web

AI Buzzword Dictionary (March 2026 Edition), Recommended to Bookmark

Author|Golem (@веб3Голем)

AI Buzzword Dictionary (March 2026 Edition), Recommended to Bookmark

Nowadays, if someone in the криптовалюта space isn’t paying attention to AI, they’re likely to get roasted (yes, my friend, think about why you clicked on this).

Are you completely clueless about the basic concepts of AI, asking Doubao what every abbreviation in a sentence means? Or are you at an AI offline event, utterly confused by all the jargon, trying to pretend you’re still in the loop?

While diving headfirst into the AI industry in a short time isn’t realistic, knowing some high-frequency basic AI vocabulary is never a waste. Fortunately, the following article is prepared for you ↓ Sincerely recommend you read it thoroughly and bookmark it.

Basic Vocabulary (12)

LLM (Large Language Model)

LLM is a deep learning model trained on massive amounts of data, excelling at understanding and generating language. It processes text and is increasingly capable of handling other types of content.

Its counterpart is SLM (Small Language Model) – typically emphasizing language models that are lower cost, lighter to deploy, and more convenient for local use.

AI Agent

AI Agent refers not just to a “chatty model,” but to a system that can understand goals, call tools, execute tasks step-by-step, and even plan and verify when necessary. Google defines an agent as software that can reason based on multimodal inputs and perform actions on behalf of the user.

Multimodal

This refers to AI models that don’t just read text but can simultaneously process and generate various forms of input and output like text, images, audio, and video. Google explicitly defines multimodal as the ability to process and generate different types of content.

Prompt

The instruction a user gives to a model, the most fundamental form of human-machine interaction.

Generative AI (AIGC)

Emphasizes AI’s ability to “generate” rather than just classify or predict. Generative models can create text, code, images, memes, videos, and other content based on prompts.

Токен

This is one of the concepts in the AI world most akin to a “Gas unit.” Models don’t understand content by “word count” but process input and output by tokens. Billing, context length, and response speed are all strongly tied to tokens.

Context Window (Context Length)

Refers to the total number of tokens a model can “see” and utilize at once. It can also be described as the number of tokens a model can consider or “remember” during a single processing instance.

Memory

Allows a model or Agent to retain user preferences, task context, and historical states.

Training

The process by which a model learns parameters from data.

Inference

Opposite of training, it refers to the process where a deployed model receives input and generates output. The industry often says “training is expensive, inference is even more costly” because many costs in real commercialization occur during inference. The training/inference distinction is also a fundamental framework for mainstream vendors when discussing deployment costs.

Инструмент Use / Инструмент Calling

Means the model doesn’t just output text but can call tools like search, code execution, databases, and external APIs. This is now considered a key capability of Agents.

API

The infrastructure used when AI products, applications, or Agents connect to third-party services.

Advanced Vocabulary (18)

Transformer

A model architecture that makes AI better at understanding contextual relationships. It’s the technical foundation for most large language models today. Its key feature is the ability to simultaneously consider the relationship between each word and all other words in a segment of text.

Attention (Mechanism)

The most crucial core mechanism of the Transformer. Its function is to let the model automatically determine “which words are most worth focusing on” when reading a sentence.

Agentic / Agentic Workflow

A very hot term recently. It means a system is no longer just “Q&A” but has a degree of autonomy to decompose tasks, decide next steps, and call external capabilities. Many vendors see it as a marker of moving “from Chatbot to executable system.”

Subagents

An Agent further broken down into multiple specialized smaller Agents to handle subtasks.

Skills (Reusable Capability Modules)

With the explosion of OpenClaw, this term has become noticeably more common. These are installable, reusable, composable capability units/operation manuals for AI Agents, but they also carry specific risks of tool misuse and data exposure.

Hallucination

Refers to the model confidently generating false or nonsensical output, “perceiving patterns that don’t exist.” It’s the model’s seemingly reasonable but actually erroneous overconfident output.

Latency

The time it takes for a model to output results after receiving a request. One of the most common engineering buzzwords, frequently appearing in discussions about implementation and productization.

Guardrails

Used to limit what a model/Agent can do, when it should stop, and what content it cannot output.

Vibe Coding

Another of today’s hottest AI buzzwords. It means users directly express requirements through conversation, AI writes the code, and the user doesn’t need to specifically know how to code.

Parameters

The internal numerical scale of a model used to store capabilities and knowledge. Often used as a rough measure of model size. “Billions of parameters” and “trillions of parameters” are among the most common intimidating phrases in the AI circle.

Reasoning Model

Typically refers to models more adept at multi-step reasoning, planning, verification, and executing complex tasks.

MCP (Model Context Protocol)

A very hot new buzzword in the past year. Its function is similar to establishing a universal interface between models and external tools/data sources.

Fine-tuning / Tuning

Continuing training on a base model to make it more suitable for specific tasks, styles, or domains. Google’s terminology list directly treats tuning and fine-tuning as related concepts.

Distillation

Compressing the capabilities of a large model into a smaller one as much as possible, akin to having a “teacher” instruct a “student.”

RAG (Retrieval-Augmented Generation)

This has almost become a basic configuration for enterprise AI. Microsoft defines it as a “search + LLM” pattern, using external data to ground answers, solving problems like outdated training data or lack of knowledge about private knowledge bases. The goal is to base answers on real documents and private knowledge, not just the model’s own memory.

Grounding

Often appears together with RAG. It means making the model’s answers based on external sources like documents, databases, web pages, etc., rather than relying solely on parameter memory for “free improvisation.” Microsoft explicitly lists grounding as a core value in its RAG documentation.

Embedding (Vector Embedding / Semantic Vector)

Encoding content like text, images, or audio into high-dimensional numerical vectors to facilitate semantic similarity calculations.

Benchmark

A method of evaluating model capabilities using a unified set of standards. It’s also the leaderboard language that various model vendors love to use to “prove they are strong.”

Recommended Reading

Lobster’s Key 11 Questions: The Most Accessible Breakdown of OpenClaw Principles

Эта статья взята из интернета: AI Buzzword Dictionary (March 2026 Edition), Recommended to Bookmark

Related: Silver is soaring, can tokenized silver amplify leverage further?

Author|Wenser(@wenser 2010) Silver, the precious metal asset once dubbed “poor man’s gold,” is sweeping across global markets with storm-like momentum. The reason is none other than its staggering price surge. Recently, the price of silver briefly broke through $117 per ounce, reaching a new all-time high. Consequently, since the peak of the 2017 криптовалюта cycle, silver has officially surpassed Bitcoin’s gain (approximately 500%) and gold’s gain (slightly below 300%) with a cumulative increase of about 517%. According to data from the 8marketcap website, the current silver price is around $110, with a market capitalization reaching $6.18 trillion, ranking second among global assets, just behind gold. Such an astonishing trend naturally triggers fervent market interest. Beyond purchasing silver funds or physical silver through traditional brokerages or offline stores, tokenized silver might…

© Copyright Notice

Related articles