📌 Table of Contents
1. Why talk about agents?
Most automation tutorials stick to workflows—a linear series of steps where every output path is pre-wired. Useful, but rigid.
By contrast, an agentic system pairs a Large Language Model (LLM) with a toolbox of actions; at run time the model decides which tool(s) to invoke and how to combine the results. Think of:
Workflow | Agent |
---|---|
Post-purchase email ✔️ | Customer-support LLM that can open tickets, pull orders, and issue refunds 🛠️ |
n8n (pronounced "n-eight-n") is one of the first open-source platforms to treat both paradigms as first-class citizens. Its Advanced AI Agent node drops an LLM into any workflow and lets you bolt on tools à-la-carte.
2. n8n essentials at a glance
Concept | What it is | Why it matters for agents |
---|---|---|
Workspace | Cloud or self-hosted project root | Holds your credentials, executions, and multiple workflows |
Node | A single step in a workflow | Five families: Triggers, Actions, Utilities, Code, Advanced AI |
Execution | A run of a workflow/agent | Every chat message or API hit shows up here for debugging |
Credentials | API keys, OAuth tokens, etc. | Centralised; reuse the same OpenAI key in many agents |
The five node families
- Triggers · Start the flow (cron, webhook, chat, etc.).
- Actions · Concrete API calls (Gmail, Telegram, Airtable…).
- Utilities · Branching, data transforms, storage.
- Code · JavaScript, HTTP Request, custom Webhook.
- Advanced AI · Chat Model and AI Agent nodes—where the magic lives.
3. Walk-through: a Home-Inventory Agent
We'll replicate the demo from the video: a chat-based agent that searches and updates an Airtable stock list.
Prerequisites
- n8n Cloud workspace (or self-hosted ≥ v1.50)
- OpenAI (or Anthropic, Groq, etc.) API key
- Airtable base with fields: Item Name, Current Qty, Order Threshold
3.1 Create the skeleton
Trigger → On Chat Message
Set to "Public" so anyone with the URL can chat.
Advanced AI → AI Agent
Think of this node as the agent's "brain".
Chat Model inside the agent
Provider: OpenAI
Model: gpt-4o
Memory → Window Buffer
Store last 5 turns so the agent remembers context ("4, 5, 6 …").
3.2 Add the first tool – Search Airtable
Inside the AI Agent node → Tools → ➕
Setting | Value |
---|---|
Type | Airtable |
Operation | Search |
Credential | Personal access token with scopes schema.read, data.records:read |
Tool description | Searches my "Home Inventory" Airtable base for items and quantities. |
Now the LLM can decide, "User asked 'What's out of stock?'—I'll run the Search tool."
3.3 Add the second tool – Update records
Add another Airtable tool:
Setting | Value |
---|---|
Operation | Update |
Tool description | Updates item quantities in the Home Inventory base. |
Inside the field mapper, use fromAI expressions so the model injects parameters dynamically:
id: {{ fromAI("record_id",
"The Airtable record ID of the item to update") }}
Current Qty: {{ fromAI("new_qty",
"New quantity for this item",
"number") }}
When you say, "I just bought two tubes of toothpaste—update my inventory," the agent:
- Searches the base to find the record_id for toothpaste.
- Calls the Update tool with new_qty = previous + 2.
3.4 Test the loop
Chat window →
User: Is anything out of stock?
Agent: Butter has Current Qty = 0.
User: I picked up 2 butters, please update.
Agent: Updated Butter → Current Qty = 2.
No hard-coded paths; the agent chose both tools on its own.
4. Scaling up with sub-agents
Large projects rarely fit in one canvas. n8n lets you:
- Convert a workflow into a callable tool (Trigger → When Called by Another Workflow).
- Nest agents – A "front-door" agent can route intents to specialised sub-agents:
Customer Agent
├── Inventory Agent (Airtable tools)
├── Order Agent (Stripe, Shopify)
└── FAQ Agent (Vector search, RAG)
Each sub-agent has its own memory, prompt, and credentials—yet the parent agent uses a single Call Workflow tool to orchestrate them.
5. Best practices & tips
Tip | Why |
---|---|
Write rich tool descriptions. | The LLM's planner relies on them to pick the right action. |
Start with small context windows. | Shorter memories are cheaper and reduce hallucinations. |
Guard mutate operations. | Add an if-statement or human approval before updates that spend money or hit prod systems. |
Log executions. | n8n's Executions tab captures every agent decision—vital for prompt tuning. |
Version your prompts. | Store long system prompts in an n8n Static Data node or pull from Git. |
6. Where to go next
- n8n Docs – Advanced AI
Deep-dive on fromAI, vector stores, and function calling. - AI Foundations Community (featured in the video)
25-module course + weekly calls on LLM ops. - Source-control your workflows
n8n supports JSON export—check it into Git to diff prompt changes.
Conclusion
Agents turn automation upside-down: instead of us defining the path, we define capabilities and let the model think. n8n's new AI Agent node, paired with a vast catalog of Triggers and Actions, makes it one of the fastest ways to prototype and ship production-grade agentic systems—no heavyweight orchestration layer required.
Spin up a workspace, wire in a chat model, and give your LLM a toolbox. Your first autonomous workflow is only a few nodes away. Happy building!