Dify API Guide 2026: Integrate AI into Your App with REST
The Dify REST API lets you call any Dify app from your own code — embed a chatbot in your website, automate document processing, or add AI features to any SaaS product. This guide covers everything from getting your API key to streaming responses and self-hosted setups.
What is the Dify API?
The Dify API is a REST API that lets you programmatically call any app you build in Dify — from any programming language or platform. Once you publish a Dify app (chatbot, agent, workflow, or completion app), it gets its own API endpoint you can call with a secret key.
The base URL for Dify Cloud is https://api.dify.ai/v1. If you self-host Dify, replace this with your own domain: https://your-server.com/v1.
Common use cases
Website chatbot
Embed AI chat into any site without using the Dify widget.
Document automation
Send PDFs or text to a Dify workflow and get structured output back.
SaaS AI features
Add AI writing, summarization, or Q&A to your existing product.
Backend pipelines
Trigger Dify agents from cron jobs, webhooks, or queue processors.
Mobile apps
Call Dify from iOS or Android apps using standard HTTP.
No-code tools
Connect Dify to n8n, Make, or Zapier via HTTP request nodes.
Getting Your API Key
Every Dify app has its own API key. You need to create one for each app you want to access via API. Here's how:
Open your Dify app in Studio
Go to cloud.dify.ai (or your self-hosted URL) and open the app you want to call via API.
Click "API Access" in the top-right corner
This opens the API reference panel for that specific app.
Click "Create API Key"
Give it a name (e.g., "production" or "testing") to identify it later.
Copy the secret key immediately
The key is shown only once. Store it in your environment variables — never hardcode it in source code.
Use it in the Authorization header
All API requests need this header: Authorization: Bearer YOUR_API_KEY
API Endpoints Overview
The Dify API provides endpoints for sending messages, managing conversations, uploading files, and more. All endpoints are relative to the base URL https://api.dify.ai/v1.
| Method | Endpoint | Description |
|---|---|---|
| POST | /chat-messages | Send a chat message and get a response. Supports blocking and streaming modes. |
| POST | /completion-messages | Send a prompt to a completion-type app. Returns a single generated text. |
| POST | /files/upload | Upload a file (PDF, image, etc.) for use in RAG or vision-enabled apps. |
| GET | /conversations | List all conversations for a user. Use the user parameter to scope results. |
| GET | /messages | Get the message history for a specific conversation. |
| DELETE | /conversations/:id | Delete a conversation and all its messages permanently. |
| POST | /messages/:id/feedbacks | Submit a thumbs-up or thumbs-down rating for a specific message. |
| GET | /parameters | Get the app's input parameters, intro text, and suggested questions. |
Your First API Call
Let's make a real API call. Replace YOUR_API_KEY with your actual key. The user field is a unique identifier for the end user — use any string that identifies them in your system.
curl example
curl -X POST 'https://api.dify.ai/v1/chat-messages' \
-H 'Authorization: Bearer YOUR_API_KEY' \
-H 'Content-Type: application/json' \
-d '{
"inputs": {},
"query": "Hello! What can you help me with?",
"response_mode": "blocking",
"conversation_id": "",
"user": "user-123"
}' Example response
{
"event": "message",
"task_id": "abc123",
"id": "msg_456",
"message_id": "msg_456",
"conversation_id": "conv_789",
"mode": "chat",
"answer": "Hello! I can help you with questions, writing, analysis, and more. What would you like to explore?",
"metadata": { "usage": { "prompt_tokens": 12, "completion_tokens": 22 } },
"created_at": 1711234567
} Python example (using requests)
import requests
API_KEY = "your_api_key_here"
BASE_URL = "https://api.dify.ai/v1"
def chat(query, conversation_id="", user="user-123"):
response = requests.post(
f"{BASE_URL}/chat-messages",
headers={
"Authorization": f"Bearer {API_KEY}",
"Content-Type": "application/json",
},
json={
"inputs": {},
"query": query,
"response_mode": "blocking",
"conversation_id": conversation_id,
"user": user,
}
)
response.raise_for_status()
return response.json()
# First message
result = chat("What is Dify?")
print(result["answer"])
conversation_id = result["conversation_id"]
# Follow-up in the same conversation
result2 = chat("Can you elaborate?", conversation_id=conversation_id)
print(result2["answer"]) conversation_id from the first response and pass it in subsequent calls. This maintains conversation history so the AI remembers what was said before.
Streaming Responses (SSE)
Streaming lets you display tokens as they're generated — like ChatGPT's typing effect. Dify uses Server-Sent Events (SSE) for streaming. Set "response_mode": "streaming" in your request body.
Each event arrives as a line prefixed with data:. The event types include message (a token chunk), message_end (final message with metadata), and error.
Streaming curl example
curl -X POST 'https://api.dify.ai/v1/chat-messages' \
-H 'Authorization: Bearer YOUR_API_KEY' \
-H 'Content-Type: application/json' \
--no-buffer \
-d '{
"inputs": {},
"query": "Write a short poem about AI.",
"response_mode": "streaming",
"conversation_id": "",
"user": "user-123"
}'
# Output (one line per token chunk):
# data: {"event":"message","answer":"In",...}
# data: {"event":"message","answer":" circuits",...}
# data: {"event":"message","answer":" deep",...}
# data: {"event":"message_end","metadata":{"usage":{...}}}} Python streaming example
import requests
import json
def stream_chat(query, user="user-123"):
with requests.post(
"https://api.dify.ai/v1/chat-messages",
headers={
"Authorization": "Bearer YOUR_API_KEY",
"Content-Type": "application/json",
},
json={
"inputs": {},
"query": query,
"response_mode": "streaming",
"conversation_id": "",
"user": user,
},
stream=True,
) as response:
for line in response.iter_lines():
if line and line.startswith(b"data: "):
data = json.loads(line[6:])
if data.get("event") == "message":
print(data["answer"], end="", flush=True)
elif data.get("event") == "message_end":
print() # newline at end
break
stream_chat("Explain quantum computing in simple terms.") JavaScript / Browser (EventSource)
// Note: EventSource doesn't support POST with a body.
// Use fetch with ReadableStream instead for browser streaming.
async function streamChat(query) {
const response = await fetch("https://api.dify.ai/v1/chat-messages", {
method: "POST",
headers: {
"Authorization": "Bearer YOUR_API_KEY",
"Content-Type": "application/json",
},
body: JSON.stringify({
inputs: {},
query,
response_mode: "streaming",
conversation_id: "",
user: "user-browser",
}),
});
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const lines = decoder.decode(value).split("\n");
for (const line of lines) {
if (line.startsWith("data: ")) {
const data = JSON.parse(line.slice(6));
if (data.event === "message") {
document.getElementById("output").textContent += data.answer;
}
}
}
}
}
streamChat("What are the benefits of self-hosting AI?"); Using the API with Self-Hosted Dify
If you self-host Dify on your own server, the API works exactly the same way — just replace the base URL. This is the main advantage of self-hosting: you control the infrastructure, data stays on your server, and there are no per-message credit limits.
Dify Cloud
BASE_URL = "https://api.dify.ai/v1" Self-Hosted
BASE_URL = "https://your-server.com/v1" Everything else — the API key format, request bodies, response format, streaming — stays identical. This makes it easy to develop against Dify Cloud and then switch to a self-hosted instance for production by changing one environment variable.
Best self-hosting options for API-heavy workloads
Hetzner VPS (DIY)
From €3.79/month. Best cost-to-performance. Ideal for developers comfortable with Linux.
Get Hetzner →Elestio (Managed)
Managed Dify deployment. Auto-updates, backups, SSL. No server management required.
Get Elestio →Frequently Asked Questions
How do I get a Dify API key?
Open your Dify app, click 'API Access' in the top-right corner, then create a new API key. Copy the secret key — it won't be shown again. Use it in the Authorization header as 'Bearer YOUR_KEY'.
Is the Dify API free to use?
The Dify API itself is free — you only pay for the LLM tokens (OpenAI, Anthropic, etc.) consumed by your requests. Self-hosted Dify has no per-call fees. Dify Cloud charges based on message credits.
Can I stream Dify API responses?
Yes. Set 'response_mode':'streaming' in your request body to receive Server-Sent Events (SSE). This lets you display tokens as they're generated, just like ChatGPT's typing effect.
What programming languages work with the Dify API?
Any language that supports HTTP requests: Python, JavaScript/Node.js, PHP, Ruby, Go, Java, C#, and more. Dify provides official SDKs for Python and Node.js, plus curl examples for every endpoint.
Ready to Self-Host Dify?
Get the most out of the Dify API by self-hosting: no credit limits, full data privacy, custom domain, and the same API interface you've learned here. Compare the best hosting options for your needs.