How to Integrate AI Agents with Notion

Complete guide to connecting AI agents with Notion. Learn how to build knowledge base search agents, meeting note processors, and daily standup writers using LangChain, Lindy AI, and n8n with working Python code examples.

Before You Start

Prerequisites: Notion account (free plan works), Notion Integration Token (Internal Integration), Basic Python knowledge (for LangChain option)

Works with: LangChain, Lindy AI, n8n

Notion holds the institutional knowledge of thousands of teams — product specs, meeting notes, SOPs, decision logs, and wikis. An AI agent with Notion access can turn that static documentation into an active, queryable intelligence layer: answering questions from your knowledge base, converting meeting notes into tasks, and writing structured updates automatically.

This guide covers the complete Notion integration setup, from creating your API token to deploying production-ready agents, with working Python examples.

What AI Agents Can Do With Notion#

Before wiring up the integration, map out what becomes genuinely useful when an agent can read from and write to your Notion workspace:

Reading and searching

  • Full-text search across your entire workspace
  • Query databases with filters (e.g., "all tasks assigned to Alice that are overdue")
  • Read page content for summarization or Q&A
  • Retrieve structured data from databases (sprint trackers, CRM databases, content calendars)

Writing and creating

  • Create new pages with formatted content (headings, bullet lists, code blocks, tables)
  • Add entries to databases (meeting notes, task items, CRM contacts)
  • Update existing page properties (status, date, assignee fields)
  • Append blocks to existing pages (e.g., add a summary section to a meeting note)

Automation workflows

  • Meeting notes transcript → extract action items → create Notion tasks
  • Daily standup digest → write a structured update to a team page
  • Support ticket themes → generate a weekly summary report in Notion
  • Research sources → compile and organize into a structured knowledge page

Step 1: Create a Notion Internal Integration#

The Notion API uses Integration Tokens for authentication. Internal Integrations are the right choice for agent automation — they're simpler than OAuth and don't require a callback URL.

Create the Integration#

  1. Go to notion.so/my-integrations and click New integration
  2. Give it a descriptive name (e.g., "AI Agent - Knowledge Base")
  3. Select the workspace it should belong to
  4. Set Type to Internal
  5. Under Capabilities, select:
    • Read content — required for all read operations
    • Update content — required for writes and page creation
    • Insert content — required for adding blocks to pages
  6. Click Submit and copy the Internal Integration Token (starts with secret_...)

Store the token as an environment variable:

export NOTION_TOKEN="secret_your_token_here"

Share Pages and Databases with Your Integration#

The integration can only access pages you explicitly share with it. For each database or page your agent needs:

  1. Open the page in Notion
  2. Click ... (top right) → Add connections
  3. Search for your integration name and select it

The integration inherits access to all child pages automatically. For a knowledge base agent, share your top-level wiki or documentation space rather than page by page.


Option 1: LangChain with the Notion Client#

Best for: Engineering teams, custom logic, multi-tool agents that combine Notion with other systems

Install Dependencies#

pip install langchain langchain-openai notion-client python-dotenv

Create the Notion Tools#

from notion_client import Client
from langchain.tools import tool
from dotenv import load_dotenv
import os
import json

load_dotenv()

notion = Client(auth=os.getenv("NOTION_TOKEN"))

@tool
def search_notion(query: str) -> str:
    """Search your Notion workspace for pages and databases matching a query."""
    results = notion.search(query=query, page_size=5)
    if not results["results"]:
        return f"No results found for: {query}"

    pages = []
    for item in results["results"]:
        if item["object"] == "page":
            # Extract the page title
            title = ""
            if "properties" in item:
                for prop in item["properties"].values():
                    if prop["type"] == "title" and prop["title"]:
                        title = prop["title"][0]["plain_text"]
                        break
            elif "title" in item:
                for block in item.get("title", []):
                    title += block.get("plain_text", "")

            url = item.get("url", "")
            pages.append(f"- {title}: {url}")

    return "Found pages:\n" + "\n".join(pages)


@tool
def get_page_content(page_id: str) -> str:
    """Retrieve the full text content of a Notion page by its ID."""
    blocks = notion.blocks.children.list(block_id=page_id)
    content_parts = []

    for block in blocks["results"]:
        block_type = block["type"]
        block_data = block.get(block_type, {})

        # Extract rich text from common block types
        if "rich_text" in block_data:
            text = "".join(rt["plain_text"] for rt in block_data["rich_text"])
            if text.strip():
                content_parts.append(text)
        elif block_type == "child_page":
            content_parts.append(f"[Sub-page: {block_data.get('title', 'Untitled')}]")

    return "\n".join(content_parts) if content_parts else "Page appears to be empty."


@tool
def query_notion_database(database_id: str, filter_json: str = "{}") -> str:
    """
    Query a Notion database with optional filters.
    filter_json: JSON string with Notion filter object, e.g.,
    '{"property": "Status", "select": {"equals": "In Progress"}}'
    """
    try:
        filter_obj = json.loads(filter_json)
        query_params = {"database_id": database_id}
        if filter_obj:
            query_params["filter"] = filter_obj

        results = notion.databases.query(**query_params, page_size=10)

        if not results["results"]:
            return "No matching records found in database."

        rows = []
        for page in results["results"]:
            props = {}
            for name, prop in page["properties"].items():
                prop_type = prop["type"]
                if prop_type == "title" and prop["title"]:
                    props[name] = prop["title"][0]["plain_text"]
                elif prop_type == "rich_text" and prop["rich_text"]:
                    props[name] = prop["rich_text"][0]["plain_text"]
                elif prop_type == "select" and prop["select"]:
                    props[name] = prop["select"]["name"]
                elif prop_type == "date" and prop["date"]:
                    props[name] = prop["date"]["start"]
                elif prop_type == "checkbox":
                    props[name] = str(prop["checkbox"])
            rows.append(str(props))

        return f"Found {len(rows)} records:\n" + "\n".join(rows)

    except json.JSONDecodeError:
        return "Error: filter_json must be valid JSON."


@tool
def create_notion_page(parent_database_id: str, title: str, content: str) -> str:
    """Create a new page in a Notion database with a title and markdown-style content."""
    new_page = notion.pages.create(
        parent={"database_id": parent_database_id},
        properties={
            "Name": {
                "title": [{"text": {"content": title}}]
            }
        },
        children=[
            {
                "object": "block",
                "type": "paragraph",
                "paragraph": {
                    "rich_text": [{"type": "text", "text": {"content": content[:2000]}}]
                }
            }
        ]
    )
    return f"Page created: {new_page['url']}"

Build the Knowledge Base Agent#

from langchain_openai import ChatOpenAI
from langchain.agents import create_tool_calling_agent, AgentExecutor
from langchain_core.prompts import ChatPromptTemplate

llm = ChatOpenAI(model="gpt-4o", temperature=0)
tools = [search_notion, get_page_content, query_notion_database, create_notion_page]

prompt = ChatPromptTemplate.from_messages([
    ("system", """You are a knowledge base assistant with access to a Notion workspace.

    When answering questions:
    1. Search Notion for relevant pages first
    2. Read the content of the most relevant pages
    3. Synthesize a clear, accurate answer from the actual documentation
    4. If no relevant information exists, say so clearly — never guess
    5. Include the Notion page URL as a source reference

    Always ground your answers in the actual Notion content, not general knowledge."""),
    ("human", "{input}"),
    ("placeholder", "{agent_scratchpad}"),
])

agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

# Example: answer a question from your knowledge base
result = executor.invoke({
    "input": "What is our process for handling customer refund requests?"
})
print(result["output"])

Use Case 1: Knowledge Base RAG Agent#

The most powerful Notion integration pattern is retrieval-augmented generation (RAG) — an agent that searches and reads your Notion documentation to answer questions with direct citations.

The basic flow uses the search_notion and get_page_content tools shown above. For higher-quality retrieval over large workspaces (100+ pages), add a vector search layer:

pip install langchain-community chromadb openai
from langchain_community.vectorstores import Chroma
from langchain_openai import OpenAIEmbeddings
from langchain.text_splitter import RecursiveCharacterTextSplitter

# 1. Fetch all pages from your key databases
def fetch_all_pages(database_id: str) -> list[dict]:
    """Fetch all pages from a Notion database."""
    pages = []
    cursor = None
    while True:
        response = notion.databases.query(
            database_id=database_id,
            start_cursor=cursor,
            page_size=100
        )
        pages.extend(response["results"])
        if not response.get("has_more"):
            break
        cursor = response["next_cursor"]
    return pages

# 2. Chunk content for embedding
splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=200)

# 3. Build vector store (run once, then persist)
embeddings = OpenAIEmbeddings(model="text-embedding-3-small")
vectorstore = Chroma(embedding_function=embeddings, persist_directory="./notion_index")

# 4. Add a similarity_search tool for the agent
from langchain.tools import Tool

retriever = vectorstore.as_retriever(search_kwargs={"k": 4})
rag_tool = Tool(
    name="search_knowledge_base",
    func=lambda q: "\n\n".join(doc.page_content for doc in retriever.invoke(q)),
    description="Semantic search over the Notion knowledge base. Use for finding relevant documentation, SOPs, and policies."
)

This approach handles larger workspaces and returns semantically similar content even when the query wording differs from the page content.


Use Case 2: Meeting Notes to Notion Task Creator#

This agent reads a meeting transcript or raw notes, extracts action items, and creates structured Notion database entries.

MEETING_AGENT_PROMPT = """You are a meeting notes processor.

Given raw meeting notes, you will:
1. Identify all action items (tasks assigned to specific people with deadlines)
2. For each action item, create a Notion page in the tasks database
3. Include the task owner, due date (if mentioned), and context
4. Return a summary of tasks created

Tasks database ID: {tasks_db_id}

Be precise — only create tasks for explicit commitments, not general discussion points."""

tasks_db_id = "your-tasks-database-id-here"

prompt = ChatPromptTemplate.from_messages([
    ("system", MEETING_AGENT_PROMPT.format(tasks_db_id=tasks_db_id)),
    ("human", "Meeting notes:\n\n{input}"),
    ("placeholder", "{agent_scratchpad}"),
])

meeting_agent = AgentExecutor(
    agent=create_tool_calling_agent(llm, [create_notion_page], prompt),
    tools=[create_notion_page],
    verbose=True
)

meeting_notes = """
Product sync 2026-02-25
Attendees: Alice, Bob, Carlos

- Alice to finalize the onboarding copy by Friday
- Bob will review the Stripe integration PR before EOD Wednesday
- Carlos: set up monitoring alerts for the new API endpoint, target next Monday
- General discussion about Q2 roadmap — no decisions made yet
"""

result = meeting_agent.invoke({"input": meeting_notes})

Use Case 3: Daily Standup Summary Writer#

This agent queries a team tasks database, generates a standup summary, and appends it to a designated Notion page — replacing the manual standup update entirely.

The agent uses query_notion_database to pull tasks completed yesterday and in-progress today, then calls create_notion_page (or a custom append-blocks tool) to write the structured standup entry.

import datetime

yesterday = (datetime.date.today() - datetime.timedelta(days=1)).isoformat()

standup_input = f"""
Generate the daily standup summary for {datetime.date.today().isoformat()}.

1. Query the Tasks database (ID: your-db-id) for tasks with Status = "Done"
   and completion date = {yesterday}
2. Query for tasks with Status = "In Progress"
3. Write a standup summary in this format:
   ## Standup - [Date]
   **Done yesterday:** [list]
   **Working on today:** [list]
   **Blockers:** [any tasks overdue by more than 3 days]
4. Create this as a new page in the Standup Log database (ID: your-standup-db-id)
"""

Option 2: n8n No-Code Notion Workflows#

Best for: Teams who want automation without writing code

n8n has a native Notion node with full API support. The setup takes about 10 minutes:

  1. In n8n, add a Notion credential with your Integration Token
  2. Use the Notion node to perform operations: Get, Create, Update, Search
  3. Connect an AI Agent node (n8n's built-in LLM agent) as the processing step
  4. Chain multiple Notion nodes for read → process → write workflows

A typical n8n Notion workflow: Schedule TriggerNotion: Get Database ItemsAI Agent: SummarizeNotion: Create Page.


Option 3: Lindy AI (No-Code Agent Platform)#

Best for: Non-technical teams, quick deployment, conversational agents

Lindy AI connects to Notion via OAuth (no token management needed). You can build a Notion-aware agent through their visual builder in under 30 minutes. Lindy supports reading pages, creating entries, and running database queries as native agent actions.


Database Query Examples#

Notion database filters use a structured JSON format. These are the most common patterns:

# Filter by select property
filter_by_status = {
    "property": "Status",
    "select": {"equals": "In Progress"}
}

# Filter by date (overdue items)
filter_overdue = {
    "property": "Due Date",
    "date": {"before": datetime.date.today().isoformat()}
}

# Filter by person
filter_by_assignee = {
    "property": "Assignee",
    "people": {"contains": "user-id-here"}
}

# Compound filter (AND)
compound_filter = {
    "and": [
        {"property": "Status", "select": {"equals": "In Progress"}},
        {"property": "Priority", "select": {"equals": "High"}}
    ]
}

Pass these as the filter_json parameter when calling the query_notion_database tool.


Known Limitations#

Understanding what the Notion API cannot do prevents frustration:

No real-time webhooks: The Notion API does not push events to your agent. If you need to react to page changes (e.g., "when a task status changes to Done"), you must poll the API on a schedule. For event-driven workflows, combine Notion with a platform that supports webhooks like Zapier or n8n.

No file attachment uploads: The API can reference existing files by URL but cannot upload new files to Notion. If your workflow needs to attach PDFs or images, this must be done through the Notion UI.

Block depth limit: The API processes a maximum of 2 levels of block nesting per request. Very deeply nested Notion pages may require multiple API calls to fully read.

Rate limits: 3 requests per second. For bulk operations (indexing large workspaces), implement exponential backoff and batch your requests.


Next Steps#

Your Notion agent is a strong foundation for a broader knowledge management system. Extend it with these resources: