Building an AI-Powered Telegram Bot with LangGraph: A Practical Guide
AI agents as imagined by AI
Introduction
Hello, tech enthusiasts! I’m Grigory, head of special projects at AllSee. Today, AI agents, large language models (LLMs), and agent-based systems are everywhere—new solutions and articles appear daily. It’s an exciting time, but sometimes it’s worth pausing to explore how these technologies can solve real-world problems.
In this article, we’ll dive into the world of AI agents, review some recent advancements, and walk through building a practical Telegram bot. This bot will answer questions about company services (in our case, AllSee), but you can easily adapt it for your own business needs.
What Are AI Agents?
Stripping away the marketing buzz, an AI agent is essentially an LLM (like GPT-4 or Claude) equipped with a set of rules and the ability to interact with the outside world. Imagine a character who can only listen and speak—this is your basic LLM. By defining what actions it can take (like clicking a button or sending a file), you turn it into an agent capable of meaningful interaction.
There are many ways to teach LLMs about their environment and available actions, but for this guide, we’ll focus on practical implementation rather than theory.
Frameworks for AI Agents in Python
Several frameworks make it easier to build AI agents and multi-agent systems in Python. Here are some popular options:
For this tutorial, we’ll use LangGraph. Here’s why:
- Built on LangChain: LangGraph extends LangChain, so you get a familiar API and a rich set of tools out of the box.
- Graph-Based Architecture: LangGraph models workflows as graphs, making it easy to design controlled, flexible scenarios.
- Scalability: While some users note scalability limitations for high-traffic scenarios, it’s more than sufficient for most business bots.
Project Overview: AI Agent Manager Bot
Let’s build a Telegram bot that acts as a company consultant, answering user questions about services, sharing files, and providing relevant links.
Functional Requirements
Our bot should:
- Respond to users following a defined scenario.
- Send hyperlinks.
- Send files (e.g., PDF lead magnets).
- Retrieve information from a company knowledge base.
- Remember conversation context and distinguish between users.
Technology Stack
- python-telegram-bot: For Telegram API integration.
- pydantic-settings: For configuration management via
.env
files. - LangGraph: For agent orchestration and workflow management.
- ChatOpenAI: As the LLM interface.
- Chroma: As a vector database for semantic search.
Step-by-Step Implementation
1. Agent Tools: Sending Files
To enable the agent to send files, we define a function send_document_to_user
that takes a file path and an optional message, then sends the file to the user via Telegram. We use LangChain’s StructuredTool
to wrap this function, allowing for custom input models and easy integration.
Key point: Telegram’s Update
objects are not serializable, so we pass them via RunnableConfig
to avoid issues with state persistence.
2. Tool-Triggered Retrieval: Knowledge Base Search
We set up a Chroma vector database to store and search company information. The retrieval tool is only called when needed, saving LLM tokens and improving efficiency. Documents are split and indexed for semantic search, and the retrieval tool is exposed to the agent for on-demand use.
3. Agent Manager: Orchestrating Tools and LLM
LangGraph models workflows as graphs with three main components:
- State: Stores important information (e.g., conversation history).
- Node: Functions that process state and perform actions (like calling the LLM or tools).
- Edge: Determines the next node based on the current state.
We use LangGraph’s create_react_agent
to assemble our agent, passing in the LLM, system prompt, and available tools (file sending and retrieval). The system prompt defines the agent’s role, behavior, and available resources.
4. Telegram Bot Integration
We use python-telegram-bot
to handle user interactions. To maintain conversation context and user-specific state, we leverage LangGraph’s AsyncPostgresSaver
for checkpointing. Each user’s chat ID is used as a unique thread identifier, ensuring personalized experiences.
Challenge: Passing non-serializable objects (like Update
) to the workflow. Solution: Use RunnableConfig
to inject these objects at runtime without persisting them in the database.
5. Deployment with Docker
To simplify deployment, we use Docker and Docker Compose:
- Dockerfile: Sets up the Python environment and installs dependencies.
- docker-compose.yaml: Orchestrates the bot and a PostgreSQL database for state persistence.
Demo
Sample conversation with the bot
Conclusion & Tips
We’ve explored the basics of LangGraph, built an AI-powered agent, and automated a business process with a conversational Telegram bot. This approach is flexible and can be adapted for various industries and use cases.
Key takeaways:
- LangGraph makes it easy to design controlled, multi-step agent workflows.
- Combining LLMs with retrieval tools and file sharing creates powerful business bots.
- Docker simplifies deployment and scaling.
The full project code is available on GitHub. Feel free to fork, adapt, and enhance it for your own company. The repository will be updated with new features, so star it to stay tuned!
Have experience with AI agents or questions about different frameworks? Share your thoughts in the comments. Good luck, and happy coding! ✌️