Step‑by‑Step Deployment of Suna: The Open‑Source AI Agent That Beats Manus
This article walks through the complete installation and configuration of Suna, an open‑source, fully offline AI agent built on Claude 3.7, LangChain, and Daytona, comparing its richer UI and privacy advantages to Manus, and demonstrates its capabilities by automating a Kaggle competition analysis.
What is Suna?
Suna is an open‑source general‑purpose AI agent that supports deep research, sandboxed computer use, browser automation, file management, web crawling, command‑line execution, site deployment, and integration with various APIs.
Technical Architecture
Base model: Claude 3.7 (default), with optional access to OpenRouter and LiteLLM.
Development framework: LangChain and LangGraph.
Sandbox environment: Daytona, which supports MCP tool calls.
Data store: Redis for fast interaction.
Backend services: Supabase for authentication, database, and API layers.
MCP tools: tavily for search and FireCrawl for intelligent web crawling.
Local Installation (Windows)
Create a Conda virtual environment for the backend and install Python 3.11.
conda create -n suna python=3.11 # create Suna backend environment
conda activate sunaInstall backend dependencies: pip install -r requirements.txt Install frontend dependencies: npm install Obtain API keys for required services and add them to .env in the backend directory:
Tavily search engine – register on the Tavily website and copy the API key to TAVILY_API_KEY.
FireCrawl – register and copy the API key to FIRECRAWL_API_KEY.
Claude model – obtain an API key and set it in the appropriate variable.
Configure the Daytona sandbox:
Create a Daytona account, build a Suna image, generate a Daytona API key, and add it to .env.
Set up Supabase:
Register on supabase.com, create a project (e.g., suna-test), and note the project reference ID.
Link the local project to Supabase:
supabase login
supabase link --project-ref <PROJECT_ID>Push the database schema: npx supabase db push Configure model access for the frontend by editing .env.local and adding the DeepSeek API key as OPENAI_API_KEY.
Startup Procedure
Start Redis via Docker Compose:
conda activate suna
docker compose up redisLaunch the backend service:
conda activate suna
python api.pyRun the frontend development server: npm run dev Access the UI at http://localhost:3000 and register with any email to begin interacting with Suna.
Capability Demonstration
A complex task was used to evaluate Suna: retrieve Kaggle competition information from the past six months related to large‑scale multimodal models, extract titles, links, dates, organizers, task types, and resource constraints, and output a structured report (table or JSON). Suna planned the task, executed browser actions, ran command‑line tools, wrote Python scripts, and created files inside the sandbox, demonstrating end‑to‑end autonomous operation.
Fun with Large Models
Master's graduate from Beijing Institute of Technology, published four top‑journal papers, previously worked as a developer at ByteDance and Alibaba. Currently researching large models at a major state‑owned enterprise. Committed to sharing concise, practical AI large‑model development experience, believing that AI large models will become as essential as PCs in the future. Let's start experimenting now!
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
