How Code2Video Turns Code into High‑Quality Teaching Videos in Minutes
Code2Video, an open‑source framework from NUS Show Lab, automates the creation of professional teaching videos by converting knowledge points into executable Manim code, using AI multi‑agent collaboration to produce coherent, visually appealing animations with subtitles across 117 subjects, dramatically reducing production time.
Overview
Code2Video is an open‑source, code‑driven video generation framework that converts a textual knowledge point (e.g., “matrix and linear transformations”) into an executable Manim script, renders animated scenes, adds subtitles and illustrations, and outputs a polished teaching video.
Core Architecture
Three AI agents :
Planner – decomposes the knowledge point into a storyboard (introduction, principle, example, summary) to ensure logical flow.
Coder – generates Manim code that defines precise animation logic (e.g., vector rotation, tower of Hanoi disks) and inserts subtitles and graphics.
Critic – evaluates visual layout, color scheme, and overall aesthetic, then refines the script.
Dataset : A built‑in MMMC benchmark provides 117 high‑frequency topics ranging from linear algebra to algorithms, enabling out‑of‑the‑box video generation without manual design.
Reproducibility : The generated Manim code is fully open‑source and can be edited (e.g., change animation speed, number of disks) to produce variant videos while preserving style consistency.
Evaluation framework : Scores each video on knowledge transmission, visual appeal, and generation efficiency, and can automatically fix code bugs.
License : Distributed under the MIT license, allowing unrestricted local deployment.
Typical Use Cases
1. Teachers creating mathematics videos
Install Python 3.8+ and Manim Community v0.19.0.
Run the single‑concept script with the desired knowledge point:
sh run_agent_single.sh --knowledge_point "Linear transformations and matrices"The Planner creates a storyboard, the Coder produces Manim code for vector rotation and scaling, and the Critic optimizes layout.
Render the video (≈10 minutes) and deliver a clear animation that illustrates matrix effects.
Modify parameters in the generated script (e.g., number of disks in a Tower of Hanoi animation) to create new variants in minutes.
2. Technical bloggers producing algorithm series
Add topics such as "Quick Sort", "Binary Search", "Tower of Hanoi" to json_files/long_video_topics_list.json.
Run the batch script: sh run_agent.sh --MAX_CONCEPTS 5 The three agents automatically generate videos with consistent styling, subtitles, and algorithmic visualizations.
New topics can be added simply by editing the JSON file; no redesign is required.
3. Students creating personal revision videos
Specify a concept (e.g., "Pure Fourier Series") and select an academic visual style.
The system explains the definition, then animates the decomposition of a square wave into sine components.
Exported Manim code allows the student to adjust parameters such as the number of sine waves.
Getting Started
Step 1 – Prepare the environment
Ensure Python 3.8+ is installed and install Manim Community v0.19.0.
Clone the repository and install Python dependencies:
git clone https://github.com/showlab/Code2Video.git
cd Code2Video/src
pip install -r requirements.txtStep 2 – Configure API keys
Edit api_config.json to provide required keys:
LLM API (recommended Claude‑4‑Opus) for high‑quality Manim code generation.
VLM API (recommended Gemini‑2.5‑Pro) for layout optimization.
Optional IconFinder API for additional icons.
Step 3 – Generate videos
Single‑concept mode (new users) :
sh run_agent_single.sh --knowledge_point "Tower of Hanoi"Output is saved in CASES/TEST-single/.
Batch mode (series production) : edit json_files/long_video_topics_list.json with multiple concepts and run:
sh run_agent.sh --MAX_CONCEPTS 3 # generate the first three videosRepository
Project source code:
https://github.com/showlab/Code2VideoOld Meng AI Explorer
Tracking global AI developments 24/7, focusing on large model iterations, commercial applications, and tech ethics. We break down hardcore technology into plain language, providing fresh news, in-depth analysis, and practical insights for professionals and enthusiasts.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
