Hyperframes vs Remotion: Write HTML to Auto‑Generate Movie‑Quality Videos with AI
The article introduces Hyperframes, an open‑source video rendering framework that lets developers compose movies using plain HTML and AI agents, compares it with Remotion, and provides step‑by‑step usage, component libraries, and tooling details for low‑threshold, automated video creation.
Problem
Front‑end developers and AI engineers often need to create quick product demos, data visualizations, or short‑video hooks, but traditional video tools require learning professional editing software and do not support code‑based or AI‑compatible workflows.
Solution – Hyperframes
Hyperframes is an open‑source video rendering framework that lets developers define video compositions with plain HTML, CSS, and GSAP, preview them in real time, and render the final output as an MP4. The core slogan is Write HTML. Render video. Built for agents. It runs without any front‑end framework, requires no bundling, and provides deterministic rendering suitable for automated pipelines.
Quick‑start methods
Method 1 – AI coding agents
Install the Hyperframes skill package with npx skills add heygen-com/hyperframes. After installation AI tools such as Claude Code, Cursor, Gemini CLI, and Codex can compose videos, invoke the CLI, and use GSAP animations. Slash commands become available: /hyperframes (create video), /hyperframes-cli (CLI usage), /gsap (animation support). Example prompts include creating a 10‑second product intro with a fade‑in title, background video, and music; converting a PDF summary into a 45‑second pitch video; generating a 9:16 TikTok‑style hook with subtitles synced to TTS; and iterative edits like enlarging the title, switching to dark mode, or adding a subtitle at a specific timestamp.
Method 2 – Manual project creation
Run npx hyperframes init my-video, then npx hyperframes preview for hot‑reload browser preview and npx hyperframes render to produce an MP4. The hyperframes init command also installs the skill package, allowing the project to be handed to an AI agent for further iteration. Environment requirements: Node.js ≥ 22 and FFmpeg.
Key design principles
HTML‑native : composition is plain HTML with data attributes, no framework or DSL.
AI‑first : non‑interactive CLI designed for direct invocation by agents.
Deterministic rendering : identical input always yields identical output, enabling reliable automation.
Frame Adapter pattern : can integrate GSAP, Lottie, CSS, or Three.js animation runtimes.
Comparison with Remotion
Writing method : Hyperframes uses HTML + CSS + GSAP; Remotion uses React TSX components.
Build steps : Hyperframes runs directly from index.html; Remotion requires bundling and a build step.
Clock animation : Hyperframes provides frame‑precise, locatable timing; Remotion renders according to the real‑time clock.
HTML/CSS directness : Hyperframes allows paste‑and‑use with direct animation; Remotion requires conversion to JSX.
Distributed rendering : Hyperframes currently supports only single‑machine rendering; Remotion supports Lambda and production‑ready distributed rendering.
License : Hyperframes is Apache 2.0 (fully open source); Remotion’s source is visible but a commercial license is required.
Core usage pattern
Video elements are described with data attributes such as data-start, data-duration, data-track-index, and data-volume. Example composition:
<div id="stage" data-composition-id="my-video" data-start="0" data-width="1920" data-height="1080">
<video id="clip-1" data-start="0" data-duration="5" data-track-index="0" src="intro.mp4" muted playsinline></video>
<img id="overlay" class="clip" data-start="2" data-duration="3" data-track-index="1" src="logo.png" />
<audio id="bg-music" data-start="0" data-duration="9" data-track-index="2" data-volume="0.5" src="music.wav"></audio>
</div>Defining these attributes is sufficient; the composition can be previewed in a browser and rendered to MP4 with a single command.
Component library
Hyperframes ships with over 50 pre‑built blocks (social overlays, transitions, data charts, cinematic effects). Components can be added via commands, for example:
npx hyperframes add flash-through-white # white flash transition
npx hyperframes add instagram-follow # social follow widget
npx hyperframes add data-chart # animated chartToolchain packages
hyperframes: CLI for creating, previewing, checking, and rendering. @hyperframes/core: types, parser, runtime, frame adapter. @hyperframes/engine: Puppeteer + FFmpeg capture engine. @hyperframes/producer: full render pipeline (capture + encode + mix). @hyperframes/studio: browser visual editor. @hyperframes/player: embeddable Web Component. @hyperframes/shader-transitions: WebGL transition library.
AI skill system
Skills expose framework functionality to agents: hyperframes: video creation, subtitles, TTS, animation, transitions. hyperframes-cli: all CLI commands. hyperframes-registry: component installation via hyperframes add. website-to-hyperframes: convert a web page URL directly to video. gsap: GSAP animation, timelines, easings, plugins.
Target users
Front‑end engineers who want to quickly produce product videos or animated demos using HTML.
AI developers building automated video‑generation pipelines.
Short‑video teams needing batch‑processable, reusable, iteratively editable templates.
Teams requiring a fully open‑source solution without commercial restrictions.
Project repository
https://github.com/heygen-com/hyperframes
AI Architecture Path
Focused on AI open-source practice, sharing AI news, tools, technologies, learning resources, and GitHub projects.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
