HyperFrames — programmatic HTML-to-video framework, agent-first
Why this is in the vault
Founder shared the catalog + GitHub on iMessage 2026-05-06 with no comment and a follow-up “Update to HyperFrames.” File it. This is the agent-native Remotion alternative — and the “agent-first” framing isn’t marketing fluff: there’s a real MCP, a skills registry, and a deterministic-render CLI. RDCO has multiple video-adjacent projects (MAC landing motion, Sanity Check intros, Ray mascot animation) that have been blocked on lacking exactly this layer. This is plug-shaped for our hole.
What it is
HyperFrames is an open-source video rendering framework that takes HTML/CSS compositions and renders them deterministically to MP4. Authored as plain HTML with data-* attributes for timing/composition, NOT React (which is what Remotion does). Full-local rendering — needs Node 22 + FFmpeg, no HeyGen account or API key.
Despite being a HeyGen-released project, it’s architecturally independent from the HeyGen avatar/lipsync API — different product class entirely. HeyGen does generative avatar/voice; HyperFrames does deterministic motion graphics.
Capability vs current RDCO video stack
| Need | Current tool | HyperFrames adds |
|---|---|---|
| Avatar / lipsync video | HeyGen MCP (existing) | nothing — different product |
| Generative pixels (image/video) | xAI grok-imagine, Kling | nothing — generative is different domain |
| Voice / sound effects | ElevenLabs MCP | nothing — out of scope |
| Motion graphics overlays | none | ✅ first solution we have |
| Deterministic HTML→video | none | ✅ first solution |
| Programmatic transitions / shaders | none | ✅ catalog includes 60+ blocks |
| Sanity Check episode plates | manual / Canva | ✅ scriptable |
| MAC landing hero loop | static cuts | ✅ deterministic motion |
| Ray mascot composition | per-frame xAI generation | ✅ assemble + transition layer |
This is purely additive — no overlap with what we run today.
Agent-friendly features (the load-bearing ones for RDCO)
- Non-interactive CLI:
npx hyperframes render— batch-renderable from a skill - MCP server in registry — wire-up to Claude Code is <1 hour
- Skills system:
npx skills add heygen-com/hyperframes— composition patterns auto-load into Claude - Deterministic output — same input → byte-identical video. Critical for verify-action type post-condition checks
- HTML authoring — familiar territory for the Astro/CF Pages stack we already build sites in
Catalog (60+ blocks)
- Social card overlays: Instagram Follow, X Post, Spotify Now Playing, YouTube Subscribe
- Shader transitions, glitch effects, shimmer, grain
- VFX micro-components
These are the polish layer that elevates baseline footage. Squarely in the founder’s high-personality / engraving / glitch / Memphis design bucket.
Mapping against Ray Data Co
Strong. Three concrete near-term applications:
- MAC landing-page hero motion — current v0-polish hero is static blocky type. A 4-6 second HyperFrames loop (subtle shader glitch on the type, scroll-trigger scrub) would push the design from “good first pass” to “vibrant landing.” Apache 2.0 + deterministic = fits the Cloudflare Pages build pipeline cleanly.
- Sanity Check intro/outro plates — instead of paying Canva or manually cutting, write the intro composition once as a Hyperframe template and parameterize per episode. RDCO-IP-shaped (the template lives in our repo, regenerates with each episode title).
- Ray mascot animation system (Notion task
344f7d49-36d1-81a7-b4db-dacff823c9f9) — the current bottleneck is composing per-frame xAI sprite outputs into smooth motion. HyperFrames’ HTML authoring + deterministic timing solves the “stitch frames into video” half of that pipeline. The “instantiation pattern” memory says variation is acceptable; HyperFrames does not enforce identicality, just composition.
Sponsor / bias notes
HeyGen sponsors the project (it’s released under their org). Two read-throughs:
- The OSS license is Apache 2.0 with no commercial restrictions, no telemetry called out in the README. Genuinely free.
- The catalog page is hosted on hyperframes.heygen.com — that surface MIGHT push paid HeyGen avatar integrations as you scroll. The OSS framework itself does not require HeyGen.
Recommended action
Tier 1 (do now, low cost): Install the HyperFrames MCP into Claude Code. ~1hr. Creates the optionality without committing to any specific video project.
Tier 2 (when MAC v0 ships): Use HyperFrames for the MAC landing hero motion as the canary integration. Validates the workflow on a live asset before deeper integration.
Tier 3 (if Sanity Check v3 ships): Build a parameterized intro/outro template for episodes. Compounds with each episode.
Cross-references
~/.claude/skills/ray-mascot-anim/SKILL.md— current mascot pipeline, candidate for HyperFrames composition layer~/.claude/skills/build-landing-page/SKILL.md— landing-page workflow, candidate for HyperFrames hero loops~/rdco-vault/01-projects/mac-landing/2026-05-05-build-spec.md— MAC v0 build spec, where Tier-2 canary lives~/.claude/projects/-Users-ray/memory/feedback_design_taste_high_personality.md— design taste, why HyperFrames’ shader/glitch catalog fits- HeyGen MCP (currently installed) — different product class; HyperFrames is parallel, not redundant
Related
- remotion-alternatives
- programmatic-video
- mac-landing
- ray-mascot
- sanity-check