LongCut: Deep Learning from Long YouTube Videos
Don’t Take the Shortcut. Take the LongCut.
YouTube is now the biggest podcasting platform in the US. Video podcasts, founder interviews, lectures, tutorials; many run over an hour and are packed with high-value information. The temptation is to toss the link into an AI tool and get a quick text summary. But as Sarah Zhang argues, converting a high-bandwidth format like video into a low-bandwidth format like text loses everything worth watching: the demo on screen, the speaker’s body language, the anecdote that hits you in the gut, the turn of phrase that makes a line memorable.
LongCut (formerly TLDW “Too Long; Didn’t Watch”) is her answer. Instead of using AI for compression (1-hour video → 1-paragraph summary), it uses AI for filtering and curation. Pick out the 5 minutes you should pay attention to, but give you the original clip without watering it down.
“Don’t take the shortcut in your learning. Take the longcut.”
The Origin Story
Sarah Zhang built LongCut on nights and weekends with two partners: Samuel Zhang (developer) and Yiqi Yan (designer). It started from a personal pain point: she was spending more hours watching long YouTube videos to learn, but noticed many of her followers lacked the attention span to sit through them. Rather than accept that tradeoff, she shipped a tool that makes deep engagement practical.
How It Works
Paste a YouTube URL and LongCut generates a structured learning workspace:
🎯 Personalized Highlight Reels
The core innovation. An LLM analyzes the video transcript and identifies the most high-signal segments. These highlights appear as colored markers on the progress bar; click any marker to jump straight to that timestamp. Crucially, this is personalized: an AI researcher watching a Sam Altman talk might search for “model training,” while a product manager searches for “productization.” Different viewers, different highlights, same video.
Two generation modes are available:
- Smart - Prioritizes quality with more thorough analysis
- Fast - Prioritizes speed for quick iteration
💬 Transcript-Aware Chat
Chat with the entire video transcript. Ask “what are the juiciest quotes?” or “how does the speaker feel about [topic]?” The AI responds with structured answers and timestamp citations, grounded in the full transcript context.
📝 Context-Specific Explanations
Select any term or jargon in the transcript and click “Explain.” Because the AI has the full transcript in its context window, it provides explanations in context and not a generic definition from a search engine, but what that term means in this conversation.
📋 Personal Notes
Select memorable lines from the transcript and save them to a personal notebook. All notes from all videos are aggregated in an /all-notes dashboard with filtering, sorting, and markdown rendering.
📺 Transcript Viewer
The transcript stays in sync with the YouTube player. Click any sentence to jump to that exact moment. Capture quotes with one click.
Key Features
- AI highlight reels with Smart and Fast generation modes, Play All playback, and theme-based re-generation
- Quick preview, structured summary, suggested questions, and memorable quotes surfaced in parallel
- Transcript-aware chat with timestamp citations and provider fallback handling
- Personal notes workspace with transcript, chat, and takeaway sources
- Cross-video notebook (
/all-notes) for reviewing notes across all analyzed videos - Authenticated library for saved analyses, favorites, and generation limits
- Aggressive caching of previous analyses with background refresh
- Security — CSP headers, CSRF protection, body-size caps, and rate limiting
Tech Stack
LongCut is open source under AGPL-3.0, built with a modern stack:
- Frontend: Next.js 15, React 19, TypeScript, Tailwind CSS v4, shadcn/ui
- AI: xAI Grok 4 Fast (default) with optional Gemini adapter
- Transcripts: Supadata API
- Persistence & Auth: Supabase (Postgres + Auth)
- Security: CSRF tokens, Zod validation, rate limiting, CSP headers
Get Started
The fastest way is to visit longcut.ai and paste a YouTube URL. You can try a pre-generated example to see it in action.
To self-host:
git clone https://github.com/SamuelZ12/longcut.git
cd longcut
npm installConfigure .env.local with your API keys:
| Variable | Required | Description |
|---|---|---|
XAI_API_KEY | yes* | xAI Grok API key |
GEMINI_API_KEY | optional* | Google Gemini API key |
SUPADATA_API_KEY | yes | Transcript API key |
NEXT_PUBLIC_SUPABASE_URL | yes | Supabase project URL |
NEXT_PUBLIC_SUPABASE_ANON_KEY | yes | Supabase anon key |
CSRF_SALT | yes | Random string for CSRF tokens |
*At least one AI provider key is required.
npm run dev # → http://localhost:3000🔗 GitHub: github.com/SamuelZ12/longcut — ⭐ 1k stars
Why This Tool Rocks
- Deep over shallow: Preserves the fidelity of original video content instead of flattening it into generic summaries
- Personalized, not generic: Different viewers get different highlights based on what they care about
- Learn in public: Notes and highlights become artifacts you can revisit and share
- Open source: AGPL-3.0 - fork it, extend it, self-host it
- Built by learners, for learners: Born from genuine frustration with how AI tools were being used for video consumption
Crepi il lupo! 🐺