How to Set Up and Run Feynman: AI Research Assistant

⬅️ Back to Tutorials

Andrej Karpathy released Feynman to automate the boring parts of research. It is an AI assistant that dispatches multiple agents to search, read, cross-reference, and synthesize information from academic papers and the web.

It is called Feynman and here is how it works:

Feynman uses Google search and arXiv to fetch full texts, summaries, and citations automatically.

100% Open Source https://github.com/getcomposition-ai/feynman/

Why Feynman?

Karpathy built Feynman to automate research tasks. It does not just chat. It dispatches multiple agents to search, read, cross-reference, and synthesize information from academic papers and the web.

The flagship workflow is deep research. You type a question, and Feynman spins up agents that:

  • Search arXiv and Google Scholar
  • Read and extract key findings`
  • Cross-reference multiple sources`
  • Produce a structured research report with citations`

It also handles literature reviews, peer reviews, code audits, replications, source comparisons, draft writing, and more. All from a single terminal command.

Prerequisites

  • A machine you can install on (macOS, Linux, or Windows)
  • Node.js version 20.19.0 or higher (if installing via npm)
  • Or curl/PowerShell for the standalone installer`
  • An API key for your preferred model provider (OpenAI, Anthropic, OpenRouter, etc.)

Step 1: Install Feynman

The fastest way to get up and running is the one-line installer:

macOS or Linux:

curl -fsSL https://feynman.is/install | bash

The installer detects your OS and architecture automatically. On macOS it supports both Intel and Apple Silicon. On Linux it supports x64 and arm64. The launcher is installed to ~/.local/bin, the bundled runtime is unpacked into ~/.local/share/feynman, and your PATH is updated when needed.

Windows (PowerShell as Administrator):

irm https://feynman.is/install.ps1 | iex

This installs the Windows runtime bundle under %LOCALAPPDATA%\Programs\feynman, adds its launcher to your user PATH, and lets you re-run the installer at any time to update.

Alternative: npm install:

npm install -g @composition-ai/feynman

This uses your local Node.js runtime instead of the bundled standalone runtime. It requires a compatible Node.js version that satisfies Feynman’s current engine range: >=20.19.0 <25.

Skills only (no runtime): If you only want Feynman’s research skills and not the full terminal runtime, install the skill library separately:

For a user-level install into ~/.codex/skills/feynman:

curl -fsSL https://feynman.is/install-skills | bash

For a repo-local install into .agents/skills/feynman under the current repository:

curl -fsSL https://feynman.is/install-skills | bash -s -- --repo

Step 2: Initial Setup

After installation, run the guided setup wizard:

feynman setup

This walks you through:

  • Selecting a default model`
  • Authenticating with your provider`
  • Optionally installing extra packages for features like web search and document preview`

Verify the install:

feynman --version

If you see a version number, you are ready to go. Run feynman doctor at any time to diagnose configuration issues, missing dependencies, or authentication problems.

Step 3: Launch the REPL

Start an interactive session by running:

feynman

You are dropped into a conversational REPL where you can ask research questions, run workflows, and interact with agents in natural language. Type your question and press Enter.

Step 4: Run a One-Shot Prompt

If you want a quick answer without entering the REPL, use the --prompt flag:

feynman --prompt "Summarize the key findings of Attention Is All You Need"

Feynman processes the prompt, prints the response, and exits. This is useful for scripting or piping output into other tools.

Step 5: Start a Deep Research Session

Deep research is the flagship workflow. It dispatches multiple agents to search, read, cross-reference, and synthesize information from academic papers and the web:

feynman
> /deepresearch What are the current approaches to mechanistic interpretability in LLMs?

The agents collaborate to produce a structured research report with citations, key findings, and open questions. The full report is saved to your session directory for later reference.

Step 6: Work with Files

Feynman can read and write files in your working directory. Point it at a paper or codebase for targeted analysis:

feynman --cwd ~/papers
> /review arxiv:2301.07041

You can also ask Feynman to draft documents, audit code, or compare multiple sources by referencing local files directly in your prompts.

Step 7: Explore Slash Commands

Type /help inside the REPL to see all available slash commands. Each command maps to a workflow or utility, such as /deepresearch, /review, /draft, /watch, and more. You can also run any workflow directly from the CLI:

feynman deepresearch "transformer architectures for protein folding"

See the Slash Commands reference for the complete list.

What Works

  • Deep research with citations from arXiv and web`
  • Literature reviews across thousands of papers`
  • Code audits with specifications`
  • Draft writing with source attribution`
  • Session persistence (pick up where you left off)`
  • Multiple model providers (OpenAI, Anthropic, OpenRouter, and more)`

Important Notes

Feynman is a tool for research, literature review, and analysis. It should be used for legitimate research, education, and quality assurance purposes, in compliance with applicable laws and the terms of service of target websites and APIs.

Feynman respects robots.txt and rate limits. It does not circumvent paywalls or access restricted content. The tool is designed for legitimate research, not content theft.

Result

After running the install command you will have:

  • Feynman installed on your own machine at `~/.local/bin/feynman``
  • Access to deep research, literature reviews, and more`
  • Your machine, your rules, no company pressure`
  • 100% open source, MIT License`

Time to go research. Try asking Feynman about a topic in your field and see the structured report it produces.


Crepi il lupo! 🐺