Saved to localStorage

AlphaCommons

The Human API for the agentic era.

Xuan Zhao — xuanzhao630@gmail.com

01 / 11

Building is free. The bottleneck has shifted.

02 / 11

A closed-loop product evolution engine.

AlphaCommons
public platform
Codified
MCP in your IDE
Agent builds
m versions
Eval compares
Builder ships

AlphaCommons (upstream)

A public-facing platform where builders and agents source human signal. Post your product → get matched with real humans → receive structured feedback. For cold starts, product launches, agentic eval. No users required.

Codified (downstream)

An MCP product that lives in your coding environment (Claude Code, Cursor, Claude Desktop). Connects to your analytics (PostHog, Amplitude) to monitor your existing users, classifies feedback, and outputs PRDs that plug directly into the builder's dev agent. Sources signal from both your own user base and the AlphaCommons contributor network.

n humans provide feedback → Codified classifies and outputs m PRDs → agents build m versions → eval compares → builder ships → new versions generate new demand for signal. The loop accelerates.

03 / 11

AlphaCommons — the public-facing platform

Post

Builder or agent requests human signal — feedback on a product, eval of an AI output, expert review of a decision.

Match

AlphaCommons matches with the right humans: real users for breadth, domain experts for depth. AI-guided conversations make mining human signals fast and structured.

Return

Structured human signal flows back into the agent stack — tagged, parsed, ready for product decisions, eval pipelines, or fine-tuning.

Powered by the Codified Engine

Detective Monitors analytics continuously — detects anomalies, proposes studies proactively
Planner Designs the engagement — study type, sampling strategy, outcome-first framing
Recruiter Sources participants from the platform — behavioral criteria, anti-bias screening
Interviewer AI-guided conversations with real humans — behavioral anchoring, depth probing
Synthesizer Classifies feedback (common / subjective / measurable), outputs structured PRDs for the dev agent

A decade of research methodology from Robinhood, Airbnb, Stripe, Google, and Meta — encoded into AI agents.

04 / 11

Three ways teams use the Human API today.

Vibe Coder — Cold Start

launch into silence → launch into signal

Solo builder ships a product with AI. Posts to AlphaCommons. Within hours: 10 real humans try it, structured feedback on what works and what doesn't. Agent iterates overnight.

AI Startup — Product Intelligence

guessing from dashboards → knowing what to build next

Early-stage team building fast but flying blind. Connects their analytics data. The Codified Engine reconstructs user journeys, identifies where users get stuck, surfaces what's blocking the next level of growth.

Enterprise — Agentic Eval

AI evaluating AI → humans in the loop, at scale

Large AI team needs human ground truth. LLM-as-judge isn't enough for fine-tuning, eval pipelines, or model improvement. Embeds the Human API directly. Real humans evaluate AI outputs — structured, tagged, continuous.

05 / 11

The Human API is a new category.

AI Interview Platforms

Faster, but no data integration, no agent feedback loop. Still separate execution.

Traditional Research SaaS

Expensive, requires research team, days to months. Disconnected from the build cycle.

Why we're structurally different

06 / 11

The engine in action

Full flow — human signal sourced via AlphaCommons → classified by Codified → PRD generated → agent builds → eval compares → builder ships.

Video — coming soon
07 / 11

The community seeds itself.

  • Builders help builders. The same people who build products also try other people's products. They give feedback because they want feedback on theirs.
  • AI enthusiasts and early adopters — free early access to new products in exchange for structured input.
  • Contributors earn product credits they can use across the platform. First cohort is hand-curated.

The incentive: earned equity

  • Product credits — redeemable for access to other products on the platform
  • Revenue share — tied to the impact of your signal
  • Equity-like stakes — earn ownership in what you helped build
08 / 11

Xuan Zhao

Founder & CEO

Former research director and one of the first 60 employees at Robinhood. Directly responsible for 0→1 research behind billion-dollar revenue products such as Robinhood Options, Cash Management, Banking. Led monetization research teams at Airbnb and Instagram. PhD from University of Michigan.

Bin Xu

Founding Engineer

Previously led quantitative research at Google Ads, Stripe, AWS, and Meta. PhD from Cornell.

Design partnerships in progress...

We encoded a decade of user research methodology into AI agents — then saw something bigger: the entire way companies form is shifting. We built the Human API: AlphaCommons for builders who need signal now, Codified for companies who need it continuously.
09 / 11

What we need to move.

10 / 11

Every AI agent needs humans in the loop.

No infrastructure exists to provide it.

Until now.

The Human API.

xuanzhao630@gmail.com

11 / 11