Hypandra and AI

Hypandra exists to protect and promote curiosity.

A lot of "AI literacy" today is either hype ("this changes everything!") or fear ("keep it away!"). We think both miss the point. The future isn't inevitable — it's shaped by the tools we build, the norms we tolerate, and the questions we keep asking.

So we focus on how people and AI work together: what gets delegated, what stays human, what tradeoffs are being made, and who's accountable when something goes wrong. We build small things, critique them honestly, and repair them in public — because real understanding comes from participation, not spectatorship.

We don't start with "what can AI do?" We start with five expressions of curiosity—curiosity we practice and share in many areas of life. Our AI principles flow from them.

We teach people to demand better tools — and to build them.

Five expressions of curiosity

Click any card to explore.

How we practice this

These principles shape what we build, and what we build teaches us more about the principles.

Our approach is guided by curiosity as a value and grounded in research on how people and technology work together.

The Handoff framework (Mulligan & Nissenbaum, 2020; Goldenfein et al., 2020; Goldenfein & Griffin, 2022) asks what happens when a task moves from a person to a machine—even if the job gets done, something changes: who's responsible, who learns, who has control.

Legitimate peripheral participation (Lave & Wenger, 1991) shows that learning happens through doing real work alongside others. When AI takes on a function previously performed by people, what changes? What values are at stake? How do people learn to work with and around these new configurations?

These frameworks grounded Daniel's dissertation research at UC Berkeley: Situating Web Searching in Data Engineering: Admissions, Extensions, Repairs, and Ownership.

See how these principles map to AI literacy frameworks →

Handoff Framework

What happens when a task moves from a person to a machine? Even if the job gets done, something changes: who's responsible, who learns, who has control.

Read Handoff Stories →

Legitimate Peripheral Participation

Learning happens through doing real work alongside others. When AI takes on a function previously performed by people, what values are at stake?