The Four-Layer Wall Around Your Library’s Public API

James Phoenix
James Phoenix

When an agent loop writes most of your library, the largest risk is not a bug in a feature. It is the loop helpfully exporting an internal helper, an experimental type, or a half-finished module. Once that ships in a minor release, you own it forever. Four package-level layers stop the loop from doing this without anyone having to remember.


The Drift That Costs the Most

I have watched agent loops do many things wrong. Most of them are recoverable. A bad implementation gets rewritten. A wrong test gets fixed. A regression gets reverted.

The one that is not recoverable is a leaked internal in your public API. The loop adds a useful helper, marks it export, the next iteration imports it from the package root, the changeset bot bumps the version, and now thousands of users depend on the symbol. You cannot remove it without a major release. You cannot change its signature without breaking everyone. The loop did exactly what you asked, and you ended up paying for a permanent commitment you never approved.

The fix is to build a wall at the package boundary that does not depend on anyone reading the diff. Matt Pocock’s sandcastle has the cleanest version of this I have seen. It uses four independent layers, each enforcing a different constraint. Any one of them missing and the loop has a path through. All four together and the loop physically cannot leak.


Layer One: A Hand-Curated Barrel

Sandcastle’s src/index.ts contains no wildcards. Every public symbol is listed by name:

export { run } from "./run.js";
export type {
  RunOptions,
  RunResult,
  LoggingOption,
  IterationResult,
  IterationUsage,
  Timeouts,
} from "./run.js";

export { interactive } from "./interactive.js";
export type { InteractiveOptions, InteractiveResult } from "./interactive.js";

export { createSandbox } from "./createSandbox.js";
// ... and so on for ~80 named exports

Why this matters: export * from "./foo.js" is the loop’s favourite shortcut. It compiles. It silences the test that imports the new symbol. It feels like the right thing to do. And it leaks every symbol the loop chose to mark export somewhere deep in foo.ts, including the ones it added five minutes ago without thinking.

A hand-curated barrel turns this into a deliberate act. To add a symbol to the public API, the agent must edit src/index.ts. That edit shows up in the diff under a file whose entire job is “what do users see?” PR review collapses from “scan every file for accidentally exported helpers” to “look at this one file.”

Practical rule: never use export * in the package root. Treat src/index.ts as a manifest, not a re-exporter.


Layer Two: package.json#exports Subpaths

Sandcastle’s package.json declares which import paths are even legal:

{
  "exports": {
    ".": {
      "import": "./dist/index.js",
      "types": "./dist/index.d.ts"
    },
    "./sandboxes/docker": {
      "import": "./dist/sandboxes/docker.js",
      "types": "./dist/sandboxes/docker.d.ts"
    },
    "./sandboxes/podman": { ... },
    "./sandboxes/vercel": { ... },
    "./sandboxes/daytona": { ... },
    "./sandboxes/no-sandbox": { ... }
  }
}

The list is short and intentional. Five sandbox provider subpaths, plus the root. Every other path inside the package is unreachable from outside. A user trying to import { something } from "@ai-hero/sandcastle/internal/foo" gets a Node resolver error before TypeScript even runs.

Why this is independent of layer one: layer one limits what gets re-exported through the root. Layer two limits which paths users can target at all. Without exports, a user can deep-import any file under dist/ directly, bypassing your barrel entirely. With exports, the barrel is the only door.

The other thing this gives you is modular subpaths by design. Sandcastle does not re-export docker from the root. It lives at @ai-hero/sandcastle/sandboxes/docker. That decision shows up in tree-shaking, in the README’s import examples, and in the user’s mental model of what is core and what is a plugin. The loop cannot fold a subpath into the root by accident, because the root barrel does not import it.


Layer Three: Optional Peer Dependencies for Plugin Surface

The Vercel and Daytona providers each need a heavy SDK (@vercel/sandbox, @daytona/sdk). Most users will never use them. Sandcastle’s package.json makes them optional peers:

{
  "peerDependencies": {
    "@vercel/sandbox": ">=1.0.0",
    "@daytona/sdk": "^0.164.0"
  },
  "peerDependenciesMeta": {
    "@vercel/sandbox": { "optional": true },
    "@daytona/sdk": { "optional": true }
  }
}

The user only installs the SDK if they import the matching subpath. No install bloat for everyone. No dependencies section ballooning every time a new provider lands.

Why this is independent of the first two layers: barrel curation and subpath exports control what users can reach. Optional peers control what users have to install to reach it. Without this, every new provider is a forced dependency on every consumer of the package, and an agent loop adding “support for X” silently grows your install tree.

The pattern generalises. Anything plugin-shaped, anything optional, anything tied to one cloud or one runtime, belongs as a subpath export plus an optional peer dependency. The loop adds the new file under src/sandboxes/, exposes it as a new entry under exports, and adds the SDK to peerDependenciesMeta. The package root never grows.


Layer Four: files: ["dist"]

The smallest layer, the most often forgotten:

{
  "files": ["dist"]
}

Without it, npm packs everything that is not in .npmignore or .gitignore. src/, tests, fixtures, ADRs, CONTEXT.md, prompts, recordings. All of it goes to the registry. Anything shipped to npm is reachable by users in some way, even if it is not exported, because they can read your repo on disk after install and copy code straight out of it.

files: ["dist"] says: only the compiled output ships. Everything else stays in the source repo. Internal modules under src/ cannot be reached by anyone, because they are not on disk after npm install.

Why this is independent of the first three layers: even with curated barrels, subpath exports, and optional peers, the loop can still ship a half-finished module under src/experiments/. Users will find it in node_modules. Users will copy from it. Bug reports will follow. files: ["dist"] deletes that surface entirely.


Why All Four, Not Three

Each layer fails differently:

Layer Fails when… What leaks
Curated barrel The loop adds export * Every internal in the re-exported file
Subpath exports Missing or contains a wildcard Deep imports into any file under dist/
Optional peers Plugin SDK lands in dependencies Forced install bloat for every user
files: ["dist"] Missing or wrong Source files, tests, fixtures shipped to npm

There is no single layer that catches all four failure modes. There is no clever abstraction that collapses them. The wall holds because each brick is independent.


The Agent Loop Angle

If a human is writing every line of the library, you can probably get away with two of these. You will catch the others in review.

If an agent loop is writing most of the lines, you cannot. The loop will, eventually, do every one of these wrong. It will export * because the test failed. It will skip peerDependenciesMeta because the example in its training data did not have one. It will forget files. It will add a subpath without updating exports.

The four layers are what let me trust the loop with the package boundary. Each layer is a deterministic guardrail that fires at install time, build time, or import time. None of them depend on me reading the diff. None of them depend on the loop remembering anything. The wall is in the configuration, not the workflow.

Leanpub Book

Read The Meta-Engineer

A practical book on building autonomous AI systems with Claude Code, context engineering, verification loops, and production harnesses.

Continuously updated
Claude Code + agentic systems
View Book

Where This Sits In My Stack

This is the package-boundary cousin of DDD bounded contexts for LLMs. Bounded contexts wall off domains inside an app. The four layers wall off published packages from their own internals. Same instinct, different scope.

It also pairs with hand-rolling the core. The four layers are what let you safely delegate the periphery of a library, because you have a structural wall between what users see and what the loop is allowed to touch. Hand-roll the public surface. Let the loop run free behind it.


Key Takeaway

A library’s public API is a permanent commitment. Curated barrel, subpath exports, optional peer dependencies, files: ["dist"]. Four lines of configuration, four independent failure modes covered, no diff review required. If you are letting an agent loop write most of your library, this is the cheapest insurance you will ever buy.

The wall holds because each brick is independent. There is no clever abstraction that collapses them.

Topics
Agent LoopsApi DesignContext EngineeringLibrary DesignNpmPackage JsonPeer DependenciesPublic ApiRalph LoopSubpath Exports

Newsletter

Become a better AI engineer

Weekly deep dives on production AI systems, context engineering, and the patterns that compound. No fluff, no tutorials. Just what works.

Join 306K+ developers. No spam. Unsubscribe anytime.


More Insights

Cover Image for The Domain Glossary Is a Constraint, Not Documentation

The Domain Glossary Is a Constraint, Not Documentation

A glossary file at the repo root is the cheapest way I have found to stop an agent loop from quietly inventing a new vocabulary every iteration. Treat it as the spec the loop has to satisfy, not as docs for humans.

James Phoenix
James Phoenix
Cover Image for Coupling Analyzers Were Solved In 2003

Coupling Analyzers Were Solved In 2003

Java and C# had topology-aware static analysis for twenty years. JavaScript skipped it. Then AI made the gap load-bearing.

James Phoenix
James Phoenix