Context
During the Protocol v2.0 PR review, we performed the first real multi-construct composition in the ecosystem: Observer (riding 4 dApp codebases) → Protocol (wallet boundary verification) → Artisan (material taste context) → Bridgebuilder (adversarial review). This composition worked — it produced findings none of these constructs could have surfaced alone (shared utility lineage across repos, approve-then-act race conditions, ZeroDev migration scars in 4/4 dApps).
But the composition happened entirely inside the context window. No construct stdout. No event fired. No knowledge artifact persisted for next time. The pipes exist (event bus #109, BUTTERFREEZONE mesh, event topology in manifests) but they don't flow.
This RFC proposes the knowledge layer — the missing piece between signals (lightweight events) and composition (rich structured data flowing construct-to-construct).
The Three User Stories
1. As a user, I want to string together constructs like UNIX pipes
Current: Invoke /ride (Observer) manually, copy findings, invoke Protocol skills manually, copy findings, invoke Artisan manually.
Desired: Something like:
/ride set-and-forgetti | /protocol dapp-lint --manifest ecosystem.yaml | /artisan taste-check
Where each construct's structured output becomes the next construct's structured input.
2. As a construct, I want to benefit from coordination and capture tacit knowledge
Current: Protocol's dapp-lint scans one codebase at a time. It doesn't know that useOnSuccess is a shared utility that exists in 3 other repos. It can't learn from Observer's ground truth about which repos are related.
Desired: Protocol declares knowledge.consumes: [observer.ground_truth, observer.utc_canvas] and receives rich context about the ecosystem before scanning. When Protocol discovers patterns (e.g., "approve-then-act race condition in mibera"), it emits knowledge.emits: [protocol.ecosystem_pattern] that other constructs can consume next time.
3. As a network, I want to offer best-in-class runtime support
Current: Runtime contract (docs/integration/runtime-contract.md) defines exit codes, checkpoints, and context signals. No mention of events, knowledge routing, or correlation propagation. The event bus and runtime contract are completely disconnected.
Desired: Runtime contract includes a "Knowledge Routing" section. When a skill completes, the runtime automatically emits the skill's declared events with the checkpoint data. Correlation IDs propagate through skill chains.
What Already Exists (and What's Missing)
| Infrastructure |
Status |
Gap |
Event bus (event-bus.sh) |
Written, CloudEvents 1.0, flock-atomic, DLQ |
No runtime auto-emission. Skills must call emit_event manually. |
| Event schemas in manifests |
Declared in all 5 pack construct.yaml files |
9 of 11 event types have zero consumers. data_schema is freeform inline, not $ref to JSON Schema. |
Event registry (event-registry.sh) |
Written, topology validation |
Scans manifest.json but installed packs only have construct.yaml — finds nothing. |
| BUTTERFREEZONE mesh |
Written, cross-repo graph traversal |
Edges carry {role, interface, protocol} — no event topology. Can't answer "which events connect these repos." |
| Runtime contract |
Written, exit codes + checkpoints + context signals |
Zero mention of events, knowledge routing, or correlation propagation. |
| Migration guide (#109) |
Proven for Observer + Artisan in midi-interface |
Not applied to Crucible, Beacon, GTM, or Protocol. |
| Compound learning |
Specified (3 levels: session → cycle → upstream) |
Cross-session patterns registry (patterns.json) is .gitkeep only. |
| MoE routing (#119) |
Designed in STRATEGIC-GAP Cycle C |
domain/expertise fields not in manifest schema yet. |
Proposal: Three Phases
Phase 0: Make the Pipes Flow (connect existing infrastructure)
The cheapest wins. No new schemas. No new abstractions. Just wire what we have.
-
Fix event-registry.sh to scan construct.yaml (not just manifest.json)
- 1-line change: add glob for
construct.yaml alongside manifest.json
- Unblocks:
validate_event_topology actually works against installed constructs
-
Add runtime event emission hook to the runtime contract
- After skill checkpoint write, if the skill's pack declares
events.emits, auto-call emit_event with checkpoint summary as data payload
- Correlate via
execution_id from the checkpoint schema
- Add to
docs/integration/runtime-contract.md section 7: "Event Emission"
-
Register Protocol as event consumer/emitter
- Protocol emits:
protocol.verification_complete, protocol.tx_decoded, protocol.qa_report (already in construct.yaml)
- Protocol should consume:
observer.canvas_created (for UTC context during dapp-lint), artisan.taste_inscribed (for material constraints during dapp-e2e)
- Establishes the first 3-way event topology: Observer → Protocol → Artisan
-
Add event edges to BUTTERFREEZONE mesh
- When
butterfreezone-mesh.sh traverses the graph, read events.emits/events.consumes from each repo's construct.yaml
- Add
events field to mesh edges: { from, to, role, interface, protocol, events: [...] }
Phase 1: Knowledge Stanza (rich data alongside signals)
Signals are lightweight ("something happened"). Knowledge is structured ("here's what I found, here's the schema, here's how to use it").
Add knowledge to construct.yaml schema v4:
knowledge:
provides:
- type: ground_truth
schema: schemas/ground-truth.schema.json
description: "Codebase reality extracted by /ride"
format: markdown
retention: permanent
- type: ecosystem_pattern
schema: schemas/ecosystem-pattern.schema.json
description: "Cross-repo pattern detected during verification"
format: json
retention: cycle
consumes:
- type: ground_truth
from: observer
required: false
description: "Ecosystem context for cross-repo verification"
- type: taste_tokens
from: artisan
required: false
description: "Material constraints for UX verification"
Key design decisions:
knowledge is separate from events: Events are fire-and-forget signals. Knowledge is queryable structured data with retention policies.
- Schema references are required (not freeform inline): Prevents the drift we see in current
data_schema fields.
retention controls lifecycle: session (ephemeral), cycle (sprint-scoped), permanent (ground truth).
required: false default: Knowledge consumption is advisory, not blocking. A construct runs without it but runs better with it.
Phase 2: Construct Stdout (UNIX piping)
The user's first story: "string together constructs like UNIX."
Skill output convention: Every skill can optionally write a structured output to a well-known path:
.loa-output/{skill-slug}/output.json
This is the skill's "stdout." It conforms to the skill's declared knowledge.provides schema. The runtime reads this after skill completion and:
- Emits it as the event's
data payload (connecting to the event bus)
- Makes it available to the next skill in a pipeline via
$LOA_PIPE_INPUT
- Stores it in the knowledge store per the
retention policy
Pipeline syntax (golden path integration):
# Single skill
/ride set-and-forgetti
# Piped skills (sequential, output → input)
/ride set-and-forgetti | /protocol dapp-lint
# Fan-out (parallel, shared input)
/ride set-and-forgetti |& /protocol dapp-lint /artisan taste-check
This maps directly to the golden path state machine (golden-path.sh) — each | creates a dependency edge. The runtime reads construct.yaml to validate that the output schema of skill A matches the input schema of skill B.
Motivating Case: Protocol Ecosystem Verification
What this looks like for the wallet boundary review that prompted this RFC:
# Phase 0 (today, with wiring fixes):
/ride set-and-forgetti # Observer emits ground_truth event
/ride cubquests-interface # Observer emits ground_truth event
/ride apdao-auction-house # Observer emits ground_truth event
/ride mibera-interface # Observer emits ground_truth event
/protocol dapp-lint # Protocol consumes ground_truth events, runs scans
# Phase 1 (with knowledge stanza):
# Protocol's dapp-lint automatically receives Observer ground truth
# Protocol knows useOnSuccess is a shared utility because Observer's ground truth says so
# Protocol emits ecosystem_pattern knowledge ("approve-then-act race in mibera")
# Phase 2 (with piping):
/ride --ecosystem manifest.yaml | /protocol dapp-lint --manifest | /artisan taste-check
# One command. Full pipeline. Each construct enriches the next.
Relationship to Existing Issues
| Issue |
Relationship |
| #109 (Event Bus Migration) |
Phase 0 depends on the proven migration pattern from midi-interface |
| #104 (Unified Feedback Schema) |
Knowledge stanza generalizes the feedback schema to all domain data |
| #119 (MoE Routing) |
Phase 2 piping depends on domain/expertise fields for type-safe routing |
| #128 (Verification Gradient) |
L3 composability tier = Phase 2 piping capability |
| #129 (Workflow Gates) |
Pipeline validation uses declared workflow gates for type checking |
| #131 (Construct Lifecycle) |
Bidirectional sync carries knowledge artifacts alongside code |
| #118 (Reference Ingestion) |
knowledge.consumes is the declarative version of methodology ingestion |
| construct-protocol#2 (Ecosystem Manifest) |
The manifest.yaml that Protocol scans is a knowledge artifact from Observer |
Open Questions
- Knowledge storage location:
grimoires/loa/a2a/knowledge/{type}/ alongside events? Or per-construct at grimoires/{pack}/knowledge/?
- Cross-repo knowledge: BUTTERFREEZONE mesh discovers capabilities. Should it also cache/proxy knowledge artifacts from remote repos? Or only advertise them?
- Retention GC: Who compacts
cycle-scoped knowledge after the cycle ends? The compact_events pattern from the event bus could be extended.
- Schema evolution: Knowledge schemas need versioning. Use the same
compatibility: backward | forward | full | none pattern from pack-manifest.schema.json?
Non-Goals
- No new daemon or service: Knowledge routing happens at skill invocation time, not as a background process.
- No breaking changes to existing manifests:
knowledge is additive to construct.yaml. Existing packs without it continue to work.
- No changes to loa-dixie: Dixie is a product layer that will eventually consume knowledge from the construct network. This RFC is about the infrastructure Dixie will consume from.
Originated from Protocol v2.0 PR #1 review — the first real multi-construct composition in the ecosystem.
Context
During the Protocol v2.0 PR review, we performed the first real multi-construct composition in the ecosystem: Observer (riding 4 dApp codebases) → Protocol (wallet boundary verification) → Artisan (material taste context) → Bridgebuilder (adversarial review). This composition worked — it produced findings none of these constructs could have surfaced alone (shared utility lineage across repos, approve-then-act race conditions, ZeroDev migration scars in 4/4 dApps).
But the composition happened entirely inside the context window. No construct stdout. No event fired. No knowledge artifact persisted for next time. The pipes exist (event bus #109, BUTTERFREEZONE mesh, event topology in manifests) but they don't flow.
This RFC proposes the knowledge layer — the missing piece between signals (lightweight events) and composition (rich structured data flowing construct-to-construct).
The Three User Stories
1. As a user, I want to string together constructs like UNIX pipes
Current: Invoke
/ride(Observer) manually, copy findings, invoke Protocol skills manually, copy findings, invoke Artisan manually.Desired: Something like:
Where each construct's structured output becomes the next construct's structured input.
2. As a construct, I want to benefit from coordination and capture tacit knowledge
Current: Protocol's
dapp-lintscans one codebase at a time. It doesn't know thatuseOnSuccessis a shared utility that exists in 3 other repos. It can't learn from Observer's ground truth about which repos are related.Desired: Protocol declares
knowledge.consumes: [observer.ground_truth, observer.utc_canvas]and receives rich context about the ecosystem before scanning. When Protocol discovers patterns (e.g., "approve-then-act race condition in mibera"), it emitsknowledge.emits: [protocol.ecosystem_pattern]that other constructs can consume next time.3. As a network, I want to offer best-in-class runtime support
Current: Runtime contract (
docs/integration/runtime-contract.md) defines exit codes, checkpoints, and context signals. No mention of events, knowledge routing, or correlation propagation. The event bus and runtime contract are completely disconnected.Desired: Runtime contract includes a "Knowledge Routing" section. When a skill completes, the runtime automatically emits the skill's declared events with the checkpoint data. Correlation IDs propagate through skill chains.
What Already Exists (and What's Missing)
event-bus.sh)emit_eventmanually.construct.yamlfilesdata_schemais freeform inline, not$refto JSON Schema.event-registry.sh)manifest.jsonbut installed packs only haveconstruct.yaml— finds nothing.{role, interface, protocol}— no event topology. Can't answer "which events connect these repos."patterns.json) is.gitkeeponly.domain/expertisefields not in manifest schema yet.Proposal: Three Phases
Phase 0: Make the Pipes Flow (connect existing infrastructure)
The cheapest wins. No new schemas. No new abstractions. Just wire what we have.
Fix
event-registry.shto scanconstruct.yaml(not justmanifest.json)construct.yamlalongsidemanifest.jsonvalidate_event_topologyactually works against installed constructsAdd runtime event emission hook to the runtime contract
events.emits, auto-callemit_eventwith checkpoint summary as data payloadexecution_idfrom the checkpoint schemadocs/integration/runtime-contract.mdsection 7: "Event Emission"Register Protocol as event consumer/emitter
protocol.verification_complete,protocol.tx_decoded,protocol.qa_report(already in construct.yaml)observer.canvas_created(for UTC context during dapp-lint),artisan.taste_inscribed(for material constraints during dapp-e2e)Add event edges to BUTTERFREEZONE mesh
butterfreezone-mesh.shtraverses the graph, readevents.emits/events.consumesfrom each repo'sconstruct.yamleventsfield to mesh edges:{ from, to, role, interface, protocol, events: [...] }Phase 1: Knowledge Stanza (rich data alongside signals)
Signals are lightweight ("something happened"). Knowledge is structured ("here's what I found, here's the schema, here's how to use it").
Add
knowledgetoconstruct.yamlschema v4:Key design decisions:
knowledgeis separate fromevents: Events are fire-and-forget signals. Knowledge is queryable structured data with retention policies.data_schemafields.retentioncontrols lifecycle:session(ephemeral),cycle(sprint-scoped),permanent(ground truth).required: falsedefault: Knowledge consumption is advisory, not blocking. A construct runs without it but runs better with it.Phase 2: Construct Stdout (UNIX piping)
The user's first story: "string together constructs like UNIX."
Skill output convention: Every skill can optionally write a structured output to a well-known path:
This is the skill's "stdout." It conforms to the skill's declared
knowledge.providesschema. The runtime reads this after skill completion and:datapayload (connecting to the event bus)$LOA_PIPE_INPUTretentionpolicyPipeline syntax (golden path integration):
This maps directly to the golden path state machine (
golden-path.sh) — each|creates a dependency edge. The runtime readsconstruct.yamlto validate that the output schema of skill A matches the input schema of skill B.Motivating Case: Protocol Ecosystem Verification
What this looks like for the wallet boundary review that prompted this RFC:
Relationship to Existing Issues
knowledge.consumesis the declarative version of methodology ingestionOpen Questions
grimoires/loa/a2a/knowledge/{type}/alongside events? Or per-construct atgrimoires/{pack}/knowledge/?cycle-scoped knowledge after the cycle ends? Thecompact_eventspattern from the event bus could be extended.compatibility: backward | forward | full | nonepattern frompack-manifest.schema.json?Non-Goals
knowledgeis additive toconstruct.yaml. Existing packs without it continue to work.Originated from Protocol v2.0 PR #1 review — the first real multi-construct composition in the ecosystem.