Idora Context Graph

Technical reference · April 2026


Foundation

What is a context graph?

A context graph is a labeled property graph where nodes represent entities and edges represent typed relationships between them. Unlike relational databases, a graph stores relationships as first-class citizens: not derived from foreign key joins but existing as independent, traversable structures in the data model.

Neo4j (the engine Idora uses via AuraDB, fully managed cloud) implements this as nodes with labels and properties, relationships with types, direction, and properties, and the Cypher query language for pattern-matching traversals.

The word “context” is deliberate. This is not a knowledge graph. It is a context graph: it captures the evidence context around software artifacts. What was checked, what was built, what shipped, and the tamper-evident receipts that prove each claim.

The context graph has two writers, each owning a distinct layer. WIT writes the structural layer synchronously: RequirementSource and Seam nodes are available immediately after ingestion so verification can begin without delay. G4 writes the evidence layer asynchronously: Receipt, File, and Artifact nodes flow through the gate pipeline after each event. This separation means verification findings are returned to the user immediately while tamper-evident receipts are written to the graph in the background. Neither layer can serve the other’s purpose: structural data without the pipeline would lack integrity guarantees; evidence data through the pipeline would introduce latency that blocks the user.

The proof that a requirement was verified and satisfied in production is a path through the graph. In a relational database, that requires a multi-table JOIN across specs, receipts, files, and artifacts. In a graph, it is a single Cypher pattern match:

MATCH path = (rs:RequirementSource)-[:CONTAINS]->(s:Seam)
  <-[:VERIFIED_BY]-(v:Receipt {type:'verify-code-generation', determination:'conforms'}) 
  -[:EVALUATED]->(f:File)
  <-[:CONSUMED]-(b:Receipt {type:'build'}) 
  -[:PRODUCED]->(a:Artifact)
WHERE rs.externalId = 'PROJ-456'
  AND v.determination = 'conforms'
RETURN path
The Problem

Why integrity is a graph problem

Software integrity has three layers of questions. The first two are solved. The third is not.

LayerQuestionCurrent toolingGap
CIDoes the code compile and pass tests?GitHub Actions, JenkinsAnswers “did it pass?” but not “was it supposed to pass for this reason?” Tests verify behavior, not intent.
CDCan passing code reach production?ArgoCD, VercelAnswers deployment, not traceability.
Continuous IntegrityDoes the code satisfy its requirements, and can you prove it from spec to production?No system joins requirement → verification → execution → artifact → deployment as one continuous model.Attestations and provenance tools exist for individual pipeline steps. None maintain the full joined chain.

The third layer is a graph problem because the proof is a path traversal:

§auth
AuthService.ts
build
api-server
test
deploy

Each arrow is an edge. Purple for seams and requirement sources, amber for code, blue for receipts, green for artifacts. The proof that seam §auth is satisfied in production is a graph traversal, not a table join or a log search. Note: the diagram above simplifies the path for visual clarity. In the actual graph a verification receipt node sits between the seam and the file, connected by VERIFIED_BY and EVALUATED edges. The Cypher query above reflects the full path.

Graph Data Model

Nodes, edges, and properties

Node types

TypeWhat it representsIdentity modelExample
RequirementSourceA ticket, spec file, or document containing requirements. Versioned on re-ingestion. Prior versions and all their receipts are preserved permanently.Type + external ID + versionPROJ-456, AUTH_SPEC.md
SeamA single atomic requirement extracted from a RequirementSource. The unit of verification.SHA-256(topic | content | source_ref)§auth, §payments
FileSource code or test file. Path-keyed: persists across commits as the same logical entity accumulating evidence.Path + repoAuthService.ts
ReceiptTamper-evident record of a verification, execution, or ingestion event. Append-only, never overwritten.Content-addressed (SHA-256)verify-code · conforms
ArtifactBuild output (binary, container, bundle).Content-addressed (SHA-256)api-server
RepoContainer boundary for all nodes.Repo pathidora/core

Seam nodes are content-addressed: two seams with identical topic, content, and source reference are the same seam, regardless of when they were ingested. File nodes are path-keyed: they persist across commits as the same logical entity accumulating evidence. Artifact and Receipt nodes are content-addressed. Files are mutable entities. Artifacts and Receipts are immutable records. RequirementSource nodes are versioned: re-ingestion creates a new version and preserves the old one with all its seams and receipts intact.

Violations are not a separate node type. A violation is a Receipt with determination: does-not-conform. Visually distinguished by rose color but ontologically identical to any other receipt.

Edge types

EdgeFrom → ToWhat it means
CONTAINSRequirementSourceSeam“This source contains this atomic requirement” · written by WIT at ingestion
SUPERSEDED_BYRequirementSourceRequirementSource“This version was replaced by this version” · old version and all receipts preserved permanently
MAPS_TOSeamFile“This seam governs this file” · accumulates verification_count on every receipt
VERIFIED_BYReceiptSeam“This receipt verified this seam” · the audit trail. Traverse from Seam via incoming edges to get full verification history.
DEPENDS_ONSeamSeam“This seam cannot be meaningfully verified without this one”
CONTRADICTSSeamSeam“These seams conflict” · blocks verification until a human resolves the contradiction
EVALUATEDReceiptFile“This receipt evaluated this file” · verification stream
CONSUMEDReceiptFile or Artifact“This execution used this as input” · execution stream
PRODUCEDReceiptArtifact“This execution created this output”
CONFIRMED_BYSeamReceipt“This seam was captured by this ingestion receipt”
PARENT_OFReceiptReceipt“This execution’s outputs became that execution’s inputs” · build → test → deploy
IN_REPOAny → RepoContainer boundary

Edge thickness on MAPS_TO is not decorative. It encodes how many times that seam has been verified against that file. The thick “verified 12×” edge between seam §auth and auth.service.ts means that seam has been checked against that file twelve times across eight commits.

Edge properties: where compounding lives

EdgeKey propertiesWhat they encode
MAPS_TOverification_count, last_verified, last_determinationHow many times this seam was verified against this file, when, and the most recent result. This is the compounding counter.
CONSUMED / PRODUCEDcommit_sha, timestampWhich commit and when

MAPS_TO vs. VERIFIED_BY

MAPS_TO (Seam → File) is the structural relationship: “seam §auth governs AuthService.ts.” It exists independently of any verification. Its verification_count property increments on every receipt. This edge persists and accumulates. It is the compounding mechanism.

VERIFIED_BY connects Receipts back to the Seams they verified. One MAPS_TO edge aggregates many VERIFIED_BY relationships over time. MAPS_TO is the summary. VERIFIED_BY is the audit trail. Both are needed. Neither can substitute for the other.

Architecture

The file bridge

The architectural innovation that connects two evidence streams no existing tool links.

VERIFICATIONEXECUTIONSeamVERIFIED_BYReceiptEVALUATEDArtifactPRODUCEDReceipt (build)CONSUMEDFILEthe bridge

The File node is the shared key between two evidence streams:

Verification: “AuthService.ts was evaluated against seam §auth
Execution: “AuthService.ts was consumed by build

One hop connects verification to execution.

This answers a question no CI/CD tool or code scanner answers today: was the code that was deployed also the code that was verified? The file path is the join key. No vendor captures both streams because each vendor sees only their own execution boundary. Idora connects them. And it works without any changes to the team’s CI pipeline. The File node already exists in both streams because the same file path appears in verification and in execution. Idora doesn’t create this relationship. It reveals it.

Demo Queries

Three queries

Each query in the /graph demo highlights a subgraph and returns a one-line answer. Connected nodes brighten. Everything else dims. Together they form a complete arc: the system sees what CI misses, traces everything, and catches and tracks problems.

QueryWhat it asksWhat it proves
what did CI miss“What passed CI but failed integrity?”CI said 142 tests passed. Idora found a spec gap CI can’t see: key rotation test missing. Tests verify behavior. Verification checks intent.
trace deploy“What’s in this deployment?”Full traceability chain: every file consumed, every artifact produced, every execution linked. From production back to requirements in one traversal.
violations“Show all integrity violations”6 total, 4 resolved, 2 unresolved. The graph preserves history. Nothing is lost.

What did CI miss

This query demonstrates the difference between testing and verification. Tests verify behavior: does this function return the right output? Verification checks intent: does this code implement what the spec requires? A test suite can pass with a perfect score while missing entire categories of required behavior, because tests only check what someone wrote a test for. Verification against §auth catches the gap because the spec says key rotation is required and the verification agent can see no test covers it.

CI told you everything was fine. It was wrong.

Trace deploy

This query exercises the file bridge end to end. It starts at the deploy receipt and walks backwards: deploy consumed api-server, api-server was produced by build, build consumed 8 files, those files have verification receipts linking to spec sections. The entire chain from production back to requirements, in one traversal. No other tool produces this path because no other tool links verification evidence to execution evidence through shared file nodes.

Violations

Six violations across the graph’s history. Four resolved, each labeled with the deploy where it was fixed. Two still unresolved. This demonstrates three things: the system catches real problems, the graph preserves history across deploys, and the graph distinguishes active risk from resolved risk.

Compounding

How the graph compounds

The graph does not grow by adding new node types. It grows by accumulating more proof on the same structure. Every push adds receipts that connect to existing Seam, File, and Artifact nodes. The topology stays the same. The evidence density increases.

The compounding lives in the verification_count property on MAPS_TO edges. Every verification write increments this counter. As the count rises, the marginal cost of each subsequent verification, for established connections, approaches zero. The same structure that gets denser also gets less expensive to operate.

The visual encoding in the /graph demo reflects this directly. MAPS_TO edge thickness grows with verification_count. The density of receipt satellites around a File node is the integrity signal, not a chart derived from the data.

For the full defensibility argument, see The Compounding Moat.