Aqium Documentation

Guides, architecture references, and administration docs for Aqium — the AI-powered platform that generates production-ready business software from structured specifications.

Welcome to Aqium Documentation

Aqium turns structured conversations about your business into running, production-ready software. It does this by building a living specification graph of your requirements, then driving a deterministic code-generation pipeline that stays in sync with that graph as your business evolves.


Start Here — Choose Your Path


What's New

Config Kits — ERPNext/Frappe-informed output path

Config Kits introduce a config-driven generation path alongside the existing code-generation path. Informed by Frappe/ERPNext's DocType model, Config Kits let you describe business objects and workflows declaratively; Aqium generates the correct config artifacts (JSON, YAML, DocType definitions) rather than imperative code when that is the right output for your target platform.

Read the Config Kits docs →

Six-Layer Unified Graph

The project graph has been extended from four layers to six, adding explicit Config and Deployment layers alongside Spec, Code, Data, and API. Every node in the graph carries cross-layer edges so drift, blast-radius, and coverage calculations span the full stack.

Read the architecture docs →

Deterministic Convergence Model

Convergence is now evaluated against five binary conditions rather than a heuristic score. A generation batch is considered converged only when all five pass: schema match, interface contract match, test-suite pass, lint clean, and CEGIS structural-diff match. This makes convergence a fact, not an opinion.

Read the convergence docs →

Brownfield Pipeline with CPG Extraction

Existing codebases can be imported via the brownfield pipeline. Aqium runs Code Property Graph (CPG) extraction on your repository, maps the resulting nodes into the unified graph, and identifies spec gaps between what the code does and what a generated specification would require.

Read the brownfield docs →

Per-Customer Railway Deployment

Each customer instance is deployed as an isolated Railway project via the automated deployment pipeline. Provisioning, environment injection, and lifecycle management are handled by the platform — no manual infrastructure work required per tenant.

Read the deployment docs →


Documentation Sections

SectionDescription
ConceptsDiscovery, obligations, specification structure, and the requirement lifecycle
ArchitectureUnified graph, subsystem boundaries, data flow, and integration points
Code GenerationGeneration pipeline, Config Kits, dependency ordering, pattern reuse, and CEGIS verification
ConvergenceThe five-condition convergence model and how the platform evaluates generation completeness
FlywheelsPattern flywheel, standards flywheel, and how successful generations improve future output
DeploymentRailway per-customer deployment, environment configuration, and infrastructure operations
GuidesStep-by-step guides for greenfield projects, brownfield imports, and common workflows
AdminPlatform administration, tenant management, and operational runbooks
ReferenceAPI reference, kit authoring reference, and schema reference

Command Palette

Search for a command to run...