Deployment

One command. No containers, no database servers, no cloud dependencies. RiskNodes runs where your data lives.

One Command

uvx run risknodes

That is the complete deployment. No Docker. No database server. No reverse proxy. No orchestration service. No cloud account.

RiskNodes is a single Python application. It stores state in SQLite — a single file, no server process. It runs LLM inference locally via Ollama. The entire system — application, database, and AI reasoning — operates on one machine within your physical perimeter.

Most enterprise software demands weeks of infrastructure planning, procurement approvals, and specialist contractors before it produces any value. RiskNodes demands a terminal and a few minutes.

What runs where

A complete RiskNodes deployment consists of two components:

  • RiskNodes — a Python application built on Starlette/ASGI, installed and run via uvx. All business logic, API endpoints, questionnaire processing, and workflow management. Data is stored in a SQLite database file alongside the application.
  • Ollama — local LLM inference server for agentic review. Runs models between 14B and 24B parameters on current-generation hardware. Installed separately if a remote LLM service is not used.

No other services are required. No message queues. No job schedulers. No separate query services. Background tasks run within the ASGI process itself.

Why this matters

The deployment model is not incidental. It is a direct consequence of the sovereignty principle that governs the entire platform.

Air-gap ready. The system has no outbound network dependencies. No telemetry, no licence servers, no cloud APIs. Install the Python package and an Ollama model on a machine with no internet connection, and it works. For defence, government, and financial-services clients who cannot permit data to leave a secured environment, this is not a convenience — it is a prerequisite.

Data stays in one place. SQLite produces a single database file. That file can be backed up with cp. It can be encrypted at rest using filesystem-level encryption. It can be moved between machines by copying it. There is no connection string, no credentials, no database server to harden. The data resides exactly where you put it.

No attack surface. Every additional service in a deployment is a surface to secure and patch — database servers, message brokers, reverse proxies, orchestration engines. RiskNodes eliminates them. The attack surface is the application process and the filesystem it writes to.

Reproducible. Two analysts running the same questionnaire against the same code change on different machines will produce comparable results. The system is deterministic where it matters: same questions, same structured response format, same scoring. Reproducibility is what turns an AI opinion into an auditable assessment.

Data sovereignty

Where your data lives is not a configuration option. It is a physical fact. The machine running RiskNodes is the machine holding your data. If that machine is in your London office, your data is in the United Kingdom. If it is in a Zurich server room, your data is in Switzerland.

There is no cloud tier, no SaaS backend, no “phone home.” Jurisdiction is determined by where you plug in the machine.

For European consultancies operating under GDPR, and for clients subject to financial-services regulation or government classification requirements, this eliminates an entire category of compliance work. There is no data-processing agreement to negotiate with a cloud vendor, because there is no cloud vendor.

Getting started

  1. Install Python 3.11 or later
  2. Install Ollama and pull a model (e.g. ollama pull qwen2.5:14b), or configure an external AI service
  3. Run uvx run risknodes
  4. Open the local URL in a browser

From zero to a working deployment in minutes, not weeks.