Download hafen
hafen ships as a self-hosted bundle. The web UI, API, parser, AI gateway, and data-movement runner all run on your machine. Nothing phones home.
Two-command quick start
Works on macOS, Linux, and Windows with Docker Desktop or any Docker daemon.
# 1. Clone the repo git clone https://github.com/davidnhg74/hafen.git cd hafen # 2. Boot the whole stack docker compose up -d # That's it. Open the UI: open http://localhost:3000
First boot pulls the Postgres + API + web images and runs the schema migrations. Takes ~2 minutes on a warm cache, ~5 minutes on first install.
What runs where
Every box below lives inside your firewall. The only outbound connection hafen ever makes is AI conversion calls to the provider you choose, using the API key you provide.
┌──────────────── your infrastructure ────────────────┐
│ │
│ ┌─ hafen bundle (Docker compose) ──┐ │
│ │ │ │
│ │ Next.js UI ── localhost:3000 │ │
│ │ FastAPI ── localhost:8000 │ │
│ │ Postgres ── internal (metadata) │ │
│ │ │ │
│ │ Embedded: parser, AI gateway, │ │
│ │ data-movement runner │ │
│ └─────────┬─────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─ your Oracle (read-only) ─┐ │
│ │ ALL_TABLES, ALL_TAB_COLS │ │
│ │ SELECT for row streaming │ │
│ └───────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─ your Postgres target ─┐ │
│ │ CREATE TABLE (DDL gen)│ │
│ │ COPY (binary) │ │
│ └────────────────────────┘ │
│ │
└──────────────────────────────────────────┬──────────┘
│ only outbound:
▼ (optional) BYOK AI
┌─ Anthropic / OpenAI / local model ─┐
│ key stays in your local config │
│ we never see it │
└────────────────────────────────────┘What hafen can see
- •Only what runs on your host
- •Your Oracle via the read-only user you create
- •Your Postgres via the admin user you provide
What hafen cannot see
- •Anything we could see — because nothing phones home
- •Your DDL, your data, your connection strings
- •Your AI provider keys (they sit in local config)
Install methods
Docker Compose
Available nowDev, demo, most production use.
git clone ... && docker compose up -d
Single Docker image
Shipping with v0.2Staging, CI pipelines, Kubernetes.
docker run -p 3000:3000 -p 8000:8000 hafen/hafen:latest
OVA / tarball
Enterprise tierAir-gapped enterprise installs. All deps vendored.
# unpack on the target, run installer.sh
Requirements
Host
- · Docker 24+
- · 4 GB RAM
- · 5 GB disk
- · Linux/macOS/Win
Source
- · Oracle 11g+
- · Read-only user
- · SELECT on ALL_*
- · Network reach
Target
- · Postgres 13+
- · Superuser / CREATE
- · Network reach
- · RDS/Aurora/on-prem
Verify your install
After docker compose up finishes:
- 1. Health check:
curl http://localhost:8000/healthshould return{"status":"ok"} - 2. Open the UI: http://localhost:3000 — the landing page should render.
- 3. Run an assessment: go to /assess, click “Try the HR sample,” confirm you see a risk list.
- 4. (Optional) Add your AI key: drop your Anthropic API key into
.env— AI conversion previews will expand to live conversion of your actual snippets.
Next steps
- 📘 Getting started — first-migration walkthrough against the HR sample schema.
- 🔐 Buy a Pro license — $25k–$75k per project, offline-verified, unlocks full AI conversion and runbook PDF.
- 💬 Open an issue — grammar gaps, type-mapping requests, feature asks.