Skip to content

Plan 1 — Project Bootstrap Implementation Plan

For agentic workers: REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (- [ ]) syntax for tracking.

Goal: Replace the legacy aiogram-2.4 / SQLite tutorial code with a fresh uv-based Python 3.12 skeleton that boots in docker-compose, runs Alembic-managed migrations against PostgreSQL 16, exposes Prometheus metrics, and lands a /start-responding aiogram 3 bot guarded by the whitelist middleware. This is the foundation that Plans 2–8 build on.

Architecture: Modular monolith. One Python service (finance_bot) using src-layout, hexagonal-ready directory skeleton (domain/, ports/, application/, adapters/), bootstrap.py as composition root. PostgreSQL 16 with two schemas (app, ledger) created up-front; only the smallest app.* tables are populated in this plan (user, processed_update). UUIDv7 IDs generated client-side in Python (TigerBeetle-friendly) — no PG extensions required.

Tech Stack: Python 3.12 · uv · ruff · mypy · pytest + pytest-asyncio + hypothesis · aiogram 3.x · asyncpg · SQLAlchemy + Alembic · pydantic-settings · structlog · prometheus-client · uuid-utils · PostgreSQL 16 · Grafana · Prometheus · Docker · docker-compose · GitHub Actions.

ADRs delivered in this plan: ADR-0004 (Modular Monolith), ADR-0011 (PII masking in logs), ADR-0012 (Long-polling), ADR-0013 (Single VPS docker-compose). All four describe decisions whose implementations land in this plan; per project policy ADRs are co-authored with the code that realizes them.

Branch: feature/bootstrap (worktree at /Users/zipsybok/dev/telegram-finance-bot-bootstrap).


  • Working directory for every command in this plan is the worktree root: /Users/zipsybok/dev/telegram-finance-bot-bootstrap unless explicitly stated otherwise.
  • Commit style: Conventional Commits (feat:, chore:, docs:, test:, ci:, build:). English. Body wraps at 72 chars.
  • One topic per commit. Each task ends with exactly one commit.
  • Postgres version: 16 (alpine variant in compose) — design pins 16; PG 17’s native uuidv7() is intentionally NOT used because we generate IDs client-side for TB-compatibility (see ADR-0009 in a later plan).
  • Bot token for manual verification: create a throwaway test bot via @BotFather. Save the token to .env (gitignored). The plan does not include an automated end-to-end Telegram test in Plan 1 — that lands in Plan 6.

telegram-finance-bot-bootstrap/
├── .env.example # template for env vars
├── .github/
│ └── workflows/
│ └── sanity.yml # lint+typecheck+unit on push
├── .gitignore # extended for Python/uv/.env
├── Dockerfile # multi-stage uv build
├── README.md # rewritten in English
├── compose.yml # postgres + bot + prom + grafana
├── docs/
│ ├── adr/
│ │ ├── 0004-modular-monolith.md
│ │ ├── 0011-pii-masking-in-logs.md
│ │ ├── 0012-long-polling-over-webhook.md
│ │ └── 0013-single-vps-docker-compose.md
│ ├── design/
│ │ └── 2026-04-27-mvp1-architecture.md # already exists
│ └── plans/
│ └── 2026-04-28-plan-1-bootstrap.md # this file
├── infra/
│ ├── grafana/
│ │ └── provisioning/
│ │ └── datasources/
│ │ └── prometheus.yml
│ ├── postgres/
│ │ └── init/
│ │ └── 01-create-schemas.sql
│ └── prometheus/
│ └── prometheus.yml
├── migrations/
│ ├── env.py
│ ├── script.py.mako
│ └── versions/
│ └── 0001_initial_app_user_processed_update.py
├── pyproject.toml
├── uv.lock # generated
├── src/
│ └── finance_bot/
│ ├── __init__.py
│ ├── __main__.py
│ ├── adapters/
│ │ ├── __init__.py
│ │ ├── ledger/
│ │ │ └── __init__.py
│ │ ├── observability/
│ │ │ ├── __init__.py
│ │ │ ├── logging.py
│ │ │ └── metrics.py
│ │ ├── repositories/
│ │ │ ├── __init__.py
│ │ │ └── postgres/
│ │ │ └── __init__.py
│ │ └── telegram/
│ │ ├── __init__.py
│ │ ├── handlers/
│ │ │ ├── __init__.py
│ │ │ └── start.py
│ │ └── middlewares/
│ │ ├── __init__.py
│ │ └── access.py
│ ├── application/
│ │ └── __init__.py
│ ├── bootstrap.py
│ ├── config.py
│ ├── domain/
│ │ └── __init__.py
│ └── ports/
│ └── __init__.py
└── tests/
├── __init__.py
├── conftest.py
└── unit/
├── __init__.py
├── test_config.py
├── test_logging_redaction.py
└── test_sanity.py

Files DELETED by this plan (legacy tutorial code):

  • Dockerfile (replaced)
  • categories.py, db.py, expenses.py, exceptions.py, middlewares.py, server.py
  • createdb.sql
  • db/ directory
  • pip_requirements.txt
  • README.md (replaced)

docs/ (design, templates-proposed) is kept intact.


Files:

  • Delete: Dockerfile, categories.py, db.py, expenses.py, exceptions.py, middlewares.py, server.py, createdb.sql, pip_requirements.txt

  • Delete: db/ directory

  • Delete: README.md (will be rewritten in Task 21)

  • Step 1.1: Verify the files exist and we’re on the right branch

Terminal window
cd /Users/zipsybok/dev/telegram-finance-bot-bootstrap
git status
git branch --show-current
ls -la

Expected: branch is feature/bootstrap; files listed above are present.

  • Step 1.2: Remove legacy files
Terminal window
git rm Dockerfile categories.py db.py expenses.py exceptions.py middlewares.py server.py createdb.sql pip_requirements.txt README.md
git rm -r db

Expected: git status shows 11 deletions, no other changes.

  • Step 1.3: Commit
Terminal window
git commit -m "chore: remove legacy aiogram-2.4 tutorial implementation
The original code (aiogram 2.4, SQLite, RUB-hardcoded, single-user
via ACCESS_ID env var) is being replaced wholesale by a Python 3.12 /
aiogram 3 / PostgreSQL stack as designed in
docs/design/2026-04-27-mvp1-architecture.md. Tutorial code is
preserved in git history (last commit on master before this branch:
a17682d) and used only as a UX reference (message format, alias-based
categories, /del<id> removal pattern).
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Files:

  • Create: pyproject.toml

  • Create: .python-version

  • Generated: uv.lock, .venv/

  • Step 2.1: Verify uv is installed

Terminal window
uv --version

Expected: prints uv 0.x.y (any 0.4+ release). If not installed: curl -LsSf https://astral.sh/uv/install.sh | sh then re-source shell.

  • Step 2.2: Pin Python version
Terminal window
echo "3.12" > .python-version
  • Step 2.3: Write pyproject.toml

Create pyproject.toml with:

[project]
name = "finance-bot"
version = "0.1.0"
description = "Personal finance Telegram bot — Tier-2 MVP-1"
readme = "README.md"
requires-python = ">=3.12"
license = { text = "MIT" }
authors = [{ name = "zipsybok" }]
dependencies = []
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/finance_bot"]
[tool.uv]
dev-dependencies = []

Note: readme = "README.md" will be invalid until Task 21 creates the file. We add it now so re-running uv sync after Task 21 needs no edit. To bypass the pre-Task-21 build hook check, do NOT install the project itself yet — uv sync --no-install-project is used in the next step.

  • Step 2.4: Sync (creates .venv and uv.lock)
Terminal window
uv sync --no-install-project

Expected: prints Resolved 0 packages, creates .venv/, creates uv.lock.

  • Step 2.5: Verify the venv works
Terminal window
uv run python -c "import sys; print(sys.version)"

Expected: 3.12.x (...).

  • Step 2.6: Update .gitignore

Replace existing .gitignore content with:

# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
*.egg-info/
.eggs/
# uv / venv
.venv/
.python-version-override
# Environment
.env
.env.local
# Tests / coverage
.pytest_cache/
.mypy_cache/
.ruff_cache/
.coverage
htmlcov/
# IDE
.idea/
.vscode/
*.swp
# OS
.DS_Store
Thumbs.db
# Project local
db/ # legacy data dir; never recreate
  • Step 2.7: Commit
Terminal window
git add pyproject.toml .python-version uv.lock .gitignore
git commit -m "build: initialize uv project with Python 3.12 pin
- pyproject.toml with hatchling backend, src layout
- .python-version pins to 3.12
- uv.lock generated; deps will be added in subsequent tasks
- .gitignore covers Python, uv, env, mypy/ruff/pytest caches
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 3: Add dev tooling (ruff, mypy, pytest, hypothesis)

Section titled “Task 3: Add dev tooling (ruff, mypy, pytest, hypothesis)”

Files:

  • Modify: pyproject.toml (add [tool.uv] dev-dependencies, [tool.ruff], [tool.mypy], [tool.pytest.ini_options])

  • Step 3.1: Add dev dependencies

Terminal window
uv add --dev ruff mypy pytest pytest-asyncio hypothesis

Expected: lockfile updated, deps installed under .venv.

  • Step 3.2: Configure ruff in pyproject.toml

Append to pyproject.toml:

[tool.ruff]
line-length = 100
target-version = "py312"
src = ["src", "tests"]
[tool.ruff.lint]
select = [
"E", # pycodestyle errors
"W", # pycodestyle warnings
"F", # pyflakes
"I", # isort
"B", # flake8-bugbear
"UP", # pyupgrade
"C4", # flake8-comprehensions
"SIM", # flake8-simplify
"RUF", # ruff-specific
]
ignore = [
"E501", # line-too-long handled by formatter
]
[tool.ruff.format]
quote-style = "double"
indent-style = "space"
[tool.ruff.lint.per-file-ignores]
"tests/**/*.py" = ["B011"] # allow `assert False` in tests
  • Step 3.3: Configure mypy in pyproject.toml

Append to pyproject.toml:

[tool.mypy]
python_version = "3.12"
namespace_packages = true
explicit_package_bases = true
mypy_path = "src"
files = ["src", "tests"]
# Defaults — relaxed for adapters
ignore_missing_imports = true
warn_unused_ignores = true
warn_redundant_casts = true
# Strict for the core
[[tool.mypy.overrides]]
module = [
"finance_bot.domain.*",
"finance_bot.ports.*",
"finance_bot.application.*",
]
strict = true
  • Step 3.4: Configure pytest in pyproject.toml

Append to pyproject.toml:

[tool.pytest.ini_options]
asyncio_mode = "auto"
testpaths = ["tests"]
addopts = [
"-ra",
"--strict-markers",
"--strict-config",
"-v",
]
markers = [
"integration: tests that require external services (PG, etc.)",
"e2e: end-to-end tests (slow)",
]
  • Step 3.5: Verify tooling
Terminal window
uv run ruff --version
uv run mypy --version
uv run pytest --version

Expected: all three print version strings without error.

  • Step 3.6: Commit
Terminal window
git add pyproject.toml uv.lock
git commit -m "build: add ruff, mypy, pytest, hypothesis as dev deps
- ruff: lint+format, line-length 100, py312 target
- mypy: strict for domain/ports/application; relaxed elsewhere
- pytest: asyncio_mode auto, integration+e2e markers
- hypothesis: available for property-based tests
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Files:

  • Modify: pyproject.toml (deps will be added by uv add)

  • Step 4.1: Add deps

Terminal window
uv add aiogram asyncpg "sqlalchemy[asyncio]" alembic pydantic-settings structlog prometheus-client uuid-utils

Expected: uv.lock updated, deps installed. Specific minor versions resolved by uv (we don’t pin minors here — pyproject leaves them open).

  • Step 4.2: Sanity-check imports
Terminal window
uv run python -c "import aiogram, asyncpg, sqlalchemy, alembic, pydantic_settings, structlog, prometheus_client, uuid_utils; print('all imports ok'); print(f'aiogram {aiogram.__version__}'); print(f'asyncpg {asyncpg.__version__}')"

Expected: all imports ok followed by version lines. aiogram should be 3.x.

  • Step 4.3: Commit
Terminal window
git add pyproject.toml uv.lock
git commit -m "$(cat <<'EOF'
build: add aiogram 3 + Postgres + observability runtime deps
Adds aiogram, asyncpg, sqlalchemy[asyncio], alembic,
pydantic-settings, structlog, prometheus-client, uuid-utils.
uuid-utils provides client-side UUIDv7 generation (TigerBeetle-
compatible time-ordered IDs) without a PG extension dependency.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>
EOF
)"

Task 5: Create src-layout package skeleton

Section titled “Task 5: Create src-layout package skeleton”

Files:

  • Create directories per File Structure section above.

  • Create empty __init__.py files in each.

  • Create stub src/finance_bot/{config.py,bootstrap.py,__main__.py}.

  • Step 5.1: Create directory tree

Terminal window
mkdir -p \
src/finance_bot/domain \
src/finance_bot/ports \
src/finance_bot/application \
src/finance_bot/adapters/ledger \
src/finance_bot/adapters/repositories/postgres \
src/finance_bot/adapters/telegram/handlers \
src/finance_bot/adapters/telegram/middlewares \
src/finance_bot/adapters/observability \
tests/unit
  • Step 5.2: Create __init__.py for every package
Terminal window
touch \
src/finance_bot/__init__.py \
src/finance_bot/domain/__init__.py \
src/finance_bot/ports/__init__.py \
src/finance_bot/application/__init__.py \
src/finance_bot/adapters/__init__.py \
src/finance_bot/adapters/ledger/__init__.py \
src/finance_bot/adapters/repositories/__init__.py \
src/finance_bot/adapters/repositories/postgres/__init__.py \
src/finance_bot/adapters/telegram/__init__.py \
src/finance_bot/adapters/telegram/handlers/__init__.py \
src/finance_bot/adapters/telegram/middlewares/__init__.py \
src/finance_bot/adapters/observability/__init__.py \
tests/__init__.py \
tests/unit/__init__.py
  • Step 5.3: Stub config.py, bootstrap.py, __main__.py

Create src/finance_bot/config.py:

"""Application configuration. Real implementation in Task 7."""

Create src/finance_bot/bootstrap.py:

"""Composition root. Real implementation in Task 12."""

Create src/finance_bot/__main__.py:

"""Entry point. Real implementation in Task 13."""
  • Step 5.4: Verify the package imports
Terminal window
uv run python -c "import finance_bot; print(finance_bot.__path__)"

Expected: prints a list with one path ending in src/finance_bot.

  • Step 5.5: Commit
Terminal window
git add src tests
git commit -m "feat: scaffold finance_bot package (hexagonal layout)
Empty __init__.py files for domain, ports, application, and adapters
(ledger, repositories/postgres, telegram/{handlers,middlewares},
observability). Stub config.py, bootstrap.py, __main__.py — real
implementations land in later tasks.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 6: Sanity test (verify pytest collection works)

Section titled “Task 6: Sanity test (verify pytest collection works)”

Files:

  • Create: tests/unit/test_sanity.py

  • Create: tests/conftest.py

  • Step 6.1: Write tests/conftest.py

"""Top-level pytest fixtures."""
  • Step 6.2: Write tests/unit/test_sanity.py
"""Sanity test — verify pytest collects and runs tests."""
def test_truth() -> None:
assert True
def test_finance_bot_importable() -> None:
import finance_bot
assert finance_bot.__name__ == "finance_bot"
  • Step 6.3: Run pytest
Terminal window
uv run pytest -v

Expected: 2 passed in <1s. No collection errors.

  • Step 6.4: Run ruff and mypy as a sanity check
Terminal window
uv run ruff check
uv run ruff format --check
uv run mypy

Expected: all three exit 0 with no findings.

  • Step 6.5: Commit
Terminal window
git add tests
git commit -m "test: add sanity tests verifying pytest + import work
Confirms pytest collects from tests/, asyncio_mode=auto is honored,
and finance_bot imports cleanly via src layout.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Files:

  • Create: tests/unit/test_config.py

  • Modify: src/finance_bot/config.py

  • Create: .env.example

  • Step 7.1: Write the failing test

Replace tests/unit/test_config.py with:

"""Tests for finance_bot.config.Settings."""
from __future__ import annotations
import pytest
from finance_bot.config import Settings
def _env(monkeypatch: pytest.MonkeyPatch, **overrides: str) -> None:
base = {
"TELEGRAM_BOT_TOKEN": "123:fake-token",
"WHITELIST_TELEGRAM_IDS": "111,222",
"DATABASE_URL": "postgresql+asyncpg://u:p@localhost/db",
"LOG_LEVEL": "INFO",
"METRICS_PORT": "9090",
}
base.update(overrides)
for key, value in base.items():
monkeypatch.setenv(key, value)
def test_settings_loads_from_env(monkeypatch: pytest.MonkeyPatch) -> None:
_env(monkeypatch)
settings = Settings()
assert settings.telegram_bot_token == "123:fake-token"
assert settings.whitelist_telegram_ids == frozenset({111, 222})
assert str(settings.database_url) == "postgresql+asyncpg://u:p@localhost/db"
assert settings.log_level == "INFO"
assert settings.metrics_port == 9090
def test_settings_rejects_empty_whitelist(monkeypatch: pytest.MonkeyPatch) -> None:
_env(monkeypatch, WHITELIST_TELEGRAM_IDS="")
with pytest.raises(ValueError, match="at least one"):
Settings()
def test_settings_rejects_invalid_log_level(monkeypatch: pytest.MonkeyPatch) -> None:
_env(monkeypatch, LOG_LEVEL="VERBOSE")
with pytest.raises(ValueError):
Settings()
def test_settings_default_metrics_port(monkeypatch: pytest.MonkeyPatch) -> None:
_env(monkeypatch)
monkeypatch.delenv("METRICS_PORT", raising=False)
settings = Settings()
assert settings.metrics_port == 9090
  • Step 7.2: Run the test, expect it to fail
Terminal window
uv run pytest tests/unit/test_config.py -v

Expected: ImportError or AttributeError — Settings does not exist yet.

  • Step 7.3: Implement Settings

Replace src/finance_bot/config.py with:

"""Application configuration loaded from environment variables."""
from __future__ import annotations
from typing import Literal
from pydantic import Field, PostgresDsn, field_validator
from pydantic_settings import BaseSettings, SettingsConfigDict
LogLevel = Literal["DEBUG", "INFO", "WARNING", "ERROR"]
class Settings(BaseSettings):
"""Strongly-typed runtime configuration.
All values come from environment variables. A `.env` file at the
project root is loaded automatically in development; in production
real env vars supersede it.
"""
model_config = SettingsConfigDict(
env_file=".env",
env_file_encoding="utf-8",
case_sensitive=False,
extra="ignore",
)
telegram_bot_token: str = Field(min_length=10)
whitelist_telegram_ids: frozenset[int] = Field(default_factory=frozenset)
database_url: PostgresDsn
log_level: LogLevel = "INFO"
metrics_port: int = Field(default=9090, ge=1, le=65535)
@field_validator("whitelist_telegram_ids", mode="before")
@classmethod
def _parse_whitelist(cls, value: object) -> frozenset[int]:
if isinstance(value, frozenset):
return value
if isinstance(value, str):
ids = [part.strip() for part in value.split(",") if part.strip()]
if not ids:
raise ValueError("whitelist_telegram_ids must contain at least one id")
return frozenset(int(v) for v in ids)
raise TypeError(f"unsupported whitelist type: {type(value).__name__}")
  • Step 7.4: Run the test, expect pass
Terminal window
uv run pytest tests/unit/test_config.py -v

Expected: 4 passed.

  • Step 7.5: Lint + typecheck
Terminal window
uv run ruff check
uv run ruff format
uv run mypy

Expected: all three pass clean.

  • Step 7.6: Create .env.example
Terminal window
cat > .env.example <<'EOF'
# Telegram bot token from @BotFather
TELEGRAM_BOT_TOKEN=123456:replace-with-real-token
# Comma-separated list of Telegram user IDs allowed to use the bot.
# Get your own ID by sending /start to @userinfobot.
WHITELIST_TELEGRAM_IDS=111111111,222222222
# PostgreSQL connection string (asyncpg driver).
DATABASE_URL=postgresql+asyncpg://finance_bot:finance_bot@postgres:5432/finance_bot
# DEBUG enables verbose logging including PII (amounts, raw text).
# Never set DEBUG in production.
LOG_LEVEL=INFO
# Port for Prometheus /metrics endpoint.
METRICS_PORT=9090
EOF
  • Step 7.7: Commit
Terminal window
git add src/finance_bot/config.py tests/unit/test_config.py .env.example
git commit -m "feat: add Settings module (pydantic-settings)
- TELEGRAM_BOT_TOKEN, WHITELIST_TELEGRAM_IDS (CSV -> frozenset[int]),
DATABASE_URL (PostgresDsn), LOG_LEVEL (Literal), METRICS_PORT (int).
- .env auto-loaded in dev; real env vars win in prod.
- .env.example documents every variable with safe placeholder values.
- 4 unit tests cover: happy path, empty whitelist, invalid log level,
default metrics port.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 8: Postgres init script and compose.yml skeleton

Section titled “Task 8: Postgres init script and compose.yml skeleton”

Files:

  • Create: infra/postgres/init/01-create-schemas.sql

  • Create: infra/prometheus/prometheus.yml

  • Create: compose.yml

  • Step 8.1: Create Postgres init script

Terminal window
mkdir -p infra/postgres/init

Create infra/postgres/init/01-create-schemas.sql:

-- Bootstrap script run once on a fresh PostgreSQL volume.
-- Creates the two top-level schemas referenced by Alembic migrations.
CREATE SCHEMA IF NOT EXISTS app;
CREATE SCHEMA IF NOT EXISTS ledger;
COMMENT ON SCHEMA app IS 'System of record: users, accounts, categories, budgets, processed_update, audit_log.';
COMMENT ON SCHEMA ledger IS 'Double-entry ledger: ledger.account, ledger.transfer.';
  • Step 8.2: Create Prometheus config
Terminal window
mkdir -p infra/prometheus

Create infra/prometheus/prometheus.yml:

global:
scrape_interval: 15s
evaluation_interval: 15s
scrape_configs:
- job_name: bot
static_configs:
- targets: ["bot:9090"]
  • Step 8.3: Create compose.yml
name: finance-bot
services:
postgres:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_DB: finance_bot
POSTGRES_USER: finance_bot
POSTGRES_PASSWORD: finance_bot
volumes:
- pgdata:/var/lib/postgresql/data
- ./infra/postgres/init:/docker-entrypoint-initdb.d:ro
ports:
- "127.0.0.1:5432:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance_bot -d finance_bot"]
interval: 5s
timeout: 3s
retries: 10
bot:
build:
context: .
dockerfile: Dockerfile
restart: unless-stopped
depends_on:
postgres:
condition: service_healthy
environment:
TELEGRAM_BOT_TOKEN: ${TELEGRAM_BOT_TOKEN}
WHITELIST_TELEGRAM_IDS: ${WHITELIST_TELEGRAM_IDS}
DATABASE_URL: postgresql+asyncpg://finance_bot:finance_bot@postgres:5432/finance_bot
LOG_LEVEL: ${LOG_LEVEL:-INFO}
METRICS_PORT: 9090
ports:
- "127.0.0.1:9091:9090"
prometheus:
image: prom/prometheus:v2.55.0
restart: unless-stopped
volumes:
- ./infra/prometheus/prometheus.yml:/etc/prometheus/prometheus.yml:ro
- promdata:/prometheus
ports:
- "127.0.0.1:9092:9090"
grafana:
image: grafana/grafana:11.3.0
restart: unless-stopped
depends_on:
- prometheus
environment:
GF_SECURITY_ADMIN_USER: admin
GF_SECURITY_ADMIN_PASSWORD: admin
GF_USERS_ALLOW_SIGN_UP: "false"
volumes:
- ./infra/grafana/provisioning:/etc/grafana/provisioning:ro
- grafanadata:/var/lib/grafana
ports:
- "127.0.0.1:3000:3000"
volumes:
pgdata:
promdata:
grafanadata:
  • Step 8.4: Bring up Postgres only and verify
Terminal window
docker compose up -d postgres
sleep 5
docker compose exec -T postgres psql -U finance_bot -d finance_bot -c "\dn"

Expected: \dn lists schemas including app and ledger.

  • Step 8.5: Tear it down (Dockerfile not yet built)
Terminal window
docker compose down
  • Step 8.6: Commit
Terminal window
git add infra compose.yml
git commit -m "build: add compose.yml + postgres init + prometheus config
- compose.yml services: postgres 16-alpine, bot (built locally),
prometheus 2.55, grafana 11.3.
- Postgres init script creates app + ledger schemas on first boot.
- Prometheus scrapes bot:9090 every 15s.
- All host ports bound to 127.0.0.1 only (no public exposure on VPS).
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 9: Multi-stage Dockerfile (uv builder + slim runtime)

Section titled “Task 9: Multi-stage Dockerfile (uv builder + slim runtime)”

Files:

  • Create: Dockerfile

  • Create: .dockerignore

  • Step 9.1: Create .dockerignore

.venv
.git
.github
.idea
.vscode
.mypy_cache
.pytest_cache
.ruff_cache
__pycache__
*.pyc
docs
tests
.env
.env.local

Note: README.md is intentionally not ignored. The Dockerfile’s COPY pyproject.toml uv.lock README.md ./ requires it because pyproject.toml declares readme = "README.md" and uv reads the README when building the project wheel during uv sync.

  • Step 9.2: Create Dockerfile
1.7-labs
# ---- Builder stage ----
FROM python:3.12-slim-bookworm AS builder
ENV UV_LINK_MODE=copy \
UV_COMPILE_BYTECODE=1 \
UV_PYTHON_DOWNLOADS=never
# Install uv
COPY --from=ghcr.io/astral-sh/uv:0.5.4 /uv /uvx /usr/local/bin/
WORKDIR /app
# Copy lock + manifest first for layer caching
COPY pyproject.toml uv.lock README.md ./
COPY src ./src
# Install runtime deps + project (skip dev)
RUN --mount=type=cache,target=/root/.cache/uv \
uv sync --frozen --no-dev --no-install-project \
&& uv sync --frozen --no-dev
# ---- Runtime stage ----
FROM python:3.12-slim-bookworm AS runtime
ENV PATH="/app/.venv/bin:$PATH" \
PYTHONUNBUFFERED=1 \
PYTHONDONTWRITEBYTECODE=1
RUN groupadd --system bot && useradd --system --gid bot --create-home bot
WORKDIR /app
COPY --from=builder --chown=bot:bot /app /app
COPY --chown=bot:bot migrations ./migrations
COPY --chown=bot:bot alembic.ini ./alembic.ini
USER bot
EXPOSE 9090
ENTRYPOINT ["python", "-m", "finance_bot"]

Note: migrations/ and alembic.ini don’t exist yet; they’re created in Task 10. Building the Dockerfile now will fail at the COPY step. The image is built and verified at the end of Task 10. We commit the Dockerfile here so the related changes are co-located.

  • Step 9.3: Commit
Terminal window
git add Dockerfile .dockerignore
git commit -m "build: add multi-stage Dockerfile (uv builder + slim runtime)
- Builder uses uv 0.5.4 to install deps from frozen lockfile.
- Runtime is python:3.12-slim with non-root 'bot' user.
- migrations/ and alembic.ini are copied into the runtime image but
do not yet exist on disk; Task 10 creates them and verifies the
build.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Files:

  • Create: alembic.ini

  • Create: migrations/env.py

  • Create: migrations/script.py.mako

  • Create: migrations/versions/ (empty dir)

  • Step 10.1: Run alembic init for an async template

Terminal window
uv run alembic init --template async migrations

Expected: creates alembic.ini at root and migrations/ with env.py, README, script.py.mako, versions/.

  • Step 10.2: Edit alembic.ini

In the freshly generated alembic.ini, find the line sqlalchemy.url = ... and replace the entire [alembic] section’s sqlalchemy.url with a placeholder we’ll override from env:

sqlalchemy.url =

(blank — env.py reads from Settings).

Also remove or comment out the [loggers], [handlers], [formatters] sections at the bottom — Alembic’s logging conflicts with structlog. Replace the entire logging block at the bottom of alembic.ini with:

[loggers]
keys = root
[handlers]
keys =
[formatters]
keys =
[logger_root]
level = WARNING
handlers =
qualname =
  • Step 10.3: Replace migrations/env.py

Overwrite migrations/env.py with:

"""Alembic environment for async SQLAlchemy with finance_bot.config.Settings."""
from __future__ import annotations
import asyncio
from logging.config import fileConfig
from alembic import context
from sqlalchemy import pool
from sqlalchemy.ext.asyncio import async_engine_from_config
from finance_bot.config import Settings
config = context.config
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# Inject DATABASE_URL from Settings into Alembic config.
settings = Settings() # type: ignore[call-arg]
config.set_main_option("sqlalchemy.url", str(settings.database_url))
# We do not yet have SQLAlchemy declarative models. When models are
# added in Plan 4 (repositories), point target_metadata at them so
# `alembic revision --autogenerate` works.
target_metadata = None
def run_migrations_offline() -> None:
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def do_run_migrations(connection) -> None: # type: ignore[no-untyped-def]
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
async def run_migrations_online() -> None:
connectable = async_engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
async with connectable.connect() as connection:
await connection.run_sync(do_run_migrations)
await connectable.dispose()
if context.is_offline_mode():
run_migrations_offline()
else:
asyncio.run(run_migrations_online())
  • Step 10.4: Verify alembic finds an empty version chain

Bring Postgres up, then run:

Terminal window
docker compose up -d postgres
sleep 3
TELEGRAM_BOT_TOKEN=1234567890:stub WHITELIST_TELEGRAM_IDS=1 \
DATABASE_URL=postgresql+asyncpg://finance_bot:finance_bot@localhost:5432/finance_bot \
uv run alembic current

Expected: prints nothing (no migrations applied yet) and exits 0.

  • Step 10.5: Build the Docker image (verifies Task 9’s Dockerfile)
Terminal window
docker compose build bot

Expected: image builds successfully. (No need to run it yet — the bot’s __main__.py is still a stub.)

  • Step 10.6: Tear down
Terminal window
docker compose down
  • Step 10.7: Commit
Terminal window
git add alembic.ini migrations
git commit -m "build: initialize Alembic with async SQLAlchemy env
- alembic init --template async created migrations/.
- env.py overridden to read DATABASE_URL from finance_bot.config.Settings
(so migrations honor the same env layer as the bot).
- target_metadata=None for now; models land in Plan 4.
- alembic.ini logging block neutered to avoid clashing with structlog
(configured in Task 15).
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 11: First migration — app.user and app.processed_update

Section titled “Task 11: First migration — app.user and app.processed_update”

Files:

  • Create: migrations/versions/0001_initial_app_user_processed_update.py

  • Step 11.1: Generate the empty revision

Terminal window
TELEGRAM_BOT_TOKEN=1234567890:stub WHITELIST_TELEGRAM_IDS=1 \
DATABASE_URL=postgresql+asyncpg://finance_bot:finance_bot@localhost:5432/finance_bot \
uv run alembic revision -m "initial app user and processed_update" --rev-id 0001

Expected: creates migrations/versions/0001_initial_app_user_processed_update.py.

  • Step 11.2: Replace the generated revision body

Open the new file and replace contents with:

"""initial app user and processed_update
Revision ID: 0001
Revises:
Create Date: 2026-04-28
"""
from __future__ import annotations
from collections.abc import Sequence
import sqlalchemy as sa
from alembic import op
revision: str = "0001"
down_revision: str | None = None
branch_labels: Sequence[str] | None = None
depends_on: Sequence[str] | None = None
def upgrade() -> None:
op.create_table(
"user",
sa.Column("id", sa.dialects.postgresql.UUID(as_uuid=True), primary_key=True),
sa.Column("telegram_id", sa.BigInteger(), nullable=False),
sa.Column("display_name", sa.Text(), nullable=False),
sa.Column("default_currency", sa.Text(), nullable=False, server_default="UAH"),
sa.Column(
"created_at",
sa.dialects.postgresql.TIMESTAMP(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.CheckConstraint(
"default_currency ~ '^[A-Z]{3}$'",
name="user_default_currency_iso4217",
),
schema="app",
)
op.create_index("ix_user_telegram_id", "user", ["telegram_id"], unique=True, schema="app")
op.create_table(
"processed_update",
sa.Column("update_id", sa.BigInteger(), primary_key=True),
sa.Column(
"processed_at",
sa.dialects.postgresql.TIMESTAMP(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
schema="app",
)
def downgrade() -> None:
op.drop_table("processed_update", schema="app")
op.drop_index("ix_user_telegram_id", table_name="user", schema="app")
op.drop_table("user", schema="app")
  • Step 11.3: Apply migration
Terminal window
docker compose up -d postgres
sleep 3
TELEGRAM_BOT_TOKEN=1234567890:stub WHITELIST_TELEGRAM_IDS=1 \
DATABASE_URL=postgresql+asyncpg://finance_bot:finance_bot@localhost:5432/finance_bot \
uv run alembic upgrade head

Expected: prints Running upgrade -> 0001, initial app user and processed_update.

  • Step 11.4: Verify tables exist
Terminal window
docker compose exec -T postgres psql -U finance_bot -d finance_bot -c "\dt app.*"

Expected: lists app.user and app.processed_update.

  • Step 11.5: Verify downgrade is clean
Terminal window
TELEGRAM_BOT_TOKEN=1234567890:stub WHITELIST_TELEGRAM_IDS=1 \
DATABASE_URL=postgresql+asyncpg://finance_bot:finance_bot@localhost:5432/finance_bot \
uv run alembic downgrade base
docker compose exec -T postgres psql -U finance_bot -d finance_bot -c "\dt app.*"

Expected: downgrade prints Running downgrade 0001 -> , ...; \dt app.* returns “Did not find any relations”.

  • Step 11.6: Re-apply, leave PG running for next tasks
Terminal window
TELEGRAM_BOT_TOKEN=1234567890:stub WHITELIST_TELEGRAM_IDS=1 \
DATABASE_URL=postgresql+asyncpg://finance_bot:finance_bot@localhost:5432/finance_bot \
uv run alembic upgrade head
  • Step 11.7: Commit
Terminal window
git add migrations/versions/0001_initial_app_user_processed_update.py
git commit -m "feat(db): add initial migration for app.user + app.processed_update
- app.user: id (UUID, client-set in Python — TB-friendly), telegram_id
(BigInt, UK), display_name, default_currency (CHECK ISO-4217 alpha),
created_at.
- app.processed_update: update_id (BigInt PK) for idempotency
middleware to dedupe Telegram redeliveries.
- Verified upgrade->head, downgrade->base, and re-upgrade all clean.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Files:

  • Modify: src/finance_bot/bootstrap.py

  • Create: tests/unit/test_bootstrap.py

  • Step 12.1: Write the failing test

Create tests/unit/test_bootstrap.py:

"""Bootstrap composition root tests."""
from __future__ import annotations
import pytest
from finance_bot.bootstrap import Bootstrap, BootstrapResult
from finance_bot.config import Settings
@pytest.fixture
def settings(monkeypatch: pytest.MonkeyPatch) -> Settings:
monkeypatch.setenv("TELEGRAM_BOT_TOKEN", "123:fake-token")
monkeypatch.setenv("WHITELIST_TELEGRAM_IDS", "111")
monkeypatch.setenv(
"DATABASE_URL", "postgresql+asyncpg://u:p@localhost/db"
)
monkeypatch.setenv("LOG_LEVEL", "INFO")
return Settings() # type: ignore[call-arg]
def test_bootstrap_returns_result_with_settings(settings: Settings) -> None:
result = Bootstrap(settings).build()
assert isinstance(result, BootstrapResult)
assert result.settings is settings
def test_bootstrap_provides_dispatcher_and_bot(settings: Settings) -> None:
result = Bootstrap(settings).build()
# aiogram types — checked by attribute presence to avoid heavy imports
assert hasattr(result.dispatcher, "include_router")
assert hasattr(result.bot, "session")
  • Step 12.2: Run test, expect ImportError
Terminal window
uv run pytest tests/unit/test_bootstrap.py -v

Expected: ImportError — Bootstrap / BootstrapResult don’t exist.

  • Step 12.3: Implement bootstrap.py

Replace src/finance_bot/bootstrap.py with:

"""Composition root.
The ONLY module allowed to import both ports and concrete adapters.
Wires Settings -> aiogram Bot/Dispatcher -> handlers/middlewares.
Real dependencies (DB pool, ledger adapter, repos) get wired in here
in subsequent plans (2-6).
"""
from __future__ import annotations
from dataclasses import dataclass
from aiogram import Bot, Dispatcher
from aiogram.client.default import DefaultBotProperties
from aiogram.enums import ParseMode
from finance_bot.config import Settings
@dataclass(frozen=True)
class BootstrapResult:
settings: Settings
bot: Bot
dispatcher: Dispatcher
class Bootstrap:
"""Build the assembled application from a Settings instance."""
def __init__(self, settings: Settings) -> None:
self._settings = settings
def build(self) -> BootstrapResult:
bot = Bot(
token=self._settings.telegram_bot_token,
default=DefaultBotProperties(parse_mode=ParseMode.HTML),
)
dispatcher = Dispatcher()
# Routers and middlewares are wired in __main__.py (Task 13).
return BootstrapResult(
settings=self._settings,
bot=bot,
dispatcher=dispatcher,
)
  • Step 12.4: Run test, expect pass
Terminal window
uv run pytest tests/unit/test_bootstrap.py -v

Expected: 2 passed.

  • Step 12.5: Lint + typecheck
Terminal window
uv run ruff check
uv run ruff format
uv run mypy

Expected: clean.

  • Step 12.6: Commit
Terminal window
git add src/finance_bot/bootstrap.py tests/unit/test_bootstrap.py
git commit -m "feat: add Bootstrap composition root
- Bootstrap(Settings).build() returns BootstrapResult(settings, bot,
dispatcher).
- Bot uses aiogram 3 DefaultBotProperties with HTML parse_mode.
- Dispatcher created without routers/middlewares — wired in __main__.
- 2 unit tests cover settings round-trip and dispatcher/bot presence.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 13: AccessMiddleware + /start handler + __main__

Section titled “Task 13: AccessMiddleware + /start handler + __main__”

Files:

  • Create: src/finance_bot/adapters/telegram/middlewares/access.py

  • Create: src/finance_bot/adapters/telegram/handlers/start.py

  • Modify: src/finance_bot/__main__.py

  • Create: tests/unit/test_access_middleware.py

  • Step 13.1: Write the failing AccessMiddleware test

Create tests/unit/test_access_middleware.py:

"""Tests for AccessMiddleware."""
from __future__ import annotations
from typing import Any
from unittest.mock import AsyncMock, MagicMock
import pytest
from finance_bot.adapters.telegram.middlewares.access import AccessMiddleware
def _message(user_id: int) -> Any:
msg = MagicMock()
msg.from_user = MagicMock()
msg.from_user.id = user_id
msg.answer = AsyncMock()
return msg
@pytest.mark.asyncio
async def test_passes_when_user_in_whitelist() -> None:
mw = AccessMiddleware(whitelist=frozenset({111, 222}))
handler = AsyncMock(return_value="handled")
msg = _message(111)
result = await mw(handler, msg, {})
assert result == "handled"
handler.assert_awaited_once_with(msg, {})
msg.answer.assert_not_awaited()
@pytest.mark.asyncio
async def test_rejects_when_user_not_in_whitelist() -> None:
mw = AccessMiddleware(whitelist=frozenset({111}))
handler = AsyncMock()
msg = _message(999)
result = await mw(handler, msg, {})
assert result is None
handler.assert_not_awaited()
msg.answer.assert_awaited_once()
args, _ = msg.answer.call_args
assert "Access Denied" in args[0]
@pytest.mark.asyncio
async def test_rejects_when_no_user() -> None:
mw = AccessMiddleware(whitelist=frozenset({111}))
handler = AsyncMock()
msg = MagicMock()
msg.from_user = None
msg.answer = AsyncMock()
result = await mw(handler, msg, {})
assert result is None
handler.assert_not_awaited()
  • Step 13.2: Run test, expect ImportError
Terminal window
uv run pytest tests/unit/test_access_middleware.py -v

Expected: ImportError — AccessMiddleware doesn’t exist.

  • Step 13.3: Implement AccessMiddleware

Create src/finance_bot/adapters/telegram/middlewares/access.py:

"""Whitelist-based access control middleware (aiogram 3)."""
from __future__ import annotations
from collections.abc import Awaitable, Callable
from typing import Any
from aiogram import BaseMiddleware
from aiogram.types import TelegramObject
class AccessMiddleware(BaseMiddleware):
"""Drop messages from users not in the whitelist."""
def __init__(self, whitelist: frozenset[int]) -> None:
self._whitelist = whitelist
async def __call__(
self,
handler: Callable[[TelegramObject, dict[str, Any]], Awaitable[Any]],
event: TelegramObject,
data: dict[str, Any],
) -> Any:
user = getattr(event, "from_user", None)
if user is None or user.id not in self._whitelist:
answer = getattr(event, "answer", None)
if answer is not None:
await answer("Access Denied")
return None
return await handler(event, data)
  • Step 13.4: Run test, expect pass
Terminal window
uv run pytest tests/unit/test_access_middleware.py -v

Expected: 3 passed.

  • Step 13.5: Implement /start handler

Create src/finance_bot/adapters/telegram/handlers/start.py:

"""/start command handler."""
from __future__ import annotations
from aiogram import Router
from aiogram.filters import CommandStart
from aiogram.types import Message
start_router = Router(name="start")
@start_router.message(CommandStart())
async def handle_start(message: Message) -> None:
await message.answer(
"<b>Finance Bot</b> — MVP-1 bootstrap.\n\n"
"Real commands land in Plan 6. For now this is a connectivity "
"check: if you see this message, the bot is up, the whitelist "
"lets you through, and Postgres + metrics are running."
)
  • Step 13.6: Implement __main__.py

Replace src/finance_bot/__main__.py with:

"""Application entry point.
Wires routers + middlewares onto the bootstrapped Dispatcher and starts
long-polling. Prometheus metrics endpoint is started in a background
thread.
"""
from __future__ import annotations
import asyncio
from prometheus_client import start_http_server
from finance_bot.adapters.telegram.handlers.start import start_router
from finance_bot.adapters.telegram.middlewares.access import AccessMiddleware
from finance_bot.bootstrap import Bootstrap
from finance_bot.config import Settings
async def amain() -> None:
settings = Settings() # type: ignore[call-arg]
deps = Bootstrap(settings).build()
# Wire middlewares
deps.dispatcher.message.middleware(
AccessMiddleware(whitelist=settings.whitelist_telegram_ids)
)
# Wire routers
deps.dispatcher.include_router(start_router)
# Start metrics server (synchronous, runs in background thread)
start_http_server(settings.metrics_port)
try:
await deps.dispatcher.start_polling(deps.bot)
finally:
await deps.bot.session.close()
def main() -> None:
asyncio.run(amain())
if __name__ == "__main__":
main()
  • Step 13.7: Run all unit tests + lint + typecheck
Terminal window
uv run pytest -v
uv run ruff check
uv run ruff format
uv run mypy

Expected: all pass clean.

  • Step 13.8: Manual end-to-end smoke test

Create a .env with a real Telegram bot token (from @BotFather) and your own Telegram ID:

Terminal window
cp .env.example .env
# edit .env: paste real TELEGRAM_BOT_TOKEN and your real Telegram ID into WHITELIST_TELEGRAM_IDS

Bring everything up:

Terminal window
docker compose up -d postgres
sleep 3
TELEGRAM_BOT_TOKEN=1234567890:stub WHITELIST_TELEGRAM_IDS=1 \
DATABASE_URL=postgresql+asyncpg://finance_bot:finance_bot@localhost:5432/finance_bot \
uv run alembic upgrade head
# run bot from host (faster than rebuilding the image)
uv run python -m finance_bot

In Telegram, send /start to your bot. Expected: bot replies with the bootstrap message.

In a second shell:

Terminal window
curl -s http://localhost:9090/metrics | head -20

Expected: Prometheus metrics text format. Stop the bot (Ctrl+C) and tear down: docker compose down.

  • Step 13.9: Commit
Terminal window
git add src/finance_bot/adapters/telegram/middlewares/access.py \
src/finance_bot/adapters/telegram/handlers/start.py \
src/finance_bot/__main__.py \
tests/unit/test_access_middleware.py
git commit -m "feat(bot): add /start handler, AccessMiddleware, and entry point
- AccessMiddleware drops messages from non-whitelist users with
'Access Denied'; works for events with or without from_user.
- /start handler answers with a connectivity-check message.
- __main__ wires middleware + router, starts Prometheus metrics on
METRICS_PORT, then dispatcher.start_polling(bot).
- Manual smoke verified: real bot replies to /start; /metrics returns
Prometheus exposition format.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 14: Prometheus metrics module (placeholder for now)

Section titled “Task 14: Prometheus metrics module (placeholder for now)”

Files:

  • Create: src/finance_bot/adapters/observability/metrics.py

This task formalizes the metrics module; in Plan 6/7 it grows real counters and histograms. For now we expose a single sentinel counter so curl /metrics shows something app-specific.

  • Step 14.1: Write the metrics module

Create src/finance_bot/adapters/observability/metrics.py:

"""Prometheus metric registrations.
Plan 6 fills in ledger / handler metrics. This bootstrap only exposes
a heartbeat so that `/metrics` returns at least one finance_bot_*
series and Grafana dashboards have something to bind to.
"""
from __future__ import annotations
from prometheus_client import Counter
bot_started_total = Counter(
"bot_started_total",
"Number of times the bot process has started.",
)
  • Step 14.2: Wire it in __main__.py

Modify src/finance_bot/__main__.py — add an import and an increment:

# at top with other imports
from finance_bot.adapters.observability.metrics import bot_started_total

In amain, immediately after start_http_server(settings.metrics_port):

bot_started_total.inc()
  • Step 14.3: Verify lint + typecheck
Terminal window
uv run ruff check
uv run ruff format
uv run mypy

Expected: clean.

  • Step 14.4: Commit
Terminal window
git add src/finance_bot/adapters/observability/metrics.py src/finance_bot/__main__.py
git commit -m "feat(observability): expose bot_started_total counter
Tiny heartbeat metric so /metrics has at least one finance_bot_*
series. Plan 6 adds real counters and histograms (messages_received,
ledger_transfer_duration, invariant_violations, etc.).
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 15: structlog setup with PII-redaction

Section titled “Task 15: structlog setup with PII-redaction”

Files:

  • Create: src/finance_bot/adapters/observability/logging.py

  • Create: tests/unit/test_logging_redaction.py

  • Modify: src/finance_bot/__main__.py (call configure_logging)

  • Step 15.1: Write the failing redaction test

Create tests/unit/test_logging_redaction.py:

"""Tests for structlog redaction processor."""
from __future__ import annotations
import logging
import pytest
import structlog
from finance_bot.adapters.observability.logging import (
configure_logging,
redact_pii_processor,
)
def test_redact_pii_replaces_known_keys_at_info() -> None:
event = {
"amount_minor": 25000,
"raw_text": "250 кафе",
"category_name": "cafe",
"account_name": "card",
"user_id": "uuid-here", # not PII — kept
"level": "info",
}
result = redact_pii_processor(None, "info", event)
assert result["amount_minor"] == "<redacted>"
assert result["raw_text"] == "<redacted>"
assert result["category_name"] == "<redacted>"
assert result["account_name"] == "<redacted>"
assert result["user_id"] == "uuid-here"
def test_redact_pii_passthrough_at_debug() -> None:
event = {
"amount_minor": 25000,
"raw_text": "250 кафе",
"level": "debug",
}
result = redact_pii_processor(None, "debug", event)
assert result["amount_minor"] == 25000
assert result["raw_text"] == "250 кафе"
def test_configure_logging_idempotent(capsys: pytest.CaptureFixture[str]) -> None:
configure_logging(level="INFO")
configure_logging(level="INFO") # second call should not crash
log = structlog.get_logger("test")
log.info("hello", user_id="abc")
log.warning("world", amount_minor=999)
out = capsys.readouterr().err
assert "hello" in out
assert "abc" in out
assert "<redacted>" in out # 999 must be redacted at INFO/WARNING
# ensure root logging level is set
assert logging.getLogger().level == logging.INFO
  • Step 15.2: Run test, expect ImportError
Terminal window
uv run pytest tests/unit/test_logging_redaction.py -v

Expected: ImportError.

  • Step 15.3: Implement logging module

Create src/finance_bot/adapters/observability/logging.py:

"""structlog configuration with PII-redaction at INFO+.
DEBUG is verbose: amounts, raw_text, category/account names appear
in logs. INFO/WARNING/ERROR redact those fields. ADR-0011 governs
this decision.
"""
from __future__ import annotations
import logging
import sys
from typing import Any
import structlog
from structlog.types import EventDict, WrappedLogger
PII_KEYS: frozenset[str] = frozenset(
{
"amount_minor",
"amount",
"raw_text",
"category_name",
"account_name",
"display_name",
}
)
REDACTED = "<redacted>"
def redact_pii_processor(
logger: WrappedLogger | None,
method_name: str,
event_dict: EventDict,
) -> EventDict:
"""Replace PII keys with <redacted> unless the call is at DEBUG level."""
if method_name == "debug":
return event_dict
for key in PII_KEYS:
if key in event_dict:
event_dict[key] = REDACTED
return event_dict
_configured = False
def configure_logging(*, level: str = "INFO") -> None:
"""Idempotently configure structlog and stdlib logging."""
global _configured
log_level = getattr(logging, level.upper(), logging.INFO)
logging.basicConfig(
format="%(message)s",
stream=sys.stderr,
level=log_level,
force=True,
)
processors: list[Any] = [
structlog.contextvars.merge_contextvars,
structlog.processors.add_log_level,
structlog.processors.TimeStamper(fmt="iso", utc=True),
redact_pii_processor,
structlog.processors.JSONRenderer(),
]
structlog.configure(
processors=processors,
wrapper_class=structlog.make_filtering_bound_logger(log_level),
context_class=dict,
logger_factory=structlog.PrintLoggerFactory(file=sys.stderr),
cache_logger_on_first_use=True,
)
_configured = True
  • Step 15.4: Run test, expect pass
Terminal window
uv run pytest tests/unit/test_logging_redaction.py -v

Expected: 3 passed.

  • Step 15.5: Wire configure_logging in __main__.py

Edit src/finance_bot/__main__.py:

  1. Add this import alongside the others at the top of the file:
from finance_bot.adapters.observability.logging import configure_logging
  1. Insert one new line in amain() immediately after settings = Settings(). The opening of amain() should now read:
async def amain() -> None:
settings = Settings() # type: ignore[call-arg]
configure_logging(level=settings.log_level)
deps = Bootstrap(settings).build()
# ... rest unchanged

No other lines in amain() change; in particular start_http_server, bot_started_total.inc(), middleware/router wiring, and start_polling keep their order from Tasks 13 and 14.

  • Step 15.6: Lint + typecheck + full test run
Terminal window
uv run ruff check
uv run ruff format
uv run mypy
uv run pytest -v

Expected: all clean.

  • Step 15.7: Commit
Terminal window
git add src/finance_bot/adapters/observability/logging.py \
src/finance_bot/__main__.py \
tests/unit/test_logging_redaction.py
git commit -m "feat(observability): structlog setup with PII redaction (ADR-0011)
- redact_pii_processor replaces amount_minor, raw_text, category_name,
account_name, display_name with '<redacted>' at INFO/WARNING/ERROR.
- DEBUG level is verbose: PII passes through for local debugging.
- configure_logging is idempotent; called once at amain() startup.
- 3 unit tests cover redaction at info, passthrough at debug, and
configure_logging idempotency.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 16: Grafana provisioning (Prometheus datasource)

Section titled “Task 16: Grafana provisioning (Prometheus datasource)”

Files:

  • Create: infra/grafana/provisioning/datasources/prometheus.yml

  • Step 16.1: Create Grafana datasource provisioning

Terminal window
mkdir -p infra/grafana/provisioning/datasources

Create infra/grafana/provisioning/datasources/prometheus.yml:

apiVersion: 1
datasources:
- name: Prometheus
type: prometheus
access: proxy
url: http://prometheus:9090
isDefault: true
editable: true
  • Step 16.2: Verify Grafana picks it up
Terminal window
docker compose up -d
sleep 10
curl -s -u admin:admin http://localhost:3000/api/datasources | python -m json.tool

Expected: returns a JSON array containing one datasource named Prometheus.

  • Step 16.3: Tear down
Terminal window
docker compose down
  • Step 16.4: Commit
Terminal window
git add infra/grafana
git commit -m "ops: provision Prometheus datasource in Grafana
Datasource auto-loaded on container start via
provisioning/datasources/prometheus.yml. Plan 7 adds dashboards.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 17: ADR-0004 — Modular Monolith over Microservices

Section titled “Task 17: ADR-0004 — Modular Monolith over Microservices”

Files:

  • Create: docs/adr/0004-modular-monolith.md

  • Step 17.1: Create the ADR

Terminal window
mkdir -p docs/adr

Create docs/adr/0004-modular-monolith.md:

---
type: adr
id: ADR-0004
status: accepted
date: 2026-04-28
deciders:
- "@zipsybok"
related: [ADR-0006, ADR-0013]
tags: [architecture, topology]
---
# ADR-0004 — Adopt a modular monolith for MVP-1
## Status
**Accepted***2026-04-28*
## Context
MVP-1 has two real users (the author and his partner) and a Tier-2
feature scope (expenses, income, transfers, budgets, simple stats).
Throughput is < 100 messages/day. The system must run on a single VPS
under docker-compose (ADR-0013) with one operator.
The codebase is brand-new and has to ship within ~4 weeks of part-time
work. The portfolio narrative has to support a credible answer to "why
this topology", not "we chose microservices because everyone does".
### Considered alternatives
| Option | Summary | Pros | Cons | Outcome |
|--------|---------|------|------|---------|
| A — Microservices | Bot, ledger, reports as separate services | Independent deploys; clean fault isolation; trendy on CV | 3× ops overhead; inter-service calls add latency and failure modes; no real reason to split at 100 msg/day; portfolio narrative becomes "I cargo-culted Netflix" | rejected |
| B — Plain monolith | One Python module with no internal boundaries | Fastest to write | Refactoring later is painful; ledger swap to TigerBeetle becomes a rewrite | rejected |
| **C (chosen) — Modular monolith** | One process, hexagonal layout (`domain`, `ports`, `application`, `adapters`); all run in one container | All upsides of (B) for ops; (A)'s clean module boundaries baked in; ledger backend swap is a one-file change in `adapters/ledger/` | Requires self-discipline (importing `adapters.*` from `domain.*` is forbidden — enforced by mypy & code review) | **selected** |
## Decision
We will run all MVP-1 functionality as **one Python process** packaged
in **one Docker image**, with internal modules separated by hexagonal
boundaries (`domain`, `ports`, `application`, `adapters`). Cross-module
calls happen via Python imports, never network.
Concretely:
- One container in `compose.yml` named `bot`.
- One entry point: `python -m finance_bot`.
- `bootstrap.py` is the only place that wires concrete adapters to
ports; everything else depends on interfaces.
- A future split into services is a non-goal; if traffic ever justifies
it, the candidate seam is `application/``adapters/telegram/`
(split the bot front-end from the domain backend), not splitting the
domain itself.
## Consequences
### Positive
- One artifact to build, push, deploy, and observe.
- Lowest cognitive overhead for solo development.
- Clean swap to TigerBeetle (the only hard module-replacement on the
roadmap) is a one-adapter change — no service contract negotiation.
### Negative / trade-offs
- A bug in one module can crash the whole bot. Acceptable for MVP-1
(Telegram redelivers updates after restart).
- The hexagonal discipline must be enforced by mypy boundaries and
code review; the language doesn't enforce it natively.
### Neutral / follow-ups
- [ ] Add an architecture-fitness test in CI (Plan 7) that asserts no
module under `domain/` or `application/` imports anything under
`adapters/`.
## References
- Sam Newman, *Monolith to Microservices*, Ch. 1 — when to split.
- TigerBeetle docs/coding/data-modeling.md — adapter swap is the only
planned cross-module replacement, and that's a library swap, not a
service split.
  • Step 17.2: Commit
Terminal window
git add docs/adr
git commit -m "docs(adr): ADR-0004 modular monolith over microservices
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Files:

  • Create: docs/adr/0011-pii-masking-in-logs.md

  • Step 18.1: Create the ADR

Create docs/adr/0011-pii-masking-in-logs.md:

---
type: adr
id: ADR-0011
status: accepted
date: 2026-04-28
deciders:
- "@zipsybok"
related: []
tags: [observability, security, privacy]
---
# ADR-0011 — Mask PII in logs by default; opt-in DEBUG verbosity
## Status
**Accepted***2026-04-28*
## Context
The bot processes financial transactions whose payload — amounts,
free-text categories, account names — is privacy-sensitive even at
two-user scale. Logs are written to stderr in JSON and end up on a
shared VPS, in CI run artifacts, and (later) in any log shipper.
We want enough information in logs to debug production issues without
leaking spend patterns of either user. Local debugging occasionally
needs the full payload.
### Considered alternatives
| Option | Summary | Pros | Cons | Outcome |
|--------|---------|------|------|---------|
| A — Log everything | Default to DEBUG-style logs in production | Easiest to debug | Every retained log archive becomes a soft data leak; reviewers/operators see partner's spending | rejected |
| B — Log nothing PII-shaped | Strip amounts/raw_text/category at all levels | Bulletproof privacy | Production debugging becomes guesswork (we lose the ability to correlate "what message produced what error") | rejected |
| **C (chosen) — Mask by default, opt-in verbose** | INFO/WARNING/ERROR redact known PII keys; DEBUG passes through. `LOG_LEVEL=DEBUG` is opt-in via env, not a default | Production logs safe by construction; local debugging keeps full power; one knob to flip | Discipline required: new code must use the standard PII keys (`amount_minor`, `raw_text`, `category_name`, `account_name`, `display_name`) so the redactor catches them | **selected** |
## Decision
We will configure structlog with a `redact_pii_processor` that
replaces a fixed set of PII-flagged keys with the literal string
`<redacted>` whenever the log call is at INFO, WARNING, or ERROR.
At DEBUG the processor is a no-op.
The PII key set is centralized in `finance_bot/adapters/observability/logging.py`:
- `amount_minor`, `amount`
- `raw_text`
- `category_name`, `account_name`, `display_name`
`LOG_LEVEL=DEBUG` must never be set in production. Local development
turns it on through `.env`.
## Consequences
### Positive
- Default-safe: a fresh deployment cannot leak PII through logs.
- Single source of truth (PII_KEYS constant) — adding a new sensitive
field means adding a name to one frozenset.
- Test asserts the contract: `test_redact_pii_replaces_known_keys_at_info`,
`test_redact_pii_passthrough_at_debug`.
### Negative / trade-offs
- New code that introduces a new sensitive field but forgets to add
it to `PII_KEYS` will leak. Mitigation: code review checklist + a
fitness test in Plan 7 that scans for log calls with raw amounts.
- `<redacted>` makes some production traces hard to correlate. When
needed, the operator flips `LOG_LEVEL=DEBUG` for one process,
reproduces, then flips back.
### Neutral / follow-ups
- [ ] Plan 7: add a structlog filter that drops the entire log line
if a sensitive key is detected at WARNING/ERROR with a non-string
value (defensive: catches "I forgot to redact" by failing closed).
- [ ] Add `LOG_LEVEL=DEBUG` to runbook with a warning banner.
## References
- `src/finance_bot/adapters/observability/logging.py` — implementation.
- `tests/unit/test_logging_redaction.py` — contract tests.
  • Step 18.2: Commit
Terminal window
git add docs/adr/0011-pii-masking-in-logs.md
git commit -m "docs(adr): ADR-0011 PII masking in logs by default
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 19: ADR-0012 — Long-polling over webhook for MVP-1

Section titled “Task 19: ADR-0012 — Long-polling over webhook for MVP-1”

Files:

  • Create: docs/adr/0012-long-polling-over-webhook.md

  • Step 19.1: Create the ADR

Create docs/adr/0012-long-polling-over-webhook.md:

---
type: adr
id: ADR-0012
status: accepted
date: 2026-04-28
deciders:
- "@zipsybok"
related: [ADR-0013]
tags: [telegram, ops, networking]
---
# ADR-0012 — Use long-polling (not webhooks) for Telegram updates in MVP-1
## Status
**Accepted***2026-04-28*
## Context
Telegram offers two delivery modes for bot updates: long-polling
(`getUpdates`) and webhooks (Telegram POSTs to your HTTPS endpoint).
MVP-1 hosts on a single VPS (ADR-0013) without a public-facing
domain or TLS termination. Traffic is sub-100 messages/day, latency
budget is "user doesn't notice" (a couple of seconds is fine).
### Considered alternatives
| Option | Summary | Pros | Cons | Outcome |
|--------|---------|------|------|---------|
| A — Webhook | Telegram POSTs to `https://bot.example.com/...` | Lowest latency; no polling overhead | Requires public domain, TLS cert (Let's Encrypt + renew), reverse proxy or aiohttp TLS, firewall rule, hardening against abuse — none of which add value at 100 msg/day | rejected |
| B — Long-polling | Bot calls `getUpdates` with `timeout=30` | Works behind NAT; no domain or cert needed; trivial to develop locally; aiogram defaults | Slightly more wasted CPU on idle polling; one extra second of latency in the worst case | accepted |
## Decision
We will use **`Dispatcher.start_polling(bot)`** with aiogram's default
long-polling (timeout 30 s, `skip_updates=False`). Telegram queues
updates for up to 24 h if the bot is offline; on restart we fetch
the backlog and dedupe via `app.processed_update.update_id`.
Webhook is a future ADR when:
- the bot must respond in < 1 s (currently no SLO requires it), OR
- the bot fans out to many users (current scope: 2), OR
- we add a web companion that already needs HTTPS (deferred to v1).
## Consequences
### Positive
- Zero infrastructure beyond the VPS + Telegram.
- Local development is identical to production: same `start_polling`
loop, no need for ngrok or local TLS proxy.
- No surface area for inbound abuse (no public HTTP endpoint).
### Negative / trade-offs
- Continuous outbound HTTPS connection from VPS to api.telegram.org.
Acceptable.
- ~30 s of update backlog is possible after a restart; idempotency
via `processed_update.update_id` makes redelivery safe.
### Neutral / follow-ups
- [ ] When traffic > 1 msg/sec sustained or external onboarding ships
in MVP-2, write a successor ADR proposing webhooks and revisit.
## References
- aiogram 3 docs — `Dispatcher.start_polling`.
- Telegram Bot API — `getUpdates` semantics, including 24 h queue.
  • Step 19.2: Commit
Terminal window
git add docs/adr/0012-long-polling-over-webhook.md
git commit -m "docs(adr): ADR-0012 long-polling over webhook for MVP-1
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Task 20: ADR-0013 — Single VPS docker-compose

Section titled “Task 20: ADR-0013 — Single VPS docker-compose”

Files:

  • Create: docs/adr/0013-single-vps-docker-compose.md

  • Step 20.1: Create the ADR

Create docs/adr/0013-single-vps-docker-compose.md:

---
type: adr
id: ADR-0013
status: accepted
date: 2026-04-28
deciders:
- "@zipsybok"
related: [ADR-0004, ADR-0012]
tags: [ops, hosting]
---
# ADR-0013 — Run MVP-1 on a single VPS via docker-compose
## Status
**Accepted***2026-04-28*
## Context
MVP-1 traffic is < 100 messages/day from two known users. The bot
must be reachable from Telegram (long-polling — ADR-0012 — so no
inbound HTTPS endpoint required) and store data in PostgreSQL with
local Prometheus + Grafana.
The operator is a solo developer who is also the primary user. There
is no on-call rotation, no SLO commitment beyond "best effort", and
no compliance requirement.
### Considered alternatives
| Option | Summary | Pros | Cons | Outcome |
|--------|---------|------|------|---------|
| A — Managed Kubernetes (GKE / EKS) | Each service as a Deployment | "Production-grade"; trivial scale-out | Months of ops setup; ~$70/mo idle cost; comically over-engineered for 100 msg/day; no scale story justifies it | rejected |
| B — Self-hosted k3s on the VPS | k3s + Helm chart per service | Lighter than full k8s; Helm portfolio sample | Still wildly more YAML than two docker-compose services; troubleshooting eats focus | rejected |
| C — Managed services (Heroku / Fly / Render) | Each container as a managed dyno | Zero ops | $$ at idle; Postgres add-ons cost; vendor lock-in for a bot that should be trivially portable | rejected |
| **D (chosen) — Single VPS + docker-compose** | One $5–10/mo VPS, all containers via `docker compose up` | Simplest possible; portable across providers; all logs/metrics local | Single point of failure (unimportant for personal bot); manual scale-up later if needed | **selected** |
## Decision
We will host the entire MVP-1 stack on **one VPS** managed by
**docker-compose**. The compose file (`compose.yml`) defines four
services: `bot`, `postgres`, `prometheus`, `grafana`. Volumes are
named (not bind-mounts) so backup/restore is `docker run --rm -v
pgdata:/data ... tar`-style.
All host ports bind to `127.0.0.1` only. External access is via SSH
tunnel for Grafana and via Telegram itself for the bot. No reverse
proxy, no public TLS, no firewall rules beyond default-deny inbound.
## Consequences
### Positive
- Reproducible local dev: same `compose.yml` on laptop and on VPS.
- Two-line deploy: `docker compose pull && docker compose up -d`.
- ~$10/mo all-in (VPS only).
### Negative / trade-offs
- VPS reboot kills the service for ~30 s. Acceptable: Telegram
redelivers updates.
- No automatic horizontal scaling. Re-evaluate at MVP-2 if user count
grows past ~50.
- Single-VPS backup story is "rsync `pgdata` volume to S3 nightly" —
basic but sufficient. Will be formalized in Plan 8 deploy task.
### Neutral / follow-ups
- [ ] Plan 8: write a `make deploy` Makefile target that does
`docker compose pull && up -d --remove-orphans`.
- [ ] Plan 8: add nightly `pg_dump` cron + S3 upload.
- [ ] Re-evaluate at MVP-2 when external onboarding starts (potential
trigger to move to a small managed Postgres for backup off-host).
## References
- ADR-0004 (Modular monolith) — only one bot container is needed.
- ADR-0012 (Long-polling) — no inbound HTTPS required.
- compose.yml — current deployment definition.
  • Step 20.2: Commit
Terminal window
git add docs/adr/0013-single-vps-docker-compose.md
git commit -m "docs(adr): ADR-0013 single VPS docker-compose for MVP-1
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"

Files:

  • Create: README.md

  • Step 21.1: Write the new README

Create README.md:

# telegram-finance-bot
A personal-finance Telegram bot. Records expenses, income, and inter-account
transfers from natural-language messages (`250 кафе`, `+5000 зарплата`,
`/transfer 1500 card->cash`) into a strictly double-entry ledger.
This is **MVP-1**: whitelisted users only (the author and his partner),
single hryvnia ledger, no multi-currency, no recurring transactions, no
goals. The system is built so the ledger backend can later be swapped from
PostgreSQL to **TigerBeetle** without touching domain or application code.
The project doubles as a System Analyst portfolio piece — the
**documentation** is the public artifact. Source remains private.
## Status
| | |
|---|---|
| Phase | MVP-1 (Tier-2 scope) |
| Branch | `feature/bootstrap` (Plan 1 of 8 in flight) |
| Design doc | [`docs/design/2026-04-27-mvp1-architecture.md`](docs/design/2026-04-27-mvp1-architecture.md) |
| ADRs | [`docs/adr/`](docs/adr/) |
| Plans | [`docs/plans/`](docs/plans/) |
## Tech stack
Python 3.12 · uv · aiogram 3 · asyncpg · SQLAlchemy + Alembic ·
PostgreSQL 16 · pydantic-settings · structlog · Prometheus + Grafana ·
Docker · GitHub Actions.
## Quickstart (development)
Prerequisites: `uv` (≥ 0.4), Docker, `docker compose`, a Telegram bot
token from `@BotFather`, and your Telegram numeric ID (ask
`@userinfobot`).
```bash
# 1. Install Python deps
uv sync
# 2. Configure env
cp .env.example .env
$EDITOR .env # paste bot token, your Telegram id, etc.
# 3. Bring up Postgres + Prometheus + Grafana
docker compose up -d postgres prometheus grafana
# 4. Apply migrations
uv run alembic upgrade head
# 5. Run the bot from the host
uv run python -m finance_bot
# 6. Send /start to the bot in Telegram

Grafana is at http://localhost:3000 (admin / admin).

Terminal window
uv run ruff check
uv run ruff format --check
uv run mypy
uv run pytest -v

Integration tests (require Docker for testcontainers-postgres) land in Plan 4.

See docs/design/2026-04-27-mvp1-architecture.md for the full architecture; the short version:

src/finance_bot/
├── domain/ pure types — no I/O
├── ports/ Protocol interfaces
├── application/ use-cases (one file per use-case)
├── adapters/ PG, Telegram, observability
├── bootstrap.py composition root
└── __main__.py entry point

MIT.

- [ ] **Step 21.2: Verify the README displays in `uv sync`**
```bash
uv sync

Expected: completes without error (the readme = "README.md" reference now resolves).

  • Step 21.3: Commit
Terminal window
git add README.md uv.lock
git commit -m "docs: rewrite README in English (project overview + quickstart)
Replaces the original Russian tutorial README with project status,
tech stack, dev quickstart, and a layout pointer to the design doc.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"
  • Step 21.4: Retroactively complete deferred Task 10.5 (docker compose build bot)

Task 10 deferred this verification because README.md did not exist on disk; the Dockerfile’s COPY pyproject.toml uv.lock README.md ./ would have failed. With Task 21 done, run it now:

Terminal window
docker compose build bot

Expected: the image builds successfully end-to-end (builder stage syncs deps via uv 0.5.4, runtime stage produces a non-root bot user image with EXPOSE 9090). No need to run the container — __main__.py is the polling entry point and we don’t have a Telegram token in this verification path. If the build fails, fix the Dockerfile or this task accordingly. No commit required for this step (it’s verification only).


Files:

  • Create: .github/workflows/sanity.yml

  • Step 22.1: Create the workflow

Terminal window
mkdir -p .github/workflows

Create .github/workflows/sanity.yml:

name: sanity
on:
push:
branches: ["**"]
pull_request:
branches: [master]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: astral-sh/setup-uv@v3
with:
version: "0.5.4"
- run: uv python install 3.12
- run: uv sync --frozen
- run: uv run ruff check
- run: uv run ruff format --check
typecheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: astral-sh/setup-uv@v3
with:
version: "0.5.4"
- run: uv python install 3.12
- run: uv sync --frozen
- run: uv run mypy
unit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: astral-sh/setup-uv@v3
with:
version: "0.5.4"
- run: uv python install 3.12
- run: uv sync --frozen
- run: uv run pytest tests/unit -v
migrations:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:16-alpine
env:
POSTGRES_DB: finance_bot
POSTGRES_USER: finance_bot
POSTGRES_PASSWORD: finance_bot
ports: ["5432:5432"]
options: >-
--health-cmd "pg_isready -U finance_bot"
--health-interval 5s
--health-timeout 3s
--health-retries 10
steps:
- uses: actions/checkout@v4
- uses: astral-sh/setup-uv@v3
with:
version: "0.5.4"
- run: uv python install 3.12
- run: uv sync --frozen
- name: Create app + ledger schemas
run: |
PGPASSWORD=finance_bot psql -h localhost -U finance_bot -d finance_bot \
-c "CREATE SCHEMA IF NOT EXISTS app; CREATE SCHEMA IF NOT EXISTS ledger;"
- name: alembic upgrade head
env:
TELEGRAM_BOT_TOKEN: 1234567890:stub
WHITELIST_TELEGRAM_IDS: "1"
DATABASE_URL: postgresql+asyncpg://finance_bot:finance_bot@localhost:5432/finance_bot
run: uv run alembic upgrade head
- name: alembic downgrade base
env:
TELEGRAM_BOT_TOKEN: 1234567890:stub
WHITELIST_TELEGRAM_IDS: "1"
DATABASE_URL: postgresql+asyncpg://finance_bot:finance_bot@localhost:5432/finance_bot
run: uv run alembic downgrade base
  • Step 22.2: Commit
Terminal window
git add .github/workflows/sanity.yml
git commit -m "ci: add sanity workflow (lint, typecheck, unit, migrations)
Runs on every branch push and on PRs to master:
- lint: ruff check + ruff format --check
- typecheck: mypy
- unit: pytest tests/unit
- migrations: alembic upgrade head + downgrade base against PG 16
service container
Plans 4-6 add integration and e2e jobs.
Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>"
  • Step 22.3: Push the branch and watch CI go green
Terminal window
git push -u origin feature/bootstrap

In GitHub UI, open the Actions tab, watch the sanity workflow on feature/bootstrap. Expected: all four jobs (lint, typecheck, unit, migrations) pass.

If a job fails: investigate, fix, push a follow-up commit (do NOT amend — keep the trail of what broke). Recurring fixes go in this same task.


The plan is done when:

  • All 22 tasks are committed on feature/bootstrap.
  • uv run ruff check && uv run ruff format --check && uv run mypy && uv run pytest -v is green locally.
  • docker compose up -d postgres && uv run alembic upgrade head succeeds against a fresh volume.
  • Sending /start to the configured Telegram bot returns the bootstrap message.
  • curl http://localhost:9090/metrics returns Prometheus exposition with bot_started_total present.
  • Grafana on localhost:3000 shows the auto-provisioned Prometheus datasource.
  • GitHub Actions sanity workflow on feature/bootstrap is fully green.
  • feature/bootstrap is merged into master (squash or rebase — operator’s choice).

After merge, Plan 2 (Domain & Ports) starts on a new worktree.