Add Traefik file-provider generator and docs

This commit is contained in:
Bas Nijholt
2025-12-13 09:51:43 -08:00
parent d432667584
commit 7d0f829895
6 changed files with 728 additions and 77 deletions

1
.gitignore vendored
View File

@@ -39,3 +39,4 @@ htmlcov/
# Local config (don't commit real configs)
compose-farm.yaml
!examples/compose-farm.yaml
coverage.xml

106
PLAN.md
View File

@@ -1,83 +1,35 @@
# SDC (Simple Distributed Compose) - Implementation Plan
# Compose Farm Traefik Multihost Ingress Plan
## Overview
Minimal CLI tool to run docker compose commands on remote hosts via SSH.
## Goal
Generate a Traefik file-provider fragment from existing docker-compose Traefik labels (no config duplication) so a single front-door Traefik on 192.168.1.66 with wildcard `*.lab.mydomain.org` can route to services running on other hosts. Keep the current simplicity (SSH + docker compose); no Swarm/K8s.
## Tech Stack
- **uv** - project init & dependency management
- **Hatch** - build backend
- **Typer** - CLI
- **Pydantic** - config parsing
- **asyncssh** - async SSH with streaming
## Requirements
- Traefik stays on main host; keep current `dynamic.yml` and Docker provider for local containers.
- Add a watched directory provider (any path works) and load a generated fragment (e.g., `compose-farm.generated.yml`).
- No edits to compose files: reuse existing `traefik.*` labels as the single source of truth; Compose Farm only reads them.
- Generator infers routing from labels and reachability from `ports:` mappings; prefer host-published ports so Traefik can reach services across hosts. Upstreams point to `<host address>:<published host port>`; warn if no published port is found.
- Only minimal data in `compose-farm.yaml`: hosts map and service→host mapping (already present).
- No new orchestration/discovery layers; respect KISS/YAGNI/DRY.
## Project Structure
```
sdc/
├── pyproject.toml
├── src/
│ └── sdc/
│ ├── __init__.py
│ ├── cli.py # Typer CLI
│ ├── config.py # Pydantic models
│ └── ssh.py # SSH execution
└── sdc.yaml # Example config
```
## Non-Goals
- No Swarm/Kubernetes adoption.
- No global Docker provider across hosts.
- No health checks/service discovery layer.
## Config Schema (`~/.config/sdc/sdc.yaml` or `./sdc.yaml`)
```yaml
compose_dir: /opt/compose
## Current State (Dec 2025)
- Compose Farm: Typer CLI wrapping `docker compose` over SSH; config in `compose-farm.yaml`; parallel by default; snapshot/log tooling present.
- Traefik: single instance on 192.168.1.66, wildcard `*.lab.mydomain.org`, Docker provider for local services, file provider via `dynamic.yml` already in use.
hosts:
nas01:
address: 192.168.1.10
user: docker # optional, defaults to current user
nas02:
address: 192.168.1.11
## Proposed Implementation Steps
1) Add generator command: `compose-farm traefik-file --output <path>`.
2) Resolve per-service host from `compose-farm.yaml`; read compose file at `{compose_dir}/{service}/docker-compose.yml`.
3) Parse `traefik.*` labels to build routers/services/middlewares as in compose; map container port to published host port (from `ports:`) to form upstream URLs with host address.
4) Emit file-provider YAML to the watched directory (recommended default: `/mnt/data/traefik/dynamic.d/compose-farm.generated.yml`, but user chooses via `--output`).
5) Warnings: if no published port is found, warn that cross-host reachability requires L3 reachability to container IPs.
6) Tests: label parsing, port mapping, YAML render; scenario with published port; scenario without published port.
7) Docs: update README/CLAUDE to describe directory provider flags and the generator workflow; note that compose files remain unchanged.
services:
plex: nas01
jellyfin: nas02
```
## CLI Commands
```bash
sdc up <service...> # docker compose up -d
sdc down <service...> # docker compose down
sdc pull <service...> # docker compose pull
sdc restart <service...> # down + up
sdc logs <service...> # stream logs
sdc ps # show all services & status
sdc update <service...> # pull + restart (end-to-end update)
# Flags
--all # run on all services
```
## Implementation Steps
### Step 1: Project Setup
- `uv init sdc`
- Configure pyproject.toml with Hatch build backend
- Add dependencies: typer, pydantic, asyncssh, pyyaml
### Step 2: Config Module (`config.py`)
- Pydantic models: `Host`, `Config`
- Config loading: check `./sdc.yaml` then `~/.config/sdc/sdc.yaml`
- Validation: ensure services reference valid hosts
### Step 3: SSH Module (`ssh.py`)
- `run_command(host, command)` - async SSH with streaming
- `run_compose(service, compose_cmd)` - build and run compose command
- `run_on_services(services, compose_cmd)` - parallel execution
### Step 4: CLI Module (`cli.py`)
- Typer app with commands: up, down, pull, restart, logs, ps, update
- `--all` flag for operating on all services
- Streaming output with service name prefix
## Design Decisions
1. **asyncssh** - native async, built-in streaming stdout/stderr
2. **SSH key auth** - uses ssh-agent, no password handling
3. **Parallel by default** - asyncio.gather for multiple services
4. **Streaming output** - real-time with `[service]` prefix
5. **Compose path** - `{compose_dir}/{service}/docker-compose.yml`
## Open Questions
- How to derive target host address: use `hosts.<name>.address` verbatim, or allow override per service? (Default: use host address.)
- Should we support multiple hosts/backends per service for LB/HA? (Start with single server.)
- Where to store generated file by default? (Default to user-specified `--output`; maybe fallback to `./compose-farm-traefik.yml`.)

View File

@@ -92,6 +92,60 @@ compose-farm logs -f plex # follow
compose-farm ps
```
## Traefik Multihost Ingress (File Provider)
If you run a single Traefik instance on one “frontdoor” host and want it to route to
Compose Farm services on other hosts, Compose Farm can generate a Traefik fileprovider
fragment from your existing compose labels.
**How it works**
- Your `docker-compose.yml` remains the source of truth. Put normal `traefik.*` labels on
the container you want exposed.
- Labels and port specs may use `${VAR}` / `${VAR:-default}`; Compose Farm resolves these
using the stacks `.env` file and your current environment, just like Docker Compose.
- Publish a host port for that container (via `ports:`). The generator prefers
hostpublished ports so Traefik can reach the service across hosts; if none are found,
it warns and youd need L3 reachability to container IPs.
- If a router label doesnt specify `traefik.http.routers.<name>.service` and theres only
one Traefik service defined on that container, Compose Farm wires the router to it.
- `compose-farm.yaml` stays unchanged: just `hosts` and `services: service → host`.
Example `docker-compose.yml` pattern:
```yaml
services:
plex:
ports: ["32400:32400"]
labels:
- traefik.enable=true
- traefik.http.routers.plex.rule=Host(`plex.lab.mydomain.org`)
- traefik.http.routers.plex.entrypoints=websecure
- traefik.http.routers.plex.tls.certresolver=letsencrypt
- traefik.http.services.plex.loadbalancer.server.port=32400
```
**Onetime Traefik setup**
Enable a file provider watching a directory (any path is fine; a common choice is on your
shared/NFS mount):
```yaml
providers:
file:
directory: /mnt/data/traefik/dynamic.d
watch: true
```
**Generate the fragment**
```bash
compose-farm traefik-file --output /mnt/data/traefik/dynamic.d/compose-farm.generated.yml
```
Rerun this after changing Traefik labels, moving a service to another host, or changing
published ports.
## Requirements
- Python 3.11+

View File

@@ -7,6 +7,7 @@ from pathlib import Path
from typing import TYPE_CHECKING, Annotated, TypeVar
import typer
import yaml
from .config import Config, load_config
from .logs import snapshot_services
@@ -181,5 +182,42 @@ def snapshot(
typer.echo(f"Snapshot written to {path}")
@app.command("traefik-file")
def traefik_file(
services: ServicesArg = None,
all_services: AllOption = False,
output: Annotated[
Path | None,
typer.Option(
"--output",
"-o",
help="Write Traefik file-provider YAML to this path (stdout if omitted)",
),
] = None,
config: ConfigOption = None,
) -> None:
"""Generate a Traefik file-provider fragment from compose Traefik labels."""
from .traefik import generate_traefik_config
svc_list, cfg = _get_services(services or [], all_services, config)
try:
dynamic, warnings = generate_traefik_config(cfg, svc_list)
except (FileNotFoundError, ValueError) as exc:
typer.echo(str(exc), err=True)
raise typer.Exit(1) from exc
rendered = yaml.safe_dump(dynamic, sort_keys=False)
if output:
output.parent.mkdir(parents=True, exist_ok=True)
output.write_text(rendered)
typer.echo(f"Traefik config written to {output}")
else:
typer.echo(rendered)
for warning in warnings:
typer.echo(warning, err=True)
if __name__ == "__main__":
app()

411
src/compose_farm/traefik.py Normal file
View File

@@ -0,0 +1,411 @@
"""Generate Traefik file-provider config from compose labels.
Compose Farm keeps compose files as the source of truth for Traefik routing.
This module reads `traefik.*` labels from a stack's docker-compose.yml and
emits an equivalent file-provider fragment with upstream servers rewritten to
use host-published ports for cross-host reachability.
"""
from __future__ import annotations
import os
import re
from dataclasses import dataclass
from typing import TYPE_CHECKING, Any
import yaml
if TYPE_CHECKING:
from pathlib import Path
from .config import Config
@dataclass(frozen=True)
class PortMapping:
"""Port mapping for a compose service."""
target: int
published: int | None
protocol: str | None = None
@dataclass
class TraefikServiceSource:
"""Source information to build an upstream for a Traefik service."""
traefik_service: str
stack: str
compose_service: str
host_address: str
ports: list[PortMapping]
container_port: int | None = None
scheme: str | None = None
LIST_VALUE_KEYS = {"entrypoints", "middlewares"}
SINGLE_PART = 1
PUBLISHED_TARGET_PARTS = 2
HOST_PUBLISHED_PARTS = 3
MIN_ROUTER_PARTS = 3
MIN_SERVICE_LABEL_PARTS = 6
_VAR_PATTERN = re.compile(r"\$\{([A-Za-z_][A-Za-z0-9_]*)(?::-(.*?))?\}")
def _load_env(compose_path: Path) -> dict[str, str]:
"""Load environment variables for compose interpolation."""
env: dict[str, str] = {}
env_path = compose_path.parent / ".env"
if env_path.exists():
for line in env_path.read_text().splitlines():
stripped = line.strip()
if not stripped or stripped.startswith("#") or "=" not in stripped:
continue
key, value = stripped.split("=", 1)
key = key.strip()
value = value.strip()
if (value.startswith('"') and value.endswith('"')) or (
value.startswith("'") and value.endswith("'")
):
value = value[1:-1]
env[key] = value
env.update({k: v for k, v in os.environ.items() if isinstance(v, str)})
return env
def _interpolate(value: str, env: dict[str, str]) -> str:
"""Perform a minimal `${VAR}`/`${VAR:-default}` interpolation."""
def replace(match: re.Match[str]) -> str:
var = match.group(1)
default = match.group(2)
resolved = env.get(var)
if resolved:
return resolved
return default or ""
return _VAR_PATTERN.sub(replace, value)
def _normalize_labels(raw: Any, env: dict[str, str]) -> dict[str, str]:
if raw is None:
return {}
if isinstance(raw, dict):
return {
_interpolate(str(k), env): _interpolate(str(v), env)
for k, v in raw.items()
if k is not None
}
if isinstance(raw, list):
labels: dict[str, str] = {}
for item in raw:
if not isinstance(item, str) or "=" not in item:
continue
key_raw, value_raw = item.split("=", 1)
key = _interpolate(key_raw.strip(), env)
value = _interpolate(value_raw.strip(), env)
labels[key] = value
return labels
return {}
def _parse_ports(raw: Any, env: dict[str, str]) -> list[PortMapping]: # noqa: PLR0912
if raw is None:
return []
mappings: list[PortMapping] = []
items = raw if isinstance(raw, list) else [raw]
for item in items:
if isinstance(item, str):
interpolated = _interpolate(item, env)
port_spec, _, protocol = interpolated.partition("/")
parts = port_spec.split(":")
published: int | None = None
target: int | None = None
if len(parts) == SINGLE_PART and parts[0].isdigit():
target = int(parts[0])
elif len(parts) == PUBLISHED_TARGET_PARTS and parts[0].isdigit() and parts[1].isdigit():
published = int(parts[0])
target = int(parts[1])
elif len(parts) == HOST_PUBLISHED_PARTS and parts[-2].isdigit() and parts[-1].isdigit():
published = int(parts[-2])
target = int(parts[-1])
if target is not None:
mappings.append(
PortMapping(target=target, published=published, protocol=protocol or None)
)
elif isinstance(item, dict):
target_raw = item.get("target")
if isinstance(target_raw, str):
target_raw = _interpolate(target_raw, env)
if target_raw is None:
continue
try:
target_val = int(str(target_raw))
except (TypeError, ValueError):
continue
published_raw = item.get("published")
if isinstance(published_raw, str):
published_raw = _interpolate(published_raw, env)
published_val: int | None
try:
published_val = int(str(published_raw)) if published_raw is not None else None
except (TypeError, ValueError):
published_val = None
protocol_val = item.get("protocol")
mappings.append(
PortMapping(
target=target_val,
published=published_val,
protocol=str(protocol_val) if protocol_val else None,
)
)
return mappings
def _parse_value(key: str, raw_value: str) -> Any:
value = raw_value.strip()
lower = value.lower()
if lower in {"true", "false"}:
return lower == "true"
if value.isdigit():
return int(value)
last_segment = key.rsplit(".", 1)[-1]
if last_segment in LIST_VALUE_KEYS:
parts = [v.strip() for v in value.split(",")] if "," in value else [value]
return [part for part in parts if part]
return value
def _parse_segment(segment: str) -> tuple[str, int | None]:
if "[" in segment and segment.endswith("]"):
name, index_raw = segment[:-1].split("[", 1)
if index_raw.isdigit():
return name, int(index_raw)
return segment, None
def _insert(root: dict[str, Any], key_path: list[str], value: Any) -> None: # noqa: PLR0912
current: Any = root
for idx, segment in enumerate(key_path):
is_last = idx == len(key_path) - 1
name, list_index = _parse_segment(segment)
if list_index is None:
if is_last:
if not isinstance(current, dict):
return
current[name] = value
else:
if not isinstance(current, dict):
return
next_container = current.get(name)
if not isinstance(next_container, dict):
next_container = {}
current[name] = next_container
current = next_container
continue
if not isinstance(current, dict):
return
container_list = current.get(name)
if not isinstance(container_list, list):
container_list = []
current[name] = container_list
while len(container_list) <= list_index:
container_list.append({})
if is_last:
container_list[list_index] = value
else:
if not isinstance(container_list[list_index], dict):
container_list[list_index] = {}
current = container_list[list_index]
def _resolve_published_port(source: TraefikServiceSource) -> tuple[int | None, str | None]:
"""Resolve host-published port for a Traefik service.
Returns (published_port, warning_message).
"""
published_ports = [m for m in source.ports if m.published is not None]
if not published_ports:
return None, None
if source.container_port is not None:
for mapping in published_ports:
if mapping.target == source.container_port:
return mapping.published, None
if len(published_ports) == 1:
port = published_ports[0].published
warn = (
f"[{source.stack}/{source.compose_service}] "
f"No published port matches container port {source.container_port} "
f"for Traefik service '{source.traefik_service}', using {port}."
)
return port, warn
return None, (
f"[{source.stack}/{source.compose_service}] "
f"No published port matches container port {source.container_port} "
f"for Traefik service '{source.traefik_service}'."
)
if len(published_ports) == 1:
return published_ports[0].published, None
return None, (
f"[{source.stack}/{source.compose_service}] "
f"Multiple published ports found for Traefik service '{source.traefik_service}', "
"but no loadbalancer.server.port label to disambiguate."
)
def generate_traefik_config( # noqa: C901, PLR0912, PLR0915
config: Config,
services: list[str],
) -> tuple[dict[str, Any], list[str]]:
"""Generate Traefik dynamic config from compose labels.
Returns (config_dict, warnings).
"""
dynamic: dict[str, Any] = {}
warnings: list[str] = []
sources: dict[str, TraefikServiceSource] = {}
for stack in services:
compose_path = config.get_compose_path(stack)
if not compose_path.exists():
message = f"[{stack}] Compose file not found: {compose_path}"
raise FileNotFoundError(message)
env = _load_env(compose_path)
compose_data = yaml.safe_load(compose_path.read_text()) or {}
raw_services = compose_data.get("services", {})
if not isinstance(raw_services, dict):
continue
host_address = config.get_host(stack).address
for compose_service, definition in raw_services.items():
if not isinstance(definition, dict):
continue
labels = _normalize_labels(definition.get("labels"), env)
if not labels:
continue
enable_raw = labels.get("traefik.enable")
if enable_raw is not None and _parse_value("enable", enable_raw) is False:
continue
ports = _parse_ports(definition.get("ports"), env)
routers: dict[str, bool] = {}
service_names: set[str] = set()
for label_key, label_value in labels.items():
if not label_key.startswith("traefik."):
continue
if label_key in {"traefik.enable", "traefik.docker.network"}:
continue
key_without_prefix = label_key[len("traefik.") :]
if not key_without_prefix.startswith(("http.", "tcp.", "udp.")):
continue
_insert(
dynamic,
key_without_prefix.split("."),
_parse_value(key_without_prefix, label_value),
)
if key_without_prefix.startswith("http.routers."):
router_parts = key_without_prefix.split(".")
if len(router_parts) >= MIN_ROUTER_PARTS:
router_name = router_parts[2]
router_remainder = router_parts[3:]
explicit = routers.get(router_name, False)
if router_remainder == ["service"]:
explicit = True
routers[router_name] = explicit
if not key_without_prefix.startswith("http.services."):
continue
parts = key_without_prefix.split(".")
if len(parts) < MIN_SERVICE_LABEL_PARTS:
continue
traefik_service = parts[2]
service_names.add(traefik_service)
remainder = parts[3:]
source = sources.get(traefik_service)
if source is None:
source = TraefikServiceSource(
traefik_service=traefik_service,
stack=stack,
compose_service=compose_service,
host_address=host_address,
ports=ports,
)
sources[traefik_service] = source
if remainder == ["loadbalancer", "server", "port"]:
parsed = _parse_value(key_without_prefix, label_value)
if isinstance(parsed, int):
source.container_port = parsed
elif remainder == ["loadbalancer", "server", "scheme"]:
source.scheme = str(_parse_value(key_without_prefix, label_value))
if routers:
if len(service_names) == 1:
default_service = next(iter(service_names))
for router_name, explicit in routers.items():
if explicit:
continue
_insert(
dynamic,
["http", "routers", router_name, "service"],
default_service,
)
elif len(service_names) == 0:
for router_name, explicit in routers.items():
if explicit:
continue
warnings.append(
f"[{stack}/{compose_service}] Router '{router_name}' has no service "
"and no traefik.http.services labels were found."
)
else:
for router_name, explicit in routers.items():
if explicit:
continue
warnings.append(
f"[{stack}/{compose_service}] Router '{router_name}' has no explicit service "
"and multiple Traefik services are defined; add "
f"traefik.http.routers.{router_name}.service."
)
for traefik_service, source in sources.items():
published_port, warn = _resolve_published_port(source)
if warn:
warnings.append(warn)
if published_port is None:
warnings.append(
f"[{source.stack}/{source.compose_service}] "
f"No host-published port found for Traefik service '{traefik_service}'. "
"Traefik will require L3 reachability to container IPs."
)
continue
scheme = source.scheme or "http"
upstream_url = f"{scheme}://{source.host_address}:{published_port}"
http_section = dynamic.setdefault("http", {})
services_section = http_section.setdefault("services", {})
service_cfg = services_section.setdefault(traefik_service, {})
lb_cfg = service_cfg.setdefault("loadbalancer", {})
if isinstance(lb_cfg, dict):
lb_cfg.pop("server", None)
lb_cfg["servers"] = [{"url": upstream_url}]
return dynamic, warnings

195
tests/test_traefik.py Normal file
View File

@@ -0,0 +1,195 @@
"""Tests for Traefik config generator."""
from pathlib import Path
import yaml
from compose_farm.config import Config, Host
from compose_farm.traefik import generate_traefik_config
def _write_compose(path: Path, data: dict[str, object]) -> None:
path.parent.mkdir(parents=True, exist_ok=True)
path.write_text(yaml.safe_dump(data, sort_keys=False))
def test_generate_traefik_config_with_published_port(tmp_path: Path) -> None:
cfg = Config(
compose_dir=tmp_path,
hosts={"nas01": Host(address="192.168.1.10")},
services={"plex": "nas01"},
)
compose_path = tmp_path / "plex" / "docker-compose.yml"
_write_compose(
compose_path,
{
"services": {
"plex": {
"ports": ["32400:32400"],
"labels": [
"traefik.enable=true",
"traefik.http.routers.plex.rule=Host(`plex.lab.mydomain.org`)",
"traefik.http.routers.plex.entrypoints=web,websecure",
"traefik.http.routers.plex.tls.domains[0].main=plex.lab.mydomain.org",
"traefik.http.services.plex.loadbalancer.server.port=32400",
],
}
}
},
)
dynamic, warnings = generate_traefik_config(cfg, ["plex"])
assert warnings == []
assert dynamic["http"]["routers"]["plex"]["rule"] == "Host(`plex.lab.mydomain.org`)"
assert dynamic["http"]["routers"]["plex"]["entrypoints"] == ["web", "websecure"]
assert (
dynamic["http"]["routers"]["plex"]["tls"]["domains"][0]["main"] == "plex.lab.mydomain.org"
)
servers = dynamic["http"]["services"]["plex"]["loadbalancer"]["servers"]
assert servers == [{"url": "http://192.168.1.10:32400"}]
def test_generate_traefik_config_without_published_port_warns(tmp_path: Path) -> None:
cfg = Config(
compose_dir=tmp_path,
hosts={"nas01": Host(address="192.168.1.10")},
services={"app": "nas01"},
)
compose_path = tmp_path / "app" / "docker-compose.yml"
_write_compose(
compose_path,
{
"services": {
"app": {
"ports": ["8080"],
"labels": [
"traefik.http.routers.app.rule=Host(`app.lab.mydomain.org`)",
"traefik.http.services.app.loadbalancer.server.port=8080",
],
}
}
},
)
dynamic, warnings = generate_traefik_config(cfg, ["app"])
assert dynamic["http"]["routers"]["app"]["rule"] == "Host(`app.lab.mydomain.org`)"
assert any("No host-published port found" in warning for warning in warnings)
def test_generate_interpolates_env_and_infers_router_service(tmp_path: Path) -> None:
cfg = Config(
compose_dir=tmp_path,
hosts={"nas01": Host(address="192.168.1.10")},
services={"wakapi": "nas01"},
)
compose_dir = tmp_path / "wakapi"
compose_dir.mkdir(parents=True, exist_ok=True)
(compose_dir / ".env").write_text("DOMAIN=lab.mydomain.org\n")
compose_path = compose_dir / "docker-compose.yml"
_write_compose(
compose_path,
{
"services": {
"wakapi": {
"ports": ["3009:3000"],
"labels": [
"traefik.enable=true",
"traefik.http.routers.wakapi.rule=Host(`wakapi.${DOMAIN}`)",
"traefik.http.routers.wakapi.entrypoints=websecure",
"traefik.http.routers.wakapi-local.rule=Host(`wakapi.local`)",
"traefik.http.routers.wakapi-local.entrypoints=web",
"traefik.http.services.wakapi.loadbalancer.server.port=3000",
],
}
}
},
)
dynamic, warnings = generate_traefik_config(cfg, ["wakapi"])
assert warnings == []
routers = dynamic["http"]["routers"]
assert routers["wakapi"]["rule"] == "Host(`wakapi.lab.mydomain.org`)"
assert routers["wakapi"]["entrypoints"] == ["websecure"]
assert routers["wakapi-local"]["entrypoints"] == ["web"]
assert routers["wakapi-local"]["service"] == "wakapi"
servers = dynamic["http"]["services"]["wakapi"]["loadbalancer"]["servers"]
assert servers == [{"url": "http://192.168.1.10:3009"}]
def test_generate_interpolates_label_keys_and_ports(tmp_path: Path) -> None:
cfg = Config(
compose_dir=tmp_path,
hosts={"nas01": Host(address="192.168.1.10")},
services={"supabase": "nas01"},
)
compose_dir = tmp_path / "supabase"
compose_dir.mkdir(parents=True, exist_ok=True)
(compose_dir / ".env").write_text(
"CONTAINER_PREFIX=supa\n"
"SUBDOMAIN=api\n"
"DOMAIN=lab.mydomain.org\n"
"PUBLIC_DOMAIN=public.example.org\n"
"KONG_HTTP_PORT=8000\n"
)
compose_path = compose_dir / "docker-compose.yml"
_write_compose(
compose_path,
{
"services": {
"kong": {
"ports": ["${KONG_HTTP_PORT}:8000/tcp"],
"labels": [
"traefik.enable=true",
"traefik.http.routers.${CONTAINER_PREFIX}.rule=Host(`${SUBDOMAIN}.${DOMAIN}`) || Host(`${SUBDOMAIN}.${PUBLIC_DOMAIN}`)",
"traefik.http.routers.${CONTAINER_PREFIX}-studio.rule=Host(`studio.${DOMAIN}`)",
"traefik.http.services.${CONTAINER_PREFIX}.loadbalancer.server.port=8000",
],
}
}
},
)
dynamic, warnings = generate_traefik_config(cfg, ["supabase"])
assert warnings == []
routers = dynamic["http"]["routers"]
assert "supa" in routers
assert "supa-studio" in routers
assert routers["supa"]["service"] == "supa"
assert routers["supa-studio"]["service"] == "supa"
servers = dynamic["http"]["services"]["supa"]["loadbalancer"]["servers"]
assert servers == [{"url": "http://192.168.1.10:8000"}]
def test_generate_skips_services_with_enable_false(tmp_path: Path) -> None:
cfg = Config(
compose_dir=tmp_path,
hosts={"nas01": Host(address="192.168.1.10")},
services={"stack": "nas01"},
)
compose_path = tmp_path / "stack" / "docker-compose.yml"
_write_compose(
compose_path,
{
"services": {
"studio": {
"ports": ["3000:3000"],
"labels": [
"traefik.enable=false",
"traefik.http.routers.studio.rule=Host(`studio.lab.mydomain.org`)",
"traefik.http.services.studio.loadbalancer.server.port=3000",
],
}
}
},
)
dynamic, warnings = generate_traefik_config(cfg, ["stack"])
assert dynamic == {}
assert warnings == []