This commit is contained in:
2026-05-05 16:52:40 +02:00
commit bdb523d4b8
58 changed files with 8880 additions and 0 deletions

6
sop-back/.dockerignore Normal file
View File

@@ -0,0 +1,6 @@
.venv
__pycache__
*.pyc
.env
data/
.git

17
sop-back/.env.example Normal file
View File

@@ -0,0 +1,17 @@
# API
PROJECT_NAME=Smash or Pass
ALLOWED_ORIGINS=["http://localhost:5173","http://localhost:8080"]
# Admin gate
ADMIN_ENABLED=true
# Database (SQLite file path)
DATABASE_URL=sqlite:///./data/sop.db
# MinIO / S3
S3_ENDPOINT_URL=http://minio:9000
S3_PUBLIC_URL=http://localhost:9000
S3_ACCESS_KEY=minioadmin
S3_SECRET_KEY=minioadmin
S3_BUCKET=sop
S3_REGION=us-east-1

18
sop-back/Dockerfile Normal file
View File

@@ -0,0 +1,18 @@
FROM python:3.12-slim
ENV PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \
PIP_NO_CACHE_DIR=1
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY app ./app
RUN mkdir -p /app/data
EXPOSE 8000
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

142
sop-back/README.md Normal file
View File

@@ -0,0 +1,142 @@
# Smash or Pass — Backend
FastAPI + SQLAlchemy (SQLite) + boto3 (MinIO/S3). Serves the API consumed by [`sop-front`](../sop-front/).
See [`../CLAUDE.md`](../CLAUDE.md) for full architecture notes.
---
## Stack
- Python 3.12, FastAPI, Uvicorn
- SQLAlchemy 2 (sync) + SQLite
- Pydantic v2 + `pydantic-settings`
- `boto3` for S3-compatible object storage (MinIO)
---
## Project layout
```
sop-back/
├── app/
│ ├── main.py # FastAPI factory + lifespan (DB + bucket init)
│ ├── core/{config,deps}.py # Settings, admin gate
│ ├── db/database.py # Engine, SessionLocal, get_db, Base
│ ├── models/models.py # Collection, Character
│ ├── schemas/schemas.py # Pydantic schemas
│ ├── services/storage.py # MinIO upload/delete, bucket bootstrap
│ └── api/routes/
│ ├── health.py # /health, /admin/status
│ ├── collections.py # /collections
│ └── admin.py # /admin/* (require_admin)
├── requirements.txt
├── Dockerfile
└── .env.example
```
---
## Configuration
Copy `.env.example` to `.env` and adjust:
| Var | Default | Notes |
|---|---|---|
| `ADMIN_ENABLED` | `false` | When `true`, `/admin/*` routes are exposed and the frontend renders the admin panel |
| `ALLOWED_ORIGINS` | `["*"]` | CORS, JSON array string |
| `DATABASE_URL` | `sqlite:///./data/sop.db` | SQLite file path |
| `S3_ENDPOINT_URL` | `http://localhost:9000` | What the **backend** uses to reach MinIO |
| `S3_PUBLIC_URL` | `http://localhost:9000` | What gets stored in `s3_url` and dereferenced by the **browser** |
| `S3_ACCESS_KEY` / `S3_SECRET_KEY` | `minioadmin` / `minioadmin` | Credentials |
| `S3_BUCKET` | `sop` | Auto-created on startup, set public-read |
| `S3_REGION` | `us-east-1` | Required by boto3 |
> **Security note:** `ADMIN_ENABLED=true` exposes admin endpoints to anyone who can reach the backend. There is no user auth — by design, per project spec. In production, either (a) put the backend behind an authenticated reverse proxy / VPN, (b) keep `ADMIN_ENABLED=false` outside of admin sessions, or (c) replace the gate with proper auth.
---
## Local development
You need a running MinIO. Easiest path: spin it up via Docker.
```bash
# from repo root
docker compose up -d minio minio-init
```
Then run the backend natively:
```bash
cd sop-back
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
cp .env.example .env
# edit .env: set ADMIN_ENABLED=true if you want to upload
uvicorn app.main:app --reload
```
- API: http://localhost:8000
- Interactive docs: http://localhost:8000/docs
- Health: http://localhost:8000/health
The SQLite DB file is created at `sop-back/data/sop.db` on first startup; tables are auto-created. There is no Alembic — if you change models, delete the file during dev.
---
## API surface
| Method | Path | Auth | Purpose |
|---|---|---|---|
| GET | `/health` | — | Liveness |
| GET | `/admin/status` | — | `{admin_enabled: bool}` |
| GET | `/collections` | — | List collections (with `character_count`) |
| GET | `/collections/{id}` | — | Collection + characters |
| POST | `/admin/collections` | admin | multipart `name` + `files[]` |
| POST | `/admin/collections/{id}/characters` | admin | multipart `files[]` |
| DELETE | `/admin/collections/{id}` | admin | Delete collection + S3 objects |
| DELETE | `/admin/characters/{id}` | admin | Delete one character + its S3 object |
Allowed image types: `image/jpeg`, `image/png`, `image/webp`, `image/gif`.
---
## Production deployment
### Docker (recommended)
The repo-root `docker-compose.yml` already wires backend + MinIO + bucket init. From the repo root:
```bash
docker compose up -d --build backend minio minio-init
```
For a real deployment you should override these defaults:
1. **Use long, random MinIO credentials.** Edit the `MINIO_ROOT_USER` / `MINIO_ROOT_PASSWORD` env on the `minio` service and the matching `S3_ACCESS_KEY` / `S3_SECRET_KEY` on the backend.
2. **Set `S3_PUBLIC_URL` to the public hostname** browsers will hit (e.g. `https://media.example.com`). Put MinIO behind a TLS-terminating reverse proxy on that hostname.
3. **Set `ALLOWED_ORIGINS`** to the exact frontend origin(s) — never `["*"]` in prod.
4. **Persist volumes**: `minio-data` and `backend-data` (the SQLite file) — already declared in compose. Back them up.
5. **Run behind a reverse proxy** (Caddy / Traefik / nginx) terminating TLS in front of port 8000.
6. Consider switching `DATABASE_URL` to PostgreSQL if you expect concurrent writes.
### Without Docker
Same install steps as local, but:
```bash
pip install -r requirements.txt gunicorn
gunicorn app.main:app -k uvicorn.workers.UvicornWorker -w 2 -b 0.0.0.0:8000
```
Run under systemd or a supervisor of your choice. Front it with nginx/Caddy for TLS.
---
## Schema migrations
There are none yet — `Base.metadata.create_all` runs on startup. If the schema changes incompatibly, either:
- Wipe `data/sop.db` (dev), or
- Add Alembic before the next prod deploy.

0
sop-back/app/__init__.py Normal file
View File

View File

View File

View File

@@ -0,0 +1,109 @@
from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, Form
from sqlalchemy.orm import Session
from sqlalchemy.exc import IntegrityError
from typing import List
from app.db.database import get_db
from app.core.deps import require_admin
from app.models.models import Collection, Character
from app.schemas.schemas import CollectionOut
from app.services.storage import upload_image, delete_object_by_url
router = APIRouter(dependencies=[Depends(require_admin)])
ALLOWED_TYPES = {"image/jpeg", "image/png", "image/webp", "image/gif"}
@router.post("/collections", response_model=CollectionOut, status_code=201)
async def create_collection(
name: str = Form(...),
files: List[UploadFile] = File(...),
db: Session = Depends(get_db),
):
name = name.strip()
if not name:
raise HTTPException(status_code=400, detail="Collection name required")
if not files:
raise HTTPException(status_code=400, detail="At least one image is required")
collection = Collection(name=name)
db.add(collection)
try:
db.flush()
except IntegrityError:
db.rollback()
raise HTTPException(status_code=409, detail="A collection with this name already exists")
for f in files:
if f.content_type not in ALLOWED_TYPES:
db.rollback()
raise HTTPException(
status_code=415,
detail=f"Unsupported file type: {f.content_type}",
)
data = await f.read()
_, url = upload_image(data, f.filename or "image", f.content_type or "")
char_name = (f.filename or "image").rsplit(".", 1)[0]
db.add(
Character(
name=char_name,
filename=f.filename or "image",
s3_url=url,
collection_id=collection.id,
)
)
db.commit()
db.refresh(collection)
return collection
@router.post("/collections/{collection_id}/characters", response_model=CollectionOut)
async def add_characters(
collection_id: int,
files: List[UploadFile] = File(...),
db: Session = Depends(get_db),
):
collection = db.get(Collection, collection_id)
if not collection:
raise HTTPException(status_code=404, detail="Collection not found")
for f in files:
if f.content_type not in ALLOWED_TYPES:
raise HTTPException(status_code=415, detail=f"Unsupported file type: {f.content_type}")
data = await f.read()
_, url = upload_image(data, f.filename or "image", f.content_type or "")
char_name = (f.filename or "image").rsplit(".", 1)[0]
db.add(
Character(
name=char_name,
filename=f.filename or "image",
s3_url=url,
collection_id=collection.id,
)
)
db.commit()
db.refresh(collection)
return collection
@router.delete("/collections/{collection_id}", status_code=204)
def delete_collection(collection_id: int, db: Session = Depends(get_db)):
collection = db.get(Collection, collection_id)
if not collection:
raise HTTPException(status_code=404, detail="Collection not found")
urls = [c.s3_url for c in collection.characters]
db.delete(collection)
db.commit()
for u in urls:
delete_object_by_url(u)
@router.delete("/characters/{character_id}", status_code=204)
def delete_character(character_id: int, db: Session = Depends(get_db)):
ch = db.get(Character, character_id)
if not ch:
raise HTTPException(status_code=404, detail="Character not found")
url = ch.s3_url
db.delete(ch)
db.commit()
delete_object_by_url(url)

View File

@@ -0,0 +1,31 @@
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.orm import Session
from typing import List
from app.db.database import get_db
from app.models.models import Collection, Character
from app.schemas.schemas import CollectionSummary, CollectionOut
router = APIRouter()
@router.get("", response_model=List[CollectionSummary])
def list_collections(db: Session = Depends(get_db)):
rows = db.query(Collection).order_by(Collection.created_at.desc()).all()
return [
CollectionSummary(
id=c.id,
name=c.name,
created_at=c.created_at,
character_count=len(c.characters),
)
for c in rows
]
@router.get("/{collection_id}", response_model=CollectionOut)
def get_collection(collection_id: int, db: Session = Depends(get_db)):
c = db.get(Collection, collection_id)
if not c:
raise HTTPException(status_code=404, detail="Collection not found")
return c

View File

@@ -0,0 +1,15 @@
from fastapi import APIRouter
from app.core.config import settings
from app.schemas.schemas import AdminStatus
router = APIRouter()
@router.get("/health")
def health():
return {"status": "ok"}
@router.get("/admin/status", response_model=AdminStatus)
def admin_status():
return AdminStatus(admin_enabled=settings.ADMIN_ENABLED)

View File

View File

@@ -0,0 +1,24 @@
from typing import List
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
PROJECT_NAME: str = "Smash or Pass"
VERSION: str = "0.1.0"
ALLOWED_ORIGINS: List[str] = ["*"]
ADMIN_ENABLED: bool = False
DATABASE_URL: str = "sqlite:///./data/sop.db"
S3_ENDPOINT_URL: str = "http://localhost:9000"
S3_PUBLIC_URL: str = "http://localhost:9000"
S3_ACCESS_KEY: str = "minioadmin"
S3_SECRET_KEY: str = "minioadmin"
S3_BUCKET: str = "sop"
S3_REGION: str = "us-east-1"
settings = Settings()

10
sop-back/app/core/deps.py Normal file
View File

@@ -0,0 +1,10 @@
from fastapi import HTTPException, status
from app.core.config import settings
def require_admin() -> None:
if not settings.ADMIN_ENABLED:
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="Admin module is disabled",
)

View File

View File

@@ -0,0 +1,24 @@
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, declarative_base, Session
from app.core.config import settings
import os
connect_args = {}
if settings.DATABASE_URL.startswith("sqlite"):
connect_args["check_same_thread"] = False
db_path = settings.DATABASE_URL.replace("sqlite:///", "")
parent = os.path.dirname(db_path)
if parent:
os.makedirs(parent, exist_ok=True)
engine = create_engine(settings.DATABASE_URL, connect_args=connect_args)
SessionLocal = sessionmaker(bind=engine, autoflush=False, autocommit=False)
Base = declarative_base()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()

39
sop-back/app/main.py Normal file
View File

@@ -0,0 +1,39 @@
from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from app.core.config import settings
from app.db.database import Base, engine
from app.models import models # noqa: F401 (register models)
from app.services.storage import ensure_bucket
from app.api.routes import health, collections, admin
@asynccontextmanager
async def lifespan(app: FastAPI):
Base.metadata.create_all(bind=engine)
try:
ensure_bucket()
except Exception as e:
# Don't crash the API if MinIO is briefly unavailable at startup.
print(f"[startup] MinIO bucket init skipped: {e}")
yield
app = FastAPI(
title=settings.PROJECT_NAME,
version=settings.VERSION,
lifespan=lifespan,
)
app.add_middleware(
CORSMiddleware,
allow_origins=settings.ALLOWED_ORIGINS,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
app.include_router(health.router, tags=["meta"])
app.include_router(collections.router, prefix="/collections", tags=["collections"])
app.include_router(admin.router, prefix="/admin", tags=["admin"])

View File

View File

@@ -0,0 +1,31 @@
from sqlalchemy import Column, Integer, String, ForeignKey, DateTime, func
from sqlalchemy.orm import relationship
from app.db.database import Base
class Collection(Base):
__tablename__ = "collections"
id = Column(Integer, primary_key=True, index=True)
name = Column(String(120), nullable=False, unique=True)
created_at = Column(DateTime, server_default=func.now())
characters = relationship(
"Character",
back_populates="collection",
cascade="all, delete-orphan",
)
class Character(Base):
__tablename__ = "characters"
id = Column(Integer, primary_key=True, index=True)
name = Column(String(255), nullable=False)
filename = Column(String(255), nullable=False)
s3_url = Column(String(1024), nullable=False)
collection_id = Column(
Integer, ForeignKey("collections.id", ondelete="CASCADE"), nullable=False
)
collection = relationship("Collection", back_populates="characters")

View File

View File

@@ -0,0 +1,39 @@
from datetime import datetime
from typing import List
from pydantic import BaseModel, ConfigDict
class CharacterOut(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: int
name: str
filename: str
s3_url: str
collection_id: int
class CollectionSummary(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: int
name: str
created_at: datetime
character_count: int = 0
class CollectionOut(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: int
name: str
created_at: datetime
characters: List[CharacterOut] = []
class CollectionCreate(BaseModel):
name: str
class AdminStatus(BaseModel):
admin_enabled: bool

View File

View File

@@ -0,0 +1,63 @@
import uuid
import boto3
from botocore.client import Config
from botocore.exceptions import ClientError
from app.core.config import settings
def _client():
return boto3.client(
"s3",
endpoint_url=settings.S3_ENDPOINT_URL,
aws_access_key_id=settings.S3_ACCESS_KEY,
aws_secret_access_key=settings.S3_SECRET_KEY,
region_name=settings.S3_REGION,
config=Config(signature_version="s3v4"),
)
def ensure_bucket() -> None:
s3 = _client()
try:
s3.head_bucket(Bucket=settings.S3_BUCKET)
except ClientError:
s3.create_bucket(Bucket=settings.S3_BUCKET)
# Make objects publicly readable so the frontend can <img src=...> directly.
policy = (
'{"Version":"2012-10-17","Statement":[{"Effect":"Allow","Principal":{"AWS":["*"]},'
'"Action":["s3:GetObject"],"Resource":["arn:aws:s3:::'
+ settings.S3_BUCKET
+ '/*"]}]}'
)
try:
s3.put_bucket_policy(Bucket=settings.S3_BUCKET, Policy=policy)
except ClientError:
pass
def upload_image(file_bytes: bytes, filename: str, content_type: str) -> tuple[str, str]:
"""Upload bytes to S3, return (object_key, public_url)."""
ext = ""
if "." in filename:
ext = "." + filename.rsplit(".", 1)[-1].lower()
key = f"characters/{uuid.uuid4().hex}{ext}"
s3 = _client()
s3.put_object(
Bucket=settings.S3_BUCKET,
Key=key,
Body=file_bytes,
ContentType=content_type or "application/octet-stream",
)
public_url = f"{settings.S3_PUBLIC_URL.rstrip('/')}/{settings.S3_BUCKET}/{key}"
return key, public_url
def delete_object_by_url(url: str) -> None:
prefix = f"{settings.S3_PUBLIC_URL.rstrip('/')}/{settings.S3_BUCKET}/"
if not url.startswith(prefix):
return
key = url[len(prefix):]
try:
_client().delete_object(Bucket=settings.S3_BUCKET, Key=key)
except ClientError:
pass

View File

@@ -0,0 +1,7 @@
fastapi>=0.111.0
uvicorn[standard]>=0.29.0
pydantic>=2.7.0
pydantic-settings>=2.2.0
sqlalchemy>=2.0.30
boto3>=1.34.0
python-multipart>=0.0.9