Initial commit - copied workspace after database cleanup

This commit is contained in:
RobbStarkAustria
2025-10-10 15:20:14 +00:00
commit 1efe40a03b
142 changed files with 23625 additions and 0 deletions

36
.env.example Normal file
View File

@@ -0,0 +1,36 @@
# Copy this file to .env and fill in values as needed for local development.
# NOTE: No secrets should be committed. Use placeholders below.
# General
ENV=development
# Database (used if DB_CONN not provided)
DB_USER=your_user
DB_PASSWORD=your_password
DB_NAME=infoscreen_by_taa
DB_HOST=db
# Preferred connection string for services (overrides the above if set)
# DB_CONN=mysql+pymysql://${DB_USER}:${DB_PASSWORD}@${DB_HOST}/${DB_NAME}
# MQTT
MQTT_BROKER_HOST=mqtt
MQTT_BROKER_PORT=1883
# MQTT_USER=your_mqtt_user
# MQTT_PASSWORD=your_mqtt_password
MQTT_KEEPALIVE=60
# Dashboard
# Used when building the production dashboard image
# VITE_API_URL=https://your.api.example.com/api
# Groups alive windows (seconds)
HEARTBEAT_GRACE_PERIOD_DEV=15
HEARTBEAT_GRACE_PERIOD_PROD=180
# Scheduler
# Optional: force periodic republish even without changes
# REFRESH_SECONDS=0
# Default admin bootstrap (server/init_defaults.py)
DEFAULT_ADMIN_USERNAME=infoscreen_admin
DEFAULT_ADMIN_PASSWORD=Info_screen_admin25!

116
.github/copilot-instructions.md vendored Normal file
View File

@@ -0,0 +1,116 @@
# Copilot instructions for infoscreen_2025
Use this as your shared context when proposing changes. Keep edits minimal and match existing patterns referenced below.
## Big picture
- Multi-service app orchestrated by Docker Compose.
- API: Flask + SQLAlchemy (MariaDB), in `server/` exposed on :8000 (health: `/health`).
- Dashboard: React + Vite in `dashboard/`, dev on :5173, served via Nginx in prod.
- MQTT broker: Eclipse Mosquitto, config in `mosquitto/config/mosquitto.conf`.
- Listener: MQTT consumer handling discovery + heartbeats in `listener/listener.py`.
- Scheduler: Publishes active events (per group) to MQTT retained topics in `scheduler/scheduler.py`.
- Nginx: Reverse proxy routes `/api/*` and `/screenshots/*` to API; everything else to dashboard (`nginx.conf`).
## Service boundaries & data flow
- Database connection string is passed as `DB_CONN` (mysql+pymysql) to Python services.
- API builds its engine in `server/database.py` (loads `.env` only in development).
- Scheduler loads `DB_CONN` in `scheduler/db_utils.py`.
- Listener also creates its own engine for writes to `clients`.
- MQTT topics (paho-mqtt v2, use Callback API v2):
- Discovery: `infoscreen/discovery` (JSON includes `uuid`, hw/ip data). ACK to `infoscreen/{uuid}/discovery_ack`. See `listener/listener.py`.
- Heartbeat: `infoscreen/{uuid}/heartbeat` updates `Client.last_alive` (UTC).
- Event lists (retained): `infoscreen/events/{group_id}` from `scheduler/scheduler.py`.
- Per-client group assignment (retained): `infoscreen/{uuid}/group_id` via `server/mqtt_helper.py`.
- Screenshots: server-side folders `server/received_screenshots/` and `server/screenshots/`; Nginx exposes `/screenshots/{uuid}.jpg` via `server/wsgi.py` route.
- Presentation conversion (PPT/PPTX/ODP → PDF):
- Trigger: on upload in `server/routes/eventmedia.py` for media types `ppt|pptx|odp` (compute sha256, upsert `Conversion`, enqueue job).
- Worker: RQ worker runs `server.worker.convert_event_media_to_pdf`, calls Gotenberg LibreOffice endpoint, writes to `server/media/converted/`.
- Services: Redis (queue) and Gotenberg added in compose; worker service consumes the `conversions` queue.
- Env: `REDIS_URL` (default `redis://redis:6379/0`), `GOTENBERG_URL` (default `http://gotenberg:3000`).
- Endpoints: `POST /api/conversions/<media_id>/pdf` (ensure/enqueue), `GET /api/conversions/<media_id>/status`, `GET /api/files/converted/<path>` (serve PDFs).
- Storage: originals under `server/media/…`, outputs under `server/media/converted/` (prod compose mounts a shared volume for this path).
## Data model highlights (see `models/models.py`)
- Enums: `EventType` (presentation, website, video, message, webuntis), `MediaType` (file/website types), and `AcademicPeriodType` (schuljahr, semester, trimester).
- Tables: `clients`, `client_groups`, `events`, `event_media`, `users`, `academic_periods`, `school_holidays`.
- Academic periods: `academic_periods` table supports educational institution cycles (school years, semesters). Events and media can be optionally linked via `academic_period_id` (nullable for backward compatibility).
- Times are stored as timezone-aware; treat comparisons in UTC (see scheduler and routes/events).
- Conversions:
- Enum `ConversionStatus`: `pending`, `processing`, `ready`, `failed`.
- Table `conversions`: `id`, `source_event_media_id` (FK→`event_media.id` ondelete CASCADE), `target_format`, `target_path`, `status`, `file_hash` (sha256), `started_at`, `completed_at`, `error_message`.
- Indexes: `(source_event_media_id, target_format)`, `(status, target_format)`; Unique: `(source_event_media_id, target_format, file_hash)`.
## API patterns
- Blueprints live in `server/routes/*` and are registered in `server/wsgi.py` with `/api/...` prefixes.
- Session usage: instantiate `Session()` per request, commit when mutating, and always `session.close()` before returning.
- Examples:
- Clients: `server/routes/clients.py` includes bulk group updates and MQTT sync (`publish_multiple_client_groups`).
- Groups: `server/routes/groups.py` computes “alive” using a grace period that varies by `ENV`.
- Events: `server/routes/events.py` serializes enum values to strings and normalizes times to UTC.
- Media: `server/routes/eventmedia.py` implements a simple file manager API rooted at `server/media/`.
- Academic periods: `server/routes/academic_periods.py` exposes:
- `GET /api/academic_periods` — list all periods
- `GET /api/academic_periods/active` — currently active period
- `POST /api/academic_periods/active` — set active period (deactivates others)
- `GET /api/academic_periods/for_date?date=YYYY-MM-DD` — period covering given date
## Frontend patterns (dashboard)
- Vite React app; proxies `/api` and `/screenshots` to API in dev (`vite.config.ts`).
- Uses Syncfusion components; Vite config pre-bundles specific packages to avoid alias issues.
- Environment: `VITE_API_URL` provided at build/run; in dev compose, proxy handles `/api` so local fetches can use relative `/api/...` paths.
- Scheduler (appointments page): top bar includes Group and Academic Period selectors (Syncfusion DropDownList). Selecting a period calls `POST /api/academic_periods/active`, moves the calendar to todays month/day within the period year, and refreshes a right-aligned indicator row showing:
- Holidays present in the current view (count)
- Period label (display_name or name) with a badge indicating whether any holidays exist in that period (overlap check)
Note: Syncfusion usage in the dashboard is already documented above; if a UI for conversion status/downloads is added later, link its routes and components here.
## Local development
- Compose: development is `docker-compose.yml` + `docker-compose.override.yml`.
- API (dev): `server/Dockerfile.dev` with debugpy on 5678, Flask app `wsgi:app` on :8000.
- Dashboard (dev): `dashboard/Dockerfile.dev` exposes :5173 and waits for API via `dashboard/wait-for-backend.sh`.
- Mosquitto: allows anonymous in dev; WebSocket on :9001.
- Common env vars: `DB_CONN`, `DB_USER`, `DB_PASSWORD`, `DB_HOST=db`, `DB_NAME`, `ENV`, `MQTT_USER`, `MQTT_PASSWORD`.
- Alembic: prod compose runs `alembic ... upgrade head` and `server/init_defaults.py` before gunicorn.
- Use `server/init_academic_periods.py` to populate default Austrian school years after migration.
## Production
- `docker-compose.prod.yml` uses prebuilt images (`ghcr.io/robbstarkaustria/*`).
- Nginx serves dashboard and proxies API; TLS certs expected in `certs/` and mounted to `/etc/nginx/certs`.
## Environment variables (reference)
- DB_CONN — Preferred DB URL for services. Example: `mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}`
- DB_USER, DB_PASSWORD, DB_NAME, DB_HOST — Used to assemble DB_CONN in dev if missing; inside containers `DB_HOST=db`.
- ENV — `development` or `production`; in development, `server/database.py` loads `.env`.
- MQTT_BROKER_HOST, MQTT_BROKER_PORT — Defaults `mqtt` and `1883`; MQTT_USER/MQTT_PASSWORD optional (dev often anonymous per Mosquitto config).
- VITE_API_URL — Dashboard build-time base URL (prod); in dev the Vite proxy serves `/api` to `server:8000`.
- HEARTBEAT_GRACE_PERIOD_DEV / HEARTBEAT_GRACE_PERIOD_PROD — Groups “alive” window (defaults ~15s dev / 180s prod).
- REFRESH_SECONDS — Optional scheduler republish interval; `0` disables periodic refresh.
## Conventions & gotchas
- Always compare datetimes in UTC; some DB values may be naive—normalize before comparing (see `routes/events.py`).
- Use retained MQTT messages for state that clients must recover after reconnect (events per group, client group_id).
- In-container DB host is `db`; do not use `localhost` inside services.
- No separate dev vs prod secret conventions: use the same env var keys across environments (e.g., `DB_CONN`, `MQTT_USER`, `MQTT_PASSWORD`).
- When adding a new route:
1) Create a Blueprint in `server/routes/...`,
2) Register it in `server/wsgi.py`,
3) Manage `Session()` lifecycle, and
4) Return JSON-safe values (serialize enums and datetimes).
- When extending media types, update `MediaType` and any logic in `eventmedia` and dashboard that depends on it.
- Academic periods: Events/media can be optionally associated with periods for educational organization. Only one period should be active at a time (`is_active=True`).
## Quick examples
- Add client description persists to DB and publishes group via MQTT: see `PUT /api/clients/<uuid>/description` in `routes/clients.py`.
- Bulk group assignment emits retained messages for each client: `PUT /api/clients/group`.
- Listener heartbeat path: `infoscreen/<uuid>/heartbeat` → sets `clients.last_alive`.
Questions or unclear areas? Tell us if you need: exact devcontainer debugging steps, stricter Alembic workflow, or a seed dataset beyond `init_defaults.py`.
## Academic Periods System
- **Purpose**: Organize events and media by educational cycles (school years, semesters, trimesters).
- **Design**: Fully backward compatible - existing events/media continue to work without period assignment.
- **Usage**: New events/media can optionally reference `academic_period_id` for better organization and filtering.
- **Constraints**: Only one period can be active at a time; use `init_academic_periods.py` for Austrian school year setup.
- **UI Integration**: The dashboard highlights the currently selected period and whether a holiday plan exists within that date range. Holiday linkage currently uses date overlap with `school_holidays`; an explicit `academic_period_id` on `school_holidays` can be added later if tighter association is required.

6
.stylelintrc.json Normal file
View File

@@ -0,0 +1,6 @@
{
"extends": [
"stylelint-config-standard",
"stylelint-config-tailwindcss"
]
}

View File

@@ -0,0 +1,100 @@
# Maintaining AI Assistant Instructions (copilot-instructions.md)
This repo uses `.github/copilot-instructions.md` to brief AI coding agents about your architecture, workflows, and conventions. Keep it concise, repo-specific, and always in sync with your code.
This guide explains when and how to update it, plus small guardrails to help—even for a solo developer.
## When to update
Update the instructions in the same commit as your change whenever you:
- Add/rename services, ports, or container wiring (docker-compose*.yml, Nginx, Mosquitto)
- Introduce/rename MQTT topics or change retained-message behavior
- Add/rename environment variables or change defaults (`.env.example`, `deployment.md`)
- Change DB models or time/UTC handling (e.g., `models/models.py`, UTC normalization in routes/scheduler)
- Add/modify API route patterns or session lifecycle (files in `server/routes/*`, `server/wsgi.py`)
- Adjust frontend dev proxy or build settings (`dashboard/vite.config.ts`, Dockerfiles)
## What to update (and where)
- `.github/copilot-instructions.md`
- Big picture: services and ports
- Service boundaries & data flow: DB connection rules, MQTT topics, retained messages, screenshots
- API patterns: Blueprints, Session per request, enum/datetime serialization
- Frontend patterns: Vite dev proxy and pre-bundled dependencies
- Environment variables (reference): names, purposes, example patterns
- Conventions & gotchas: UTC comparisons, retained MQTT, container hostnames
- `.env.example`
- Add new variable names with placeholders and comments (never secrets)
- Keep in-container defaults (e.g., `DB_HOST=db`, `MQTT_BROKER_HOST=mqtt`)
- `deployment.md`
- Update Quickstart URLs/ports/commands
- Document prod-specific env usage (e.g., `VITE_API_URL`, `DB_CONN`)
## How to write good updates
- Keep it short (approx. 2050 lines total). Link to code by path or route rather than long prose.
- Document real, present patterns—not plans.
- Use UTC consistently and call out any special handling.
- Include concrete examples from this repo when describing patterns (e.g., which route shows enum serialization).
- Never include secrets or real tokens; show only variable names and example formats.
## Solo-friendly workflow
- Update docs in the same commit as your change:
- Code changed → docs changed (copilot-instructions, `.env.example`, `deployment.md` as needed)
- Use a quick self-checklist before pushing:
- Services/ports changed? Update “Big picture”.
- MQTT topics/retained behavior changed? Update “Service boundaries & data flow”.
- API/Session/UTC rules changed? Update “API patterns” and “Conventions & gotchas”.
- Frontend proxy/build changed? Update “Frontend patterns”.
- Env vars changed? Update “Environment variables (reference)” + `.env.example`.
- Dev/prod run steps changed? Update `deployment.md` Quickstart.
- Keep commits readable by pairing code and doc changes:
- `feat(api): add events endpoint; docs: update routes and UTC note`
- `chore(compose): rename service; docs: update ports + nginx`
- `docs(env): add MQTT_USER to .env.example + instructions`
## Optional guardrails (even for solo)
- PR (or MR) template (useful even if you self-merge)
- Add `.github/pull_request_template.md` with:
```
Checklist
- [ ] Updated .github/copilot-instructions.md (services/MQTT/API/UTC/env)
- [ ] Synced .env.example (new/renamed vars)
- [ ] Adjusted deployment.md (dev/prod steps, URLs/ports)
- [ ] Verified referenced files/paths in the instructions exist
```
- Lightweight docs check (optional pre-commit hook)
- Non-blocking script that warns if referenced files/paths dont exist. Example sketch:
```
#!/usr/bin/env bash
set -e
FILE=.github/copilot-instructions.md
missing=0
for path in \
server/wsgi.py \
server/routes/clients.py \
server/routes/events.py \
server/routes/groups.py \
dashboard/vite.config.ts \
docker-compose.yml \
docker-compose.override.yml; do
if ! test -e "$path"; then
echo "[warn] referenced path not found: $path"; missing=1
fi
done
exit 0 # warn only; do not block commit
```
- Weekly 2-minute sweep
- Read `.github/copilot-instructions.md` top-to-bottom and remove anything stale.
## FAQ
- Where do the AI assistants look?
- `.github/copilot-instructions.md` + the code you have open. Keep this file synced with the codebase.
- Is it safe to commit this file?
- Yes—no secrets. It should contain only structure, patterns, and example formats.
- How detailed should it be?
- Concise and actionable; point to exact files for details. Avoid generic advice.
## Pointers to key files
- Compose & infra: `docker-compose*.yml`, `nginx.conf`, `mosquitto/config/mosquitto.conf`
- Backend: `server/database.py`, `server/wsgi.py`, `server/routes/*`, `models/models.py`
- MQTT workers: `listener/listener.py`, `scheduler/scheduler.py`, `server/mqtt_helper.py`
- Frontend: `dashboard/vite.config.ts`, `dashboard/package.json`, `dashboard/src/*`
- Dev/Prod docs: `deployment.md`, `.env.example`

39
CLEANUP_SUMMARY.md Normal file
View File

@@ -0,0 +1,39 @@
# Database Cleanup Summary
## Files Removed ✅
The following obsolete database initialization files have been removed:
### Removed Files:
- **`server/init_database.py`** - Manual table creation (superseded by Alembic migrations)
- **`server/init_db.py`** - Alternative initialization (superseded by `init_defaults.py`)
- **`server/init_mariadb.py`** - Database/user creation (handled by Docker Compose)
- **`server/test_sql.py`** - Outdated connection test (used localhost instead of container)
### Why These Were Safe to Remove:
1. **No references found** in any Docker files, scripts, or code
2. **Functionality replaced** by modern Alembic-based approach
3. **Hardcoded connection strings** that don't match current Docker setup
4. **Manual processes** now automated in production deployment
## Current Database Management ✅
### Active Scripts:
- **`server/initialize_database.py`** - Complete initialization (NEW)
- **`server/init_defaults.py`** - Default data creation
- **`server/init_academic_periods.py`** - Academic periods setup
- **`alembic/`** - Schema migrations (version control)
### Development Scripts (Kept):
- **`server/dummy_clients.py`** - Test client data generation
- **`server/dummy_events.py`** - Test event data generation
- **`server/sync_existing_clients.py`** - MQTT synchronization utility
## Result
- **4 obsolete files removed**
- **Documentation updated** to reflect current state
- **No breaking changes** - all functionality preserved
- **Cleaner codebase** with single initialization path
The database initialization process is now streamlined and uses only modern, maintained approaches.

147
DATABASE_GUIDE.md Normal file
View File

@@ -0,0 +1,147 @@
# Database Initialization and Management Guide
## Quick Start
Your database has been successfully initialized! Here's what you need to know:
### ✅ Current Status
- **Database**: MariaDB 11.2 running in Docker container `infoscreen-db`
- **Schema**: Up to date (Alembic revision: `b5a6c3d4e7f8`)
- **Default Data**: Admin user and client group created
- **Academic Periods**: Austrian school years 2024/25 (active), 2025/26, 2026/27
### 🔐 Default Credentials
- **Admin Username**: `infoscreen_admin`
- **Admin Password**: Check your `.env` file for `DEFAULT_ADMIN_PASSWORD`
- **Database User**: `infoscreen_admin`
- **Database Name**: `infoscreen_by_taa`
## Database Management Commands
### Initialize/Reinitialize Database
```bash
cd /workspace/server
python initialize_database.py
```
### Check Migration Status
```bash
cd /workspace/server
alembic current
alembic history --verbose
```
### Run Migrations Manually
```bash
cd /workspace/server
alembic upgrade head # Apply all pending migrations
alembic upgrade +1 # Apply next migration
alembic downgrade -1 # Rollback one migration
```
### Create New Migration
```bash
cd /workspace/server
alembic revision --autogenerate -m "Description of changes"
```
### Database Connection Test
```bash
cd /workspace/server
python -c "
from database import Session
session = Session()
print('✅ Database connection successful')
session.close()
"
```
## Initialization Scripts
### Core Scripts (recommended order):
1. **`alembic upgrade head`** - Apply database schema migrations
2. **`init_defaults.py`** - Create default user groups and admin user
3. **`init_academic_periods.py`** - Set up Austrian school year periods
### All-in-One Script:
- **`initialize_database.py`** - Complete database initialization (runs all above scripts)
### Development/Testing Scripts:
- **`dummy_clients.py`** - Creates test client data for development
- **`dummy_events.py`** - Creates test event data for development
- **`sync_existing_clients.py`** - One-time MQTT sync for existing clients
## Database Schema Overview
### Main Tables:
- **`users`** - User authentication and roles
- **`clients`** - Registered client devices
- **`client_groups`** - Client organization groups
- **`events`** - Scheduled events and presentations
- **`event_media`** - Media files for events
- **`conversions`** - File conversion jobs (PPT → PDF)
- **`academic_periods`** - School year/semester management
- **`school_holidays`** - Holiday calendar
- **`alembic_version`** - Migration tracking
### Environment Variables:
```bash
DB_CONN=mysql+pymysql://infoscreen_admin:KqtpM7wmNdM1DamFKs@db/infoscreen_by_taa
DB_USER=infoscreen_admin
DB_PASSWORD=KqtpM7wmNdM1DamFKs
DB_NAME=infoscreen_by_taa
DB_HOST=db
```
## Troubleshooting
### Database Connection Issues:
```bash
# Check if database container is running
docker ps | grep db
# Check database logs
docker logs infoscreen-db
# Test direct connection
docker exec -it infoscreen-db mysql -u infoscreen_admin -p infoscreen_by_taa
```
### Migration Issues:
```bash
# Check current state
cd /workspace/server && alembic current
# Show migration history
cd /workspace/server && alembic history
# Show pending migrations
cd /workspace/server && alembic show head
```
### Reset Database (⚠️ DESTRUCTIVE):
```bash
# Stop services
docker-compose down
# Remove database volume
docker volume rm infoscreen_2025_db-data
# Restart and reinitialize
docker-compose up -d db
cd /workspace/server && python initialize_database.py
```
## Production Deployment
The production setup in `docker-compose.prod.yml` includes automatic database initialization:
```yaml
server:
command: >
bash -c "alembic -c /app/server/alembic.ini upgrade head &&
python /app/server/init_defaults.py &&
exec gunicorn server.wsgi:app --bind 0.0.0.0:8000"
```
This ensures the database is properly initialized on every deployment.

View File

@@ -0,0 +1,18 @@
"2.11.","Allerseelen",20251102,20251102,
"Ferien2","Weihnachtsferien",20251224,20260106,
"Ferien3","Semesterferien",20260216,20260222,
"Ferien4_2","Osterferien",20260328,20260406,
"Ferien4","Hl. Florian",20260504,20260504,
"26.10.","Nationalfeiertag",20251026,20251026,"F"
"27.10.","Herbstferien",20251027,20251027,"F"
"28.10.","Herbstferien",20251028,20251028,"F"
"29.10.","Herbstferien",20251029,20251029,"F"
"30.10.","Herbstferien",20251030,20251030,"F"
"31.10.","Herbstferien",20251031,20251031,"F"
"1.11.","Allerheiligen",20251101,20251101,"F"
"8.12.","Mariä Empfängnis",20251208,20251208,"F"
"1.5.","Staatsfeiertag",20260501,20260501,"F"
"14.5.","Christi Himmelfahrt",20260514,20260514,"F"
"24.5.","Pfingstsonntag",20260524,20260524,"F"
"25.5.","Pfingstmontag",20260525,20260525,"F"
"4.6.","Fronleichnam",20260604,20260604,"F"

92
Makefile Normal file
View File

@@ -0,0 +1,92 @@
# Makefile for infoscreen_2025
# Usage: run `make help` to see available targets.
# Default compose files
COMPOSE_FILES=-f docker-compose.yml -f docker-compose.override.yml
COMPOSE=docker compose $(COMPOSE_FILES)
# Registry and image names (adjust if needed)
REGISTRY=ghcr.io/robbstarkaustria
API_IMAGE=$(REGISTRY)/infoscreen-api:latest
DASH_IMAGE=$(REGISTRY)/infoscreen-dashboard:latest
LISTENER_IMAGE=$(REGISTRY)/infoscreen-listener:latest
SCHED_IMAGE=$(REGISTRY)/infoscreen-scheduler:latest
.PHONY: help
help:
@echo "Available targets:"
@echo " up - Start dev stack (compose + override)"
@echo " down - Stop dev stack"
@echo " logs - Tail logs for all services"
@echo " logs-% - Tail logs for a specific service (e.g., make logs-server)"
@echo " build - Build all images locally"
@echo " push - Push built images to GHCR"
@echo " pull-prod - Pull prod images from GHCR"
@echo " up-prod - Start prod stack (docker-compose.prod.yml)"
@echo " down-prod - Stop prod stack"
@echo " health - Quick health checks"
@echo " fix-perms - Recursively chown workspace to current user"
# ---------- Development stack ----------
.PHONY: up
up: ## Start dev stack
$(COMPOSE) up -d --build
.PHONY: down
down: ## Stop dev stack
$(COMPOSE) down
.PHONY: logs
logs: ## Tail logs for all services
$(COMPOSE) logs -f
.PHONY: logs-%
logs-%: ## Tail logs for a specific service, e.g. `make logs-server`
$(COMPOSE) logs -f $*
# ---------- Images: build/push ----------
.PHONY: build
build: ## Build all images locally
docker build -f server/Dockerfile -t $(API_IMAGE) .
docker build -f dashboard/Dockerfile -t $(DASH_IMAGE) .
docker build -f listener/Dockerfile -t $(LISTENER_IMAGE) .
docker build -f scheduler/Dockerfile -t $(SCHED_IMAGE) .
.PHONY: push
push: ## Push all images to GHCR
docker push $(API_IMAGE)
docker push $(DASH_IMAGE)
docker push $(LISTENER_IMAGE)
docker push $(SCHED_IMAGE)
# ---------- Production stack ----------
PROD_COMPOSE=docker compose -f docker-compose.prod.yml
.PHONY: pull-prod
pull-prod: ## Pull prod images
$(PROD_COMPOSE) pull
.PHONY: up-prod
up-prod: ## Start prod stack
$(PROD_COMPOSE) up -d
.PHONY: down-prod
down-prod: ## Stop prod stack
$(PROD_COMPOSE) down
# ---------- Health ----------
.PHONY: health
health: ## Quick health checks
@echo "API health:" && curl -fsS http://localhost:8000/health || true
@echo "Dashboard (dev):" && curl -fsS http://localhost:5173/ || true
@echo "MQTT TCP 1883:" && nc -z localhost 1883 && echo OK || echo FAIL
@echo "MQTT WS 9001:" && nc -z localhost 9001 && echo OK || echo FAIL
# ---------- Permissions ----------
.PHONY: fix-perms
fix-perms:
@echo "Fixing ownership to current user recursively (may prompt for sudo password)..."
sudo chown -R $$(id -u):$$(id -g) .
@echo "Done. Consider adding UID and GID to your .env to prevent future root-owned files:"
@echo " echo UID=$$(id -u) >> .env && echo GID=$$(id -g) >> .env"

407
README.md Normal file
View File

@@ -0,0 +1,407 @@
# Infoscreen 2025
[![Docker](https://img.shields.io/badge/Docker-Multi--Service-blue?logo=docker)](https://www.docker.com/)
[![React](https://img.shields.io/badge/React-19.1.0-61DAFB?logo=react)](https://reactjs.org/)
[![Flask](https://img.shields.io/badge/Flask-REST_API-green?logo=flask)](https://flask.palletsprojects.com/)
[![MariaDB](https://img.shields.io/badge/MariaDB-11.2-003545?logo=mariadb)](https://mariadb.org/)
[![MQTT](https://img.shields.io/badge/MQTT-Eclipse_Mosquitto-purple)](https://mosquitto.org/)
A comprehensive multi-service digital signage solution for educational institutions, featuring client management, event scheduling, presentation conversion, and real-time MQTT communication.
## 🏗️ Architecture Overview
```
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Dashboard │ │ API Server │ │ Listener │
│ (React/Vite) │◄──►│ (Flask) │◄──►│ (MQTT Client) │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│ │ │
│ ▼ │
│ ┌─────────────────┐ │
│ │ MariaDB │ │
│ │ (Database) │ │
│ └─────────────────┘ │
│ │
└────────────────────┬───────────────────────────┘
┌─────────────────┐
│ MQTT Broker │
│ (Mosquitto) │
└─────────────────┘
┌────────────────────┼────────────────────┐
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Scheduler │ │ Worker │ │ Infoscreen │
│ (Events) │ │ (Conversions) │ │ Clients │
└─────────────────┘ └─────────────────┘ └─────────────────┘
```
## 🌟 Key Features
### 📊 **Dashboard Management**
- Modern React-based web interface with Syncfusion components
- Real-time client monitoring and group management
- Event scheduling with academic period support
- Media management with presentation conversion
- Holiday calendar integration
### 🎯 **Event System**
- **Presentations**: PowerPoint/LibreOffice → PDF conversion via Gotenberg
- **Websites**: URL-based content display
- **Videos**: Media file streaming
- **Messages**: Text announcements
- **WebUntis**: Educational schedule integration
### 🏫 **Academic Period Management**
- Support for school years, semesters, and trimesters
- Austrian school system integration
- Holiday calendar synchronization
- Period-based event organization
### 📡 **Real-time Communication**
- MQTT-based client discovery and heartbeat monitoring
- Retained topics for reliable state synchronization
- WebSocket support for browser clients
- Automatic client group assignment
### 🔄 **Background Processing**
- Redis-based job queues for presentation conversion
- Gotenberg integration for LibreOffice/PowerPoint processing
- Asynchronous file processing with status tracking
- RQ (Redis Queue) worker management
## 🚀 Quick Start
### Prerequisites
- Docker & Docker Compose
- Git
- SSL certificates (for production)
### Development Setup
1. **Clone the repository**
```bash
git clone <repository-url>
cd infoscreen_2025
```
2. **Environment Configuration**
```bash
cp .env.example .env
# Edit .env with your configuration
```
3. **Start the development stack**
```bash
make up
# or: docker compose up -d --build
```
4. **Access the services**
- Dashboard: http://localhost:5173
- API: http://localhost:8000
- Database: localhost:3306
- MQTT: localhost:1883 (WebSocket: 9001)
### Production Deployment
1. **Build and push images**
```bash
make build
make push
```
2. **Deploy on server**
```bash
make pull-prod
make up-prod
```
For detailed deployment instructions, see:
- [Debian Deployment Guide](deployment-debian.md)
- [Ubuntu Deployment Guide](deployment-ubuntu.md)
## 🛠️ Services
### 🖥️ **Dashboard** (`dashboard/`)
- **Technology**: React 19 + TypeScript + Vite
- **UI Framework**: Syncfusion components + Tailwind CSS
- **Features**: Responsive design, real-time updates, file management
- **Port**: 5173 (dev), served via Nginx (prod)
### 🔧 **API Server** (`server/`)
- **Technology**: Flask + SQLAlchemy + Alembic
- **Database**: MariaDB with timezone-aware timestamps
- **Features**: RESTful API, file uploads, MQTT integration
- **Port**: 8000
- **Health Check**: `/health`
### 👂 **Listener** (`listener/`)
- **Technology**: Python + paho-mqtt
- **Purpose**: MQTT message processing, client discovery
- **Features**: Heartbeat monitoring, automatic client registration
### ⏰ **Scheduler** (`scheduler/`)
- **Technology**: Python + SQLAlchemy
- **Purpose**: Event publishing, group-based content distribution
- **Features**: Time-based event activation, MQTT publishing
### 🔄 **Worker** (Conversion Service)
- **Technology**: RQ (Redis Queue) + Gotenberg
- **Purpose**: Background presentation conversion
- **Features**: PPT/PPTX/ODP → PDF conversion, status tracking
### 🗄️ **Database** (MariaDB 11.2)
- **Features**: Health checks, automatic initialization
- **Migrations**: Alembic-based schema management
- **Timezone**: UTC-aware timestamps
### 📡 **MQTT Broker** (Eclipse Mosquitto 2.0.21)
- **Features**: WebSocket support, health monitoring
- **Topics**:
- `infoscreen/discovery` - Client registration
- `infoscreen/{uuid}/heartbeat` - Client alive status
- `infoscreen/events/{group_id}` - Event distribution
- `infoscreen/{uuid}/group_id` - Client group assignment
## 📁 Project Structure
```
infoscreen_2025/
├── dashboard/ # React frontend
│ ├── src/ # React components and logic
│ ├── public/ # Static assets
│ └── Dockerfile # Production build
├── server/ # Flask API backend
│ ├── routes/ # API endpoints
│ ├── alembic/ # Database migrations
│ ├── media/ # File storage
│ └── worker.py # Background jobs
├── listener/ # MQTT listener service
├── scheduler/ # Event scheduling service
├── models/ # Shared database models
├── mosquitto/ # MQTT broker configuration
├── certs/ # SSL certificates
├── docker-compose.yml # Development setup
├── docker-compose.prod.yml # Production setup
└── Makefile # Development shortcuts
```
## 🔧 Development
### Available Commands
```bash
# Development
make up # Start dev stack
make down # Stop dev stack
make logs # View all logs
make logs-server # View specific service logs
# Building & Deployment
make build # Build all images
make push # Push to registry
make pull-prod # Pull production images
make up-prod # Start production stack
# Maintenance
make health # Health checks
make fix-perms # Fix file permissions
```
### Database Management
```bash
# Access database directly
docker exec -it infoscreen-db mysql -u${DB_USER} -p${DB_PASSWORD} ${DB_NAME}
# Run migrations
docker exec -it infoscreen-api alembic upgrade head
# Initialize academic periods (Austrian school system)
docker exec -it infoscreen-api python init_academic_periods.py
```
### MQTT Testing
```bash
# Subscribe to all topics
mosquitto_sub -h localhost -t "infoscreen/#" -v
# Publish test message
mosquitto_pub -h localhost -t "infoscreen/test" -m "Hello World"
# Monitor client heartbeats
mosquitto_sub -h localhost -t "infoscreen/+/heartbeat" -v
```
## 🌐 API Endpoints
### Core Resources
- `GET /api/clients` - List all registered clients
- `PUT /api/clients/{uuid}/group` - Assign client to group
- `GET /api/groups` - List client groups with alive status
- `GET /api/events` - List events with filtering
- `POST /api/events` - Create new event
- `GET /api/academic_periods` - List academic periods
- `POST /api/academic_periods/active` - Set active period
### File Management
- `POST /api/files` - Upload media files
- `GET /api/files/{path}` - Download files
- `GET /api/files/converted/{path}` - Download converted PDFs
- `POST /api/conversions/{media_id}/pdf` - Request conversion
- `GET /api/conversions/{media_id}/status` - Check conversion status
### Health & Monitoring
- `GET /health` - Service health check
- `GET /api/screenshots/{uuid}.jpg` - Client screenshots
## 🎨 Frontend Features
### Syncfusion Components Used
- **Schedule**: Event calendar with drag-drop support
- **Grid**: Data tables with filtering and sorting
- **DropDownList**: Group and period selectors
- **FileManager**: Media upload and organization
- **Kanban**: Task management views
- **Notifications**: Toast messages and alerts
### Pages Overview
- **Dashboard**: System overview and statistics
- **Clients**: Device management and monitoring
- **Groups**: Client group organization
- **Events**: Schedule management
- **Media**: File upload and conversion
- **Settings**: System configuration
- **Holidays**: Academic calendar management
## 🔒 Security & Authentication
- **Environment Variables**: Sensitive data via `.env`
- **SSL/TLS**: HTTPS support with custom certificates
- **MQTT Security**: Username/password authentication
- **Database**: Parameterized queries, connection pooling
- **File Uploads**: Type validation, size limits
- **CORS**: Configured for production deployment
## 📊 Monitoring & Logging
### Health Checks
All services include Docker health checks:
- API: HTTP endpoint monitoring
- Database: Connection and initialization status
- MQTT: Pub/sub functionality test
- Dashboard: Nginx availability
### Logging Strategy
- **Development**: Docker Compose logs with service prefixes
- **Production**: Centralized logging via Docker log drivers
- **MQTT**: Message-level debugging available
- **Database**: Query logging in development mode
## 🌍 Deployment Options
### Development
- **Hot Reload**: Vite dev server + Flask debug mode
- **Volume Mounts**: Live code editing
- **Debug Ports**: Python debugger support (port 5678)
- **Local Certificates**: Self-signed SSL for testing
### Production
- **Optimized Builds**: Multi-stage Dockerfiles
- **Reverse Proxy**: Nginx with SSL termination
- **Health Monitoring**: Comprehensive healthchecks
- **Registry**: GitHub Container Registry integration
- **Scaling**: Docker Compose for single-node deployment
## 🤝 Contributing
1. Fork the repository
2. Create a feature branch: `git checkout -b feature/amazing-feature`
3. Commit your changes: `git commit -m 'Add amazing feature'`
4. Push to the branch: `git push origin feature/amazing-feature`
5. Open a Pull Request
### Development Guidelines
- Follow existing code patterns and naming conventions
- Add appropriate tests for new features
- Update documentation for API changes
- Use TypeScript for frontend development
- Follow Python PEP 8 for backend code
## 📋 Requirements
### System Requirements
- **CPU**: 2+ cores recommended
- **RAM**: 4GB minimum, 8GB recommended
- **Storage**: 20GB+ for media files and database
- **Network**: Reliable internet for client communication
### Software Dependencies
- Docker 24.0+
- Docker Compose 2.0+
- Git 2.30+
- Modern web browser (Chrome, Firefox, Safari, Edge)
## 🐛 Troubleshooting
### Common Issues
**Services won't start**
```bash
# Check service health
make health
# View specific service logs
make logs-server
make logs-db
```
**Database connection errors**
```bash
# Verify database is running
docker exec -it infoscreen-db mysqladmin ping
# Check credentials in .env file
# Restart dependent services
```
**MQTT communication issues**
```bash
# Test MQTT broker
mosquitto_pub -h localhost -t test -m "hello"
# Check client certificates and credentials
# Verify firewall settings for ports 1883/9001
```
**File conversion problems**
```bash
# Check Gotenberg service
curl http://localhost:3000/health
# Monitor worker logs
make logs-worker
# Check Redis queue status
docker exec -it infoscreen-redis redis-cli LLEN conversions
```
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 🙏 Acknowledgments
- **Syncfusion**: UI components for React dashboard
- **Eclipse Mosquitto**: MQTT broker implementation
- **Gotenberg**: Document conversion service
- **MariaDB**: Reliable database engine
- **Flask**: Python web framework
- **React**: Frontend user interface library
---
For detailed technical documentation, deployment guides, and API specifications, please refer to the additional documentation files in this repository.

0
__init__.py Normal file
View File

17
create_init_files.py Normal file
View File

@@ -0,0 +1,17 @@
import os
folders = [
"server",
"dashboard",
"dashboard/callbacks",
"dashboard/utils",
]
for folder in folders:
path = os.path.join(os.getcwd(), folder, "__init__.py")
if not os.path.exists(path):
with open(path, "w") as f:
pass # Leere Datei anlegen
print(f"Angelegt: {path}")
else:
print(f"Existiert bereits: {path}")

1
dashboard/.dockerignore Normal file
View File

@@ -0,0 +1 @@
node_modules

34
dashboard/.eslintrc.cjs Normal file
View File

@@ -0,0 +1,34 @@
module.exports = {
root: true,
env: {
browser: true,
es2021: true,
},
extends: [
'eslint:recommended',
'plugin:react/recommended',
'plugin:react-hooks/recommended',
'plugin:@typescript-eslint/recommended',
'plugin:prettier/recommended'
],
parser: '@typescript-eslint/parser',
parserOptions: {
ecmaVersion: 'latest',
sourceType: 'module',
ecmaFeatures: {
jsx: true,
},
},
plugins: ['react', '@typescript-eslint'],
settings: {
react: {
version: 'detect',
},
},
rules: {
// Beispiele für sinnvolle Anpassungen
'react/react-in-jsx-scope': 'off', // nicht nötig mit React 17+
'@typescript-eslint/explicit-module-boundary-types': 'off',
'@typescript-eslint/no-unused-vars': ['warn', { argsIgnorePattern: '^_' }],
},
};

9
dashboard/.prettierrc Normal file
View File

@@ -0,0 +1,9 @@
{
"semi": true,
"singleQuote": true,
"trailingComma": "es5",
"tabWidth": 2,
"printWidth": 100,
"bracketSpacing": true,
"arrowParens": "avoid"
}

View File

@@ -0,0 +1,9 @@
{
"extends": [
"stylelint-config-standard",
"stylelint-config-tailwindcss"
],
"rules": {
"at-rule-no-unknown": null
}
}

25
dashboard/Dockerfile Normal file
View File

@@ -0,0 +1,25 @@
# ==========================================
# dashboard/Dockerfile (Production)
# ==========================================
FROM node:20-alpine AS build
WORKDIR /app
# Kopiere package.json und Lockfile aus dem Build-Kontext (./dashboard)
COPY package*.json ./
# Produktions-Abhängigkeiten installieren
ENV NODE_ENV=production
RUN npm ci --omit=dev
# Quellcode kopieren und builden
COPY . .
ARG VITE_API_URL
ENV VITE_API_URL=${VITE_API_URL}
RUN npm run build
FROM nginx:1.25-alpine
COPY --from=build /app/dist /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", " -g", "daemon off;"]

28
dashboard/Dockerfile.dev Normal file
View File

@@ -0,0 +1,28 @@
# ==========================================
# dashboard/Dockerfile.dev (Development)
# 🔧 OPTIMIERT: Für schnelle Entwicklung mit Vite und npm
# ==========================================
FROM node:20-alpine
# Stelle sicher, dass benötigte Tools verfügbar sind (z. B. für wait-for-backend.sh)
RUN apk add --no-cache curl
# Setze Arbeitsverzeichnis direkt auf das Dashboard-Verzeichnis im Container
# (Der Build-Kontext ist ./dashboard, siehe docker-compose.override.yml)
WORKDIR /workspace/dashboard
# KOPIEREN: Nur package-Dateien relativ zum Build-Kontext (KEINE /workspace-Pfade)
# package*.json deckt sowohl package.json als auch package-lock.json ab, falls vorhanden
COPY package*.json ./
# Installation robust machen: npm ci erfordert package-lock.json; fallback auf npm install
RUN if [ -f package-lock.json ]; then \
npm ci --legacy-peer-deps; \
else \
npm install --legacy-peer-deps; \
fi && \
npm cache clean --force
EXPOSE 5173 9230
CMD ["npm", "run", "dev", "--", "--host", "0.0.0.0", "--port", "5173"]

54
dashboard/README.md Normal file
View File

@@ -0,0 +1,54 @@
# React + TypeScript + Vite
This template provides a minimal setup to get React working in Vite with HMR and some ESLint rules.
Currently, two official plugins are available:
- [@vitejs/plugin-react](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react) uses [Babel](https://babeljs.io/) for Fast Refresh
- [@vitejs/plugin-react-swc](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react-swc) uses [SWC](https://swc.rs/) for Fast Refresh
## Expanding the ESLint configuration
If you are developing a production application, we recommend updating the configuration to enable type-aware lint rules:
```js
export default tseslint.config({
extends: [
// Remove ...tseslint.configs.recommended and replace with this
...tseslint.configs.recommendedTypeChecked,
// Alternatively, use this for stricter rules
...tseslint.configs.strictTypeChecked,
// Optionally, add this for stylistic rules
...tseslint.configs.stylisticTypeChecked,
],
languageOptions: {
// other options...
parserOptions: {
project: ['./tsconfig.node.json', './tsconfig.app.json'],
tsconfigRootDir: import.meta.dirname,
},
},
})
```
You can also install [eslint-plugin-react-x](https://github.com/Rel1cx/eslint-react/tree/main/packages/plugins/eslint-plugin-react-x) and [eslint-plugin-react-dom](https://github.com/Rel1cx/eslint-react/tree/main/packages/plugins/eslint-plugin-react-dom) for React-specific lint rules:
```js
// eslint.config.js
import reactX from 'eslint-plugin-react-x'
import reactDom from 'eslint-plugin-react-dom'
export default tseslint.config({
plugins: {
// Add the react-x and react-dom plugins
'react-x': reactX,
'react-dom': reactDom,
},
rules: {
// other rules...
// Enable its recommended typescript rules
...reactX.configs['recommended-typescript'].rules,
...reactDom.configs.recommended.rules,
},
})
```

View File

@@ -0,0 +1,28 @@
import js from '@eslint/js'
import globals from 'globals'
import reactHooks from 'eslint-plugin-react-hooks'
import reactRefresh from 'eslint-plugin-react-refresh'
import tseslint from 'typescript-eslint'
export default tseslint.config(
{ ignores: ['dist'] },
{
extends: [js.configs.recommended, ...tseslint.configs.recommended],
files: ['**/*.{ts,tsx}'],
languageOptions: {
ecmaVersion: 2020,
globals: globals.browser,
},
plugins: {
'react-hooks': reactHooks,
'react-refresh': reactRefresh,
},
rules: {
...reactHooks.configs.recommended.rules,
'react-refresh/only-export-components': [
'warn',
{ allowConstantExport: true },
],
},
},
)

13
dashboard/index.html Normal file
View File

@@ -0,0 +1,13 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Vite + React + TS</title>
</head>
<body>
<div id="root"></div>
<script type="module" src="/src/main.tsx"></script>
</body>
</html>

8177
dashboard/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

75
dashboard/package.json Normal file
View File

@@ -0,0 +1,75 @@
{
"name": "dashboard",
"private": true,
"version": "0.0.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "tsc -b && vite build",
"lint": "eslint .",
"preview": "vite preview"
},
"dependencies": {
"@syncfusion/ej2-base": "^30.2.0",
"@syncfusion/ej2-buttons": "^30.2.0",
"@syncfusion/ej2-calendars": "^30.2.0",
"@syncfusion/ej2-dropdowns": "^30.2.0",
"@syncfusion/ej2-grids": "^30.2.0",
"@syncfusion/ej2-icons": "^30.2.0",
"@syncfusion/ej2-inputs": "^30.2.0",
"@syncfusion/ej2-kanban": "^30.2.0",
"@syncfusion/ej2-layouts": "^30.2.0",
"@syncfusion/ej2-lists": "^30.2.0",
"@syncfusion/ej2-navigations": "^30.2.0",
"@syncfusion/ej2-notifications": "^30.2.0",
"@syncfusion/ej2-popups": "^30.2.0",
"@syncfusion/ej2-react-base": "^30.2.0",
"@syncfusion/ej2-react-buttons": "^30.2.0",
"@syncfusion/ej2-react-calendars": "^30.2.0",
"@syncfusion/ej2-react-dropdowns": "^30.2.0",
"@syncfusion/ej2-react-filemanager": "^30.2.0",
"@syncfusion/ej2-react-grids": "^30.2.0",
"@syncfusion/ej2-react-inputs": "^30.2.0",
"@syncfusion/ej2-react-kanban": "^30.2.0",
"@syncfusion/ej2-react-layouts": "^30.2.0",
"@syncfusion/ej2-react-navigations": "^30.2.0",
"@syncfusion/ej2-react-notifications": "^30.2.0",
"@syncfusion/ej2-react-popups": "^30.2.0",
"@syncfusion/ej2-react-schedule": "^30.2.0",
"@syncfusion/ej2-splitbuttons": "^30.2.0",
"cldr-data": "^36.0.4",
"lucide-react": "^0.522.0",
"react": "^19.1.0",
"react-dom": "^19.1.0",
"react-router-dom": "^7.6.2"
},
"devDependencies": {
"@eslint/js": "^9.25.0",
"@tailwindcss/aspect-ratio": "^0.4.2",
"@tailwindcss/forms": "^0.5.10",
"@tailwindcss/typography": "^0.5.16",
"@types/react": "^19.1.8",
"@types/react-dom": "^19.1.6",
"@types/react-router-dom": "^5.3.3",
"@typescript-eslint/eslint-plugin": "^8.34.1",
"@typescript-eslint/parser": "^8.34.1",
"@vitejs/plugin-react": "^4.4.1",
"autoprefixer": "^10.4.21",
"eslint": "^9.29.0",
"eslint-config-prettier": "^10.1.5",
"eslint-plugin-prettier": "^5.5.0",
"eslint-plugin-react": "^7.37.5",
"eslint-plugin-react-hooks": "^5.2.0",
"eslint-plugin-react-refresh": "^0.4.19",
"globals": "^16.0.0",
"postcss": "^8.5.6",
"prettier": "^3.5.3",
"stylelint": "^16.21.0",
"stylelint-config-standard": "^38.0.0",
"stylelint-config-tailwindcss": "^1.0.0",
"tailwindcss": "^3.4.17",
"typescript": "~5.8.3",
"typescript-eslint": "^8.30.1",
"vite": "^6.3.5"
}
}

View File

@@ -0,0 +1,6 @@
module.exports = {
plugins: {
tailwindcss: {},
autoprefixer: {},
},
}

View File

@@ -0,0 +1,96 @@
{
"appName": "Infoscreen-Management",
"version": "2025.1.0-alpha.7",
"copyright": "© 2025 Third-Age-Applications",
"supportContact": "support@third-age-applications.com",
"description": "Eine zentrale Verwaltungsoberfläche für digitale Informationsbildschirme.",
"techStack": {
"Frontend": "React, Vite, TypeScript",
"Backend": "Python (Flask), SQLAlchemy",
"Database": "MariaDB",
"Realtime": "Mosquitto (MQTT)",
"Containerization": "Docker"
},
"openSourceComponents": {
"frontend": [
{ "name": "React", "license": "MIT" },
{ "name": "Vite", "license": "MIT" },
{ "name": "Lucide Icons", "license": "ISC" },
{ "name": "Syncfusion UI Components", "license": "Kommerziell / Community" }
],
"backend": [
{ "name": "Flask", "license": "BSD" },
{ "name": "SQLAlchemy", "license": "MIT" },
{ "name": "Paho-MQTT", "license": "EPL/EDL" },
{ "name": "Alembic", "license": "MIT" }
]
},
"buildInfo": {
"buildDate": "2025-09-20T11:00:00Z",
"commitId": "8d1df7199cb7"
},
"changelog": [
{
"version": "2025.1.0-alpha.7",
"date": "2025-09-21",
"changes": [
"🧭 UI: Periode-Auswahl (Syncfusion) neben Gruppenauswahl; kompaktes Layout",
"✅ Anzeige: Abzeichen für vorhandenen Ferienplan + Zähler Ferien im Blick",
"🛠️ API: Endpunkte für akademische Perioden (list, active GET/POST, for_date)",
"📅 Scheduler: Standardmäßig keine Terminierung in Ferien; Block-Darstellung wie Ganztagesereignis; schwarze Textfarbe",
"📤 Ferien: Upload von TXT/CSV (headless TXT nutzt Spalten 24)",
"🔧 UX: Schalter in einer Reihe; Dropdown-Breiten optimiert"
]
},
{
"version": "2025.1.0-alpha.6",
"date": "2025-09-20",
"changes": [
"🗓️ NEU: Akademische Perioden System - Unterstützung für Schuljahre, Semester und Trimester",
"🏗️ DATENBANK: Neue 'academic_periods' Tabelle für zeitbasierte Organisation",
"🔗 ERWEITERT: Events und Medien können jetzt optional einer akademischen Periode zugeordnet werden",
"📊 ARCHITEKTUR: Vollständig rückwärtskompatible Implementierung für schrittweise Einführung",
"🎯 BILDUNG: Fokus auf Schulumgebung mit Erweiterbarkeit für Hochschulen",
"⚙️ TOOLS: Automatische Erstellung von Standard-Schuljahren für österreichische Schulen"
]
},
{
"version": "2025.1.0-alpha.5",
"date": "2025-09-14",
"changes": [
"Komplettes Redesign des Backend-Handlings der Gruppenzuordnungen von neuen Clients und der Schritte bei Änderung der Gruppenzuordnung."
]
},
{
"version": "2025.1.0-alpha.4",
"date": "2025-09-01",
"changes": [
"Grundstruktur für Deployment getestet und optimiert.",
"FIX: Programmfehler beim Umschalten der Ansicht auf der Medien-Seite behoben."
]
},
{
"version": "2025.1.0-alpha.3",
"date": "2025-08-30",
"changes": [
"NEU: Programminfo-Seite mit dynamischen Daten, Build-Infos und Changelog.",
"NEU: Logout-Funktionalität implementiert.",
"FIX: Breite der Sidebar im eingeklappten Zustand korrigiert."
]
},
{
"version": "2025.1.0-alpha.2",
"date": "2025-08-29",
"changes": [
"INFO: Analyse und Anzeige der verwendeten Open-Source-Bibliotheken."
]
},
{
"version": "2025.1.0-alpha.1",
"date": "2025-08-28",
"changes": [
"Initiales Setup des Projekts und der Grundstruktur."
]
}
]
}

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="31.88" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 257"><defs><linearGradient id="IconifyId1813088fe1fbc01fb466" x1="-.828%" x2="57.636%" y1="7.652%" y2="78.411%"><stop offset="0%" stop-color="#41D1FF"></stop><stop offset="100%" stop-color="#BD34FE"></stop></linearGradient><linearGradient id="IconifyId1813088fe1fbc01fb467" x1="43.376%" x2="50.316%" y1="2.242%" y2="89.03%"><stop offset="0%" stop-color="#FFEA83"></stop><stop offset="8.333%" stop-color="#FFDD35"></stop><stop offset="100%" stop-color="#FFA800"></stop></linearGradient></defs><path fill="url(#IconifyId1813088fe1fbc01fb466)" d="M255.153 37.938L134.897 252.976c-2.483 4.44-8.862 4.466-11.382.048L.875 37.958c-2.746-4.814 1.371-10.646 6.827-9.67l120.385 21.517a6.537 6.537 0 0 0 2.322-.004l117.867-21.483c5.438-.991 9.574 4.796 6.877 9.62Z"></path><path fill="url(#IconifyId1813088fe1fbc01fb467)" d="M185.432.063L96.44 17.501a3.268 3.268 0 0 0-2.634 3.014l-5.474 92.456a3.268 3.268 0 0 0 3.997 3.378l24.777-5.718c2.318-.535 4.413 1.507 3.936 3.838l-7.361 36.047c-.495 2.426 1.782 4.5 4.151 3.78l15.304-4.649c2.372-.72 4.652 1.36 4.15 3.788l-11.698 56.621c-.732 3.542 3.979 5.473 5.943 2.437l1.313-2.028l72.516-144.72c1.215-2.423-.88-5.186-3.54-4.672l-25.505 4.922c-2.396.462-4.435-1.77-3.759-4.114l16.646-57.705c.677-2.35-1.37-4.583-3.769-4.113Z"></path></svg>

After

Width:  |  Height:  |  Size: 1.5 KiB

275
dashboard/src/App.css Normal file
View File

@@ -0,0 +1,275 @@
@import "../node_modules/@syncfusion/ej2-base/styles/material.css";
@import "../node_modules/@syncfusion/ej2-buttons/styles/material.css";
@import "../node_modules/@syncfusion/ej2-calendars/styles/material.css";
@import "../node_modules/@syncfusion/ej2-dropdowns/styles/material.css";
@import "../node_modules/@syncfusion/ej2-inputs/styles/material.css";
@import "../node_modules/@syncfusion/ej2-lists/styles/material.css";
@import "../node_modules/@syncfusion/ej2-navigations/styles/material.css";
@import "../node_modules/@syncfusion/ej2-popups/styles/material.css";
@import "../node_modules/@syncfusion/ej2-splitbuttons/styles/material.css";
@import "../node_modules/@syncfusion/ej2-react-schedule/styles/material.css";
@import "../node_modules/@syncfusion/ej2-kanban/styles/material.css";
@import "../node_modules/@syncfusion/ej2-notifications/styles/material.css";
@import "../node_modules/@syncfusion/ej2-react-filemanager/styles/material.css";
@import "../node_modules/@syncfusion/ej2-layouts/styles/material.css";
@import "../node_modules/@syncfusion/ej2-grids/styles/material.css";
@import "../node_modules/@syncfusion/ej2-icons/styles/material.css";
body {
font-family: Inter, 'Segoe UI', Roboto, Arial, sans-serif;
overflow: hidden; /* Verhindert den Scrollbalken auf der obersten Ebene */
}
:root {
--sidebar-bg: #e5d8c7;
--sidebar-fg: #78591c;
--sidebar-border: #d6c3a6;
--sidebar-text: #000;
--sidebar-hover-bg: #d1b89b;
--sidebar-hover-text: #000;
--sidebar-active-bg: #cda76b;
--sidebar-active-text: #fff;
}
/* Layout-Container für Sidebar und Content */
.layout-container {
display: flex;
height: 100vh; /* Feste Höhe auf die des Viewports setzen */
overflow: hidden; /* Verhindert, dass der Scrollbalken den gesamten Container betrifft */
}
/* Sidebar fixieren, keine Scrollbalken, volle Höhe */
.sidebar-theme {
background-color: var(--sidebar-bg);
color: var(--sidebar-text);
font-size: 1.15rem;
font-family: ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif;
flex-shrink: 0;
z-index: 10; /* Stellt sicher, dass die Sidebar über dem Inhalt ist */
height: 100vh; /* Volle Browser-Höhe */
min-height: 100vh; /* Mindesthöhe für volle Browser-Höhe */
max-height: 100vh; /* Maximale Höhe begrenzen */
display: flex !important;
flex-direction: column !important;
overflow: hidden !important;
}
/* Sicherstelle vertikale Anordnung der Navigation und Footer am Ende */
.sidebar-theme nav {
display: flex !important;
flex-direction: column !important;
flex: 1 1 auto !important;
overflow-y: auto !important;
min-height: 0 !important; /* Ermöglicht Flex-Shrinking */
}
/* Footer-Bereich am unteren Ende fixieren */
.sidebar-theme > div:last-child {
margin-top: auto !important;
flex-shrink: 0 !important;
min-height: auto !important;
padding-bottom: 0.5rem !important; /* Zusätzlicher Abstand vom unteren Rand */
}
.sidebar-theme .sidebar-link {
text-decoration: none;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
display: flex !important;
width: 100% !important;
box-sizing: border-box;
}
.sidebar-theme .sidebar-logout {
border: none;
cursor: pointer;
text-align: left;
width: 100%;
font-size: 1.15rem;
display: flex !important;
box-sizing: border-box;
}
.sidebar-link:hover,
.sidebar-logout:hover {
background-color: var(--sidebar-hover-bg);
color: var(--sidebar-hover-text);
}
.sidebar-link.active {
background-color: var(--sidebar-active-bg);
color: var(--sidebar-active-text);
font-weight: bold;
}
/* === START: SYNCFUSION-KOMPATIBLES LAYOUT === */
/* Der Inhaltsbereich arbeitet mit Syncfusion's natürlichem Layout */
.content-area {
display: flex;
flex-direction: column;
flex: 1;
min-width: 0; /* Verhindert Flex-Item-Overflow */
}
.content-header {
flex-shrink: 0; /* Header soll nicht schrumpfen */
}
.page-content {
flex-grow: 1; /* Füllt den verbleibenden Platz */
overflow-y: auto; /* NUR dieser Bereich scrollt */
padding: 2rem;
background-color: #f3f4f6;
}
/* === ENDE: SYNCFUSION-KOMPATIBLES LAYOUT === */
/* Kanban-Karten im Sidebar-Style */
.e-kanban .e-card,
.e-kanban .e-card .e-card-content,
.e-kanban .e-card .e-card-header {
background-color: var(--sidebar-bg) !important;
color: var(--sidebar-fg) !important;
}
.e-kanban .e-card:hover,
.e-kanban .e-card.e-selection,
.e-kanban .e-card.e-card-active,
.e-kanban .e-card:hover .e-card-content,
.e-kanban .e-card.e-selection .e-card-content,
.e-kanban .e-card.e-card-active .e-card-content,
.e-kanban .e-card:hover .e-card-header,
.e-kanban .e-card.e-selection .e-card-header,
.e-kanban .e-card.e-card-active .e-card-header {
background-color: var(--sidebar-fg) !important;
color: var(--sidebar-bg) !important;
}
/* Optional: Fokus-Style für Tastatur-Navigation */
.e-kanban .e-card:focus {
outline: 2px solid var(--sidebar-fg);
outline-offset: 2px;
}
/* Kanban-Spaltenheader: Hintergrund und Textfarbe überschreiben */
.e-kanban .e-kanban-table .e-header-cells {
background-color: color-mix(in srgb, var(--sidebar-bg) 80%, #fff 20%) !important;
color: var(--sidebar-fg) !important;
font-weight: 700;
font-size: 1.08rem;
border-bottom: 2px solid var(--sidebar-fg);
box-shadow: 0 2px 6px 0 color-mix(in srgb, #78591c 8%, transparent);
letter-spacing: 0.02em;
}
/* Header-Text noch spezifischer and mit !important */
.e-kanban .e-kanban-table .e-header-cells .e-header-text {
color: color-mix(in srgb, var(--sidebar-fg) 85%, #000 15%) !important;
}
/* Entferne den globalen Scrollbalken von .main-content! */
.main-content {
width: 100%;
overflow-x: auto; /* Wiederherstellen des ursprünglichen Scroll-Verhaltens */
padding-bottom: 8px;
}
/* Entfernt - Syncfusion verwaltet das Layout selbst */
/* Grundlegende Sidebar-Styles - Syncfusion-kompatibel */
#sidebar .sidebar-link,
#sidebar .sidebar-logout {
display: flex !important;
align-items: center !important;
gap: 8px !important;
}
#sidebar .sidebar-link svg,
#sidebar .sidebar-logout svg {
flex-shrink: 0 !important;
}
/* Text standardmäßig IMMER sichtbar */
#sidebar .sidebar-link .sidebar-text,
#sidebar .sidebar-logout .sidebar-text {
margin-left: 0 !important;
display: inline-block !important;
opacity: 1 !important;
transition: opacity 0.3s, transform 0.3s !important;
}
#sidebar .sidebar-link:hover,
#sidebar .sidebar-logout:hover {
background-color: var(--sidebar-hover-bg) !important;
color: var(--sidebar-hover-text) !important;
}
/* Expanded state - Text sichtbar (Standard) */
#sidebar .sidebar-theme.expanded .sidebar-link,
#sidebar .sidebar-theme.expanded .sidebar-logout {
justify-content: flex-start !important;
padding: 12px 24px !important;
gap: 8px !important;
}
#sidebar .sidebar-theme.expanded .sidebar-text {
display: inline-block !important;
opacity: 1 !important;
}
#sidebar .sidebar-theme.expanded .sidebar-link svg,
#sidebar .sidebar-theme.expanded .sidebar-logout svg {
margin-right: 8px !important;
}
/* Collapsed state - nur Icons */
#sidebar .sidebar-theme.collapsed .sidebar-link,
#sidebar .sidebar-theme.collapsed .sidebar-logout {
justify-content: center !important;
padding: 12px 8px !important;
gap: 0 !important;
position: relative !important;
}
#sidebar .sidebar-theme.collapsed .sidebar-text {
display: none !important;
}
#sidebar .sidebar-theme.collapsed .sidebar-link svg,
#sidebar .sidebar-theme.collapsed .sidebar-logout svg {
margin-right: 0 !important;
}
/* Syncfusion TooltipComponent wird jetzt verwendet - CSS-Tooltips entfernt */
/* Logo und Versionsnummer im collapsed state ausblenden */
@keyframes fade-in {
from {
opacity: 0;
transform: translateY(-50%) translateX(-5px);
}
to {
opacity: 1;
transform: translateY(-50%) translateX(0);
}
}
/* Logo und Versionsnummer im collapsed state ausblenden */
#sidebar .sidebar-theme.collapsed img {
display: none !important;
}
#sidebar .sidebar-theme.collapsed .version-info {
display: none !important;
}

337
dashboard/src/App.tsx Normal file
View File

@@ -0,0 +1,337 @@
import React, { useState } from 'react';
import { BrowserRouter as Router, Routes, Route, Link, Outlet } from 'react-router-dom';
import { SidebarComponent } from '@syncfusion/ej2-react-navigations';
import { ButtonComponent } from '@syncfusion/ej2-react-buttons';
import { TooltipComponent } from '@syncfusion/ej2-react-popups';
import logo from './assets/logo.png';
import './App.css';
// Lucide Icons importieren
import {
LayoutDashboard,
Calendar,
Boxes,
Image,
User,
Settings,
Monitor,
MonitorDotIcon,
LogOut,
Wrench,
Info,
} from 'lucide-react';
import { ToastProvider } from './components/ToastProvider';
const sidebarItems = [
{ name: 'Dashboard', path: '/', icon: LayoutDashboard },
{ name: 'Termine', path: '/termine', icon: Calendar },
{ name: 'Ressourcen', path: '/ressourcen', icon: Boxes },
{ name: 'Raumgruppen', path: '/infoscr_groups', icon: MonitorDotIcon },
{ name: 'Infoscreen-Clients', path: '/clients', icon: Monitor },
{ name: 'Erweiterungsmodus', path: '/setup', icon: Wrench },
{ name: 'Medien', path: '/medien', icon: Image },
{ name: 'Benutzer', path: '/benutzer', icon: User },
{ name: 'Einstellungen', path: '/einstellungen', icon: Settings },
{ name: 'Programminfo', path: '/programminfo', icon: Info },
];
// Dummy Components (können in eigene Dateien ausgelagert werden)
import Dashboard from './dashboard';
import Appointments from './appointments';
import Ressourcen from './ressourcen';
import Infoscreens from './clients';
import Infoscreen_groups from './infoscreen_groups';
import Media from './media';
import Benutzer from './benutzer';
import Einstellungen from './einstellungen';
import SetupMode from './SetupMode';
import Programminfo from './programminfo';
import Logout from './logout';
// ENV aus .env holen (Platzhalter, im echten Projekt über process.env oder API)
// const ENV = import.meta.env.VITE_ENV || 'development';
const Layout: React.FC = () => {
const [version, setVersion] = useState('');
const [isCollapsed, setIsCollapsed] = useState(false);
let sidebarRef: SidebarComponent | null;
React.useEffect(() => {
fetch('/program-info.json')
.then(res => res.json())
.then(data => setVersion(data.version))
.catch(err => console.error('Failed to load version info:', err));
}, []);
const toggleSidebar = () => {
if (sidebarRef) {
sidebarRef.toggle();
}
};
const onSidebarChange = () => {
// Syncfusion unterscheidet zwischen isOpen (true/false) und dem Dock-Modus
// Im Dock-Modus ist isOpen=true, aber die Sidebar ist kollabiert
const sidebar = sidebarRef?.element;
if (sidebar) {
const currentWidth = sidebar.style.width;
const newCollapsedState = currentWidth === '60px' || currentWidth.includes('60');
setIsCollapsed(newCollapsedState);
}
};
const sidebarTemplate = () => (
<div
className={`sidebar-theme ${isCollapsed ? 'collapsed' : 'expanded'}`}
style={{
display: 'flex',
flexDirection: 'column',
height: '100vh',
minHeight: '100vh',
overflow: 'hidden',
}}
>
<div
style={{
borderColor: 'var(--sidebar-border)',
height: '68px',
flexShrink: 0,
display: 'flex',
alignItems: 'center',
justifyContent: 'center',
borderBottom: '1px solid var(--sidebar-border)',
margin: 0,
padding: 0,
}}
>
<img
src={logo}
alt="Logo"
style={{
height: '64px',
maxHeight: '60px',
display: 'block',
margin: '0 auto',
}}
/>
</div>
<nav
style={{
flex: '1 1 auto',
display: 'flex',
flexDirection: 'column',
marginTop: '1rem',
overflowY: 'auto',
minHeight: 0, // Wichtig für Flex-Shrinking
}}
>
{sidebarItems.map(item => {
const Icon = item.icon;
const linkContent = (
<Link
key={item.path}
to={item.path}
className="sidebar-link no-underline w-full"
style={{
display: 'flex',
alignItems: 'center',
justifyContent: 'flex-start',
gap: '8px',
padding: '12px 24px',
transition: 'background 0.2s, color 0.2s, justify-content 0.3s',
textDecoration: 'none',
color: 'var(--sidebar-fg)',
backgroundColor: 'var(--sidebar-bg)',
}}
>
<Icon size={22} style={{ flexShrink: 0, marginRight: 0 }} />
<span className="sidebar-text" style={{ marginLeft: 0, transition: 'opacity 0.3s' }}>
{item.name}
</span>
</Link>
);
// Syncfusion Tooltip nur im collapsed state
return isCollapsed ? (
<TooltipComponent
key={item.path}
content={item.name}
position="RightCenter"
opensOn="Hover"
showTipPointer={true}
animation={{
open: { effect: 'FadeIn', duration: 200 },
close: { effect: 'FadeOut', duration: 200 },
}}
>
{linkContent}
</TooltipComponent>
) : (
linkContent
);
})}
</nav>
<div
style={{
flexShrink: 0,
marginTop: 'auto',
display: 'flex',
flexDirection: 'column',
minHeight: 'auto',
}}
>
{(() => {
const logoutContent = (
<Link
to="/logout"
className="sidebar-logout no-underline w-full"
style={{
display: 'flex',
alignItems: 'center',
justifyContent: 'flex-start',
gap: '8px',
padding: '12px 24px',
transition: 'background 0.2s, color 0.2s, justify-content 0.3s',
textDecoration: 'none',
color: 'var(--sidebar-fg)',
backgroundColor: 'var(--sidebar-bg)',
border: 'none',
cursor: 'pointer',
fontSize: '1.15rem',
}}
>
<LogOut size={22} style={{ flexShrink: 0, marginRight: 0 }} />
<span className="sidebar-text" style={{ marginLeft: 0, transition: 'opacity 0.3s' }}>
Abmelden
</span>
</Link>
);
// Syncfusion Tooltip nur im collapsed state
return isCollapsed ? (
<TooltipComponent
content="Abmelden"
position="RightCenter"
opensOn="Hover"
showTipPointer={true}
animation={{
open: { effect: 'FadeIn', duration: 200 },
close: { effect: 'FadeOut', duration: 200 },
}}
>
{logoutContent}
</TooltipComponent>
) : (
logoutContent
);
})()}
{version && (
<div
className="version-info px-6 py-2 text-xs text-center opacity-70 border-t"
style={{ borderColor: 'var(--sidebar-border)' }}
>
Version {version}
</div>
)}
</div>
</div>
);
return (
<div className="layout-container">
<SidebarComponent
id="sidebar"
ref={(sidebar: SidebarComponent | null) => {
sidebarRef = sidebar;
}}
width="256px"
target=".layout-container"
isOpen={true}
closeOnDocumentClick={false}
enableGestures={false}
type="Auto"
enableDock={true}
dockSize="60px"
change={onSidebarChange}
>
{sidebarTemplate()}
</SidebarComponent>
<div className="content-area">
<header
className="content-header flex items-center shadow"
style={{
backgroundColor: '#e5d8c7',
color: '#78591c',
height: '68px', // Exakt gleiche Höhe wie Sidebar-Header
fontSize: '1.15rem',
fontFamily:
'ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif',
margin: 0,
padding: '0 2rem 0 0', // Nur rechts Padding, links kein Padding
boxSizing: 'border-box',
display: 'flex',
alignItems: 'center',
}}
>
<ButtonComponent
cssClass="e-inherit"
iconCss="e-icons e-menu"
onClick={toggleSidebar}
isToggle={true}
style={{
margin: '0 1rem 0 0', // Nur rechts Margin für Abstand zum Logo
padding: '8px 12px',
minWidth: '44px',
height: '44px',
flexShrink: 0,
}}
/>
<img src={logo} alt="Logo" className="h-16 mr-4" style={{ maxHeight: '60px' }} />
<span className="text-2xl font-bold mr-8" style={{ color: '#78591c' }}>
Infoscreen-Management
</span>
<span className="ml-auto text-lg font-medium" style={{ color: '#78591c' }}>
[Organisationsname]
</span>
</header>
<main className="page-content">
<Outlet />
</main>
</div>
</div>
);
};
const App: React.FC = () => {
// Automatische Navigation zu /clients bei leerer Beschreibung entfernt
return (
<ToastProvider>
<Routes>
<Route path="/" element={<Layout />}>
<Route index element={<Dashboard />} />
<Route path="termine" element={<Appointments />} />
<Route path="ressourcen" element={<Ressourcen />} />
<Route path="infoscr_groups" element={<Infoscreen_groups />} />
<Route path="medien" element={<Media />} />
<Route path="benutzer" element={<Benutzer />} />
<Route path="einstellungen" element={<Einstellungen />} />
<Route path="clients" element={<Infoscreens />} />
<Route path="setup" element={<SetupMode />} />
<Route path="programminfo" element={<Programminfo />} />
</Route>
<Route path="/logout" element={<Logout />} />
</Routes>
</ToastProvider>
);
};
const AppWrapper: React.FC = () => (
<Router>
<App />
</Router>
);
export default AppWrapper;

174
dashboard/src/SetupMode.tsx Normal file
View File

@@ -0,0 +1,174 @@
import React, { useEffect, useState } from 'react';
import { fetchClientsWithoutDescription, setClientDescription } from './apiClients';
import { ButtonComponent } from '@syncfusion/ej2-react-buttons';
import { TextBoxComponent } from '@syncfusion/ej2-react-inputs';
import { GridComponent, ColumnsDirective, ColumnDirective } from '@syncfusion/ej2-react-grids';
import { DialogComponent } from '@syncfusion/ej2-react-popups';
import { useClientDelete } from './hooks/useClientDelete';
type Client = {
uuid: string;
hostname?: string;
ip_address?: string;
last_alive?: string;
};
const SetupMode: React.FC = () => {
const [clients, setClients] = useState<Client[]>([]);
const [descriptions, setDescriptions] = useState<Record<string, string>>({});
const [loading /* setLoading */] = useState(false);
const [inputActive, setInputActive] = useState(false);
// Lösch-Logik aus Hook (analog zu clients.tsx)
const { showDialog, deleteClientId, handleDelete, confirmDelete, cancelDelete } = useClientDelete(
async uuid => {
// Nach dem Löschen neu laden!
const updated = await fetchClientsWithoutDescription();
setClients(updated);
setDescriptions(prev => {
const copy = { ...prev };
delete copy[uuid];
return copy;
});
}
);
// Hilfsfunktion zum Vergleich der Clients
const isEqual = (a: Client[], b: Client[]) => {
if (a.length !== b.length) return false;
const aSorted = [...a].sort((x, y) => x.uuid.localeCompare(y.uuid));
const bSorted = [...b].sort((x, y) => x.uuid.localeCompare(y.uuid));
for (let i = 0; i < aSorted.length; i++) {
if (aSorted[i].uuid !== bSorted[i].uuid) return false;
if (aSorted[i].hostname !== bSorted[i].hostname) return false;
if (aSorted[i].ip_address !== bSorted[i].ip_address) return false;
if (aSorted[i].last_alive !== bSorted[i].last_alive) return false;
}
return true;
};
useEffect(() => {
let polling: ReturnType<typeof setInterval> | null = null;
const fetchClients = () => {
if (inputActive) return;
fetchClientsWithoutDescription().then(list => {
setClients(prev => (isEqual(prev, list) ? prev : list));
});
};
fetchClients();
polling = setInterval(fetchClients, 5000);
return () => {
if (polling) clearInterval(polling);
};
}, [inputActive]);
const handleDescriptionChange = (uuid: string, value: string) => {
setDescriptions(prev => ({ ...prev, [uuid]: value }));
};
const handleSave = (uuid: string) => {
setClientDescription(uuid, descriptions[uuid] || '')
.then(() => {
setClients(prev => prev.filter(c => c.uuid !== uuid));
})
.catch(err => {
console.error('Fehler beim Speichern der Beschreibung:', err);
});
};
if (loading) return <div>Lade neue Clients ...</div>;
return (
<div>
<h2>Erweiterungsmodus: Neue Clients zuordnen</h2>
<GridComponent
dataSource={clients}
allowPaging={true}
pageSettings={{ pageSize: 10 }}
rowHeight={50}
width="100%"
allowTextWrap={false}
>
<ColumnsDirective>
<ColumnDirective field="uuid" headerText="UUID" width="180" />
<ColumnDirective field="hostname" headerText="Hostname" width="90" />
<ColumnDirective field="ip_address" headerText="IP" width="80" />
<ColumnDirective
headerText="Letzter Kontakt"
width="120"
template={(props: Client) => {
if (!props.last_alive) return '';
let iso = props.last_alive;
if (!iso.endsWith('Z')) iso += 'Z';
const date = new Date(iso);
const pad = (n: number) => n.toString().padStart(2, '0');
return `${pad(date.getDate())}.${pad(date.getMonth() + 1)}.${date.getFullYear()} ${pad(date.getHours())}:${pad(date.getMinutes())}:${pad(date.getSeconds())}`;
}}
/>
<ColumnDirective
headerText="Beschreibung"
width="220"
template={(props: Client) => (
<TextBoxComponent
value={descriptions[props.uuid] || ''}
placeholder="Beschreibung eingeben"
change={e => handleDescriptionChange(props.uuid, e.value as string)}
focus={() => setInputActive(true)}
blur={() => setInputActive(false)}
/>
)}
/>
<ColumnDirective
headerText="Aktion"
width="180"
template={(props: Client) => (
<div style={{ display: 'flex', gap: '8px' }}>
<ButtonComponent
content="Speichern"
disabled={!descriptions[props.uuid]}
onClick={() => handleSave(props.uuid)}
/>
<ButtonComponent
content="Entfernen"
cssClass="e-danger"
onClick={e => {
e.stopPropagation();
handleDelete(props.uuid);
}}
/>
</div>
)}
/>
</ColumnsDirective>
</GridComponent>
{clients.length === 0 && <div>Keine neuen Clients ohne Beschreibung.</div>}
{/* Syncfusion Dialog für Sicherheitsabfrage */}
{showDialog && deleteClientId && (
<DialogComponent
visible={showDialog}
header="Bestätigung"
content={(() => {
const client = clients.find(c => c.uuid === deleteClientId);
const hostname = client?.hostname ? ` (${client.hostname})` : '';
return client
? `Möchten Sie diesen Client${hostname} wirklich entfernen?`
: 'Client nicht gefunden.';
})()}
showCloseIcon={true}
width="400px"
buttons={[
{ click: confirmDelete, buttonModel: { content: 'Ja', isPrimary: true } },
{ click: cancelDelete, buttonModel: { content: 'Abbrechen' } },
]}
close={cancelDelete}
/>
)}
</div>
);
};
export default SetupMode;

View File

@@ -0,0 +1,42 @@
export type AcademicPeriod = {
id: number;
name: string;
display_name?: string | null;
start_date: string; // YYYY-MM-DD
end_date: string; // YYYY-MM-DD
period_type: 'schuljahr' | 'semester' | 'trimester';
is_active: boolean;
};
async function api<T>(url: string, init?: RequestInit): Promise<T> {
const res = await fetch(url, { credentials: 'include', ...init });
if (!res.ok) throw new Error(`HTTP ${res.status}`);
return res.json();
}
export async function getAcademicPeriodForDate(date: Date): Promise<AcademicPeriod | null> {
const iso = date.toISOString().slice(0, 10);
const { period } = await api<{ period: AcademicPeriod | null }>(
`/api/academic_periods/for_date?date=${iso}`
);
return period ?? null;
}
export async function listAcademicPeriods(): Promise<AcademicPeriod[]> {
const { periods } = await api<{ periods: AcademicPeriod[] }>(`/api/academic_periods`);
return Array.isArray(periods) ? periods : [];
}
export async function getActiveAcademicPeriod(): Promise<AcademicPeriod | null> {
const { period } = await api<{ period: AcademicPeriod | null }>(`/api/academic_periods/active`);
return period ?? null;
}
export async function setActiveAcademicPeriod(id: number): Promise<AcademicPeriod> {
const { period } = await api<{ period: AcademicPeriod }>(`/api/academic_periods/active`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ id }),
});
return period;
}

105
dashboard/src/apiClients.ts Normal file
View File

@@ -0,0 +1,105 @@
export interface Client {
uuid: string;
hardware_token?: string;
ip?: string;
type?: string;
hostname?: string;
os_version?: string;
software_version?: string;
macs?: string;
model?: string;
description?: string;
registration_time?: string;
last_alive?: string;
is_active?: boolean;
group_id?: number;
// Für Health-Status
is_alive?: boolean;
}
export interface Group {
id: number;
name: string;
created_at?: string;
is_active?: boolean;
clients: Client[];
}
// Liefert alle Gruppen mit zugehörigen Clients
export async function fetchGroupsWithClients(): Promise<Group[]> {
const response = await fetch('/api/groups/with_clients');
if (!response.ok) {
throw new Error('Fehler beim Laden der Gruppen mit Clients');
}
return await response.json();
}
export async function fetchClients(): Promise<Client[]> {
const response = await fetch('/api/clients');
if (!response.ok) {
throw new Error('Fehler beim Laden der Clients');
}
return await response.json();
}
export async function fetchClientsWithoutDescription(): Promise<Client[]> {
const response = await fetch('/api/clients/without_description');
if (!response.ok) {
throw new Error('Fehler beim Laden der Clients ohne Beschreibung');
}
return await response.json();
}
export async function setClientDescription(uuid: string, description: string) {
const res = await fetch(`/api/clients/${uuid}/description`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ description }),
});
if (!res.ok) throw new Error('Fehler beim Setzen der Beschreibung');
return await res.json();
}
export async function updateClientGroup(clientIds: string[], groupId: number) {
const res = await fetch('/api/clients/group', {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ client_ids: clientIds, group_id: groupId }),
});
if (!res.ok) throw new Error('Fehler beim Aktualisieren der Clients');
return await res.json();
}
export async function updateClient(uuid: string, data: { description?: string; model?: string }) {
const res = await fetch(`/api/clients/${uuid}`, {
method: 'PATCH',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(data),
});
if (!res.ok) throw new Error('Fehler beim Aktualisieren des Clients');
return await res.json();
}
export async function restartClient(uuid: string): Promise<{ success: boolean; message?: string }> {
const response = await fetch(`/api/clients/${uuid}/restart`, {
method: 'POST',
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.error || 'Fehler beim Neustart des Clients');
}
return await response.json();
}
export async function deleteClient(uuid: string) {
const res = await fetch(`/api/clients/${uuid}`, {
method: 'DELETE',
});
if (!res.ok) throw new Error('Fehler beim Entfernen des Clients');
return await res.json();
}
export async function fetchMediaById(mediaId: number | string) {
const response = await fetch(`/api/eventmedia/${mediaId}`);
if (!response.ok) throw new Error('Fehler beim Laden der Mediainformationen');
return await response.json();
}

View File

@@ -0,0 +1,42 @@
export interface Event {
id: string;
title: string;
start: string;
end: string;
allDay: boolean;
classNames: string[];
extendedProps: Record<string, unknown>;
}
export async function fetchEvents(groupId: string, showInactive = false) {
const res = await fetch(
`/api/events?group_id=${encodeURIComponent(groupId)}&show_inactive=${showInactive ? '1' : '0'}`
);
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Laden der Termine');
return data;
}
export async function deleteEvent(eventId: string) {
const res = await fetch(`/api/events/${encodeURIComponent(eventId)}`, {
method: 'DELETE',
});
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Löschen des Termins');
return data;
}
export interface UpdateEventPayload {
[key: string]: unknown;
}
export async function updateEvent(eventId: string, payload: UpdateEventPayload) {
const res = await fetch(`/api/events/${encodeURIComponent(eventId)}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(payload),
});
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Aktualisieren des Termins');
return data;
}

View File

@@ -0,0 +1,40 @@
export async function createGroup(name: string) {
const res = await fetch('/api/groups', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ name }),
});
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Erstellen der Gruppe');
return data;
}
export async function fetchGroups() {
const res = await fetch('/api/groups');
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Laden der Gruppen');
return data;
}
export async function deleteGroup(groupName: string) {
const res = await fetch(`/api/groups/byname/${encodeURIComponent(groupName)}`, {
method: 'DELETE',
});
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Löschen der Gruppe');
return data;
}
export async function renameGroup(oldName: string, newName: string) {
const res = await fetch(`/api/groups/byname/${encodeURIComponent(oldName)}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ newName: newName }),
});
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Umbenennen der Gruppe');
return data;
}
// Hier kannst du später weitere Funktionen ergänzen:
// export async function updateGroup(id: number, name: string) { ... }

View File

@@ -0,0 +1,26 @@
export type Holiday = {
id: number;
name: string;
start_date: string;
end_date: string;
region?: string | null;
source_file_name?: string | null;
imported_at?: string | null;
};
export async function listHolidays(region?: string) {
const url = region ? `/api/holidays?region=${encodeURIComponent(region)}` : '/api/holidays';
const res = await fetch(url);
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Laden der Ferien');
return data as { holidays: Holiday[] };
}
export async function uploadHolidaysCsv(file: File) {
const form = new FormData();
form.append('file', file);
const res = await fetch('/api/holidays/upload', { method: 'POST', body: form });
const data = await res.json();
if (!res.ok || data.error) throw new Error(data.error || 'Fehler beim Import der Ferien');
return data as { success: boolean; inserted: number; updated: number };
}

View File

@@ -0,0 +1,813 @@
import React, { useEffect, useMemo, useState } from 'react';
import {
ScheduleComponent,
Day,
Week,
WorkWeek,
Month,
Agenda,
Inject,
ViewsDirective,
ViewDirective,
} from '@syncfusion/ej2-react-schedule';
import { DropDownListComponent } from '@syncfusion/ej2-react-dropdowns';
import { L10n, loadCldr, setCulture } from '@syncfusion/ej2-base';
import type {
EventRenderedArgs,
ActionEventArgs,
RenderCellEventArgs,
} from '@syncfusion/ej2-react-schedule';
import { fetchEvents } from './apiEvents';
import { fetchGroups } from './apiGroups';
import { getGroupColor } from './groupColors';
import { deleteEvent } from './apiEvents';
import CustomEventModal from './components/CustomEventModal';
import { fetchMediaById } from './apiClients';
import { listHolidays, type Holiday } from './apiHolidays';
import {
getAcademicPeriodForDate,
listAcademicPeriods,
setActiveAcademicPeriod,
} from './apiAcademicPeriods';
import {
Presentation,
Globe,
Video,
MessageSquare,
School,
CheckCircle,
AlertCircle,
} from 'lucide-react';
import { renderToStaticMarkup } from 'react-dom/server';
import caGregorian from './cldr/ca-gregorian.json';
import numbers from './cldr/numbers.json';
import timeZoneNames from './cldr/timeZoneNames.json';
import numberingSystems from './cldr/numberingSystems.json';
// Typ für Gruppe ergänzen
type Group = {
id: string;
name: string;
};
// Typ für Event ergänzen
type Event = {
Id: string;
Subject: string;
StartTime: Date;
EndTime: Date;
IsAllDay: boolean;
IsBlock?: boolean; // Syncfusion block appointment
isHoliday?: boolean; // marker for styling/logic
MediaId?: string | number;
SlideshowInterval?: number;
WebsiteUrl?: string;
Icon?: string; // <--- Icon ergänzen!
Type?: string; // <--- Typ ergänzen, falls benötigt
};
type RawEvent = {
Id: string;
Subject: string;
StartTime: string;
EndTime: string;
IsAllDay: boolean;
MediaId?: string | number;
Icon?: string; // <--- Icon ergänzen!
Type?: string;
};
// CLDR-Daten laden (direkt die JSON-Objekte übergeben)
loadCldr(
caGregorian as object,
numbers as object,
timeZoneNames as object,
numberingSystems as object
);
// Deutsche Lokalisierung für den Scheduler
L10n.load({
de: {
schedule: {
day: 'Tag',
week: 'Woche',
workWeek: 'Arbeitswoche',
month: 'Monat',
agenda: 'Agenda',
today: 'Heute',
noEvents: 'Keine Termine',
allDay: 'Ganztägig',
start: 'Start',
end: 'Ende',
event: 'Termin',
save: 'Speichern',
cancel: 'Abbrechen',
delete: 'Löschen',
edit: 'Bearbeiten',
newEvent: 'Neuer Termin',
title: 'Titel',
description: 'Beschreibung',
location: 'Ort',
recurrence: 'Wiederholung',
repeat: 'Wiederholen',
deleteEvent: 'Termin löschen',
deleteContent: 'Möchten Sie diesen Termin wirklich löschen?',
moreDetails: 'Mehr Details',
addTitle: 'Termintitel',
},
},
});
// Kultur setzen
setCulture('de');
// Mapping für Lucide-Icons
const iconMap: Record<string, React.ElementType> = {
Presentation,
Globe,
Video,
MessageSquare,
School,
};
const eventTemplate = (event: Event) => {
const IconComponent = iconMap[event.Icon ?? ''] || null;
// Zeitangabe formatieren
const start = event.StartTime instanceof Date ? event.StartTime : new Date(event.StartTime);
const end = event.EndTime instanceof Date ? event.EndTime : new Date(event.EndTime);
const timeString = `${start.toLocaleTimeString([], { hour: '2-digit', minute: '2-digit' })} - ${end.toLocaleTimeString([], { hour: '2-digit', minute: '2-digit' })}`;
return (
<div style={{ display: 'flex', flexDirection: 'column', alignItems: 'flex-start' }}>
<div style={{ display: 'flex', alignItems: 'center', color: '#000', marginBottom: 2 }}>
{IconComponent && (
<span style={{ verticalAlign: 'middle', display: 'inline-block', marginRight: 6 }}>
<IconComponent size={18} color="#000" />
</span>
)}
<span style={{ marginTop: 3 }}>{event.Subject}</span>
</div>
<div style={{ fontSize: '0.95em', color: '#000', marginTop: -2 }}>{timeString}</div>
</div>
);
};
const Appointments: React.FC = () => {
const [groups, setGroups] = useState<Group[]>([]);
const [selectedGroupId, setSelectedGroupId] = useState<string | null>(null);
const [events, setEvents] = useState<Event[]>([]);
const [holidays, setHolidays] = useState<Holiday[]>([]);
const [modalOpen, setModalOpen] = useState(false);
const [modalInitialData, setModalInitialData] = useState({});
const [schedulerKey, setSchedulerKey] = useState(0);
const [editMode, setEditMode] = useState(false); // NEU: Editiermodus
const [showInactive, setShowInactive] = React.useState(true);
const [allowScheduleOnHolidays, setAllowScheduleOnHolidays] = React.useState(false);
const [showHolidayList, setShowHolidayList] = React.useState(true);
const scheduleRef = React.useRef<ScheduleComponent | null>(null);
const [holidaysInView, setHolidaysInView] = React.useState<number>(0);
const [schoolYearLabel, setSchoolYearLabel] = React.useState<string>('');
const [hasSchoolYearPlan, setHasSchoolYearPlan] = React.useState<boolean>(false);
const [periods, setPeriods] = React.useState<{ id: number; label: string }[]>([]);
const [activePeriodId, setActivePeriodId] = React.useState<number | null>(null);
// Gruppen laden
useEffect(() => {
fetchGroups()
.then(data => {
// Nur Gruppen mit id != 1 berücksichtigen (nicht zugeordnet ignorieren)
const filtered = Array.isArray(data) ? data.filter(g => g.id && g.name && g.id !== 1) : [];
setGroups(filtered);
if (filtered.length > 0) setSelectedGroupId(filtered[0].id);
})
.catch(console.error);
}, []);
// Holidays laden
useEffect(() => {
listHolidays()
.then(res => setHolidays(res.holidays || []))
.catch(err => console.error('Ferien laden fehlgeschlagen:', err));
}, []);
// Perioden laden (Dropdown)
useEffect(() => {
listAcademicPeriods()
.then(all => {
setPeriods(all.map(p => ({ id: p.id, label: p.display_name || p.name })));
const active = all.find(p => p.is_active);
setActivePeriodId(active ? active.id : null);
})
.catch(err => console.error('Akademische Perioden laden fehlgeschlagen:', err));
}, []);
// fetchAndSetEvents als useCallback definieren, damit die Dependency korrekt ist:
const fetchAndSetEvents = React.useCallback(async () => {
if (!selectedGroupId) {
setEvents([]);
return;
}
try {
const data = await fetchEvents(selectedGroupId, showInactive); // selectedGroupId ist jetzt garantiert string
const mapped: Event[] = data.map((e: RawEvent) => ({
Id: e.Id,
Subject: e.Subject,
StartTime: new Date(e.StartTime.endsWith('Z') ? e.StartTime : e.StartTime + 'Z'),
EndTime: new Date(e.EndTime.endsWith('Z') ? e.EndTime : e.EndTime + 'Z'),
IsAllDay: e.IsAllDay,
MediaId: e.MediaId,
Icon: e.Icon, // <--- Icon übernehmen!
Type: e.Type, // <--- Typ übernehmen!
}));
setEvents(mapped);
} catch (err) {
console.error('Fehler beim Laden der Termine:', err);
}
}, [selectedGroupId, showInactive]);
React.useEffect(() => {
if (selectedGroupId) {
// selectedGroupId kann null sein, fetchEvents erwartet aber string
fetchAndSetEvents();
} else {
setEvents([]);
}
}, [selectedGroupId, showInactive, fetchAndSetEvents]);
// Helper: prüfe, ob Zeitraum einen Feiertag/Ferienbereich schneidet
const isWithinHolidayRange = React.useCallback(
(start: Date, end: Date) => {
// normalisiere Endzeit minimal (Syncfusion nutzt exklusive Enden in einigen Fällen)
const adjEnd = new Date(end);
// keine Änderung nötig unsere eigenen Events sind präzise
for (const h of holidays) {
// Holiday dates are strings YYYY-MM-DD (local date)
const hs = new Date(h.start_date + 'T00:00:00');
const he = new Date(h.end_date + 'T23:59:59');
if (
(start >= hs && start <= he) ||
(adjEnd >= hs && adjEnd <= he) ||
(start <= hs && adjEnd >= he)
) {
return true;
}
}
return false;
},
[holidays]
);
// Baue Holiday-Anzeige-Events und Block-Events
const holidayDisplayEvents: Event[] = useMemo(() => {
if (!showHolidayList) return [];
const out: Event[] = [];
for (const h of holidays) {
const start = new Date(h.start_date + 'T00:00:00');
const end = new Date(h.end_date + 'T23:59:59');
out.push({
Id: `holiday-${h.id}-display`,
Subject: h.name,
StartTime: start,
EndTime: end,
IsAllDay: true,
isHoliday: true,
});
}
return out;
}, [holidays, showHolidayList]);
const holidayBlockEvents: Event[] = useMemo(() => {
if (allowScheduleOnHolidays) return [];
const out: Event[] = [];
for (const h of holidays) {
const start = new Date(h.start_date + 'T00:00:00');
const end = new Date(h.end_date + 'T23:59:59');
out.push({
Id: `holiday-${h.id}-block`,
Subject: h.name,
StartTime: start,
EndTime: end,
IsAllDay: true,
IsBlock: true,
isHoliday: true,
});
}
return out;
}, [holidays, allowScheduleOnHolidays]);
const dataSource = useMemo(() => {
return [...events, ...holidayDisplayEvents, ...holidayBlockEvents];
}, [events, holidayDisplayEvents, holidayBlockEvents]);
// Aktive akademische Periode für Datum aus dem Backend ermitteln
const refreshAcademicPeriodFor = React.useCallback(
async (baseDate: Date) => {
try {
const p = await getAcademicPeriodForDate(baseDate);
if (!p) {
setSchoolYearLabel('');
setHasSchoolYearPlan(false);
return;
}
// Anzeige: bevorzugt display_name, sonst name
const label = p.display_name ? p.display_name : p.name;
setSchoolYearLabel(label);
// Existiert ein Ferienplan innerhalb der Periode?
const start = new Date(p.start_date + 'T00:00:00');
const end = new Date(p.end_date + 'T23:59:59');
let exists = false;
for (const h of holidays) {
const hs = new Date(h.start_date + 'T00:00:00');
const he = new Date(h.end_date + 'T23:59:59');
if (hs <= end && he >= start) {
exists = true;
break;
}
}
setHasSchoolYearPlan(exists);
} catch (e) {
console.error('Akademische Periode laden fehlgeschlagen:', e);
setSchoolYearLabel('');
setHasSchoolYearPlan(false);
}
},
[holidays]
);
// Anzahl an Ferienzeiträumen in aktueller Ansicht ermitteln + Perioden-Indikator setzen
const updateHolidaysInView = React.useCallback(() => {
const inst = scheduleRef.current;
if (!inst) {
setHolidaysInView(0);
return;
}
const view = inst.currentView as 'Day' | 'Week' | 'WorkWeek' | 'Month' | 'Agenda';
const baseDate = inst.selectedDate as Date;
if (!baseDate) {
setHolidaysInView(0);
return;
}
let rangeStart = new Date(baseDate);
let rangeEnd = new Date(baseDate);
if (view === 'Day' || view === 'Agenda') {
rangeStart.setHours(0, 0, 0, 0);
rangeEnd.setHours(23, 59, 59, 999);
} else if (view === 'Week' || view === 'WorkWeek') {
const day = baseDate.getDay();
const diffToMonday = (day + 6) % 7; // Monday=0
rangeStart = new Date(baseDate);
rangeStart.setDate(baseDate.getDate() - diffToMonday);
rangeStart.setHours(0, 0, 0, 0);
rangeEnd = new Date(rangeStart);
rangeEnd.setDate(rangeStart.getDate() + 6);
rangeEnd.setHours(23, 59, 59, 999);
} else if (view === 'Month') {
rangeStart = new Date(baseDate.getFullYear(), baseDate.getMonth(), 1, 0, 0, 0, 0);
rangeEnd = new Date(baseDate.getFullYear(), baseDate.getMonth() + 1, 0, 23, 59, 59, 999);
}
let count = 0;
for (const h of holidays) {
const hs = new Date(h.start_date + 'T00:00:00');
const he = new Date(h.end_date + 'T23:59:59');
const overlaps =
(hs >= rangeStart && hs <= rangeEnd) ||
(he >= rangeStart && he <= rangeEnd) ||
(hs <= rangeStart && he >= rangeEnd);
if (overlaps) count += 1;
}
setHolidaysInView(count);
// Perioden-Indikator über Backend prüfen
refreshAcademicPeriodFor(baseDate);
}, [holidays, refreshAcademicPeriodFor]);
// Aktualisiere Indikator wenn Ferien oder Ansicht (Key) wechseln
React.useEffect(() => {
updateHolidaysInView();
}, [holidays, updateHolidaysInView, schedulerKey]);
return (
<div>
<h1 className="text-2xl font-bold mb-4">Terminmanagement</h1>
<div
style={{
marginBottom: 16,
display: 'flex',
alignItems: 'center',
gap: 16,
flexWrap: 'wrap',
}}
>
<label htmlFor="groupDropdown" className="mb-0 mr-2" style={{ whiteSpace: 'nowrap' }}>
Raumgruppe auswählen:
</label>
<DropDownListComponent
id="groupDropdown"
dataSource={groups}
fields={{ text: 'name', value: 'id' }}
placeholder="Gruppe auswählen"
value={selectedGroupId}
width="240px"
change={(e: { value: string }) => {
// <--- Typ für e ergänzt
setEvents([]); // Events sofort leeren
setSelectedGroupId(e.value);
}}
style={{}}
/>
{/* Akademische Periode Selector + Plan-Badge */}
<span style={{ marginLeft: 8, whiteSpace: 'nowrap' }}>Periode:</span>
<DropDownListComponent
id="periodDropdown"
dataSource={periods}
fields={{ text: 'label', value: 'id' }}
placeholder="Periode wählen"
value={activePeriodId ?? undefined}
width="260px"
change={async (e: { value: number }) => {
const id = Number(e.value);
if (!id) return;
try {
const updated = await setActiveAcademicPeriod(id);
setActivePeriodId(updated.id);
// Zum gleichen Tag/Monat (heute) innerhalb der gewählten Periode springen
const today = new Date();
const targetYear = new Date(updated.start_date).getFullYear();
const target = new Date(targetYear, today.getMonth(), today.getDate(), 12, 0, 0);
if (scheduleRef.current) {
scheduleRef.current.selectedDate = target;
scheduleRef.current.dataBind?.();
}
updateHolidaysInView();
} catch (err) {
console.error('Aktive Periode setzen fehlgeschlagen:', err);
}
}}
style={{}}
/>
{/* School-year/period plan badge (adjacent) */}
<span
title={hasSchoolYearPlan ? 'Ferienplan ist hinterlegt' : 'Kein Ferienplan hinterlegt'}
style={{
background: hasSchoolYearPlan ? '#dcfce7' : '#f3f4f6',
border: hasSchoolYearPlan ? '1px solid #86efac' : '1px solid #e5e7eb',
color: '#000',
padding: '4px 10px',
borderRadius: 16,
fontSize: 12,
display: 'flex',
alignItems: 'center',
gap: 6,
}}
>
{hasSchoolYearPlan ? (
<CheckCircle size={14} color="#166534" />
) : (
<AlertCircle size={14} color="#6b7280" />
)}
{schoolYearLabel || 'Periode'}
</span>
</div>
<button
className="e-btn e-success mb-4"
onClick={() => {
const now = new Date();
// Runde auf die nächste halbe Stunde
const minutes = now.getMinutes();
const roundedMinutes = minutes < 30 ? 30 : 0;
const startTime = new Date(now);
startTime.setMinutes(roundedMinutes, 0, 0);
if (roundedMinutes === 0) startTime.setHours(startTime.getHours() + 1);
const endTime = new Date(startTime);
endTime.setMinutes(endTime.getMinutes() + 30);
setModalInitialData({
startDate: startTime,
startTime: startTime,
endTime: endTime,
});
setEditMode(false);
setModalOpen(true);
}}
>
Neuen Termin anlegen
</button>
<div
style={{
display: 'flex',
alignItems: 'center',
gap: 24,
marginBottom: 16,
flexWrap: 'wrap',
}}
>
<label>
<input
type="checkbox"
checked={showInactive}
onChange={e => setShowInactive(e.target.checked)}
style={{ marginRight: 8 }}
/>
Vergangene Termine anzeigen
</label>
<label>
<input
type="checkbox"
checked={allowScheduleOnHolidays}
onChange={e => setAllowScheduleOnHolidays(e.target.checked)}
style={{ marginRight: 8 }}
/>
Termine an Ferientagen erlauben
</label>
<label>
<input
type="checkbox"
checked={showHolidayList}
onChange={e => setShowHolidayList(e.target.checked)}
style={{ marginRight: 8 }}
/>
Ferien im Kalender anzeigen
</label>
{/* Right-aligned indicators */}
<div style={{ marginLeft: 'auto', display: 'flex', gap: 8, alignItems: 'center' }}>
{/* Holidays-in-view badge */}
<span
title="Anzahl der Ferientage/-zeiträume in der aktuellen Ansicht"
style={{
background: holidaysInView > 0 ? '#ffe8cc' : '#f3f4f6',
border: holidaysInView > 0 ? '1px solid #ffcf99' : '1px solid #e5e7eb',
color: '#000',
padding: '4px 10px',
borderRadius: 16,
fontSize: 12,
}}
>
{holidaysInView > 0 ? `Ferien im Blick: ${holidaysInView}` : 'Keine Ferien in Ansicht'}
</span>
</div>
</div>
<CustomEventModal
open={modalOpen}
onClose={() => {
setModalOpen(false);
setEditMode(false); // Editiermodus zurücksetzen
}}
onSave={async () => {
setModalOpen(false);
setEditMode(false);
if (selectedGroupId) {
const data = await fetchEvents(selectedGroupId, showInactive);
const mapped: Event[] = data.map((e: RawEvent) => ({
Id: e.Id,
Subject: e.Subject,
StartTime: new Date(e.StartTime.endsWith('Z') ? e.StartTime : e.StartTime + 'Z'),
EndTime: new Date(e.EndTime.endsWith('Z') ? e.EndTime : e.EndTime + 'Z'),
IsAllDay: e.IsAllDay,
MediaId: e.MediaId,
}));
setEvents(mapped);
setSchedulerKey(prev => prev + 1); // <-- Key erhöhen
}
}}
initialData={modalInitialData}
groupName={groups.find(g => g.id === selectedGroupId) ?? { id: selectedGroupId, name: '' }}
groupColor={selectedGroupId ? getGroupColor(selectedGroupId, groups) : undefined}
editMode={editMode} // NEU: Prop für Editiermodus
blockHolidays={!allowScheduleOnHolidays}
isHolidayRange={(s, e) => isWithinHolidayRange(s, e)}
/>
<ScheduleComponent
ref={scheduleRef}
key={schedulerKey} // <-- dynamischer Key
height="750px"
locale="de"
currentView="Week"
eventSettings={{
dataSource: dataSource,
fields: { isBlock: 'IsBlock' },
template: eventTemplate, // <--- Hier das Template setzen!
}}
actionComplete={() => updateHolidaysInView()}
cellClick={args => {
if (!allowScheduleOnHolidays && isWithinHolidayRange(args.startTime, args.endTime)) {
args.cancel = true;
return; // block creation on holidays
}
// args.startTime und args.endTime sind Date-Objekte
args.cancel = true; // Verhindert die Standardaktion
setModalInitialData({
startDate: args.startTime,
startTime: args.startTime,
endTime: args.endTime,
});
setEditMode(false); // NEU: kein Editiermodus
setModalOpen(true);
}}
popupOpen={async args => {
if (args.type === 'Editor') {
args.cancel = true;
const event = args.data;
console.log('Event zum Bearbeiten:', event);
let media = null;
if (event.MediaId) {
try {
const mediaData = await fetchMediaById(event.MediaId);
media = {
id: mediaData.id,
path: mediaData.file_path,
name: mediaData.name || mediaData.url,
};
} catch (err) {
console.error('Fehler beim Laden der Mediainfos:', err);
}
}
setModalInitialData({
Id: event.Id,
title: event.Subject,
startDate: event.StartTime,
startTime: event.StartTime,
endTime: event.EndTime,
description: event.Description ?? '',
type: event.Type ?? 'presentation',
repeat: event.Repeat ?? false,
weekdays: event.Weekdays ?? [],
repeatUntil: event.RepeatUntil ?? null,
skipHolidays: event.SkipHolidays ?? false,
media, // Metadaten werden nur bei Bedarf geladen!
slideshowInterval: event.SlideshowInterval ?? 10,
websiteUrl: event.WebsiteUrl ?? '',
});
console.log('Modal initial data:', {
Id: event.Id,
title: event.Subject,
startDate: event.StartTime,
startTime: event.StartTime,
endTime: event.EndTime,
description: event.Description ?? '',
type: event.Type ?? 'presentation',
repeat: event.Repeat ?? false,
weekdays: event.Weekdays ?? [],
repeatUntil: event.RepeatUntil ?? null,
skipHolidays: event.SkipHolidays ?? false,
media, // Metadaten werden nur bei Bedarf geladen!
slideshowInterval: event.SlideshowInterval ?? 10,
websiteUrl: event.WebsiteUrl ?? '',
});
setEditMode(true);
setModalOpen(true);
}
}}
eventRendered={(args: EventRenderedArgs) => {
// Blende Nicht-Ferien-Events aus, falls sie in Ferien fallen und Terminieren nicht erlaubt ist
if (!allowScheduleOnHolidays && args.data && !args.data.isHoliday) {
const s =
args.data.StartTime instanceof Date
? args.data.StartTime
: new Date(args.data.StartTime);
const e =
args.data.EndTime instanceof Date ? args.data.EndTime : new Date(args.data.EndTime);
if (isWithinHolidayRange(s, e)) {
args.cancel = true;
return;
}
}
if (selectedGroupId && args.data && args.data.Id) {
const groupColor = getGroupColor(selectedGroupId, groups);
const now = new Date();
let IconComponent: React.ElementType | null = null;
switch (args.data.Type) {
case 'presentation':
IconComponent = Presentation;
break;
case 'website':
IconComponent = Globe;
break;
case 'video':
IconComponent = Video;
break;
case 'message':
IconComponent = MessageSquare;
break;
case 'webuntis':
IconComponent = School;
break;
default:
IconComponent = null;
}
// Nur .e-subject verwenden!
const titleElement = args.element.querySelector('.e-subject');
if (titleElement && IconComponent) {
const svgString = renderToStaticMarkup(<IconComponent size={18} color="#78591c" />);
// Immer nur den reinen Text nehmen, kein vorhandenes Icon!
const subjectText = (titleElement as HTMLElement).textContent ?? '';
(titleElement as HTMLElement).innerHTML =
`<span style="vertical-align:middle;display:inline-block;margin-right:6px;">${svgString}</span>` +
subjectText;
}
// Vergangene Termine: Raumgruppenfarbe
if (args.data.EndTime && args.data.EndTime < now) {
args.element.style.backgroundColor = groupColor ? `${groupColor}` : '#f3f3f3';
args.element.style.color = '#000';
} else if (groupColor) {
args.element.style.backgroundColor = groupColor;
args.element.style.color = '#000';
}
// Spezielle Darstellung für Ferienanzeige-Events
if (args.data.isHoliday && !args.data.IsBlock) {
args.element.style.backgroundColor = '#ffe8cc'; // sanftes Orange
args.element.style.border = '1px solid #ffcf99';
args.element.style.color = '#000';
}
// Gleiche Darstellung für Ferien-Block-Events
if (args.data.isHoliday && args.data.IsBlock) {
args.element.style.backgroundColor = '#ffe8cc';
args.element.style.border = '1px solid #ffcf99';
args.element.style.color = '#000';
}
}
}}
actionBegin={async (args: ActionEventArgs) => {
if (args.requestType === 'eventRemove') {
// args.data ist ein Array von zu löschenden Events
const toDelete = Array.isArray(args.data) ? args.data : [args.data];
for (const ev of toDelete) {
try {
await deleteEvent(ev.Id); // Deine API-Funktion
} catch (err) {
// Optional: Fehlerbehandlung
console.error('Fehler beim Löschen:', err);
}
}
// Events nach Löschen neu laden
if (selectedGroupId) {
fetchEvents(selectedGroupId, showInactive)
.then((data: RawEvent[]) => {
const mapped: Event[] = data.map((e: RawEvent) => ({
Id: e.Id,
Subject: e.Subject,
StartTime: new Date(e.StartTime),
EndTime: new Date(e.EndTime),
IsAllDay: e.IsAllDay,
MediaId: e.MediaId,
}));
setEvents(mapped);
})
.catch(console.error);
}
// Syncfusion soll das Event nicht selbst löschen
args.cancel = true;
} else if (
(args.requestType === 'eventCreate' || args.requestType === 'eventChange') &&
!allowScheduleOnHolidays
) {
// Verhindere Erstellen/Ändern in Ferienbereichen (Failsafe, falls eingebauter Editor genutzt wird)
type PartialEventLike = { StartTime?: Date | string; EndTime?: Date | string };
const raw = (args as ActionEventArgs).data as
| PartialEventLike
| PartialEventLike[]
| undefined;
const data = Array.isArray(raw) ? raw[0] : raw;
if (data && data.StartTime && data.EndTime) {
const s = data.StartTime instanceof Date ? data.StartTime : new Date(data.StartTime);
const e = data.EndTime instanceof Date ? data.EndTime : new Date(data.EndTime);
if (isWithinHolidayRange(s, e)) {
args.cancel = true;
return;
}
}
}
}}
firstDayOfWeek={1}
renderCell={(args: RenderCellEventArgs) => {
// Nur für Arbeitszellen (Stunden-/Tageszellen)
if (args.elementType === 'workCells') {
const now = new Date();
// args.element ist vom Typ Element, daher als HTMLElement casten:
const cell = args.element as HTMLElement;
if (args.date && args.date < now) {
cell.style.backgroundColor = '#fff9e3'; // Hellgelb für Vergangenheit
cell.style.opacity = '0.7';
}
}
}}
>
<ViewsDirective>
<ViewDirective option="Day" />
<ViewDirective option="Week" />
<ViewDirective option="WorkWeek" />
<ViewDirective option="Month" />
<ViewDirective option="Agenda" />
</ViewsDirective>
<Inject services={[Day, Week, WorkWeek, Month, Agenda]} />
</ScheduleComponent>
</div>
);
};
export default Appointments;

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 225 KiB

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="35.93" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 228"><path fill="#00D8FF" d="M210.483 73.824a171.49 171.49 0 0 0-8.24-2.597c.465-1.9.893-3.777 1.273-5.621c6.238-30.281 2.16-54.676-11.769-62.708c-13.355-7.7-35.196.329-57.254 19.526a171.23 171.23 0 0 0-6.375 5.848a155.866 155.866 0 0 0-4.241-3.917C100.759 3.829 77.587-4.822 63.673 3.233C50.33 10.957 46.379 33.89 51.995 62.588a170.974 170.974 0 0 0 1.892 8.48c-3.28.932-6.445 1.924-9.474 2.98C17.309 83.498 0 98.307 0 113.668c0 15.865 18.582 31.778 46.812 41.427a145.52 145.52 0 0 0 6.921 2.165a167.467 167.467 0 0 0-2.01 9.138c-5.354 28.2-1.173 50.591 12.134 58.266c13.744 7.926 36.812-.22 59.273-19.855a145.567 145.567 0 0 0 5.342-4.923a168.064 168.064 0 0 0 6.92 6.314c21.758 18.722 43.246 26.282 56.54 18.586c13.731-7.949 18.194-32.003 12.4-61.268a145.016 145.016 0 0 0-1.535-6.842c1.62-.48 3.21-.974 4.76-1.488c29.348-9.723 48.443-25.443 48.443-41.52c0-15.417-17.868-30.326-45.517-39.844Zm-6.365 70.984c-1.4.463-2.836.91-4.3 1.345c-3.24-10.257-7.612-21.163-12.963-32.432c5.106-11 9.31-21.767 12.459-31.957c2.619.758 5.16 1.557 7.61 2.4c23.69 8.156 38.14 20.213 38.14 29.504c0 9.896-15.606 22.743-40.946 31.14Zm-10.514 20.834c2.562 12.94 2.927 24.64 1.23 33.787c-1.524 8.219-4.59 13.698-8.382 15.893c-8.067 4.67-25.32-1.4-43.927-17.412a156.726 156.726 0 0 1-6.437-5.87c7.214-7.889 14.423-17.06 21.459-27.246c12.376-1.098 24.068-2.894 34.671-5.345a134.17 134.17 0 0 1 1.386 6.193ZM87.276 214.515c-7.882 2.783-14.16 2.863-17.955.675c-8.075-4.657-11.432-22.636-6.853-46.752a156.923 156.923 0 0 1 1.869-8.499c10.486 2.32 22.093 3.988 34.498 4.994c7.084 9.967 14.501 19.128 21.976 27.15a134.668 134.668 0 0 1-4.877 4.492c-9.933 8.682-19.886 14.842-28.658 17.94ZM50.35 144.747c-12.483-4.267-22.792-9.812-29.858-15.863c-6.35-5.437-9.555-10.836-9.555-15.216c0-9.322 13.897-21.212 37.076-29.293c2.813-.98 5.757-1.905 8.812-2.773c3.204 10.42 7.406 21.315 12.477 32.332c-5.137 11.18-9.399 22.249-12.634 32.792a134.718 134.718 0 0 1-6.318-1.979Zm12.378-84.26c-4.811-24.587-1.616-43.134 6.425-47.789c8.564-4.958 27.502 2.111 47.463 19.835a144.318 144.318 0 0 1 3.841 3.545c-7.438 7.987-14.787 17.08-21.808 26.988c-12.04 1.116-23.565 2.908-34.161 5.309a160.342 160.342 0 0 1-1.76-7.887Zm110.427 27.268a347.8 347.8 0 0 0-7.785-12.803c8.168 1.033 15.994 2.404 23.343 4.08c-2.206 7.072-4.956 14.465-8.193 22.045a381.151 381.151 0 0 0-7.365-13.322Zm-45.032-43.861c5.044 5.465 10.096 11.566 15.065 18.186a322.04 322.04 0 0 0-30.257-.006c4.974-6.559 10.069-12.652 15.192-18.18ZM82.802 87.83a323.167 323.167 0 0 0-7.227 13.238c-3.184-7.553-5.909-14.98-8.134-22.152c7.304-1.634 15.093-2.97 23.209-3.984a321.524 321.524 0 0 0-7.848 12.897Zm8.081 65.352c-8.385-.936-16.291-2.203-23.593-3.793c2.26-7.3 5.045-14.885 8.298-22.6a321.187 321.187 0 0 0 7.257 13.246c2.594 4.48 5.28 8.868 8.038 13.147Zm37.542 31.03c-5.184-5.592-10.354-11.779-15.403-18.433c4.902.192 9.899.29 14.978.29c5.218 0 10.376-.117 15.453-.343c-4.985 6.774-10.018 12.97-15.028 18.486Zm52.198-57.817c3.422 7.8 6.306 15.345 8.596 22.52c-7.422 1.694-15.436 3.058-23.88 4.071a382.417 382.417 0 0 0 7.859-13.026a347.403 347.403 0 0 0 7.425-13.565Zm-16.898 8.101a358.557 358.557 0 0 1-12.281 19.815a329.4 329.4 0 0 1-23.444.823c-7.967 0-15.716-.248-23.178-.732a310.202 310.202 0 0 1-12.513-19.846h.001a307.41 307.41 0 0 1-10.923-20.627a310.278 310.278 0 0 1 10.89-20.637l-.001.001a307.318 307.318 0 0 1 12.413-19.761c7.613-.576 15.42-.876 23.31-.876H128c7.926 0 15.743.303 23.354.883a329.357 329.357 0 0 1 12.335 19.695a358.489 358.489 0 0 1 11.036 20.54a329.472 329.472 0 0 1-11 20.722Zm22.56-122.124c8.572 4.944 11.906 24.881 6.52 51.026c-.344 1.668-.73 3.367-1.15 5.09c-10.622-2.452-22.155-4.275-34.23-5.408c-7.034-10.017-14.323-19.124-21.64-27.008a160.789 160.789 0 0 1 5.888-5.4c18.9-16.447 36.564-22.941 44.612-18.3ZM128 90.808c12.625 0 22.86 10.235 22.86 22.86s-10.235 22.86-22.86 22.86s-22.86-10.235-22.86-22.86s10.235-22.86 22.86-22.86Z"></path></svg>

After

Width:  |  Height:  |  Size: 4.0 KiB

View File

@@ -0,0 +1,8 @@
import React from 'react';
const Benutzer: React.FC = () => (
<div>
<h2 className="text-xl font-bold mb-4">Benutzer</h2>
<p>Willkommen im Infoscreen-Management Benutzer.</p>
</div>
);
export default Benutzer;

View File

@@ -0,0 +1,569 @@
{
"main": {
"de": {
"identity": {
"language": "de"
},
"dates": {
"calendars": {
"gregorian": {
"months": {
"format": {
"abbreviated": {
"1": "Jan.",
"2": "Feb.",
"3": "März",
"4": "Apr.",
"5": "Mai",
"6": "Juni",
"7": "Juli",
"8": "Aug.",
"9": "Sept.",
"10": "Okt.",
"11": "Nov.",
"12": "Dez."
},
"narrow": {
"1": "J",
"2": "F",
"3": "M",
"4": "A",
"5": "M",
"6": "J",
"7": "J",
"8": "A",
"9": "S",
"10": "O",
"11": "N",
"12": "D"
},
"wide": {
"1": "Januar",
"2": "Februar",
"3": "März",
"4": "April",
"5": "Mai",
"6": "Juni",
"7": "Juli",
"8": "August",
"9": "September",
"10": "Oktober",
"11": "November",
"12": "Dezember"
}
},
"stand-alone": {
"abbreviated": {
"1": "Jan",
"2": "Feb",
"3": "Mär",
"4": "Apr",
"5": "Mai",
"6": "Jun",
"7": "Jul",
"8": "Aug",
"9": "Sep",
"10": "Okt",
"11": "Nov",
"12": "Dez"
},
"narrow": {
"1": "J",
"2": "F",
"3": "M",
"4": "A",
"5": "M",
"6": "J",
"7": "J",
"8": "A",
"9": "S",
"10": "O",
"11": "N",
"12": "D"
},
"wide": {
"1": "Januar",
"2": "Februar",
"3": "März",
"4": "April",
"5": "Mai",
"6": "Juni",
"7": "Juli",
"8": "August",
"9": "September",
"10": "Oktober",
"11": "November",
"12": "Dezember"
}
}
},
"days": {
"format": {
"abbreviated": {
"sun": "So.",
"mon": "Mo.",
"tue": "Di.",
"wed": "Mi.",
"thu": "Do.",
"fri": "Fr.",
"sat": "Sa."
},
"narrow": {
"sun": "S",
"mon": "M",
"tue": "D",
"wed": "M",
"thu": "D",
"fri": "F",
"sat": "S"
},
"short": {
"sun": "So.",
"mon": "Mo.",
"tue": "Di.",
"wed": "Mi.",
"thu": "Do.",
"fri": "Fr.",
"sat": "Sa."
},
"wide": {
"sun": "Sonntag",
"mon": "Montag",
"tue": "Dienstag",
"wed": "Mittwoch",
"thu": "Donnerstag",
"fri": "Freitag",
"sat": "Samstag"
}
},
"stand-alone": {
"abbreviated": {
"sun": "So",
"mon": "Mo",
"tue": "Di",
"wed": "Mi",
"thu": "Do",
"fri": "Fr",
"sat": "Sa"
},
"narrow": {
"sun": "S",
"mon": "M",
"tue": "D",
"wed": "M",
"thu": "D",
"fri": "F",
"sat": "S"
},
"short": {
"sun": "So.",
"mon": "Mo.",
"tue": "Di.",
"wed": "Mi.",
"thu": "Do.",
"fri": "Fr.",
"sat": "Sa."
},
"wide": {
"sun": "Sonntag",
"mon": "Montag",
"tue": "Dienstag",
"wed": "Mittwoch",
"thu": "Donnerstag",
"fri": "Freitag",
"sat": "Samstag"
}
}
},
"quarters": {
"format": {
"abbreviated": {
"1": "Q1",
"2": "Q2",
"3": "Q3",
"4": "Q4"
},
"narrow": {
"1": "1",
"2": "2",
"3": "3",
"4": "4"
},
"wide": {
"1": "1. Quartal",
"2": "2. Quartal",
"3": "3. Quartal",
"4": "4. Quartal"
}
},
"stand-alone": {
"abbreviated": {
"1": "Q1",
"2": "Q2",
"3": "Q3",
"4": "Q4"
},
"narrow": {
"1": "1",
"2": "2",
"3": "3",
"4": "4"
},
"wide": {
"1": "1. Quartal",
"2": "2. Quartal",
"3": "3. Quartal",
"4": "4. Quartal"
}
}
},
"dayPeriods": {
"format": {
"abbreviated": {
"midnight": "Mitternacht",
"am": "AM",
"pm": "PM",
"morning1": "morgens",
"morning2": "vorm.",
"afternoon1": "mittags",
"afternoon2": "nachm.",
"evening1": "abends",
"night1": "nachts"
},
"narrow": {
"midnight": "Mitternacht",
"am": "AM",
"pm": "PM",
"morning1": "morgens",
"morning2": "vorm.",
"afternoon1": "mittags",
"afternoon2": "nachm.",
"evening1": "abends",
"night1": "nachts"
},
"wide": {
"midnight": "Mitternacht",
"am": "AM",
"pm": "PM",
"morning1": "morgens",
"morning2": "vormittags",
"afternoon1": "mittags",
"afternoon2": "nachmittags",
"evening1": "abends",
"night1": "nachts"
}
},
"stand-alone": {
"abbreviated": {
"midnight": "Mitternacht",
"am": "AM",
"pm": "PM",
"morning1": "Morgen",
"morning2": "Vorm.",
"afternoon1": "Mittag",
"afternoon2": "Nachm.",
"evening1": "Abend",
"night1": "Nacht"
},
"narrow": {
"midnight": "Mitternacht",
"am": "AM",
"pm": "PM",
"morning1": "Morgen",
"morning2": "Vorm.",
"afternoon1": "Mittag",
"afternoon2": "Nachm.",
"evening1": "Abend",
"night1": "Nacht"
},
"wide": {
"midnight": "Mitternacht",
"am": "AM",
"pm": "PM",
"morning1": "Morgen",
"morning2": "Vormittag",
"afternoon1": "Mittag",
"afternoon2": "Nachmittag",
"evening1": "Abend",
"night1": "Nacht"
}
}
},
"eras": {
"eraNames": {
"0": "v. Chr.",
"0-alt-variant": "vor unserer Zeitrechnung",
"1": "n. Chr.",
"1-alt-variant": "unserer Zeitrechnung"
},
"eraAbbr": {
"0": "v. Chr.",
"0-alt-variant": "v. u. Z.",
"1": "n. Chr.",
"1-alt-variant": "u. Z."
},
"eraNarrow": {
"0": "v. Chr.",
"0-alt-variant": "v. u. Z.",
"1": "n. Chr.",
"1-alt-variant": "u. Z."
}
},
"dateFormats": {
"full": "EEEE, d. MMMM y",
"long": "d. MMMM y",
"medium": "dd.MM.y",
"short": "dd.MM.yy"
},
"dateSkeletons": {
"full": "yMMMMEEEEd",
"long": "yMMMMd",
"medium": "yMMdd",
"short": "yyMMdd"
},
"timeFormats": {
"full": "HH:mm:ss zzzz",
"long": "HH:mm:ss z",
"medium": "HH:mm:ss",
"short": "HH:mm"
},
"timeSkeletons": {
"full": "HHmmsszzzz",
"long": "HHmmssz",
"medium": "HHmmss",
"short": "HHmm"
},
"dateTimeFormats": {
"full": "{1}, {0}",
"long": "{1}, {0}",
"medium": "{1}, {0}",
"short": "{1}, {0}",
"availableFormats": {
"Bh": "h B",
"Bhm": "h:mm B",
"Bhms": "h:mm:ss B",
"d": "d",
"E": "ccc",
"EBhm": "E h:mm B",
"EBhms": "E h:mm:ss B",
"Ed": "E, d.",
"Ehm": "E h:mma",
"EHm": "E, HH:mm",
"Ehms": "E, h:mm:ssa",
"EHms": "E, HH:mm:ss",
"Gy": "y G",
"GyMd": "dd.MM.y G",
"GyMMM": "MMM y G",
"GyMMMd": "d. MMM y G",
"GyMMMEd": "E, d. MMM y G",
"h": "h 'Uhr' a",
"H": "HH 'Uhr'",
"hm": "h:mma",
"Hm": "HH:mm",
"hms": "h:mm:ssa",
"Hms": "HH:mm:ss",
"hmsv": "h:mm:ssa v",
"Hmsv": "HH:mm:ss v",
"hmv": "h:mma v",
"Hmv": "HH:mm v",
"M": "L",
"Md": "d.M.",
"MEd": "E, d.M.",
"MMd": "d.MM.",
"MMdd": "dd.MM.",
"MMM": "LLL",
"MMMd": "d. MMM",
"MMMEd": "E, d. MMM",
"MMMMd": "d. MMMM",
"MMMMEd": "E, d. MMMM",
"MMMMW-count-one": "'Woche' W 'im' MMMM",
"MMMMW-count-other": "'Woche' W 'im' MMMM",
"ms": "mm:ss",
"y": "y",
"yM": "M/y",
"yMd": "d.M.y",
"yMEd": "E, d.M.y",
"yMM": "MM.y",
"yMMdd": "dd.MM.y",
"yMMM": "MMM y",
"yMMMd": "d. MMM y",
"yMMMEd": "E, d. MMM y",
"yMMMM": "MMMM y",
"yQQQ": "QQQ y",
"yQQQQ": "QQQQ y",
"yw-count-one": "'Woche' w 'des' 'Jahres' Y",
"yw-count-other": "'Woche' w 'des' 'Jahres' Y"
},
"appendItems": {
"Day": "{0} ({2}: {1})",
"Day-Of-Week": "{0} {1}",
"Era": "{1} {0}",
"Hour": "{0} ({2}: {1})",
"Minute": "{0} ({2}: {1})",
"Month": "{0} ({2}: {1})",
"Quarter": "{0} ({2}: {1})",
"Second": "{0} ({2}: {1})",
"Timezone": "{0} {1}",
"Week": "{0} ({2}: {1})",
"Year": "{1} {0}"
},
"intervalFormats": {
"intervalFormatFallback": "{0}{1}",
"Bh": {
"B": "h 'Uhr' Bh 'Uhr' B",
"h": "hh 'Uhr' B"
},
"Bhm": {
"B": "h:mm 'Uhr' Bh:mm 'Uhr' B",
"h": "h:mmh:mm 'Uhr' B",
"m": "h:mmh:mm 'Uhr' B"
},
"d": {
"d": "d.d."
},
"Gy": {
"G": "y Gy G",
"y": "yy G"
},
"GyM": {
"G": "MM/y GMM/y G",
"M": "MM/yMM/y G",
"y": "MM/yMM/y G"
},
"GyMd": {
"d": "dd.dd.MM.y G",
"G": "dd.MM.y Gdd.MM.y G",
"M": "dd.MM.dd.MM.y G",
"y": "dd.MM.ydd.MM.y G"
},
"GyMEd": {
"d": "E, dd.MM.yE, dd.MM.y G",
"G": "E, dd.MM.y GE, dd.MM.y G",
"M": "E, dd.MM.E, dd.MM.y G",
"y": "E, dd.MM.yE, dd.MM.y G"
},
"GyMMM": {
"G": "MMM y GMMM y G",
"M": "MMMMMM y G",
"y": "MMM yMMM y G"
},
"GyMMMd": {
"d": "d.d. MMM y G",
"G": "d. MMM y Gd. MMM y G",
"M": "d. MMMd. MMM y G",
"y": "d. MMM yd. MMM y G"
},
"GyMMMEd": {
"d": "E, d.E, d. MMM y G",
"G": "E, d. MMM y GE E, d. MMM y G",
"M": "E, d. MMME, d. MMM y G",
"y": "E, d. MMM yE, d. MMM y G"
},
"h": {
"a": "h 'Uhr' ah 'Uhr' a",
"h": "hh 'Uhr' a"
},
"H": {
"H": "HHHH 'Uhr'"
},
"hm": {
"a": "h:mmah:mma",
"h": "h:mmh:mma",
"m": "h:mmh:mma"
},
"Hm": {
"H": "HH:mmHH:mm 'Uhr'",
"m": "HH:mmHH:mm 'Uhr'"
},
"hmv": {
"a": "h:mmah:mma v",
"h": "h:mmh:mma v",
"m": "h:mmh:mma v"
},
"Hmv": {
"H": "HH:mmHH:mm 'Uhr' v",
"m": "HH:mmHH:mm 'Uhr' v"
},
"hv": {
"a": "haha v",
"h": "hha v"
},
"Hv": {
"H": "HHHH 'Uhr' v"
},
"M": {
"M": "MMMM"
},
"Md": {
"d": "dd.dd.MM.",
"M": "dd.MM.dd.MM."
},
"MEd": {
"d": "E, dd.E, dd.MM.",
"M": "E, dd.MM.E, dd.MM."
},
"MMM": {
"M": "MMMMMM"
},
"MMMd": {
"d": "d.d. MMM",
"M": "d. MMMd. MMM"
},
"MMMEd": {
"d": "E, d.E, d. MMM",
"M": "E, d. MMME, d. MMM"
},
"MMMM": {
"M": "LLLLLLLL"
},
"y": {
"y": "yy"
},
"yM": {
"M": "M/yM/y",
"y": "M/yM/y"
},
"yMd": {
"d": "dd.dd.MM.y",
"M": "dd.MM.dd.MM.y",
"y": "dd.MM.ydd.MM.y"
},
"yMEd": {
"d": "E, dd.E, dd.MM.y",
"M": "E, dd.MM.E, dd.MM.y",
"y": "E, dd.MM.yE, dd.MM.y"
},
"yMMM": {
"M": "MMMMMM y",
"y": "MMM yMMM y"
},
"yMMMd": {
"d": "d.d. MMM y",
"M": "d. MMMd. MMM y",
"y": "d. MMM yd. MMM y"
},
"yMMMEd": {
"d": "E, d.E, d. MMM y",
"M": "E, d. MMME, d. MMM y",
"y": "E, d. MMM yE, d. MMM y"
},
"yMMMM": {
"M": "MMMMMMMM y",
"y": "MMMM yMMMM y"
}
}
},
"dateTimeFormats-atTime": {
"standard": {
"full": "{1} 'um' {0}",
"long": "{1} 'um' {0}",
"medium": "{1}, {0}",
"short": "{1}, {0}"
}
}
}
}
}
}
}
}

View File

@@ -0,0 +1,394 @@
{
"supplemental": {
"version": {
"_unicodeVersion": "16.0.0",
"_cldrVersion": "47"
},
"numberingSystems": {
"adlm": {
"_digits": "𞥐𞥑𞥒𞥓𞥔𞥕𞥖𞥗𞥘𞥙",
"_type": "numeric"
},
"ahom": {
"_digits": "𑜰𑜱𑜲𑜳𑜴𑜵𑜶𑜷𑜸𑜹",
"_type": "numeric"
},
"arab": {
"_digits": "٠١٢٣٤٥٦٧٨٩",
"_type": "numeric"
},
"arabext": {
"_digits": "۰۱۲۳۴۵۶۷۸۹",
"_type": "numeric"
},
"armn": {
"_rules": "armenian-upper",
"_type": "algorithmic"
},
"armnlow": {
"_rules": "armenian-lower",
"_type": "algorithmic"
},
"bali": {
"_digits": "᭐᭑᭒᭓᭔᭕᭖᭗᭘᭙",
"_type": "numeric"
},
"beng": {
"_digits": "০১২৩৪৫৬৭৮৯",
"_type": "numeric"
},
"bhks": {
"_digits": "𑱐𑱑𑱒𑱓𑱔𑱕𑱖𑱗𑱘𑱙",
"_type": "numeric"
},
"brah": {
"_digits": "𑁦𑁧𑁨𑁩𑁪𑁫𑁬𑁭𑁮𑁯",
"_type": "numeric"
},
"cakm": {
"_digits": "𑄶𑄷𑄸𑄹𑄺𑄻𑄼𑄽𑄾𑄿",
"_type": "numeric"
},
"cham": {
"_digits": "꩐꩑꩒꩓꩔꩕꩖꩗꩘꩙",
"_type": "numeric"
},
"cyrl": {
"_rules": "cyrillic-lower",
"_type": "algorithmic"
},
"deva": {
"_digits": "०१२३४५६७८९",
"_type": "numeric"
},
"diak": {
"_digits": "𑥐𑥑𑥒𑥓𑥔𑥕𑥖𑥗𑥘𑥙",
"_type": "numeric"
},
"ethi": {
"_rules": "ethiopic",
"_type": "algorithmic"
},
"fullwide": {
"_digits": "",
"_type": "numeric"
},
"gara": {
"_digits": "𐵀𐵁𐵂𐵃𐵄𐵅𐵆𐵇𐵈𐵉",
"_type": "numeric"
},
"geor": {
"_rules": "georgian",
"_type": "algorithmic"
},
"gong": {
"_digits": "𑶠𑶡𑶢𑶣𑶤𑶥𑶦𑶧𑶨𑶩",
"_type": "numeric"
},
"gonm": {
"_digits": "𑵐𑵑𑵒𑵓𑵔𑵕𑵖𑵗𑵘𑵙",
"_type": "numeric"
},
"grek": {
"_rules": "greek-upper",
"_type": "algorithmic"
},
"greklow": {
"_rules": "greek-lower",
"_type": "algorithmic"
},
"gujr": {
"_digits": "૦૧૨૩૪૫૬૭૮૯",
"_type": "numeric"
},
"gukh": {
"_digits": "𖄰𖄱𖄲𖄳𖄴𖄵𖄶𖄷𖄸𖄹",
"_type": "numeric"
},
"guru": {
"_digits": "੦੧੨੩੪੫੬੭੮੯",
"_type": "numeric"
},
"hanidays": {
"_rules": "zh/SpelloutRules/spellout-numbering-days",
"_type": "algorithmic"
},
"hanidec": {
"_digits": "〇一二三四五六七八九",
"_type": "numeric"
},
"hans": {
"_rules": "zh/SpelloutRules/spellout-cardinal",
"_type": "algorithmic"
},
"hansfin": {
"_rules": "zh/SpelloutRules/spellout-cardinal-financial",
"_type": "algorithmic"
},
"hant": {
"_rules": "zh_Hant/SpelloutRules/spellout-cardinal",
"_type": "algorithmic"
},
"hantfin": {
"_rules": "zh_Hant/SpelloutRules/spellout-cardinal-financial",
"_type": "algorithmic"
},
"hebr": {
"_rules": "hebrew",
"_type": "algorithmic"
},
"hmng": {
"_digits": "𖭐𖭑𖭒𖭓𖭔𖭕𖭖𖭗𖭘𖭙",
"_type": "numeric"
},
"hmnp": {
"_digits": "𞅀𞅁𞅂𞅃𞅄𞅅𞅆𞅇𞅈𞅉",
"_type": "numeric"
},
"java": {
"_digits": "꧐꧑꧒꧓꧔꧕꧖꧗꧘꧙",
"_type": "numeric"
},
"jpan": {
"_rules": "ja/SpelloutRules/spellout-cardinal",
"_type": "algorithmic"
},
"jpanfin": {
"_rules": "ja/SpelloutRules/spellout-cardinal-financial",
"_type": "algorithmic"
},
"jpanyear": {
"_rules": "ja/SpelloutRules/spellout-numbering-year-latn",
"_type": "algorithmic"
},
"kali": {
"_digits": "꤀꤁꤂꤃꤄꤅꤆꤇꤈꤉",
"_type": "numeric"
},
"kawi": {
"_digits": "𑽐𑽑𑽒𑽓𑽔𑽕𑽖𑽗𑽘𑽙",
"_type": "numeric"
},
"khmr": {
"_digits": "០១២៣៤៥៦៧៨៩",
"_type": "numeric"
},
"knda": {
"_digits": "೦೧೨೩೪೫೬೭೮೯",
"_type": "numeric"
},
"krai": {
"_digits": "𖵰𖵱𖵲𖵳𖵴𖵵𖵶𖵷𖵸𖵹",
"_type": "numeric"
},
"lana": {
"_digits": "᪀᪁᪂᪃᪄᪅᪆᪇᪈᪉",
"_type": "numeric"
},
"lanatham": {
"_digits": "᪐᪑᪒᪓᪔᪕᪖᪗᪘᪙",
"_type": "numeric"
},
"laoo": {
"_digits": "໐໑໒໓໔໕໖໗໘໙",
"_type": "numeric"
},
"latn": {
"_digits": "0123456789",
"_type": "numeric"
},
"lepc": {
"_digits": "᱀᱁᱂᱃᱄᱅᱆᱇᱈᱉",
"_type": "numeric"
},
"limb": {
"_digits": "᥆᥇᥈᥉᥊᥋᥌᥍᥎᥏",
"_type": "numeric"
},
"mathbold": {
"_digits": "𝟎𝟏𝟐𝟑𝟒𝟓𝟔𝟕𝟖𝟗",
"_type": "numeric"
},
"mathdbl": {
"_digits": "𝟘𝟙𝟚𝟛𝟜𝟝𝟞𝟟𝟠𝟡",
"_type": "numeric"
},
"mathmono": {
"_digits": "𝟶𝟷𝟸𝟹𝟺𝟻𝟼𝟽𝟾𝟿",
"_type": "numeric"
},
"mathsanb": {
"_digits": "𝟬𝟭𝟮𝟯𝟰𝟱𝟲𝟳𝟴𝟵",
"_type": "numeric"
},
"mathsans": {
"_digits": "𝟢𝟣𝟤𝟥𝟦𝟧𝟨𝟩𝟪𝟫",
"_type": "numeric"
},
"mlym": {
"_digits": "൦൧൨൩൪൫൬൭൮൯",
"_type": "numeric"
},
"modi": {
"_digits": "𑙐𑙑𑙒𑙓𑙔𑙕𑙖𑙗𑙘𑙙",
"_type": "numeric"
},
"mong": {
"_digits": "᠐᠑᠒᠓᠔᠕᠖᠗᠘᠙",
"_type": "numeric"
},
"mroo": {
"_digits": "𖩠𖩡𖩢𖩣𖩤𖩥𖩦𖩧𖩨𖩩",
"_type": "numeric"
},
"mtei": {
"_digits": "꯰꯱꯲꯳꯴꯵꯶꯷꯸꯹",
"_type": "numeric"
},
"mymr": {
"_digits": "၀၁၂၃၄၅၆၇၈၉",
"_type": "numeric"
},
"mymrepka": {
"_digits": "𑛚𑛛𑛜𑛝𑛞𑛟𑛠𑛡𑛢𑛣",
"_type": "numeric"
},
"mymrpao": {
"_digits": "𑛐𑛑𑛒𑛓𑛔𑛕𑛖𑛗𑛘𑛙",
"_type": "numeric"
},
"mymrshan": {
"_digits": "႐႑႒႓႔႕႖႗႘႙",
"_type": "numeric"
},
"mymrtlng": {
"_digits": "꧰꧱꧲꧳꧴꧵꧶꧷꧸꧹",
"_type": "numeric"
},
"nagm": {
"_digits": "𞓰𞓱𞓲𞓳𞓴𞓵𞓶𞓷𞓸𞓹",
"_type": "numeric"
},
"newa": {
"_digits": "𑑐𑑑𑑒𑑓𑑔𑑕𑑖𑑗𑑘𑑙",
"_type": "numeric"
},
"nkoo": {
"_digits": "߀߁߂߃߄߅߆߇߈߉",
"_type": "numeric"
},
"olck": {
"_digits": "᱐᱑᱒᱓᱔᱕᱖᱗᱘᱙",
"_type": "numeric"
},
"onao": {
"_digits": "𞗱𞗲𞗳𞗴𞗵𞗶𞗷𞗸𞗹𞗺",
"_type": "numeric"
},
"orya": {
"_digits": "୦୧୨୩୪୫୬୭୮୯",
"_type": "numeric"
},
"osma": {
"_digits": "𐒠𐒡𐒢𐒣𐒤𐒥𐒦𐒧𐒨𐒩",
"_type": "numeric"
},
"outlined": {
"_digits": "𜳰𜳱𜳲𜳳𜳴𜳵𜳶𜳷𜳸𜳹",
"_type": "numeric"
},
"rohg": {
"_digits": "𐴰𐴱𐴲𐴳𐴴𐴵𐴶𐴷𐴸𐴹",
"_type": "numeric"
},
"roman": {
"_rules": "roman-upper",
"_type": "algorithmic"
},
"romanlow": {
"_rules": "roman-lower",
"_type": "algorithmic"
},
"saur": {
"_digits": "꣐꣑꣒꣓꣔꣕꣖꣗꣘꣙",
"_type": "numeric"
},
"segment": {
"_digits": "🯰🯱🯲🯳🯴🯵🯶🯷🯸🯹",
"_type": "numeric"
},
"shrd": {
"_digits": "𑇐𑇑𑇒𑇓𑇔𑇕𑇖𑇗𑇘𑇙",
"_type": "numeric"
},
"sind": {
"_digits": "𑋰𑋱𑋲𑋳𑋴𑋵𑋶𑋷𑋸𑋹",
"_type": "numeric"
},
"sinh": {
"_digits": "෦෧෨෩෪෫෬෭෮෯",
"_type": "numeric"
},
"sora": {
"_digits": "𑃰𑃱𑃲𑃳𑃴𑃵𑃶𑃷𑃸𑃹",
"_type": "numeric"
},
"sund": {
"_digits": "᮰᮱᮲᮳᮴᮵᮶᮷᮸᮹",
"_type": "numeric"
},
"sunu": {
"_digits": "𑯰𑯱𑯲𑯳𑯴𑯵𑯶𑯷𑯸𑯹",
"_type": "numeric"
},
"takr": {
"_digits": "𑛀𑛁𑛂𑛃𑛄𑛅𑛆𑛇𑛈𑛉",
"_type": "numeric"
},
"talu": {
"_digits": "᧐᧑᧒᧓᧔᧕᧖᧗᧘᧙",
"_type": "numeric"
},
"taml": {
"_rules": "tamil",
"_type": "algorithmic"
},
"tamldec": {
"_digits": "௦௧௨௩௪௫௬௭௮௯",
"_type": "numeric"
},
"telu": {
"_digits": "౦౧౨౩౪౫౬౭౮౯",
"_type": "numeric"
},
"thai": {
"_digits": "๐๑๒๓๔๕๖๗๘๙",
"_type": "numeric"
},
"tibt": {
"_digits": "༠༡༢༣༤༥༦༧༨༩",
"_type": "numeric"
},
"tirh": {
"_digits": "𑓐𑓑𑓒𑓓𑓔𑓕𑓖𑓗𑓘𑓙",
"_type": "numeric"
},
"tnsa": {
"_digits": "𖫀𖫁𖫂𖫃𖫄𖫅𖫆𖫇𖫈𖫉",
"_type": "numeric"
},
"vaii": {
"_digits": "꘠꘡꘢꘣꘤꘥꘦꘧꘨꘩",
"_type": "numeric"
},
"wara": {
"_digits": "𑣠𑣡𑣢𑣣𑣤𑣥𑣦𑣧𑣨𑣩",
"_type": "numeric"
},
"wcho": {
"_digits": "𞋰𞋱𞋲𞋳𞋴𞋵𞋶𞋷𞋸𞋹",
"_type": "numeric"
}
}
}
}

View File

@@ -0,0 +1,164 @@
{
"main": {
"de": {
"identity": {
"language": "de"
},
"numbers": {
"defaultNumberingSystem": "latn",
"otherNumberingSystems": {
"native": "latn"
},
"minimumGroupingDigits": "1",
"symbols-numberSystem-latn": {
"decimal": ",",
"group": ".",
"list": ";",
"percentSign": "%",
"plusSign": "+",
"minusSign": "-",
"approximatelySign": "≈",
"exponential": "E",
"superscriptingExponent": "·",
"perMille": "‰",
"infinity": "∞",
"nan": "NaN",
"timeSeparator": ":"
},
"decimalFormats-numberSystem-latn": {
"standard": "#,##0.###",
"long": {
"decimalFormat": {
"1000-count-one": "0 Tausend",
"1000-count-other": "0 Tausend",
"10000-count-one": "00 Tausend",
"10000-count-other": "00 Tausend",
"100000-count-one": "000 Tausend",
"100000-count-other": "000 Tausend",
"1000000-count-one": "0 Million",
"1000000-count-other": "0 Millionen",
"10000000-count-one": "00 Millionen",
"10000000-count-other": "00 Millionen",
"100000000-count-one": "000 Millionen",
"100000000-count-other": "000 Millionen",
"1000000000-count-one": "0 Milliarde",
"1000000000-count-other": "0 Milliarden",
"10000000000-count-one": "00 Milliarden",
"10000000000-count-other": "00 Milliarden",
"100000000000-count-one": "000 Milliarden",
"100000000000-count-other": "000 Milliarden",
"1000000000000-count-one": "0 Billion",
"1000000000000-count-other": "0 Billionen",
"10000000000000-count-one": "00 Billionen",
"10000000000000-count-other": "00 Billionen",
"100000000000000-count-one": "000 Billionen",
"100000000000000-count-other": "000 Billionen"
}
},
"short": {
"decimalFormat": {
"1000-count-one": "0",
"1000-count-other": "0",
"10000-count-one": "0",
"10000-count-other": "0",
"100000-count-one": "0",
"100000-count-other": "0",
"1000000-count-one": "0 Mio'.'",
"1000000-count-other": "0 Mio'.'",
"10000000-count-one": "00 Mio'.'",
"10000000-count-other": "00 Mio'.'",
"100000000-count-one": "000 Mio'.'",
"100000000-count-other": "000 Mio'.'",
"1000000000-count-one": "0 Mrd'.'",
"1000000000-count-other": "0 Mrd'.'",
"10000000000-count-one": "00 Mrd'.'",
"10000000000-count-other": "00 Mrd'.'",
"100000000000-count-one": "000 Mrd'.'",
"100000000000-count-other": "000 Mrd'.'",
"1000000000000-count-one": "0 Bio'.'",
"1000000000000-count-other": "0 Bio'.'",
"10000000000000-count-one": "00 Bio'.'",
"10000000000000-count-other": "00 Bio'.'",
"100000000000000-count-one": "000 Bio'.'",
"100000000000000-count-other": "000 Bio'.'"
}
}
},
"scientificFormats-numberSystem-latn": {
"standard": "#E0"
},
"percentFormats-numberSystem-latn": {
"standard": "#,##0 %"
},
"currencyFormats-numberSystem-latn": {
"currencySpacing": {
"beforeCurrency": {
"currencyMatch": "[[:^S:]&[:^Z:]]",
"surroundingMatch": "[:digit:]",
"insertBetween": " "
},
"afterCurrency": {
"currencyMatch": "[[:^S:]&[:^Z:]]",
"surroundingMatch": "[:digit:]",
"insertBetween": " "
}
},
"standard": "#,##0.00 ¤",
"standard-alphaNextToNumber": "¤ #,##0.00",
"standard-noCurrency": "#,##0.00",
"accounting": "#,##0.00 ¤",
"accounting-alphaNextToNumber": "¤ #,##0.00",
"accounting-noCurrency": "#,##0.00",
"short": {
"standard": {
"1000-count-one": "0",
"1000-count-other": "0",
"10000-count-one": "0",
"10000-count-other": "0",
"100000-count-one": "0",
"100000-count-other": "0",
"1000000-count-one": "0 Mio'.' ¤",
"1000000-count-other": "0 Mio'.' ¤",
"10000000-count-one": "00 Mio'.' ¤",
"10000000-count-other": "00 Mio'.' ¤",
"100000000-count-one": "000 Mio'.' ¤",
"100000000-count-other": "000 Mio'.' ¤",
"1000000000-count-one": "0 Mrd'.' ¤",
"1000000000-count-other": "0 Mrd'.' ¤",
"10000000000-count-one": "00 Mrd'.' ¤",
"10000000000-count-other": "00 Mrd'.' ¤",
"100000000000-count-one": "000 Mrd'.' ¤",
"100000000000-count-other": "000 Mrd'.' ¤",
"1000000000000-count-one": "0 Bio'.' ¤",
"1000000000000-count-other": "0 Bio'.' ¤",
"10000000000000-count-one": "00 Bio'.' ¤",
"10000000000000-count-other": "00 Bio'.' ¤",
"100000000000000-count-one": "000 Bio'.' ¤",
"100000000000000-count-other": "000 Bio'.' ¤"
}
},
"currencyPatternAppendISO": "{0} ¤¤",
"unitPattern-count-other": "{0} {1}"
},
"miscPatterns-numberSystem-latn": {
"approximately": "≈{0}",
"atLeast": "{0}+",
"atMost": "≤{0}",
"range": "{0}{1}"
},
"minimalPairs": {
"pluralMinimalPairs-count-one": "{0} Tag",
"pluralMinimalPairs-count-other": "{0} Tage",
"other": "{0}. Abzweigung nach rechts nehmen",
"accusative": "… für {0} …",
"dative": "… mit {0} …",
"genitive": "Anstatt {0} …",
"nominative": "{0} kostet (kosten) € 3,50.",
"feminine": "Die {0} ist …",
"masculine": "Der {0} ist …",
"neuter": "Das {0} ist …"
}
}
}
}
}

File diff suppressed because it is too large Load Diff

278
dashboard/src/clients.tsx Normal file
View File

@@ -0,0 +1,278 @@
import SetupModeButton from './components/SetupModeButton';
import React, { useEffect, useState } from 'react';
import { useClientDelete } from './hooks/useClientDelete';
import { fetchClients, updateClient } from './apiClients';
import type { Client } from './apiClients';
import {
GridComponent,
ColumnsDirective,
ColumnDirective,
Page,
Inject,
Toolbar,
Search,
Sort,
Edit,
} from '@syncfusion/ej2-react-grids';
import { DialogComponent } from '@syncfusion/ej2-react-popups';
// Raumgruppen werden dynamisch aus der API geladen
interface DetailsModalProps {
open: boolean;
client: Client | null;
groupIdToName: Record<string | number, string>;
onClose: () => void;
}
function DetailsModal({ open, client, groupIdToName, onClose }: DetailsModalProps) {
if (!open || !client) return null;
return (
<div
style={{
position: 'fixed',
top: 0,
left: 0,
right: 0,
bottom: 0,
background: 'rgba(0,0,0,0.3)',
zIndex: 1000,
}}
>
<div
style={{
background: 'white',
padding: 0,
margin: '100px auto',
maxWidth: 500,
borderRadius: 12,
boxShadow: '0 4px 24px rgba(0,0,0,0.12)',
}}
>
<div style={{ padding: 32 }}>
<h3 style={{ fontSize: '1.25rem', fontWeight: 700, marginBottom: 18 }}>Client-Details</h3>
<table style={{ width: '100%', borderCollapse: 'collapse', marginBottom: 24 }}>
<tbody>
{Object.entries(client)
.filter(
([key]) =>
![
'index',
'is_active',
'type',
'column',
'group_name',
'foreignKeyData',
'hardware_token',
].includes(key)
)
.map(([key, value]) => (
<tr key={key}>
<td style={{ fontWeight: 'bold', padding: '6px 8px' }}>
{key === 'group_id'
? 'Raumgruppe'
: key === 'ip'
? 'IP-Adresse'
: key === 'registration_time'
? 'Registriert am'
: key === 'description'
? 'Beschreibung'
: key === 'last_alive'
? 'Letzter Kontakt'
: key === 'model'
? 'Modell'
: key === 'uuid'
? 'Client-Code'
: key === "os_version"
? 'Betriebssystem'
: key === 'software_version'
? 'Clientsoftware'
: key === 'macs'
? 'MAC-Adressen'
: key.charAt(0).toUpperCase() + key.slice(1)}
:
</td>
<td style={{ padding: '6px 8px' }}>
{key === 'group_id'
? value !== undefined
? groupIdToName[value as string | number] || value
: ''
: key === 'registration_time' && value
? new Date(
(value as string).endsWith('Z') ? (value as string) : value + 'Z'
).toLocaleString()
: key === 'last_alive' && value
? String(value) // Wert direkt anzeigen, nicht erneut parsen
: String(value)}
</td>
</tr>
))}
</tbody>
</table>
<div style={{ textAlign: 'right' }}>
<button className="e-btn e-outline" onClick={onClose}>
Schließen
</button>
</div>
</div>
</div>
</div>
);
}
const Clients: React.FC = () => {
const [clients, setClients] = useState<Client[]>([]);
const [groups, setGroups] = useState<{ id: number; name: string }[]>([]);
const [detailsClient, setDetailsClient] = useState<Client | null>(null);
const { showDialog, deleteClientId, handleDelete, confirmDelete, cancelDelete } = useClientDelete(
uuid => setClients(prev => prev.filter(c => c.uuid !== uuid))
);
useEffect(() => {
fetchClients().then(setClients);
// Gruppen auslesen
import('./apiGroups').then(mod => mod.fetchGroups()).then(setGroups);
}, []);
// Map group_id zu group_name
const groupIdToName: Record<string | number, string> = {};
groups.forEach(g => {
groupIdToName[g.id] = g.name;
});
// DataGrid data mit korrektem Gruppennamen und formatierten Zeitangaben
const gridData = clients.map(c => ({
...c,
group_name: c.group_id !== undefined ? groupIdToName[c.group_id] || String(c.group_id) : '',
last_alive: c.last_alive
? new Date(
(c.last_alive as string).endsWith('Z') ? (c.last_alive as string) : c.last_alive + 'Z'
).toLocaleString()
: '',
}));
// DataGrid row template für Details- und Entfernen-Button
const detailsButtonTemplate = (props: Client) => (
<div style={{ display: 'flex', gap: '8px' }}>
<button
className="e-btn e-primary"
onClick={() => setDetailsClient(props)}
style={{ minWidth: 80 }}
>
Details
</button>
<button
className="e-btn e-danger"
onClick={e => {
e.stopPropagation();
handleDelete(props.uuid);
}}
style={{ minWidth: 80 }}
>
Entfernen
</button>
</div>
);
return (
<div>
<div className="flex justify-between items-center mb-4">
<h2 className="text-xl font-bold">Client-Übersicht</h2>
<SetupModeButton />
</div>
{groups.length > 0 ? (
<>
<GridComponent
dataSource={gridData}
allowPaging={true}
pageSettings={{ pageSize: 10 }}
toolbar={['Search', 'Edit', 'Update', 'Cancel']}
allowSorting={true}
allowFiltering={true}
height={400}
editSettings={{
allowEditing: true,
allowAdding: false,
allowDeleting: false,
mode: 'Normal',
}}
actionComplete={async (args: {
requestType: string;
data: Record<string, unknown>;
}) => {
if (args.requestType === 'save') {
const { uuid, description, model } = args.data as {
uuid: string;
description: string;
model: string;
};
// API-Aufruf zum Speichern
await updateClient(uuid, { description, model });
// Nach dem Speichern neu laden
fetchClients().then(setClients);
}
}}
>
<ColumnsDirective>
<ColumnDirective
field="description"
headerText="Beschreibung"
allowEditing={true}
width="180"
/>
<ColumnDirective
field="group_name"
headerText="Raumgruppe"
allowEditing={false}
width="140"
/>
<ColumnDirective field="uuid" headerText="UUID" allowEditing={false} width="160" />
<ColumnDirective field="ip" headerText="IP-Adresse" allowEditing={false} width="80" />
<ColumnDirective
field="last_alive"
headerText="Last Alive"
allowEditing={false}
width="120"
/>
<ColumnDirective field="model" headerText="Model" allowEditing={true} width="120" />
<ColumnDirective
headerText="Aktion"
width="190"
template={detailsButtonTemplate}
textAlign="Center"
allowEditing={false}
/>
</ColumnsDirective>
<Inject services={[Page, Toolbar, Search, Sort, Edit]} />
</GridComponent>
<DetailsModal
open={!!detailsClient}
client={detailsClient}
groupIdToName={groupIdToName}
onClose={() => setDetailsClient(null)}
/>
</>
) : (
<div className="text-gray-500">Raumgruppen werden geladen ...</div>
)}
{/* DialogComponent für Bestätigung */}
{showDialog && deleteClientId && (
<DialogComponent
visible={showDialog}
header="Bestätigung"
content="Möchten Sie diesen Client wirklich entfernen?"
showCloseIcon={true}
width="400px"
buttons={[
{ click: confirmDelete, buttonModel: { content: 'Ja', isPrimary: true } },
{ click: cancelDelete, buttonModel: { content: 'Abbrechen' } },
]}
close={cancelDelete}
/>
)}
</div>
);
};
export default Clients;

View File

@@ -0,0 +1,515 @@
import React from 'react';
import { DialogComponent } from '@syncfusion/ej2-react-popups';
import { TextBoxComponent } from '@syncfusion/ej2-react-inputs';
import { DatePickerComponent, TimePickerComponent } from '@syncfusion/ej2-react-calendars';
import { DropDownListComponent, MultiSelectComponent } from '@syncfusion/ej2-react-dropdowns';
import { CheckBoxComponent } from '@syncfusion/ej2-react-buttons';
import CustomSelectUploadEventModal from './CustomSelectUploadEventModal';
import { updateEvent } from '../apiEvents';
type CustomEventData = {
title: string;
startDate: Date | null;
startTime: Date | null;
endTime: Date | null;
type: string;
description: string;
repeat: boolean;
weekdays: number[];
repeatUntil: Date | null;
skipHolidays: boolean;
media?: { id: string; path: string; name: string } | null; // <--- ergänzt
slideshowInterval?: number; // <--- ergänzt
websiteUrl?: string; // <--- ergänzt
};
// Typ für initialData erweitern, damit Id unterstützt wird
type CustomEventModalProps = {
open: boolean;
onClose: () => void;
onSave: (eventData: CustomEventData) => void;
initialData?: Partial<CustomEventData> & { Id?: string }; // <--- Id ergänzen
groupName: string | { id: string | null; name: string };
groupColor?: string;
editMode?: boolean;
blockHolidays?: boolean;
isHolidayRange?: (start: Date, end: Date) => boolean;
};
const weekdayOptions = [
{ value: 0, label: 'Montag' },
{ value: 1, label: 'Dienstag' },
{ value: 2, label: 'Mittwoch' },
{ value: 3, label: 'Donnerstag' },
{ value: 4, label: 'Freitag' },
{ value: 5, label: 'Samstag' },
{ value: 6, label: 'Sonntag' },
];
const typeOptions = [
{ value: 'presentation', label: 'Präsentation' },
{ value: 'website', label: 'Website' },
{ value: 'video', label: 'Video' },
{ value: 'message', label: 'Nachricht' },
{ value: 'webuntis', label: 'WebUntis' },
];
const CustomEventModal: React.FC<CustomEventModalProps> = ({
open,
onClose,
onSave,
initialData = {},
groupName,
groupColor,
editMode,
blockHolidays,
isHolidayRange,
}) => {
const [title, setTitle] = React.useState(initialData.title || '');
const [startDate, setStartDate] = React.useState(initialData.startDate || null);
const [startTime, setStartTime] = React.useState(
initialData.startTime || new Date(0, 0, 0, 9, 0)
);
const [endTime, setEndTime] = React.useState(initialData.endTime || new Date(0, 0, 0, 9, 30));
const [type, setType] = React.useState(initialData.type ?? 'presentation');
const [description, setDescription] = React.useState(initialData.description || '');
const [repeat, setRepeat] = React.useState(initialData.repeat || false);
const [weekdays, setWeekdays] = React.useState<number[]>(initialData.weekdays || []);
const [repeatUntil, setRepeatUntil] = React.useState(initialData.repeatUntil || null);
const [skipHolidays, setSkipHolidays] = React.useState(initialData.skipHolidays || false);
const [errors, setErrors] = React.useState<{ [key: string]: string }>({});
// --- KORREKTUR: Media, SlideshowInterval, WebsiteUrl aus initialData übernehmen ---
const [media, setMedia] = React.useState<{ id: string; path: string; name: string } | null>(
initialData.media ?? null
);
const [pendingMedia, setPendingMedia] = React.useState<{
id: string;
path: string;
name: string;
} | null>(null);
const [slideshowInterval, setSlideshowInterval] = React.useState<number>(
initialData.slideshowInterval ?? 10
);
const [websiteUrl, setWebsiteUrl] = React.useState<string>(initialData.websiteUrl ?? '');
const [mediaModalOpen, setMediaModalOpen] = React.useState(false);
React.useEffect(() => {
if (open) {
setTitle(initialData.title || '');
setStartDate(initialData.startDate || null);
setStartTime(initialData.startTime || new Date(0, 0, 0, 9, 0));
setEndTime(initialData.endTime || new Date(0, 0, 0, 9, 30));
setType(initialData.type ?? 'presentation');
setDescription(initialData.description || '');
setRepeat(initialData.repeat || false);
setWeekdays(initialData.weekdays || []);
setRepeatUntil(initialData.repeatUntil || null);
setSkipHolidays(initialData.skipHolidays || false);
// --- KORREKTUR: Media, SlideshowInterval, WebsiteUrl aus initialData übernehmen ---
setMedia(initialData.media ?? null);
setSlideshowInterval(initialData.slideshowInterval ?? 10);
setWebsiteUrl(initialData.websiteUrl ?? '');
}
}, [open, initialData]);
React.useEffect(() => {
if (!mediaModalOpen && pendingMedia) {
setMedia(pendingMedia);
setPendingMedia(null);
}
}, [mediaModalOpen, pendingMedia]);
const handleSave = async () => {
const newErrors: { [key: string]: string } = {};
if (!title.trim()) newErrors.title = 'Titel ist erforderlich';
if (!startDate) newErrors.startDate = 'Startdatum ist erforderlich';
if (!startTime) newErrors.startTime = 'Startzeit ist erforderlich';
if (!endTime) newErrors.endTime = 'Endzeit ist erforderlich';
if (!type) newErrors.type = 'Termintyp ist erforderlich';
// Vergangenheitsprüfung
const startDateTime =
startDate && startTime
? new Date(
startDate.getFullYear(),
startDate.getMonth(),
startDate.getDate(),
startTime.getHours(),
startTime.getMinutes()
)
: null;
const isPast = startDateTime && startDateTime < new Date();
if (isPast) {
newErrors.startDate = 'Termine in der Vergangenheit sind nicht erlaubt!';
}
if (type === 'presentation') {
if (!media) newErrors.media = 'Bitte eine Präsentation auswählen';
if (!slideshowInterval || slideshowInterval < 1)
newErrors.slideshowInterval = 'Intervall angeben';
}
if (type === 'website') {
if (!websiteUrl.trim()) newErrors.websiteUrl = 'Webseiten-URL ist erforderlich';
}
// Holiday blocking: prevent creating when range overlaps
if (
!editMode &&
blockHolidays &&
startDate &&
startTime &&
endTime &&
typeof isHolidayRange === 'function'
) {
const s = new Date(
startDate.getFullYear(),
startDate.getMonth(),
startDate.getDate(),
startTime.getHours(),
startTime.getMinutes()
);
const e = new Date(
startDate.getFullYear(),
startDate.getMonth(),
startDate.getDate(),
endTime.getHours(),
endTime.getMinutes()
);
if (isHolidayRange(s, e)) {
newErrors.startDate = 'Dieser Zeitraum liegt in den Ferien und ist gesperrt.';
}
}
if (Object.keys(newErrors).length > 0) {
setErrors(newErrors);
return;
}
setErrors({});
const group_id = typeof groupName === 'object' && groupName !== null ? groupName.id : groupName;
const payload: CustomEventData & { [key: string]: unknown } = {
group_id,
title,
description,
start:
startDate && startTime
? new Date(
startDate.getFullYear(),
startDate.getMonth(),
startDate.getDate(),
startTime.getHours(),
startTime.getMinutes()
).toISOString()
: null,
end:
startDate && endTime
? new Date(
startDate.getFullYear(),
startDate.getMonth(),
startDate.getDate(),
endTime.getHours(),
endTime.getMinutes()
).toISOString()
: null,
type,
startDate,
startTime,
endTime,
repeat,
weekdays,
repeatUntil,
skipHolidays,
event_type: type,
is_active: 1,
created_by: 1,
};
if (type === 'presentation') {
payload.event_media_id = media?.id;
payload.slideshow_interval = slideshowInterval;
}
if (type === 'website') {
payload.website_url = websiteUrl;
}
try {
let res;
if (editMode && initialData && typeof initialData.Id === 'string') {
// UPDATE statt CREATE
res = await updateEvent(initialData.Id, payload);
} else {
// CREATE
res = await fetch('/api/events', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(payload),
});
res = await res.json();
}
if (res.success) {
onSave(payload);
onClose(); // <--- Box nach erfolgreichem Speichern schließen
} else {
setErrors({ api: res.error || 'Fehler beim Speichern' });
}
} catch {
setErrors({ api: 'Netzwerkfehler beim Speichern' });
}
console.log('handleSave called');
};
// Vergangenheitsprüfung (außerhalb von handleSave, damit im Render verfügbar)
const startDateTime =
startDate && startTime
? new Date(
startDate.getFullYear(),
startDate.getMonth(),
startDate.getDate(),
startTime.getHours(),
startTime.getMinutes()
)
: null;
const isPast = !!(startDateTime && startDateTime < new Date());
return (
<DialogComponent
target="#root"
visible={open}
width="800px"
header={() => (
<div
style={{
background: groupColor || '#f5f5f5',
padding: '8px 16px',
borderRadius: '6px 6px 0 0',
color: '#fff',
fontWeight: 600,
fontSize: 20,
display: 'flex',
alignItems: 'center',
}}
>
{editMode ? 'Termin bearbeiten' : 'Neuen Termin anlegen'}
{groupName && (
<span style={{ fontWeight: 400, fontSize: 16, marginLeft: 16, color: '#fff' }}>
für Raumgruppe: <b>{typeof groupName === 'object' ? groupName.name : groupName}</b>
</span>
)}
</div>
)}
showCloseIcon={true}
close={onClose}
isModal={true}
footerTemplate={() => (
<div className="flex gap-2 justify-end">
<button className="e-btn e-danger" onClick={onClose}>
Schließen
</button>
<button
className="e-btn e-success"
onClick={handleSave}
disabled={isPast} // <--- Button deaktivieren, wenn Termin in Vergangenheit
>
Termin(e) speichern
</button>
</div>
)}
>
<div style={{ padding: '24px' }}>
<div style={{ display: 'flex', gap: 24, flexWrap: 'wrap' }}>
<div style={{ flex: 1, minWidth: 260 }}>
{/* ...Titel, Beschreibung, Datum, Zeit... */}
<div style={{ marginBottom: 12 }}>
<TextBoxComponent
placeholder="Titel"
floatLabelType="Auto"
value={title}
change={e => setTitle(e.value)}
/>
{errors.title && <div style={{ color: 'red', fontSize: 12 }}>{errors.title}</div>}
</div>
<div style={{ marginBottom: 12 }}>
<TextBoxComponent
placeholder="Beschreibung"
floatLabelType="Auto"
multiline={true}
value={description}
change={e => setDescription(e.value)}
/>
</div>
<div style={{ marginBottom: 12 }}>
<DatePickerComponent
placeholder="Startdatum"
floatLabelType="Auto"
value={startDate ?? undefined}
change={e => setStartDate(e.value)}
/>
{errors.startDate && (
<div style={{ color: 'red', fontSize: 12 }}>{errors.startDate}</div>
)}
{isPast && (
<span
style={{
color: 'orange',
fontWeight: 600,
marginLeft: 8,
display: 'inline-block',
background: '#fff3cd',
borderRadius: 4,
padding: '2px 8px',
border: '1px solid #ffeeba',
}}
>
Termin liegt in der Vergangenheit!
</span>
)}
</div>
<div style={{ display: 'flex', gap: 8, marginBottom: 12 }}>
<div style={{ flex: 1 }}>
<TimePickerComponent
placeholder="Startzeit"
floatLabelType="Auto"
value={startTime}
step={30}
change={e => setStartTime(e.value)}
/>
{errors.startTime && (
<div style={{ color: 'red', fontSize: 12 }}>{errors.startTime}</div>
)}
</div>
<div style={{ flex: 1 }}>
<TimePickerComponent
placeholder="Endzeit"
floatLabelType="Auto"
value={endTime}
step={30}
change={e => setEndTime(e.value)}
/>
{errors.endTime && (
<div style={{ color: 'red', fontSize: 12 }}>{errors.endTime}</div>
)}
</div>
</div>
</div>
<div style={{ flex: 1, minWidth: 260 }}>
{/* ...Wiederholung, MultiSelect, Wiederholung bis, Ferien... */}
<div style={{ marginBottom: 12 }}>
<CheckBoxComponent
label="Wiederholender Termin"
checked={repeat}
change={e => setRepeat(e.checked)}
/>
</div>
<div style={{ marginBottom: 12 }}>
<MultiSelectComponent
key={repeat ? 'enabled' : 'disabled'}
dataSource={weekdayOptions}
fields={{ text: 'label', value: 'value' }}
placeholder="Wochentage"
value={weekdays}
change={e => setWeekdays(e.value as number[])}
disabled={!repeat}
showDropDownIcon={true}
closePopupOnSelect={false}
/>
</div>
<div style={{ marginBottom: 12 }}>
<DatePickerComponent
key={repeat ? 'enabled' : 'disabled'}
placeholder="Wiederholung bis"
floatLabelType="Auto"
value={repeatUntil ?? undefined}
change={e => setRepeatUntil(e.value)}
disabled={!repeat}
/>
</div>
<div style={{ marginBottom: 12 }}>
<CheckBoxComponent
label="Ferientage berücksichtigen"
checked={skipHolidays}
change={e => setSkipHolidays(e.checked)}
disabled={!repeat}
/>
</div>
</div>
</div>
{/* NEUER ZWEISPALTIGER BEREICH für Termintyp und Zusatzfelder */}
<div style={{ display: 'flex', gap: 24, flexWrap: 'wrap', marginTop: 8 }}>
<div style={{ flex: 1, minWidth: 260 }}>
<div style={{ marginBottom: 12, marginTop: 16 }}>
<DropDownListComponent
dataSource={typeOptions}
fields={{ text: 'label', value: 'value' }}
placeholder="Termintyp"
value={type}
change={e => setType(e.value as string)}
style={{ width: '100%' }}
/>
{errors.type && <div style={{ color: 'red', fontSize: 12 }}>{errors.type}</div>}
</div>
</div>
<div style={{ flex: 1, minWidth: 260 }}>
<div style={{ marginBottom: 12, minHeight: 60 }}>
{type === 'presentation' && (
<div>
<div style={{ marginBottom: 8, marginTop: 16 }}>
<button
className="e-btn"
onClick={() => setMediaModalOpen(true)}
style={{ width: '100%' }}
>
Medium auswählen/hochladen
</button>
</div>
<div style={{ marginBottom: 8 }}>
<b>Ausgewähltes Medium:</b>{' '}
{media ? (
media.path
) : (
<span style={{ color: '#888' }}>Kein Medium ausgewählt</span>
)}
</div>
<TextBoxComponent
placeholder="Slideshow-Intervall (Sekunden)"
floatLabelType="Auto"
type="number"
value={String(slideshowInterval)}
change={e => setSlideshowInterval(Number(e.value))}
/>
</div>
)}
{type === 'website' && (
<div>
<TextBoxComponent
placeholder="Webseiten-URL"
floatLabelType="Always"
value={websiteUrl}
change={e => setWebsiteUrl(e.value)}
/>
</div>
)}
</div>
</div>
</div>
</div>
{mediaModalOpen && (
<CustomSelectUploadEventModal
open={mediaModalOpen}
onClose={() => setMediaModalOpen(false)}
onSelect={({ id, path, name }) => {
setPendingMedia({ id, path, name });
setMediaModalOpen(false);
}}
selectedFileId={null}
/>
)}
</DialogComponent>
);
};
export default CustomEventModal;

View File

@@ -0,0 +1,58 @@
import React from 'react';
interface CustomMediaInfoPanelProps {
name: string;
size: number;
type: string;
dateModified: number;
description?: string | null;
}
const CustomMediaInfoPanel: React.FC<CustomMediaInfoPanelProps> = ({
name,
size,
type,
dateModified,
description,
}) => {
function formatLocalDate(timestamp: number | undefined | null) {
if (!timestamp || isNaN(timestamp)) return '-';
const date = new Date(timestamp * 1000);
return date.toLocaleString('de-DE');
}
return (
<div
style={{
padding: 16,
border: '1px solid #eee',
borderRadius: 8,
background: '#fafafa',
maxWidth: 400,
}}
>
<h3 style={{ marginBottom: 12 }}>Datei-Eigenschaften</h3>
<div>
<b>Name:</b> {name || '-'}
</div>
<div>
<b>Typ:</b> {type || '-'}
</div>
<div>
<b>Größe:</b> {typeof size === 'number' && !isNaN(size) ? size + ' Bytes' : '-'}
</div>
<div>
<b>Geändert:</b> {formatLocalDate(dateModified)}
</div>
<div>
<b>Beschreibung:</b>{' '}
{description && description !== 'null' ? (
description
) : (
<span style={{ color: '#888' }}>Keine Beschreibung</span>
)}
</div>
</div>
);
};
export default CustomMediaInfoPanel;

View File

@@ -0,0 +1,119 @@
import React, { useState } from 'react';
import { DialogComponent } from '@syncfusion/ej2-react-popups';
import {
FileManagerComponent,
Inject,
NavigationPane,
DetailsView,
Toolbar,
} from '@syncfusion/ej2-react-filemanager';
const hostUrl = '/api/eventmedia/filemanager/';
type CustomSelectUploadEventModalProps = {
open: boolean;
onClose: () => void;
onSelect: (file: { id: string; path: string; name: string }) => void; // name ergänzt
selectedFileId?: string | null;
};
const CustomSelectUploadEventModal: React.FC<CustomSelectUploadEventModalProps> = props => {
const { open, onClose, onSelect } = props;
const [selectedFile, setSelectedFile] = useState<{
id: string;
path: string;
name: string;
} | null>(null);
// Callback für Dateiauswahl
interface FileSelectEventArgs {
fileDetails: {
name: string;
isFile: boolean;
size: number;
// weitere Felder falls benötigt
};
}
const handleFileSelect = async (args: FileSelectEventArgs) => {
if (args.fileDetails.isFile && args.fileDetails.size > 0) {
const filename = args.fileDetails.name;
try {
const response = await fetch(
`/api/eventmedia/find_by_filename?filename=${encodeURIComponent(filename)}`
);
if (response.ok) {
const data = await response.json();
setSelectedFile({ id: data.id, path: data.file_path, name: filename });
} else {
setSelectedFile({ id: filename, path: filename, name: filename });
}
} catch (e) {
console.error('Error fetching file details:', e);
}
}
};
// Button-Handler
const handleSelectClick = () => {
if (selectedFile) {
onSelect(selectedFile);
}
};
return (
<DialogComponent
target="#root"
visible={open}
width="700px"
header="Medium auswählen/hochladen"
showCloseIcon={true}
close={onClose}
isModal={true}
footerTemplate={() => (
<div className="flex gap-2 justify-end">
<button className="e-btn" onClick={onClose}>
Abbrechen
</button>
<button className="e-btn e-primary" disabled={!selectedFile} onClick={handleSelectClick}>
Auswählen
</button>
</div>
)}
>
<FileManagerComponent
ajaxSettings={{
url: hostUrl + 'operations',
getImageUrl: hostUrl + 'get-image',
uploadUrl: hostUrl + 'upload',
downloadUrl: hostUrl + 'download',
}}
toolbarSettings={{
items: [
'NewFolder',
'Upload',
'Download',
'Rename',
'Delete',
'SortBy',
'Refresh',
'Details',
],
}}
contextMenuSettings={{
file: ['Open', '|', 'Download', '|', 'Rename', 'Delete', '|', 'Details'],
folder: ['Open', '|', 'Rename', 'Delete', '|', 'Details'],
layout: ['SortBy', 'Refresh', '|', 'View', 'Details'],
}}
allowMultiSelection={false}
fileSelect={handleFileSelect}
>
<Inject services={[NavigationPane, DetailsView, Toolbar]} />
</FileManagerComponent>
</DialogComponent>
);
};
export default CustomSelectUploadEventModal;

View File

@@ -0,0 +1,19 @@
import React from 'react';
import { useNavigate } from 'react-router-dom';
import { Wrench } from 'lucide-react';
const SetupModeButton: React.FC = () => {
const navigate = useNavigate();
return (
<button
className="setupmode-btn flex items-center gap-2 px-4 py-2 bg-yellow-200 hover:bg-yellow-300 rounded"
onClick={() => navigate('/setup')}
title="Erweiterungsmodus starten"
>
<Wrench size={18} />
Erweiterungsmodus
</button>
);
};
export default SetupModeButton;

View File

@@ -0,0 +1,24 @@
import React, { createContext, useRef, useContext } from 'react';
import { ToastComponent, type ToastModel } from '@syncfusion/ej2-react-notifications';
const ToastContext = createContext<{ show: (opts: ToastModel) => void }>({ show: () => {} });
export const useToast = () => useContext(ToastContext);
export const ToastProvider: React.FC<{ children: React.ReactNode }> = ({ children }) => {
const toastRef = useRef<ToastComponent>(null);
const show = (opts: ToastModel) => toastRef.current?.show(opts);
return (
<ToastContext.Provider value={{ show }}>
{children}
<ToastComponent
ref={toastRef}
position={{ X: 'Right', Y: 'Top' }}
timeOut={5000} // Standard: 5 Sekunden
showCloseButton={false}
/>
</ToastContext.Provider>
);
};

204
dashboard/src/dashboard.tsx Normal file
View File

@@ -0,0 +1,204 @@
import React, { useEffect, useState, useRef } from 'react';
import { fetchGroupsWithClients, restartClient } from './apiClients';
import type { Group, Client } from './apiClients';
import {
GridComponent,
ColumnsDirective,
ColumnDirective,
Page,
DetailRow,
Inject,
Sort,
} from '@syncfusion/ej2-react-grids';
const REFRESH_INTERVAL = 15000; // 15 Sekunden
// Typ für Collapse-Event
// type DetailRowCollapseArgs = {
// data?: { id?: string | number };
// };
// Typ für DataBound-Event
type DetailRowDataBoundArgs = {
data?: { id?: string | number };
};
const Dashboard: React.FC = () => {
const [groups, setGroups] = useState<Group[]>([]);
const [expandedGroupIds, setExpandedGroupIds] = useState<string[]>([]);
const gridRef = useRef<GridComponent | null>(null);
// Funktion für das Schließen einer Gruppe (Collapse)
// const onDetailCollapse = (args: DetailRowCollapseArgs) => {
// if (args && args.data && args.data.id) {
// const groupId = String(args.data.id);
// setExpandedGroupIds(prev => prev.filter(id => String(id) !== groupId));
// }
// };
// // Registriere das Event nach dem Mount am Grid
// useEffect(() => {
// if (gridRef.current) {
// gridRef.current.detailCollapse = onDetailCollapse;
// }
// }, []);
// Optimiertes Update: Nur bei echten Datenänderungen wird das Grid aktualisiert
useEffect(() => {
let lastGroups: Group[] = [];
const fetchAndUpdate = async () => {
const newGroups = await fetchGroupsWithClients();
// Vergleiche nur die relevanten Felder (id, clients, is_alive)
const changed =
lastGroups.length !== newGroups.length ||
lastGroups.some((g, i) => {
const ng = newGroups[i];
if (!ng || g.id !== ng.id || g.clients.length !== ng.clients.length) return true;
// Optional: Vergleiche tiefer, z.B. Alive-Status
for (let j = 0; j < g.clients.length; j++) {
if (
g.clients[j].uuid !== ng.clients[j].uuid ||
g.clients[j].is_alive !== ng.clients[j].is_alive
) {
return true;
}
}
return false;
});
if (changed) {
setGroups(newGroups);
lastGroups = newGroups;
setTimeout(() => {
expandedGroupIds.forEach(id => {
const rowIndex = newGroups.findIndex(g => String(g.id) === String(id));
if (rowIndex !== -1 && gridRef.current) {
gridRef.current.detailRowModule.expand(rowIndex);
}
});
}, 100);
}
};
fetchAndUpdate();
const interval = setInterval(fetchAndUpdate, REFRESH_INTERVAL);
return () => clearInterval(interval);
}, [expandedGroupIds]);
// Health-Badge
const getHealthBadge = (group: Group) => {
const total = group.clients.length;
const alive = group.clients.filter((c: Client) => c.is_alive).length;
const ratio = total === 0 ? 0 : alive / total;
let color = 'danger';
let text = `${alive} / ${total} offline`;
if (ratio === 1) {
color = 'success';
text = `${alive} / ${total} alive`;
} else if (ratio >= 0.5) {
color = 'warning';
text = `${alive} / ${total} teilw. alive`;
}
return <span className={`e-badge e-badge-${color}`}>{text}</span>;
};
// Einfache Tabelle für Clients einer Gruppe
const getClientTable = (group: Group) => (
<div style={{ maxHeight: 300, overflowY: 'auto', marginBottom: '5px' }}>
<GridComponent dataSource={group.clients} allowSorting={true} height={'auto'}>
<ColumnsDirective>
<ColumnDirective field="description" headerText="Beschreibung" width="150" />
<ColumnDirective field="ip" headerText="IP" width="120" />
{/* <ColumnDirective
field="last_alive"
headerText="Letztes Lebenszeichen"
width="180"
template={(props: { last_alive: string | null }) => {
if (!props.last_alive) return '-';
const dateStr = props.last_alive.endsWith('Z')
? props.last_alive
: props.last_alive + 'Z';
const date = new Date(dateStr);
return isNaN(date.getTime()) ? props.last_alive : date.toLocaleString();
}}
/> */}
<ColumnDirective
field="is_alive"
headerText="Alive"
width="100"
template={(props: { is_alive: boolean }) => (
<span className={`e-badge e-badge-${props.is_alive ? 'success' : 'danger'}`}>
{props.is_alive ? 'alive' : 'offline'}
</span>
)}
sortComparer={(a, b) => (a === b ? 0 : a ? -1 : 1)}
/>
<ColumnDirective
headerText="Aktionen"
width="150"
template={(props: { uuid: string }) => (
<button className="e-btn e-primary" onClick={() => handleRestartClient(props.uuid)}>
Neustart
</button>
)}
/>
</ColumnsDirective>
<Inject services={[Sort]} />
</GridComponent>
</div>
);
// Neustart-Logik
const handleRestartClient = async (uuid: string) => {
try {
const result = await restartClient(uuid);
alert(`Neustart erfolgreich: ${result.message}`);
} catch (error: unknown) {
if (error && typeof error === 'object' && 'message' in error) {
alert(`Fehler beim Neustart: ${(error as { message: string }).message}`);
} else {
alert('Unbekannter Fehler beim Neustart');
}
}
};
// SyncFusion Grid liefert im Event die Zeile/Gruppe
const onDetailDataBound = (args: DetailRowDataBoundArgs) => {
if (args && args.data && args.data.id) {
const groupId = String(args.data.id);
setExpandedGroupIds(prev => (prev.includes(groupId) ? prev : [...prev, groupId]));
}
};
return (
<div>
<header className="mb-8 pb-4 border-b-2 border-[#d6c3a6]">
<h2 className="text-3xl font-extrabold mb-2">Dashboard</h2>
</header>
<h3 className="text-lg font-semibold mt-6 mb-4">Raumgruppen Übersicht</h3>
<GridComponent
dataSource={groups}
allowPaging={true}
pageSettings={{ pageSize: 5 }}
height={400}
detailTemplate={(props: Group) => getClientTable(props)}
detailDataBound={onDetailDataBound}
ref={gridRef}
>
<Inject services={[Page, DetailRow]} />
<ColumnsDirective>
<ColumnDirective field="name" headerText="Raumgruppe" width="180" />
<ColumnDirective
headerText="Health"
width="160"
template={(props: Group) => getHealthBadge(props)}
/>
</ColumnsDirective>
</GridComponent>
{groups.length === 0 && (
<div className="col-span-full text-center text-gray-400">Keine Gruppen gefunden.</div>
)}
</div>
);
};
export default Dashboard;

View File

@@ -0,0 +1,87 @@
import React from 'react';
import { listHolidays, uploadHolidaysCsv, type Holiday } from './apiHolidays';
const Einstellungen: React.FC = () => {
const [file, setFile] = React.useState<File | null>(null);
const [busy, setBusy] = React.useState(false);
const [message, setMessage] = React.useState<string | null>(null);
const [holidays, setHolidays] = React.useState<Holiday[]>([]);
const refresh = React.useCallback(async () => {
try {
const data = await listHolidays();
setHolidays(data.holidays);
} catch (e) {
const msg = e instanceof Error ? e.message : 'Fehler beim Laden der Ferien';
setMessage(msg);
}
}, []);
React.useEffect(() => {
refresh();
}, [refresh]);
const onUpload = async () => {
if (!file) return;
setBusy(true);
setMessage(null);
try {
const res = await uploadHolidaysCsv(file);
setMessage(`Import erfolgreich: ${res.inserted} neu, ${res.updated} aktualisiert.`);
await refresh();
} catch (e) {
const msg = e instanceof Error ? e.message : 'Fehler beim Import.';
setMessage(msg);
} finally {
setBusy(false);
}
};
return (
<div>
<h2 className="text-xl font-bold mb-4">Einstellungen</h2>
<div className="space-y-4">
<section className="p-4 border rounded-md">
<h3 className="font-semibold mb-2">Schulferien importieren</h3>
<p className="text-sm text-gray-600 mb-2">
Unterstützte Formate:
<br /> CSV mit Kopfzeile: <code>name</code>, <code>start_date</code>,{' '}
<code>end_date</code>, optional <code>region</code>
<br /> TXT/CSV ohne Kopfzeile mit Spalten: interner Name, <strong>Name</strong>,{' '}
<strong>Start (YYYYMMDD)</strong>, <strong>Ende (YYYYMMDD)</strong>, optional interne
Info (ignoriert)
</p>
<div className="flex items-center gap-3">
<input
type="file"
accept=".csv,text/csv,.txt,text/plain"
onChange={e => setFile(e.target.files?.[0] ?? null)}
/>
<button className="e-btn e-primary" onClick={onUpload} disabled={!file || busy}>
{busy ? 'Importiere…' : 'CSV/TXT importieren'}
</button>
</div>
{message && <div className="mt-2 text-sm">{message}</div>}
</section>
<section className="p-4 border rounded-md">
<h3 className="font-semibold mb-2">Importierte Ferien</h3>
{holidays.length === 0 ? (
<div className="text-sm text-gray-600">Keine Einträge vorhanden.</div>
) : (
<ul className="text-sm list-disc pl-6">
{holidays.slice(0, 20).map(h => (
<li key={h.id}>
{h.name}: {h.start_date} {h.end_date}
{h.region ? ` (${h.region})` : ''}
</li>
))}
</ul>
)}
</section>
</div>
</div>
);
};
export default Einstellungen;

View File

@@ -0,0 +1,52 @@
// 20 gut unterscheidbare Farben für Gruppen
export const groupColorPalette: string[] = [
'#1E90FF', // Blau
'#28A745', // Grün
'#FFC107', // Gelb
'#DC3545', // Rot
'#6F42C1', // Lila
'#20C997', // Türkis
'#FD7E14', // Orange
'#6610F2', // Violett
'#17A2B8', // Cyan
'#E83E8C', // Pink
'#FF5733', // Koralle
'#2ECC40', // Hellgrün
'#FFB300', // Dunkelgelb
'#00796B', // Petrol
'#C70039', // Dunkelrot
'#8D6E63', // Braun
'#607D8B', // Grau-Blau
'#00B8D4', // Türkisblau
'#FF6F00', // Dunkelorange
'#9C27B0', // Dunkellila
];
// Gibt für eine Gruppen-ID immer dieselbe Farbe zurück (Index basiert auf Gruppenliste)
export function getGroupColor(groupId: string, groups: { id: string }[]): string {
const colorPalette = [
'#1E90FF',
'#28A745',
'#FFC107',
'#DC3545',
'#6F42C1',
'#20C997',
'#FD7E14',
'#6610F2',
'#17A2B8',
'#E83E8C',
'#FF5733',
'#2ECC40',
'#FFB300',
'#00796B',
'#C70039',
'#8D6E63',
'#607D8B',
'#00B8D4',
'#FF6F00',
'#9C27B0',
];
const idx = groups.findIndex(g => g.id === groupId);
return colorPalette[idx % colorPalette.length];
}

View File

@@ -0,0 +1,36 @@
import { useState } from 'react';
import { deleteClient } from '../apiClients';
export function useClientDelete(onDeleted?: (uuid: string) => void) {
const [showDialog, setShowDialog] = useState(false);
const [deleteClientId, setDeleteClientId] = useState<string | null>(null);
// Details-Modal separat im Parent verwalten!
const handleDelete = (uuid: string) => {
setDeleteClientId(uuid);
setShowDialog(true);
};
const confirmDelete = async () => {
if (deleteClientId) {
await deleteClient(deleteClientId);
setShowDialog(false);
if (onDeleted) onDeleted(deleteClientId);
setDeleteClientId(null);
}
};
const cancelDelete = () => {
setShowDialog(false);
setDeleteClientId(null);
};
return {
showDialog,
deleteClientId,
handleDelete,
confirmDelete,
cancelDelete,
};
}

76
dashboard/src/index.css Normal file
View File

@@ -0,0 +1,76 @@
/* @tailwind base;
@tailwind components; */
/* @tailwind utilities; */
/* :root {
font-family: system-ui, Avenir, Helvetica, Arial, sans-serif;
line-height: 1.5;
font-weight: 400;
color-scheme: light dark;
color: rgb(255 255 255 / 87%);
background-color: #242424;
font-synthesis: none;
text-rendering: optimizelegibility;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
a {
font-weight: 500;
color: #646cff;
text-decoration: inherit;
}
a:hover {
color: #535bf2;
}
body {
margin: 0;
display: flex;
place-items: center;
min-width: 320px;
min-height: 100vh;
}
h1 {
font-size: 3.2em;
line-height: 1.1;
}
/* button {
border-radius: 8px;
border: 1px solid transparent;
padding: 0.6em 1.2em;
font-size: 1em;
font-weight: 500;
font-family: inherit;
background-color: #1a1a1a;
cursor: pointer;
transition: border-color 0.25s;
}
button:hover {
border-color: #646cff;
}
button:focus,
button:focus-visible {
outline: 4px auto -webkit-focus-ring-color;
} */
/* @media (prefers-color-scheme: light) {
:root {
color: #213547;
background-color: #fff;
}
a:hover {
color: #747bff;
}
button {
background-color: #f9f9f9;
}
} */

View File

@@ -0,0 +1,519 @@
import React, { useEffect, useState, useRef } from 'react';
import { KanbanComponent } from '@syncfusion/ej2-react-kanban';
import { fetchClients, updateClientGroup } from './apiClients';
import { fetchGroups, createGroup, deleteGroup, renameGroup } from './apiGroups';
import type { Client } from './apiClients';
import type { KanbanComponent as KanbanComponentType } from '@syncfusion/ej2-react-kanban';
import { DialogComponent } from '@syncfusion/ej2-react-popups';
import { useToast } from './components/ToastProvider';
import { L10n } from '@syncfusion/ej2-base';
interface KanbanClient extends Client {
Id: string;
Status: string; // Raumgruppe (Gruppenname)
Summary: string; // Anzeigename
}
interface Group {
id: number;
name: string;
// weitere Felder möglich
}
interface KanbanDragEventArgs {
element: HTMLElement | HTMLElement[];
data: KanbanClient | KanbanClient[];
event?: { event?: MouseEvent };
[key: string]: unknown;
}
interface KanbanComponentWithClear extends KanbanComponentType {
clearSelection: () => void;
}
const de = {
title: 'Gruppen',
newGroup: 'Neue Raumgruppe',
renameGroup: 'Gruppe umbenennen',
deleteGroup: 'Gruppe löschen',
add: 'Hinzufügen',
cancel: 'Abbrechen',
rename: 'Umbenennen',
confirmDelete: 'Löschbestätigung',
reallyDelete: (name: string) => `Möchten Sie die Gruppe <b>${name}</b> wirklich löschen?`,
clientsMoved: 'Alle Clients werden in "Nicht zugeordnet" verschoben.',
groupCreated: 'Gruppe angelegt',
groupDeleted: 'Gruppe gelöscht. Clients in "Nicht zugeordnet" verschoben',
groupRenamed: 'Gruppenname geändert',
selectGroup: 'Gruppe wählen',
newName: 'Neuer Name',
warning: 'Achtung:',
yesDelete: 'Ja, löschen',
};
L10n.load({
de: {
kanban: {
items: 'Clients',
addTitle: 'Neue Karte hinzufügen',
editTitle: 'Karte bearbeiten',
deleteTitle: 'Karte löschen',
edit: 'Bearbeiten',
delete: 'Löschen',
save: 'Speichern',
cancel: 'Abbrechen',
yes: 'Ja',
no: 'Nein',
noCard: 'Keine Clients vorhanden',
},
},
});
const Infoscreen_groups: React.FC = () => {
const toast = useToast();
const [clients, setClients] = useState<KanbanClient[]>([]);
const [groups, setGroups] = useState<{ keyField: string; headerText: string; id?: number }[]>([]);
const [showDialog, setShowDialog] = useState(false);
const [newGroupName, setNewGroupName] = useState('');
const [draggedCard, setDraggedCard] = useState<{ id: string; fromColumn: string } | null>(null);
const [renameDialog, setRenameDialog] = useState<{
open: boolean;
oldName: string;
newName: string;
}>({ open: false, oldName: '', newName: '' });
const [deleteDialog, setDeleteDialog] = useState<{ open: boolean; groupName: string }>({
open: false,
groupName: '',
});
const [showDeleteConfirm, setShowDeleteConfirm] = useState(false);
const kanbanRef = useRef<KanbanComponentType | null>(null); // Ref für Kanban
// Lade Gruppen und Clients
useEffect(() => {
let groupMap: Record<number, string> = {};
fetchGroups().then((groupData: Group[]) => {
const kanbanGroups = groupData.map(g => ({
keyField: g.name,
headerText: g.name,
id: g.id,
}));
setGroups(kanbanGroups);
groupMap = Object.fromEntries(groupData.map(g => [g.id, g.name]));
fetchClients().then(data => {
setClients(
data.map((c, i) => ({
...c,
Id: c.uuid,
Status:
c.group_id === 1
? 'Nicht zugeordnet'
: typeof c.group_id === 'number' && groupMap[c.group_id]
? groupMap[c.group_id]
: 'Nicht zugeordnet',
Summary: c.description || `Client ${i + 1}`,
}))
);
});
});
}, []);
// Neue Gruppe anlegen (persistiert per API)
const handleAddGroup = async () => {
if (!newGroupName.trim()) return;
try {
const newGroup = await createGroup(newGroupName);
toast.show({
content: de.groupCreated,
cssClass: 'e-toast-success',
timeOut: 5000,
showCloseButton: false,
});
setGroups([
...groups,
{ keyField: newGroup.name, headerText: newGroup.name, id: newGroup.id },
]);
setNewGroupName('');
setShowDialog(false);
} catch (err) {
toast.show({
content: (err as Error).message,
cssClass: 'e-toast-danger',
timeOut: 0,
showCloseButton: true,
});
}
};
// Löschen einer Gruppe
const handleDeleteGroup = async (groupName: string) => {
try {
// Clients der Gruppe in "Nicht zugeordnet" verschieben
const groupClients = clients.filter(c => c.Status === groupName);
if (groupClients.length > 0) {
// Ermittle die ID der Zielgruppe "Nicht zugeordnet"
const target = groups.find(g => g.headerText === 'Nicht zugeordnet');
if (!target || !target.id) throw new Error('Zielgruppe "Nicht zugeordnet" nicht gefunden');
await updateClientGroup(
groupClients.map(c => c.Id),
target.id
);
}
await deleteGroup(groupName);
toast.show({
content: de.groupDeleted,
cssClass: 'e-toast-success',
timeOut: 5000,
showCloseButton: false,
});
// Gruppen und Clients neu laden
const groupData = await fetchGroups();
const groupMap = Object.fromEntries(groupData.map((g: Group) => [g.id, g.name]));
setGroups(groupData.map((g: Group) => ({ keyField: g.name, headerText: g.name, id: g.id })));
const data = await fetchClients();
setClients(
data.map((c, i) => ({
...c,
Id: c.uuid,
Status:
typeof c.group_id === 'number' && groupMap[c.group_id]
? groupMap[c.group_id]
: 'Nicht zugeordnet',
Summary: c.description || `Client ${i + 1}`,
}))
);
} catch (err) {
toast.show({
content: (err as Error).message,
cssClass: 'e-toast-danger',
timeOut: 0,
showCloseButton: true,
});
}
setDeleteDialog({ open: false, groupName: '' });
};
// Umbenennen einer Gruppe
const handleRenameGroup = async () => {
try {
await renameGroup(renameDialog.oldName, renameDialog.newName);
toast.show({
content: de.groupRenamed,
cssClass: 'e-toast-success',
timeOut: 5000,
showCloseButton: false,
});
// Gruppen und Clients neu laden
const groupData = await fetchGroups();
const groupMap = Object.fromEntries(groupData.map((g: Group) => [g.id, g.name]));
setGroups(groupData.map((g: Group) => ({ keyField: g.name, headerText: g.name, id: g.id })));
const data = await fetchClients();
setClients(
data.map((c, i) => ({
...c,
Id: c.uuid,
Status:
typeof c.group_id === 'number' && groupMap[c.group_id]
? groupMap[c.group_id]
: 'Nicht zugeordnet',
Summary: c.description || `Client ${i + 1}`,
}))
);
} catch (err) {
toast.show({
content: (err as Error).message,
cssClass: 'e-toast-danger',
timeOut: 0,
showCloseButton: true,
});
}
setRenameDialog({ open: false, oldName: '', newName: '' });
};
const handleDragStart = (args: KanbanDragEventArgs) => {
const element = Array.isArray(args.element) ? args.element[0] : args.element;
const cardId = element.getAttribute('data-id');
const fromColumn = element.getAttribute('data-key');
setDraggedCard({ id: cardId || '', fromColumn: fromColumn || '' });
};
const handleCardDrop = async (args: KanbanDragEventArgs) => {
if (!draggedCard) return;
const mouseEvent = args.event?.event;
let targetGroupName = '';
if (mouseEvent && mouseEvent.clientX && mouseEvent.clientY) {
const targetElement = document.elementFromPoint(mouseEvent.clientX, mouseEvent.clientY);
const kanbanColumn =
targetElement?.closest('[data-key]') || targetElement?.closest('.e-content-row');
if (kanbanColumn) {
const columnKey = kanbanColumn.getAttribute('data-key');
if (columnKey) {
targetGroupName = columnKey;
} else {
const headerElement = kanbanColumn.querySelector('.e-header-text');
targetGroupName = headerElement?.textContent?.trim() || '';
}
}
}
// Fallback
if (!targetGroupName) {
const targetElement = Array.isArray(args.element) ? args.element[0] : args.element;
const cardWrapper = targetElement.closest('.e-card-wrapper');
const contentRow = cardWrapper?.closest('.e-content-row');
const headerText = contentRow?.querySelector('.e-header-text');
targetGroupName = headerText?.textContent?.trim() || '';
}
if (!targetGroupName || targetGroupName === draggedCard.fromColumn) {
setDraggedCard(null);
return;
}
const dropped = Array.isArray(args.data) ? args.data : [args.data];
const clientIds = dropped.map((card: KanbanClient) => card.Id);
try {
// Ermittle Zielgruppen-ID anhand des Namens
const target = groups.find(g => g.headerText === targetGroupName);
if (!target || !target.id) throw new Error('Zielgruppe nicht gefunden');
await updateClientGroup(clientIds, target.id);
fetchGroups().then((groupData: Group[]) => {
const groupMap = Object.fromEntries(groupData.map(g => [g.id, g.name]));
setGroups(
groupData.map(g => ({
keyField: g.name,
headerText: g.name,
id: g.id,
}))
);
fetchClients().then(data => {
setClients(
data.map((c, i) => ({
...c,
Id: c.uuid,
Status:
typeof c.group_id === 'number' && groupMap[c.group_id]
? groupMap[c.group_id]
: 'Nicht zugeordnet',
Summary: c.description || `Client ${i + 1}`,
}))
);
// Nach dem Laden: Karten deselektieren
setTimeout(() => {
(kanbanRef.current as KanbanComponentWithClear)?.clearSelection();
setTimeout(() => {
(kanbanRef.current as KanbanComponentWithClear)?.clearSelection();
}, 100);
}, 50);
});
});
} catch {
alert('Fehler beim Aktualisieren der Clients');
}
setDraggedCard(null);
};
// Spalten-Array ohne Header-Buttons/Template
const kanbanColumns = groups.map(group => ({
keyField: group.keyField,
headerText: group.headerText,
}));
return (
<div id="dialog-target">
<h2 className="text-xl font-bold mb-4">{de.title}</h2>
<div className="flex gap-2 mb-4">
<button
className="px-4 py-2 bg-blue-500 text-white rounded"
onClick={() => setShowDialog(true)}
>
{de.newGroup}
</button>
<button
className="px-4 py-2 bg-yellow-500 text-white rounded"
onClick={() => setRenameDialog({ open: true, oldName: '', newName: '' })}
>
{de.renameGroup}
</button>
<button
className="px-4 py-2 bg-red-500 text-white rounded"
onClick={() => setDeleteDialog({ open: true, groupName: '' })}
>
{de.deleteGroup}
</button>
</div>
<KanbanComponent
locale="de"
id="kanban"
keyField="Status"
dataSource={clients}
cardSettings={{
headerField: 'Summary',
selectionType: 'Multiple',
}}
allowDragAndDrop={true}
dragStart={handleDragStart}
dragStop={handleCardDrop}
ref={kanbanRef}
columns={kanbanColumns}
/>
{showDialog && (
<div className="fixed inset-0 bg-black bg-opacity-30 flex items-center justify-center">
<div className="bg-white p-6 rounded shadow">
<h3 className="mb-2 font-bold">{de.newGroup}</h3>
<input
className="border p-2 mb-2 w-full"
value={newGroupName}
onChange={e => setNewGroupName(e.target.value)}
placeholder="Raumname"
/>
<div className="flex gap-2">
<button className="bg-blue-500 text-white px-4 py-2 rounded" onClick={handleAddGroup}>
{de.add}
</button>
<button
className="bg-gray-300 px-4 py-2 rounded"
onClick={() => setShowDialog(false)}
>
{de.cancel}
</button>
</div>
</div>
</div>
)}
{renameDialog.open && (
<div className="fixed inset-0 bg-black bg-opacity-30 flex items-center justify-center">
<div className="bg-white p-6 rounded shadow">
<h3 className="mb-2 font-bold">{de.renameGroup}</h3>
<select
className="border p-2 mb-2 w-full"
value={renameDialog.oldName}
onChange={e =>
setRenameDialog({
...renameDialog,
oldName: e.target.value,
newName: e.target.value,
})
}
>
<option value="">{de.selectGroup}</option>
{groups
.filter(g => g.headerText !== 'Nicht zugeordnet')
.map(g => (
<option key={g.keyField} value={g.headerText}>
{g.headerText}
</option>
))}
</select>
<input
className="border p-2 mb-2 w-full"
value={renameDialog.newName}
onChange={e => setRenameDialog({ ...renameDialog, newName: e.target.value })}
placeholder={de.newName}
/>
<div className="flex gap-2">
<button
className="bg-blue-500 text-white px-4 py-2 rounded"
onClick={handleRenameGroup}
disabled={!renameDialog.oldName || !renameDialog.newName}
>
{de.rename}
</button>
<button
className="bg-gray-300 px-4 py-2 rounded"
onClick={() => setRenameDialog({ open: false, oldName: '', newName: '' })}
>
{de.cancel}
</button>
</div>
</div>
</div>
)}
{deleteDialog.open && (
<div className="fixed inset-0 bg-black bg-opacity-30 flex items-center justify-center">
<div className="bg-white p-6 rounded shadow">
<h3 className="mb-2 font-bold">{de.deleteGroup}</h3>
<select
className="border p-2 mb-2 w-full"
value={deleteDialog.groupName}
onChange={e => setDeleteDialog({ ...deleteDialog, groupName: e.target.value })}
>
<option value="">{de.selectGroup}</option>
{groups
.filter(g => g.headerText !== 'Nicht zugeordnet')
.map(g => (
<option key={g.keyField} value={g.headerText}>
{g.headerText}
</option>
))}
</select>
<p>{de.clientsMoved}</p>
{deleteDialog.groupName && (
<div className="bg-yellow-100 text-yellow-800 p-2 rounded mb-2 text-sm">
<strong>{de.warning}</strong> Möchten Sie die Gruppe <b>{deleteDialog.groupName}</b>{' '}
wirklich löschen?
</div>
)}
<div className="flex gap-2 mt-2">
<button
className="bg-red-500 text-white px-4 py-2 rounded"
onClick={() => setShowDeleteConfirm(true)}
disabled={!deleteDialog.groupName}
>
{de.deleteGroup}
</button>
<button
className="bg-gray-300 px-4 py-2 rounded"
onClick={() => setDeleteDialog({ open: false, groupName: '' })}
>
{de.cancel}
</button>
</div>
</div>
{showDeleteConfirm && deleteDialog.groupName && (
<DialogComponent
width="350px"
header={de.confirmDelete}
visible={showDeleteConfirm}
close={() => setShowDeleteConfirm(false)}
footerTemplate={() => (
<div className="flex gap-2 justify-end">
<button
className="bg-red-500 text-white px-4 py-2 rounded"
onClick={() => {
handleDeleteGroup(deleteDialog.groupName);
setShowDeleteConfirm(false);
}}
>
{de.yesDelete}
</button>
<button
className="bg-gray-300 px-4 py-2 rounded"
onClick={() => {
setShowDeleteConfirm(false);
setDeleteDialog({ open: false, groupName: '' });
}}
>
{de.cancel}
</button>
</div>
)}
>
<div>
Möchten Sie die Gruppe <b>{deleteDialog.groupName}</b> wirklich löschen?
<br />
<span className="text-sm text-gray-500">{de.clientsMoved}</span>
</div>
</DialogComponent>
)}
</div>
)}
</div>
);
};
export default Infoscreen_groups;

12
dashboard/src/logout.tsx Normal file
View File

@@ -0,0 +1,12 @@
import React from 'react';
const Logout: React.FC = () => (
<div className="flex items-center justify-center h-screen">
<div className="text-center">
<h2 className="text-2xl font-bold mb-4">Abmeldung</h2>
<p>Sie haben sich erfolgreich abgemeldet.</p>
</div>
</div>
);
export default Logout;

19
dashboard/src/main.tsx Normal file
View File

@@ -0,0 +1,19 @@
import { StrictMode } from 'react';
import { createRoot } from 'react-dom/client';
import './index.css';
import App from './App.tsx';
import { registerLicense } from '@syncfusion/ej2-base';
import '@syncfusion/ej2-base/styles/material3.css';
import '@syncfusion/ej2-navigations/styles/material3.css';
import '@syncfusion/ej2-buttons/styles/material3.css';
// Setze hier deinen Lizenzschlüssel ein
registerLicense(
'ORg4AjUWIQA/Gnt3VVhhQlJDfV5AQmBIYVp/TGpJfl96cVxMZVVBJAtUQF1hTH5VdENiXX1dcHxUQWNVWkd2'
);
createRoot(document.getElementById('root')!).render(
<StrictMode>
<App />
</StrictMode>
);

117
dashboard/src/media.tsx Normal file
View File

@@ -0,0 +1,117 @@
import React, { useState, useRef } from 'react';
import CustomMediaInfoPanel from './components/CustomMediaInfoPanel';
import {
FileManagerComponent,
Inject,
NavigationPane,
DetailsView,
Toolbar,
} from '@syncfusion/ej2-react-filemanager';
const hostUrl = '/api/eventmedia/filemanager/'; // Dein Backend-Endpunkt für FileManager
const Media: React.FC = () => {
// State für die angezeigten Dateidetails
const [fileDetails] = useState<null | {
name: string;
size: number;
type: string;
dateModified: number;
description?: string | null;
}>(null);
// Ansicht: 'LargeIcons', 'Details'
const [viewMode, setViewMode] = useState<'LargeIcons' | 'Details'>('LargeIcons');
const fileManagerRef = useRef<FileManagerComponent | null>(null);
// Hilfsfunktion für Datum in Browser-Zeitzone
function formatLocalDate(timestamp: number) {
if (!timestamp) return '';
const date = new Date(timestamp * 1000);
return date.toLocaleString('de-DE'); // Zeigt lokale Zeit des Browsers
}
// Ansicht umschalten, ohne Remount
React.useEffect(() => {
if (fileManagerRef.current) {
const element = fileManagerRef.current.element as HTMLElement & { ej2_instances?: unknown[] };
if (element && element.ej2_instances && element.ej2_instances[0]) {
// Typisiere Instanz als unknown, da kein offizieller Typ vorhanden
const instanz = element.ej2_instances[0] as { view: string; dataBind: () => void };
instanz.view = viewMode;
instanz.dataBind();
}
}
}, [viewMode]);
return (
<div>
<h2 className="text-xl font-bold mb-4">Medien</h2>
{/* Ansicht-Umschalter */}
<div style={{ marginBottom: 12 }}>
<button
className={viewMode === 'LargeIcons' ? 'e-btn e-active' : 'e-btn'}
onClick={() => setViewMode('LargeIcons')}
style={{ marginRight: 8 }}
>
Icons
</button>
<button
className={viewMode === 'Details' ? 'e-btn e-active' : 'e-btn'}
onClick={() => setViewMode('Details')}
>
Details
</button>
</div>
{/* Debug-Ausgabe entfernt, da ReactNode erwartet wird */}
<FileManagerComponent
ref={fileManagerRef}
ajaxSettings={{
url: hostUrl + 'operations',
getImageUrl: hostUrl + 'get-image',
uploadUrl: hostUrl + 'upload',
downloadUrl: hostUrl + 'download',
}}
toolbarSettings={{
items: [
'NewFolder',
'Upload',
'Download',
'Rename',
'Delete',
'SortBy',
'Refresh',
'Details',
],
}}
contextMenuSettings={{
file: ['Open', '|', 'Download', '|', 'Rename', 'Delete', '|', 'Details'],
folder: ['Open', '|', 'Rename', 'Delete', '|', 'Details'],
layout: ['SortBy', 'Refresh', '|', 'View', 'Details'],
}}
allowMultiSelection={false}
view={viewMode}
detailsViewSettings={{
columns: [
{ field: 'name', headerText: 'Name', minWidth: '120', width: '200' },
{ field: 'size', headerText: 'Größe', minWidth: '80', width: '100' },
{
field: 'dateModified',
headerText: 'Upload-Datum',
minWidth: '120',
width: '180',
template: (data: { dateModified: number }) => formatLocalDate(data.dateModified),
},
{ field: 'type', headerText: 'Typ', minWidth: '80', width: '100' },
],
}}
menuClick={() => {}}
>
<Inject services={[NavigationPane, DetailsView, Toolbar]} />
</FileManagerComponent>
{/* Details-Panel anzeigen, wenn Details verfügbar sind */}
{fileDetails && <CustomMediaInfoPanel {...fileDetails} />}
</div>
);
};
export default Media;

View File

@@ -0,0 +1,172 @@
import React, { useState, useEffect } from 'react';
interface ProgramInfo {
appName: string;
version: string;
copyright: string;
supportContact: string;
description: string;
techStack: {
[key: string]: string;
};
openSourceComponents: {
frontend: { name: string; license: string }[];
backend: { name: string; license: string }[];
};
buildInfo: {
buildDate: string;
commitId: string;
};
changelog: {
version: string;
date: string;
changes: string[];
}[];
}
const Programminfo: React.FC = () => {
const [info, setInfo] = useState<ProgramInfo | null>(null);
const [error, setError] = useState<string | null>(null);
useEffect(() => {
fetch('/program-info.json')
.then(response => {
if (!response.ok) {
throw new Error('Netzwerk-Antwort war nicht ok');
}
return response.json();
})
.then(data => setInfo(data))
.catch(error => {
console.error('Fehler beim Laden der Programminformationen:', error);
setError('Informationen konnten nicht geladen werden.');
});
}, []);
if (error) {
return (
<div>
<h2 className="text-xl font-bold mb-4 text-red-600">Fehler</h2>
<p>{error}</p>
</div>
);
}
if (!info) {
return (
<div>
<h2 className="text-xl font-bold mb-4">Programminfo</h2>
<p>Lade Informationen...</p>
</div>
);
}
return (
<div className="space-y-8">
<div>
<h2 className="text-2xl font-bold mb-2">{info.appName}</h2>
<p className="text-gray-600">{info.description}</p>
</div>
<div className="grid grid-cols-1 md:grid-cols-2 gap-8">
{/* Allgemeine Infos & Build */}
<div className="bg-white p-6 rounded-lg shadow">
<h3 className="text-xl font-semibold mb-4 border-b pb-2">Allgemein</h3>
<div className="space-y-3">
<p>
<strong>Version:</strong> {info.version}
</p>
<p>
<strong>Copyright:</strong> {info.copyright}
</p>
<p>
<strong>Support:</strong>{' '}
<a href={`mailto:${info.supportContact}`} className="text-blue-600 hover:underline">
{info.supportContact}
</a>
</p>
<hr className="my-4" />
<h4 className="font-semibold">Build-Informationen</h4>
<p>
<strong>Build-Datum:</strong>{' '}
{new Date(info.buildInfo.buildDate).toLocaleString('de-DE')}
</p>
<p>
<strong>Commit-ID:</strong>{' '}
<span className="font-mono text-sm bg-gray-100 p-1 rounded">
{info.buildInfo.commitId}
</span>
</p>
</div>
</div>
{/* Technischer Stack */}
<div className="bg-white p-6 rounded-lg shadow">
<h3 className="text-xl font-semibold mb-4 border-b pb-2">Technologie-Stack</h3>
<ul className="list-disc list-inside space-y-2">
{Object.entries(info.techStack).map(([key, value]) => (
<li key={key}>
<span className="font-semibold capitalize">{key}:</span> {value}
</li>
))}
</ul>
</div>
</div>
{/* Changelog */}
<div>
<h3 className="text-xl font-semibold mb-4">Änderungsprotokoll (Changelog)</h3>
<div className="space-y-6">
{info.changelog.map(log => (
<div key={log.version} className="bg-white p-6 rounded-lg shadow">
<h4 className="font-bold text-lg mb-2">
Version {log.version}{' '}
<span className="text-sm font-normal text-gray-500">
- {new Date(log.date).toLocaleDateString('de-DE')}
</span>
</h4>
<ul className="list-disc list-inside space-y-1 text-gray-700">
{log.changes.map((change, index) => (
<li key={index}>{change}</li>
))}
</ul>
</div>
))}
</div>
</div>
{/* Open Source Komponenten */}
<div>
<h3 className="text-xl font-semibold mb-4">Verwendete Open-Source-Komponenten</h3>
<div className="grid grid-cols-1 md:grid-cols-2 gap-8">
{info.openSourceComponents.frontend && (
<div className="bg-white p-6 rounded-lg shadow">
<h4 className="font-bold mb-3">Frontend</h4>
<ul className="list-disc list-inside space-y-1">
{info.openSourceComponents.frontend.map(item => (
<li key={item.name}>
{item.name} ({item.license}-Lizenz)
</li>
))}
</ul>
</div>
)}
{info.openSourceComponents.backend && (
<div className="bg-white p-6 rounded-lg shadow">
<h4 className="font-bold mb-3">Backend</h4>
<ul className="list-disc list-inside space-y-1">
{info.openSourceComponents.backend.map(item => (
<li key={item.name}>
{item.name} ({item.license}-Lizenz)
</li>
))}
</ul>
</div>
)}
</div>
</div>
</div>
);
};
export default Programminfo;

View File

@@ -0,0 +1,8 @@
import React from 'react';
const Ressourcen: React.FC = () => (
<div>
<h2 className="text-xl font-bold mb-4">Ressourcen</h2>
<p>Willkommen im Infoscreen-Management Ressourcen.</p>
</div>
);
export default Ressourcen;

4
dashboard/src/types/json.d.ts vendored Normal file
View File

@@ -0,0 +1,4 @@
declare module '*.json' {
const value: unknown;
export default value;
}

1
dashboard/src/vite-env.d.ts vendored Normal file
View File

@@ -0,0 +1 @@
/// <reference types="vite/client" />

View File

@@ -0,0 +1,10 @@
module.exports = {
content: ['./index.html', './src/**/*.{js,ts,jsx,tsx}'],
corePlugins: {
preflight: false,
},
theme: {
extend: {},
},
plugins: [],
};

View File

@@ -0,0 +1,27 @@
{
"compilerOptions": {
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.app.tsbuildinfo",
"target": "ES2020",
"useDefineForClassFields": true,
"lib": ["ES2020", "DOM", "DOM.Iterable"],
"module": "ESNext",
"skipLibCheck": true,
/* Bundler mode */
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"verbatimModuleSyntax": true,
"moduleDetection": "force",
"noEmit": true,
"jsx": "react-jsx",
/* Linting */
"strict": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"erasableSyntaxOnly": true,
"noFallthroughCasesInSwitch": true,
"noUncheckedSideEffectImports": true
},
"include": ["src"]
}

10
dashboard/tsconfig.json Normal file
View File

@@ -0,0 +1,10 @@
{
"compilerOptions": {
"typeRoots": ["./src/types", "./node_modules/@types"]
},
"files": [],
"references": [
{ "path": "./tsconfig.app.json" },
{ "path": "./tsconfig.node.json" }
]
}

View File

@@ -0,0 +1,25 @@
{
"compilerOptions": {
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.node.tsbuildinfo",
"target": "ES2022",
"lib": ["ES2023"],
"module": "ESNext",
"skipLibCheck": true,
/* Bundler mode */
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"verbatimModuleSyntax": true,
"moduleDetection": "force",
"noEmit": true,
/* Linting */
"strict": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"erasableSyntaxOnly": true,
"noFallthroughCasesInSwitch": true,
"noUncheckedSideEffectImports": true
},
"include": ["vite.config.ts"]
}

54
dashboard/vite.config.ts Normal file
View File

@@ -0,0 +1,54 @@
import { defineConfig } from 'vite';
import react from '@vitejs/plugin-react';
// import path from 'path';
// https://vite.dev/config/
export default defineConfig({
cacheDir: './.vite',
plugins: [react()],
resolve: {
// 🔧 KORRIGIERT: Entferne die problematischen Aliases komplett
// Diese verursachen das "not an absolute path" Problem
// alias: {
// '@syncfusion/ej2-react-navigations': '@syncfusion/ej2-react-navigations/index.js',
// '@syncfusion/ej2-react-buttons': '@syncfusion/ej2-react-buttons/index.js',
// },
},
optimizeDeps: {
// 🔧 NEU: Force pre-bundling der Syncfusion Module
include: [
'@syncfusion/ej2-react-navigations',
'@syncfusion/ej2-react-buttons',
'@syncfusion/ej2-base',
'@syncfusion/ej2-navigations',
'@syncfusion/ej2-buttons',
'@syncfusion/ej2-react-base',
],
// 🔧 NEU: Force dependency re-optimization
force: true,
esbuildOptions: {
target: 'es2020',
},
},
build: {
target: 'es2020',
commonjsOptions: {
include: [/node_modules/],
transformMixedEsModules: true,
},
},
server: {
host: '0.0.0.0',
port: 5173,
watch: {
usePolling: true,
},
fs: {
strict: false,
},
proxy: {
'/api': 'http://server:8000',
'/screenshots': 'http://server:8000',
},
},
});

24
dashboard/wait-for-backend.sh Executable file
View File

@@ -0,0 +1,24 @@
#!/bin/sh
# wait-for-backend.sh
# Stellt sicher, dass das Skript bei einem Fehler abbricht
set -e
# Der erste Parameter ist der Host, der erreicht werden soll
host="$1"
# Alle weiteren Parameter bilden den Befehl, der danach ausgeführt werden soll
shift
cmd="$@"
# Schleife, die so lange läuft, bis der Host mit einem erfolgreichen HTTP-Status antwortet
# curl -s: silent mode (kein Fortschrittsbalken)
# curl -f: fail silently (gibt einen Fehlercode > 0 zurück, wenn der HTTP-Status nicht 2xx ist)
until curl -s -f "$host" > /dev/null; do
>&2 echo "Backend ist noch nicht erreichbar - schlafe für 2 Sekunden"
sleep 2
done
# Wenn die Schleife beendet ist, ist das Backend erreichbar
>&2 echo "Backend ist erreichbar - starte Vite-Server..."
# Führe den eigentlichen Befehl aus (z.B. npm run dev)
exec $cmd

336
deployment-debian.md Normal file
View File

@@ -0,0 +1,336 @@
# Infoscreen Deployment Guide (Debian)
Komplette Anleitung für das Deployment des Infoscreen-Systems auf einem Debian-Server (Bookworm/Trixie) mit GitHub Container Registry.
## 📋 Übersicht
- **Phase 0**: Docker Installation (optional)
- **Phase 1**: Images bauen und zur Registry pushen
- **Phase 2**: Debian-Server Vorbereitung
- **Phase 3**: System-Konfiguration und Start
---
## 🐳 Phase 0: Docker Installation (optional)
Falls Docker noch nicht installiert ist, wählen Sie eine der folgenden Optionen:
### Option A: Debian Repository (schnell, aber oft ältere Version)
```bash
sudo apt update
sudo apt install -y docker.io docker-compose-plugin
sudo systemctl enable docker
sudo systemctl start docker
```
### Option B: Offizielle Docker-Installation (empfohlen für Produktion)
```bash
# Alte Docker-Versionen entfernen (falls vorhanden)
sudo apt remove -y docker docker-engine docker.io containerd runc
# Abhängigkeiten installieren
sudo apt update
sudo apt install -y ca-certificates curl gnupg lsb-release
# Repository-Schlüssel hinzufügen
sudo install -m 0755 -d /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/debian/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
sudo chmod a+r /etc/apt/keyrings/docker.gpg
# Debian Codename ermitteln (bookworm / trixie / bullseye ...)
source /etc/os-release
echo "Using Debian release: $VERSION_CODENAME"
# Docker Repository hinzufügen
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/debian \
$VERSION_CODENAME stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# Docker installieren (neueste aus dem offiziellen Repo)
sudo apt update
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
# Docker aktivieren
sudo systemctl enable docker
sudo systemctl start docker
# Benutzer zur docker-Gruppe hinzufügen (für Root-losen Zugriff)
sudo usermod -aG docker $USER
echo "Bitte neu einloggen (SSH Sitzung beenden und neu verbinden), damit die Gruppenzugehörigkeit aktiv wird."
```
### Docker-Installation testen
```bash
docker run hello-world
docker --version
docker compose version
```
---
## 🏗️ Phase 1: Images bauen und pushen (lokale Entwicklungsmaschine)
### 1. GitHub Container Registry Login
```bash
# Personal Access Token (write:packages) verwenden
echo $GITHUB_TOKEN | docker login ghcr.io -u robbstarkaustria --password-stdin
# Alternativ interaktiv
docker login ghcr.io
# Username: robbstarkaustria
# Password: [GITHUB_TOKEN]
```
### 2. Images bauen und taggen
```bash
cd /workspace
docker build -f server/Dockerfile -t ghcr.io/robbstarkaustria/infoscreen-api:latest .
docker build -f dashboard/Dockerfile -t ghcr.io/robbstarkaustria/infoscreen-dashboard:latest .
docker build -f listener/Dockerfile -t ghcr.io/robbstarkaustria/infoscreen-listener:latest .
docker build -f scheduler/Dockerfile -t ghcr.io/robbstarkaustria/infoscreen-scheduler:latest .
```
### 3. Images pushen
```bash
docker push ghcr.io/robbstarkaustria/infoscreen-api:latest
docker push ghcr.io/robbstarkaustria/infoscreen-dashboard:latest
docker push ghcr.io/robbstarkaustria/infoscreen-listener:latest
docker push ghcr.io/robbstarkaustria/infoscreen-scheduler:latest
docker images | grep ghcr.io
```
---
## 🖥️ Phase 2: Debian-Server Vorbereitung
### 4. Grundsystem aktualisieren
```bash
sudo apt update && sudo apt upgrade -y
sudo apt install -y git curl wget
# Falls Docker noch fehlt → Phase 0 ausführen
```
### 5. Deployment-Dateien übertragen
```bash
mkdir -p ~/infoscreen-deployment
cd ~/infoscreen-deployment
scp user@dev-machine:/workspace/docker-compose.prod.yml .
scp user@dev-machine:/workspace/.env .
scp user@dev-machine:/workspace/nginx.conf .
scp -r user@dev-machine:/workspace/certs ./
scp -r user@dev-machine:/workspace/mosquitto ./
# Alternative Paketierung:
# (auf Entwicklungsrechner)
# tar -czf infoscreen-deployment.tar.gz docker-compose.prod.yml .env nginx.conf certs/ mosquitto/
# scp infoscreen-deployment.tar.gz user@server:~/
# (auf Server)
# tar -xzf infoscreen-deployment.tar.gz -C ~/infoscreen-deployment
```
### 6. Mosquitto-Konfiguration (falls nicht kopiert)
```bash
mkdir -p mosquitto/{config,data,log}
cat > mosquitto/config/mosquitto.conf << 'EOF'
listener 1883
allow_anonymous true
listener 9001
protocol websockets
persistence true
persistence_location /mosquitto/data/
log_dest file /mosquitto/log/mosquitto.log
EOF
sudo chown -R 1883:1883 mosquitto/data mosquitto/log
chmod 755 mosquitto/config mosquitto/data mosquitto/log
```
### 7. Environment (.env) prüfen/anpassen
```bash
nano .env
# Prüfen u.a.:
# DB_HOST=db
# DB_CONN=mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}
# VITE_API_URL=https://YOUR_SERVER_HOST/api
# Sichere Passwörter & Secrets setzen
```
Hinweise:
- Vorlage `.env.example` aus dem Repository verwenden: `cp .env.example .env` (falls noch nicht vorhanden).
- In Produktion lädt der Code keine `.env` automatisch (nur bei `ENV=development`).
---
## 🚀 Phase 3: System-Start und Konfiguration
### 8. Images von Registry pullen
```bash
echo $GITHUB_TOKEN | docker login ghcr.io -u robbstarkaustria --password-stdin
docker compose -f docker-compose.prod.yml pull
```
### 9. System starten
```bash
docker compose -f docker-compose.prod.yml up -d
docker compose ps
docker compose logs -f
```
### 10. Firewall konfigurieren
Debian hat standardmäßig nftables/iptables aktiv. Falls eine einfache Verwaltung gewünscht ist, kann `ufw` installiert werden:
```bash
sudo apt install -y ufw
sudo ufw allow ssh
sudo ufw allow 80/tcp
sudo ufw allow 443/tcp
sudo ufw allow 1883/tcp
sudo ufw allow 9001/tcp
sudo ufw enable
sudo ufw status
```
Alternativ direkt via nftables / iptables konfigurieren (optional, nicht dargestellt).
### 11. Installation validieren
```bash
curl http://localhost/api/health
curl -k https://localhost # -k bei selbstsignierten Zertifikaten
docker compose ps
docker compose logs server
docker compose logs mqtt
```
---
## 🧪 Quickstart (lokale Entwicklung)
```bash
cp -n .env.example .env
docker compose up -d --build
docker compose ps
docker compose logs -f server
```
Dev-Erreichbarkeit:
- Dashboard: http://localhost:5173
- API: http://localhost:8000/api
- Health: http://localhost:8000/health
- Screenshots: http://localhost:8000/screenshots/<uuid>.jpg
- MQTT: localhost:1883 (WebSocket: 9001)
### 12. Systemd Autostart (optional)
```bash
sudo tee /etc/systemd/system/infoscreen.service > /dev/null << 'EOF'
[Unit]
Description=Infoscreen Application
Requires=docker.service
After=docker.service
[Service]
Type=oneshot
RemainAfterExit=yes
WorkingDirectory=/home/$USER/infoscreen-deployment
ExecStart=/usr/bin/docker compose -f docker-compose.prod.yml up -d
ExecStop=/usr/bin/docker compose -f docker-compose.prod.yml down
TimeoutStartSec=300
[Install]
WantedBy=multi-user.target
EOF
sudo systemctl enable infoscreen.service
sudo systemctl start infoscreen.service
```
---
## 🌐 Zugriff auf die Anwendung
- HTTPS Dashboard: `https://YOUR_SERVER_IP`
- HTTP (Redirect): `http://YOUR_SERVER_IP`
- API: `http://YOUR_SERVER_IP/api/`
- MQTT: `YOUR_SERVER_IP:1883`
- MQTT WebSocket: `YOUR_SERVER_IP:9001`
---
## 🔧 Troubleshooting
```bash
docker compose ps
docker compose logs -f server
docker compose restart server
```
Häufige Ursachen:
| Problem | Lösung |
|---------|--------|
| Container startet nicht | `docker compose logs <service>` prüfen |
| Ports belegt | `ss -tulpn | grep -E ':80|:443|:1883|:9001'` |
| Keine Berechtigung Docker | User zur Gruppe `docker` hinzufügen & neu einloggen |
| DB-Verbindung schlägt fehl | `.env` Einträge prüfen (Host = db) |
| Mosquitto Fehler | Ordner-Berechtigungen (`1883:1883`) prüfen |
System Neustart / Update des Stacks:
```bash
docker compose down
docker compose pull
docker compose up -d
```
---
## 📝 Wartung
### Updates
```bash
docker compose pull
docker compose up -d
sudo apt update && sudo apt upgrade -y
```
### Backup (abhängig von persistenter Datenhaltung hier nur Mosquitto + Certs exemplarisch)
```bash
docker compose down
sudo tar -czf infoscreen-backup-$(date +%Y%m%d).tar.gz mosquitto/data/ certs/
# Wiederherstellung
sudo tar -xzf infoscreen-backup-YYYYMMDD.tar.gz
docker compose up -d
```
---
**Das Infoscreen-System ist jetzt auf Ihrem Debian-Server bereitgestellt.**
Bei Verbesserungswünschen oder Problemen: Issues / Pull Requests im Repository willkommen.

417
deployment-ubuntu.md Normal file
View File

@@ -0,0 +1,417 @@
# Infoscreen Deployment Guide
Komplette Anleitung für das Deployment des Infoscreen-Systems auf einem Ubuntu-Server mit GitHub Container Registry.
## 📋 Übersicht
- **Phase 0**: Docker Installation (optional)
- **Phase 1**: Images bauen und zur Registry pushen
- **Phase 2**: Ubuntu-Server Installation
- **Phase 3**: System-Konfiguration und Start
---
## 🐳 Phase 0: Docker Installation (optional)
Falls Docker noch nicht installiert ist, wählen Sie eine der folgenden Optionen:
### Option A: Ubuntu Repository (schnell)
```bash
# Standard Ubuntu Docker-Pakete
sudo apt update
sudo apt install docker.io docker-compose-plugin -y
sudo systemctl enable docker
sudo systemctl start docker
```
### Option B: Offizielle Docker-Installation (empfohlen)
```bash
# Alte Docker-Versionen entfernen
sudo apt remove docker docker-engine docker.io containerd runc -y
# Abhängigkeiten installieren
sudo apt update
sudo apt install ca-certificates curl gnupg lsb-release -y
# Docker GPG-Key hinzufügen
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
# Docker Repository hinzufügen
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# Docker installieren (neueste Version)
sudo apt update
sudo apt install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin -y
# Docker aktivieren und starten
sudo systemctl enable docker
sudo systemctl start docker
# User zur Docker-Gruppe hinzufügen
sudo usermod -aG docker $USER
# Neuanmeldung für Gruppenänderung erforderlich
exit
# Neu einloggen via SSH
```
### Docker-Installation testen
```bash
# Test-Container ausführen
docker run hello-world
# Docker-Version prüfen
docker --version
docker compose version
```
---
## 🏗️ Phase 1: Images bauen und pushen (Entwicklungsmaschine)
### 1. GitHub Container Registry Login
```bash
# GitHub Personal Access Token mit write:packages Berechtigung erstellen
echo $GITHUB_TOKEN | docker login ghcr.io -u robbstarkaustria --password-stdin
# Oder interaktiv:
docker login ghcr.io
# Username: robbstarkaustria
# Password: [GITHUB_TOKEN]
```
### 2. Images bauen und taggen
```bash
cd /workspace
# Server-Image bauen
docker build -f server/Dockerfile -t ghcr.io/robbstarkaustria/infoscreen-api:latest .
# Dashboard-Image bauen
docker build -f dashboard/Dockerfile -t ghcr.io/robbstarkaustria/infoscreen-dashboard:latest .
# Listener-Image bauen (falls vorhanden)
docker build -f listener/Dockerfile -t ghcr.io/robbstarkaustria/infoscreen-listener:latest .
# Scheduler-Image bauen (falls vorhanden)
docker build -f scheduler/Dockerfile -t ghcr.io/robbstarkaustria/infoscreen-scheduler:latest .
```
### 3. Images zur Registry pushen
```bash
# Alle Images pushen
docker push ghcr.io/robbstarkaustria/infoscreen-api:latest
docker push ghcr.io/robbstarkaustria/infoscreen-dashboard:latest
docker push ghcr.io/robbstarkaustria/infoscreen-listener:latest
docker push ghcr.io/robbstarkaustria/infoscreen-scheduler:latest
# Status prüfen
docker images | grep ghcr.io
```
---
## 🖥️ Phase 2: Ubuntu-Server Installation
### 4. Ubuntu Server vorbereiten
```bash
sudo apt update && sudo apt upgrade -y
# Grundlegende Tools installieren
sudo apt install git curl wget -y
# Docker installieren (siehe Phase 0)
```
### 5. Deployment-Dateien übertragen
```bash
# Deployment-Ordner erstellen
mkdir -p ~/infoscreen-deployment
cd ~/infoscreen-deployment
# Dateien vom Dev-System kopieren (über SCP)
scp user@dev-machine:/workspace/docker-compose.prod.yml .
scp user@dev-machine:/workspace/.env .
scp user@dev-machine:/workspace/nginx.conf .
scp -r user@dev-machine:/workspace/certs ./
scp -r user@dev-machine:/workspace/mosquitto ./
# Alternative: Deployment-Paket verwenden
# Auf Dev-Maschine (/workspace):
# tar -czf infoscreen-deployment.tar.gz docker-compose.prod.yml .env nginx.conf certs/ mosquitto/
# scp infoscreen-deployment.tar.gz user@server:~/
# Auf Server: tar -xzf infoscreen-deployment.tar.gz
```
### 6. Mosquitto-Konfiguration vorbereiten
```bash
# Falls mosquitto-Ordner noch nicht vollständig vorhanden:
mkdir -p mosquitto/{config,data,log}
# Mosquitto-Konfiguration erstellen (falls nicht übertragen)
cat > mosquitto/config/mosquitto.conf << 'EOF'
# -----------------------------
# Netzwerkkonfiguration
# -----------------------------
listener 1883
allow_anonymous true
# password_file /mosquitto/config/passwd
# WebSocket (optional)
listener 9001
protocol websockets
# -----------------------------
# Persistence & Pfade
# -----------------------------
persistence true
persistence_location /mosquitto/data/
log_dest file /mosquitto/log/mosquitto.log
EOF
# Berechtigungen für Mosquitto setzen
sudo chown -R 1883:1883 mosquitto/data mosquitto/log
chmod 755 mosquitto/config mosquitto/data mosquitto/log
```
### 7. Environment-Variablen anpassen
```bash
# .env für Produktionsumgebung anpassen
nano .env
# Wichtige Anpassungen:
# VITE_API_URL=https://YOUR_SERVER_HOST/api # Für Dashboard-Build (Production)
# DB_HOST=db # In Containern immer 'db'
# DB_CONN=mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}
# Alle Passwörter für Produktion ändern
```
Hinweise:
- Eine Vorlage `.env.example` liegt im Repo. Kopiere sie als Ausgangspunkt: `cp .env.example .env`.
- Für lokale Entwicklung lädt `server/database.py` die `.env`, wenn `ENV=development` gesetzt ist.
- In Produktion verwaltet Compose/Container die Variablen; kein automatisches `.env`-Load im Code nötig.
---
## 🚀 Phase 3: System-Start und Konfiguration
### 8. Images von Registry pullen
```bash
# GitHub Container Registry Login (falls private Repository)
echo $GITHUB_TOKEN | docker login ghcr.io -u robbstarkaustria --password-stdin
# Images pullen
docker compose -f docker-compose.prod.yml pull
```
### 9. System starten
```bash
# Container starten
docker compose -f docker-compose.prod.yml up -d
# Status prüfen
docker compose ps
docker compose logs -f
```
### 10. Firewall konfigurieren
```bash
sudo ufw enable
sudo ufw allow ssh
sudo ufw allow 80/tcp
sudo ufw allow 443/tcp
sudo ufw allow 1883/tcp # MQTT
sudo ufw allow 9001/tcp # MQTT WebSocket
sudo ufw status
```
### 11. Installation validieren
```bash
# Health-Checks
curl http://localhost/api/health
curl https://localhost -k # -k für selbstsignierte Zertifikate
# Container-Status
docker compose ps
# Logs bei Problemen anzeigen
docker compose logs server
docker compose logs dashboard
docker compose logs mqtt
```
---
## 🧪 Quickstart (Entwicklung)
Schneller Start der Entwicklungsumgebung mit automatischen Proxys und Hot-Reload.
```bash
# Im Repository-Root
# 1) .env aus Vorlage erzeugen (lokal, falls noch nicht vorhanden)
cp -n .env.example .env
# 2) Dev-Stack starten (verwendet docker-compose.yml + docker-compose.override.yml)
docker compose up -d --build
# 3) Status & Logs
docker compose ps
docker compose logs -f server
docker compose logs -f dashboard
docker compose logs -f mqtt
# 4) Stack stoppen
docker compose down
```
Erreichbarkeit (Dev):
- Dashboard (Vite): http://localhost:5173
- API (Flask Dev): http://localhost:8000/api
- API Health: http://localhost:8000/health
- Screenshots: http://localhost:8000/screenshots/<uuid>.jpg
- MQTT: localhost:1883 (WebSocket: localhost:9001)
Hinweise:
- `ENV=development` lädt `.env` automatisch in `server/database.py`.
- Vite proxy routet `/api` und `/screenshots` in Dev direkt auf die API (siehe `dashboard/vite.config.ts`).
### 12. Automatischer Start (optional)
```bash
# Systemd-Service erstellen
sudo tee /etc/systemd/system/infoscreen.service > /dev/null << 'EOF'
[Unit]
Description=Infoscreen Application
Requires=docker.service
After=docker.service
[Service]
Type=oneshot
RemainAfterExit=yes
WorkingDirectory=/home/$USER/infoscreen-deployment
ExecStart=/usr/bin/docker compose -f docker-compose.prod.yml up -d
ExecStop=/usr/bin/docker compose -f docker-compose.prod.yml down
TimeoutStartSec=300
[Install]
WantedBy=multi-user.target
EOF
# Service aktivieren
sudo systemctl enable infoscreen.service
sudo systemctl start infoscreen.service
```
---
## 🌐 Zugriff auf die Anwendung
Nach erfolgreichem Deployment ist die Anwendung unter folgenden URLs erreichbar:
- **HTTPS Dashboard**: `https://YOUR_SERVER_IP`
- **HTTP Dashboard**: `http://YOUR_SERVER_IP` (Redirect zu HTTPS)
- **API**: `http://YOUR_SERVER_IP/api/`
- **MQTT**: `YOUR_SERVER_IP:1883`
- **MQTT WebSocket**: `YOUR_SERVER_IP:9001`
---
## 🔧 Troubleshooting
### Container-Status prüfen
```bash
# Alle Container anzeigen
docker compose ps
# Spezifische Logs anzeigen
docker compose logs -f [service-name]
# Container einzeln neustarten
docker compose restart [service-name]
```
### System neustarten
```bash
# Komplett neu starten
docker compose down
docker compose up -d
# Images neu pullen
docker compose pull
docker compose up -d
```
### Häufige Probleme
| Problem | Lösung |
|---------|--------|
| Container startet nicht | `docker compose logs [service]` prüfen |
| Ports bereits belegt | `sudo netstat -tulpn \| grep :80` prüfen |
| Keine Berechtigung | User zu docker-Gruppe hinzufügen |
| DB-Verbindung fehlschlägt | Environment-Variablen in `.env` prüfen |
| Mosquitto startet nicht | Ordner-Berechtigungen für `1883:1883` setzen |
---
## 📊 Docker-Version Vergleich
| Aspekt | Ubuntu Repository | Offizielle Installation |
|--------|------------------|------------------------|
| **Installation** | ✅ Schnell (1 Befehl) | ⚠️ Mehrere Schritte |
| **Version** | ⚠️ Oft älter | ✅ Neueste Version |
| **Updates** | ✅ Via apt | ✅ Via apt (nach Setup) |
| **Stabilität** | ✅ Getestet | ✅ Aktuell |
| **Features** | ⚠️ Möglicherweise eingeschränkt | ✅ Alle Features |
**Empfehlung:** Für Produktion die offizielle Docker-Installation verwenden.
---
## 📝 Wartung
### Regelmäßige Updates
```bash
# Images aktualisieren
docker compose pull
docker compose up -d
# System-Updates
sudo apt update && sudo apt upgrade -y
```
### Backup
```bash
# Container-Daten sichern
docker compose down
sudo tar -czf infoscreen-backup-$(date +%Y%m%d).tar.gz mosquitto/data/ certs/
# Backup wiederherstellen
sudo tar -xzf infoscreen-backup-YYYYMMDD.tar.gz
docker compose up -d
```
---
**Das Infoscreen-System ist jetzt vollständig über GitHub

143
docker-compose.prod.yml Normal file
View File

@@ -0,0 +1,143 @@
networks:
infoscreen-net:
driver: bridge
services:
proxy:
image: nginx:1.25
container_name: infoscreen-proxy
restart: unless-stopped
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf:ro
- ./certs:/etc/nginx/certs:ro
depends_on:
- server
- dashboard
networks:
- infoscreen-net
db:
image: mariadb:11.2
container_name: infoscreen-db
restart: unless-stopped
environment:
MYSQL_ROOT_PASSWORD: ${DB_ROOT_PASSWORD}
MYSQL_DATABASE: ${DB_NAME}
MYSQL_USER: ${DB_USER}
MYSQL_PASSWORD: ${DB_PASSWORD}
volumes:
- db-data:/var/lib/mysql
networks:
- infoscreen-net
healthcheck:
test: ["CMD", "healthcheck.sh", "--connect", "--innodb_initialized"]
interval: 30s
timeout: 5s
retries: 3
start_period: 30s
mqtt:
image: eclipse-mosquitto:2.0.21
container_name: infoscreen-mqtt
restart: unless-stopped
volumes:
- ./mosquitto/config/mosquitto.conf:/mosquitto/config/mosquitto.conf:ro
ports:
- "1883:1883"
- "9001:9001"
networks:
- infoscreen-net
healthcheck:
test: ["CMD-SHELL", "mosquitto_pub -h localhost -t test -m 'health' || exit 1"]
interval: 30s
timeout: 5s
retries: 3
start_period: 10s
# Verwende fertige Images statt Build
server:
image: ghcr.io/robbstarkaustria/infoscreen-api:latest
container_name: infoscreen-api
restart: unless-stopped
depends_on:
db:
condition: service_healthy
mqtt:
condition: service_healthy
environment:
DB_CONN: "mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}"
DB_USER: ${DB_USER}
DB_PASSWORD: ${DB_PASSWORD}
DB_NAME: ${DB_NAME}
DB_ROOT_PASSWORD: ${DB_ROOT_PASSWORD}
DB_HOST: db
FLASK_ENV: production
MQTT_BROKER_URL: mqtt://mqtt:1883
MQTT_USER: ${MQTT_USER}
MQTT_PASSWORD: ${MQTT_PASSWORD}
networks:
- infoscreen-net
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 5s
retries: 3
start_period: 40s
command: >
bash -c "alembic -c /app/server/alembic.ini upgrade head &&
python /app/server/init_defaults.py &&
exec gunicorn server.wsgi:app --bind 0.0.0.0:8000"
dashboard:
image: ghcr.io/robbstarkaustria/infoscreen-dashboard:latest # Oder wo auch immer Ihre Images liegen
container_name: infoscreen-dashboard
restart: unless-stopped
depends_on:
server:
condition: service_healthy
environment:
NODE_ENV: production
VITE_API_URL: ${API_URL}
networks:
- infoscreen-net
listener:
image: ghcr.io/robbstarkaustria/infoscreen-listener:latest # Oder wo auch immer Ihre Images liegen
container_name: infoscreen-listener
restart: unless-stopped
depends_on:
db:
condition: service_healthy
mqtt:
condition: service_healthy
environment:
DB_CONN: "mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}"
DB_USER: ${DB_USER}
DB_PASSWORD: ${DB_PASSWORD}
DB_NAME: ${DB_NAME}
DB_ROOT_PASSWORD: ${DB_ROOT_PASSWORD}
networks:
- infoscreen-net
scheduler:
image: ghcr.io/robbstarkaustria/infoscreen-scheduler:latest
container_name: infoscreen-scheduler
restart: unless-stopped
depends_on:
# HINZUGEFÜGT: Stellt sicher, dass die DB vor dem Scheduler startet
db:
condition: service_healthy
mqtt:
condition: service_healthy
environment:
# HINZUGEFÜGT: Datenbank-Verbindungsstring
DB_CONN: "mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}"
MQTT_PORT: 1883
networks:
- infoscreen-net
volumes:
db-data:

209
docker-compose.yml Normal file
View File

@@ -0,0 +1,209 @@
networks:
infoscreen-net:
driver: bridge
services:
listener:
build:
context: .
dockerfile: listener/Dockerfile
image: infoscreen-listener:latest
container_name: infoscreen-listener
restart: unless-stopped
depends_on:
db:
condition: service_healthy
mqtt:
condition: service_healthy
environment:
- DB_CONN=mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}
- DB_URL=mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}
# 🔧 ENTFERNT: Volume-Mount ist nur für die Entwicklung
networks:
- infoscreen-net
proxy:
image: nginx:1.25 # 🔧 GEÄNDERT: Spezifische Version
container_name: infoscreen-proxy
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf:ro # 🔧 GEÄNDERT: Relativer Pfad
- ./certs:/etc/nginx/certs:ro # 🔧 GEÄNDERT: Relativer Pfad
depends_on:
- server
- dashboard
networks:
- infoscreen-net
db:
image: mariadb:11.2 # 🔧 GEÄNDERT: Spezifische Version
container_name: infoscreen-db
restart: unless-stopped
environment:
MYSQL_ROOT_PASSWORD: ${DB_ROOT_PASSWORD}
MYSQL_DATABASE: ${DB_NAME}
MYSQL_USER: ${DB_USER}
MYSQL_PASSWORD: ${DB_PASSWORD}
volumes:
- db-data:/var/lib/mysql
ports:
- "3306:3306"
networks:
- infoscreen-net
healthcheck:
test: ["CMD", "healthcheck.sh", "--connect", "--innodb_initialized"]
interval: 30s
timeout: 5s
retries: 3
start_period: 30s
mqtt:
image: eclipse-mosquitto:2.0.21 # ✅ GUT: Version ist bereits spezifisch
container_name: infoscreen-mqtt
restart: unless-stopped
volumes:
- ./mosquitto/config:/mosquitto/config
- ./mosquitto/data:/mosquitto/data
- ./mosquitto/log:/mosquitto/log
ports:
- "1883:1883" # Standard MQTT
- "9001:9001" # WebSocket (falls benötigt)
networks:
- infoscreen-net
healthcheck:
test:
[
"CMD-SHELL",
"mosquitto_pub -h localhost -t test -m 'health' || exit 1",
]
interval: 30s
timeout: 5s
retries: 3
start_period: 10s
server:
build:
context: .
dockerfile: server/Dockerfile
image: infoscreen-api:latest
container_name: infoscreen-api
restart: unless-stopped
depends_on:
db:
condition: service_healthy
mqtt:
condition: service_healthy
environment:
DB_CONN: "mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}"
FLASK_ENV: ${FLASK_ENV}
ENV_FILE: ${ENV_FILE}
MQTT_BROKER_URL: ${MQTT_BROKER_URL}
MQTT_USER: ${MQTT_USER}
MQTT_PASSWORD: ${MQTT_PASSWORD}
REDIS_URL: "${REDIS_URL:-redis://redis:6379/0}"
GOTENBERG_URL: "${GOTENBERG_URL:-http://gotenberg:3000}"
ports:
- "8000:8000"
networks:
- infoscreen-net
volumes:
- media-data:/app/server/media
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 5s
retries: 3
start_period: 40s
# ✅ GEÄNDERT: Dashboard jetzt mit Node.js/React statt Python/Dash
dashboard:
build:
context: ./dashboard
dockerfile: Dockerfile
args:
- VITE_API_URL=${API_URL}
image: infoscreen-dashboard:latest
container_name: infoscreen-dashboard
restart: unless-stopped
depends_on:
server:
condition: service_healthy
environment:
- NODE_ENV=production
- VITE_API_URL=${API_URL}
# 🔧 ENTFERNT: Port wird in Produktion nicht direkt freigegeben, Zugriff via Proxy
networks:
- infoscreen-net
healthcheck:
# 🔧 GEÄNDERT: Healthcheck prüft den Nginx-Server im Container
test: ["CMD", "curl", "-f", "http://localhost/"]
interval: 30s
timeout: 5s
retries: 3
# 🔧 ERHÖHT: Gibt dem Backend mehr Zeit zum Starten, bevor dieser
# Container als "gesund" markiert wird.
start_period: 60s
scheduler:
build:
context: .
dockerfile: scheduler/Dockerfile
image: infoscreen-scheduler:latest
container_name: infoscreen-scheduler
restart: unless-stopped
depends_on:
# HINZUGEFÜGT: Stellt sicher, dass die DB vor dem Scheduler startet
db:
condition: service_healthy
mqtt:
condition: service_healthy
environment:
# HINZUGEFÜGT: Datenbank-Verbindungsstring
- DB_CONN=mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}
- MQTT_PORT=1883
networks:
- infoscreen-net
volumes:
- ./scheduler:/app/scheduler
redis:
image: redis:7-alpine
container_name: infoscreen-redis
restart: unless-stopped
networks:
- infoscreen-net
gotenberg:
image: gotenberg/gotenberg:8
container_name: infoscreen-gotenberg
restart: unless-stopped
networks:
- infoscreen-net
worker:
build:
context: .
dockerfile: server/Dockerfile
image: infoscreen-worker:latest
container_name: infoscreen-worker
restart: unless-stopped
depends_on:
- redis
- gotenberg
- db
environment:
DB_CONN: "mysql+pymysql://${DB_USER}:${DB_PASSWORD}@db/${DB_NAME}"
REDIS_URL: "${REDIS_URL:-redis://redis:6379/0}"
GOTENBERG_URL: "${GOTENBERG_URL:-http://gotenberg:3000}"
PYTHONPATH: /app
command: ["rq", "worker", "conversions"]
networks:
- infoscreen-net
volumes:
server-pip-cache:
db-data:
media-data:

100
early-validation.sh Normal file
View File

@@ -0,0 +1,100 @@
#!/bin/bash
# Early Hardware Validation für 25% Entwicklungsstand
# Ziel: Architektur-Probleme früh erkennen, nicht Volltest
echo "🧪 Infoscreen Early Hardware Validation"
echo "======================================"
echo "Entwicklungsstand: ~25-30%"
echo "Ziel: Basis-Deployment + Performance-Baseline"
echo ""
# Phase 1: Quick-Setup (30 Min)
echo "📦 Phase 1: Container-Setup-Test"
echo "- Docker-Compose startet alle Services?"
echo "- Health-Checks werden grün?"
echo "- Ports sind erreichbar?"
echo ""
# Phase 2: Connectivity-Test (1 Stunde)
echo "🌐 Phase 2: Service-Kommunikation"
echo "- Database-Connection vom Server?"
echo "- MQTT-Broker empfängt Messages?"
echo "- Nginx routet zu Services?"
echo "- API-Grundendpoints antworten?"
echo ""
# Phase 3: Performance-Baseline (2 Stunden)
echo "📊 Phase 3: Performance-Snapshot"
echo "- Memory-Verbrauch pro Container"
echo "- CPU-Usage im Idle"
echo "- Startup-Zeiten messen"
echo "- Network-Latency zwischen Services"
echo ""
# Phase 4: Basic Load-Test (4 Stunden)
echo "🔥 Phase 4: Basis-Belastungstest"
echo "- 10 parallele API-Requests"
echo "- 1000 MQTT-Messages senden"
echo "- Database-Insert-Performance"
echo "- Memory-Leak-Check (1h Laufzeit)"
echo ""
# Test-Checklist erstellen
cat > early-validation-checklist.md << 'EOF'
# Early Hardware Validation Checklist
## ✅ Container-Setup
- [ ] `docker compose up -d` erfolgreich
- [ ] Alle Services zeigen "healthy" Status
- [ ] Keine Error-Logs in den ersten 5 Minuten
- [ ] Ports 80, 8000, 3306, 1883 erreichbar
## ✅ Service-Kommunikation
- [ ] Server kann zu Database verbinden
- [ ] MQTT-Test-Message wird empfangen
- [ ] Nginx zeigt Service-Status-Page
- [ ] API-Health-Endpoint antwortet (200 OK)
## ✅ Performance-Baseline
- [ ] Total Memory < 4GB bei Idle
- [ ] CPU-Usage < 10% bei Idle
- [ ] Container-Startup < 60s
- [ ] API-Response-Time < 500ms
## ✅ Basic-Load-Test
- [ ] 10 parallele Requests ohne Errors
- [ ] 1000 MQTT-Messages ohne Message-Loss
- [ ] Memory-Usage stabil über 1h
- [ ] Keine Container-Restarts
## 📊 Baseline-Metriken (dokumentieren)
- Memory pro Container: ___MB
- CPU-Usage bei Load: ___%
- API-Response-Time: ___ms
- Database-Query-Time: ___ms
- Container-Startup-Zeit: ___s
## 🚨 Gefundene Probleme
- [ ] Performance-Bottlenecks: ____________
- [ ] Memory-Issues: ____________________
- [ ] Network-Probleme: _________________
- [ ] Container-Probleme: _______________
## ✅ Architektur-Validierung
- [ ] Container-Orchestrierung funktioniert
- [ ] Service-Discovery läuft
- [ ] Volume-Mounting korrekt
- [ ] Environment-Variables werden geladen
- [ ] Health-Checks sind aussagekräftig
EOF
echo "✅ Early Validation Checklist erstellt: early-validation-checklist.md"
echo ""
echo "🎯 Erwartetes Ergebnis:"
echo "- Architektur-Probleme identifiziert"
echo "- Performance-Baseline dokumentiert"
echo "- Deployment-Prozess validiert"
echo "- Basis für spätere Tests gelegt"
echo ""
echo "⏰ Geschätzter Aufwand: 8-12 Stunden über 2-3 Tage"
echo "💰 ROI: Verhindert teure Architektur-Änderungen später"

0
entrypoint.sh Normal file
View File

140
hardware-test-setup.sh Normal file
View File

@@ -0,0 +1,140 @@
#!/bin/bash
# Infoscreen Hardware Test Setup für Quad-Core 16GB System
echo "🖥️ Infoscreen Hardware Test Setup"
echo "=================================="
echo "System: Quad-Core, 16GB RAM, SSD"
echo ""
# System-Info anzeigen
echo "📊 System-Information:"
echo "CPU Cores: $(nproc)"
echo "RAM Total: $(free -h | grep Mem | awk '{print $2}')"
echo "Disk Free: $(df -h / | tail -1 | awk '{print $4}')"
echo ""
# Docker-Setup
echo "🐳 Docker-Setup..."
sudo apt update -y
sudo apt install -y docker.io docker-compose-plugin
sudo systemctl enable docker
sudo systemctl start docker
sudo usermod -aG docker $USER
# Test-Verzeichnisse erstellen
echo "📁 Test-Umgebung erstellen..."
mkdir -p ~/infoscreen-hardware-test/{prod,dev,monitoring,scripts,backups}
# Performance-Monitoring-Tools
echo "📊 Monitoring-Tools installieren..."
sudo apt install -y htop iotop nethogs ncdu stress-ng
# Test-Script erstellen
cat > ~/infoscreen-hardware-test/scripts/system-monitor.sh << 'EOF'
#!/bin/bash
# System-Monitoring während Tests
echo "=== Infoscreen System Monitor ==="
echo "Zeit: $(date)"
echo ""
echo "🖥️ CPU-Info:"
echo "Load: $(uptime | awk -F'load average:' '{print $2}')"
echo "Cores: $(nproc) | Usage: $(top -bn1 | grep "Cpu(s)" | awk '{print $2}' | cut -d'%' -f1)%"
echo ""
echo "💾 Memory-Info:"
free -h
echo ""
echo "💿 Disk-Info:"
df -h /
echo ""
echo "🐳 Docker-Info:"
docker ps --format "table {{.Names}}\t{{.Status}}\t{{.Ports}}"
echo ""
echo "🌡️ System-Temperature (falls verfügbar):"
sensors 2>/dev/null || echo "lm-sensors nicht installiert"
echo ""
echo "🌐 Network-Connections:"
ss -tuln | grep :80\\\|:443\\\|:8000\\\|:3306\\\|:1883
EOF
chmod +x ~/infoscreen-hardware-test/scripts/system-monitor.sh
# Load-Test-Script erstellen
cat > ~/infoscreen-hardware-test/scripts/load-test.sh << 'EOF'
#!/bin/bash
# Load-Test für Infoscreen-System
echo "🔥 Infoscreen Load-Test startet..."
# CPU-Load erzeugen (für Thermal-Tests)
echo "CPU-Stress-Test (30s)..."
stress-ng --cpu $(nproc) --timeout 30s &
# Memory-Test
echo "Memory-Stress-Test..."
stress-ng --vm 2 --vm-bytes 2G --timeout 30s &
# Disk-I/O-Test
echo "Disk-I/O-Test..."
stress-ng --hdd 1 --hdd-bytes 1G --timeout 30s &
# Warten auf Tests
wait
echo "✅ Load-Test abgeschlossen"
EOF
chmod +x ~/infoscreen-hardware-test/scripts/load-test.sh
# Docker-Test-Setup
echo "🧪 Docker-Test-Setup..."
cat > ~/infoscreen-hardware-test/docker-compose.test.yml << 'EOF'
version: '3.8'
services:
test-web:
image: nginx:alpine
ports: ["8080:80"]
deploy:
resources:
limits:
memory: 256M
reservations:
memory: 128M
test-db:
image: mariadb:11.2
environment:
MYSQL_ROOT_PASSWORD: test123
MYSQL_DATABASE: testdb
deploy:
resources:
limits:
memory: 512M
reservations:
memory: 256M
test-load:
image: alpine
command: sh -c "while true; do wget -q -O- http://test-web/ > /dev/null; sleep 0.1; done"
depends_on: [test-web]
EOF
echo ""
echo "✅ Setup abgeschlossen!"
echo ""
echo "🚀 Nächste Schritte:"
echo "1. Logout/Login für Docker-Gruppe"
echo "2. Test: docker run hello-world"
echo "3. System-Monitor: ~/infoscreen-hardware-test/scripts/system-monitor.sh"
echo "4. Load-Test: ~/infoscreen-hardware-test/scripts/load-test.sh"
echo "5. Docker-Test: cd ~/infoscreen-hardware-test && docker compose -f docker-compose.test.yml up"
echo ""
echo "📁 Test-Verzeichnis: ~/infoscreen-hardware-test/"
echo "📊 Monitoring: Führen Sie system-monitor.sh parallel zu Tests aus"

0
helpers/__init__.py Normal file
View File

56
helpers/check_folder.py Normal file
View File

@@ -0,0 +1,56 @@
import os
from pathlib import Path
def ensure_folder_exists(folder_path):
"""
Check if a folder exists and create it if it doesn't.
Args:
folder_path (str or Path): Path to the folder to check/create
Returns:
bool: True if folder was created, False if it already existed
Raises:
OSError: If folder creation fails due to permissions or other issues
"""
folder_path = Path(folder_path)
if folder_path.exists():
if folder_path.is_dir():
return False # Folder already exists
else:
raise OSError(f"Path '{folder_path}' exists but is not a directory")
try:
folder_path.mkdir(parents=True, exist_ok=True)
return True # Folder was created
except OSError as e:
raise OSError(f"Failed to create folder '{folder_path}': {e}")
# Alternative simpler version using os module
def ensure_folder_exists_simple(folder_path):
"""
Simple version using os.makedirs with exist_ok parameter.
Args:
folder_path (str): Path to the folder to check/create
"""
os.makedirs(folder_path, exist_ok=True)
# Usage examples
if __name__ == "__main__":
# Example 1: Create a single folder
folder_created = ensure_folder_exists("my_new_folder")
print(f"Folder created: {folder_created}")
# Example 2: Create nested folders
ensure_folder_exists("data/processed/results")
# Example 3: Using the simple version
ensure_folder_exists_simple("logs/2024")
# Example 4: Using with absolute path
import tempfile
temp_dir = tempfile.gettempdir()
ensure_folder_exists(os.path.join(temp_dir, "my_app", "cache"))

4
listener/.dockerignore Normal file
View File

@@ -0,0 +1,4 @@
__pycache__/
*.pyc
*.pyo
*.log

16
listener/Dockerfile Normal file
View File

@@ -0,0 +1,16 @@
# Listener Dockerfile
FROM python:3.13-slim
WORKDIR /app
COPY listener/requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
# Mosquitto-Tools für MQTT-Tests installieren
RUN apt-get update && apt-get install -y --no-install-recommends mosquitto-clients && rm -rf /var/lib/apt/lists/*
COPY listener/ ./listener
COPY models/ ./models
ENV PYTHONPATH=/app
CMD ["python", "listener/listener.py"]

102
listener/listener.py Normal file
View File

@@ -0,0 +1,102 @@
import os
import json
import logging
import datetime
import paho.mqtt.client as mqtt
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from models.models import Client
if os.getenv("ENV", "development") == "development":
from dotenv import load_dotenv
load_dotenv(".env")
# ENV-abhängige Konfiguration
ENV = os.getenv("ENV", "development")
LOG_LEVEL = os.getenv("LOG_LEVEL", "INFO" if ENV == "production" else "DEBUG")
DB_URL = os.environ.get(
"DB_CONN", "mysql+pymysql://user:password@db/infoscreen")
# Logging
logging.basicConfig(level=logging.DEBUG,
format='%(asctime)s [%(levelname)s] %(message)s')
# DB-Konfiguration
engine = create_engine(DB_URL)
Session = sessionmaker(bind=engine)
def on_message(client, userdata, msg):
topic = msg.topic
logging.debug(f"Empfangene Nachricht auf Topic: {topic}")
try:
# Heartbeat-Handling
if topic.startswith("infoscreen/") and topic.endswith("/heartbeat"):
uuid = topic.split("/")[1]
session = Session()
client_obj = session.query(Client).filter_by(uuid=uuid).first()
if client_obj:
client_obj.last_alive = datetime.datetime.now(datetime.UTC)
session.commit()
logging.info(
f"Heartbeat von {uuid} empfangen, last_alive (UTC) aktualisiert.")
session.close()
return
# Discovery-Handling
if topic == "infoscreen/discovery":
payload = json.loads(msg.payload.decode())
logging.info(f"Discovery empfangen: {payload}")
if "uuid" in payload:
uuid = payload["uuid"]
session = Session()
existing = session.query(Client).filter_by(uuid=uuid).first()
if not existing:
new_client = Client(
uuid=uuid,
hardware_token=payload.get("hardware_token"),
ip=payload.get("ip"),
type=payload.get("type"),
hostname=payload.get("hostname"),
os_version=payload.get("os_version"),
software_version=payload.get("software_version"),
macs=",".join(payload.get("macs", [])),
model=payload.get("model"),
registration_time=datetime.datetime.now(datetime.UTC),
)
session.add(new_client)
session.commit()
logging.info(f"Neuer Client registriert: {uuid}")
else:
logging.info(f"Client bereits bekannt: {uuid}")
session.close()
# Discovery-ACK senden
ack_topic = f"infoscreen/{uuid}/discovery_ack"
client.publish(ack_topic, json.dumps({"status": "ok"}))
logging.info(f"Discovery-ACK gesendet an {ack_topic}")
else:
logging.warning("Discovery ohne UUID empfangen, ignoriert.")
except Exception as e:
logging.error(f"Fehler bei Verarbeitung: {e}")
def main():
mqtt_client = mqtt.Client(protocol=mqtt.MQTTv311, callback_api_version=2)
mqtt_client.on_message = on_message
mqtt_client.connect("mqtt", 1883)
mqtt_client.subscribe("infoscreen/discovery")
mqtt_client.subscribe("infoscreen/+/heartbeat")
logging.info(
"Listener gestartet und abonniert auf infoscreen/discovery und infoscreen/+/heartbeat")
mqtt_client.loop_forever()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,4 @@
paho-mqtt>=2.0
SQLAlchemy>=2.0
pymysql
python-dotenv

1
models/__init__.py Normal file
View File

@@ -0,0 +1 @@
# models package for shared SQLAlchemy models

271
models/models.py Normal file
View File

@@ -0,0 +1,271 @@
from sqlalchemy import (
Column, Integer, String, Enum, TIMESTAMP, func, Boolean, ForeignKey, Float, Text, Index, DateTime, Date, UniqueConstraint
)
from sqlalchemy.orm import declarative_base, relationship
import enum
from datetime import datetime, timezone
Base = declarative_base()
class UserRole(enum.Enum):
user = "user"
admin = "admin"
superadmin = "superadmin"
class AcademicPeriodType(enum.Enum):
schuljahr = "schuljahr"
semester = "semester"
trimester = "trimester"
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True, autoincrement=True)
username = Column(String(50), unique=True, nullable=False, index=True)
password_hash = Column(String(128), nullable=False)
role = Column(Enum(UserRole), nullable=False, default=UserRole.user)
is_active = Column(Boolean, default=True, nullable=False)
created_at = Column(TIMESTAMP(timezone=True),
server_default=func.current_timestamp())
updated_at = Column(TIMESTAMP(timezone=True), server_default=func.current_timestamp(
), onupdate=func.current_timestamp())
class AcademicPeriod(Base):
__tablename__ = 'academic_periods'
id = Column(Integer, primary_key=True, autoincrement=True)
name = Column(String(100), nullable=False) # "Schuljahr 2024/25"
display_name = Column(String(50), nullable=True) # "SJ 24/25" (kurz)
start_date = Column(Date, nullable=False, index=True)
end_date = Column(Date, nullable=False, index=True)
period_type = Column(Enum(AcademicPeriodType),
nullable=False, default=AcademicPeriodType.schuljahr)
# nur eine aktive Periode zur Zeit
is_active = Column(Boolean, default=False, nullable=False)
created_at = Column(TIMESTAMP(timezone=True),
server_default=func.current_timestamp())
updated_at = Column(TIMESTAMP(timezone=True), server_default=func.current_timestamp(
), onupdate=func.current_timestamp())
# Constraint: nur eine aktive Periode zur Zeit
__table_args__ = (
Index('ix_academic_periods_active', 'is_active'),
UniqueConstraint('name', name='uq_academic_periods_name'),
)
def to_dict(self):
return {
"id": self.id,
"name": self.name,
"display_name": self.display_name,
"start_date": self.start_date.isoformat() if self.start_date else None,
"end_date": self.end_date.isoformat() if self.end_date else None,
"period_type": self.period_type.value if self.period_type else None,
"is_active": self.is_active,
"created_at": self.created_at.isoformat() if self.created_at else None,
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
}
class ClientGroup(Base):
__tablename__ = 'client_groups'
id = Column(Integer, primary_key=True, autoincrement=True)
name = Column(String(100), unique=True, nullable=False)
description = Column(String(255), nullable=True) # Manuell zu setzen
created_at = Column(TIMESTAMP(timezone=True),
server_default=func.current_timestamp())
is_active = Column(Boolean, default=True, nullable=False)
class Client(Base):
__tablename__ = 'clients'
uuid = Column(String(36), primary_key=True, nullable=False)
hardware_token = Column(String(64), nullable=True)
ip = Column(String(45), nullable=True)
type = Column(String(50), nullable=True)
hostname = Column(String(100), nullable=True)
os_version = Column(String(100), nullable=True)
software_version = Column(String(100), nullable=True)
macs = Column(String(255), nullable=True)
model = Column(String(100), nullable=True)
description = Column(String(255), nullable=True) # Manuell zu setzen
registration_time = Column(TIMESTAMP(
timezone=True), server_default=func.current_timestamp(), nullable=False)
last_alive = Column(TIMESTAMP(timezone=True), server_default=func.current_timestamp(
), onupdate=func.current_timestamp(), nullable=False)
is_active = Column(Boolean, default=True, nullable=False)
group_id = Column(Integer, ForeignKey(
'client_groups.id'), nullable=False, default=1)
class EventType(enum.Enum):
presentation = "presentation"
website = "website"
video = "video"
message = "message"
other = "other"
webuntis = "webuntis"
class MediaType(enum.Enum):
# Präsentationen
pdf = "pdf"
ppt = "ppt"
pptx = "pptx"
odp = "odp"
# Videos (gängige VLC-Formate)
mp4 = "mp4"
avi = "avi"
mkv = "mkv"
mov = "mov"
wmv = "wmv"
flv = "flv"
webm = "webm"
mpg = "mpg"
mpeg = "mpeg"
ogv = "ogv"
# Bilder (benutzerfreundlich)
jpg = "jpg"
jpeg = "jpeg"
png = "png"
gif = "gif"
bmp = "bmp"
tiff = "tiff"
svg = "svg"
# HTML-Mitteilung
html = "html"
# Webseiten
website = "website"
class Event(Base):
__tablename__ = 'events'
id = Column(Integer, primary_key=True, autoincrement=True)
group_id = Column(Integer, ForeignKey(
'client_groups.id'), nullable=False, index=True)
academic_period_id = Column(Integer, ForeignKey(
# Optional für Rückwärtskompatibilität
'academic_periods.id'), nullable=True, index=True)
title = Column(String(100), nullable=False)
description = Column(Text, nullable=True)
start = Column(TIMESTAMP(timezone=True), nullable=False, index=True)
end = Column(TIMESTAMP(timezone=True), nullable=False, index=True)
event_type = Column(Enum(EventType), nullable=False)
event_media_id = Column(Integer, ForeignKey(
'event_media.id'), nullable=True)
autoplay = Column(Boolean, nullable=True) # NEU
loop = Column(Boolean, nullable=True) # NEU
volume = Column(Float, nullable=True) # NEU
slideshow_interval = Column(Integer, nullable=True) # NEU
created_at = Column(TIMESTAMP(timezone=True),
server_default=func.current_timestamp())
updated_at = Column(TIMESTAMP(timezone=True), server_default=func.current_timestamp(
), onupdate=func.current_timestamp())
created_by = Column(Integer, ForeignKey('users.id'), nullable=False)
updated_by = Column(Integer, ForeignKey('users.id'), nullable=True)
is_active = Column(Boolean, default=True, nullable=False)
# Add relationships
academic_period = relationship(
"AcademicPeriod", foreign_keys=[academic_period_id])
event_media = relationship("EventMedia", foreign_keys=[event_media_id])
class EventMedia(Base):
__tablename__ = 'event_media'
id = Column(Integer, primary_key=True, autoincrement=True)
academic_period_id = Column(Integer, ForeignKey(
# Optional für bessere Organisation
'academic_periods.id'), nullable=True, index=True)
media_type = Column(Enum(MediaType), nullable=False)
url = Column(String(255), nullable=False)
file_path = Column(String(255), nullable=True)
message_content = Column(Text, nullable=True)
uploaded_at = Column(TIMESTAMP, nullable=False,
default=lambda: datetime.now(timezone.utc))
# Add relationship
academic_period = relationship(
"AcademicPeriod", foreign_keys=[academic_period_id])
def to_dict(self):
return {
"id": self.id,
"academic_period_id": self.academic_period_id,
"media_type": self.media_type.value if self.media_type else None,
"url": self.url,
"file_path": self.file_path,
"message_content": self.message_content,
}
class SchoolHoliday(Base):
__tablename__ = 'school_holidays'
id = Column(Integer, primary_key=True, autoincrement=True)
name = Column(String(150), nullable=False)
start_date = Column(Date, nullable=False, index=True)
end_date = Column(Date, nullable=False, index=True)
region = Column(String(100), nullable=True, index=True)
source_file_name = Column(String(255), nullable=True)
imported_at = Column(TIMESTAMP(timezone=True),
server_default=func.current_timestamp())
__table_args__ = (
UniqueConstraint('name', 'start_date', 'end_date',
'region', name='uq_school_holidays_unique'),
)
def to_dict(self):
return {
"id": self.id,
"name": self.name,
"start_date": self.start_date.isoformat() if self.start_date else None,
"end_date": self.end_date.isoformat() if self.end_date else None,
"region": self.region,
"source_file_name": self.source_file_name,
"imported_at": self.imported_at.isoformat() if self.imported_at else None,
}
# --- Conversions: Track PPT/PPTX/ODP -> PDF processing state ---
class ConversionStatus(enum.Enum):
pending = "pending"
processing = "processing"
ready = "ready"
failed = "failed"
class Conversion(Base):
__tablename__ = 'conversions'
id = Column(Integer, primary_key=True, autoincrement=True)
# Source media to be converted
source_event_media_id = Column(
Integer,
ForeignKey('event_media.id', ondelete='CASCADE'),
nullable=False,
index=True,
)
target_format = Column(String(10), nullable=False,
index=True) # e.g. 'pdf'
# relative to server/media
target_path = Column(String(512), nullable=True)
status = Column(Enum(ConversionStatus), nullable=False,
default=ConversionStatus.pending)
file_hash = Column(String(64), nullable=False) # sha256 of source file
started_at = Column(TIMESTAMP(timezone=True), nullable=True)
completed_at = Column(TIMESTAMP(timezone=True), nullable=True)
error_message = Column(Text, nullable=True)
__table_args__ = (
# Fast lookup per media/format
Index('ix_conv_source_target', 'source_event_media_id', 'target_format'),
# Operational filtering
Index('ix_conv_status_target', 'status', 'target_format'),
# Idempotency: same source + target + file content should be unique
UniqueConstraint('source_event_media_id', 'target_format',
'file_hash', name='uq_conv_source_target_hash'),
)

29
nginx.conf Normal file
View File

@@ -0,0 +1,29 @@
events {}
http {
upstream dashboard {
server infoscreen-dashboard:80;
}
upstream infoscreen_api {
server infoscreen-api:8000;
}
server {
listen 80;
server_name _;
# Leitet /api/ und /screenshots/ an den API-Server weiter
location /api/ {
proxy_pass http://infoscreen_api/api/;
}
location /screenshots/ {
proxy_pass http://infoscreen_api/screenshots/;
}
# Alles andere geht ans Frontend
location / {
proxy_pass http://dashboard;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
}

47
nginx.dev.conf Normal file
View File

@@ -0,0 +1,47 @@
events {}
http {
upstream dashboard {
# Vite dev server inside the dashboard container
server infoscreen-dashboard:5173;
}
upstream infoscreen_api {
server infoscreen-api:8000;
}
server {
listen 80;
server_name _;
# Proxy /api and /screenshots to the Flask API
location /api/ {
proxy_pass http://infoscreen_api/api/;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /screenshots/ {
proxy_pass http://infoscreen_api/screenshots/;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
# Everything else to the Vite dev server
location / {
proxy_pass http://dashboard;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# WebSocket upgrade for Vite HMR
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
}
}

477
pptx_conversion_guide.md Normal file
View File

@@ -0,0 +1,477 @@
# Recommended Implementation: PPTX-to-PDF Conversion System
## Architecture Overview
**Asynchronous server-side conversion with database tracking**
```
User Upload → API saves PPTX + DB entry → Job in Queue
Client requests → API checks DB status → PDF ready? → Download PDF
→ Pending? → "Please wait"
→ Failed? → Retry/Error
```
## 1. Database Schema
```sql
CREATE TABLE media_files (
id UUID PRIMARY KEY,
filename VARCHAR(255),
original_path VARCHAR(512),
file_type VARCHAR(10),
mime_type VARCHAR(100),
uploaded_at TIMESTAMP,
updated_at TIMESTAMP
);
CREATE TABLE conversions (
id UUID PRIMARY KEY,
source_file_id UUID REFERENCES media_files(id) ON DELETE CASCADE,
target_format VARCHAR(10), -- 'pdf'
target_path VARCHAR(512), -- Path to generated PDF
status VARCHAR(20), -- 'pending', 'processing', 'ready', 'failed'
started_at TIMESTAMP,
completed_at TIMESTAMP,
error_message TEXT,
file_hash VARCHAR(64) -- Hash of PPTX for cache invalidation
);
CREATE INDEX idx_conversions_source ON conversions(source_file_id, target_format);
```
## 2. Components
### **API Server (existing)**
- Accepts uploads
- Creates DB entries
- Enqueues jobs
- Delivers status and files
### **Background Worker (new)**
- Runs as separate process in **same container** as API
- Processes conversion jobs from queue
- Can run multiple worker instances in parallel
- Technology: Python RQ, Celery, or similar
### **Message Queue**
- Redis (recommended for start - simple, fast)
- Alternative: RabbitMQ for more features
### **Redis Container (new)**
- Separate container for Redis
- Handles job queue
- Minimal resource footprint
## 3. Detailed Workflow
### **Upload Process:**
```python
@app.post("/upload")
async def upload_file(file):
# 1. Save PPTX
file_path = save_to_disk(file)
# 2. DB entry for original file
file_record = db.create_media_file({
'filename': file.filename,
'original_path': file_path,
'file_type': 'pptx'
})
# 3. Create conversion record
conversion = db.create_conversion({
'source_file_id': file_record.id,
'target_format': 'pdf',
'status': 'pending',
'file_hash': calculate_hash(file_path)
})
# 4. Enqueue job (asynchronous!)
queue.enqueue(convert_to_pdf, conversion.id)
# 5. Return immediately to user
return {
'file_id': file_record.id,
'status': 'uploaded',
'conversion_status': 'pending'
}
```
### **Worker Process:**
```python
def convert_to_pdf(conversion_id):
conversion = db.get_conversion(conversion_id)
source_file = db.get_media_file(conversion.source_file_id)
# Status update: processing
db.update_conversion(conversion_id, {
'status': 'processing',
'started_at': now()
})
try:
# LibreOffice Conversion
pdf_path = f"/data/converted/{conversion.id}.pdf"
subprocess.run([
'libreoffice',
'--headless',
'--convert-to', 'pdf',
'--outdir', '/data/converted/',
source_file.original_path
], check=True)
# Success
db.update_conversion(conversion_id, {
'status': 'ready',
'target_path': pdf_path,
'completed_at': now()
})
except Exception as e:
# Error
db.update_conversion(conversion_id, {
'status': 'failed',
'error_message': str(e),
'completed_at': now()
})
```
### **Client Download:**
```python
@app.get("/files/{file_id}/display")
async def get_display_file(file_id):
file = db.get_media_file(file_id)
# Only for PPTX: check PDF conversion
if file.file_type == 'pptx':
conversion = db.get_latest_conversion(file.id, target_format='pdf')
if not conversion:
# Shouldn't happen, but just to be safe
trigger_new_conversion(file.id)
return {'status': 'pending', 'message': 'Conversion is being created'}
if conversion.status == 'ready':
return FileResponse(conversion.target_path)
elif conversion.status == 'failed':
# Optional: Auto-retry
trigger_new_conversion(file.id)
return {'status': 'failed', 'error': conversion.error_message}
else: # pending or processing
return {'status': conversion.status, 'message': 'Please wait...'}
# Serve other file types directly
return FileResponse(file.original_path)
```
## 4. Docker Setup
```yaml
version: '3.8'
services:
# Your API Server
api:
build: ./api
command: uvicorn main:app --host 0.0.0.0 --port 8000
ports:
- "8000:8000"
volumes:
- ./data/uploads:/data/uploads
- ./data/converted:/data/converted
environment:
- REDIS_URL=redis://redis:6379
- DATABASE_URL=postgresql://postgres:password@postgres:5432/infoscreen
depends_on:
- redis
- postgres
restart: unless-stopped
# Worker (same codebase as API, different command)
worker:
build: ./api # Same build as API!
command: python worker.py # or: rq worker
volumes:
- ./data/uploads:/data/uploads
- ./data/converted:/data/converted
environment:
- REDIS_URL=redis://redis:6379
- DATABASE_URL=postgresql://postgres:password@postgres:5432/infoscreen
depends_on:
- redis
- postgres
restart: unless-stopped
# Optional: Multiple workers
deploy:
replicas: 2
# Redis - separate container
redis:
image: redis:7-alpine
volumes:
- redis-data:/data
# Optional: persistent configuration
command: redis-server --appendonly yes
restart: unless-stopped
# Your existing Postgres
postgres:
image: postgres:15
environment:
- POSTGRES_DB=infoscreen
- POSTGRES_PASSWORD=password
volumes:
- postgres-data:/var/lib/postgresql/data
restart: unless-stopped
# Optional: Redis Commander (UI for debugging)
redis-commander:
image: rediscommander/redis-commander
environment:
- REDIS_HOSTS=local:redis:6379
ports:
- "8081:8081"
depends_on:
- redis
volumes:
redis-data:
postgres-data:
```
## 5. Container Communication
Containers communicate via **Docker's internal network**:
```python
# In your API/Worker code:
import redis
# Connection to Redis
redis_client = redis.from_url('redis://redis:6379')
# ^^^^^^
# Container name = hostname in Docker network
```
Docker automatically creates DNS entries, so `redis` resolves to the Redis container.
## 6. Client Behavior (Pi5)
```python
# On the Pi5 client
def display_file(file_id):
response = api.get(f"/files/{file_id}/display")
if response.content_type == 'application/pdf':
# PDF is ready
download_and_display(response)
subprocess.run(['impressive', downloaded_pdf])
elif response.json()['status'] in ['pending', 'processing']:
# Wait and retry
show_loading_screen("Presentation is being prepared...")
time.sleep(5)
display_file(file_id) # Retry
else:
# Error
show_error_screen("Error loading presentation")
```
## 7. Additional Features
### **Cache Invalidation on PPTX Update:**
```python
@app.put("/files/{file_id}")
async def update_file(file_id, new_file):
# Delete old conversions
db.mark_conversions_as_obsolete(file_id)
# Update file
update_media_file(file_id, new_file)
# Trigger new conversion
trigger_conversion(file_id, 'pdf')
```
### **Status API for Monitoring:**
```python
@app.get("/admin/conversions/status")
async def get_conversion_stats():
return {
'pending': db.count(status='pending'),
'processing': db.count(status='processing'),
'failed': db.count(status='failed'),
'avg_duration_seconds': db.avg_duration()
}
```
### **Cleanup Job (Cronjob):**
```python
def cleanup_old_conversions():
# Remove PDFs from deleted files
db.delete_orphaned_conversions()
# Clean up old failed conversions
db.delete_old_failed_conversions(older_than_days=7)
```
## 8. Redis Container Details
### **Why Separate Container?**
**Separation of Concerns**: Each service has its own responsibility
**Independent Lifecycle Management**: Redis can be restarted/updated independently
**Better Scaling**: Redis can be moved to different hardware
**Easier Backup**: Redis data can be backed up separately
**Standard Docker Pattern**: Microservices architecture
### **Resource Usage:**
- RAM: ~10-50 MB for your use case
- CPU: Minimal
- Disk: Only for persistence (optional)
For 10 clients with occasional PPTX uploads, this is absolutely no problem.
## 9. Advantages of This Solution
**Scalable**: Workers can be scaled horizontally
**Performant**: Clients don't wait for conversion
**Robust**: Status tracking and error handling
**Maintainable**: Clear separation of responsibilities
**Transparent**: Status queryable at any time
**Efficient**: One-time conversion per file
**Future-proof**: Easily extensible for other formats
**Professional**: Industry-standard architecture
## 10. Migration Path
### **Phase 1 (MVP):**
- 1 worker process in API container
- Redis for queue (separate container)
- Basic DB schema
- Simple retry logic
### **Phase 2 (as needed):**
- Multiple worker instances
- Dedicated conversion service container
- Monitoring & alerting
- Prioritization logic
- Advanced caching strategies
**Start simple, scale when needed!**
## 11. Key Decisions Summary
| Aspect | Decision | Reason |
|--------|----------|--------|
| **Conversion Location** | Server-side | One conversion per file, consistent results |
| **Conversion Timing** | Asynchronous (on upload) | No client waiting time, predictable performance |
| **Data Storage** | Database-tracked | Status visibility, robust error handling |
| **Queue System** | Redis (separate container) | Standard pattern, scalable, maintainable |
| **Worker Architecture** | Background process in API container | Simple start, easy to separate later |
## 12. File Flow Diagram
```
┌─────────────┐
│ User Upload │
│ (PPTX) │
└──────┬──────┘
┌──────────────────┐
│ API Server │
│ 1. Save PPTX │
│ 2. Create DB rec │
│ 3. Enqueue job │
└──────┬───────────┘
┌──────────────────┐
│ Redis Queue │◄─────┐
└──────┬───────────┘ │
│ │
▼ │
┌──────────────────┐ │
│ Worker Process │ │
│ 1. Get job │ │
│ 2. Convert PPTX │ │
│ 3. Update DB │ │
└──────┬───────────┘ │
│ │
▼ │
┌──────────────────┐ │
│ PDF Storage │ │
└──────┬───────────┘ │
│ │
▼ │
┌──────────────────┐ │
│ Client Requests │ │
│ 1. Check DB │ │
│ 2. Download PDF │ │
│ 3. Display │──────┘
└──────────────────┘
(via impressive)
```
## 13. Implementation Checklist
### Database Setup
- [ ] Create `media_files` table
- [ ] Create `conversions` table
- [ ] Add indexes for performance
- [ ] Set up foreign key constraints
### API Changes
- [ ] Modify upload endpoint to create DB records
- [ ] Add conversion job enqueueing
- [ ] Implement file download endpoint with status checking
- [ ] Add status API for monitoring
- [ ] Implement cache invalidation on file update
### Worker Setup
- [ ] Create worker script/module
- [ ] Implement LibreOffice conversion logic
- [ ] Add error handling and retry logic
- [ ] Set up logging and monitoring
### Docker Configuration
- [ ] Add Redis container to docker-compose.yml
- [ ] Configure worker container
- [ ] Set up volume mounts for file storage
- [ ] Configure environment variables
- [ ] Set up container dependencies
### Client Updates
- [ ] Modify client to check conversion status
- [ ] Implement retry logic for pending conversions
- [ ] Add loading/waiting screens
- [ ] Implement error handling
### Testing
- [ ] Test upload → conversion → download flow
- [ ] Test multiple concurrent conversions
- [ ] Test error handling (corrupted PPTX, etc.)
- [ ] Test cache invalidation on file update
- [ ] Load test with multiple clients
### Monitoring & Operations
- [ ] Set up logging for conversions
- [ ] Implement cleanup job for old files
- [ ] Add metrics for conversion times
- [ ] Set up alerts for failed conversions
- [ ] Document backup procedures
---
**This architecture provides a solid foundation that's simple to start with but scales professionally as your needs grow!**

View File

@@ -0,0 +1,815 @@
# Recommended Implementation: PPTX-to-PDF Conversion System with Gotenberg
## Architecture Overview
**Asynchronous server-side conversion using Gotenberg with shared storage**
```
User Upload → API saves PPTX → Job in Queue → Worker calls Gotenberg API
Gotenberg converts via shared volume
Client requests → API checks DB status → PDF ready? → Download PDF from shared storage
→ Pending? → "Please wait"
→ Failed? → Retry/Error
```
## 1. Database Schema
```sql
CREATE TABLE media_files (
id UUID PRIMARY KEY,
filename VARCHAR(255),
original_path VARCHAR(512),
file_type VARCHAR(10),
mime_type VARCHAR(100),
uploaded_at TIMESTAMP,
updated_at TIMESTAMP
);
CREATE TABLE conversions (
id UUID PRIMARY KEY,
source_file_id UUID REFERENCES media_files(id) ON DELETE CASCADE,
target_format VARCHAR(10), -- 'pdf'
target_path VARCHAR(512), -- Path to generated PDF
status VARCHAR(20), -- 'pending', 'processing', 'ready', 'failed'
started_at TIMESTAMP,
completed_at TIMESTAMP,
error_message TEXT,
file_hash VARCHAR(64) -- Hash of PPTX for cache invalidation
);
CREATE INDEX idx_conversions_source ON conversions(source_file_id, target_format);
```
## 2. Components
### **API Server (existing)**
- Accepts uploads
- Creates DB entries
- Enqueues jobs
- Delivers status and files
### **Background Worker (new)**
- Runs as separate process in **same container** as API
- Processes conversion jobs from queue
- Calls Gotenberg API for conversion
- Updates database with results
- Technology: Python RQ, Celery, or similar
### **Gotenberg Container (new)**
- Dedicated conversion service
- HTTP API for document conversion
- Handles LibreOffice conversions internally
- Accesses files via shared volume
### **Message Queue**
- Redis (recommended for start - simple, fast)
- Alternative: RabbitMQ for more features
### **Redis Container (separate)**
- Handles job queue
- Minimal resource footprint
### **Shared Storage**
- Docker volume mounted to all containers that need file access
- API, Worker, and Gotenberg all access same files
- Simplifies file exchange between services
## 3. Detailed Workflow
### **Upload Process:**
```python
@app.post("/upload")
async def upload_file(file):
# 1. Save PPTX to shared volume
file_path = save_to_disk(file) # e.g., /shared/uploads/abc123.pptx
# 2. DB entry for original file
file_record = db.create_media_file({
'filename': file.filename,
'original_path': file_path,
'file_type': 'pptx'
})
# 3. Create conversion record
conversion = db.create_conversion({
'source_file_id': file_record.id,
'target_format': 'pdf',
'status': 'pending',
'file_hash': calculate_hash(file_path)
})
# 4. Enqueue job (asynchronous!)
queue.enqueue(convert_to_pdf_via_gotenberg, conversion.id)
# 5. Return immediately to user
return {
'file_id': file_record.id,
'status': 'uploaded',
'conversion_status': 'pending'
}
```
### **Worker Process (calls Gotenberg):**
```python
import requests
import os
GOTENBERG_URL = os.getenv('GOTENBERG_URL', 'http://gotenberg:3000')
def convert_to_pdf_via_gotenberg(conversion_id):
conversion = db.get_conversion(conversion_id)
source_file = db.get_media_file(conversion.source_file_id)
# Status update: processing
db.update_conversion(conversion_id, {
'status': 'processing',
'started_at': now()
})
try:
# Prepare output path
pdf_filename = f"{conversion.id}.pdf"
pdf_path = f"/shared/converted/{pdf_filename}"
# Call Gotenberg API
# Gotenberg accesses the file via shared volume
with open(source_file.original_path, 'rb') as f:
files = {
'files': (os.path.basename(source_file.original_path), f)
}
response = requests.post(
f'{GOTENBERG_URL}/forms/libreoffice/convert',
files=files,
timeout=300 # 5 minutes timeout
)
response.raise_for_status()
# Save PDF to shared volume
with open(pdf_path, 'wb') as pdf_file:
pdf_file.write(response.content)
# Success
db.update_conversion(conversion_id, {
'status': 'ready',
'target_path': pdf_path,
'completed_at': now()
})
except requests.exceptions.Timeout:
db.update_conversion(conversion_id, {
'status': 'failed',
'error_message': 'Conversion timeout after 5 minutes',
'completed_at': now()
})
except requests.exceptions.RequestException as e:
db.update_conversion(conversion_id, {
'status': 'failed',
'error_message': f'Gotenberg API error: {str(e)}',
'completed_at': now()
})
except Exception as e:
db.update_conversion(conversion_id, {
'status': 'failed',
'error_message': str(e),
'completed_at': now()
})
```
### **Alternative: Direct File Access via Shared Volume**
If you prefer Gotenberg to read from shared storage directly (more efficient for large files):
```python
def convert_to_pdf_via_gotenberg_shared(conversion_id):
conversion = db.get_conversion(conversion_id)
source_file = db.get_media_file(conversion.source_file_id)
db.update_conversion(conversion_id, {
'status': 'processing',
'started_at': now()
})
try:
pdf_filename = f"{conversion.id}.pdf"
pdf_path = f"/shared/converted/{pdf_filename}"
# Gotenberg reads directly from shared volume
# We just tell it where to find the file
with open(source_file.original_path, 'rb') as f:
files = {'files': f}
response = requests.post(
f'{GOTENBERG_URL}/forms/libreoffice/convert',
files=files,
timeout=300
)
response.raise_for_status()
# Write result to shared volume
with open(pdf_path, 'wb') as pdf_file:
pdf_file.write(response.content)
db.update_conversion(conversion_id, {
'status': 'ready',
'target_path': pdf_path,
'completed_at': now()
})
except Exception as e:
db.update_conversion(conversion_id, {
'status': 'failed',
'error_message': str(e),
'completed_at': now()
})
```
### **Client Download:**
```python
@app.get("/files/{file_id}/display")
async def get_display_file(file_id):
file = db.get_media_file(file_id)
# Only for PPTX: check PDF conversion
if file.file_type == 'pptx':
conversion = db.get_latest_conversion(file.id, target_format='pdf')
if not conversion:
# Shouldn't happen, but just to be safe
trigger_new_conversion(file.id)
return {'status': 'pending', 'message': 'Conversion is being created'}
if conversion.status == 'ready':
# Serve PDF from shared storage
return FileResponse(conversion.target_path)
elif conversion.status == 'failed':
# Optional: Auto-retry
trigger_new_conversion(file.id)
return {'status': 'failed', 'error': conversion.error_message}
else: # pending or processing
return {'status': conversion.status, 'message': 'Please wait...'}
# Serve other file types directly
return FileResponse(file.original_path)
```
## 4. Docker Setup
```yaml
version: '3.8'
services:
# Your API Server
api:
build: ./api
command: uvicorn main:app --host 0.0.0.0 --port 8000
ports:
- "8000:8000"
volumes:
- shared-storage:/shared # Shared volume
environment:
- REDIS_URL=redis://redis:6379
- DATABASE_URL=postgresql://postgres:password@postgres:5432/infoscreen
- GOTENBERG_URL=http://gotenberg:3000
depends_on:
- redis
- postgres
- gotenberg
restart: unless-stopped
# Worker (same codebase as API, different command)
worker:
build: ./api # Same build as API!
command: python worker.py # or: rq worker
volumes:
- shared-storage:/shared # Shared volume
environment:
- REDIS_URL=redis://redis:6379
- DATABASE_URL=postgresql://postgres:password@postgres:5432/infoscreen
- GOTENBERG_URL=http://gotenberg:3000
depends_on:
- redis
- postgres
- gotenberg
restart: unless-stopped
# Optional: Multiple workers
deploy:
replicas: 2
# Gotenberg - Document Conversion Service
gotenberg:
image: gotenberg/gotenberg:8
# Gotenberg doesn't need the shared volume if files are sent via HTTP
# But mount it if you want direct file access
volumes:
- shared-storage:/shared # Optional: for direct file access
environment:
# Gotenberg configuration
- GOTENBERG_API_TIMEOUT=300s
- GOTENBERG_LOG_LEVEL=info
restart: unless-stopped
# Resource limits (optional but recommended)
deploy:
resources:
limits:
cpus: '2.0'
memory: 2G
reservations:
cpus: '0.5'
memory: 512M
# Redis - separate container
redis:
image: redis:7-alpine
volumes:
- redis-data:/data
command: redis-server --appendonly yes
restart: unless-stopped
# Your existing Postgres
postgres:
image: postgres:15
environment:
- POSTGRES_DB=infoscreen
- POSTGRES_PASSWORD=password
volumes:
- postgres-data:/var/lib/postgresql/data
restart: unless-stopped
# Optional: Redis Commander (UI for debugging)
redis-commander:
image: rediscommander/redis-commander
environment:
- REDIS_HOSTS=local:redis:6379
ports:
- "8081:8081"
depends_on:
- redis
volumes:
shared-storage: # New: Shared storage for all file operations
redis-data:
postgres-data:
```
## 5. Storage Structure
```
/shared/
├── uploads/ # Original uploaded files (PPTX, etc.)
│ ├── abc123.pptx
│ ├── def456.pptx
│ └── ...
└── converted/ # Converted PDF files
├── uuid-1.pdf
├── uuid-2.pdf
└── ...
```
## 6. Gotenberg Integration Details
### **Gotenberg API Endpoints:**
Gotenberg provides various conversion endpoints:
```python
# LibreOffice conversion (for PPTX, DOCX, ODT, etc.)
POST http://gotenberg:3000/forms/libreoffice/convert
# HTML to PDF
POST http://gotenberg:3000/forms/chromium/convert/html
# Markdown to PDF
POST http://gotenberg:3000/forms/chromium/convert/markdown
# Merge PDFs
POST http://gotenberg:3000/forms/pdfengines/merge
```
### **Example Conversion Request:**
```python
import requests
def convert_with_gotenberg(input_file_path, output_file_path):
"""
Convert document using Gotenberg
"""
with open(input_file_path, 'rb') as f:
files = {
'files': (os.path.basename(input_file_path), f,
'application/vnd.openxmlformats-officedocument.presentationml.presentation')
}
# Optional: Add conversion parameters
data = {
'landscape': 'false', # Portrait mode
'nativePageRanges': '1-', # All pages
}
response = requests.post(
'http://gotenberg:3000/forms/libreoffice/convert',
files=files,
data=data,
timeout=300
)
if response.status_code == 200:
with open(output_file_path, 'wb') as out:
out.write(response.content)
return True
else:
raise Exception(f"Gotenberg error: {response.status_code} - {response.text}")
```
### **Advanced Options:**
```python
# With custom PDF properties
data = {
'landscape': 'false',
'nativePageRanges': '1-10', # Only first 10 pages
'pdfFormat': 'PDF/A-1a', # PDF/A format
'exportFormFields': 'false',
}
# With password protection
data = {
'userPassword': 'secret123',
'ownerPassword': 'admin456',
}
```
## 7. Client Behavior (Pi5)
```python
# On the Pi5 client
def display_file(file_id):
response = api.get(f"/files/{file_id}/display")
if response.content_type == 'application/pdf':
# PDF is ready
download_and_display(response)
subprocess.run(['impressive', downloaded_pdf])
elif response.json()['status'] in ['pending', 'processing']:
# Wait and retry
show_loading_screen("Presentation is being prepared...")
time.sleep(5)
display_file(file_id) # Retry
else:
# Error
show_error_screen("Error loading presentation")
```
## 8. Additional Features
### **Cache Invalidation on PPTX Update:**
```python
@app.put("/files/{file_id}")
async def update_file(file_id, new_file):
# Delete old conversions and PDFs
conversions = db.get_conversions_for_file(file_id)
for conv in conversions:
if conv.target_path and os.path.exists(conv.target_path):
os.remove(conv.target_path)
db.mark_conversions_as_obsolete(file_id)
# Update file
update_media_file(file_id, new_file)
# Trigger new conversion
trigger_conversion(file_id, 'pdf')
```
### **Status API for Monitoring:**
```python
@app.get("/admin/conversions/status")
async def get_conversion_stats():
return {
'pending': db.count(status='pending'),
'processing': db.count(status='processing'),
'failed': db.count(status='failed'),
'avg_duration_seconds': db.avg_duration(),
'gotenberg_health': check_gotenberg_health()
}
def check_gotenberg_health():
try:
response = requests.get(
f'{GOTENBERG_URL}/health',
timeout=5
)
return response.status_code == 200
except:
return False
```
### **Cleanup Job (Cronjob):**
```python
def cleanup_old_conversions():
# Remove PDFs from deleted files
orphaned = db.get_orphaned_conversions()
for conv in orphaned:
if conv.target_path and os.path.exists(conv.target_path):
os.remove(conv.target_path)
db.delete_conversion(conv.id)
# Clean up old failed conversions
old_failed = db.get_old_failed_conversions(older_than_days=7)
for conv in old_failed:
db.delete_conversion(conv.id)
```
## 9. Advantages of Using Gotenberg
**Specialized Service**: Optimized specifically for document conversion
**No LibreOffice Management**: Gotenberg handles LibreOffice lifecycle internally
**Better Resource Management**: Isolated conversion process
**HTTP API**: Clean, standard interface
**Production Ready**: Battle-tested, actively maintained
**Multiple Formats**: Supports PPTX, DOCX, ODT, HTML, Markdown, etc.
**PDF Features**: Merge, encrypt, watermark PDFs
**Health Checks**: Built-in health endpoint
**Horizontal Scaling**: Can run multiple Gotenberg instances
**Memory Safe**: Automatic cleanup and restart on issues
## 10. Migration Path
### **Phase 1 (MVP):**
- 1 worker process in API container
- Redis for queue (separate container)
- Gotenberg for conversion (separate container)
- Basic DB schema
- Shared volume for file exchange
- Simple retry logic
### **Phase 2 (as needed):**
- Multiple worker instances
- Multiple Gotenberg instances (load balancing)
- Monitoring & alerting
- Prioritization logic
- Advanced caching strategies
- PDF optimization/compression
**Start simple, scale when needed!**
## 11. Key Decisions Summary
| Aspect | Decision | Reason |
|--------|----------|--------|
| **Conversion Location** | Server-side (Gotenberg) | One conversion per file, consistent results |
| **Conversion Service** | Dedicated Gotenberg container | Specialized, production-ready, better isolation |
| **Conversion Timing** | Asynchronous (on upload) | No client waiting time, predictable performance |
| **Data Storage** | Database-tracked | Status visibility, robust error handling |
| **File Exchange** | Shared Docker volume | Simple, efficient, no network overhead |
| **Queue System** | Redis (separate container) | Standard pattern, scalable, maintainable |
| **Worker Architecture** | Background process in API container | Simple start, easy to separate later |
## 12. File Flow Diagram
```
┌─────────────┐
│ User Upload │
│ (PPTX) │
└──────┬──────┘
┌──────────────────────┐
│ API Server │
│ 1. Save to /shared │
│ 2. Create DB record │
│ 3. Enqueue job │
└──────┬───────────────┘
┌──────────────────┐
│ Redis Queue │
└──────┬───────────┘
┌──────────────────────┐
│ Worker Process │
│ 1. Get job │
│ 2. Call Gotenberg │
│ 3. Update DB │
└──────┬───────────────┘
┌──────────────────────┐
│ Gotenberg │
│ 1. Read from /shared │
│ 2. Convert PPTX │
│ 3. Return PDF │
└──────┬───────────────┘
┌──────────────────────┐
│ Worker saves PDF │
│ to /shared/converted│
└──────┬───────────────┘
┌──────────────────────┐
│ Client Requests │
│ 1. Check DB │
│ 2. Download PDF │
│ 3. Display │
└──────────────────────┘
(via impressive)
```
## 13. Implementation Checklist
### Database Setup
- [ ] Create `media_files` table
- [ ] Create `conversions` table
- [ ] Add indexes for performance
- [ ] Set up foreign key constraints
### Storage Setup
- [ ] Create shared Docker volume
- [ ] Set up directory structure (/shared/uploads, /shared/converted)
- [ ] Configure proper permissions
### API Changes
- [ ] Modify upload endpoint to save to shared storage
- [ ] Create DB records for uploads
- [ ] Add conversion job enqueueing
- [ ] Implement file download endpoint with status checking
- [ ] Add status API for monitoring
- [ ] Implement cache invalidation on file update
### Worker Setup
- [ ] Create worker script/module
- [ ] Implement Gotenberg API calls
- [ ] Add error handling and retry logic
- [ ] Set up logging and monitoring
- [ ] Handle timeouts and failures
### Docker Configuration
- [ ] Add Gotenberg container to docker-compose.yml
- [ ] Add Redis container to docker-compose.yml
- [ ] Configure worker container
- [ ] Set up shared volume mounts
- [ ] Configure environment variables
- [ ] Set up container dependencies
- [ ] Configure resource limits for Gotenberg
### Client Updates
- [ ] Modify client to check conversion status
- [ ] Implement retry logic for pending conversions
- [ ] Add loading/waiting screens
- [ ] Implement error handling
### Testing
- [ ] Test upload → conversion → download flow
- [ ] Test multiple concurrent conversions
- [ ] Test error handling (corrupted PPTX, etc.)
- [ ] Test Gotenberg timeout handling
- [ ] Test cache invalidation on file update
- [ ] Load test with multiple clients
- [ ] Test Gotenberg health checks
### Monitoring & Operations
- [ ] Set up logging for conversions
- [ ] Monitor Gotenberg health endpoint
- [ ] Implement cleanup job for old files
- [ ] Add metrics for conversion times
- [ ] Set up alerts for failed conversions
- [ ] Monitor shared storage disk usage
- [ ] Document backup procedures
### Security
- [ ] Validate file types before conversion
- [ ] Set file size limits
- [ ] Sanitize filenames
- [ ] Implement rate limiting
- [ ] Secure inter-container communication
## 14. Gotenberg Configuration Options
### **Environment Variables:**
```yaml
gotenberg:
image: gotenberg/gotenberg:8
environment:
# API Configuration
- GOTENBERG_API_TIMEOUT=300s
- GOTENBERG_API_PORT=3000
# Logging
- GOTENBERG_LOG_LEVEL=info # debug, info, warn, error
# LibreOffice
- GOTENBERG_LIBREOFFICE_DISABLE_ROUTES=false
- GOTENBERG_LIBREOFFICE_AUTO_START=true
# Chromium (if needed for HTML/Markdown)
- GOTENBERG_CHROMIUM_DISABLE_ROUTES=true # Disable if not needed
# Resource limits
- GOTENBERG_LIBREOFFICE_MAX_QUEUE_SIZE=100
```
### **Custom Gotenberg Configuration:**
For advanced configurations, create a `gotenberg.yml`:
```yaml
api:
timeout: 300s
port: 3000
libreoffice:
autoStart: true
maxQueueSize: 100
chromium:
disableRoutes: true
```
Mount it in docker-compose:
```yaml
gotenberg:
image: gotenberg/gotenberg:8
volumes:
- ./gotenberg.yml:/etc/gotenberg/config.yml:ro
- shared-storage:/shared
```
## 15. Troubleshooting
### **Common Issues:**
**Gotenberg timeout:**
```python
# Increase timeout for large files
response = requests.post(
f'{GOTENBERG_URL}/forms/libreoffice/convert',
files=files,
timeout=600 # 10 minutes for large PPTX
)
```
**Memory issues:**
```yaml
# Increase Gotenberg memory limit
gotenberg:
deploy:
resources:
limits:
memory: 4G
```
**File permission issues:**
```bash
# Ensure proper permissions on shared volume
chmod -R 755 /shared
chown -R 1000:1000 /shared
```
**Gotenberg not responding:**
```python
# Check health before conversion
def ensure_gotenberg_healthy():
try:
response = requests.get(f'{GOTENBERG_URL}/health', timeout=5)
if response.status_code != 200:
raise Exception("Gotenberg unhealthy")
except Exception as e:
logger.error(f"Gotenberg health check failed: {e}")
raise
```
---
**This architecture provides a production-ready, scalable solution using Gotenberg as a specialized conversion service with efficient file sharing via Docker volumes!**
## 16. Best Practices Specific to Infoscreen
- Idempotency by content: Always compute a SHA256 of the uploaded source and include it in the unique key (source_event_media_id, target_format, file_hash). This prevents duplicate work for identical content and auto-busts cache on change.
- Strict MIME/type validation: Accept only .ppt, .pptx, .odp for conversion. Reject unknown types early. Consider reading the first bytes (magic) for extra safety.
- Bounded retries with jitter: Retry conversions on transient HTTP 5xx or timeouts up to N times with exponential backoff. Do not retry on 4xx or clear user errors.
- Output naming: Derive deterministic output paths under media/converted/, e.g., <basename>.pdf. Ensure no path traversal and sanitize names.
- Timeouts and size limits: Enforce server-side max upload size and per-job conversion timeout (e.g., 10 minutes). Return clear errors for oversized/long-running files.
- Isolation and quotas: Set CPU/memory limits for Gotenberg; consider a concurrency cap per worker to avoid DB starvation.
- Health probes before work: Check Gotenberg /health prior to enqueue spikes; fail-fast to avoid queue pile-ups when Gotenberg is down.
- Observability: Log job IDs, file hashes, durations, and sizes. Expose a small /api/conversions/status summary for operational visibility.
- Cleanup policy: Periodically delete orphaned conversions (media deleted) and failed jobs older than X days. Keep successful PDFs aligned with DB rows.
- Security: Never trust client paths; always resolve relative to the known media root. Do not expose the shared volume directly; serve via API only.
- Backpressure: If queue length exceeds a threshold, surface 503/“try later” on new uploads or pause enqueue to protect the system.

8
scheduler/Dockerfile Normal file
View File

@@ -0,0 +1,8 @@
FROM python:3.13-slim
WORKDIR /app
COPY scheduler/requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY scheduler/ ./scheduler
COPY models/ ./models
ENV PYTHONPATH=/app
CMD ["python", "-m", "scheduler.scheduler"]

113
scheduler/db_utils.py Normal file
View File

@@ -0,0 +1,113 @@
# scheduler/db_utils.py
from dotenv import load_dotenv
import os
from datetime import datetime
from sqlalchemy.orm import sessionmaker, joinedload
from sqlalchemy import create_engine
from models.models import Event, EventMedia
load_dotenv('/workspace/.env')
# DB-URL aus Umgebungsvariable oder Fallback
DB_CONN = os.environ.get("DB_CONN", "mysql+pymysql://user:password@db/dbname")
engine = create_engine(DB_CONN)
Session = sessionmaker(bind=engine)
# Base URL from .env for file URLs
API_BASE_URL = os.environ.get("API_BASE_URL", "http://server:8000")
def get_active_events(start: datetime, end: datetime, group_id: int = None):
session = Session()
try:
# Now this will work with the relationship defined
query = session.query(Event).options(
joinedload(Event.event_media)
).filter(Event.is_active == True)
if start and end:
query = query.filter(Event.start < end, Event.end > start)
if group_id:
query = query.filter(Event.group_id == group_id)
events = query.all()
formatted_events = []
for event in events:
formatted_event = format_event_with_media(event)
formatted_events.append(formatted_event)
return formatted_events
finally:
session.close()
def format_event_with_media(event):
"""Transform Event + EventMedia into client-expected format"""
event_dict = {
"id": event.id,
"title": event.title,
"start": str(event.start),
"end": str(event.end),
"group_id": event.group_id,
}
# Now you can directly access event.event_media
import logging
if event.event_media:
media = event.event_media
if event.event_type.value == "presentation":
event_dict["presentation"] = {
"type": "slideshow",
"files": [],
"slide_interval": event.slideshow_interval or 5000,
"auto_advance": True
}
# Debug: log media_type
logging.debug(
f"[Scheduler] EventMedia id={media.id} media_type={getattr(media.media_type, 'value', str(media.media_type))}")
# Check for PDF conversion for ppt/pptx/odp
from sqlalchemy.orm import scoped_session
from models.models import Conversion, ConversionStatus
session = scoped_session(Session)
pdf_url = None
if getattr(media.media_type, 'value', str(media.media_type)) in ("ppt", "pptx", "odp"):
conversion = session.query(Conversion).filter_by(
source_event_media_id=media.id,
target_format="pdf",
status=ConversionStatus.ready
).order_by(Conversion.completed_at.desc()).first()
logging.debug(
f"[Scheduler] Conversion lookup for media_id={media.id}: found={bool(conversion)}, path={getattr(conversion, 'target_path', None) if conversion else None}")
if conversion and conversion.target_path:
# Serve via /api/files/converted/<path>
pdf_url = f"{API_BASE_URL}/api/files/converted/{conversion.target_path}"
session.remove()
if pdf_url:
filename = os.path.basename(pdf_url)
event_dict["presentation"]["files"].append({
"name": filename,
"url": pdf_url,
"checksum": None,
"size": None
})
logging.info(
f"[Scheduler] Using converted PDF for event_media_id={media.id}: {pdf_url}")
elif media.file_path:
filename = os.path.basename(media.file_path)
event_dict["presentation"]["files"].append({
"name": filename,
"url": f"{API_BASE_URL}/api/files/{media.id}/{filename}",
"checksum": None,
"size": None
})
logging.info(
f"[Scheduler] Using original file for event_media_id={media.id}: {filename}")
# Add other event types...
return event_dict

View File

@@ -0,0 +1,4 @@
paho-mqtt
sqlalchemy
pymysql
python-dotenv

112
scheduler/scheduler.py Normal file
View File

@@ -0,0 +1,112 @@
# scheduler/scheduler.py
import os
import logging
from .db_utils import get_active_events
import paho.mqtt.client as mqtt
import json
import datetime
import time
# Logging-Konfiguration
ENV = os.getenv("ENV", "development")
LOG_LEVEL = os.getenv("LOG_LEVEL", "DEBUG" if ENV == "development" else "INFO")
LOG_PATH = os.path.join(os.path.dirname(__file__), "scheduler.log")
os.makedirs(os.path.dirname(LOG_PATH), exist_ok=True)
log_handlers = []
if ENV == "production":
from logging.handlers import RotatingFileHandler
log_handlers.append(RotatingFileHandler(
LOG_PATH, maxBytes=2*1024*1024, backupCount=5, encoding="utf-8"))
else:
log_handlers.append(logging.FileHandler(LOG_PATH, encoding="utf-8"))
if os.getenv("DEBUG_MODE", "1" if ENV == "development" else "0") in ("1", "true", "True"):
log_handlers.append(logging.StreamHandler())
logging.basicConfig(
level=getattr(logging, LOG_LEVEL.upper(), logging.INFO),
format="%(asctime)s [%(levelname)s] %(message)s",
handlers=log_handlers
)
def main():
# Fix für die veraltete API - explizit callback_api_version setzen
client = mqtt.Client(callback_api_version=mqtt.CallbackAPIVersion.VERSION2)
client.reconnect_delay_set(min_delay=1, max_delay=30)
POLL_INTERVAL = 30 # Sekunden, Empfehlung für seltene Änderungen
# 0 = aus; z.B. 600 für alle 10 Min
REFRESH_SECONDS = int(os.getenv("REFRESH_SECONDS", "0"))
last_payloads = {} # group_id -> payload
last_published_at = {} # group_id -> epoch seconds
# Beim (Re-)Connect alle bekannten retained Payloads erneut senden
def on_connect(client, userdata, flags, reasonCode, properties=None):
logging.info(
f"MQTT connected (reasonCode={reasonCode}) - republishing {len(last_payloads)} groups")
for gid, payload in last_payloads.items():
topic = f"infoscreen/events/{gid}"
client.publish(topic, payload, retain=True)
client.on_connect = on_connect
client.connect("mqtt", 1883)
client.loop_start()
while True:
now = datetime.datetime.now(datetime.timezone.utc)
# Hole alle aktiven Events (bereits formatierte Dictionaries)
events = get_active_events(now, now)
# Gruppiere Events nach group_id
groups = {}
for event in events:
gid = event.get("group_id")
if gid not in groups:
groups[gid] = []
# Event ist bereits ein Dictionary im gewünschten Format
groups[gid].append(event)
# Sende pro Gruppe die Eventliste als retained Message, nur bei Änderung
for gid, event_list in groups.items():
# stabile Reihenfolge, um unnötige Publishes zu vermeiden
event_list.sort(key=lambda e: (e.get("start"), e.get("id")))
payload = json.dumps(
event_list, sort_keys=True, separators=(",", ":"))
topic = f"infoscreen/events/{gid}"
should_send = (last_payloads.get(gid) != payload)
if not should_send and REFRESH_SECONDS:
last_ts = last_published_at.get(gid, 0)
if time.time() - last_ts >= REFRESH_SECONDS:
should_send = True
if should_send:
result = client.publish(topic, payload, retain=True)
if result.rc != mqtt.MQTT_ERR_SUCCESS:
logging.error(
f"Fehler beim Publish für Gruppe {gid}: {mqtt.error_string(result.rc)}")
else:
logging.info(f"Events für Gruppe {gid} gesendet")
last_payloads[gid] = payload
last_published_at[gid] = time.time()
# Entferne Gruppen, die nicht mehr existieren (leere retained Message senden)
for gid in list(last_payloads.keys()):
if gid not in groups:
topic = f"infoscreen/events/{gid}"
result = client.publish(topic, payload="[]", retain=True)
if result.rc != mqtt.MQTT_ERR_SUCCESS:
logging.error(
f"Fehler beim Entfernen für Gruppe {gid}: {mqtt.error_string(result.rc)}")
else:
logging.info(
f"Events für Gruppe {gid} entfernt (leere retained Message gesendet)")
del last_payloads[gid]
last_published_at.pop(gid, None)
time.sleep(POLL_INTERVAL)
if __name__ == "__main__":
main()

58
server-setup-script.sh Executable file
View File

@@ -0,0 +1,58 @@
#!/bin/bash
# Ubuntu Server Setup Script for Infoscreen Development
set -e
echo "🚀 Setting up Ubuntu Server for Infoscreen Development..."
# Update system
sudo apt update && sudo apt upgrade -y
# Install essential tools
sudo apt install -y git curl wget vim nano htop
# Install Docker (official installation method)
echo "📦 Installing Docker..."
# Remove old Docker packages
sudo apt remove -y docker docker-engine docker.io containerd runc 2>/dev/null || true
# Install Docker dependencies
sudo apt install -y ca-certificates curl gnupg lsb-release
# Add Docker GPG key
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
# Add Docker repository
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# Install Docker
sudo apt update
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
# Start and enable Docker
sudo systemctl enable docker
sudo systemctl start docker
# Add user to docker group
sudo usermod -aG docker $USER
# Install Node.js (for dashboard development)
echo "📦 Installing Node.js..."
curl -fsSL https://deb.nodesource.com/setup_lts.x | sudo -E bash -
sudo apt install -y nodejs
# Verify installations
echo "✅ Verifying installations..."
echo "Docker version: $(docker --version)"
echo "Docker Compose version: $(docker compose version)"
echo "Node.js version: $(node --version)"
echo "npm version: $(npm --version)"
echo ""
echo "🎉 Server setup complete!"
echo "⚠️ Please log out and log back in for Docker group changes to take effect."
echo "💡 You can now clone your repository and start development."

61
server/Dockerfile Normal file
View File

@@ -0,0 +1,61 @@
# server/Dockerfile
# 🔧 OPTIMIERT: Multi-Stage-Build für ein minimales und sicheres Produktions-Image
# Stage 1: Builder - Installiert Abhängigkeiten
FROM python:3.13-slim AS builder
WORKDIR /app
# Installiert nur die für den Build notwendigen Systemabhängigkeiten
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
libmariadb-dev-compat libmariadb-dev gcc \
&& rm -rf /var/lib/apt/lists/*
# Kopiert nur die requirements.txt, um den Docker-Cache optimal zu nutzen
COPY /server/requirements.txt .
# Installiert die Python-Pakete in ein separates Verzeichnis
RUN pip install --no-cache-dir --prefix="/install" -r requirements.txt
# Stage 2: Final - Das eigentliche Produktions-Image
FROM python:3.13-slim
# --- Arbeitsverzeichnis ---
WORKDIR /app
# Installiert nur die für die Laufzeit notwendigen Systemabhängigkeiten
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
libmariadb-dev-compat locales curl \
&& rm -rf /var/lib/apt/lists/*
# --- Locale konfigurieren ---
RUN sed -i 's/# de_DE.UTF-8 UTF-8/de_DE.UTF-8 UTF-8/' /etc/locale.gen \
&& locale-gen
ENV LANG=de_DE.UTF-8 \
LC_ALL=de_DE.UTF-8
# Kopiert die installierten Pakete aus der Builder-Stage
COPY --from=builder /install /usr/local
# --- Applikationscode ---
# Kopiert den Server-Code in das Arbeitsverzeichnis
COPY server/ ./server
COPY models/ ./models
# --- Non-Root User anlegen und Rechte setzen ---
ARG USER_ID=1000
ARG GROUP_ID=1000
RUN groupadd -g ${GROUP_ID} infoscreen_taa \
&& useradd -u ${USER_ID} -g ${GROUP_ID} \
--shell /bin/bash --create-home infoscreen_taa \
&& chown -R infoscreen_taa:infoscreen_taa /app
USER infoscreen_taa
# --- Port für die API exposed ---
EXPOSE 8000
# --- Startbefehl für Gunicorn ---
CMD ["gunicorn", "-w", "4", "-b", "0.0.0.0:8000", "server.wsgi:app"]

42
server/Dockerfile.dev Normal file
View File

@@ -0,0 +1,42 @@
# Datei: server/Dockerfile.dev
# 🔧 OPTIMIERT: Für die Entwicklung im Dev-Container
# ==========================================
FROM python:3.13-slim
# Die Erstellung des non-root Users und die Locale-Konfiguration
# sind für den Dev-Container nicht zwingend nötig, da VS Code sich als 'root'
# verbindet (gemäß devcontainer.json). Sie schaden aber nicht.
ARG USER_ID=1000
ARG GROUP_ID=1000
RUN apt-get update && apt-get install -y --no-install-recommends locales curl git \
&& groupadd -g ${GROUP_ID} infoscreen_taa \
&& useradd -u ${USER_ID} -g ${GROUP_ID} --shell /bin/bash --create-home infoscreen_taa \
&& sed -i 's/# de_DE.UTF-8 UTF-8/de_DE.UTF-8 UTF-8/' /etc/locale.gen \
&& locale-gen \
&& rm -rf /var/lib/apt/lists/*
ENV LANG=de_DE.UTF-8 \
LC_ALL=de_DE.UTF-8
# Setze das Arbeitsverzeichnis auf den Workspace-Root, passend zu den Mounts.
WORKDIR /app
# Kopiere die Anforderungsdateien in das korrekte Unterverzeichnis.
# ✅ KORRIGIERT: Pfade sind jetzt relativ zum Build-Kontext (dem 'server'-Verzeichnis)
COPY server/requirements.txt server/requirements-dev.txt ./server/
# Installiere die Python-Abhängigkeiten
RUN pip install --upgrade pip \
&& pip install --no-cache-dir -r server/requirements.txt \
&& pip install --no-cache-dir -r server/requirements-dev.txt
# Das Kopieren des Codes ist nicht nötig, da das Verzeichnis gemountet wird.
# Exponiere die Ports für die Flask API und den Debugger
EXPOSE 8000 5678
# Der Startbefehl wird in der docker-compose.override.yml definiert.
# Ein Standard-CMD dient als Fallback.
CMD ["python", "-m", "debugpy", "--listen", "0.0.0.0:5678", "-m", "flask", "run", "--host=0.0.0.0", "--port=8000"]

8
server/__init__.py Normal file
View File

@@ -0,0 +1,8 @@
"""Server package initializer.
Expose submodules required by external importers (e.g., RQ string paths).
"""
# Ensure 'server.worker' is available as an attribute of the 'server' package
# so that RQ can resolve 'server.worker.convert_event_media_to_pdf'.
from . import worker # noqa: F401

141
server/alembic.ini Normal file
View File

@@ -0,0 +1,141 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts.
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = %(here)s/alembic
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory. for multiple paths, the path separator
# is defined by "path_separator" below.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =
# max length of characters to apply to the "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to <script_location>/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "path_separator"
# below.
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
# path_separator; This indicates what character is used to split lists of file
# paths, including version_locations and prepend_sys_path within configparser
# files such as alembic.ini.
# The default rendered in new alembic.ini files is "os", which uses os.pathsep
# to provide os-dependent path splitting.
#
# Note that in order to support legacy alembic.ini files, this default does NOT
# take place if path_separator is not present in alembic.ini. If this
# option is omitted entirely, fallback logic is as follows:
#
# 1. Parsing of the version_locations option falls back to using the legacy
# "version_path_separator" key, which if absent then falls back to the legacy
# behavior of splitting on spaces and/or commas.
# 2. Parsing of the prepend_sys_path option falls back to the legacy
# behavior of splitting on spaces, commas, or colons.
#
# Valid values for path_separator are:
#
# path_separator = :
# path_separator = ;
# path_separator = space
# path_separator = newline
#
# Use os.pathsep. Default configuration used for new projects.
path_separator = os
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
# database URL. This is consumed by the user-maintained env.py script only.
# other means of configuring database URLs may be customized within the env.py
# file.
# sqlalchemy.url = driver://user:pass@localhost/dbname
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
# hooks = ruff
# ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Logging configuration. This is also consumed by the user-maintained
# env.py script only.
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARNING
handlers = console
qualname =
[logger_sqlalchemy]
level = WARNING
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

1
server/alembic/README Normal file
View File

@@ -0,0 +1 @@
Generic single-database configuration.

115
server/alembic/env.py Normal file
View File

@@ -0,0 +1,115 @@
# isort: skip_file
from alembic import context
from sqlalchemy import pool
from sqlalchemy import engine_from_config
from logging.config import fileConfig
from dotenv import load_dotenv
from models.models import Base
import os
import sys
sys.path.insert(0, '/workspace')
print("sys.path:", sys.path)
print("models dir exists:", os.path.isdir('/workspace/models'))
print("models/models.py exists:", os.path.isfile('/workspace/models/models.py'))
print("models/__init__.py exists:",
os.path.isfile('/workspace/models/__init__.py'))
print("sys.path:", sys.path)
print("models dir exists:", os.path.isdir('/workspace/models'))
print("models/models.py exists:", os.path.isfile('/workspace/models/models.py'))
print("models/__init__.py exists:",
os.path.isfile('/workspace/models/__init__.py'))
# .env laden (optional)
env_path = os.path.abspath(os.path.join(
os.path.dirname(__file__), '../../.env'))
print(f"Loading environment variables from: {env_path}")
load_dotenv(env_path)
DB_CONN = os.getenv("DB_CONN")
if DB_CONN:
DATABASE_URL = DB_CONN
else:
# Datenbank-Zugangsdaten aus .env
DB_USER = os.getenv("DB_USER")
DB_PASSWORD = os.getenv("DB_PASSWORD")
DB_HOST = os.getenv("DB_HOST", "db") # Default jetzt 'db'
DB_PORT = os.getenv("DB_PORT", "3306")
DB_NAME = os.getenv("DB_NAME")
DATABASE_URL = f"mysql+pymysql://{DB_USER}:{DB_PASSWORD}@{DB_HOST}:{DB_PORT}/{DB_NAME}"
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)
print(f"Using DATABASE_URL: {DATABASE_URL}")
config.set_main_option("sqlalchemy.url", DATABASE_URL)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = Base.metadata
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata
)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

View File

@@ -0,0 +1,28 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
"""Upgrade schema."""
${upgrades if upgrades else "pass"}
def downgrade() -> None:
"""Downgrade schema."""
${downgrades if downgrades else "pass"}

View File

@@ -0,0 +1,36 @@
"""Rename location to description in client_groups, add description to clients
Revision ID: 0c47280d3e2d
Revises: 3a09ef909689
Create Date: 2025-07-16 08:47:00.355445
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision: str = '0c47280d3e2d'
down_revision: Union[str, None] = '3a09ef909689'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('client_groups', sa.Column('description', sa.String(length=255), nullable=True))
op.drop_column('client_groups', 'location')
op.add_column('clients', sa.Column('description', sa.String(length=255), nullable=True))
# ### end Alembic commands ###
def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('clients', 'description')
op.add_column('client_groups', sa.Column('location', mysql.VARCHAR(length=100), nullable=True))
op.drop_column('client_groups', 'description')
# ### end Alembic commands ###

View File

@@ -0,0 +1,56 @@
"""Update clients table for new fields
Revision ID: 207f5b190f93
Revises: 3d15c3cac7b6
Create Date: 2025-07-15 14:12:42.427274
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision: str = '207f5b190f93'
down_revision: Union[str, None] = '3d15c3cac7b6'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('clients', sa.Column('hardware_token', sa.String(length=64), nullable=True))
op.add_column('clients', sa.Column('ip', sa.String(length=45), nullable=True))
op.add_column('clients', sa.Column('type', sa.String(length=50), nullable=True))
op.add_column('clients', sa.Column('hostname', sa.String(length=100), nullable=True))
op.add_column('clients', sa.Column('os_version', sa.String(length=100), nullable=True))
op.add_column('clients', sa.Column('software_version', sa.String(length=100), nullable=True))
op.add_column('clients', sa.Column('macs', sa.String(length=255), nullable=True))
op.add_column('clients', sa.Column('model', sa.String(length=100), nullable=True))
op.drop_index(op.f('ix_clients_hardware_hash'), table_name='clients')
op.drop_index(op.f('ix_clients_ip_address'), table_name='clients')
op.drop_column('clients', 'location')
op.drop_column('clients', 'hardware_hash')
op.drop_column('clients', 'ip_address')
# ### end Alembic commands ###
def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('clients', sa.Column('ip_address', mysql.VARCHAR(length=45), nullable=True))
op.add_column('clients', sa.Column('hardware_hash', mysql.VARCHAR(length=64), nullable=False))
op.add_column('clients', sa.Column('location', mysql.VARCHAR(length=100), nullable=True))
op.create_index(op.f('ix_clients_ip_address'), 'clients', ['ip_address'], unique=False)
op.create_index(op.f('ix_clients_hardware_hash'), 'clients', ['hardware_hash'], unique=False)
op.drop_column('clients', 'model')
op.drop_column('clients', 'macs')
op.drop_column('clients', 'software_version')
op.drop_column('clients', 'os_version')
op.drop_column('clients', 'hostname')
op.drop_column('clients', 'type')
op.drop_column('clients', 'ip')
op.drop_column('clients', 'hardware_token')
# ### end Alembic commands ###

Some files were not shown because too many files have changed in this diff Show More