Compare commits
92 Commits
torben-fro
...
4a994c3bf8
Author | SHA1 | Date | |
---|---|---|---|
4a994c3bf8 | |||
7eabb59b35 | |||
f1a2ca75b1 | |||
a757c24f12 | |||
845e6dcc24 | |||
ee15efc898 | |||
b0eef79b1d | |||
659af9abc0 | |||
ee4e4b0a56 | |||
04ff95469b | |||
6404f011cc | |||
b35a66cd8a | |||
8a4542c6bf | |||
59b9189686 | |||
9a0cda9cad | |||
77b89186d3 | |||
ef6b46dc05 | |||
05bd3f3f22 | |||
e143f4ab16 | |||
f1541478ad | |||
fc62086a50 | |||
951473d1ec | |||
4db5512b57 | |||
347cc931ed | |||
a082a81c87 | |||
7f0991e517 | |||
621833766d | |||
27e1b2d82c | |||
4e0fa33dee | |||
37f2519140 | |||
f26759ec24 | |||
3ffde06e6e | |||
3e08a09d87 | |||
8c3c80fb5c | |||
fa3a2209ad | |||
2ab4c4c3e2 | |||
8366a9295e | |||
8db9a93507 | |||
d496dec773 | |||
37ab57c455 | |||
aad1be90ee | |||
4fda1139d1 | |||
7a34f7581c | |||
8ba0f9e55a | |||
38d1a90dc2 | |||
84a3f45d2f | |||
5c9de8becc | |||
9502a9b3af | |||
2ab091b90c | |||
9ffa70aad1 | |||
b0d8d4f915 | |||
52a336a788 | |||
63b04c4dea | |||
be0067ab06 | |||
f58c85c8f0 | |||
84ddfefaea | |||
4a2782734a | |||
190794b2c1 | |||
ad5bd4367e | |||
097934ac18 | |||
a7b8d470e4 | |||
3f2be5b17d | |||
7bf9d7e5ae | |||
a651257b3b | |||
3972860be8 | |||
6cdc437d3e | |||
7c1d0069c6 | |||
575325e838 | |||
faf0736dbd | |||
47143d29a5 | |||
8222d89b2b | |||
ea4b903d63 | |||
f3cd2ba730 | |||
b5dcc6999d | |||
038c261eb7 | |||
2adafb149a | |||
e31c4036d7 | |||
29730fa880 | |||
c3fa6455d0 | |||
7061d13b12 | |||
331a235f05 | |||
70aeb17cdb | |||
73ead5939c | |||
f480ed00bd | |||
61b5a4a67c | |||
55936c81f0 | |||
68a1910bdc | |||
b7fe9a036a | |||
fc911317f4 | |||
73dd2e5a84 | |||
fb2d584874 | |||
dfd63d7c9d |
53
CLAUDE.md
Normal file
53
CLAUDE.md
Normal file
@ -0,0 +1,53 @@
|
||||
# MYP Project Development Guidelines
|
||||
|
||||
## System Architecture
|
||||
- **Frontend**:
|
||||
- Located in `packages/reservation-platform`
|
||||
- Runs on a Raspberry Pi connected to company network
|
||||
- Has internet access on one interface
|
||||
- Connected via LAN to an offline network
|
||||
- Serves as the user interface
|
||||
- Developed by another apprentice as part of IHK project work
|
||||
|
||||
- **Backend**:
|
||||
- Located in `backend` directory
|
||||
- Flask application running on a separate Raspberry Pi
|
||||
- Connected only to the offline network
|
||||
- Communicates with WiFi smart plugs
|
||||
- Part of my IHK project work for digital networking qualification
|
||||
|
||||
- **Printers/Smart Plugs**:
|
||||
- Printers can only be controlled (on/off) via WiFi smart plugs
|
||||
- No other control mechanisms available
|
||||
- Smart plugs and printers are equivalent in the system context
|
||||
|
||||
## Build/Run Commands
|
||||
- Backend: `cd backend && source venv/bin/activate && python app.py`
|
||||
- Frontend: `cd packages/reservation-platform && pnpm dev`
|
||||
- Run tests: `cd backend && python -m unittest development/tests/tests.py`
|
||||
- Run single test: `cd backend && python -m unittest development.tests.tests.MYPBackendTestCase.test_name`
|
||||
- Check jobs manually: `cd backend && source venv/bin/activate && flask check-jobs`
|
||||
- Lint frontend: `cd packages/reservation-platform && pnpm lint`
|
||||
- Format frontend: `cd packages/reservation-platform && npx @biomejs/biome format --write ./src`
|
||||
|
||||
## Code Style
|
||||
- **Python Backend**:
|
||||
- Use PEP 8 conventions, 4-space indentation
|
||||
- Line width: 100 characters max
|
||||
- Add docstrings to functions and classes
|
||||
- Error handling: Use try/except with specific exceptions
|
||||
- Naming: snake_case for functions/variables, PascalCase for classes
|
||||
|
||||
- **Frontend (Next.js/TypeScript)**:
|
||||
- Use Biome for formatting and linting (line width: 120 chars)
|
||||
- Organize imports automatically with Biome
|
||||
- Use TypeScript types for all code
|
||||
- Use React hooks for state management
|
||||
- Naming: camelCase for functions/variables, PascalCase for components
|
||||
|
||||
## Work Guidelines
|
||||
- All changes must be committed to git
|
||||
- Work efficiently and cost-effectively
|
||||
- Don't repeatedly try the same solution if it doesn't work
|
||||
- Create and check notes when encountering issues
|
||||
- Clearly communicate if something is not possible so I can handle it manually
|
3
CREDENTIALS
Normal file
3
CREDENTIALS
Normal file
@ -0,0 +1,3 @@
|
||||
TAPO ADMIN: vT6Vsd^p
|
||||
Admin-PW: 744563017196
|
||||
Tapo: 744563017196A
|
11
Dokumentation.md
Executable file
11
Dokumentation.md
Executable file
@ -0,0 +1,11 @@
|
||||
# Dokumentation
|
||||
|
||||
Komplikationen:
|
||||
- Netzwerkanbindung
|
||||
- Ermitteln der Schnittstellen der Drucker
|
||||
- Auswahl der Anbindung, Entwickeln eines Netzwerkkonzeptes
|
||||
- Beschaffung der Hardware (beschränkte Auswahlmöglichkeiten)
|
||||
- Welches Betriebssystem? OpenSuse, NixOS, Debian
|
||||
- Frontend verstehen lernen
|
||||
- Netzwerk einrichten, Frontend anbinden
|
||||
|
242
INSTALL.md
Normal file
242
INSTALL.md
Normal file
@ -0,0 +1,242 @@
|
||||
# MYP System - Installationsanleitung
|
||||
|
||||
Dieses Dokument beschreibt die Installation des MYP-Systems, bestehend aus einem Frontend und einem Backend.
|
||||
|
||||
## Systemanforderungen
|
||||
|
||||
- **Frontend**:
|
||||
- Raspberry Pi 3B+ oder neuer (empfohlen: Pi 4 mit mindestens 2GB RAM)
|
||||
- Raspbian/Raspberry Pi OS (64-bit empfohlen)
|
||||
- Docker und Docker Compose (werden automatisch installiert)
|
||||
- Internet-Zugang für die Installation
|
||||
- Zwei Netzwerk-Schnittstellen:
|
||||
- Eine mit Internet-Zugang
|
||||
- Eine für die Verbindung zum Backend-Netzwerk
|
||||
|
||||
- **Backend**:
|
||||
- Raspberry Pi 3B+ oder neuer (empfohlen: Pi 4 mit mindestens 2GB RAM)
|
||||
- Raspbian/Raspberry Pi OS (64-bit empfohlen)
|
||||
- Docker und Docker Compose (werden automatisch installiert)
|
||||
- Verbindung zum Druckernetzwerk
|
||||
|
||||
## 1. Installation des Backends
|
||||
|
||||
Das Backend wird auf dem ersten Raspberry Pi installiert, der mit den Smart Plugs verbunden ist.
|
||||
|
||||
```bash
|
||||
# Den Code auf den Raspberry Pi kopieren
|
||||
scp -r /pfad/zum/projektverzeichnis pi@raspberry-backend:/home/pi/myp
|
||||
|
||||
# SSH-Verbindung herstellen
|
||||
ssh pi@raspberry-backend
|
||||
|
||||
# In das Projektverzeichnis wechseln
|
||||
cd /home/pi/myp
|
||||
|
||||
# Installations-Skript ausführbar machen und ausführen
|
||||
chmod +x install-backend.sh
|
||||
./install-backend.sh
|
||||
```
|
||||
|
||||
Das Skript erledigt folgende Aufgaben:
|
||||
- Installation von Docker und Docker Compose (falls nicht vorhanden)
|
||||
- Erstellung der nötigen Verzeichnisse und Dateien
|
||||
- Erstellung und Start des Docker-Containers
|
||||
- Initialisierung der Datenbank
|
||||
- Überprüfung, ob der Service korrekt läuft
|
||||
|
||||
Nach erfolgreicher Installation ist das Backend unter `http://raspberry-backend:5000` erreichbar.
|
||||
|
||||
## 2. Installation des Frontends
|
||||
|
||||
Das Frontend wird auf dem zweiten Raspberry Pi installiert, der mit dem Unternehmensnetzwerk verbunden ist.
|
||||
|
||||
```bash
|
||||
# Den Code auf den Raspberry Pi kopieren
|
||||
scp -r /pfad/zum/projektverzeichnis pi@raspberry-frontend:/home/pi/myp
|
||||
|
||||
# SSH-Verbindung herstellen
|
||||
ssh pi@raspberry-frontend
|
||||
|
||||
# In das Projektverzeichnis wechseln
|
||||
cd /home/pi/myp
|
||||
|
||||
# Installations-Skript ausführbar machen und ausführen
|
||||
chmod +x install-frontend.sh
|
||||
./install-frontend.sh
|
||||
```
|
||||
|
||||
Das Skript erledigt folgende Aufgaben:
|
||||
- Installation von Docker und Docker Compose (falls nicht vorhanden)
|
||||
- Erstellung der benötigten Verzeichnisse und Dateien
|
||||
- Erstellung und Start des Docker-Containers
|
||||
- Überprüfung, ob der Service korrekt läuft
|
||||
|
||||
Nach erfolgreicher Installation ist das Frontend unter `http://raspberry-frontend:3000` erreichbar.
|
||||
|
||||
## 3. Konfiguration der Verbindung zwischen Frontend und Backend
|
||||
|
||||
Für die Kommunikation zwischen Frontend und Backend muss die API-URL im Frontend konfiguriert werden:
|
||||
|
||||
1. Die Datei `/home/pi/myp/packages/reservation-platform/.env` auf dem Frontend-Raspberry Pi bearbeiten:
|
||||
|
||||
```
|
||||
# Basic Server Configuration
|
||||
RUNTIME_ENVIRONMENT=prod
|
||||
DB_PATH=db/sqlite.db
|
||||
|
||||
# OAuth Configuration
|
||||
OAUTH_CLIENT_ID=client_id
|
||||
OAUTH_CLIENT_SECRET=client_secret
|
||||
|
||||
# Backend-URL (Hostname oder IP-Adresse des Backend-Raspberry Pi)
|
||||
NEXT_PUBLIC_API_URL=http://raspberry-backend:5000
|
||||
```
|
||||
|
||||
2. Frontend-Container neu starten:
|
||||
|
||||
```bash
|
||||
cd /home/pi/myp/packages/reservation-platform
|
||||
docker-compose down
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## 4. Wartung und Fehlerbehebung
|
||||
|
||||
### Logs anzeigen
|
||||
|
||||
**Backend:**
|
||||
```bash
|
||||
docker logs -f myp-backend
|
||||
```
|
||||
|
||||
**Frontend:**
|
||||
```bash
|
||||
docker logs -f myp-frontend
|
||||
```
|
||||
|
||||
### Container neustarten
|
||||
|
||||
**Backend:**
|
||||
```bash
|
||||
cd /pfad/zum/backend
|
||||
docker-compose restart
|
||||
```
|
||||
|
||||
**Frontend:**
|
||||
```bash
|
||||
cd /pfad/zum/frontend
|
||||
docker-compose restart
|
||||
```
|
||||
|
||||
### Datenbank-Reset
|
||||
|
||||
Sollte die Datenbank zurückgesetzt werden müssen:
|
||||
|
||||
```bash
|
||||
# Auf dem Backend-Raspberry Pi
|
||||
cd /home/pi/myp/backend
|
||||
docker-compose down
|
||||
rm -f instance/myp.db
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
### Docker Compose YAML-Fehler
|
||||
|
||||
Wenn Sie einen YAML-Fehler in der Docker Compose-Datei erhalten:
|
||||
|
||||
```
|
||||
yaml: line 12: did not find expected key
|
||||
```
|
||||
|
||||
Überprüfen Sie folgende Punkte:
|
||||
1. Die Docker Compose-Version könnte veraltet sein. Die Installationsskripte installieren automatisch die richtige Version.
|
||||
2. Es könnte ein Syntaxfehler in der YAML-Datei vorliegen. Prüfen Sie insbesondere komplexe Werte wie JSON-Strings.
|
||||
|
||||
Fehlerbehebung:
|
||||
```bash
|
||||
# Auf dem betroffenen Server
|
||||
cd /home/pi/myp
|
||||
# Für das Backend
|
||||
nano backend/docker-compose.yml
|
||||
# Für das Frontend
|
||||
nano packages/reservation-platform/docker-compose.yml
|
||||
```
|
||||
|
||||
### Docker-Daemon läuft nicht
|
||||
|
||||
Wenn Sie die Fehlermeldung erhalten, dass der Docker-Daemon nicht läuft:
|
||||
|
||||
```
|
||||
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
|
||||
```
|
||||
|
||||
Starten Sie den Docker-Daemon:
|
||||
```bash
|
||||
sudo systemctl start docker
|
||||
# oder
|
||||
sudo service docker start
|
||||
```
|
||||
|
||||
### Container startet nicht
|
||||
|
||||
Wenn der Container nicht startet, prüfen Sie die Logs:
|
||||
|
||||
```bash
|
||||
docker logs myp-backend
|
||||
# oder
|
||||
docker logs myp-frontend
|
||||
```
|
||||
|
||||
### Frontend kann nicht auf Backend zugreifen
|
||||
|
||||
1. Stellen Sie sicher, dass beide Server im selben Netzwerk sind
|
||||
2. Überprüfen Sie die Firewall-Einstellungen
|
||||
3. Stellen Sie sicher, dass der Backend-Service auf Port 5000 läuft
|
||||
4. Stellen Sie sicher, dass die richtige Backend-URL in der .env-Datei des Frontends eingestellt ist
|
||||
|
||||
## 5. Automatischer Start beim Systemstart
|
||||
|
||||
Die Docker-Container sind so konfiguriert, dass sie automatisch beim Neustart der Geräte starten (`restart: unless-stopped`).
|
||||
|
||||
Sollte dies nicht funktionieren, kann der Start in die `/etc/rc.local` eingetragen werden:
|
||||
|
||||
```bash
|
||||
# Auf dem Backend-Raspberry Pi
|
||||
echo "cd /home/pi/myp/backend && docker-compose up -d" >> /etc/rc.local
|
||||
|
||||
# Auf dem Frontend-Raspberry Pi
|
||||
echo "cd /home/pi/myp/packages/reservation-platform && docker-compose up -d" >> /etc/rc.local
|
||||
```
|
||||
|
||||
## 6. Technische Details
|
||||
|
||||
- Das Backend ist eine Flask-Anwendung, die mit den Smart Plugs kommuniziert
|
||||
- Das Frontend ist eine Next.js-Anwendung
|
||||
- Beide Komponenten laufen in Docker-Containern mit Host-Netzwerkanbindung
|
||||
- Die Datenbanken werden in Docker-Volumes persistiert
|
||||
|
||||
## 7. Raspberry Pi-spezifische Anmerkungen
|
||||
|
||||
Wenn Sie Probleme mit der Docker-Installation auf dem Raspberry Pi haben, können Sie folgende Schritte manuell ausführen:
|
||||
|
||||
```bash
|
||||
# Docker für Raspberry Pi installieren
|
||||
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||
sudo sh get-docker.sh
|
||||
sudo usermod -aG docker $USER
|
||||
|
||||
# Docker Compose für die richtige Architektur installieren
|
||||
# Für 32-bit (armhf):
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-Linux-armv7" -o /usr/local/bin/docker-compose
|
||||
|
||||
# Für 64-bit (arm64):
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-Linux-aarch64" -o /usr/local/bin/docker-compose
|
||||
|
||||
sudo chmod +x /usr/local/bin/docker-compose
|
||||
```
|
||||
|
||||
## 8. Unterstützung
|
||||
|
||||
Bei Fragen oder Problemen wenden Sie sich an:
|
||||
- Till Tomczak (Projektentwickler)
|
0
LICENSE.md
Normal file → Executable file
0
LICENSE.md
Normal file → Executable file
79
README.Frontend.md
Normal file
79
README.Frontend.md
Normal file
@ -0,0 +1,79 @@
|
||||
# Frontend-Wiederherstellung und Installation
|
||||
|
||||
Diese Anleitung erklärt, wie du das Frontend auf den Stand von Torbens letztem Commit zurücksetzen und installieren kannst.
|
||||
|
||||
## Vorhandene Skripte
|
||||
|
||||
Es wurden drei Skripte erstellt, um die Wiederherstellung des Frontends zu erleichtern:
|
||||
|
||||
1. `fix-frontend-install.sh` - Master-Skript, das beide unten genannten Skripte ausführt
|
||||
2. `packages/restore-torben-frontend.sh` - Setzt das Frontend auf Torbens letzten Commit zurück
|
||||
3. `packages/install-torben-frontend.sh` - Installiert die Abhängigkeiten des wiederhergestellten Frontends
|
||||
|
||||
## Schnelle Lösung
|
||||
|
||||
Für die schnellste Lösung führe einfach das Master-Skript aus:
|
||||
|
||||
```bash
|
||||
chmod +x fix-frontend-install.sh
|
||||
./fix-frontend-install.sh
|
||||
```
|
||||
|
||||
Das Skript wird:
|
||||
1. Das aktuelle Frontend-Verzeichnis sichern (optional)
|
||||
2. Das Frontend auf Torbens letzten Commit (27. Mai 2024) zurücksetzen
|
||||
3. Die Änderungen committen (optional)
|
||||
4. Die Frontend-Abhängigkeiten installieren
|
||||
5. Das Frontend bauen, um die Installation zu verifizieren
|
||||
|
||||
## Manuelle Schritte
|
||||
|
||||
Wenn du die Schritte manuell ausführen möchtest:
|
||||
|
||||
### 1. Frontend zurücksetzen
|
||||
|
||||
```bash
|
||||
chmod +x packages/restore-torben-frontend.sh
|
||||
./packages/restore-torben-frontend.sh
|
||||
```
|
||||
|
||||
### 2. Frontend installieren
|
||||
|
||||
```bash
|
||||
chmod +x packages/install-torben-frontend.sh
|
||||
./packages/install-torben-frontend.sh
|
||||
```
|
||||
|
||||
## Das System starten
|
||||
|
||||
### Frontend starten
|
||||
|
||||
```bash
|
||||
cd packages/reservation-platform
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
Das Frontend ist dann unter http://localhost:3000 erreichbar.
|
||||
|
||||
### Backend starten
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
source venv/bin/activate
|
||||
python app.py
|
||||
```
|
||||
|
||||
Das Backend läuft dann auf http://localhost:5000.
|
||||
|
||||
### Backend-Autostart konfigurieren
|
||||
|
||||
Für den automatischen Start des Backends beim Hochfahren:
|
||||
|
||||
```bash
|
||||
sudo ./backend/autostart-backend.sh
|
||||
```
|
||||
|
||||
## Bekannte Probleme
|
||||
|
||||
- Wenn beim Frontend-Start Fehler auftreten, überprüfe die .env-Datei in packages/reservation-platform/
|
||||
- Stelle sicher, dass das Backend erreichbar ist unter der URL, die in NEXT_PUBLIC_API_URL konfiguriert ist
|
5
README.md
Normal file → Executable file
5
README.md
Normal file → Executable file
@ -1,3 +1,4 @@
|
||||
<<<<<<< HEAD
|
||||
# 📦 MYP
|
||||
|
||||
> Frontend: https://git.i.mercedes-benz.com/TBA-Berlin-FI/MYP/tree/main/packages/reservation-platform
|
||||
@ -44,3 +45,7 @@ MYP *(Manage your Printer)* ist eine Plattform zur Reservierung von 3D-Druckern,
|
||||
## Fremdschlüsselbeziehungen
|
||||
- `User` ist verknüpft mit `PrintJob`, `Account` und `Session` über Benutzer-ID.
|
||||
- `Printer` ist verknüpft mit `PrintJob` über die Drucker-ID.
|
||||
=======
|
||||
# Projektarbeit-MYP
|
||||
|
||||
>>>>>>> dfd63d7c9ddf4b3a654f06dff38bebdbec7395d7
|
||||
|
0
packages/@backend-archiv/NETWORK-api-backend_blueprint/.env → archiv/NETWORK-api-backend_blueprint/.env
Normal file → Executable file
0
packages/@backend-archiv/NETWORK-api-backend_blueprint/.env → archiv/NETWORK-api-backend_blueprint/.env
Normal file → Executable file
0
packages/@backend-archiv/NETWORK-api-backend_blueprint/README.md → archiv/NETWORK-api-backend_blueprint/README.md
Normal file → Executable file
0
packages/@backend-archiv/NETWORK-api-backend_blueprint/README.md → archiv/NETWORK-api-backend_blueprint/README.md
Normal file → Executable file
0
packages/@backend-archiv/NETWORK-api-backend_blueprint/requirements.txt → archiv/NETWORK-api-backend_blueprint/requirements.txt
Normal file → Executable file
0
packages/@backend-archiv/NETWORK-api-backend_blueprint/requirements.txt → archiv/NETWORK-api-backend_blueprint/requirements.txt
Normal file → Executable file
0
packages/@backend-archiv/NETWORK-api-backend_blueprint/server.py → archiv/NETWORK-api-backend_blueprint/server.py
Normal file → Executable file
0
packages/@backend-archiv/NETWORK-api-backend_blueprint/server.py → archiv/NETWORK-api-backend_blueprint/server.py
Normal file → Executable file
0
packages/backend/myp_backend.db → archiv/backend/myp_backend.db
Normal file → Executable file
0
packages/backend/myp_backend.db → archiv/backend/myp_backend.db
Normal file → Executable file
0
packages/backend/myp_backend.py → archiv/backend/myp_backend.py
Normal file → Executable file
0
packages/backend/myp_backend.py → archiv/backend/myp_backend.py
Normal file → Executable file
0
packages/backend/templates/base.html → archiv/backend/templates/base.html
Normal file → Executable file
0
packages/backend/templates/base.html → archiv/backend/templates/base.html
Normal file → Executable file
0
packages/backend/templates/dashboard.html → archiv/backend/templates/dashboard.html
Normal file → Executable file
0
packages/backend/templates/dashboard.html → archiv/backend/templates/dashboard.html
Normal file → Executable file
0
packages/backend/templates/login.html → archiv/backend/templates/login.html
Normal file → Executable file
0
packages/backend/templates/login.html → archiv/backend/templates/login.html
Normal file → Executable file
3
archiv/flask-backend/.env
Executable file
3
archiv/flask-backend/.env
Executable file
@ -0,0 +1,3 @@
|
||||
SECRET_KEY=dev-secret-key-change-in-production
|
||||
DATABASE_URL=sqlite:///app.db
|
||||
JWT_SECRET=dev-jwt-secret-change-in-production
|
3
archiv/flask-backend/.env.example
Executable file
3
archiv/flask-backend/.env.example
Executable file
@ -0,0 +1,3 @@
|
||||
SECRET_KEY=change-me-to-a-real-secret-key
|
||||
DATABASE_URL=sqlite:///app.db
|
||||
JWT_SECRET=change-me-to-a-real-jwt-secret
|
20
archiv/flask-backend/Dockerfile
Executable file
20
archiv/flask-backend/Dockerfile
Executable file
@ -0,0 +1,20 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install dependencies
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy application code
|
||||
COPY . .
|
||||
|
||||
# Run database migrations
|
||||
RUN mkdir -p /app/instance
|
||||
ENV FLASK_APP=wsgi.py
|
||||
|
||||
# Expose port
|
||||
EXPOSE 5000
|
||||
|
||||
# Run the application
|
||||
CMD ["gunicorn", "--bind", "0.0.0.0:5000", "wsgi:app"]
|
96
archiv/flask-backend/README.md
Executable file
96
archiv/flask-backend/README.md
Executable file
@ -0,0 +1,96 @@
|
||||
# Reservation Platform Backend
|
||||
|
||||
This is the Flask backend for the 3D Printer Reservation Platform, providing a RESTful API for managing printers, reservations, and users.
|
||||
|
||||
## Features
|
||||
|
||||
- User authentication with email/password
|
||||
- Role-based permission system (admin, user)
|
||||
- Printer management
|
||||
- Reservation system
|
||||
- User management
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Authentication
|
||||
- `POST /auth/register` - Register a new user
|
||||
- `POST /auth/login` - Login with username/email and password
|
||||
- `POST /auth/logout` - Log out a user by invalidating their session
|
||||
|
||||
### Printers
|
||||
- `GET /api/printers` - Get all printers
|
||||
- `GET /api/printers/<printer_id>` - Get a specific printer
|
||||
- `POST /api/printers` - Create a new printer (admin only)
|
||||
- `PUT /api/printers/<printer_id>` - Update a printer (admin only)
|
||||
- `DELETE /api/printers/<printer_id>` - Delete a printer (admin only)
|
||||
- `GET /api/printers/availability` - Get availability information for all printers
|
||||
|
||||
### Print Jobs
|
||||
- `GET /api/jobs` - Get jobs for the current user or all jobs for admin
|
||||
- `GET /api/jobs/<job_id>` - Get a specific job
|
||||
- `POST /api/jobs` - Create a new print job (reserve a printer)
|
||||
- `PUT /api/jobs/<job_id>` - Update a job
|
||||
- `DELETE /api/jobs/<job_id>` - Delete a job (cancel reservation)
|
||||
- `GET /api/jobs/<job_id>/remaining-time` - Get remaining time for a job (public endpoint)
|
||||
|
||||
### Users
|
||||
- `GET /api/users` - Get all users (admin only)
|
||||
- `GET /api/users/<user_id>` - Get a specific user (admin only)
|
||||
- `PUT /api/users/<user_id>` - Update a user (admin only)
|
||||
- `DELETE /api/users/<user_id>` - Delete a user (admin only)
|
||||
- `GET /api/me` - Get the current user's profile
|
||||
- `PUT /api/me` - Update the current user's profile
|
||||
|
||||
## Installation
|
||||
|
||||
### Prerequisites
|
||||
- Python 3.11 or higher
|
||||
- pip
|
||||
|
||||
### Setup
|
||||
|
||||
1. Clone the repository
|
||||
```bash
|
||||
git clone https://github.com/your-repo/reservation-platform.git
|
||||
cd reservation-platform/packages/flask-backend
|
||||
```
|
||||
|
||||
2. Install dependencies
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
3. Create a `.env` file with the following variables:
|
||||
```
|
||||
SECRET_KEY=your-secret-key
|
||||
DATABASE_URL=sqlite:///app.db
|
||||
JWT_SECRET=your-jwt-secret
|
||||
```
|
||||
|
||||
4. Initialize the database
|
||||
```bash
|
||||
flask db upgrade
|
||||
python scripts/init_db.py
|
||||
```
|
||||
|
||||
5. Run the development server
|
||||
```bash
|
||||
python wsgi.py
|
||||
```
|
||||
|
||||
## Docker Deployment
|
||||
|
||||
1. Build and run with Docker Compose
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## Development
|
||||
|
||||
### Running Migrations
|
||||
|
||||
To create a new migration after updating models:
|
||||
```bash
|
||||
flask db migrate -m "Description of changes"
|
||||
flask db upgrade
|
||||
```
|
32
archiv/flask-backend/app/__init__.py
Executable file
32
archiv/flask-backend/app/__init__.py
Executable file
@ -0,0 +1,32 @@
|
||||
from flask import Flask
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from flask_migrate import Migrate
|
||||
from flask_cors import CORS
|
||||
from config import Config
|
||||
|
||||
db = SQLAlchemy()
|
||||
migrate = Migrate()
|
||||
|
||||
def create_app(config_class=Config):
|
||||
app = Flask(__name__)
|
||||
app.config.from_object(config_class)
|
||||
|
||||
# Initialize extensions
|
||||
db.init_app(app)
|
||||
migrate.init_app(app, db)
|
||||
CORS(app)
|
||||
|
||||
# Register blueprints
|
||||
from app.api import bp as api_bp
|
||||
app.register_blueprint(api_bp, url_prefix='/api')
|
||||
|
||||
from app.auth import bp as auth_bp
|
||||
app.register_blueprint(auth_bp, url_prefix='/auth')
|
||||
|
||||
@app.route('/health')
|
||||
def health_check():
|
||||
return {'status': 'ok'}
|
||||
|
||||
return app
|
||||
|
||||
from app import models
|
5
archiv/flask-backend/app/api/__init__.py
Executable file
5
archiv/flask-backend/app/api/__init__.py
Executable file
@ -0,0 +1,5 @@
|
||||
from flask import Blueprint
|
||||
|
||||
bp = Blueprint('api', __name__)
|
||||
|
||||
from app.api import printers, jobs, users
|
219
archiv/flask-backend/app/api/jobs.py
Executable file
219
archiv/flask-backend/app/api/jobs.py
Executable file
@ -0,0 +1,219 @@
|
||||
from flask import request, jsonify
|
||||
from app import db
|
||||
from app.api import bp
|
||||
from app.models import PrintJob, Printer, User
|
||||
from app.auth.routes import token_required, admin_required
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
@bp.route('/jobs', methods=['GET'])
|
||||
@token_required
|
||||
def get_jobs():
|
||||
"""Get jobs for the current user or all jobs for admin"""
|
||||
is_admin = request.user_role == 'admin'
|
||||
user_id = request.user_id
|
||||
|
||||
# Parse query parameters
|
||||
status = request.args.get('status') # active, upcoming, completed, aborted, all
|
||||
printer_id = request.args.get('printer_id')
|
||||
|
||||
# Base query
|
||||
query = PrintJob.query
|
||||
|
||||
# Filter by user unless admin
|
||||
if not is_admin:
|
||||
query = query.filter_by(user_id=user_id)
|
||||
|
||||
# Filter by printer if provided
|
||||
if printer_id:
|
||||
query = query.filter_by(printer_id=printer_id)
|
||||
|
||||
# Apply status filter
|
||||
now = datetime.utcnow()
|
||||
if status == 'active':
|
||||
query = query.filter_by(aborted=False) \
|
||||
.filter(PrintJob.start_at <= now) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > now)
|
||||
elif status == 'upcoming':
|
||||
query = query.filter_by(aborted=False) \
|
||||
.filter(PrintJob.start_at > now)
|
||||
elif status == 'completed':
|
||||
query = query.filter_by(aborted=False) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) <= now)
|
||||
elif status == 'aborted':
|
||||
query = query.filter_by(aborted=True)
|
||||
|
||||
# Order by start time, most recent first
|
||||
query = query.order_by(PrintJob.start_at.desc())
|
||||
|
||||
# Execute query
|
||||
jobs = query.all()
|
||||
result = [job.to_dict() for job in jobs]
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/jobs/<job_id>', methods=['GET'])
|
||||
@token_required
|
||||
def get_job(job_id):
|
||||
"""Get a specific job"""
|
||||
job = PrintJob.query.get_or_404(job_id)
|
||||
|
||||
# Check permissions
|
||||
is_admin = request.user_role == 'admin'
|
||||
user_id = request.user_id
|
||||
|
||||
if not is_admin and job.user_id != user_id:
|
||||
return jsonify({'error': 'Not authorized to view this job'}), 403
|
||||
|
||||
return jsonify(job.to_dict())
|
||||
|
||||
@bp.route('/jobs', methods=['POST'])
|
||||
@token_required
|
||||
def create_job():
|
||||
"""Create a new print job (reserve a printer)"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
required_fields = ['printer_id', 'start_at', 'duration_in_minutes']
|
||||
for field in required_fields:
|
||||
if field not in data:
|
||||
return jsonify({'error': f'Missing required field: {field}'}), 400
|
||||
|
||||
# Validate printer
|
||||
printer = Printer.query.get(data['printer_id'])
|
||||
if not printer:
|
||||
return jsonify({'error': 'Printer not found'}), 404
|
||||
|
||||
if printer.status != 0: # Not operational
|
||||
return jsonify({'error': 'Printer is not operational'}), 400
|
||||
|
||||
# Parse start time
|
||||
try:
|
||||
start_at = datetime.fromisoformat(data['start_at'].replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
return jsonify({'error': 'Invalid start_at format'}), 400
|
||||
|
||||
# Validate duration
|
||||
try:
|
||||
duration = int(data['duration_in_minutes'])
|
||||
if duration <= 0 or duration > 480: # Max 8 hours
|
||||
return jsonify({'error': 'Invalid duration (must be between 1 and 480 minutes)'}), 400
|
||||
except ValueError:
|
||||
return jsonify({'error': 'Duration must be a number'}), 400
|
||||
|
||||
end_at = start_at + timedelta(minutes=duration)
|
||||
|
||||
# Check if the printer is available during the requested time
|
||||
conflicting_jobs = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(
|
||||
(PrintJob.start_at < end_at) &
|
||||
(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > start_at)
|
||||
) \
|
||||
.all()
|
||||
|
||||
if conflicting_jobs:
|
||||
return jsonify({'error': 'Printer is not available during the requested time'}), 409
|
||||
|
||||
# Create job
|
||||
job = PrintJob(
|
||||
printer_id=data['printer_id'],
|
||||
user_id=request.user_id,
|
||||
start_at=start_at,
|
||||
duration_in_minutes=duration,
|
||||
comments=data.get('comments', '')
|
||||
)
|
||||
|
||||
db.session.add(job)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify(job.to_dict()), 201
|
||||
|
||||
@bp.route('/jobs/<job_id>', methods=['PUT'])
|
||||
@token_required
|
||||
def update_job(job_id):
|
||||
"""Update a job"""
|
||||
job = PrintJob.query.get_or_404(job_id)
|
||||
|
||||
# Check permissions
|
||||
is_admin = request.user_role == 'admin'
|
||||
user_id = request.user_id
|
||||
|
||||
if not is_admin and job.user_id != user_id:
|
||||
return jsonify({'error': 'Not authorized to update this job'}), 403
|
||||
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Only allow certain fields to be updated
|
||||
if 'comments' in data:
|
||||
job.comments = data['comments']
|
||||
|
||||
# Admin or owner can abort a job
|
||||
if 'aborted' in data and data['aborted'] and not job.aborted:
|
||||
job.aborted = True
|
||||
job.abort_reason = data.get('abort_reason', '')
|
||||
|
||||
# Admin or owner can extend a job if it's active
|
||||
now = datetime.utcnow()
|
||||
is_active = (not job.aborted and
|
||||
job.start_at <= now and
|
||||
job.get_end_time() > now)
|
||||
|
||||
if 'extend_minutes' in data and is_active:
|
||||
try:
|
||||
extend_minutes = int(data['extend_minutes'])
|
||||
if extend_minutes <= 0 or extend_minutes > 120: # Max extend 2 hours
|
||||
return jsonify({'error': 'Invalid extension (must be between 1 and 120 minutes)'}), 400
|
||||
|
||||
new_end_time = job.get_end_time() + timedelta(minutes=extend_minutes)
|
||||
|
||||
# Check for conflicts with the extension
|
||||
conflicting_jobs = PrintJob.query.filter_by(printer_id=job.printer_id, aborted=False) \
|
||||
.filter(PrintJob.id != job.id) \
|
||||
.filter(PrintJob.start_at < new_end_time) \
|
||||
.filter(PrintJob.start_at > job.get_end_time()) \
|
||||
.all()
|
||||
|
||||
if conflicting_jobs:
|
||||
return jsonify({'error': 'Cannot extend job due to conflicts with other reservations'}), 409
|
||||
|
||||
job.duration_in_minutes += extend_minutes
|
||||
except ValueError:
|
||||
return jsonify({'error': 'Extend minutes must be a number'}), 400
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify(job.to_dict())
|
||||
|
||||
@bp.route('/jobs/<job_id>', methods=['DELETE'])
|
||||
@token_required
|
||||
def delete_job(job_id):
|
||||
"""Delete a job (cancel reservation)"""
|
||||
job = PrintJob.query.get_or_404(job_id)
|
||||
|
||||
# Check permissions
|
||||
is_admin = request.user_role == 'admin'
|
||||
user_id = request.user_id
|
||||
|
||||
if not is_admin and job.user_id != user_id:
|
||||
return jsonify({'error': 'Not authorized to delete this job'}), 403
|
||||
|
||||
# Only allow deletion of upcoming jobs
|
||||
now = datetime.utcnow()
|
||||
if job.start_at <= now and not is_admin:
|
||||
return jsonify({'error': 'Cannot delete an active or completed job'}), 400
|
||||
|
||||
db.session.delete(job)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({'message': 'Job deleted successfully'})
|
||||
|
||||
@bp.route('/jobs/<job_id>/remaining-time', methods=['GET'])
|
||||
def get_remaining_time(job_id):
|
||||
"""Get remaining time for a job (public endpoint)"""
|
||||
job = PrintJob.query.get_or_404(job_id)
|
||||
|
||||
remaining_seconds = job.get_remaining_time()
|
||||
|
||||
return jsonify({
|
||||
'job_id': job.id,
|
||||
'remaining_seconds': remaining_seconds,
|
||||
'is_active': job.is_active()
|
||||
})
|
177
archiv/flask-backend/app/api/printers.py
Executable file
177
archiv/flask-backend/app/api/printers.py
Executable file
@ -0,0 +1,177 @@
|
||||
from flask import request, jsonify
|
||||
from app import db
|
||||
from app.api import bp
|
||||
from app.models import Printer, PrintJob
|
||||
from app.auth.routes import token_required, admin_required
|
||||
from datetime import datetime
|
||||
|
||||
@bp.route('/printers', methods=['GET'])
|
||||
def get_printers():
|
||||
"""Get all printers"""
|
||||
printers = Printer.query.all()
|
||||
result = []
|
||||
|
||||
for printer in printers:
|
||||
# Get active job for the printer if any
|
||||
now = datetime.utcnow()
|
||||
active_job = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(PrintJob.start_at <= now) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > now) \
|
||||
.first()
|
||||
|
||||
printer_data = {
|
||||
'id': printer.id,
|
||||
'name': printer.name,
|
||||
'description': printer.description,
|
||||
'status': printer.status,
|
||||
'is_available': printer.status == 0 and active_job is None,
|
||||
'active_job': active_job.to_dict() if active_job else None
|
||||
}
|
||||
result.append(printer_data)
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/printers/<printer_id>', methods=['GET'])
|
||||
def get_printer(printer_id):
|
||||
"""Get a specific printer"""
|
||||
printer = Printer.query.get_or_404(printer_id)
|
||||
|
||||
# Get active job for the printer if any
|
||||
now = datetime.utcnow()
|
||||
active_job = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(PrintJob.start_at <= now) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > now) \
|
||||
.first()
|
||||
|
||||
# Get upcoming jobs
|
||||
upcoming_jobs = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(PrintJob.start_at > now) \
|
||||
.order_by(PrintJob.start_at) \
|
||||
.limit(5) \
|
||||
.all()
|
||||
|
||||
result = {
|
||||
'id': printer.id,
|
||||
'name': printer.name,
|
||||
'description': printer.description,
|
||||
'status': printer.status,
|
||||
'is_available': printer.status == 0 and active_job is None,
|
||||
'active_job': active_job.to_dict() if active_job else None,
|
||||
'upcoming_jobs': [job.to_dict() for job in upcoming_jobs]
|
||||
}
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/printers', methods=['POST'])
|
||||
@admin_required
|
||||
def create_printer():
|
||||
"""Create a new printer (admin only)"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
required_fields = ['name', 'description']
|
||||
for field in required_fields:
|
||||
if field not in data:
|
||||
return jsonify({'error': f'Missing required field: {field}'}), 400
|
||||
|
||||
printer = Printer(
|
||||
name=data['name'],
|
||||
description=data['description'],
|
||||
status=data.get('status', 0)
|
||||
)
|
||||
|
||||
db.session.add(printer)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'id': printer.id,
|
||||
'name': printer.name,
|
||||
'description': printer.description,
|
||||
'status': printer.status
|
||||
}), 201
|
||||
|
||||
@bp.route('/printers/<printer_id>', methods=['PUT'])
|
||||
@admin_required
|
||||
def update_printer(printer_id):
|
||||
"""Update a printer (admin only)"""
|
||||
printer = Printer.query.get_or_404(printer_id)
|
||||
data = request.get_json() or {}
|
||||
|
||||
if 'name' in data:
|
||||
printer.name = data['name']
|
||||
if 'description' in data:
|
||||
printer.description = data['description']
|
||||
if 'status' in data:
|
||||
printer.status = data['status']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'id': printer.id,
|
||||
'name': printer.name,
|
||||
'description': printer.description,
|
||||
'status': printer.status
|
||||
})
|
||||
|
||||
@bp.route('/printers/<printer_id>', methods=['DELETE'])
|
||||
@admin_required
|
||||
def delete_printer(printer_id):
|
||||
"""Delete a printer (admin only)"""
|
||||
printer = Printer.query.get_or_404(printer_id)
|
||||
|
||||
# Check if the printer has active jobs
|
||||
now = datetime.utcnow()
|
||||
active_jobs = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(PrintJob.start_at <= now) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > now) \
|
||||
.all()
|
||||
|
||||
if active_jobs:
|
||||
return jsonify({'error': 'Cannot delete printer with active jobs'}), 400
|
||||
|
||||
db.session.delete(printer)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({'message': 'Printer deleted successfully'})
|
||||
|
||||
@bp.route('/printers/availability', methods=['GET'])
|
||||
def get_availability():
|
||||
"""Get availability information for all printers"""
|
||||
start_date = request.args.get('start_date')
|
||||
end_date = request.args.get('end_date')
|
||||
|
||||
if not start_date or not end_date:
|
||||
return jsonify({'error': 'start_date and end_date are required'}), 400
|
||||
|
||||
try:
|
||||
start = datetime.fromisoformat(start_date.replace('Z', '+00:00'))
|
||||
end = datetime.fromisoformat(end_date.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
return jsonify({'error': 'Invalid date format'}), 400
|
||||
|
||||
if start >= end:
|
||||
return jsonify({'error': 'start_date must be before end_date'}), 400
|
||||
|
||||
printers = Printer.query.all()
|
||||
result = []
|
||||
|
||||
for printer in printers:
|
||||
# Get all jobs for this printer in the date range
|
||||
jobs = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(
|
||||
(PrintJob.start_at <= end) &
|
||||
(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) >= start)
|
||||
) \
|
||||
.order_by(PrintJob.start_at) \
|
||||
.all()
|
||||
|
||||
# Convert to availability slots
|
||||
availability = {
|
||||
'printer_id': printer.id,
|
||||
'printer_name': printer.name,
|
||||
'status': printer.status,
|
||||
'jobs': [job.to_dict() for job in jobs]
|
||||
}
|
||||
|
||||
result.append(availability)
|
||||
|
||||
return jsonify(result)
|
139
archiv/flask-backend/app/api/users.py
Executable file
139
archiv/flask-backend/app/api/users.py
Executable file
@ -0,0 +1,139 @@
|
||||
from flask import request, jsonify
|
||||
from app import db
|
||||
from app.api import bp
|
||||
from app.models import User, PrintJob
|
||||
from app.auth.routes import admin_required, token_required
|
||||
|
||||
@bp.route('/users', methods=['GET'])
|
||||
@admin_required
|
||||
def get_users():
|
||||
"""Get all users (admin only)"""
|
||||
users = User.query.all()
|
||||
result = []
|
||||
|
||||
for user in users:
|
||||
# Count jobs
|
||||
total_jobs = PrintJob.query.filter_by(user_id=user.id).count()
|
||||
active_jobs = PrintJob.query.filter_by(user_id=user.id, aborted=False).count()
|
||||
|
||||
user_data = {
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role,
|
||||
'job_count': total_jobs,
|
||||
'active_job_count': active_jobs
|
||||
}
|
||||
result.append(user_data)
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/users/<user_id>', methods=['GET'])
|
||||
@admin_required
|
||||
def get_user(user_id):
|
||||
"""Get a specific user (admin only)"""
|
||||
user = User.query.get_or_404(user_id)
|
||||
|
||||
# Count jobs
|
||||
total_jobs = PrintJob.query.filter_by(user_id=user.id).count()
|
||||
active_jobs = PrintJob.query.filter_by(user_id=user.id, aborted=False).count()
|
||||
|
||||
result = {
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role,
|
||||
'job_count': total_jobs,
|
||||
'active_job_count': active_jobs
|
||||
}
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/users/<user_id>', methods=['PUT'])
|
||||
@admin_required
|
||||
def update_user(user_id):
|
||||
"""Update a user (admin only)"""
|
||||
user = User.query.get_or_404(user_id)
|
||||
data = request.get_json() or {}
|
||||
|
||||
if 'role' in data and data['role'] in ['admin', 'user', 'guest']:
|
||||
user.role = data['role']
|
||||
|
||||
if 'display_name' in data:
|
||||
user.display_name = data['display_name']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role
|
||||
})
|
||||
|
||||
@bp.route('/users/<user_id>', methods=['DELETE'])
|
||||
@admin_required
|
||||
def delete_user(user_id):
|
||||
"""Delete a user (admin only)"""
|
||||
user = User.query.get_or_404(user_id)
|
||||
|
||||
# Check if user has active jobs
|
||||
active_jobs = PrintJob.query.filter_by(user_id=user.id, aborted=False).first()
|
||||
if active_jobs:
|
||||
return jsonify({'error': 'Cannot delete user with active jobs'}), 400
|
||||
|
||||
db.session.delete(user)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({'message': 'User deleted successfully'})
|
||||
|
||||
@bp.route('/me', methods=['GET'])
|
||||
@token_required
|
||||
def get_current_user():
|
||||
"""Get the current user's profile"""
|
||||
user = User.query.get(request.user_id)
|
||||
if not user:
|
||||
return jsonify({'error': 'User not found'}), 404
|
||||
|
||||
result = {
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role
|
||||
}
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/me', methods=['PUT'])
|
||||
@token_required
|
||||
def update_current_user():
|
||||
"""Update the current user's profile"""
|
||||
user = User.query.get(request.user_id)
|
||||
if not user:
|
||||
return jsonify({'error': 'User not found'}), 404
|
||||
|
||||
data = request.get_json() or {}
|
||||
|
||||
if 'display_name' in data:
|
||||
user.display_name = data['display_name']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
result = {
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role
|
||||
}
|
||||
|
||||
return jsonify(result)
|
5
archiv/flask-backend/app/auth/__init__.py
Executable file
5
archiv/flask-backend/app/auth/__init__.py
Executable file
@ -0,0 +1,5 @@
|
||||
from flask import Blueprint
|
||||
|
||||
bp = Blueprint('auth', __name__)
|
||||
|
||||
from app.auth import routes
|
156
archiv/flask-backend/app/auth/routes.py
Executable file
156
archiv/flask-backend/app/auth/routes.py
Executable file
@ -0,0 +1,156 @@
|
||||
from flask import request, jsonify, current_app
|
||||
from app import db
|
||||
from app.auth import bp
|
||||
from app.models import User, Session
|
||||
from datetime import datetime, timedelta
|
||||
import time
|
||||
import functools
|
||||
import re
|
||||
|
||||
@bp.route('/register', methods=['POST'])
|
||||
def register():
|
||||
"""Register a new user"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate required fields
|
||||
required_fields = ['username', 'email', 'password']
|
||||
for field in required_fields:
|
||||
if field not in data:
|
||||
return jsonify({'error': f'Missing required field: {field}'}), 400
|
||||
|
||||
# Validate email format
|
||||
email_regex = r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$'
|
||||
if not re.match(email_regex, data['email']):
|
||||
return jsonify({'error': 'Invalid email format'}), 400
|
||||
|
||||
# Validate password strength (at least 8 characters)
|
||||
if len(data['password']) < 8:
|
||||
return jsonify({'error': 'Password must be at least 8 characters long'}), 400
|
||||
|
||||
# Check if username already exists
|
||||
if User.query.filter_by(username=data['username']).first():
|
||||
return jsonify({'error': 'Username already exists'}), 400
|
||||
|
||||
# Check if email already exists
|
||||
if User.query.filter_by(email=data['email']).first():
|
||||
return jsonify({'error': 'Email already exists'}), 400
|
||||
|
||||
# Create new user
|
||||
user = User(
|
||||
username=data['username'],
|
||||
email=data['email'],
|
||||
display_name=data.get('display_name', data['username']),
|
||||
role='user' # Default role
|
||||
)
|
||||
user.set_password(data['password'])
|
||||
|
||||
db.session.add(user)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'id': user.id,
|
||||
'username': user.username,
|
||||
'email': user.email,
|
||||
'display_name': user.display_name,
|
||||
'role': user.role
|
||||
}), 201
|
||||
|
||||
@bp.route('/login', methods=['POST'])
|
||||
def login():
|
||||
"""Login a user with username/email and password"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate required fields
|
||||
if 'password' not in data:
|
||||
return jsonify({'error': 'Password is required'}), 400
|
||||
|
||||
if 'username' not in data and 'email' not in data:
|
||||
return jsonify({'error': 'Username or email is required'}), 400
|
||||
|
||||
# Find user by username or email
|
||||
user = None
|
||||
if 'username' in data:
|
||||
user = User.query.filter_by(username=data['username']).first()
|
||||
else:
|
||||
user = User.query.filter_by(email=data['email']).first()
|
||||
|
||||
# Check if user exists and verify password
|
||||
if not user or not user.check_password(data['password']):
|
||||
return jsonify({'error': 'Invalid credentials'}), 401
|
||||
|
||||
# Create a session for the user
|
||||
expires_at = int((datetime.utcnow() + timedelta(days=7)).timestamp())
|
||||
session = Session(
|
||||
user_id=user.id,
|
||||
expires_at=expires_at
|
||||
)
|
||||
db.session.add(session)
|
||||
db.session.commit()
|
||||
|
||||
# Generate JWT token
|
||||
token = user.generate_token()
|
||||
|
||||
return jsonify({
|
||||
'token': token,
|
||||
'user': {
|
||||
'id': user.id,
|
||||
'username': user.username,
|
||||
'email': user.email,
|
||||
'display_name': user.display_name,
|
||||
'role': user.role
|
||||
}
|
||||
})
|
||||
|
||||
@bp.route('/logout', methods=['POST'])
|
||||
def logout():
|
||||
"""Log out a user by invalidating their session"""
|
||||
auth_header = request.headers.get('Authorization')
|
||||
if not auth_header or not auth_header.startswith('Bearer '):
|
||||
return jsonify({'error': 'Authorization header required'}), 401
|
||||
|
||||
token = auth_header.split(' ')[1]
|
||||
payload = User.verify_token(token)
|
||||
if not payload:
|
||||
return jsonify({'error': 'Invalid token'}), 401
|
||||
|
||||
# Delete all sessions for this user
|
||||
Session.query.filter_by(user_id=payload['user_id']).delete()
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({'message': 'Successfully logged out'})
|
||||
|
||||
def token_required(f):
|
||||
@functools.wraps(f)
|
||||
def decorated(*args, **kwargs):
|
||||
auth_header = request.headers.get('Authorization')
|
||||
if not auth_header or not auth_header.startswith('Bearer '):
|
||||
return jsonify({'error': 'Authorization header required'}), 401
|
||||
|
||||
token = auth_header.split(' ')[1]
|
||||
payload = User.verify_token(token)
|
||||
if not payload:
|
||||
return jsonify({'error': 'Invalid token'}), 401
|
||||
|
||||
# Check if user has an active session
|
||||
user_id = payload['user_id']
|
||||
current_time = int(time.time())
|
||||
session = Session.query.filter_by(user_id=user_id).filter(Session.expires_at > current_time).first()
|
||||
if not session:
|
||||
return jsonify({'error': 'No active session found'}), 401
|
||||
|
||||
# Add user to request context
|
||||
request.user_id = user_id
|
||||
request.user_role = payload['role']
|
||||
|
||||
return f(*args, **kwargs)
|
||||
return decorated
|
||||
|
||||
def admin_required(f):
|
||||
@functools.wraps(f)
|
||||
@token_required
|
||||
def decorated(*args, **kwargs):
|
||||
if request.user_role != 'admin':
|
||||
return jsonify({'error': 'Admin privileges required'}), 403
|
||||
|
||||
return f(*args, **kwargs)
|
||||
return decorated
|
124
archiv/flask-backend/app/models.py
Executable file
124
archiv/flask-backend/app/models.py
Executable file
@ -0,0 +1,124 @@
|
||||
from app import db
|
||||
import uuid
|
||||
from datetime import datetime, timedelta
|
||||
import jwt
|
||||
from config import Config
|
||||
import bcrypt
|
||||
|
||||
class User(db.Model):
|
||||
__tablename__ = 'user'
|
||||
|
||||
id = db.Column(db.String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
username = db.Column(db.String(64), index=True, unique=True, nullable=False)
|
||||
display_name = db.Column(db.String(120))
|
||||
email = db.Column(db.String(120), index=True, unique=True, nullable=False)
|
||||
password_hash = db.Column(db.String(128), nullable=False)
|
||||
role = db.Column(db.String(20), default='user')
|
||||
|
||||
print_jobs = db.relationship('PrintJob', backref='user', lazy='dynamic', cascade='all, delete-orphan')
|
||||
sessions = db.relationship('Session', backref='user', lazy='dynamic', cascade='all, delete-orphan')
|
||||
|
||||
def set_password(self, password):
|
||||
"""Hash and set the user's password"""
|
||||
password_bytes = password.encode('utf-8')
|
||||
salt = bcrypt.gensalt()
|
||||
self.password_hash = bcrypt.hashpw(password_bytes, salt).decode('utf-8')
|
||||
|
||||
def check_password(self, password):
|
||||
"""Check if the provided password matches the stored hash"""
|
||||
password_bytes = password.encode('utf-8')
|
||||
stored_hash = self.password_hash.encode('utf-8')
|
||||
return bcrypt.checkpw(password_bytes, stored_hash)
|
||||
|
||||
def generate_token(self):
|
||||
"""Generate a JWT token for this user"""
|
||||
payload = {
|
||||
'user_id': self.id,
|
||||
'username': self.username,
|
||||
'email': self.email,
|
||||
'role': self.role,
|
||||
'exp': datetime.utcnow() + timedelta(seconds=Config.JWT_ACCESS_TOKEN_EXPIRES)
|
||||
}
|
||||
return jwt.encode(payload, Config.JWT_SECRET, algorithm='HS256')
|
||||
|
||||
@staticmethod
|
||||
def verify_token(token):
|
||||
"""Verify and decode a JWT token"""
|
||||
try:
|
||||
payload = jwt.decode(token, Config.JWT_SECRET, algorithms=['HS256'])
|
||||
return payload
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
class Session(db.Model):
|
||||
__tablename__ = 'session'
|
||||
|
||||
id = db.Column(db.String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
user_id = db.Column(db.String(36), db.ForeignKey('user.id', ondelete='CASCADE'), nullable=False)
|
||||
expires_at = db.Column(db.Integer, nullable=False)
|
||||
|
||||
|
||||
class Printer(db.Model):
|
||||
__tablename__ = 'printer'
|
||||
|
||||
id = db.Column(db.String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
name = db.Column(db.String(120), nullable=False)
|
||||
description = db.Column(db.Text, nullable=False)
|
||||
status = db.Column(db.Integer, nullable=False, default=0) # 0: OPERATIONAL, 1: OUT_OF_ORDER
|
||||
|
||||
print_jobs = db.relationship('PrintJob', backref='printer', lazy='dynamic', cascade='all, delete-orphan')
|
||||
|
||||
|
||||
class PrintJob(db.Model):
|
||||
__tablename__ = 'printJob'
|
||||
|
||||
id = db.Column(db.String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
printer_id = db.Column(db.String(36), db.ForeignKey('printer.id', ondelete='CASCADE'), nullable=False)
|
||||
user_id = db.Column(db.String(36), db.ForeignKey('user.id', ondelete='CASCADE'), nullable=False)
|
||||
start_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
duration_in_minutes = db.Column(db.Integer, nullable=False)
|
||||
comments = db.Column(db.Text)
|
||||
aborted = db.Column(db.Boolean, nullable=False, default=False)
|
||||
abort_reason = db.Column(db.Text)
|
||||
|
||||
def get_end_time(self):
|
||||
return self.start_at + timedelta(minutes=self.duration_in_minutes)
|
||||
|
||||
def is_active(self):
|
||||
now = datetime.utcnow()
|
||||
return (not self.aborted and
|
||||
self.start_at <= now and
|
||||
now < self.get_end_time())
|
||||
|
||||
def get_remaining_time(self):
|
||||
if self.aborted:
|
||||
return 0
|
||||
|
||||
now = datetime.utcnow()
|
||||
if now < self.start_at:
|
||||
# Job hasn't started yet
|
||||
return self.duration_in_minutes * 60
|
||||
|
||||
end_time = self.get_end_time()
|
||||
if now >= end_time:
|
||||
# Job has ended
|
||||
return 0
|
||||
|
||||
# Job is ongoing
|
||||
remaining_seconds = (end_time - now).total_seconds()
|
||||
return int(remaining_seconds)
|
||||
|
||||
def to_dict(self):
|
||||
return {
|
||||
'id': self.id,
|
||||
'printer_id': self.printer_id,
|
||||
'user_id': self.user_id,
|
||||
'start_at': self.start_at.isoformat(),
|
||||
'duration_in_minutes': self.duration_in_minutes,
|
||||
'comments': self.comments,
|
||||
'aborted': self.aborted,
|
||||
'abort_reason': self.abort_reason,
|
||||
'remaining_time': self.get_remaining_time(),
|
||||
'is_active': self.is_active()
|
||||
}
|
13
archiv/flask-backend/config.py
Executable file
13
archiv/flask-backend/config.py
Executable file
@ -0,0 +1,13 @@
|
||||
import os
|
||||
from dotenv import load_dotenv
|
||||
|
||||
basedir = os.path.abspath(os.path.dirname(__file__))
|
||||
load_dotenv(os.path.join(basedir, '.env'))
|
||||
|
||||
class Config:
|
||||
SECRET_KEY = os.environ.get('SECRET_KEY') or 'you-will-never-guess'
|
||||
SQLALCHEMY_DATABASE_URI = os.environ.get('DATABASE_URL') or \
|
||||
'sqlite:///' + os.path.join(basedir, 'app.db')
|
||||
SQLALCHEMY_TRACK_MODIFICATIONS = False
|
||||
JWT_SECRET = os.environ.get('JWT_SECRET') or 'jwt-secret-key'
|
||||
JWT_ACCESS_TOKEN_EXPIRES = 3600 # 1 hour in seconds
|
20
archiv/flask-backend/docker-compose.yml
Executable file
20
archiv/flask-backend/docker-compose.yml
Executable file
@ -0,0 +1,20 @@
|
||||
version: '3.8'
|
||||
|
||||
services:
|
||||
flask-backend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
restart: always
|
||||
ports:
|
||||
- "5000:5000"
|
||||
environment:
|
||||
- SECRET_KEY=your-secret-key
|
||||
- DATABASE_URL=sqlite:///app.db
|
||||
- JWT_SECRET=your-jwt-secret
|
||||
volumes:
|
||||
- ./instance:/app/instance
|
||||
command: >
|
||||
bash -c "python -m flask db upgrade &&
|
||||
python scripts/init_db.py &&
|
||||
gunicorn --bind 0.0.0.0:5000 wsgi:app"
|
89
archiv/flask-backend/migrations/alembic.ini
Executable file
89
archiv/flask-backend/migrations/alembic.ini
Executable file
@ -0,0 +1,89 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = migrations
|
||||
|
||||
# template used to generate migration files
|
||||
file_template = %%(year)d%%(month).2d%%(day).2d_%%(hour).2d%%(minute).2d%%(second).2d_%%(slug)s
|
||||
|
||||
# sys.path path, will be prepended to sys.path if present.
|
||||
# defaults to the current working directory.
|
||||
prepend_sys_path = .
|
||||
|
||||
# timezone to use when rendering the date
|
||||
# within the migration file as well as the filename.
|
||||
# string value is passed to dateutil.tz.gettz()
|
||||
# leave blank for localtime
|
||||
# timezone =
|
||||
|
||||
# max length of characters to apply to the
|
||||
# "slug" field
|
||||
truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
# sourceless = false
|
||||
|
||||
# version location specification; this defaults
|
||||
# to migrations/versions. When using multiple version
|
||||
# directories, initial revisions must be specified with --version-path
|
||||
# version_locations = %(here)s/bar %(here)s/bat migrations/versions
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
sqlalchemy.url = driver://user:pass@localhost/dbname
|
||||
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks = black
|
||||
# black.type = console_scripts
|
||||
# black.entrypoint = black
|
||||
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
91
archiv/flask-backend/migrations/env.py
Executable file
91
archiv/flask-backend/migrations/env.py
Executable file
@ -0,0 +1,91 @@
|
||||
from __future__ import with_statement
|
||||
|
||||
import logging
|
||||
from logging.config import fileConfig
|
||||
|
||||
from flask import current_app
|
||||
|
||||
from alembic import context
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
fileConfig(config.config_file_name)
|
||||
logger = logging.getLogger('alembic.env')
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
# from myapp import mymodel
|
||||
# target_metadata = mymodel.Base.metadata
|
||||
config.set_main_option(
|
||||
'sqlalchemy.url',
|
||||
str(current_app.extensions['migrate'].db.get_engine().url).replace(
|
||||
'%', '%%'))
|
||||
target_metadata = current_app.extensions['migrate'].db.metadata
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url, target_metadata=target_metadata, literal_binds=True
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online():
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
|
||||
# this callback is used to prevent an auto-migration from being generated
|
||||
# when there are no changes to the schema
|
||||
# reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html
|
||||
def process_revision_directives(context, revision, directives):
|
||||
if getattr(config.cmd_opts, 'autogenerate', False):
|
||||
script = directives[0]
|
||||
if script.upgrade_ops.is_empty():
|
||||
directives[:] = []
|
||||
logger.info('No changes in schema detected.')
|
||||
|
||||
connectable = current_app.extensions['migrate'].db.get_engine()
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=target_metadata,
|
||||
process_revision_directives=process_revision_directives,
|
||||
**current_app.extensions['migrate'].configure_args
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
24
archiv/flask-backend/migrations/script.py.mako
Executable file
24
archiv/flask-backend/migrations/script.py.mako
Executable file
@ -0,0 +1,24 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade():
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade():
|
||||
${downgrades if downgrades else "pass"}
|
75
archiv/flask-backend/migrations/versions/initial_migration.py
Executable file
75
archiv/flask-backend/migrations/versions/initial_migration.py
Executable file
@ -0,0 +1,75 @@
|
||||
"""Initial migration
|
||||
|
||||
Revision ID: initial_migration
|
||||
Revises:
|
||||
Create Date: 2025-03-06 12:00:00.000000
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'initial_migration'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# Create user table
|
||||
op.create_table('user',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('username', sa.String(length=64), nullable=False),
|
||||
sa.Column('display_name', sa.String(length=120), nullable=True),
|
||||
sa.Column('email', sa.String(length=120), nullable=False),
|
||||
sa.Column('password_hash', sa.String(length=128), nullable=False),
|
||||
sa.Column('role', sa.String(length=20), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('email'),
|
||||
sa.UniqueConstraint('username')
|
||||
)
|
||||
op.create_index(op.f('ix_user_email'), 'user', ['email'], unique=True)
|
||||
op.create_index(op.f('ix_user_username'), 'user', ['username'], unique=True)
|
||||
|
||||
# Create session table
|
||||
op.create_table('session',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('user_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('expires_at', sa.Integer(), nullable=False),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
# Create printer table
|
||||
op.create_table('printer',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('name', sa.String(length=120), nullable=False),
|
||||
sa.Column('description', sa.Text(), nullable=False),
|
||||
sa.Column('status', sa.Integer(), nullable=False),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
# Create printJob table
|
||||
op.create_table('printJob',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('printer_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('user_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('start_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('duration_in_minutes', sa.Integer(), nullable=False),
|
||||
sa.Column('comments', sa.Text(), nullable=True),
|
||||
sa.Column('aborted', sa.Boolean(), nullable=False),
|
||||
sa.Column('abort_reason', sa.Text(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['printer_id'], ['printer.id'], ondelete='CASCADE'),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table('printJob')
|
||||
op.drop_table('printer')
|
||||
op.drop_table('session')
|
||||
op.drop_index(op.f('ix_user_username'), table_name='user')
|
||||
op.drop_index(op.f('ix_user_email'), table_name='user')
|
||||
op.drop_table('user')
|
9
archiv/flask-backend/requirements.txt
Executable file
9
archiv/flask-backend/requirements.txt
Executable file
@ -0,0 +1,9 @@
|
||||
Flask==2.3.3
|
||||
Flask-SQLAlchemy==3.1.1
|
||||
Flask-Migrate==4.0.5
|
||||
Flask-CORS==4.0.0
|
||||
python-dotenv==1.0.0
|
||||
SQLAlchemy==2.0.25
|
||||
pyjwt==2.8.0
|
||||
bcrypt==4.1.2
|
||||
gunicorn==21.2.0
|
23
archiv/flask-backend/run.sh
Executable file
23
archiv/flask-backend/run.sh
Executable file
@ -0,0 +1,23 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Initialize virtual environment if it doesn't exist
|
||||
if [ ! -d "venv" ]; then
|
||||
echo "Creating virtual environment..."
|
||||
python3 -m venv venv
|
||||
fi
|
||||
|
||||
# Activate virtual environment
|
||||
source venv/bin/activate
|
||||
|
||||
# Install dependencies
|
||||
echo "Installing dependencies..."
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Initialize database
|
||||
echo "Initializing database..."
|
||||
flask db upgrade
|
||||
python scripts/init_db.py
|
||||
|
||||
# Run the application
|
||||
echo "Starting Flask application..."
|
||||
python wsgi.py
|
55
archiv/flask-backend/scripts/init_db.py
Executable file
55
archiv/flask-backend/scripts/init_db.py
Executable file
@ -0,0 +1,55 @@
|
||||
#!/usr/bin/env python
|
||||
from app import create_app, db
|
||||
from app.models import User, Printer
|
||||
import uuid
|
||||
|
||||
def init_db():
|
||||
app = create_app()
|
||||
with app.app_context():
|
||||
# Create tables
|
||||
db.create_all()
|
||||
|
||||
# Check if we already have an admin user
|
||||
admin = User.query.filter_by(role='admin').first()
|
||||
if not admin:
|
||||
# Create admin user
|
||||
admin = User(
|
||||
id=str(uuid.uuid4()),
|
||||
username='admin',
|
||||
display_name='Administrator',
|
||||
email='admin@example.com',
|
||||
role='admin'
|
||||
)
|
||||
admin.set_password('admin123') # Default password, change in production!
|
||||
db.session.add(admin)
|
||||
print("Created admin user with username 'admin' and password 'admin123'")
|
||||
|
||||
# Check if we have any printers
|
||||
printer_count = Printer.query.count()
|
||||
if printer_count == 0:
|
||||
# Create sample printers
|
||||
printers = [
|
||||
Printer(
|
||||
name='Printer 1',
|
||||
description='3D Printer for general use',
|
||||
status=0 # OPERATIONAL
|
||||
),
|
||||
Printer(
|
||||
name='Printer 2',
|
||||
description='High resolution printer for detailed work',
|
||||
status=0 # OPERATIONAL
|
||||
),
|
||||
Printer(
|
||||
name='Printer 3',
|
||||
description='Large format printer for big projects',
|
||||
status=0 # OPERATIONAL
|
||||
)
|
||||
]
|
||||
db.session.add_all(printers)
|
||||
print("Created sample printers")
|
||||
|
||||
db.session.commit()
|
||||
print("Database initialized successfully!")
|
||||
|
||||
if __name__ == '__main__':
|
||||
init_db()
|
6
archiv/flask-backend/wsgi.py
Executable file
6
archiv/flask-backend/wsgi.py
Executable file
@ -0,0 +1,6 @@
|
||||
from app import create_app
|
||||
|
||||
app = create_app()
|
||||
|
||||
if __name__ == '__main__':
|
||||
app.run(host='0.0.0.0', port=5000, debug=True)
|
5
backend/.env
Normal file
5
backend/.env
Normal file
@ -0,0 +1,5 @@
|
||||
SECRET_KEY=7445630171969DFAC92C53CEC92E67A9CB2E00B3CB2F
|
||||
DATABASE_PATH=instance/myp.db
|
||||
TAPO_USERNAME=till.tomczak@mercedes-benz.com
|
||||
TAPO_PASSWORD=744563017196A
|
||||
PRINTERS={"Printer 1": {"ip": "192.168.0.100"}, "Printer 2": {"ip": "192.168.0.101"}, "Printer 3": {"ip": "192.168.0.102"}, "Printer 4": {"ip": "192.168.0.103"}, "Printer 5": {"ip": "192.168.0.104"}, "Printer 6": {"ip": "192.168.0.106"}}
|
48
backend/.gitignore
vendored
Executable file
48
backend/.gitignore
vendored
Executable file
@ -0,0 +1,48 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
env/
|
||||
build/
|
||||
develop-eggs/
|
||||
dist/
|
||||
downloads/
|
||||
eggs/
|
||||
.eggs/
|
||||
lib/
|
||||
lib64/
|
||||
parts/
|
||||
sdist/
|
||||
var/
|
||||
wheels/
|
||||
*.egg-info/
|
||||
.installed.cfg
|
||||
*.egg
|
||||
|
||||
# Flask
|
||||
instance/
|
||||
.webassets-cache
|
||||
|
||||
# Logs
|
||||
logs/
|
||||
*.log
|
||||
|
||||
# SQLite Datenbank-Dateien
|
||||
*.db
|
||||
*.db-journal
|
||||
|
||||
# Virtuelle Umgebungen
|
||||
venv/
|
||||
ENV/
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
# Betriebssystem
|
||||
.DS_Store
|
||||
Thumbs.db
|
52
backend/Dockerfile
Executable file
52
backend/Dockerfile
Executable file
@ -0,0 +1,52 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies (curl, sqlite3 for database, wget for healthcheck)
|
||||
RUN apt-get update && apt-get install -y \
|
||||
curl \
|
||||
sqlite3 \
|
||||
wget \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
COPY requirements.txt .
|
||||
|
||||
# Install Python dependencies
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy application code
|
||||
COPY . .
|
||||
|
||||
# Create required directories
|
||||
RUN mkdir -p logs instance
|
||||
|
||||
ENV FLASK_APP=app.py
|
||||
ENV PYTHONUNBUFFERED=1
|
||||
|
||||
# Add health check endpoint
|
||||
RUN echo 'from flask import Blueprint\n\
|
||||
health_bp = Blueprint("health", __name__)\n\
|
||||
\n\
|
||||
@health_bp.route("/health")\n\
|
||||
def health_check():\n\
|
||||
return {"status": "healthy"}, 200\n'\
|
||||
> /app/health.py
|
||||
|
||||
# Add the health blueprint to app.py if it doesn't exist
|
||||
RUN grep -q "health_bp" app.py || sed -i '/from flask import/a from health import health_bp' app.py
|
||||
RUN grep -q "app.register_blueprint(health_bp)" app.py || sed -i '/app = Flask/a app.register_blueprint(health_bp)' app.py
|
||||
|
||||
EXPOSE 5000
|
||||
|
||||
# Add startup script to initialize database if needed
|
||||
RUN echo '#!/bin/bash\n\
|
||||
if [ ! -f "instance/myp.db" ] || [ ! -s "instance/myp.db" ]; then\n\
|
||||
echo "Initializing database..."\n\
|
||||
python -c "from app import init_db; init_db()"\n\
|
||||
fi\n\
|
||||
\n\
|
||||
echo "Starting gunicorn server..."\n\
|
||||
gunicorn --bind 0.0.0.0:5000 app:app\n'\
|
||||
> /app/start.sh && chmod +x /app/start.sh
|
||||
|
||||
CMD ["/app/start.sh"]
|
1719
backend/app.py
Executable file
1719
backend/app.py
Executable file
File diff suppressed because it is too large
Load Diff
8
backend/development/crontab-example
Normal file
8
backend/development/crontab-example
Normal file
@ -0,0 +1,8 @@
|
||||
# MYP Backend Cron-Jobs
|
||||
# Installiere mit: crontab crontab-example
|
||||
|
||||
# Prüfe alle 5 Minuten auf abgelaufene Reservierungen und schalte Steckdosen aus
|
||||
*/5 * * * * cd /pfad/zum/projektarbeit-myp/backend && /pfad/zur/venv/bin/flask check-jobs >> /pfad/zum/projektarbeit-myp/backend/logs/cron.log 2>&1
|
||||
|
||||
# Tägliche Sicherung der Datenbank um 3:00 Uhr
|
||||
0 3 * * * cd /pfad/zum/projektarbeit-myp/backend && cp instance/myp.db instance/backups/myp-$(date +\%Y\%m\%d).db
|
84
backend/development/initialize_myp_database.sh
Normal file
84
backend/development/initialize_myp_database.sh
Normal file
@ -0,0 +1,84 @@
|
||||
#!/bin/bash
|
||||
|
||||
# MYP Datenbank Initialisierungs-Skript
|
||||
# Dieses Skript erstellt die erforderlichen Datenbanktabellen für das MYP Backend
|
||||
|
||||
echo "=== MYP Datenbank Initialisierung ==="
|
||||
echo ""
|
||||
|
||||
# Prüfe, ob sqlite3 installiert ist
|
||||
if ! command -v sqlite3 &> /dev/null; then
|
||||
echo "FEHLER: sqlite3 ist nicht installiert."
|
||||
echo "Bitte installiere sqlite3 mit deinem Paketmanager, z.B. 'apt install sqlite3'"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Erstelle Instance-Ordner, falls nicht vorhanden
|
||||
echo "Erstelle instance-Ordner, falls nicht vorhanden..."
|
||||
mkdir -p instance/backups
|
||||
|
||||
# Prüfen, ob die Datenbank bereits existiert
|
||||
if [ -f "instance/myp.db" ]; then
|
||||
echo "Datenbank existiert bereits."
|
||||
echo "Erstelle Backup in instance/backups..."
|
||||
cp instance/myp.db "instance/backups/myp_$(date '+%Y%m%d_%H%M%S').db"
|
||||
fi
|
||||
|
||||
# Erstelle die Datenbank und ihre Tabellen
|
||||
echo "Erstelle neue Datenbank..."
|
||||
sqlite3 instance/myp.db <<EOF
|
||||
PRAGMA foreign_keys = ON;
|
||||
|
||||
CREATE TABLE IF NOT EXISTS user (
|
||||
id TEXT PRIMARY KEY,
|
||||
username TEXT UNIQUE NOT NULL,
|
||||
password_hash TEXT NOT NULL,
|
||||
display_name TEXT,
|
||||
email TEXT UNIQUE,
|
||||
role TEXT DEFAULT 'user'
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS session (
|
||||
id TEXT PRIMARY KEY,
|
||||
user_id TEXT NOT NULL,
|
||||
expires_at TIMESTAMP NOT NULL,
|
||||
FOREIGN KEY (user_id) REFERENCES user (id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS socket (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
description TEXT NOT NULL,
|
||||
status INTEGER DEFAULT 0,
|
||||
ip_address TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS job (
|
||||
id TEXT PRIMARY KEY,
|
||||
socket_id TEXT NOT NULL,
|
||||
user_id TEXT NOT NULL,
|
||||
start_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
duration_in_minutes INTEGER NOT NULL,
|
||||
comments TEXT,
|
||||
aborted INTEGER DEFAULT 0,
|
||||
abort_reason TEXT,
|
||||
FOREIGN KEY (socket_id) REFERENCES socket (id) ON DELETE CASCADE,
|
||||
FOREIGN KEY (user_id) REFERENCES user (id) ON DELETE CASCADE
|
||||
);
|
||||
EOF
|
||||
|
||||
# Setze Berechtigungen für die Datenbankdatei
|
||||
chmod 644 instance/myp.db
|
||||
|
||||
echo ""
|
||||
echo "=== Datenbank-Initialisierung abgeschlossen ==="
|
||||
echo ""
|
||||
echo "Du kannst jetzt einen Admin-Benutzer über die Web-Oberfläche registrieren."
|
||||
echo "Der erste registrierte Benutzer wird automatisch zum Admin."
|
||||
echo ""
|
||||
echo "Starte den Server mit:"
|
||||
echo "python app.py"
|
||||
echo ""
|
||||
echo "Alternativ kannst du einen Admin-Benutzer über die API erstellen mit:"
|
||||
echo "curl -X POST http://localhost:5000/api/create-initial-admin -H \"Content-Type: application/json\" -d '{\"username\":\"admin\",\"password\":\"password\",\"displayName\":\"Administrator\"}'"
|
||||
echo ""
|
95
backend/development/tests/api-test.drucker.py
Normal file
95
backend/development/tests/api-test.drucker.py
Normal file
@ -0,0 +1,95 @@
|
||||
import requests
|
||||
import json
|
||||
|
||||
# Basis-URL inkl. Token
|
||||
url = "http://192.168.0.102:80/app?token=9DFAC92C53CEC92E67A9CB2E00B3CB2F"
|
||||
|
||||
# HTTP-Header wie in der Originalanfrage
|
||||
headers = {
|
||||
"Referer": "http://192.168.0.102:80",
|
||||
"Accept": "application/json",
|
||||
"requestByApp": "true",
|
||||
"Content-Type": "application/json; charset=UTF-8",
|
||||
"Host": "192.168.0.102",
|
||||
"Connection": "Keep-Alive",
|
||||
"Accept-Encoding": "gzip",
|
||||
"User-Agent": "okhttp/3.14.9"
|
||||
}
|
||||
|
||||
# Liste der Payloads (als Python-Dictionaries)
|
||||
payloads = [
|
||||
{
|
||||
"method": "securePassthrough",
|
||||
"params": {
|
||||
"request": (
|
||||
"ZC4CHp6bbfBO1rtmuH6I+TStBIiFRfQpayYPwet5NBmL35dib5xXHeEeLM7c0OSQSyxO6fnbXrC1\n"
|
||||
"gXdfowwwq4Fum9ispgt8yT7cgbDcqnoVrhxEtHIDfuwLh8YAGmDSfTMo/JlsGspWPYMKd1EWXtb5\n"
|
||||
"gP9FA9LHnV2kxKsNSPQ=\n"
|
||||
)
|
||||
}
|
||||
},
|
||||
{
|
||||
"method": "securePassthrough",
|
||||
"params": {
|
||||
"request": (
|
||||
"k111EbfCcfVzAouNbu1vyos9Ltsg+a97n4xUUQMviQVJfhqxvKOhv1SrvEk2LvpD0LwNVUNPZdwU\n"
|
||||
"6pH5E/NOwdc1WzTPeqHiY760GpUuqn0tToHEHEyO2HaSKdrAYnw2gN410bvHb0pM3gYWS43eOA==\n"
|
||||
)
|
||||
}
|
||||
},
|
||||
{
|
||||
"method": "securePassthrough",
|
||||
"params": {
|
||||
"request": (
|
||||
"7/uYVDwyNfFhg9y7rHyp+4AGKBYQPyaBN6cFMl9j4ER/JpJTcGBdaUteSmx8P8Fkz+b2kkNLjYa2\n"
|
||||
"wQr2gA3m6vEq9jpnAF2V3fv9c4Yg9gja9MlTIZqM6EdMi7YbfbhLme34Bh8kMcohDR3u1F4DwFDz\n"
|
||||
"hNZPckf/CegbY9KGFeGwT4rWyX3BTk9+FE7ldtJn\n"
|
||||
)
|
||||
}
|
||||
},
|
||||
{
|
||||
"method": "securePassthrough",
|
||||
"params": {
|
||||
"request": (
|
||||
"EjWZb+YYS9tihgLdX4x+Wwx7q+e5X/ZHicr4jOnYmpFToDANzpm5ZpzD49BITcTCdQMOHlJBis85\n"
|
||||
"9GX6Hv8j66OITyH0XmfG9dQo2tgIykyagCZIofr/BpAWYX4aRaOkU4z14mVa2XpDtHJQjc+pXYkh\n"
|
||||
"JuWvLE+h01U5RoyPtvE=\n"
|
||||
)
|
||||
}
|
||||
},
|
||||
{
|
||||
"method": "securePassthrough",
|
||||
"params": {
|
||||
"request": (
|
||||
"OwyTsm5HdB/ReJMhVRrkjnV0NLTanw6iXOxVrDDexT456edWuwKiBOsZUyBHmUyJKgiPQzOXqyWWi220bX8IjLX4q8YNgPwRlj+7nRbfzpC/I57wBZBTWIt626pSdIH0vpiuPq84KMfPD5BB2p78/LjsqlzyeLGYzkSsGRBMT8TnLMDFzZE864nfDUZ9muH2kk8NRMN9l6xoCXBJqGA9q8XxIWRTpsl0kTx52kUszY69hYlfFSrrCDIls1ykul14/T1NtOVF8KOgiwaSGOZf7L4QlbhYvRj9kkVVkrxhlwt8jtMqfJKEqq+CIPh3Mp4440WYMLRo6VNIEJ3pWjplkJmc+htnYC4FwVgT7mHZ8eeGGKBvsJz+78gTaHnGBnwZ26I8UdFparyp6QXpOhK9zFmGVh0yapiTHo6jOOI+4Q3Ru+aPnidX/ZASPmR7CZO70CUpvv9zIKJnrAaoTMmH7A6+kcmCRLgLFaTaM+4DFmiz6JGP+4W7MmVPJxxvn0IFlo1P/xwNDuL3T6GLUIEVNk89JG5roBm7AdchUZJO38dGZ0eFiiTK/NhKPvjj+fk9A4FGh7EDshXZhL2u50cdLcdUtcP/CAMDjgWlMm4Kk3vxMQO+UGE+jsB7NkaulmTW1jcl+PSnAE5P71oqVVQ0ng==\n"
|
||||
)
|
||||
}
|
||||
},
|
||||
{
|
||||
"method": "securePassthrough",
|
||||
"params": {
|
||||
"request": (
|
||||
"7/uYVDwyNfFhg9y7rHyp+4AGKBYQPyaBN6cFMl9j4ER/JpJTcGBdaUteSmx8P8FkURmv/LWV1FpO\n"
|
||||
"M3RWvsiC5UAsei2G+vwTVuQpOPjKKAx+qwftr9Qs2mSkPNjNLpWHK68EZkIw+h04TQkt0Q99Dirg\n"
|
||||
"0BcrPgHTVKjiK8mdZ6w6gcld/h/FOKYMqJrP0Z+2\n"
|
||||
)
|
||||
}
|
||||
},
|
||||
{
|
||||
"method": "securePassthrough",
|
||||
"params": {
|
||||
"request": (
|
||||
"ZE/+XlUmTA9D3DFfp4x3xhS3vdsQ+60tz4TOodtZDby/4DPoqk9EBvJZ1JtUCr5c0AHuv/sfwcvN\n"
|
||||
"Vx1zJP9RkltrAKVTWoaESAeewLozpXt/x0s/jkYC1rh7eTrxm+nYTZ5LJgNtcQq8yJxhEPez1w==\n"
|
||||
)
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
# Sende die Payloads sequenziell per POST-Anfrage
|
||||
for idx, payload in enumerate(payloads, start=1):
|
||||
response = requests.post(url, headers=headers, data=json.dumps(payload))
|
||||
print(f"Anfrage {idx}:")
|
||||
print("Status Code:", response.status_code)
|
||||
print("Response Text:", response.text)
|
||||
print("-" * 60)
|
BIN
backend/development/tests/capture.pcap
Normal file
BIN
backend/development/tests/capture.pcap
Normal file
Binary file not shown.
128
backend/development/tests/handshake.py
Normal file
128
backend/development/tests/handshake.py
Normal file
@ -0,0 +1,128 @@
|
||||
import requests
|
||||
import json
|
||||
|
||||
# Constants from the Wireshark capture
|
||||
PUBLIC_KEY = """-----BEGIN PUBLIC KEY-----
|
||||
MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCMl89OZsjqE8yZ9TQhUb9h539WTX3U8Y5YCNdp
|
||||
OhuXvLFYcAT5mvC074VFROmD0xhvw5hrwESOisqpPPU9r78JpLuYUKd+/aidvykqBT8OW5rDLb6d
|
||||
O9FO6Gc+bV8L8ttHVlBFoX69EqiRhcreGPG6FQz4JqGJF4T1nFi0EvALXwIDAQAB
|
||||
-----END PUBLIC KEY-----"""
|
||||
|
||||
# Vorbereitete verschlüsselte Befehle (aus Wireshark extrahiert)
|
||||
COMMAND_ON = """ps0Puxc37EK4PhfcevceL3lyyDrjwLT1+443DDXNbcNRsltlgCQ6+oXgsrE2Pl5OhV73ZI/oM5Nj
|
||||
37cWEaHpXPiHdr1W0cD3aJ5qJ55TfTRkHP9xcMNQJHCn6aWPEHpR7xvvXW9WbJWfShnE2Xdvmw==
|
||||
"""
|
||||
|
||||
COMMAND_OFF = """FlO5i3DRcrUmu2ZwIIv8b68EisGu8VCuqfGOydaR+xCA0n3f2W/EcqVj8MurRBFXYTrZ/uwa1W26
|
||||
ftCfvhdXNebBRwHr9Rj3id4bVfltJ8eT5/R3xY8kputklW2mrw9UfdISzAJqOPp9KZcU4K9p8g==
|
||||
"""
|
||||
|
||||
class TapoP115Controller:
|
||||
def __init__(self, device_ip):
|
||||
self.device_ip = device_ip
|
||||
self.session_id = None
|
||||
self.token = None
|
||||
|
||||
def perform_handshake(self):
|
||||
"""Führt den ersten Handshake durch und speichert die Session-ID"""
|
||||
handshake_data = {
|
||||
"method": "handshake",
|
||||
"params": {
|
||||
"key": PUBLIC_KEY
|
||||
},
|
||||
"requestTimeMils": 0
|
||||
}
|
||||
|
||||
headers = {
|
||||
"Referer": f"http://{self.device_ip}:80",
|
||||
"Accept": "application/json",
|
||||
"requestByApp": "true",
|
||||
"Content-Type": "application/json; charset=UTF-8"
|
||||
}
|
||||
|
||||
response = requests.post(
|
||||
f"http://{self.device_ip}/app",
|
||||
json=handshake_data,
|
||||
headers=headers
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
if data["error_code"] == 0:
|
||||
# Session-ID aus dem Cookie extrahieren
|
||||
self.session_id = response.cookies.get("TP_SESSIONID")
|
||||
print(f"Handshake erfolgreich, Session-ID: {self.session_id}")
|
||||
|
||||
# In einem echten Szenario würden wir hier den verschlüsselten Schlüssel entschlüsseln
|
||||
# Da wir keinen privaten Schlüssel haben, speichern wir nur die Antwort
|
||||
encrypted_key = data["result"]["key"]
|
||||
print(f"Verschlüsselter Schlüssel: {encrypted_key}")
|
||||
return True
|
||||
|
||||
print("Handshake fehlgeschlagen")
|
||||
return False
|
||||
|
||||
def send_command(self, encrypted_command):
|
||||
"""Sendet einen vorbereiteten verschlüsselten Befehl"""
|
||||
if not self.session_id:
|
||||
print("Keine Session-ID. Bitte zuerst Handshake durchführen.")
|
||||
return None
|
||||
|
||||
# Token aus der Wireshark-Aufnahme (könnte sich ändern, oder vom Gerät abhängen)
|
||||
token = "9DFAC92C53CEC92E67A9CB2E00B3CB2F"
|
||||
|
||||
secure_data = {
|
||||
"method": "securePassthrough",
|
||||
"params": {
|
||||
"request": encrypted_command
|
||||
}
|
||||
}
|
||||
|
||||
headers = {
|
||||
"Referer": f"http://{self.device_ip}:80",
|
||||
"Accept": "application/json",
|
||||
"requestByApp": "true",
|
||||
"Content-Type": "application/json; charset=UTF-8",
|
||||
"Cookie": f"TP_SESSIONID={self.session_id}"
|
||||
}
|
||||
|
||||
response = requests.post(
|
||||
f"http://{self.device_ip}/app?token={token}",
|
||||
json=secure_data,
|
||||
headers=headers
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
if data["error_code"] == 0:
|
||||
# In einem echten Szenario würden wir die Antwort entschlüsseln
|
||||
encrypted_response = data["result"]["response"]
|
||||
print("Befehl erfolgreich gesendet")
|
||||
return encrypted_response
|
||||
|
||||
print("Fehler beim Senden des Befehls")
|
||||
return None
|
||||
|
||||
def turn_on(self):
|
||||
"""Schaltet die Steckdose ein"""
|
||||
return self.send_command(COMMAND_ON)
|
||||
|
||||
def turn_off(self):
|
||||
"""Schaltet die Steckdose aus"""
|
||||
return self.send_command(COMMAND_OFF)
|
||||
|
||||
# Verwendungsbeispiel
|
||||
if __name__ == "__main__":
|
||||
controller = TapoP115Controller("192.168.0.102")
|
||||
|
||||
# Handshake durchführen
|
||||
if controller.perform_handshake():
|
||||
# Steckdose einschalten
|
||||
controller.turn_on()
|
||||
|
||||
# Kurze Pause (im echten Code mit time.sleep)
|
||||
print("Steckdose ist eingeschaltet")
|
||||
|
||||
# Steckdose ausschalten
|
||||
controller.turn_off()
|
||||
print("Steckdose ist ausgeschaltet")
|
9
backend/development/tests/tapo.py
Normal file
9
backend/development/tests/tapo.py
Normal file
@ -0,0 +1,9 @@
|
||||
from PyP100 import PyP100
|
||||
|
||||
p100 = PyP100.P100("192.168.0.102", "till.tomczak@mercedes-benz.com", "Agent045") #Creates a P100 plug object
|
||||
|
||||
p100.handshake() #Creates the cookies required for further methods
|
||||
p100.login() #Sends credentials to the plug and creates AES Key and IV for further methods
|
||||
|
||||
p100.turnOn() #Turns the connected plug on
|
||||
p100.turnOff() #Turns the connected plug off
|
253
backend/development/tests/tests.py
Normal file
253
backend/development/tests/tests.py
Normal file
@ -0,0 +1,253 @@
|
||||
import unittest
|
||||
import json
|
||||
import os
|
||||
import tempfile
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from app import app, db, User, Printer, PrintJob
|
||||
|
||||
class MYPBackendTestCase(unittest.TestCase):
|
||||
def setUp(self):
|
||||
# Temporäre Datenbank für Tests
|
||||
self.db_fd, app.config['DATABASE'] = tempfile.mkstemp()
|
||||
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + app.config['DATABASE']
|
||||
app.config['TESTING'] = True
|
||||
self.app = app.test_client()
|
||||
|
||||
# Datenbank-Tabellen erstellen und Test-Daten einfügen
|
||||
with app.app_context():
|
||||
db.create_all()
|
||||
|
||||
# Admin-Benutzer erstellen
|
||||
admin = User(username='admin_test', email='admin@test.com', role='admin')
|
||||
admin.set_password('admin')
|
||||
db.session.add(admin)
|
||||
|
||||
# Normaler Benutzer erstellen
|
||||
user = User(username='user_test', email='user@test.com', role='user')
|
||||
user.set_password('user')
|
||||
db.session.add(user)
|
||||
|
||||
# Drucker erstellen
|
||||
printer1 = Printer(name='Printer 1', location='Room A', type='3D',
|
||||
status='available', description='Test printer 1')
|
||||
printer2 = Printer(name='Printer 2', location='Room B', type='3D',
|
||||
status='busy', description='Test printer 2')
|
||||
db.session.add(printer1)
|
||||
db.session.add(printer2)
|
||||
|
||||
# Job erstellen
|
||||
start_time = datetime.utcnow()
|
||||
end_time = start_time + timedelta(minutes=60)
|
||||
job = PrintJob(title='Test Job', start_time=start_time, end_time=end_time,
|
||||
duration=60, status='active', comments='Test job',
|
||||
user_id=2, printer_id=2)
|
||||
db.session.add(job)
|
||||
|
||||
db.session.commit()
|
||||
|
||||
def tearDown(self):
|
||||
# Aufräumen nach dem Test
|
||||
os.close(self.db_fd)
|
||||
os.unlink(app.config['DATABASE'])
|
||||
|
||||
def get_token(self, username, password):
|
||||
response = self.app.post('/api/auth/login',
|
||||
data=json.dumps({'username': username, 'password': password}),
|
||||
content_type='application/json')
|
||||
data = json.loads(response.data)
|
||||
return data.get('token')
|
||||
|
||||
def test_login(self):
|
||||
# Test: Erfolgreicher Login
|
||||
response = self.app.post('/api/auth/login',
|
||||
data=json.dumps({'username': 'admin_test', 'password': 'admin'}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertIn('token', data)
|
||||
self.assertIn('user', data)
|
||||
|
||||
# Test: Fehlgeschlagener Login (falsches Passwort)
|
||||
response = self.app.post('/api/auth/login',
|
||||
data=json.dumps({'username': 'admin_test', 'password': 'wrong'}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 401)
|
||||
|
||||
def test_register(self):
|
||||
# Test: Erfolgreiche Registrierung
|
||||
response = self.app.post('/api/auth/register',
|
||||
data=json.dumps({
|
||||
'username': 'new_user',
|
||||
'email': 'new@test.com',
|
||||
'password': 'password'
|
||||
}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 201)
|
||||
|
||||
# Test: Doppelte Registrierung
|
||||
response = self.app.post('/api/auth/register',
|
||||
data=json.dumps({
|
||||
'username': 'new_user',
|
||||
'email': 'another@test.com',
|
||||
'password': 'password'
|
||||
}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 400)
|
||||
|
||||
def test_get_printers(self):
|
||||
# Test: Drucker abrufen
|
||||
response = self.app.get('/api/printers')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(len(data), 2)
|
||||
|
||||
def test_get_single_printer(self):
|
||||
# Test: Einzelnen Drucker abrufen
|
||||
response = self.app.get('/api/printers/1')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(data['name'], 'Printer 1')
|
||||
|
||||
def test_create_printer(self):
|
||||
# Als Admin einen Drucker erstellen
|
||||
token = self.get_token('admin_test', 'admin')
|
||||
response = self.app.post('/api/printers',
|
||||
headers={'Authorization': f'Bearer {token}'},
|
||||
data=json.dumps({
|
||||
'name': 'New Printer',
|
||||
'location': 'Room C',
|
||||
'type': '3D',
|
||||
'description': 'New test printer'
|
||||
}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 201)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(data['name'], 'New Printer')
|
||||
|
||||
def test_update_printer(self):
|
||||
# Als Admin einen Drucker aktualisieren
|
||||
token = self.get_token('admin_test', 'admin')
|
||||
response = self.app.put('/api/printers/1',
|
||||
headers={'Authorization': f'Bearer {token}'},
|
||||
data=json.dumps({
|
||||
'name': 'Updated Printer',
|
||||
'location': 'Room D'
|
||||
}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(data['name'], 'Updated Printer')
|
||||
self.assertEqual(data['location'], 'Room D')
|
||||
|
||||
def test_delete_printer(self):
|
||||
# Als Admin einen Drucker löschen
|
||||
token = self.get_token('admin_test', 'admin')
|
||||
response = self.app.delete('/api/printers/1',
|
||||
headers={'Authorization': f'Bearer {token}'})
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
# Überprüfen, ob der Drucker wirklich gelöscht wurde
|
||||
response = self.app.get('/api/printers/1')
|
||||
self.assertEqual(response.status_code, 404)
|
||||
|
||||
def test_get_jobs_as_admin(self):
|
||||
# Als Admin alle Jobs abrufen
|
||||
token = self.get_token('admin_test', 'admin')
|
||||
response = self.app.get('/api/jobs',
|
||||
headers={'Authorization': f'Bearer {token}'})
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(len(data), 1)
|
||||
|
||||
def test_get_jobs_as_user(self):
|
||||
# Als normaler Benutzer nur eigene Jobs abrufen
|
||||
token = self.get_token('user_test', 'user')
|
||||
response = self.app.get('/api/jobs',
|
||||
headers={'Authorization': f'Bearer {token}'})
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(len(data), 1) # Der Benutzer hat einen Job
|
||||
|
||||
def test_create_job(self):
|
||||
# Als Benutzer einen Job erstellen
|
||||
token = self.get_token('user_test', 'user')
|
||||
response = self.app.post('/api/jobs',
|
||||
headers={'Authorization': f'Bearer {token}'},
|
||||
data=json.dumps({
|
||||
'title': 'New Job',
|
||||
'printer_id': 1,
|
||||
'duration': 30,
|
||||
'comments': 'Test job creation'
|
||||
}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 201)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(data['title'], 'New Job')
|
||||
self.assertEqual(data['duration'], 30)
|
||||
|
||||
def test_update_job(self):
|
||||
# Als Benutzer den eigenen Job aktualisieren
|
||||
token = self.get_token('user_test', 'user')
|
||||
response = self.app.put('/api/jobs/1',
|
||||
headers={'Authorization': f'Bearer {token}'},
|
||||
data=json.dumps({
|
||||
'comments': 'Updated comments',
|
||||
'duration': 15 # Verlängerung
|
||||
}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(data['comments'], 'Updated comments')
|
||||
self.assertEqual(data['duration'], 75) # 60 + 15
|
||||
|
||||
def test_complete_job(self):
|
||||
# Als Benutzer einen Job als abgeschlossen markieren
|
||||
token = self.get_token('user_test', 'user')
|
||||
response = self.app.put('/api/jobs/1',
|
||||
headers={'Authorization': f'Bearer {token}'},
|
||||
data=json.dumps({
|
||||
'status': 'completed'
|
||||
}),
|
||||
content_type='application/json')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(data['status'], 'completed')
|
||||
|
||||
# Überprüfen, ob der Drucker wieder verfügbar ist
|
||||
response = self.app.get('/api/printers/2')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(data['status'], 'available')
|
||||
|
||||
def test_get_remaining_time(self):
|
||||
# Test: Verbleibende Zeit für einen aktiven Job abrufen
|
||||
response = self.app.get('/api/job/1/remaining-time')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertIn('remaining_minutes', data)
|
||||
# Der genaue Wert kann nicht überprüft werden, da er von der Zeit abhängt
|
||||
|
||||
def test_stats(self):
|
||||
# Als Admin Statistiken abrufen
|
||||
token = self.get_token('admin_test', 'admin')
|
||||
response = self.app.get('/api/stats',
|
||||
headers={'Authorization': f'Bearer {token}'})
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertIn('printers', data)
|
||||
self.assertIn('jobs', data)
|
||||
self.assertIn('users', data)
|
||||
self.assertEqual(data['printers']['total'], 2)
|
||||
self.assertEqual(data['jobs']['total'], 1)
|
||||
self.assertEqual(data['users']['total'], 2)
|
||||
|
||||
def test_test_endpoint(self):
|
||||
# Test: API-Test-Endpunkt
|
||||
response = self.app.get('/api/test')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.data)
|
||||
self.assertEqual(data['message'], 'MYP Backend API funktioniert!')
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
25
backend/docker-compose.yml
Executable file
25
backend/docker-compose.yml
Executable file
@ -0,0 +1,25 @@
|
||||
version: '3'
|
||||
|
||||
services:
|
||||
backend:
|
||||
build: .
|
||||
container_name: myp-backend
|
||||
network_mode: host
|
||||
environment:
|
||||
- SECRET_KEY=${SECRET_KEY:-7445630171969DFAC92C53CEC92E67A9CB2E00B3CB2F}
|
||||
- DATABASE_PATH=${DATABASE_PATH:-instance/myp.db}
|
||||
- TAPO_USERNAME=${TAPO_USERNAME:-till.tomczak@mercedes-benz.com}
|
||||
- TAPO_PASSWORD=${TAPO_PASSWORD:-744563017196A}
|
||||
- "PRINTERS=${PRINTERS:-{\"Printer 1\": {\"ip\": \"192.168.0.100\"}, \"Printer 2\": {\"ip\": \"192.168.0.101\"}, \"Printer 3\": {\"ip\": \"192.168.0.102\"}, \"Printer 4\": {\"ip\": \"192.168.0.103\"}, \"Printer 5\": {\"ip\": \"192.168.0.104\"}, \"Printer 6\": {\"ip\": \"192.168.0.106\"}}}"
|
||||
- FLASK_APP=app.py
|
||||
- PYTHONUNBUFFERED=1
|
||||
volumes:
|
||||
- ./logs:/app/logs
|
||||
- ./instance:/app/instance
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD", "wget", "--spider", "http://localhost:5000/health"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
start_period: 40s
|
647
backend/docs/API_DOCS.md
Normal file
647
backend/docs/API_DOCS.md
Normal file
@ -0,0 +1,647 @@
|
||||
# MYP Backend API-Dokumentation
|
||||
|
||||
Dieses Dokument beschreibt detailliert die API-Endpunkte des MYP (Manage Your Printer) Backend-Systems.
|
||||
|
||||
## Basis-URL
|
||||
|
||||
Die Basis-URL für alle API-Anfragen ist: `http://localhost:5000` (Entwicklungsumgebung) oder die URL, unter der die Anwendung gehostet wird.
|
||||
|
||||
## Authentifizierung
|
||||
|
||||
Die meisten Endpunkte erfordern eine Authentifizierung. Diese erfolgt über Cookies/Sessions, die bei der Anmeldung erstellt werden. Die Session wird für 7 Tage gespeichert.
|
||||
|
||||
### Benutzerregistrierung
|
||||
|
||||
**Endpunkt:** `POST /auth/register`
|
||||
|
||||
**Beschreibung:** Registriert einen neuen Benutzer im System.
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"username": "string", // Erforderlich
|
||||
"password": "string", // Erforderlich
|
||||
"displayName": "string", // Optional (Standard: username)
|
||||
"email": "string" // Optional
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Registrierung erfolgreich!",
|
||||
"user": {
|
||||
"id": "string",
|
||||
"username": "string",
|
||||
"displayName": "string",
|
||||
"email": "string",
|
||||
"role": "string"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Fehlerantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Benutzername bereits vergeben!"
|
||||
}
|
||||
```
|
||||
|
||||
### Benutzeranmeldung
|
||||
|
||||
**Endpunkt:** `POST /auth/login`
|
||||
|
||||
**Beschreibung:** Meldet einen Benutzer an und erstellt eine Session.
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"username": "string", // Erforderlich
|
||||
"password": "string" // Erforderlich
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Anmeldung erfolgreich!",
|
||||
"user": {
|
||||
"id": "string",
|
||||
"username": "string",
|
||||
"displayName": "string",
|
||||
"email": "string",
|
||||
"role": "string"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Fehlerantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Ungültiger Benutzername oder Passwort!"
|
||||
}
|
||||
```
|
||||
|
||||
### Initialer Administrator
|
||||
|
||||
**Endpunkt:** `POST /api/create-initial-admin`
|
||||
|
||||
**Beschreibung:** Erstellt einen initialen Admin-Benutzer, falls noch keiner existiert.
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"username": "string", // Erforderlich
|
||||
"password": "string", // Erforderlich
|
||||
"displayName": "string", // Optional (Standard: username)
|
||||
"email": "string" // Optional
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Administrator wurde erfolgreich erstellt!",
|
||||
"user": {
|
||||
"id": "string",
|
||||
"username": "string",
|
||||
"displayName": "string",
|
||||
"email": "string",
|
||||
"role": "string"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Fehlerantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Es existiert bereits ein Administrator!"
|
||||
}
|
||||
```
|
||||
|
||||
## Benutzer-Endpunkte
|
||||
|
||||
### Alle Benutzer abrufen (Admin)
|
||||
|
||||
**Endpunkt:** `GET /api/users`
|
||||
|
||||
**Beschreibung:** Gibt eine Liste aller Benutzer zurück.
|
||||
|
||||
**Erforderliche Rechte:** Admin
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": 1,
|
||||
"username": "string",
|
||||
"email": "string",
|
||||
"role": "string"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Benutzer abrufen (Admin)
|
||||
|
||||
**Endpunkt:** `GET /api/users/{userId}`
|
||||
|
||||
**Beschreibung:** Gibt die Details eines bestimmten Benutzers zurück.
|
||||
|
||||
**Erforderliche Rechte:** Admin
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": 1,
|
||||
"username": "string",
|
||||
"email": "string",
|
||||
"role": "string"
|
||||
}
|
||||
```
|
||||
|
||||
**Fehlerantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Nicht gefunden!"
|
||||
}
|
||||
```
|
||||
|
||||
### Benutzer aktualisieren (Admin)
|
||||
|
||||
**Endpunkt:** `PUT /api/users/{userId}`
|
||||
|
||||
**Beschreibung:** Aktualisiert die Daten eines Benutzers.
|
||||
|
||||
**Erforderliche Rechte:** Admin
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"username": "string",
|
||||
"email": "string",
|
||||
"password": "string",
|
||||
"role": "string"
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": 1,
|
||||
"username": "string",
|
||||
"email": "string",
|
||||
"role": "string"
|
||||
}
|
||||
```
|
||||
|
||||
**Fehlerantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Benutzername bereits vergeben!"
|
||||
}
|
||||
```
|
||||
|
||||
### Benutzer löschen (Admin)
|
||||
|
||||
**Endpunkt:** `DELETE /api/users/{userId}`
|
||||
|
||||
**Beschreibung:** Löscht einen Benutzer.
|
||||
|
||||
**Erforderliche Rechte:** Admin
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Benutzer gelöscht!"
|
||||
}
|
||||
```
|
||||
|
||||
## Drucker-Endpunkte
|
||||
|
||||
### Alle Drucker abrufen
|
||||
|
||||
**Endpunkt:** `GET /api/printers`
|
||||
|
||||
**Beschreibung:** Gibt eine Liste aller Drucker (Steckdosen) zurück.
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"name": "string",
|
||||
"description": "string",
|
||||
"status": 0, // 0 = available, 1 = busy
|
||||
"latestJob": {
|
||||
// Job-Objekt oder null, wenn kein aktiver Job
|
||||
}
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Drucker hinzufügen (Admin)
|
||||
|
||||
**Endpunkt:** `POST /api/printers`
|
||||
|
||||
**Beschreibung:** Fügt einen neuen Drucker hinzu.
|
||||
|
||||
**Erforderliche Rechte:** Admin
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"name": "string",
|
||||
"description": "string",
|
||||
"ipAddress": "string" // IP-Adresse der Tapo-Steckdose
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"name": "string",
|
||||
"description": "string",
|
||||
"status": 0, // 0 = available, 1 = busy
|
||||
"latestJob": null
|
||||
}
|
||||
```
|
||||
|
||||
### Drucker abrufen
|
||||
|
||||
**Endpunkt:** `GET /api/printers/{printerId}`
|
||||
|
||||
**Beschreibung:** Gibt die Details eines bestimmten Druckers zurück.
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"name": "string",
|
||||
"description": "string",
|
||||
"status": 0, // 0 = available, 1 = busy
|
||||
"latestJob": {
|
||||
// Job-Objekt oder null, wenn kein aktiver Job
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Fehlerantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Nicht gefunden!"
|
||||
}
|
||||
```
|
||||
|
||||
### Drucker aktualisieren (Admin)
|
||||
|
||||
**Endpunkt:** `PUT /api/printers/{printerId}`
|
||||
|
||||
**Beschreibung:** Aktualisiert die Daten eines Druckers.
|
||||
|
||||
**Erforderliche Rechte:** Admin
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"name": "string",
|
||||
"description": "string",
|
||||
"ipAddress": "string", // IP-Adresse der Tapo-Steckdose
|
||||
"status": 0 // 0 = available, 1 = busy
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"name": "string",
|
||||
"description": "string",
|
||||
"status": 0, // 0 = available, 1 = busy
|
||||
"latestJob": {
|
||||
// Job-Objekt oder null, wenn kein aktiver Job
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Drucker löschen (Admin)
|
||||
|
||||
**Endpunkt:** `DELETE /api/printers/{printerId}`
|
||||
|
||||
**Beschreibung:** Löscht einen Drucker.
|
||||
|
||||
**Erforderliche Rechte:** Admin
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Drucker gelöscht!"
|
||||
}
|
||||
```
|
||||
|
||||
## Druckauftrags-Endpunkte
|
||||
|
||||
### Alle Druckaufträge abrufen
|
||||
|
||||
**Endpunkt:** `GET /api/jobs`
|
||||
|
||||
**Beschreibung:** Gibt eine Liste aller Druckaufträge zurück (für Admins) oder der eigenen Druckaufträge (für Benutzer).
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"socketId": "uuid-string",
|
||||
"userId": "uuid-string",
|
||||
"startAt": "string (ISO 8601)",
|
||||
"durationInMinutes": 60,
|
||||
"comments": "string",
|
||||
"aborted": false,
|
||||
"abortReason": null,
|
||||
"remainingMinutes": 30
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Druckauftrag erstellen
|
||||
|
||||
**Endpunkt:** `POST /api/jobs`
|
||||
|
||||
**Beschreibung:** Erstellt einen neuen Druckauftrag.
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"printerId": "uuid-string",
|
||||
"durationInMinutes": 60,
|
||||
"comments": "string"
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"socketId": "uuid-string",
|
||||
"userId": "uuid-string",
|
||||
"startAt": "string (ISO 8601)",
|
||||
"durationInMinutes": 60,
|
||||
"comments": "string",
|
||||
"aborted": false,
|
||||
"abortReason": null,
|
||||
"remainingMinutes": 60
|
||||
}
|
||||
```
|
||||
|
||||
**Fehlerantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Drucker ist nicht verfügbar!"
|
||||
}
|
||||
```
|
||||
|
||||
### Druckauftrag abrufen
|
||||
|
||||
**Endpunkt:** `GET /api/jobs/{jobId}`
|
||||
|
||||
**Beschreibung:** Gibt die Details eines bestimmten Druckauftrags zurück.
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"socketId": "uuid-string",
|
||||
"userId": "uuid-string",
|
||||
"startAt": "string (ISO 8601)",
|
||||
"durationInMinutes": 60,
|
||||
"comments": "string",
|
||||
"aborted": false,
|
||||
"abortReason": null,
|
||||
"remainingMinutes": 30
|
||||
}
|
||||
```
|
||||
|
||||
**Fehlerantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Nicht gefunden!"
|
||||
}
|
||||
```
|
||||
|
||||
### Druckauftrag Kommentare aktualisieren
|
||||
|
||||
**Endpunkt:** `PUT /api/jobs/{jobId}/comments`
|
||||
|
||||
**Beschreibung:** Aktualisiert die Kommentare eines Druckauftrags.
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"comments": "string"
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"socketId": "uuid-string",
|
||||
"userId": "uuid-string",
|
||||
"startAt": "string (ISO 8601)",
|
||||
"durationInMinutes": 60,
|
||||
"comments": "string",
|
||||
"aborted": false,
|
||||
"abortReason": null,
|
||||
"remainingMinutes": 30
|
||||
}
|
||||
```
|
||||
|
||||
### Druckauftrag abbrechen
|
||||
|
||||
**Endpunkt:** `POST /api/jobs/{jobId}/abort`
|
||||
|
||||
**Beschreibung:** Bricht einen laufenden Druckauftrag ab.
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"reason": "string" // Optional
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"socketId": "uuid-string",
|
||||
"userId": "uuid-string",
|
||||
"startAt": "string (ISO 8601)",
|
||||
"durationInMinutes": 60,
|
||||
"comments": "string",
|
||||
"aborted": true,
|
||||
"abortReason": "string",
|
||||
"remainingMinutes": 0
|
||||
}
|
||||
```
|
||||
|
||||
### Druckauftrag vorzeitig beenden
|
||||
|
||||
**Endpunkt:** `POST /api/jobs/{jobId}/finish`
|
||||
|
||||
**Beschreibung:** Beendet einen laufenden Druckauftrag vorzeitig.
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"socketId": "uuid-string",
|
||||
"userId": "uuid-string",
|
||||
"startAt": "string (ISO 8601)",
|
||||
"durationInMinutes": 45, // Tatsächliche Dauer bis zum Beenden
|
||||
"comments": "string",
|
||||
"aborted": false,
|
||||
"abortReason": null,
|
||||
"remainingMinutes": 0
|
||||
}
|
||||
```
|
||||
|
||||
### Druckauftrag verlängern
|
||||
|
||||
**Endpunkt:** `POST /api/jobs/{jobId}/extend`
|
||||
|
||||
**Beschreibung:** Verlängert die Laufzeit eines Druckauftrags.
|
||||
|
||||
**Request-Body:**
|
||||
```json
|
||||
{
|
||||
"minutes": 30, // Zusätzliche Minuten
|
||||
"hours": 0 // Zusätzliche Stunden (optional)
|
||||
}
|
||||
```
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"socketId": "uuid-string",
|
||||
"userId": "uuid-string",
|
||||
"startAt": "string (ISO 8601)",
|
||||
"durationInMinutes": 90, // Aktualisierte Gesamtdauer
|
||||
"comments": "string",
|
||||
"aborted": false,
|
||||
"abortReason": null,
|
||||
"remainingMinutes": 60
|
||||
}
|
||||
```
|
||||
|
||||
### Druckauftrag löschen
|
||||
|
||||
**Endpunkt:** `DELETE /api/jobs/{jobId}`
|
||||
|
||||
**Beschreibung:** Löscht einen Druckauftrag.
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "Druckauftrag gelöscht!"
|
||||
}
|
||||
```
|
||||
|
||||
### Verbleibende Zeit eines Druckauftrags abrufen
|
||||
|
||||
**Endpunkt:** `GET /api/job/{jobId}/remaining-time`
|
||||
|
||||
**Beschreibung:** Gibt die verbleibende Zeit eines aktiven Druckauftrags in Minuten zurück.
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"remaining_minutes": 30,
|
||||
"job_status": "active", // active, completed
|
||||
"socket_status": "busy" // busy, available
|
||||
}
|
||||
```
|
||||
|
||||
### Status eines Druckauftrags abrufen
|
||||
|
||||
**Endpunkt:** `GET /api/job/{jobId}/status`
|
||||
|
||||
**Beschreibung:** Gibt detaillierte Statusinformationen zu einem Druckauftrag zurück.
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"job": {
|
||||
"id": "uuid-string",
|
||||
"socketId": "uuid-string",
|
||||
"userId": "uuid-string",
|
||||
"startAt": "string (ISO 8601)",
|
||||
"durationInMinutes": 60,
|
||||
"comments": "string",
|
||||
"aborted": false,
|
||||
"abortReason": null,
|
||||
"remainingMinutes": 30
|
||||
},
|
||||
"status": "active", // active, completed, aborted
|
||||
"socketStatus": "busy", // busy, available
|
||||
"remainingMinutes": 30
|
||||
}
|
||||
```
|
||||
|
||||
## Statistik-Endpunkte
|
||||
|
||||
### Systemstatistiken abrufen (Admin)
|
||||
|
||||
**Endpunkt:** `GET /api/stats`
|
||||
|
||||
**Beschreibung:** Gibt Statistiken zu Druckern, Aufträgen und Benutzern zurück.
|
||||
|
||||
**Erforderliche Rechte:** Admin
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"printers": {
|
||||
"total": 10,
|
||||
"available": 5,
|
||||
"utilization_rate": 0.5
|
||||
},
|
||||
"jobs": {
|
||||
"total": 100,
|
||||
"active": 5,
|
||||
"completed": 90,
|
||||
"avg_duration": 120
|
||||
},
|
||||
"users": {
|
||||
"total": 50
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Test-Endpunkt
|
||||
|
||||
### API-Test
|
||||
|
||||
**Endpunkt:** `GET /api/test`
|
||||
|
||||
**Beschreibung:** Testet, ob die API funktioniert.
|
||||
|
||||
**Erfolgsantwort:**
|
||||
```json
|
||||
{
|
||||
"message": "MYP Backend API funktioniert!"
|
||||
}
|
||||
```
|
||||
|
||||
## Fehlercodes
|
||||
|
||||
| Statuscode | Beschreibung |
|
||||
|------------|-----------------------------|
|
||||
| 200 | OK - Anfrage erfolgreich |
|
||||
| 201 | Created - Ressource erstellt |
|
||||
| 400 | Bad Request - Ungültige Anfrage |
|
||||
| 401 | Unauthorized - Authentifizierung erforderlich |
|
||||
| 403 | Forbidden - Unzureichende Rechte |
|
||||
| 404 | Not Found - Ressource nicht gefunden |
|
||||
| 500 | Internal Server Error - Serverfehler |
|
213
backend/docs/PROJEKTDOKUMENTATION.md
Normal file
213
backend/docs/PROJEKTDOKUMENTATION.md
Normal file
@ -0,0 +1,213 @@
|
||||
# MYP - Projektdokumentation für das IHK-Abschlussprojekt
|
||||
|
||||
## Projektübersicht
|
||||
|
||||
**Projektname:** MYP (Manage Your Printer)
|
||||
**Projekttyp:** IHK-Abschlussprojekt für Fachinformatiker für digitale Vernetzung
|
||||
**Zeitraum:** [Dein Projektzeitraum]
|
||||
**Team:** 2 Personen (Frontend- und Backend-Entwicklung)
|
||||
|
||||
## Projektziel
|
||||
|
||||
Das Ziel des Projektes ist die Entwicklung einer Reservierungs- und Steuerungsplattform für 3D-Drucker, die es Benutzern ermöglicht, Drucker zu reservieren und deren Stromversorgung automatisch über WLAN-Steckdosen (Tapo P115) zu steuern. Die Plattform soll eine einfache Verwaltung der Drucker und ihrer Auslastung bieten sowie den Stromverbrauch optimieren, indem Drucker nur während aktiver Reservierungen mit Strom versorgt werden.
|
||||
|
||||
## Aufgabenbeschreibung
|
||||
|
||||
Als Fachinformatiker für digitale Vernetzung besteht meine Aufgabe in der Entwicklung des Backend-Systems, das folgende Funktionen bereitstellt:
|
||||
|
||||
1. **API-Backend für das Frontend**: Entwicklung einer RESTful API, die mit dem Frontend kommuniziert und alle notwendigen Daten bereitstellt.
|
||||
|
||||
2. **Authentifizierungssystem**: Integration einer OAuth-Authentifizierung über GitHub, um Benutzer zu identifizieren und Zugriffskontrolle zu gewährleisten.
|
||||
|
||||
3. **Datenbankverwaltung**: Erstellung und Verwaltung der Datenbankmodelle für Benutzer, Drucker und Reservierungen.
|
||||
|
||||
4. **Steckdosensteuerung**: Implementierung einer Schnittstelle zu Tapo P115 WLAN-Steckdosen, um die Stromversorgung der Drucker basierend auf Reservierungen zu steuern.
|
||||
|
||||
5. **Automatisierung**: Entwicklung von Mechanismen zur automatischen Überwachung von Reservierungen und Steuerung der Steckdosen.
|
||||
|
||||
6. **Sicherheit**: Implementierung von Sicherheitsmaßnahmen zum Schutz der Anwendung und der Daten.
|
||||
|
||||
7. **Dokumentation**: Erstellung einer umfassenden Dokumentation für Entwicklung, Installation und Nutzung des Systems.
|
||||
|
||||
## Technische Umsetzung
|
||||
|
||||
### Backend (Mein Verantwortungsbereich)
|
||||
|
||||
#### Verwendete Technologien
|
||||
|
||||
- **Programmiersprache**: Python 3.11
|
||||
- **Web-Framework**: Flask 2.3.3
|
||||
- **Datenbank-ORM**: SQLAlchemy 3.1.1
|
||||
- **Datenbank**: SQLite (für Entwicklung), erweiterbar auf PostgreSQL für Produktion
|
||||
- **Authentifizierung**: Authlib für GitHub OAuth
|
||||
- **Steckdosen-Steuerung**: Tapo Python Library
|
||||
- **Container-Technologie**: Docker und Docker Compose
|
||||
|
||||
#### Architektur
|
||||
|
||||
Die Backend-Anwendung folgt einer klassischen dreischichtigen Architektur:
|
||||
|
||||
1. **Datenmodell-Schicht**: SQLAlchemy ORM-Modelle für Benutzer, Sessions, Drucker und Druckaufträge
|
||||
2. **Business-Logic-Schicht**: Implementierung der Geschäftslogik für Reservierungsverwaltung und Steckdosensteuerung
|
||||
3. **API-Schicht**: RESTful API-Endpunkte, die vom Frontend konsumiert werden
|
||||
|
||||
Zusätzlich wurden folgende Features implementiert:
|
||||
|
||||
- **OAuth-Authentifizierung**: Implementierung einer sicheren Authentifizierung über GitHub
|
||||
- **Session-Management**: Server-seitige Session-Verwaltung für Benutzerauthentifizierung
|
||||
- **Steckdosensteuerung**: Asynchrone Steuerung der Tapo P115 WLAN-Steckdosen
|
||||
- **CLI-Befehle**: Flask CLI-Befehle für automatisierte Aufgaben wie die Überprüfung abgelaufener Reservierungen
|
||||
|
||||
#### Datenmodell
|
||||
|
||||
Das Datenmodell besteht aus vier Hauptentitäten:
|
||||
|
||||
1. **User**: Benutzer mit GitHub-Authentifizierung und Rollenverwaltung
|
||||
2. **Session**: Sitzungsdaten für die Authentifizierung
|
||||
3. **Printer**: Drucker mit Status und IP-Adresse der zugehörigen Steckdose
|
||||
4. **PrintJob**: Reservierungen mit Start- und Endzeit, Dauer und Status
|
||||
|
||||
#### API-Endpunkte
|
||||
|
||||
Die API wurde speziell entwickelt, um nahtlos mit dem bestehenden Frontend zusammenzuarbeiten. Sie bietet Endpunkte für:
|
||||
|
||||
- Authentifizierung und Benutzerverwaltung
|
||||
- Druckerverwaltung
|
||||
- Reservierungsverwaltung (Erstellen, Abbrechen, Verlängern)
|
||||
- Statusinformationen wie verbleibende Zeit
|
||||
|
||||
#### Steckdosensteuerung
|
||||
|
||||
Die Steuerung der Tapo P115 WLAN-Steckdosen erfolgt über die Tapo Python Library. Das System:
|
||||
|
||||
- Schaltet Steckdosen bei Erstellung einer Reservierung ein
|
||||
- Schaltet Steckdosen bei Abbruch oder Beendigung einer Reservierung aus
|
||||
- Überprüft regelmäßig abgelaufene Reservierungen und schaltet die entsprechenden Steckdosen aus
|
||||
|
||||
#### Automatisierung
|
||||
|
||||
Das System implementiert mehrere Automatisierungsmechanismen:
|
||||
|
||||
- **Automatische Steckdosensteuerung**: Ein- und Ausschalten der Steckdosen basierend auf Reservierungsstatus
|
||||
- **Job-Überprüfung**: CLI-Befehl `flask check-jobs` zur regelmäßigen Überprüfung abgelaufener Reservierungen
|
||||
- **Logging**: Automatische Protokollierung aller Aktionen zur Fehlerdiagnose
|
||||
|
||||
### Frontend (Verantwortungsbereich des Teampartners)
|
||||
|
||||
Das Frontend wurde von meinem Teampartner entwickelt und besteht aus:
|
||||
|
||||
- Next.js-Anwendung mit React-Komponenten
|
||||
- Tailwind CSS für das Styling
|
||||
- Serverless Functions für API-Integrationen
|
||||
- Responsive Design für Desktop- und Mobile-Nutzung
|
||||
|
||||
## Projektergebnisse
|
||||
|
||||
Das Projekt hat erfolgreich eine funktionsfähige Reservierungs- und Steuerungsplattform für 3D-Drucker geschaffen, die es Benutzern ermöglicht:
|
||||
|
||||
1. Sich über GitHub zu authentifizieren
|
||||
2. Verfügbare Drucker zu sehen und zu reservieren
|
||||
3. Ihre Reservierungen zu verwalten (verlängern, abbrechen, kommentieren)
|
||||
4. Als Administrator Drucker und Benutzer zu verwalten
|
||||
|
||||
Technische Errungenschaften:
|
||||
|
||||
1. Nahtlose Integration mit dem Frontend
|
||||
2. Erfolgreiche Implementierung der Steckdosensteuerung
|
||||
3. Sichere Authentifizierung über GitHub OAuth
|
||||
4. Optimierte Stromnutzung durch automatische Steckdosensteuerung
|
||||
|
||||
## Herausforderungen und Lösungen
|
||||
|
||||
### Herausforderung 1: GitHub OAuth-Integration
|
||||
|
||||
Die Integration der GitHub-Authentifizierung, insbesondere mit GitHub Enterprise, erforderte eine sorgfältige Konfiguration der OAuth-Einstellungen und URL-Anpassungen.
|
||||
|
||||
**Lösung:** Implementierung mit Authlib und anpassbaren Konfigurationsoptionen für verschiedene GitHub-Instanzen.
|
||||
|
||||
### Herausforderung 2: Tapo P115 Steuerung
|
||||
|
||||
Die Kommunikation mit den Tapo P115 WLAN-Steckdosen erforderte eine zuverlässige und asynchrone Implementierung.
|
||||
|
||||
**Lösung:** Verwendung der Tapo Python Library mit asynchronem Handling und robusten Fehlerbehandlungsmechanismen.
|
||||
|
||||
### Herausforderung 3: Kompatibilität mit bestehendem Frontend
|
||||
|
||||
Das Backend musste mit dem bereits entwickelten Frontend kompatibel sein, was eine genaue Anpassung der API-Endpunkte und Datenstrukturen erforderte.
|
||||
|
||||
**Lösung:** Sorgfältige Analyse des Frontend-Codes, um die erwarteten API-Strukturen zu verstehen und das Backend entsprechend zu implementieren.
|
||||
|
||||
### Herausforderung 4: Automatische Steckdosensteuerung
|
||||
|
||||
Die zuverlässige Steuerung der Steckdosen bei abgelaufenen Reservierungen war eine Herausforderung.
|
||||
|
||||
**Lösung:** Implementierung eines CLI-Befehls, der regelmäßig durch Cron-Jobs ausgeführt werden kann, um abgelaufene Reservierungen zu überprüfen.
|
||||
|
||||
## Fachliche Reflexion
|
||||
|
||||
Das Projekt erforderte ein breites Spektrum an Fähigkeiten aus dem Bereich der digitalen Vernetzung:
|
||||
|
||||
1. **Netzwerkkommunikation**: Implementierung der Kommunikation zwischen Backend, Frontend und WLAN-Steckdosen über verschiedene Protokolle.
|
||||
|
||||
2. **Systemintegration**: Integration verschiedener Systeme (GitHub OAuth, Datenbank, Tapo-Steckdosen) zu einer kohärenten Anwendung.
|
||||
|
||||
3. **API-Design**: Entwicklung einer RESTful API, die den Anforderungen des Frontends entspricht und zukunftssicher ist.
|
||||
|
||||
4. **Datenbankentwurf**: Erstellung eines optimierten Datenbankschemas für die Anwendung.
|
||||
|
||||
5. **Sicherheitskonzepte**: Implementierung von Sicherheitsmaßnahmen wie OAuth, Session-Management und Zugriffskontrollen.
|
||||
|
||||
6. **Automatisierung**: Entwicklung von Automatisierungsprozessen für die Steckdosensteuerung und Job-Überwachung.
|
||||
|
||||
Diese Aspekte entsprechen direkt den Kernkompetenzen des Berufsbildes "Fachinformatiker für digitale Vernetzung" und zeigen die praktische Anwendung dieser Fähigkeiten in einem realen Projekt.
|
||||
|
||||
## Ausblick und Weiterentwicklung
|
||||
|
||||
Das System bietet verschiedene Möglichkeiten zur Weiterentwicklung:
|
||||
|
||||
1. **Erweiterung der Steckdosenunterstützung**: Integration weiterer Smart-Home-Geräte neben Tapo P115.
|
||||
|
||||
2. **Benachrichtigungssystem**: Implementierung von E-Mail- oder Push-Benachrichtigungen für Reservierungserinnerungen.
|
||||
|
||||
3. **Erweiterte Statistiken**: Detailliertere Nutzungsstatistiken und Visualisierungen für Administratoren.
|
||||
|
||||
4. **Mobile App**: Entwicklung einer nativen mobilen App für iOS und Android.
|
||||
|
||||
5. **Verbesserte Automatisierung**: Integration mit weiteren Systemen wie 3D-Drucker-APIs für direktes Monitoring des Druckstatus.
|
||||
|
||||
## Fazit
|
||||
|
||||
Das MYP-Projekt zeigt erfolgreich, wie moderne Webtechnologien und IoT-Geräte kombiniert werden können, um eine praktische Lösung für die Verwaltung von 3D-Druckern zu schaffen.
|
||||
|
||||
Als angehender Fachinformatiker für digitale Vernetzung konnte ich meine Fähigkeiten in den Bereichen Programmierung, Systemintegration, Netzwerkkommunikation und Automatisierung anwenden und erweitern.
|
||||
|
||||
Die Zusammenarbeit im Team mit klarer Aufgabenteilung (Frontend/Backend) hat zu einem erfolgreichen Projektergebnis geführt, das die gestellten Anforderungen erfüllt und einen praktischen Nutzen bietet.
|
||||
|
||||
---
|
||||
|
||||
## Anhang
|
||||
|
||||
### Installation und Einrichtung
|
||||
|
||||
Detaillierte Anweisungen zur Installation und Einrichtung des Backend-Systems finden sich in der README.md-Datei.
|
||||
|
||||
### Wichtige Konfigurationsparameter
|
||||
|
||||
Die folgenden Umgebungsvariablen müssen konfiguriert werden:
|
||||
|
||||
- `SECRET_KEY`: Geheimer Schlüssel für die Session-Verschlüsselung
|
||||
- `DATABASE_URL`: URL zur Datenbank
|
||||
- `OAUTH_CLIENT_ID`: GitHub OAuth Client ID
|
||||
- `OAUTH_CLIENT_SECRET`: GitHub OAuth Client Secret
|
||||
- `GITHUB_API_BASE_URL`, `GITHUB_AUTHORIZE_URL`, `GITHUB_TOKEN_URL`: URLs für GitHub OAuth
|
||||
- `TAPO_USERNAME`: Benutzername für die Tapo-Steckdosen
|
||||
- `TAPO_PASSWORD`: Passwort für die Tapo-Steckdosen
|
||||
- `TAPO_DEVICES`: JSON-Objekt mit der Zuordnung von Drucker-IDs zu IP-Adressen
|
||||
|
||||
### Cron-Job-Einrichtung
|
||||
|
||||
Für die automatische Überprüfung abgelaufener Jobs kann folgender Cron-Job eingerichtet werden:
|
||||
|
||||
```
|
||||
*/5 * * * * cd /pfad/zum/projekt && /pfad/zur/venv/bin/flask check-jobs >> /pfad/zum/projekt/logs/cron.log 2>&1
|
||||
```
|
185
backend/docs/README.md
Normal file
185
backend/docs/README.md
Normal file
@ -0,0 +1,185 @@
|
||||
# MYP Backend-Steuerungsplattform
|
||||
|
||||
Dies ist das Backend für das MYP (Manage Your Printer) Projekt, ein IHK-Abschlussprojekt für Fachinformatiker für digitale Vernetzung. Die Plattform ist mit Python und Flask implementiert und stellt eine RESTful API zur Verfügung, die es ermöglicht, 3D-Drucker zu verwalten, zu reservieren und über WLAN-Steckdosen (Tapo P115) zu steuern.
|
||||
|
||||
## Funktionen
|
||||
|
||||
- Lokales Authentifizierungssystem (Offline-fähig)
|
||||
- Rollen-basierte Zugriffskontrolle (Admin/User/Guest)
|
||||
- Druckerverwaltung (Hinzufügen, Bearbeiten, Löschen)
|
||||
- Reservierungsverwaltung (Erstellen, Abbrechen, Verlängern)
|
||||
- Fernsteuerung von WLAN-Steckdosen (Tapo P115)
|
||||
- Statistikerfassung und -anzeige
|
||||
- RESTful API für die Kommunikation mit dem Frontend
|
||||
|
||||
## Technologie-Stack
|
||||
|
||||
- **Python**: Programmiersprache
|
||||
- **Flask**: Web-Framework
|
||||
- **SQLite**: Integrierte Datenbank (kann für Produktion durch PostgreSQL ersetzt werden)
|
||||
- **PyP100**: Python-Bibliothek zur Steuerung der Tapo P115 WLAN-Steckdosen
|
||||
- **Gunicorn**: WSGI HTTP Server für die Produktionsumgebung
|
||||
- **Docker**: Containerisierung der Anwendung
|
||||
|
||||
## Projekt-Struktur
|
||||
|
||||
- `app.py`: Hauptanwendungsdatei mit allen Routen und Modellen
|
||||
- `requirements.txt`: Liste aller Python-Abhängigkeiten
|
||||
- `Dockerfile`: Docker-Konfiguration
|
||||
- `docker-compose.yml`: Docker Compose Konfiguration für einfaches Deployment
|
||||
- `.env.example`: Beispiel für die Umgebungsvariablen
|
||||
- `logs/`: Logdateien (automatisch erstellt)
|
||||
- `instance/`: SQLite-Datenbank (automatisch erstellt)
|
||||
|
||||
## Installation und Ausführung
|
||||
|
||||
### Lokal (Entwicklung)
|
||||
|
||||
1. Python 3.8 oder höher installieren
|
||||
2. Repository klonen
|
||||
3. Ins Projektverzeichnis wechseln
|
||||
4. Virtuelle Umgebung erstellen (optional, aber empfohlen)
|
||||
```
|
||||
python -m venv venv
|
||||
source venv/bin/activate # Unter Windows: venv\Scripts\activate
|
||||
```
|
||||
5. Abhängigkeiten installieren
|
||||
```
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
6. `.env.example` nach `.env` kopieren und anpassen
|
||||
```
|
||||
cp .env.example .env
|
||||
```
|
||||
7. Anwendung starten
|
||||
```
|
||||
python app.py
|
||||
```
|
||||
|
||||
Die Anwendung ist dann unter http://localhost:5000 erreichbar.
|
||||
|
||||
### Mit Docker
|
||||
|
||||
1. Docker und Docker Compose installieren
|
||||
2. Ins Projektverzeichnis wechseln
|
||||
3. `.env.example` nach `.env` kopieren und anpassen
|
||||
```
|
||||
cp .env.example .env
|
||||
```
|
||||
4. Anwendung starten
|
||||
```
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
Die Anwendung ist dann unter http://localhost:5000 erreichbar.
|
||||
|
||||
## API-Endpunkte
|
||||
|
||||
### Authentifizierung
|
||||
|
||||
- `POST /auth/register`: Neuen Benutzer registrieren
|
||||
- `POST /auth/login`: Benutzer anmelden
|
||||
- `POST /auth/logout`: Abmelden und Session beenden
|
||||
- `POST /api/create-initial-admin`: Initialen Administrator erstellen
|
||||
- `GET /api/me`: Aktuelle Benutzerinformationen abrufen
|
||||
|
||||
### Benutzer
|
||||
|
||||
- `GET /api/users`: Liste aller Benutzer (Admin)
|
||||
- `GET /api/users/<id>`: Details zu einem Benutzer (Admin)
|
||||
- `PUT /api/users/<id>`: Benutzer aktualisieren (Admin)
|
||||
- `DELETE /api/users/<id>`: Benutzer löschen (Admin)
|
||||
|
||||
### Drucker
|
||||
|
||||
- `GET /api/printers`: Liste aller Drucker
|
||||
- `POST /api/printers`: Drucker hinzufügen (Admin)
|
||||
- `GET /api/printers/<id>`: Details zu einem Drucker
|
||||
- `PUT /api/printers/<id>`: Drucker aktualisieren (Admin)
|
||||
- `DELETE /api/printers/<id>`: Drucker löschen (Admin)
|
||||
|
||||
### Druckaufträge
|
||||
|
||||
- `GET /api/jobs`: Liste aller Druckaufträge (Admin) oder eigener Druckaufträge (Benutzer)
|
||||
- `POST /api/jobs`: Druckauftrag erstellen
|
||||
- `GET /api/jobs/<id>`: Details zu einem Druckauftrag
|
||||
- `POST /api/jobs/<id>/abort`: Druckauftrag abbrechen
|
||||
- `POST /api/jobs/<id>/finish`: Druckauftrag vorzeitig beenden
|
||||
- `POST /api/jobs/<id>/extend`: Druckauftrag verlängern
|
||||
- `PUT /api/jobs/<id>/comments`: Kommentare aktualisieren
|
||||
- `GET /api/job/<id>/remaining-time`: Verbleibende Zeit für einen aktiven Druckauftrag
|
||||
|
||||
### Statistiken
|
||||
|
||||
- `GET /api/stats`: Statistiken zu Druckern, Aufträgen und Benutzern (Admin)
|
||||
|
||||
## Datenmodell
|
||||
|
||||
### Benutzer (User)
|
||||
- id (String UUID, Primary Key)
|
||||
- username (String, Unique)
|
||||
- password_hash (String)
|
||||
- display_name (String)
|
||||
- email (String, Unique)
|
||||
- role (String, 'admin', 'user' oder 'guest')
|
||||
|
||||
### Session
|
||||
- id (String UUID, Primary Key)
|
||||
- user_id (String UUID, Foreign Key zu User)
|
||||
- expires_at (DateTime)
|
||||
|
||||
### Drucker (Printer)
|
||||
- id (String UUID, Primary Key)
|
||||
- name (String)
|
||||
- description (Text)
|
||||
- status (Integer, 0=available, 1=busy, 2=maintenance)
|
||||
- ip_address (String, IP-Adresse der Tapo-Steckdose)
|
||||
|
||||
### Druckauftrag (PrintJob)
|
||||
- id (String UUID, Primary Key)
|
||||
- printer_id (String UUID, Foreign Key zu Printer)
|
||||
- user_id (String UUID, Foreign Key zu User)
|
||||
- start_at (DateTime)
|
||||
- duration_in_minutes (Integer)
|
||||
- comments (Text)
|
||||
- aborted (Boolean)
|
||||
- abort_reason (Text)
|
||||
|
||||
## Steckdosensteuerung
|
||||
|
||||
Die Anwendung steuert Tapo P115 WLAN-Steckdosen, um die Drucker basierend auf Reservierungen ein- und auszuschalten:
|
||||
|
||||
- Bei Erstellung eines Druckauftrags wird die Steckdose des zugehörigen Druckers automatisch eingeschaltet
|
||||
- Bei Abbruch oder vorzeitiger Beendigung eines Druckauftrags wird die Steckdose ausgeschaltet
|
||||
- Nach Ablauf der Reservierungszeit wird die Steckdose automatisch ausgeschaltet
|
||||
- Ein CLI-Befehl `flask check-jobs` überprüft regelmäßig abgelaufene Jobs und schaltet Steckdosen aus
|
||||
|
||||
## Sicherheit
|
||||
|
||||
- Die Anwendung verwendet ein lokales Authentifizierungssystem mit Passwort-Hashing
|
||||
- Sitzungsdaten werden in Server-Side-Sessions gespeichert
|
||||
- Zugriffskontrollen sind implementiert, um sicherzustellen, dass Benutzer nur auf ihre eigenen Daten zugreifen können
|
||||
- Admin-Benutzer haben Zugriff auf alle Daten und können Systemkonfigurationen ändern
|
||||
- Der erste registrierte Benutzer wird automatisch zum Administrator
|
||||
|
||||
## Logging
|
||||
|
||||
Die Anwendung protokolliert Aktivitäten in rotierenden Logdateien in einem `logs` Verzeichnis. Dies hilft bei der Fehlersuche und Überwachung der Anwendung im Betrieb.
|
||||
|
||||
## Umgebungsvariablen
|
||||
|
||||
Die folgenden Umgebungsvariablen müssen konfiguriert werden:
|
||||
|
||||
- `SECRET_KEY`: Geheimer Schlüssel für die Session-Verschlüsselung
|
||||
- `DATABASE_PATH`: Pfad zur Datenbank (Standard: SQLite-Datenbank im Instance-Verzeichnis)
|
||||
- `TAPO_USERNAME`: Benutzername für die Tapo-Steckdosen
|
||||
- `TAPO_PASSWORD`: Passwort für die Tapo-Steckdosen
|
||||
- `PRINTERS`: JSON-Objekt mit der Zuordnung von Drucker-Namen zu IP-Adressen der Steckdosen im Format: `{"Printer 1": {"ip": "192.168.1.100"}, "Printer 2": {"ip": "192.168.1.101"}, ...}`
|
||||
|
||||
## Automatisierung
|
||||
|
||||
Die Anwendung beinhaltet einen CLI-Befehl `flask check-jobs`, der regelmäßig ausgeführt werden sollte (z.B. als Cron-Job), um abgelaufene Druckaufträge zu überprüfen und die zugehörigen Steckdosen auszuschalten.
|
||||
|
||||
## Kompatibilität mit dem Frontend
|
||||
|
||||
Das Backend wurde speziell für die Kompatibilität mit dem bestehenden Frontend entwickelt, welches in `/packages/reservation-platform` zu finden ist. Die API-Endpunkte und Datenstrukturen sind so gestaltet, dass sie nahtlos mit dem Frontend zusammenarbeiten.
|
89
backend/log.txt
Normal file
89
backend/log.txt
Normal file
@ -0,0 +1,89 @@
|
||||
root@raspberrypi:/home/user/Projektarbeit-MYP/backend# python3 app.py
|
||||
[2025-03-24 09:38:15,229] INFO in app: MYP Backend starting up
|
||||
[2025-03-24 09:38:15,338] INFO in app: Initialisiere Drucker aus Umgebungsvariablen
|
||||
[2025-03-24 09:38:15,353] INFO in app: Neuer Drucker angelegt: Printer 1 mit IP 192.168.0.100
|
||||
[2025-03-24 09:38:16,197] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.100: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:16,197] INFO in app: Neue Steckdose mit IP 192.168.0.100 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:16,209] INFO in app: Neuer Drucker angelegt: Printer 2 mit IP 192.168.0.101
|
||||
[2025-03-24 09:38:16,521] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.101: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:16,522] INFO in app: Neue Steckdose mit IP 192.168.0.101 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:16,536] INFO in app: Neuer Drucker angelegt: Printer 3 mit IP 192.168.0.102
|
||||
[2025-03-24 09:38:17,082] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.102: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:17,083] INFO in app: Neue Steckdose mit IP 192.168.0.102 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:17,096] INFO in app: Neuer Drucker angelegt: Printer 4 mit IP 192.168.0.103
|
||||
[2025-03-24 09:38:18,248] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.103: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:18,249] INFO in app: Neue Steckdose mit IP 192.168.0.103 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:18,263] INFO in app: Neuer Drucker angelegt: Printer 5 mit IP 192.168.0.104
|
||||
[2025-03-24 09:38:18,635] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.104: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:18,636] INFO in app: Neue Steckdose mit IP 192.168.0.104 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:18,650] INFO in app: Neuer Drucker angelegt: Printer 6 mit IP 192.168.0.106
|
||||
[2025-03-24 09:38:21,004] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.106: HTTPConnectionPool(host='192.168.0.106', port=80): Max retries exceeded with url: /app (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7fb0b1dd90>, 'Connection to 192.168.0.106 timed out. (connect timeout=2)'))
|
||||
[2025-03-24 09:38:21,006] INFO in app: Neue Steckdose mit IP 192.168.0.106 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:21,007] INFO in app: Starte Hintergrund-Thread für Job-Überprüfung und Steckdosen-Monitoring
|
||||
[2025-03-24 09:38:21,008] INFO in app: Hintergrund-Thread für Job-Überprüfung gestartet
|
||||
[2025-03-24 09:38:21,014] INFO in app: 0 abgelaufene Jobs überprüft, 0 Steckdosen aktualisiert.
|
||||
* Serving Flask app 'app'
|
||||
* Debug mode: on
|
||||
[2025-03-24 09:38:21,023] INFO in app: Überprüfe Verbindungsstatus von 6 Steckdosen
|
||||
WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
|
||||
* Running on all addresses (0.0.0.0)
|
||||
* Running on http://127.0.0.1:5000
|
||||
* Running on http://192.168.0.105:5000
|
||||
Press CTRL+C to quit
|
||||
* Restarting with stat
|
||||
[2025-03-24 09:38:21,810] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.100: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:21,826] INFO in app: Verbindungsstatus für Steckdose 80c65076-acdb-4448-ac6e-05a44b35f5b2 geändert: offline
|
||||
[2025-03-24 09:38:21,845] WARNING in app: Steckdose Printer 1 (192.168.0.100) ist nicht erreichbar
|
||||
[2025-03-24 09:38:21,913] INFO in app: MYP Backend starting up
|
||||
[2025-03-24 09:38:21,968] INFO in app: Initialisiere Drucker aus Umgebungsvariablen
|
||||
[2025-03-24 09:38:21,969] INFO in app: Drucker mit IP 192.168.0.100 existiert bereits in der Datenbank
|
||||
[2025-03-24 09:38:22,109] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.101: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:22,120] INFO in app: Verbindungsstatus für Steckdose 19e70cd5-5fdb-439b-80e3-807015c7cb15 geändert: offline
|
||||
[2025-03-24 09:38:22,134] WARNING in app: Steckdose Printer 2 (192.168.0.101) ist nicht erreichbar
|
||||
[2025-03-24 09:38:22,666] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.100: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:22,667] INFO in app: Steckdose mit IP 192.168.0.100 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:22,668] INFO in app: Drucker mit IP 192.168.0.101 existiert bereits in der Datenbank
|
||||
[2025-03-24 09:38:22,806] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.102: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:22,819] INFO in app: Verbindungsstatus für Steckdose 7cdc29a8-3593-4666-8419-070914c6d6c5 geändert: offline
|
||||
[2025-03-24 09:38:22,831] WARNING in app: Steckdose Printer 3 (192.168.0.102) ist nicht erreichbar
|
||||
[2025-03-24 09:38:23,222] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.101: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:23,223] INFO in app: Steckdose mit IP 192.168.0.101 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:23,223] INFO in app: Drucker mit IP 192.168.0.102 existiert bereits in der Datenbank
|
||||
[2025-03-24 09:38:23,228] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.103: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:23,243] INFO in app: Verbindungsstatus für Steckdose 69be8092-0eea-4797-a940-51bdec244cf7 geändert: offline
|
||||
[2025-03-24 09:38:23,256] WARNING in app: Steckdose Printer 4 (192.168.0.103) ist nicht erreichbar
|
||||
[2025-03-24 09:38:23,458] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.104: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:23,476] INFO in app: Verbindungsstatus für Steckdose 90caa30e-adaf-44ec-a680-6beea72a570a geändert: offline
|
||||
[2025-03-24 09:38:23,489] WARNING in app: Steckdose Printer 5 (192.168.0.104) ist nicht erreichbar
|
||||
[2025-03-24 09:38:23,492] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.102: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:23,493] INFO in app: Steckdose mit IP 192.168.0.102 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:23,493] INFO in app: Drucker mit IP 192.168.0.103 existiert bereits in der Datenbank
|
||||
[2025-03-24 09:38:24,058] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.103: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:24,058] INFO in app: Steckdose mit IP 192.168.0.103 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:24,059] INFO in app: Drucker mit IP 192.168.0.104 existiert bereits in der Datenbank
|
||||
[2025-03-24 09:38:24,610] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.104: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:24,611] INFO in app: Steckdose mit IP 192.168.0.104 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:24,612] INFO in app: Drucker mit IP 192.168.0.106 existiert bereits in der Datenbank
|
||||
[2025-03-24 09:38:26,344] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.106: HTTPConnectionPool(host='192.168.0.106', port=80): Max retries exceeded with url: /app (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7fafc91790>, 'Connection to 192.168.0.106 timed out. (connect timeout=2)'))
|
||||
[2025-03-24 09:38:26,357] INFO in app: Verbindungsstatus für Steckdose 2b6b9831-e4c1-4f60-8107-69cbc8b58e2c geändert: offline
|
||||
[2025-03-24 09:38:26,370] WARNING in app: Steckdose Printer 6 (192.168.0.106) ist nicht erreichbar
|
||||
[2025-03-24 09:38:26,371] INFO in app: Verbindungsüberprüfung abgeschlossen: 0 online, 6 offline, 0 übersprungen
|
||||
[2025-03-24 09:38:26,371] INFO in app: Nächste Socket-Überprüfung in 120 Sekunden
|
||||
[2025-03-24 09:38:26,775] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.106: HTTPConnectionPool(host='192.168.0.106', port=80): Max retries exceeded with url: /app (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f8214df50>, 'Connection to 192.168.0.106 timed out. (connect timeout=2)'))
|
||||
[2025-03-24 09:38:26,776] INFO in app: Steckdose mit IP 192.168.0.106 wurde beim Start ausgeschaltet
|
||||
[2025-03-24 09:38:26,776] INFO in app: Starte Hintergrund-Thread für Job-Überprüfung und Steckdosen-Monitoring
|
||||
[2025-03-24 09:38:26,777] INFO in app: Hintergrund-Thread für Job-Überprüfung gestartet
|
||||
[2025-03-24 09:38:26,780] INFO in app: 0 abgelaufene Jobs überprüft, 0 Steckdosen aktualisiert.
|
||||
[2025-03-24 09:38:26,784] INFO in app: Überprüfe Verbindungsstatus von 6 Steckdosen
|
||||
* Debugger is active!
|
||||
* Debugger PIN: 101-484-383
|
||||
[2025-03-24 09:38:27,279] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.100: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:27,280] WARNING in app: Steckdose Printer 1 (192.168.0.100) ist nicht erreichbar
|
||||
[2025-03-24 09:38:27,719] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.101: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:27,720] WARNING in app: Steckdose Printer 2 (192.168.0.101) ist nicht erreichbar
|
||||
[2025-03-24 09:38:28,073] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.102: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:28,074] WARNING in app: Steckdose Printer 3 (192.168.0.102) ist nicht erreichbar
|
||||
[2025-03-24 09:38:28,887] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.103: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:28,887] WARNING in app: Steckdose Printer 4 (192.168.0.103) ist nicht erreichbar
|
||||
[2025-03-24 09:38:29,312] ERROR in app: Fehler bei der Anmeldung an P100-Gerät 192.168.0.104: Expecting value: line 1 column 1 (char 0)
|
||||
[2025-03-24 09:38:29,312] WARNING in app: Steckdose Printer 5 (192.168.0.104) ist nicht erreichbar
|
1
backend/migrations/README
Normal file
1
backend/migrations/README
Normal file
@ -0,0 +1 @@
|
||||
Single-database configuration for Flask.
|
50
backend/migrations/alembic.ini
Normal file
50
backend/migrations/alembic.ini
Normal file
@ -0,0 +1,50 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# template used to generate migration files
|
||||
# file_template = %%(rev)s_%%(slug)s
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic,flask_migrate
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[logger_flask_migrate]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = flask_migrate
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
113
backend/migrations/env.py
Normal file
113
backend/migrations/env.py
Normal file
@ -0,0 +1,113 @@
|
||||
import logging
|
||||
from logging.config import fileConfig
|
||||
|
||||
from flask import current_app
|
||||
|
||||
from alembic import context
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
fileConfig(config.config_file_name)
|
||||
logger = logging.getLogger('alembic.env')
|
||||
|
||||
|
||||
def get_engine():
|
||||
try:
|
||||
# this works with Flask-SQLAlchemy<3 and Alchemical
|
||||
return current_app.extensions['migrate'].db.get_engine()
|
||||
except (TypeError, AttributeError):
|
||||
# this works with Flask-SQLAlchemy>=3
|
||||
return current_app.extensions['migrate'].db.engine
|
||||
|
||||
|
||||
def get_engine_url():
|
||||
try:
|
||||
return get_engine().url.render_as_string(hide_password=False).replace(
|
||||
'%', '%%')
|
||||
except AttributeError:
|
||||
return str(get_engine().url).replace('%', '%%')
|
||||
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
# from myapp import mymodel
|
||||
# target_metadata = mymodel.Base.metadata
|
||||
config.set_main_option('sqlalchemy.url', get_engine_url())
|
||||
target_db = current_app.extensions['migrate'].db
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def get_metadata():
|
||||
if hasattr(target_db, 'metadatas'):
|
||||
return target_db.metadatas[None]
|
||||
return target_db.metadata
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url, target_metadata=get_metadata(), literal_binds=True
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online():
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
|
||||
# this callback is used to prevent an auto-migration from being generated
|
||||
# when there are no changes to the schema
|
||||
# reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html
|
||||
def process_revision_directives(context, revision, directives):
|
||||
if getattr(config.cmd_opts, 'autogenerate', False):
|
||||
script = directives[0]
|
||||
if script.upgrade_ops.is_empty():
|
||||
directives[:] = []
|
||||
logger.info('No changes in schema detected.')
|
||||
|
||||
conf_args = current_app.extensions['migrate'].configure_args
|
||||
if conf_args.get("process_revision_directives") is None:
|
||||
conf_args["process_revision_directives"] = process_revision_directives
|
||||
|
||||
connectable = get_engine()
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=get_metadata(),
|
||||
**conf_args
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
24
backend/migrations/script.py.mako
Normal file
24
backend/migrations/script.py.mako
Normal file
@ -0,0 +1,24 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade():
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade():
|
||||
${downgrades if downgrades else "pass"}
|
42
backend/migrations/versions/add_waiting_approval.py
Normal file
42
backend/migrations/versions/add_waiting_approval.py
Normal file
@ -0,0 +1,42 @@
|
||||
"""Add waiting_approval column to job table
|
||||
|
||||
Revision ID: add_waiting_approval
|
||||
Revises: af3faaa3844c
|
||||
Create Date: 2025-03-12 14:00:00.000000
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'add_waiting_approval'
|
||||
down_revision = 'af3faaa3844c'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# Füge die neue Spalte waiting_approval zur job-Tabelle hinzu
|
||||
with op.batch_alter_table('job', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('waiting_approval', sa.Integer(), server_default='0', nullable=False))
|
||||
|
||||
# SQLite-kompatible Migration für die print_job-Tabelle, falls diese existiert
|
||||
try:
|
||||
with op.batch_alter_table('print_job', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('waiting_approval', sa.Boolean(), server_default='0', nullable=False))
|
||||
except Exception as e:
|
||||
print(f"Migration für print_job-Tabelle übersprungen: {e}")
|
||||
|
||||
|
||||
def downgrade():
|
||||
# Entferne die waiting_approval-Spalte aus der job-Tabelle
|
||||
with op.batch_alter_table('job', schema=None) as batch_op:
|
||||
batch_op.drop_column('waiting_approval')
|
||||
|
||||
# SQLite-kompatible Migration für die print_job-Tabelle, falls diese existiert
|
||||
try:
|
||||
with op.batch_alter_table('print_job', schema=None) as batch_op:
|
||||
batch_op.drop_column('waiting_approval')
|
||||
except Exception as e:
|
||||
print(f"Downgrade für print_job-Tabelle übersprungen: {e}")
|
81
backend/migrations/versions/af3faaa3844c_.py
Normal file
81
backend/migrations/versions/af3faaa3844c_.py
Normal file
@ -0,0 +1,81 @@
|
||||
"""empty message
|
||||
|
||||
Revision ID: af3faaa3844c
|
||||
Revises:
|
||||
Create Date: 2025-03-11 11:16:04.961964
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'af3faaa3844c'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('printer',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('name', sa.String(length=64), nullable=False),
|
||||
sa.Column('description', sa.Text(), nullable=False),
|
||||
sa.Column('status', sa.Integer(), nullable=True),
|
||||
sa.Column('ip_address', sa.String(length=15), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
with op.batch_alter_table('printer', schema=None) as batch_op:
|
||||
batch_op.create_index(batch_op.f('ix_printer_name'), ['name'], unique=False)
|
||||
|
||||
op.create_table('user',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('username', sa.String(length=64), nullable=True),
|
||||
sa.Column('password_hash', sa.String(length=128), nullable=True),
|
||||
sa.Column('display_name', sa.String(length=100), nullable=True),
|
||||
sa.Column('email', sa.String(length=120), nullable=True),
|
||||
sa.Column('role', sa.String(length=20), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
with op.batch_alter_table('user', schema=None) as batch_op:
|
||||
batch_op.create_index(batch_op.f('ix_user_email'), ['email'], unique=True)
|
||||
batch_op.create_index(batch_op.f('ix_user_username'), ['username'], unique=True)
|
||||
|
||||
op.create_table('print_job',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('printer_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('user_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('start_at', sa.DateTime(), nullable=True),
|
||||
sa.Column('duration_in_minutes', sa.Integer(), nullable=False),
|
||||
sa.Column('comments', sa.Text(), nullable=True),
|
||||
sa.Column('aborted', sa.Boolean(), nullable=True),
|
||||
sa.Column('abort_reason', sa.Text(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['printer_id'], ['printer.id'], ),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_table('session',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('user_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('expires_at', sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_table('session')
|
||||
op.drop_table('print_job')
|
||||
with op.batch_alter_table('user', schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f('ix_user_username'))
|
||||
batch_op.drop_index(batch_op.f('ix_user_email'))
|
||||
|
||||
op.drop_table('user')
|
||||
with op.batch_alter_table('printer', schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f('ix_printer_name'))
|
||||
|
||||
op.drop_table('printer')
|
||||
# ### end Alembic commands ###
|
7
backend/requirements.txt
Normal file
7
backend/requirements.txt
Normal file
@ -0,0 +1,7 @@
|
||||
flask==2.3.3
|
||||
flask-cors==4.0.0
|
||||
pyjwt==2.8.0
|
||||
python-dotenv==1.0.0
|
||||
werkzeug==2.3.7
|
||||
gunicorn==21.2.0
|
||||
PyP100==0.0.19
|
12068
backend/static/css/bootstrap.css
vendored
Normal file
12068
backend/static/css/bootstrap.css
vendored
Normal file
File diff suppressed because it is too large
Load Diff
6314
backend/static/js/bootstrap.bundle.js
vendored
Normal file
6314
backend/static/js/bootstrap.bundle.js
vendored
Normal file
File diff suppressed because it is too large
Load Diff
169
backend/templates/base.html
Normal file
169
backend/templates/base.html
Normal file
@ -0,0 +1,169 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="de">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>{% block title %}MYP API Tester{% endblock %}</title>
|
||||
<link rel="stylesheet" href="{{ url_for('static', filename='css/bootstrap.css') }}">
|
||||
<style>
|
||||
.sidebar {
|
||||
min-height: calc(100vh - 56px);
|
||||
background-color: #f8f9fa;
|
||||
}
|
||||
.api-response {
|
||||
max-height: 300px;
|
||||
overflow-y: auto;
|
||||
font-family: monospace;
|
||||
background-color: #f5f5f5;
|
||||
padding: 10px;
|
||||
border-radius: 4px;
|
||||
}
|
||||
.nav-link.active {
|
||||
background-color: #0d6efd;
|
||||
color: white !important;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<nav class="navbar navbar-expand-lg navbar-dark bg-dark">
|
||||
<div class="container-fluid">
|
||||
<a class="navbar-brand" href="/">MYP API Tester</a>
|
||||
<button class="navbar-toggler" type="button" data-bs-toggle="collapse" data-bs-target="#navbarNav">
|
||||
<span class="navbar-toggler-icon"></span>
|
||||
</button>
|
||||
<div class="collapse navbar-collapse" id="navbarNav">
|
||||
<ul class="navbar-nav">
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if active_page == 'home' %}active{% endif %}" href="/">Home</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if active_page == 'printers' %}active{% endif %}" href="/admin/printers">Drucker</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if active_page == 'jobs' %}active{% endif %}" href="/admin/jobs">Druckaufträge</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if active_page == 'users' %}active{% endif %}" href="/admin/users">Benutzer</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if active_page == 'stats' %}active{% endif %}" href="/admin/stats">Statistiken</a>
|
||||
</li>
|
||||
</ul>
|
||||
<ul class="navbar-nav ms-auto">
|
||||
{% if current_user %}
|
||||
<li class="nav-item dropdown">
|
||||
<a class="nav-link dropdown-toggle" href="#" id="userDropdown" role="button" data-bs-toggle="dropdown">
|
||||
{{ current_user.username }}
|
||||
</a>
|
||||
<ul class="dropdown-menu dropdown-menu-end">
|
||||
<li><a class="dropdown-item" href="/logout">Abmelden</a></li>
|
||||
</ul>
|
||||
</li>
|
||||
{% else %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="/login">Anmelden</a>
|
||||
</li>
|
||||
{% endif %}
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
<div class="container-fluid py-3">
|
||||
{% with messages = get_flashed_messages(with_categories=true) %}
|
||||
{% if messages %}
|
||||
{% for category, message in messages %}
|
||||
<div class="alert alert-{{ category }}" role="alert">
|
||||
{{ message }}
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
{% endwith %}
|
||||
|
||||
{% block content %}{% endblock %}
|
||||
</div>
|
||||
|
||||
<script src="{{ url_for('static', filename='js/bootstrap.bundle.js') }}"></script>
|
||||
<script>
|
||||
function formatJson(jsonString) {
|
||||
try {
|
||||
const obj = JSON.parse(jsonString);
|
||||
return JSON.stringify(obj, null, 2);
|
||||
} catch (e) {
|
||||
return jsonString;
|
||||
}
|
||||
}
|
||||
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Format all response areas
|
||||
document.querySelectorAll('.api-response').forEach(function(el) {
|
||||
if (el.textContent) {
|
||||
el.textContent = formatJson(el.textContent);
|
||||
}
|
||||
});
|
||||
|
||||
// Add event listener to show response areas
|
||||
document.querySelectorAll('.api-form').forEach(function(form) {
|
||||
form.addEventListener('submit', async function(e) {
|
||||
e.preventDefault();
|
||||
|
||||
const url = this.getAttribute('data-url');
|
||||
const method = this.getAttribute('data-method') || 'GET';
|
||||
const responseArea = document.getElementById(this.getAttribute('data-response'));
|
||||
const formData = new FormData(this);
|
||||
const data = {};
|
||||
|
||||
formData.forEach((value, key) => {
|
||||
if (value) {
|
||||
try {
|
||||
// Try to parse as JSON if it looks like JSON
|
||||
if (value.trim().startsWith('{') || value.trim().startsWith('[')) {
|
||||
data[key] = JSON.parse(value);
|
||||
} else {
|
||||
data[key] = value;
|
||||
}
|
||||
} catch (e) {
|
||||
data[key] = value;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
const options = {
|
||||
method: method,
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'same-origin'
|
||||
};
|
||||
|
||||
if (method !== 'GET' && method !== 'HEAD') {
|
||||
options.body = JSON.stringify(data);
|
||||
}
|
||||
|
||||
try {
|
||||
responseArea.textContent = 'Sending request...';
|
||||
const response = await fetch(url, options);
|
||||
const responseText = await response.text();
|
||||
|
||||
try {
|
||||
const formatted = formatJson(responseText);
|
||||
responseArea.textContent = formatted;
|
||||
} catch (e) {
|
||||
responseArea.textContent = responseText;
|
||||
}
|
||||
|
||||
if (this.hasAttribute('data-reload') && response.ok) {
|
||||
setTimeout(() => {
|
||||
window.location.reload();
|
||||
}, 1000);
|
||||
}
|
||||
} catch (err) {
|
||||
responseArea.textContent = 'Error: ' + err.message;
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
</script>
|
||||
{% block scripts %}{% endblock %}
|
||||
</body>
|
||||
</html>
|
304
backend/templates/dashboard.html
Normal file
304
backend/templates/dashboard.html
Normal file
@ -0,0 +1,304 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Dashboard - MYP API Tester{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row">
|
||||
<div class="col-md-12 mb-4">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h4 class="mb-0">Willkommen, {{ current_user.display_name }}</h4>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<p>Benutzerdetails:</p>
|
||||
<ul>
|
||||
<li><strong>ID:</strong> {{ current_user.id }}</li>
|
||||
<li><strong>Benutzername:</strong> {{ current_user.username }}</li>
|
||||
<li><strong>E-Mail:</strong> {{ current_user.email or "Nicht angegeben" }}</li>
|
||||
<li><strong>Rolle:</strong> {{ current_user.role }}</li>
|
||||
</ul>
|
||||
<div class="mt-3">
|
||||
<a href="/admin/printers" class="btn btn-primary me-2">Drucker verwalten</a>
|
||||
<a href="/admin/jobs" class="btn btn-success me-2">Druckaufträge verwalten</a>
|
||||
{% if current_user.role == 'admin' %}
|
||||
<a href="/admin/users" class="btn btn-info me-2">Benutzer verwalten</a>
|
||||
<a href="/admin/stats" class="btn btn-secondary">Statistiken</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row">
|
||||
<div class="col-md-6 mb-4">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Aktive Druckaufträge</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form mb-3" data-url="/api/jobs" data-method="GET" data-response="jobsResponse">
|
||||
<button type="submit" class="btn btn-primary">Aktualisieren</button>
|
||||
</form>
|
||||
|
||||
<div id="activeJobsContainer">
|
||||
<div class="alert alert-info">Lade Druckaufträge...</div>
|
||||
</div>
|
||||
|
||||
<div class="d-none">
|
||||
<h6>API-Antwort:</h6>
|
||||
<pre class="api-response" id="jobsResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-6 mb-4">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Verfügbare Drucker</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form mb-3" data-url="/api/printers" data-method="GET" data-response="printersResponse">
|
||||
<button type="submit" class="btn btn-primary">Aktualisieren</button>
|
||||
</form>
|
||||
|
||||
<div id="availablePrintersContainer">
|
||||
<div class="alert alert-info">Lade Drucker...</div>
|
||||
</div>
|
||||
|
||||
<div class="d-none">
|
||||
<h6>API-Antwort:</h6>
|
||||
<pre class="api-response" id="printersResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Job freischalten Modal -->
|
||||
<div class="modal fade" id="approveJobModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Druckauftrag freischalten</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p>Möchten Sie diesen Druckauftrag jetzt freischalten und starten?</p>
|
||||
<p><strong>Hinweis:</strong> Der Drucker muss verfügbar sein, damit der Auftrag gestartet werden kann.</p>
|
||||
<form id="approveJobForm" class="api-form" data-method="POST" data-response="approveJobResponse" data-reload="true">
|
||||
<input type="hidden" id="approveJobId" name="jobId">
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="approveJobResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="approveJobForm" class="btn btn-success">Freischalten</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Aufträge und Drucker laden
|
||||
document.querySelector('form[data-url="/api/jobs"]').dispatchEvent(new Event('submit'));
|
||||
document.querySelector('form[data-url="/api/printers"]').dispatchEvent(new Event('submit'));
|
||||
|
||||
// Tabellen aktualisieren, wenn Daten geladen werden
|
||||
const jobsResponse = document.getElementById('jobsResponse');
|
||||
const printersResponse = document.getElementById('printersResponse');
|
||||
|
||||
// Observer für Jobs
|
||||
const jobsObserver = new MutationObserver(function(mutations) {
|
||||
try {
|
||||
const jobs = JSON.parse(jobsResponse.textContent);
|
||||
updateActiveJobs(jobs);
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Parsen der Auftrags-Daten:', e);
|
||||
}
|
||||
});
|
||||
|
||||
jobsObserver.observe(jobsResponse, { childList: true, characterData: true, subtree: true });
|
||||
|
||||
// Observer für Drucker
|
||||
const printersObserver = new MutationObserver(function(mutations) {
|
||||
try {
|
||||
const printers = JSON.parse(printersResponse.textContent);
|
||||
updateAvailablePrinters(printers);
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Parsen der Drucker-Daten:', e);
|
||||
}
|
||||
});
|
||||
|
||||
printersObserver.observe(printersResponse, { childList: true, characterData: true, subtree: true });
|
||||
|
||||
// Approve-Modal vorbereiten
|
||||
document.getElementById('approveJobModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const jobId = button.getAttribute('data-job-id');
|
||||
|
||||
document.getElementById('approveJobId').value = jobId;
|
||||
document.getElementById('approveJobForm').setAttribute('data-url', `/api/jobs/${jobId}/approve`);
|
||||
});
|
||||
|
||||
// Automatische Aktualisierung alle 60 Sekunden
|
||||
setInterval(() => {
|
||||
document.querySelector('form[data-url="/api/jobs"]').dispatchEvent(new Event('submit'));
|
||||
document.querySelector('form[data-url="/api/printers"]').dispatchEvent(new Event('submit'));
|
||||
}, 60000);
|
||||
});
|
||||
|
||||
function updateActiveJobs(jobs) {
|
||||
const container = document.getElementById('activeJobsContainer');
|
||||
|
||||
// Filter für aktive und wartende Jobs
|
||||
const activeJobs = jobs.filter(job => !job.aborted && job.remainingMinutes > 0 && !job.waitingApproval);
|
||||
const waitingJobs = jobs.filter(job => !job.aborted && job.waitingApproval);
|
||||
|
||||
if (activeJobs.length === 0 && waitingJobs.length === 0) {
|
||||
container.innerHTML = '<div class="alert alert-info">Keine aktiven Druckaufträge vorhanden.</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
let html = '';
|
||||
|
||||
// Aktive Jobs anzeigen
|
||||
if (activeJobs.length > 0) {
|
||||
html += '<h6 class="mt-3">Laufende Aufträge</h6>';
|
||||
html += '<div class="list-group mb-3">';
|
||||
|
||||
activeJobs.forEach(job => {
|
||||
// Prozentsatz der abgelaufenen Zeit berechnen
|
||||
const totalDuration = job.durationInMinutes;
|
||||
const elapsed = totalDuration - job.remainingMinutes;
|
||||
const percentage = Math.round((elapsed / totalDuration) * 100);
|
||||
|
||||
html += `
|
||||
<div class="list-group-item">
|
||||
<div class="d-flex justify-content-between">
|
||||
<div>
|
||||
<strong>Job ${job.id.substring(0, 8)}...</strong> (${job.durationInMinutes} Min)
|
||||
<div class="small text-muted">Verbleibend: ${job.remainingMinutes} Min</div>
|
||||
</div>
|
||||
<div>
|
||||
<span class="badge bg-warning">Aktiv</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="progress mt-2" style="height: 10px;">
|
||||
<div class="progress-bar progress-bar-striped progress-bar-animated"
|
||||
role="progressbar"
|
||||
style="width: ${percentage}%;"
|
||||
aria-valuenow="${percentage}"
|
||||
aria-valuemin="0"
|
||||
aria-valuemax="100">
|
||||
${percentage}%
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
});
|
||||
|
||||
html += '</div>';
|
||||
}
|
||||
|
||||
// Wartende Jobs anzeigen
|
||||
if (waitingJobs.length > 0) {
|
||||
html += '<h6 class="mt-3">Wartende Aufträge</h6>';
|
||||
html += '<div class="list-group">';
|
||||
|
||||
waitingJobs.forEach(job => {
|
||||
html += `
|
||||
<div class="list-group-item">
|
||||
<div class="d-flex justify-content-between">
|
||||
<div>
|
||||
<strong>Job ${job.id.substring(0, 8)}...</strong> (${job.durationInMinutes} Min)
|
||||
<div class="small text-muted">Drucker: ${job.socketId.substring(0, 8)}...</div>
|
||||
</div>
|
||||
<div>
|
||||
<span class="badge bg-info">Wartet</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-2">
|
||||
<button type="button" class="btn btn-sm btn-success"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#approveJobModal"
|
||||
data-job-id="${job.id}">
|
||||
Freischalten
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
});
|
||||
|
||||
html += '</div>';
|
||||
}
|
||||
|
||||
container.innerHTML = html;
|
||||
}
|
||||
|
||||
function updateAvailablePrinters(printers) {
|
||||
const container = document.getElementById('availablePrintersContainer');
|
||||
|
||||
// Filter für verfügbare Drucker
|
||||
const availablePrinters = printers.filter(printer => printer.status === 0);
|
||||
|
||||
if (availablePrinters.length === 0) {
|
||||
container.innerHTML = '<div class="alert alert-warning">Keine verfügbaren Drucker gefunden.</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
let html = '<div class="list-group">';
|
||||
|
||||
availablePrinters.forEach(printer => {
|
||||
html += `
|
||||
<div class="list-group-item">
|
||||
<div class="d-flex justify-content-between align-items-center">
|
||||
<div>
|
||||
<strong>${printer.name}</strong>
|
||||
<div class="small text-muted">${printer.description}</div>
|
||||
</div>
|
||||
<div>
|
||||
<span class="badge bg-success">Verfügbar</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-2">
|
||||
<a href="/admin/jobs" class="btn btn-sm btn-primary">Auftrag erstellen</a>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
});
|
||||
|
||||
html += '</div>';
|
||||
|
||||
container.innerHTML = html;
|
||||
|
||||
// Gesamtstatistik hinzufügen
|
||||
const busyPrinters = printers.filter(printer => printer.status === 1).length;
|
||||
const totalPrinters = printers.length;
|
||||
|
||||
if (totalPrinters > 0) {
|
||||
const statsHtml = `
|
||||
<div class="mt-3">
|
||||
<div class="d-flex justify-content-between">
|
||||
<small>Verfügbar: ${availablePrinters.length} / ${totalPrinters}</small>
|
||||
<small>Belegt: ${busyPrinters} / ${totalPrinters}</small>
|
||||
</div>
|
||||
<div class="progress mt-1" style="height: 5px;">
|
||||
<div class="progress-bar bg-success" style="width: ${(availablePrinters.length / totalPrinters) * 100}%"></div>
|
||||
<div class="progress-bar bg-warning" style="width: ${(busyPrinters / totalPrinters) * 100}%"></div>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
container.innerHTML += statsHtml;
|
||||
}
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
443
backend/templates/jobs.html
Normal file
443
backend/templates/jobs.html
Normal file
@ -0,0 +1,443 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Druckaufträge - MYP API Tester{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row">
|
||||
<div class="col-md-12 mb-4">
|
||||
<div class="card">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<h4 class="mb-0">Druckaufträge verwalten</h4>
|
||||
<button class="btn btn-primary" type="button" data-bs-toggle="collapse" data-bs-target="#newJobForm">
|
||||
Neuen Auftrag erstellen
|
||||
</button>
|
||||
</div>
|
||||
<div class="collapse" id="newJobForm">
|
||||
<div class="card-body border-bottom">
|
||||
<form class="api-form" data-url="/api/jobs" data-method="POST" data-response="createJobResponse" data-reload="true">
|
||||
<div class="mb-3">
|
||||
<label for="jobPrinterId" class="form-label">Drucker</label>
|
||||
<select class="form-control" id="jobPrinterId" name="printerId" required>
|
||||
<option value="">Drucker auswählen...</option>
|
||||
<!-- Wird dynamisch gefüllt -->
|
||||
</select>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="jobDuration" class="form-label">Dauer (Minuten)</label>
|
||||
<input type="number" class="form-control" id="jobDuration" name="durationInMinutes" min="1" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="jobComments" class="form-label">Kommentare</label>
|
||||
<textarea class="form-control" id="jobComments" name="comments" rows="3"></textarea>
|
||||
</div>
|
||||
<div class="mb-3 form-check">
|
||||
<input type="checkbox" class="form-check-input" id="allowQueuedJobs" name="allowQueuedJobs" value="true">
|
||||
<label class="form-check-label" for="allowQueuedJobs">
|
||||
Auftrag in Warteschlange erlauben (wenn Drucker belegt ist)
|
||||
</label>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-success">Auftrag erstellen</button>
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="createJobResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form mb-3" data-url="/api/jobs" data-method="GET" data-response="jobsResponse">
|
||||
<button type="submit" class="btn btn-primary">Aufträge aktualisieren</button>
|
||||
</form>
|
||||
|
||||
<div class="table-responsive">
|
||||
<table class="table table-striped table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>ID</th>
|
||||
<th>Drucker</th>
|
||||
<th>Benutzer</th>
|
||||
<th>Start</th>
|
||||
<th>Dauer (Min)</th>
|
||||
<th>Verbleibend (Min)</th>
|
||||
<th>Status</th>
|
||||
<th>Kommentare</th>
|
||||
<th>Aktionen</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="jobsTableBody">
|
||||
<!-- Wird dynamisch gefüllt -->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<h6>API-Antwort:</h6>
|
||||
<pre class="api-response" id="jobsResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Job abbrechen Modal -->
|
||||
<div class="modal fade" id="abortJobModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Auftrag abbrechen</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p>Möchten Sie den Auftrag wirklich abbrechen?</p>
|
||||
<form id="abortJobForm" class="api-form" data-method="POST" data-response="abortJobResponse" data-reload="true">
|
||||
<input type="hidden" id="abortJobId" name="jobId">
|
||||
<div class="mb-3">
|
||||
<label for="abortReason" class="form-label">Abbruchgrund</label>
|
||||
<textarea class="form-control" id="abortReason" name="reason" rows="3"></textarea>
|
||||
</div>
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="abortJobResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="abortJobForm" class="btn btn-danger">Auftrag abbrechen</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Job beenden Modal -->
|
||||
<div class="modal fade" id="finishJobModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Auftrag beenden</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p>Möchten Sie den Auftrag als beendet markieren?</p>
|
||||
<form id="finishJobForm" class="api-form" data-method="POST" data-response="finishJobResponse" data-reload="true">
|
||||
<input type="hidden" id="finishJobId" name="jobId">
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="finishJobResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="finishJobForm" class="btn btn-success">Auftrag beenden</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Job verlängern Modal -->
|
||||
<div class="modal fade" id="extendJobModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Auftrag verlängern</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="extendJobForm" class="api-form" data-method="POST" data-response="extendJobResponse" data-reload="true">
|
||||
<input type="hidden" id="extendJobId" name="jobId">
|
||||
<div class="mb-3">
|
||||
<label for="extendHours" class="form-label">Stunden</label>
|
||||
<input type="number" class="form-control" id="extendHours" name="hours" min="0" value="0">
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="extendMinutes" class="form-label">Minuten</label>
|
||||
<input type="number" class="form-control" id="extendMinutes" name="minutes" min="0" max="59" value="30">
|
||||
</div>
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="extendJobResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="extendJobForm" class="btn btn-primary">Auftrag verlängern</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Job Kommentare bearbeiten Modal -->
|
||||
<div class="modal fade" id="editCommentsModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Kommentare bearbeiten</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="editCommentsForm" class="api-form" data-method="PUT" data-response="editCommentsResponse" data-reload="true">
|
||||
<input type="hidden" id="editCommentsJobId" name="jobId">
|
||||
<div class="mb-3">
|
||||
<label for="editJobComments" class="form-label">Kommentare</label>
|
||||
<textarea class="form-control" id="editJobComments" name="comments" rows="3"></textarea>
|
||||
</div>
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="editCommentsResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="editCommentsForm" class="btn btn-primary">Speichern</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Job freischalten Modal -->
|
||||
<div class="modal fade" id="approveJobModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Druckauftrag freischalten</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p>Möchten Sie diesen Druckauftrag jetzt freischalten und starten?</p>
|
||||
<p><strong>Hinweis:</strong> Der Drucker muss verfügbar sein, damit der Auftrag gestartet werden kann.</p>
|
||||
<form id="approveJobForm" class="api-form" data-method="POST" data-response="approveJobResponse" data-reload="true">
|
||||
<input type="hidden" id="approveJobId" name="jobId">
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="approveJobResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="approveJobForm" class="btn btn-success">Freischalten</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Drucker für Dropdown laden
|
||||
loadPrinters();
|
||||
|
||||
// Aufträge laden
|
||||
document.querySelector('form[data-url="/api/jobs"]').dispatchEvent(new Event('submit'));
|
||||
|
||||
// Tabelle aktualisieren, wenn Aufträge geladen werden
|
||||
const jobsResponse = document.getElementById('jobsResponse');
|
||||
const observer = new MutationObserver(function(mutations) {
|
||||
try {
|
||||
const jobs = JSON.parse(jobsResponse.textContent);
|
||||
updateJobsTable(jobs);
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Parsen der Auftrags-Daten:', e);
|
||||
}
|
||||
});
|
||||
|
||||
observer.observe(jobsResponse, { childList: true, characterData: true, subtree: true });
|
||||
|
||||
// Abort-Modal vorbereiten
|
||||
document.getElementById('abortJobModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const jobId = button.getAttribute('data-job-id');
|
||||
|
||||
document.getElementById('abortJobId').value = jobId;
|
||||
document.getElementById('abortJobForm').setAttribute('data-url', `/api/jobs/${jobId}/abort`);
|
||||
});
|
||||
|
||||
// Finish-Modal vorbereiten
|
||||
document.getElementById('finishJobModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const jobId = button.getAttribute('data-job-id');
|
||||
|
||||
document.getElementById('finishJobId').value = jobId;
|
||||
document.getElementById('finishJobForm').setAttribute('data-url', `/api/jobs/${jobId}/finish`);
|
||||
});
|
||||
|
||||
// Extend-Modal vorbereiten
|
||||
document.getElementById('extendJobModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const jobId = button.getAttribute('data-job-id');
|
||||
|
||||
document.getElementById('extendJobId').value = jobId;
|
||||
document.getElementById('extendJobForm').setAttribute('data-url', `/api/jobs/${jobId}/extend`);
|
||||
});
|
||||
|
||||
// Edit-Comments-Modal vorbereiten
|
||||
document.getElementById('editCommentsModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const jobId = button.getAttribute('data-job-id');
|
||||
const comments = button.getAttribute('data-job-comments');
|
||||
|
||||
document.getElementById('editCommentsJobId').value = jobId;
|
||||
document.getElementById('editCommentsForm').setAttribute('data-url', `/api/jobs/${jobId}/comments`);
|
||||
document.getElementById('editJobComments').value = comments || '';
|
||||
});
|
||||
|
||||
// Approve-Modal vorbereiten
|
||||
document.getElementById('approveJobModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const jobId = button.getAttribute('data-job-id');
|
||||
|
||||
document.getElementById('approveJobId').value = jobId;
|
||||
document.getElementById('approveJobForm').setAttribute('data-url', `/api/jobs/${jobId}/approve`);
|
||||
});
|
||||
});
|
||||
|
||||
async function loadPrinters() {
|
||||
try {
|
||||
const response = await fetch('/api/printers');
|
||||
const printers = await response.json();
|
||||
|
||||
const selectElement = document.getElementById('jobPrinterId');
|
||||
selectElement.innerHTML = '<option value="">Drucker auswählen...</option>';
|
||||
|
||||
// Drucker anzeigen (alle, da man jetzt auch für belegte Drucker Jobs erstellen kann)
|
||||
printers.forEach(printer => {
|
||||
const option = document.createElement('option');
|
||||
option.value = printer.id;
|
||||
|
||||
// Status-Information zum Drucker hinzufügen
|
||||
const statusText = printer.status === 0 ? '(Verfügbar)' : '(Belegt)';
|
||||
option.textContent = `${printer.name} - ${printer.description} ${statusText}`;
|
||||
|
||||
// Belegte Drucker visuell unterscheiden
|
||||
if (printer.status !== 0) {
|
||||
option.classList.add('text-muted');
|
||||
}
|
||||
|
||||
selectElement.appendChild(option);
|
||||
});
|
||||
|
||||
// Hinweis auf die Checkbox für Warteschlange anzeigen oder verstecken
|
||||
const allowQueuedJobsCheckbox = document.getElementById('allowQueuedJobs');
|
||||
const queueCheckboxContainer = allowQueuedJobsCheckbox.closest('.form-check');
|
||||
|
||||
// Prüfen, ob es belegte Drucker gibt
|
||||
const hasBusyPrinters = printers.some(printer => printer.status !== 0);
|
||||
queueCheckboxContainer.style.display = hasBusyPrinters ? 'block' : 'none';
|
||||
|
||||
// Event-Listener für die Druckerauswahl hinzufügen
|
||||
selectElement.addEventListener('change', function() {
|
||||
const selectedPrinterId = this.value;
|
||||
const selectedPrinter = printers.find(p => p.id === selectedPrinterId);
|
||||
|
||||
if (selectedPrinter && selectedPrinter.status !== 0) {
|
||||
// Wenn ein belegter Drucker ausgewählt wird, Checkbox für Warteschlange anzeigen
|
||||
queueCheckboxContainer.style.display = 'block';
|
||||
allowQueuedJobsCheckbox.checked = true;
|
||||
} else if (selectedPrinter && selectedPrinter.status === 0) {
|
||||
// Wenn ein verfügbarer Drucker ausgewählt wird, Checkbox für Warteschlange verstecken
|
||||
allowQueuedJobsCheckbox.checked = false;
|
||||
}
|
||||
});
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Laden der Drucker:', e);
|
||||
}
|
||||
}
|
||||
|
||||
function updateJobsTable(jobs) {
|
||||
const tableBody = document.getElementById('jobsTableBody');
|
||||
tableBody.innerHTML = '';
|
||||
|
||||
jobs.forEach(job => {
|
||||
const row = document.createElement('tr');
|
||||
|
||||
const startDate = new Date(job.startAt);
|
||||
const formattedStart = startDate.toLocaleString();
|
||||
|
||||
const isActive = !job.aborted && job.remainingMinutes > 0 && !job.waitingApproval;
|
||||
const isWaiting = !job.aborted && job.waitingApproval;
|
||||
|
||||
let statusText = '';
|
||||
let statusClass = '';
|
||||
|
||||
if (job.aborted) {
|
||||
statusText = 'Abgebrochen';
|
||||
statusClass = 'text-danger';
|
||||
} else if (job.waitingApproval) {
|
||||
statusText = 'Wartet auf Freischaltung';
|
||||
statusClass = 'text-info';
|
||||
} else if (job.remainingMinutes <= 0) {
|
||||
statusText = 'Abgeschlossen';
|
||||
statusClass = 'text-success';
|
||||
} else {
|
||||
statusText = 'Aktiv';
|
||||
statusClass = 'text-warning';
|
||||
}
|
||||
|
||||
// Zeige die verbleibende Zeit an
|
||||
const remainingTime = job.waitingApproval ? '-' : job.remainingMinutes;
|
||||
|
||||
row.innerHTML = `
|
||||
<td>${job.id}</td>
|
||||
<td>${job.printerId}</td>
|
||||
<td>${job.userId}</td>
|
||||
<td>${formattedStart}</td>
|
||||
<td>${job.durationInMinutes}</td>
|
||||
<td>${remainingTime}</td>
|
||||
<td><span class="${statusClass}">${statusText}</span></td>
|
||||
<td>${job.comments || '-'}</td>
|
||||
<td>
|
||||
${isActive ? `
|
||||
<button type="button" class="btn btn-sm btn-danger mb-1"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#abortJobModal"
|
||||
data-job-id="${job.id}">
|
||||
Abbrechen
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-success mb-1"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#finishJobModal"
|
||||
data-job-id="${job.id}">
|
||||
Beenden
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-primary mb-1"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#extendJobModal"
|
||||
data-job-id="${job.id}">
|
||||
Verlängern
|
||||
</button>
|
||||
` : ''}
|
||||
|
||||
${isWaiting ? `
|
||||
<button type="button" class="btn btn-sm btn-success mb-1"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#approveJobModal"
|
||||
data-job-id="${job.id}">
|
||||
Freischalten
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-danger mb-1"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#abortJobModal"
|
||||
data-job-id="${job.id}">
|
||||
Abbrechen
|
||||
</button>
|
||||
` : ''}
|
||||
|
||||
<button type="button" class="btn btn-sm btn-secondary mb-1"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#editCommentsModal"
|
||||
data-job-id="${job.id}"
|
||||
data-job-comments="${job.comments || ''}">
|
||||
Kommentare
|
||||
</button>
|
||||
</td>
|
||||
`;
|
||||
|
||||
tableBody.appendChild(row);
|
||||
});
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
37
backend/templates/login.html
Normal file
37
backend/templates/login.html
Normal file
@ -0,0 +1,37 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Anmelden - MYP API Tester{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row justify-content-center">
|
||||
<div class="col-md-6">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h4 class="mb-0">Anmelden</h4>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form" data-url="/auth/login" data-method="POST" data-response="loginResponse">
|
||||
<div class="mb-3">
|
||||
<label for="username" class="form-label">Benutzername</label>
|
||||
<input type="text" class="form-control" id="username" name="username" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="password" class="form-label">Passwort</label>
|
||||
<input type="password" class="form-control" id="password" name="password" required>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-primary">Anmelden</button>
|
||||
</form>
|
||||
|
||||
<div class="mt-3">
|
||||
<p>Noch kein Konto? <a href="/register">Registrieren</a></p>
|
||||
</div>
|
||||
|
||||
<div class="mt-3">
|
||||
<h5>Antwort:</h5>
|
||||
<pre class="api-response" id="loginResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
280
backend/templates/printers.html
Normal file
280
backend/templates/printers.html
Normal file
@ -0,0 +1,280 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Drucker - MYP API Tester{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row">
|
||||
<div class="col-md-12 mb-4">
|
||||
<div class="card">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<h4 class="mb-0">Drucker verwalten</h4>
|
||||
<button class="btn btn-primary" type="button" data-bs-toggle="collapse" data-bs-target="#newPrinterForm">
|
||||
Neuen Drucker hinzufügen
|
||||
</button>
|
||||
</div>
|
||||
<div class="collapse" id="newPrinterForm">
|
||||
<div class="card-body border-bottom">
|
||||
<form class="api-form" data-url="/api/printers" data-method="POST" data-response="createPrinterResponse" data-reload="true">
|
||||
<div class="mb-3">
|
||||
<label for="printerName" class="form-label">Name</label>
|
||||
<input type="text" class="form-control" id="printerName" name="name" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="printerDescription" class="form-label">Beschreibung</label>
|
||||
<textarea class="form-control" id="printerDescription" name="description" rows="3" required></textarea>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="printerStatus" class="form-label">Status</label>
|
||||
<select class="form-control" id="printerStatus" name="status">
|
||||
<option value="0">Verfügbar (0)</option>
|
||||
<option value="1">Besetzt (1)</option>
|
||||
<option value="2">Wartung (2)</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="printerIpAddress" class="form-label">IP-Adresse (Tapo Steckdose)</label>
|
||||
<input type="text" class="form-control" id="printerIpAddress" name="ipAddress" placeholder="z.B. 192.168.1.100">
|
||||
</div>
|
||||
<button type="submit" class="btn btn-success">Drucker erstellen</button>
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="createPrinterResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form mb-3" data-url="/api/printers" data-method="GET" data-response="printersResponse">
|
||||
<button type="submit" class="btn btn-primary">Drucker aktualisieren</button>
|
||||
</form>
|
||||
|
||||
<div class="table-responsive">
|
||||
<table class="table table-striped table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>ID</th>
|
||||
<th>Name</th>
|
||||
<th>Beschreibung</th>
|
||||
<th>Status</th>
|
||||
<th>IP-Adresse</th>
|
||||
<th>Aktueller Job</th>
|
||||
<th>Wartende Jobs</th>
|
||||
<th>Aktionen</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="printersTableBody">
|
||||
<!-- Wird dynamisch gefüllt -->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<h6>API-Antwort:</h6>
|
||||
<pre class="api-response" id="printersResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Drucker bearbeiten Modal -->
|
||||
<div class="modal fade" id="editPrinterModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Drucker bearbeiten</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="editPrinterForm" class="api-form" data-method="PUT" data-response="editPrinterResponse" data-reload="true">
|
||||
<input type="hidden" id="editPrinterId" name="printerId">
|
||||
<div class="mb-3">
|
||||
<label for="editPrinterName" class="form-label">Name</label>
|
||||
<input type="text" class="form-control" id="editPrinterName" name="name" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="editPrinterDescription" class="form-label">Beschreibung</label>
|
||||
<textarea class="form-control" id="editPrinterDescription" name="description" rows="3" required></textarea>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="editPrinterStatus" class="form-label">Status</label>
|
||||
<select class="form-control" id="editPrinterStatus" name="status">
|
||||
<option value="0">Verfügbar (0)</option>
|
||||
<option value="1">Besetzt (1)</option>
|
||||
<option value="2">Wartung (2)</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="editPrinterIpAddress" class="form-label">IP-Adresse (Tapo Steckdose)</label>
|
||||
<input type="text" class="form-control" id="editPrinterIpAddress" name="ipAddress" placeholder="z.B. 192.168.1.100">
|
||||
</div>
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="editPrinterResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="editPrinterForm" class="btn btn-primary">Änderungen speichern</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Drucker löschen Modal -->
|
||||
<div class="modal fade" id="deletePrinterModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Drucker löschen</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p>Möchten Sie den Drucker <span id="deletePrinterName"></span> wirklich löschen?</p>
|
||||
<form id="deletePrinterForm" class="api-form" data-method="DELETE" data-response="deletePrinterResponse" data-reload="true">
|
||||
<input type="hidden" id="deletePrinterId" name="printerId">
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="deletePrinterResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="deletePrinterForm" class="btn btn-danger">Löschen</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Drucker laden
|
||||
document.querySelector('form[data-url="/api/printers"]').dispatchEvent(new Event('submit'));
|
||||
|
||||
// Tabelle aktualisieren, wenn Drucker geladen werden
|
||||
const printersResponse = document.getElementById('printersResponse');
|
||||
const observer = new MutationObserver(function(mutations) {
|
||||
try {
|
||||
const printers = JSON.parse(printersResponse.textContent);
|
||||
updatePrintersTable(printers);
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Parsen der Drucker-Daten:', e);
|
||||
}
|
||||
});
|
||||
|
||||
observer.observe(printersResponse, { childList: true, characterData: true, subtree: true });
|
||||
|
||||
// Edit-Modal vorbereiten
|
||||
document.getElementById('editPrinterModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const printerId = button.getAttribute('data-printer-id');
|
||||
const printerName = button.getAttribute('data-printer-name');
|
||||
const printerDescription = button.getAttribute('data-printer-description');
|
||||
const printerStatus = button.getAttribute('data-printer-status');
|
||||
const printerIpAddress = button.getAttribute('data-printer-ip');
|
||||
|
||||
document.getElementById('editPrinterId').value = printerId;
|
||||
document.getElementById('editPrinterForm').setAttribute('data-url', `/api/printers/${printerId}`);
|
||||
document.getElementById('editPrinterName').value = printerName;
|
||||
document.getElementById('editPrinterDescription').value = printerDescription;
|
||||
document.getElementById('editPrinterStatus').value = printerStatus;
|
||||
document.getElementById('editPrinterIpAddress').value = printerIpAddress || '';
|
||||
});
|
||||
|
||||
// Delete-Modal vorbereiten
|
||||
document.getElementById('deletePrinterModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const printerId = button.getAttribute('data-printer-id');
|
||||
const printerName = button.getAttribute('data-printer-name');
|
||||
|
||||
document.getElementById('deletePrinterId').value = printerId;
|
||||
document.getElementById('deletePrinterForm').setAttribute('data-url', `/api/printers/${printerId}`);
|
||||
document.getElementById('deletePrinterName').textContent = printerName;
|
||||
});
|
||||
});
|
||||
|
||||
function updatePrintersTable(printers) {
|
||||
const tableBody = document.getElementById('printersTableBody');
|
||||
tableBody.innerHTML = '';
|
||||
|
||||
printers.forEach(printer => {
|
||||
const row = document.createElement('tr');
|
||||
|
||||
const statusText = {
|
||||
0: 'Verfügbar',
|
||||
1: 'Besetzt',
|
||||
2: 'Wartung'
|
||||
}[printer.status] || 'Unbekannt';
|
||||
|
||||
const statusClass = {
|
||||
0: 'text-success',
|
||||
1: 'text-warning',
|
||||
2: 'text-danger'
|
||||
}[printer.status] || '';
|
||||
|
||||
// Informationen zum aktuellen Job
|
||||
let currentJobInfo = '-';
|
||||
if (printer.latestJob && printer.status === 1) {
|
||||
// Verbleibende Zeit berechnen
|
||||
const remainingTime = printer.latestJob.remainingMinutes || 0;
|
||||
currentJobInfo = `
|
||||
<div class="small">
|
||||
<strong>ID:</strong> ${printer.latestJob.id.substring(0, 8)}...<br>
|
||||
<strong>Dauer:</strong> ${printer.latestJob.durationInMinutes} Min<br>
|
||||
<strong>Verbleibend:</strong> ${remainingTime} Min
|
||||
</div>
|
||||
`;
|
||||
}
|
||||
|
||||
// Wartende Jobs anzeigen
|
||||
let waitingJobsInfo = '-';
|
||||
if (printer.waitingJobs && printer.waitingJobs.length > 0) {
|
||||
const waitingJobsCount = printer.waitingJobs.length;
|
||||
waitingJobsInfo = `
|
||||
<div class="small">
|
||||
<strong>${waitingJobsCount} Job${waitingJobsCount !== 1 ? 's' : ''} in Warteschlange</strong><br>
|
||||
${printer.waitingJobs.map((job, index) =>
|
||||
`<span>${index + 1}. Job ${job.id.substring(0, 8)}... (${job.durationInMinutes} Min)</span>`
|
||||
).join('<br>')}
|
||||
</div>
|
||||
`;
|
||||
}
|
||||
|
||||
row.innerHTML = `
|
||||
<td>${printer.id}</td>
|
||||
<td>${printer.name}</td>
|
||||
<td>${printer.description}</td>
|
||||
<td><span class="${statusClass}">${statusText} (${printer.status})</span></td>
|
||||
<td>${printer.ipAddress || '-'}</td>
|
||||
<td>${currentJobInfo}</td>
|
||||
<td>${waitingJobsInfo}</td>
|
||||
<td>
|
||||
<button type="button" class="btn btn-sm btn-primary"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#editPrinterModal"
|
||||
data-printer-id="${printer.id}"
|
||||
data-printer-name="${printer.name}"
|
||||
data-printer-description="${printer.description}"
|
||||
data-printer-status="${printer.status}"
|
||||
data-printer-ip="${printer.ipAddress || ''}">
|
||||
Bearbeiten
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-danger"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#deletePrinterModal"
|
||||
data-printer-id="${printer.id}"
|
||||
data-printer-name="${printer.name}">
|
||||
Löschen
|
||||
</button>
|
||||
</td>
|
||||
`;
|
||||
|
||||
tableBody.appendChild(row);
|
||||
});
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
45
backend/templates/register.html
Normal file
45
backend/templates/register.html
Normal file
@ -0,0 +1,45 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Registrieren - MYP API Tester{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row justify-content-center">
|
||||
<div class="col-md-6">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h4 class="mb-0">Registrieren</h4>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form" data-url="/auth/register" data-method="POST" data-response="registerResponse">
|
||||
<div class="mb-3">
|
||||
<label for="username" class="form-label">Benutzername</label>
|
||||
<input type="text" class="form-control" id="username" name="username" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="password" class="form-label">Passwort</label>
|
||||
<input type="password" class="form-control" id="password" name="password" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="displayName" class="form-label">Anzeigename</label>
|
||||
<input type="text" class="form-control" id="displayName" name="displayName">
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="email" class="form-label">E-Mail</label>
|
||||
<input type="email" class="form-control" id="email" name="email">
|
||||
</div>
|
||||
<button type="submit" class="btn btn-primary">Registrieren</button>
|
||||
</form>
|
||||
|
||||
<div class="mt-3">
|
||||
<p>Bereits registriert? <a href="/login">Anmelden</a></p>
|
||||
</div>
|
||||
|
||||
<div class="mt-3">
|
||||
<h5>Antwort:</h5>
|
||||
<pre class="api-response" id="registerResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
395
backend/templates/stats.html
Normal file
395
backend/templates/stats.html
Normal file
@ -0,0 +1,395 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Statistiken - MYP API Tester{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row">
|
||||
<div class="col-md-12 mb-4">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h4 class="mb-0">Systemstatistiken</h4>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form mb-3" data-url="/api/stats" data-method="GET" data-response="statsResponse">
|
||||
<button type="submit" class="btn btn-primary">Statistiken aktualisieren</button>
|
||||
</form>
|
||||
|
||||
<div class="row" id="statsContainer">
|
||||
<!-- Wird dynamisch gefüllt -->
|
||||
</div>
|
||||
|
||||
<!-- Problem-Drucker-Bereich -->
|
||||
<div class="row mt-4">
|
||||
<div class="col-md-12">
|
||||
<div class="card">
|
||||
<div class="card-header bg-warning text-dark">
|
||||
<h5 class="mb-0">Drucker mit Verbindungsproblemen</h5>
|
||||
</div>
|
||||
<div class="card-body" id="problemPrintersContainer">
|
||||
<div class="alert alert-info">Keine Verbindungsprobleme festgestellt.</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Uptime-Grafik -->
|
||||
<div class="row mt-4">
|
||||
<div class="col-md-12">
|
||||
<div class="card">
|
||||
<div class="card-header bg-dark text-white">
|
||||
<h5 class="mb-0">Steckdosen-Verfügbarkeit</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form mb-3" data-url="/api/uptime" data-method="GET" data-response="uptimeResponse">
|
||||
<button type="submit" class="btn btn-primary">Uptime-Daten laden</button>
|
||||
</form>
|
||||
<canvas id="uptimeChart" width="100%" height="300"></canvas>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- API-Antworten -->
|
||||
<div class="row mt-4">
|
||||
<div class="col-md-6">
|
||||
<h6>Stats API-Antwort:</h6>
|
||||
<pre class="api-response" id="statsResponse"></pre>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<h6>Uptime API-Antwort:</h6>
|
||||
<pre class="api-response" id="uptimeResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<!-- Chart.js für Diagramme -->
|
||||
<script src="https://cdn.jsdelivr.net/npm/chart.js@4.4.0/dist/chart.umd.min.js"></script>
|
||||
<script>
|
||||
let uptimeChart;
|
||||
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Statistiken laden
|
||||
document.querySelector('form[data-url="/api/stats"]').dispatchEvent(new Event('submit'));
|
||||
document.querySelector('form[data-url="/api/uptime"]').dispatchEvent(new Event('submit'));
|
||||
|
||||
// Statistiken aktualisieren, wenn API-Antwort geladen wird
|
||||
const statsResponse = document.getElementById('statsResponse');
|
||||
const statsObserver = new MutationObserver(function(mutations) {
|
||||
try {
|
||||
const stats = JSON.parse(statsResponse.textContent);
|
||||
updateStatsDisplay(stats);
|
||||
updateProblemPrinters(stats);
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Parsen der Statistik-Daten:', e);
|
||||
}
|
||||
});
|
||||
|
||||
statsObserver.observe(statsResponse, { childList: true, characterData: true, subtree: true });
|
||||
|
||||
// Uptime-Daten aktualisieren, wenn API-Antwort geladen wird
|
||||
const uptimeResponse = document.getElementById('uptimeResponse');
|
||||
const uptimeObserver = new MutationObserver(function(mutations) {
|
||||
try {
|
||||
const uptime = JSON.parse(uptimeResponse.textContent);
|
||||
updateUptimeChart(uptime);
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Parsen der Uptime-Daten:', e);
|
||||
}
|
||||
});
|
||||
|
||||
uptimeObserver.observe(uptimeResponse, { childList: true, characterData: true, subtree: true });
|
||||
|
||||
// Periodische Aktualisierung
|
||||
setInterval(function() {
|
||||
document.querySelector('form[data-url="/api/stats"]').dispatchEvent(new Event('submit'));
|
||||
document.querySelector('form[data-url="/api/uptime"]').dispatchEvent(new Event('submit'));
|
||||
}, 60000); // Alle 60 Sekunden aktualisieren
|
||||
});
|
||||
|
||||
function updateStatsDisplay(stats) {
|
||||
const container = document.getElementById('statsContainer');
|
||||
container.innerHTML = '';
|
||||
|
||||
// Drucker-Statistiken
|
||||
const printerStats = document.createElement('div');
|
||||
printerStats.className = 'col-md-4 mb-3';
|
||||
printerStats.innerHTML = `
|
||||
<div class="card h-100">
|
||||
<div class="card-header bg-primary text-white">
|
||||
<h5 class="mb-0">Drucker</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Gesamt:</span>
|
||||
<span>${stats.printers.total}</span>
|
||||
</div>
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Verfügbar:</span>
|
||||
<span>${stats.printers.available}</span>
|
||||
</div>
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Auslastung:</span>
|
||||
<span>${Math.round(stats.printers.utilization_rate * 100)}%</span>
|
||||
</div>
|
||||
<div class="progress mt-3 mb-3">
|
||||
<div class="progress-bar" role="progressbar"
|
||||
style="width: ${Math.round(stats.printers.utilization_rate * 100)}%">
|
||||
${Math.round(stats.printers.utilization_rate * 100)}%
|
||||
</div>
|
||||
</div>
|
||||
<hr />
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Online:</span>
|
||||
<span>${stats.printers.online}</span>
|
||||
</div>
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Offline:</span>
|
||||
<span>${stats.printers.offline}</span>
|
||||
</div>
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Verbindungsrate:</span>
|
||||
<span>${Math.round(stats.printers.connectivity_rate * 100)}%</span>
|
||||
</div>
|
||||
<div class="progress mt-3">
|
||||
<div class="progress-bar bg-success" role="progressbar"
|
||||
style="width: ${Math.round(stats.printers.connectivity_rate * 100)}%">
|
||||
${Math.round(stats.printers.connectivity_rate * 100)}%
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
// Job-Statistiken
|
||||
const jobStats = document.createElement('div');
|
||||
jobStats.className = 'col-md-4 mb-3';
|
||||
jobStats.innerHTML = `
|
||||
<div class="card h-100">
|
||||
<div class="card-header bg-success text-white">
|
||||
<h5 class="mb-0">Druckaufträge</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Gesamt:</span>
|
||||
<span>${stats.jobs.total}</span>
|
||||
</div>
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Aktiv:</span>
|
||||
<span>${stats.jobs.active}</span>
|
||||
</div>
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Abgeschlossen:</span>
|
||||
<span>${stats.jobs.completed}</span>
|
||||
</div>
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Durchschnittliche Dauer:</span>
|
||||
<span>${stats.jobs.avg_duration} Minuten</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
// Benutzer- und Uptime-Statistiken
|
||||
const userStats = document.createElement('div');
|
||||
userStats.className = 'col-md-4 mb-3';
|
||||
userStats.innerHTML = `
|
||||
<div class="card h-100">
|
||||
<div class="card-header bg-info text-white">
|
||||
<h5 class="mb-0">System</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Benutzer:</span>
|
||||
<span>${stats.users.total}</span>
|
||||
</div>
|
||||
<hr />
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Verbindungsausfälle (7 Tage):</span>
|
||||
<span>${stats.uptime.outages_last_7_days}</span>
|
||||
</div>
|
||||
<div class="d-flex justify-content-between mb-2">
|
||||
<span>Aktuelle Probleme:</span>
|
||||
<span>${stats.uptime.problem_printers.length}</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
container.appendChild(printerStats);
|
||||
container.appendChild(jobStats);
|
||||
container.appendChild(userStats);
|
||||
}
|
||||
|
||||
function updateProblemPrinters(stats) {
|
||||
const container = document.getElementById('problemPrintersContainer');
|
||||
const problemPrinters = stats.uptime.problem_printers;
|
||||
|
||||
if (problemPrinters.length === 0) {
|
||||
container.innerHTML = '<div class="alert alert-info">Keine Verbindungsprobleme festgestellt.</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
let html = '<div class="table-responsive"><table class="table table-striped">';
|
||||
html += '<thead><tr><th>Drucker</th><th>Status</th><th>Offline seit</th><th>Dauer</th></tr></thead>';
|
||||
html += '<tbody>';
|
||||
|
||||
problemPrinters.forEach(printer => {
|
||||
let offlineSince = 'Unbekannt';
|
||||
let duration = 'Unbekannt';
|
||||
|
||||
if (printer.last_seen) {
|
||||
try {
|
||||
const lastSeen = new Date(printer.last_seen);
|
||||
const now = new Date();
|
||||
const diffSeconds = Math.floor((now - lastSeen) / 1000);
|
||||
const hours = Math.floor(diffSeconds / 3600);
|
||||
const minutes = Math.floor((diffSeconds % 3600) / 60);
|
||||
|
||||
offlineSince = lastSeen.toLocaleString();
|
||||
duration = `${hours}h ${minutes}m`;
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Berechnen der Offline-Zeit:', e);
|
||||
}
|
||||
}
|
||||
|
||||
html += `<tr>
|
||||
<td>${printer.name}</td>
|
||||
<td><span class="badge bg-danger">Offline</span></td>
|
||||
<td>${offlineSince}</td>
|
||||
<td>${duration}</td>
|
||||
</tr>`;
|
||||
});
|
||||
|
||||
html += '</tbody></table></div>';
|
||||
container.innerHTML = html;
|
||||
}
|
||||
|
||||
function updateUptimeChart(uptimeData) {
|
||||
// Wenn keine Daten vorhanden sind, nichts tun
|
||||
if (!uptimeData || !uptimeData.sockets || uptimeData.sockets.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Daten für das Diagramm vorbereiten
|
||||
const socketNames = [];
|
||||
const datasets = [];
|
||||
const colors = {
|
||||
online: 'rgba(40, 167, 69, 0.7)',
|
||||
offline: 'rgba(220, 53, 69, 0.7)',
|
||||
unknown: 'rgba(108, 117, 125, 0.7)'
|
||||
};
|
||||
|
||||
// Zeitraum für das Diagramm (letzten 7 Tage)
|
||||
const endDate = new Date();
|
||||
const startDate = new Date();
|
||||
startDate.setDate(startDate.getDate() - 7);
|
||||
|
||||
// Für jede Steckdose
|
||||
uptimeData.sockets.forEach(socket => {
|
||||
socketNames.push(socket.name);
|
||||
|
||||
// Sortiere Ereignisse nach Zeitstempel
|
||||
if (socket.events) {
|
||||
socket.events.sort((a, b) => new Date(a.timestamp) - new Date(b.timestamp));
|
||||
|
||||
// Erstelle einen Datensatz für diese Steckdose
|
||||
const data = [];
|
||||
|
||||
// Füge Ereignisse zum Datensatz hinzu
|
||||
socket.events.forEach(event => {
|
||||
data.push({
|
||||
x: new Date(event.timestamp),
|
||||
y: event.status === 'online' ? 1 : 0,
|
||||
status: event.status,
|
||||
duration: event.duration_seconds ?
|
||||
formatDuration(event.duration_seconds) : null
|
||||
});
|
||||
});
|
||||
|
||||
// Füge aktuellen Status hinzu
|
||||
if (socket.current_status) {
|
||||
data.push({
|
||||
x: new Date(),
|
||||
y: socket.current_status.connection_status === 'online' ? 1 : 0,
|
||||
status: socket.current_status.connection_status,
|
||||
duration: null
|
||||
});
|
||||
}
|
||||
|
||||
datasets.push({
|
||||
label: socket.name,
|
||||
data: data,
|
||||
stepped: true,
|
||||
borderColor: colors[socket.current_status?.connection_status || 'unknown'],
|
||||
backgroundColor: colors[socket.current_status?.connection_status || 'unknown'],
|
||||
fill: false
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Chart.js Konfiguration
|
||||
const ctx = document.getElementById('uptimeChart').getContext('2d');
|
||||
|
||||
// Wenn Chart bereits existiert, zerstöre ihn
|
||||
if (uptimeChart) {
|
||||
uptimeChart.destroy();
|
||||
}
|
||||
|
||||
// Erstelle neuen Chart
|
||||
uptimeChart = new Chart(ctx, {
|
||||
type: 'line',
|
||||
data: {
|
||||
datasets: datasets
|
||||
},
|
||||
options: {
|
||||
responsive: true,
|
||||
plugins: {
|
||||
tooltip: {
|
||||
callbacks: {
|
||||
label: function(context) {
|
||||
const point = context.raw;
|
||||
let label = context.dataset.label || '';
|
||||
label += ': ' + (point.status === 'online' ? 'Online' : 'Offline');
|
||||
if (point.duration) {
|
||||
label += ' (Dauer: ' + point.duration + ')';
|
||||
}
|
||||
return label;
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
scales: {
|
||||
x: {
|
||||
type: 'time',
|
||||
time: {
|
||||
unit: 'day'
|
||||
},
|
||||
min: startDate,
|
||||
max: endDate
|
||||
},
|
||||
y: {
|
||||
min: -0.1,
|
||||
max: 1.1,
|
||||
ticks: {
|
||||
callback: function(value) {
|
||||
return value === 0 ? 'Offline' : value === 1 ? 'Online' : '';
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function formatDuration(seconds) {
|
||||
const hours = Math.floor(seconds / 3600);
|
||||
const minutes = Math.floor((seconds % 3600) / 60);
|
||||
return `${hours}h ${minutes}m`;
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
238
backend/templates/users.html
Normal file
238
backend/templates/users.html
Normal file
@ -0,0 +1,238 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Benutzer - MYP API Tester{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row">
|
||||
<div class="col-md-12 mb-4">
|
||||
<div class="card">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<h4 class="mb-0">Benutzer verwalten</h4>
|
||||
<button class="btn btn-primary" type="button" data-bs-toggle="collapse" data-bs-target="#newUserForm">
|
||||
Neuen Benutzer hinzufügen
|
||||
</button>
|
||||
</div>
|
||||
<div class="collapse" id="newUserForm">
|
||||
<div class="card-body border-bottom">
|
||||
<form class="api-form" data-url="/auth/register" data-method="POST" data-response="createUserResponse" data-reload="true">
|
||||
<div class="mb-3">
|
||||
<label for="userName" class="form-label">Benutzername</label>
|
||||
<input type="text" class="form-control" id="userName" name="username" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="userPassword" class="form-label">Passwort</label>
|
||||
<input type="password" class="form-control" id="userPassword" name="password" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="userDisplayName" class="form-label">Anzeigename</label>
|
||||
<input type="text" class="form-control" id="userDisplayName" name="displayName">
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="userEmail" class="form-label">E-Mail</label>
|
||||
<input type="email" class="form-control" id="userEmail" name="email">
|
||||
</div>
|
||||
<button type="submit" class="btn btn-success">Benutzer erstellen</button>
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="createUserResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<form class="api-form mb-3" data-url="/api/users" data-method="GET" data-response="usersResponse">
|
||||
<button type="submit" class="btn btn-primary">Benutzer aktualisieren</button>
|
||||
</form>
|
||||
|
||||
<div class="table-responsive">
|
||||
<table class="table table-striped table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>ID</th>
|
||||
<th>Benutzername</th>
|
||||
<th>Anzeigename</th>
|
||||
<th>E-Mail</th>
|
||||
<th>Rolle</th>
|
||||
<th>Aktionen</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="usersTableBody">
|
||||
<!-- Wird dynamisch gefüllt -->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<h6>API-Antwort:</h6>
|
||||
<pre class="api-response" id="usersResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Benutzer bearbeiten Modal -->
|
||||
<div class="modal fade" id="editUserModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Benutzer bearbeiten</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="editUserForm" class="api-form" data-method="PUT" data-response="editUserResponse" data-reload="true">
|
||||
<input type="hidden" id="editUserId" name="userId">
|
||||
<div class="mb-3">
|
||||
<label for="editUserName" class="form-label">Benutzername</label>
|
||||
<input type="text" class="form-control" id="editUserName" name="username" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="editUserDisplayName" class="form-label">Anzeigename</label>
|
||||
<input type="text" class="form-control" id="editUserDisplayName" name="displayName">
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="editUserEmail" class="form-label">E-Mail</label>
|
||||
<input type="email" class="form-control" id="editUserEmail" name="email">
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="editUserRole" class="form-label">Rolle</label>
|
||||
<select class="form-control" id="editUserRole" name="role">
|
||||
<option value="user">Benutzer</option>
|
||||
<option value="admin">Administrator</option>
|
||||
<option value="guest">Gast</option>
|
||||
</select>
|
||||
</div>
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="editUserResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="editUserForm" class="btn btn-primary">Änderungen speichern</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Benutzer löschen Modal -->
|
||||
<div class="modal fade" id="deleteUserModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Benutzer löschen</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p>Möchten Sie den Benutzer <span id="deleteUserName"></span> wirklich löschen?</p>
|
||||
<form id="deleteUserForm" class="api-form" data-method="DELETE" data-response="deleteUserResponse" data-reload="true">
|
||||
<input type="hidden" id="deleteUserId" name="userId">
|
||||
</form>
|
||||
<div class="mt-3">
|
||||
<h6>Antwort:</h6>
|
||||
<pre class="api-response" id="deleteUserResponse"></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Abbrechen</button>
|
||||
<button type="submit" form="deleteUserForm" class="btn btn-danger">Löschen</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Benutzer laden
|
||||
document.querySelector('form[data-url="/api/users"]').dispatchEvent(new Event('submit'));
|
||||
|
||||
// Tabelle aktualisieren, wenn Benutzer geladen werden
|
||||
const usersResponse = document.getElementById('usersResponse');
|
||||
const observer = new MutationObserver(function(mutations) {
|
||||
try {
|
||||
const users = JSON.parse(usersResponse.textContent);
|
||||
updateUsersTable(users);
|
||||
} catch (e) {
|
||||
console.error('Fehler beim Parsen der Benutzer-Daten:', e);
|
||||
}
|
||||
});
|
||||
|
||||
observer.observe(usersResponse, { childList: true, characterData: true, subtree: true });
|
||||
|
||||
// Edit-Modal vorbereiten
|
||||
document.getElementById('editUserModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const userId = button.getAttribute('data-user-id');
|
||||
const userName = button.getAttribute('data-user-name');
|
||||
const userDisplayName = button.getAttribute('data-user-displayname');
|
||||
const userEmail = button.getAttribute('data-user-email');
|
||||
const userRole = button.getAttribute('data-user-role');
|
||||
|
||||
document.getElementById('editUserId').value = userId;
|
||||
document.getElementById('editUserForm').setAttribute('data-url', `/api/users/${userId}`);
|
||||
document.getElementById('editUserName').value = userName;
|
||||
document.getElementById('editUserDisplayName').value = userDisplayName || '';
|
||||
document.getElementById('editUserEmail').value = userEmail || '';
|
||||
document.getElementById('editUserRole').value = userRole;
|
||||
});
|
||||
|
||||
// Delete-Modal vorbereiten
|
||||
document.getElementById('deleteUserModal').addEventListener('show.bs.modal', function(event) {
|
||||
const button = event.relatedTarget;
|
||||
const userId = button.getAttribute('data-user-id');
|
||||
const userName = button.getAttribute('data-user-name');
|
||||
|
||||
document.getElementById('deleteUserId').value = userId;
|
||||
document.getElementById('deleteUserForm').setAttribute('data-url', `/api/users/${userId}`);
|
||||
document.getElementById('deleteUserName').textContent = userName;
|
||||
});
|
||||
});
|
||||
|
||||
function updateUsersTable(users) {
|
||||
const tableBody = document.getElementById('usersTableBody');
|
||||
tableBody.innerHTML = '';
|
||||
|
||||
users.forEach(user => {
|
||||
const row = document.createElement('tr');
|
||||
|
||||
const roleClass = {
|
||||
'admin': 'text-danger',
|
||||
'user': 'text-primary',
|
||||
'guest': 'text-secondary'
|
||||
}[user.role] || '';
|
||||
|
||||
row.innerHTML = `
|
||||
<td>${user.id}</td>
|
||||
<td>${user.username}</td>
|
||||
<td>${user.displayName || user.username}</td>
|
||||
<td>${user.email || '-'}</td>
|
||||
<td><span class="${roleClass}">${user.role}</span></td>
|
||||
<td>
|
||||
<button type="button" class="btn btn-sm btn-primary"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#editUserModal"
|
||||
data-user-id="${user.id}"
|
||||
data-user-name="${user.username}"
|
||||
data-user-displayname="${user.displayName || ''}"
|
||||
data-user-email="${user.email || ''}"
|
||||
data-user-role="${user.role}">
|
||||
Bearbeiten
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-danger"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#deleteUserModal"
|
||||
data-user-id="${user.id}"
|
||||
data-user-name="${user.username}">
|
||||
Löschen
|
||||
</button>
|
||||
</td>
|
||||
`;
|
||||
|
||||
tableBody.appendChild(row);
|
||||
});
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
0
docs/.gitkeep
Normal file → Executable file
0
docs/.gitkeep
Normal file → Executable file
0
docs/Aktueller Stand.md
Normal file → Executable file
0
docs/Aktueller Stand.md
Normal file → Executable file
10
docs/Dokumentation_IHK.md
Normal file → Executable file
10
docs/Dokumentation_IHK.md
Normal file → Executable file
@ -8,6 +8,16 @@ Notizen:
|
||||
- Da Till digitale Vernetzung hat macht er Backend, weil die Schnittstelle der Vernetzung zum cyberphysischen System dort lag
|
||||
- für die Dokumentation: Daten (Datums) müssen stimmen!
|
||||
|
||||
python schnittstelle funktionierte nicht
|
||||
nach etlichem rumprobieren festgestellt: geht nicht so einfach
|
||||
wireshark mitschnitt gemacht → auffällig: immer die selben responses bei verschlüsselter verbindung
|
||||
ohne erfolg beim simulieren einzelner anfragen
|
||||
dann: geistesblitz: anfragensequenz muss es sein!
|
||||
hat funktioniert → es hat klick gemacht!! .
|
||||
verbindung verschlüsselt und mit temporärem cookie
|
||||
→ proprietäre Verschlüsselung
|
||||
wie wird die verbindung ausgehandelt?
|
||||
|
||||
------
|
||||
|
||||
11.09 : Teile bestellt im internen Technikshop
|
||||
|
0
docs/Infrastruktur.png
Normal file → Executable file
0
docs/Infrastruktur.png
Normal file → Executable file
Before Width: | Height: | Size: 3.3 MiB After Width: | Height: | Size: 3.3 MiB |
0
docs/Infrastruktur.tldr
Normal file → Executable file
0
docs/Infrastruktur.tldr
Normal file → Executable file
0
docs/MYP.dbml
Normal file → Executable file
0
docs/MYP.dbml
Normal file → Executable file
0
docs/MYP.png
Normal file → Executable file
0
docs/MYP.png
Normal file → Executable file
Before Width: | Height: | Size: 64 KiB After Width: | Height: | Size: 64 KiB |
0
docs/MYP.sql
Normal file → Executable file
0
docs/MYP.sql
Normal file → Executable file
22
frontend-aenderungen.md
Executable file
22
frontend-aenderungen.md
Executable file
@ -0,0 +1,22 @@
|
||||
# Notwendige Frontend-Änderungen
|
||||
|
||||
1. Frontend-Authentifizierung anpassen:
|
||||
- GitHub OAuth durch lokale Authentifizierung ersetzen
|
||||
- Login-Komponenten für Benutzername/Passwort erstellen
|
||||
- Registrierungs-Formular implementieren
|
||||
- API-Routen für Login- und Registrierungsprozess anpassen
|
||||
|
||||
2. Datenbankschema:
|
||||
- Users-Tabelle anpassen um Passwort-Hash zu unterstützen
|
||||
- GitHub-ID entfernen oder optional machen
|
||||
|
||||
3. Auth-System:
|
||||
- Lucia.js: Anpassung von OAuth auf Formular-basierte Authentifizierung
|
||||
- Session-Management beibehalten
|
||||
|
||||
4. API-Endpunktanpassungen:
|
||||
- Neue Login und Register-Endpunkte erstellen
|
||||
- Route für initialen Admin-Setup
|
||||
|
||||
Die Änderungen im Frontend sind umfangreicher, da das aktuelle System stark auf GitHub OAuth ausgerichtet ist und komplett umgestellt werden muss.
|
||||
|
408
install-backend.sh
Executable file
408
install-backend.sh
Executable file
@ -0,0 +1,408 @@
|
||||
#!/bin/bash
|
||||
|
||||
# MYP Backend Installations-Skript
|
||||
# Dieses Skript installiert das Backend mit Docker und Host-Netzwerkanbindung
|
||||
|
||||
# Farbcodes für Ausgabe
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[0;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Funktion zur Ausgabe mit Zeitstempel
|
||||
log() {
|
||||
echo -e "${BLUE}[$(date +'%Y-%m-%d %H:%M:%S')]${NC} $1"
|
||||
}
|
||||
|
||||
error_log() {
|
||||
echo -e "${RED}[$(date +'%Y-%m-%d %H:%M:%S')] FEHLER:${NC} $1" >&2
|
||||
}
|
||||
|
||||
# Funktion zum Bereinigen vorhandener Installationen
|
||||
cleanup_existing_installation() {
|
||||
log "${YELLOW}Bereinige vorhandene Installation...${NC}"
|
||||
|
||||
# Stoppe und entferne existierende Container
|
||||
if docker ps -a | grep -q "myp-backend"; then
|
||||
log "Stoppe und entferne existierenden Backend-Container..."
|
||||
docker stop myp-backend &>/dev/null || true
|
||||
docker rm myp-backend &>/dev/null || true
|
||||
fi
|
||||
|
||||
# Entferne Docker Images
|
||||
if docker images | grep -q "myp-backend"; then
|
||||
log "Entferne existierendes Backend-Image..."
|
||||
docker rmi myp-backend &>/dev/null || true
|
||||
fi
|
||||
|
||||
log "${GREEN}Bereinigung abgeschlossen.${NC}"
|
||||
}
|
||||
|
||||
# Pfade definieren
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
BACKEND_DIR="$SCRIPT_DIR/backend"
|
||||
|
||||
# Prüfen ob Verzeichnis existiert
|
||||
if [ ! -d "$BACKEND_DIR" ]; then
|
||||
error_log "Backend-Verzeichnis '$BACKEND_DIR' nicht gefunden."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Bereinige existierende Installation
|
||||
cleanup_existing_installation
|
||||
|
||||
# Funktion zur Installation von Docker und Docker Compose für Raspberry Pi
|
||||
install_docker() {
|
||||
log "${YELLOW}Docker ist nicht installiert. Installation wird gestartet...${NC}"
|
||||
|
||||
# Erkenne Raspberry Pi
|
||||
if [ -f /proc/device-tree/model ] && grep -q "Raspberry Pi" /proc/device-tree/model; then
|
||||
log "${GREEN}Raspberry Pi erkannt. Installiere Docker für ARM-Architektur...${NC}"
|
||||
IS_RASPBERRY_PI=true
|
||||
else
|
||||
IS_RASPBERRY_PI=false
|
||||
fi
|
||||
|
||||
# Aktualisiere Paketindex
|
||||
if ! sudo apt-get update; then
|
||||
error_log "Konnte Paketindex nicht aktualisieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Installiere erforderliche Pakete
|
||||
if ! sudo apt-get install -y apt-transport-https ca-certificates curl gnupg software-properties-common; then
|
||||
error_log "Konnte erforderliche Pakete nicht installieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Raspberry Pi-spezifische Installation
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
# Setze Systemarchitektur für Raspberry Pi (armhf oder arm64)
|
||||
ARCH=$(dpkg --print-architecture)
|
||||
log "Erkannte Systemarchitektur: ${ARCH}"
|
||||
|
||||
# Installiere Docker mit convenience script (für Raspberry Pi empfohlen)
|
||||
log "${YELLOW}Installiere Docker mit dem convenience script...${NC}"
|
||||
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||
sudo sh get-docker.sh
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Docker-Installation fehlgeschlagen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
# Standard-Installation für andere Systeme
|
||||
# Füge Docker's offiziellen GPG-Schlüssel hinzu
|
||||
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
|
||||
|
||||
# Füge Docker-Repository hinzu
|
||||
if ! sudo add-apt-repository "deb [arch=$(dpkg --print-architecture)] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"; then
|
||||
error_log "Konnte Docker-Repository nicht hinzufügen. Prüfen Sie, ob Ihr System unterstützt wird."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Aktualisiere Paketindex erneut
|
||||
sudo apt-get update
|
||||
|
||||
# Installiere Docker
|
||||
if ! sudo apt-get install -y docker-ce docker-ce-cli containerd.io; then
|
||||
error_log "Konnte Docker nicht installieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Füge aktuellen Benutzer zur Docker-Gruppe hinzu
|
||||
sudo usermod -aG docker "$USER"
|
||||
|
||||
log "${GREEN}Docker wurde installiert.${NC}"
|
||||
log "${YELLOW}WICHTIG: Möglicherweise müssen Sie sich neu anmelden, damit die Gruppenänderung wirksam wird.${NC}"
|
||||
|
||||
# Prüfen, ob Docker Compose v2 Plugin verfügbar ist (bevorzugt, da moderner)
|
||||
log "${YELLOW}Prüfe Docker Compose Version...${NC}"
|
||||
|
||||
if docker compose version &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v2 Plugin ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=true
|
||||
else
|
||||
log "${YELLOW}Docker Compose v2 Plugin nicht gefunden. Versuche Docker Compose v1 zu installieren...${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
# Für Raspberry Pi ist es besser, die richtige Architektur zu verwenden
|
||||
if [ "$ARCH" = "armhf" ]; then
|
||||
log "Installiere Docker Compose für armhf (32-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-armv7" -o /usr/local/bin/docker-compose
|
||||
elif [ "$ARCH" = "arm64" ]; then
|
||||
log "Installiere Docker Compose für arm64 (64-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-aarch64" -o /usr/local/bin/docker-compose
|
||||
else
|
||||
# Fallback auf v1.29.2 für unbekannte ARM-Architekturen
|
||||
log "Verwende automatische Architekturerkennung für Docker Compose v1.29.2..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
else
|
||||
# Für andere Systeme versuche zuerst v2, dann v1.29.2 als Fallback
|
||||
log "Installiere Docker Compose v2 für $(uname -s)/$(uname -m)..."
|
||||
if ! sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose; then
|
||||
log "${YELLOW}Konnte Docker Compose v2 nicht herunterladen. Versuche v1.29.2...${NC}"
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Konnte Docker Compose nicht herunterladen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
sudo chmod +x /usr/local/bin/docker-compose
|
||||
|
||||
log "${GREEN}Docker Compose wurde installiert.${NC}"
|
||||
fi
|
||||
|
||||
# Starte Docker-Dienst
|
||||
if command -v systemctl &> /dev/null; then
|
||||
sudo systemctl enable docker
|
||||
sudo systemctl start docker
|
||||
elif command -v service &> /dev/null; then
|
||||
sudo service docker enable
|
||||
sudo service docker start
|
||||
fi
|
||||
}
|
||||
|
||||
# Prüfen ob Docker installiert ist
|
||||
if ! command -v docker &> /dev/null; then
|
||||
log "${YELLOW}Docker ist nicht installiert.${NC}"
|
||||
read -p "Möchten Sie Docker installieren? (j/n): " install_docker_choice
|
||||
if [[ "$install_docker_choice" == "j" ]]; then
|
||||
install_docker
|
||||
else
|
||||
error_log "Docker wird für die Installation benötigt. Bitte installieren Sie Docker manuell."
|
||||
log "Siehe: https://docs.docker.com/get-docker/"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfen ob Docker Daemon läuft
|
||||
if ! docker info &> /dev/null; then
|
||||
log "${YELLOW}Docker-Daemon läuft nicht. Versuche, den Dienst zu starten...${NC}"
|
||||
|
||||
# Versuche, Docker zu starten
|
||||
if command -v systemctl &> /dev/null; then
|
||||
sudo systemctl start docker
|
||||
elif command -v service &> /dev/null; then
|
||||
sudo service docker start
|
||||
else
|
||||
error_log "Konnte Docker-Daemon nicht starten. Bitte starten Sie den Docker-Dienst manuell."
|
||||
log "Starten mit: sudo systemctl start docker oder sudo service docker start"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfe erneut, ob Docker läuft
|
||||
if ! docker info &> /dev/null; then
|
||||
error_log "Docker-Daemon konnte nicht gestartet werden. Bitte starten Sie den Docker-Dienst manuell."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
log "${GREEN}Docker-Daemon wurde erfolgreich gestartet.${NC}"
|
||||
fi
|
||||
|
||||
# Prüfen ob Docker Compose installiert ist
|
||||
if docker compose version &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v2 Plugin ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=true
|
||||
elif command -v docker-compose &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v1 ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
else
|
||||
log "${YELLOW}Docker Compose ist nicht installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
read -p "Möchten Sie Docker Compose installieren? (j/n): " install_compose_choice
|
||||
if [[ "$install_compose_choice" == "j" ]]; then
|
||||
log "${YELLOW}Installiere Docker Compose...${NC}"
|
||||
|
||||
# Prüfe ob das Betriebssystem ARM-basiert ist (z.B. Raspberry Pi)
|
||||
if grep -q "arm" /proc/cpuinfo 2> /dev/null; then
|
||||
ARCH=$(dpkg --print-architecture 2> /dev/null || echo "unknown")
|
||||
IS_RASPBERRY_PI=true
|
||||
else
|
||||
IS_RASPBERRY_PI=false
|
||||
fi
|
||||
|
||||
# Versuche zuerst Docker Compose v2 zu installieren
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
if [ "$ARCH" = "armhf" ]; then
|
||||
log "Installiere Docker Compose für armhf (32-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-armv7" -o /usr/local/bin/docker-compose
|
||||
elif [ "$ARCH" = "arm64" ]; then
|
||||
log "Installiere Docker Compose für arm64 (64-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-aarch64" -o /usr/local/bin/docker-compose
|
||||
else
|
||||
log "Verwende automatische Architekturerkennung für Docker Compose v1.29.2..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
else
|
||||
log "Installiere Docker Compose v2 für $(uname -s)/$(uname -m)..."
|
||||
if ! sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose; then
|
||||
log "${YELLOW}Konnte Docker Compose v2 nicht herunterladen. Versuche v1.29.2...${NC}"
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Konnte Docker Compose nicht herunterladen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
sudo chmod +x /usr/local/bin/docker-compose
|
||||
|
||||
log "${GREEN}Docker Compose wurde installiert.${NC}"
|
||||
else
|
||||
error_log "Docker Compose wird für die Installation benötigt. Bitte installieren Sie es manuell."
|
||||
log "Siehe: https://docs.docker.com/compose/install/"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfen ob wget installiert ist (wird für healthcheck verwendet)
|
||||
if ! command -v wget &> /dev/null; then
|
||||
error_log "wget ist nicht installiert, wird aber für den Container-Healthcheck benötigt."
|
||||
log "Installation mit: sudo apt-get install wget"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Wechsle ins Backend-Verzeichnis
|
||||
log "Wechsle ins Verzeichnis: $BACKEND_DIR"
|
||||
cd "$BACKEND_DIR" || {
|
||||
error_log "Konnte nicht ins Verzeichnis $BACKEND_DIR wechseln."
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Prüfe ob Dockerfile existiert
|
||||
if [ ! -f "Dockerfile" ]; then
|
||||
error_log "Dockerfile nicht gefunden in $BACKEND_DIR."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfe ob docker-compose.yml existiert
|
||||
if [ ! -f "docker-compose.yml" ]; then
|
||||
error_log "docker-compose.yml nicht gefunden in $BACKEND_DIR."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Erstelle .env-Datei
|
||||
log "${YELLOW}Erstelle .env Datei...${NC}"
|
||||
cat > .env << EOL
|
||||
SECRET_KEY=7445630171969DFAC92C53CEC92E67A9CB2E00B3CB2F
|
||||
DATABASE_PATH=instance/myp.db
|
||||
TAPO_USERNAME=till.tomczak@mercedes-benz.com
|
||||
TAPO_PASSWORD=744563017196A
|
||||
PRINTERS={"Printer 1": {"ip": "192.168.0.100"}, "Printer 2": {"ip": "192.168.0.101"}, "Printer 3": {"ip": "192.168.0.102"}, "Printer 4": {"ip": "192.168.0.103"}, "Printer 5": {"ip": "192.168.0.104"}, "Printer 6": {"ip": "192.168.0.106"}}
|
||||
EOL
|
||||
|
||||
if [ ! -f ".env" ]; then
|
||||
error_log "Konnte .env-Datei nicht erstellen. Prüfen Sie die Berechtigungen."
|
||||
exit 1
|
||||
fi
|
||||
log "${GREEN}.env Datei erfolgreich erstellt${NC}"
|
||||
|
||||
# Verzeichnisse erstellen
|
||||
log "Erstelle benötigte Verzeichnisse"
|
||||
if ! mkdir -p logs; then
|
||||
error_log "Konnte Verzeichnis 'logs' nicht erstellen. Prüfen Sie die Berechtigungen."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if ! mkdir -p instance; then
|
||||
error_log "Konnte Verzeichnis 'instance' nicht erstellen. Prüfen Sie die Berechtigungen."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Docker-Image bauen und starten
|
||||
log "${YELLOW}Baue und starte Backend-Container...${NC}"
|
||||
|
||||
# Verwende die richtige Docker Compose Version
|
||||
if [ "${DOCKER_COMPOSE_V2:-false}" = true ]; then
|
||||
# Docker Compose V2 Plugin (docker compose)
|
||||
if ! docker compose up -d --build; then
|
||||
error_log "Docker Compose (v2) fehlgeschlagen. Versuche mit v1 Format..."
|
||||
if ! docker-compose up -d --build; then
|
||||
error_log "Docker Compose fehlgeschlagen. Siehe Fehlermeldung oben."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
else
|
||||
# Docker Compose V1 (docker-compose)
|
||||
if ! docker-compose up -d --build; then
|
||||
error_log "Docker Compose fehlgeschlagen. Siehe Fehlermeldung oben."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfe, ob der Container läuft
|
||||
log "Warte 10 Sekunden, bis der Container gestartet ist..."
|
||||
sleep 10
|
||||
if docker ps | grep -q "myp-backend"; then
|
||||
log "${GREEN}Backend-Container läuft${NC}"
|
||||
else
|
||||
error_log "Backend-Container läuft nicht. Container-Status:"
|
||||
docker ps -a | grep myp-backend
|
||||
log "Container-Logs:"
|
||||
docker logs myp-backend
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Test API-Endpunkt
|
||||
log "${YELLOW}Teste Backend-API...${NC}"
|
||||
log "${YELLOW}HINWEIS: Der API-Server ist bei der ersten Installation oft noch nicht erreichbar${NC}"
|
||||
log "${YELLOW}Dies ist ein bekanntes Verhalten wegen der Netzwerkkonfiguration${NC}"
|
||||
log "${YELLOW}Bitte nach der Installation das System neu starten, danach sollte der API-Server erreichbar sein${NC}"
|
||||
|
||||
# Wir versuchen es trotzdem einmal, um zu sehen, ob er vielleicht doch läuft
|
||||
if curl -s http://localhost:5000/health 2>/dev/null | grep -q "healthy"; then
|
||||
log "${GREEN}Backend-API ist erreichbar und funktioniert${NC}"
|
||||
else
|
||||
log "${YELLOW}Backend-API ist wie erwartet noch nicht erreichbar${NC}"
|
||||
log "${GREEN}Das ist völlig normal bei der Erstinstallation${NC}"
|
||||
log "${GREEN}Nach einem Neustart des Systems sollte der API-Server korrekt erreichbar sein${NC}"
|
||||
log "Container-Status prüfen mit: docker logs myp-backend"
|
||||
fi
|
||||
|
||||
# Initialisierung der Datenbank prüfen
|
||||
log "${YELLOW}Prüfe Datenbank-Initialisierung...${NC}"
|
||||
if [ ! -s "instance/myp.db" ]; then
|
||||
log "${YELLOW}Datenbank scheint leer zu sein. Führe Initialisierungsskript aus...${NC}"
|
||||
DB_INIT_OUTPUT=$(docker exec myp-backend python -c "from app import init_db; init_db()" 2>&1)
|
||||
if [ $? -eq 0 ]; then
|
||||
log "${GREEN}Datenbank erfolgreich initialisiert${NC}"
|
||||
else
|
||||
error_log "Fehler bei der Datenbank-Initialisierung:"
|
||||
echo "$DB_INIT_OUTPUT"
|
||||
log "Container-Logs:"
|
||||
docker logs myp-backend
|
||||
fi
|
||||
else
|
||||
log "${GREEN}Datenbank existiert bereits${NC}"
|
||||
fi
|
||||
|
||||
# Teste, ob ein API-Endpunkt Daten zurückgibt
|
||||
log "${YELLOW}Teste Datenbank-Verbindung über API...${NC}"
|
||||
if curl -s http://localhost:5000/api/printers | grep -q "\[\]"; then
|
||||
log "${GREEN}Datenbank-Verbindung funktioniert${NC}"
|
||||
else
|
||||
log "${YELLOW}API gibt keine leere Drucker-Liste zurück. Möglicherweise ist die DB nicht korrekt initialisiert.${NC}"
|
||||
log "API-Antwort:"
|
||||
curl -s http://localhost:5000/api/printers
|
||||
fi
|
||||
|
||||
log "${GREEN}=== Installation abgeschlossen ===${NC}"
|
||||
log "${YELLOW}WICHTIG: Nach der Erstinstallation ist ein Systemneustart erforderlich${NC}"
|
||||
log "${YELLOW}Danach ist das Backend unter http://localhost:5000 erreichbar${NC}"
|
||||
log "Anzeigen der Logs: docker logs -f myp-backend"
|
||||
|
||||
# Verwende die richtige Docker Compose Version für Hinweis
|
||||
if [ "${DOCKER_COMPOSE_V2:-false}" = true ]; then
|
||||
log "Backend stoppen: docker compose -f $BACKEND_DIR/docker-compose.yml down"
|
||||
else
|
||||
log "Backend stoppen: docker-compose -f $BACKEND_DIR/docker-compose.yml down"
|
||||
fi
|
483
install-frontend.sh
Executable file
483
install-frontend.sh
Executable file
@ -0,0 +1,483 @@
|
||||
#!/bin/bash
|
||||
|
||||
# MYP Frontend Installations-Skript
|
||||
# Dieses Skript installiert das Frontend mit Docker und Host-Netzwerkanbindung
|
||||
|
||||
# Farbcodes für Ausgabe
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[0;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Funktion zur Ausgabe mit Zeitstempel
|
||||
log() {
|
||||
echo -e "${BLUE}[$(date +'%Y-%m-%d %H:%M:%S')]${NC} $1"
|
||||
}
|
||||
|
||||
error_log() {
|
||||
echo -e "${RED}[$(date +'%Y-%m-%d %H:%M:%S')] FEHLER:${NC} $1" >&2
|
||||
}
|
||||
|
||||
# Funktion zum Bereinigen vorhandener Installationen
|
||||
cleanup_existing_installation() {
|
||||
log "${YELLOW}Bereinige vorhandene Installation...${NC}"
|
||||
|
||||
# Stoppe und entferne existierende Container
|
||||
if docker ps -a | grep -q "myp-frontend"; then
|
||||
log "Stoppe und entferne existierenden Frontend-Container..."
|
||||
docker stop myp-frontend &>/dev/null || true
|
||||
docker rm myp-frontend &>/dev/null || true
|
||||
fi
|
||||
|
||||
# Entferne Docker Images
|
||||
if docker images | grep -q "myp-frontend"; then
|
||||
log "Entferne existierendes Frontend-Image..."
|
||||
docker rmi myp-frontend &>/dev/null || true
|
||||
fi
|
||||
|
||||
log "${GREEN}Bereinigung abgeschlossen.${NC}"
|
||||
}
|
||||
|
||||
# Pfade definieren
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
FRONTEND_DIR="$SCRIPT_DIR/packages/reservation-platform"
|
||||
|
||||
# Prüfen ob Verzeichnis existiert
|
||||
if [ ! -d "$FRONTEND_DIR" ]; then
|
||||
error_log "Frontend-Verzeichnis '$FRONTEND_DIR' nicht gefunden."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Bereinige existierende Installation
|
||||
cleanup_existing_installation
|
||||
|
||||
# Funktion zur Installation von Docker und Docker Compose für Raspberry Pi
|
||||
install_docker() {
|
||||
log "${YELLOW}Docker ist nicht installiert. Installation wird gestartet...${NC}"
|
||||
|
||||
# Erkenne Raspberry Pi
|
||||
if [ -f /proc/device-tree/model ] && grep -q "Raspberry Pi" /proc/device-tree/model; then
|
||||
log "${GREEN}Raspberry Pi erkannt. Installiere Docker für ARM-Architektur...${NC}"
|
||||
IS_RASPBERRY_PI=true
|
||||
else
|
||||
IS_RASPBERRY_PI=false
|
||||
fi
|
||||
|
||||
# Aktualisiere Paketindex
|
||||
if ! sudo apt-get update; then
|
||||
error_log "Konnte Paketindex nicht aktualisieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Installiere erforderliche Pakete
|
||||
if ! sudo apt-get install -y apt-transport-https ca-certificates curl gnupg software-properties-common; then
|
||||
error_log "Konnte erforderliche Pakete nicht installieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Raspberry Pi-spezifische Installation
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
# Setze Systemarchitektur für Raspberry Pi (armhf oder arm64)
|
||||
ARCH=$(dpkg --print-architecture)
|
||||
log "Erkannte Systemarchitektur: ${ARCH}"
|
||||
|
||||
# Installiere Docker mit convenience script (für Raspberry Pi empfohlen)
|
||||
log "${YELLOW}Installiere Docker mit dem convenience script...${NC}"
|
||||
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||
sudo sh get-docker.sh
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Docker-Installation fehlgeschlagen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
# Standard-Installation für andere Systeme
|
||||
# Füge Docker's offiziellen GPG-Schlüssel hinzu
|
||||
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
|
||||
|
||||
# Füge Docker-Repository hinzu
|
||||
if ! sudo add-apt-repository "deb [arch=$(dpkg --print-architecture)] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"; then
|
||||
error_log "Konnte Docker-Repository nicht hinzufügen. Prüfen Sie, ob Ihr System unterstützt wird."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Aktualisiere Paketindex erneut
|
||||
sudo apt-get update
|
||||
|
||||
# Installiere Docker
|
||||
if ! sudo apt-get install -y docker-ce docker-ce-cli containerd.io; then
|
||||
error_log "Konnte Docker nicht installieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Füge aktuellen Benutzer zur Docker-Gruppe hinzu
|
||||
sudo usermod -aG docker "$USER"
|
||||
|
||||
log "${GREEN}Docker wurde installiert.${NC}"
|
||||
log "${YELLOW}WICHTIG: Möglicherweise müssen Sie sich neu anmelden, damit die Gruppenänderung wirksam wird.${NC}"
|
||||
|
||||
# Prüfen, ob Docker Compose v2 Plugin verfügbar ist (bevorzugt, da moderner)
|
||||
log "${YELLOW}Prüfe Docker Compose Version...${NC}"
|
||||
|
||||
if docker compose version &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v2 Plugin ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=true
|
||||
else
|
||||
log "${YELLOW}Docker Compose v2 Plugin nicht gefunden. Versuche Docker Compose v1 zu installieren...${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
# Für Raspberry Pi ist es besser, die richtige Architektur zu verwenden
|
||||
if [ "$ARCH" = "armhf" ]; then
|
||||
log "Installiere Docker Compose für armhf (32-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-armv7" -o /usr/local/bin/docker-compose
|
||||
elif [ "$ARCH" = "arm64" ]; then
|
||||
log "Installiere Docker Compose für arm64 (64-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-aarch64" -o /usr/local/bin/docker-compose
|
||||
else
|
||||
# Fallback auf v1.29.2 für unbekannte ARM-Architekturen
|
||||
log "Verwende automatische Architekturerkennung für Docker Compose v1.29.2..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
else
|
||||
# Für andere Systeme versuche zuerst v2, dann v1.29.2 als Fallback
|
||||
log "Installiere Docker Compose v2 für $(uname -s)/$(uname -m)..."
|
||||
if ! sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose; then
|
||||
log "${YELLOW}Konnte Docker Compose v2 nicht herunterladen. Versuche v1.29.2...${NC}"
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Konnte Docker Compose nicht herunterladen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
sudo chmod +x /usr/local/bin/docker-compose
|
||||
|
||||
log "${GREEN}Docker Compose wurde installiert.${NC}"
|
||||
fi
|
||||
|
||||
# Starte Docker-Dienst
|
||||
if command -v systemctl &> /dev/null; then
|
||||
sudo systemctl enable docker
|
||||
sudo systemctl start docker
|
||||
elif command -v service &> /dev/null; then
|
||||
sudo service docker enable
|
||||
sudo service docker start
|
||||
fi
|
||||
}
|
||||
|
||||
# Prüfen ob Docker installiert ist
|
||||
if ! command -v docker &> /dev/null; then
|
||||
log "${YELLOW}Docker ist nicht installiert.${NC}"
|
||||
read -p "Möchten Sie Docker installieren? (j/n): " install_docker_choice
|
||||
if [[ "$install_docker_choice" == "j" ]]; then
|
||||
install_docker
|
||||
else
|
||||
error_log "Docker wird für die Installation benötigt. Bitte installieren Sie Docker manuell."
|
||||
log "Siehe: https://docs.docker.com/get-docker/"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfen ob Docker Daemon läuft
|
||||
if ! docker info &> /dev/null; then
|
||||
log "${YELLOW}Docker-Daemon läuft nicht. Versuche, den Dienst zu starten...${NC}"
|
||||
|
||||
# Versuche, Docker zu starten
|
||||
if command -v systemctl &> /dev/null; then
|
||||
sudo systemctl start docker
|
||||
elif command -v service &> /dev/null; then
|
||||
sudo service docker start
|
||||
else
|
||||
error_log "Konnte Docker-Daemon nicht starten. Bitte starten Sie den Docker-Dienst manuell."
|
||||
log "Starten mit: sudo systemctl start docker oder sudo service docker start"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfe erneut, ob Docker läuft
|
||||
if ! docker info &> /dev/null; then
|
||||
error_log "Docker-Daemon konnte nicht gestartet werden. Bitte starten Sie den Docker-Dienst manuell."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
log "${GREEN}Docker-Daemon wurde erfolgreich gestartet.${NC}"
|
||||
fi
|
||||
|
||||
# Prüfen ob Docker Compose installiert ist
|
||||
if docker compose version &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v2 Plugin ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=true
|
||||
elif command -v docker-compose &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v1 ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
else
|
||||
log "${YELLOW}Docker Compose ist nicht installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
read -p "Möchten Sie Docker Compose installieren? (j/n): " install_compose_choice
|
||||
if [[ "$install_compose_choice" == "j" ]]; then
|
||||
log "${YELLOW}Installiere Docker Compose...${NC}"
|
||||
|
||||
# Prüfe ob das Betriebssystem ARM-basiert ist (z.B. Raspberry Pi)
|
||||
if grep -q "arm" /proc/cpuinfo 2> /dev/null; then
|
||||
ARCH=$(dpkg --print-architecture 2> /dev/null || echo "unknown")
|
||||
IS_RASPBERRY_PI=true
|
||||
else
|
||||
IS_RASPBERRY_PI=false
|
||||
fi
|
||||
|
||||
# Versuche zuerst Docker Compose v2 zu installieren
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
if [ "$ARCH" = "armhf" ]; then
|
||||
log "Installiere Docker Compose für armhf (32-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-armv7" -o /usr/local/bin/docker-compose
|
||||
elif [ "$ARCH" = "arm64" ]; then
|
||||
log "Installiere Docker Compose für arm64 (64-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-aarch64" -o /usr/local/bin/docker-compose
|
||||
else
|
||||
log "Verwende automatische Architekturerkennung für Docker Compose v1.29.2..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
else
|
||||
log "Installiere Docker Compose v2 für $(uname -s)/$(uname -m)..."
|
||||
if ! sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose; then
|
||||
log "${YELLOW}Konnte Docker Compose v2 nicht herunterladen. Versuche v1.29.2...${NC}"
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Konnte Docker Compose nicht herunterladen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
sudo chmod +x /usr/local/bin/docker-compose
|
||||
|
||||
log "${GREEN}Docker Compose wurde installiert.${NC}"
|
||||
else
|
||||
error_log "Docker Compose wird für die Installation benötigt. Bitte installieren Sie es manuell."
|
||||
log "Siehe: https://docs.docker.com/compose/install/"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfen ob wget installiert ist (wird für healthcheck verwendet)
|
||||
if ! command -v wget &> /dev/null; then
|
||||
error_log "wget ist nicht installiert, wird aber für den Container-Healthcheck benötigt."
|
||||
log "Installation mit: sudo apt-get install wget"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Wechsle ins Frontend-Verzeichnis
|
||||
log "Wechsle ins Verzeichnis: $FRONTEND_DIR"
|
||||
cd "$FRONTEND_DIR" || {
|
||||
error_log "Konnte nicht ins Verzeichnis $FRONTEND_DIR wechseln."
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Prüfe ob Dockerfile existiert
|
||||
if [ ! -f "Dockerfile" ]; then
|
||||
error_log "Dockerfile nicht gefunden in $FRONTEND_DIR."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfe ob docker-compose.yml existiert
|
||||
if [ ! -f "docker-compose.yml" ]; then
|
||||
error_log "docker-compose.yml nicht gefunden in $FRONTEND_DIR."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfe ob package.json existiert
|
||||
if [ ! -f "package.json" ]; then
|
||||
error_log "package.json nicht gefunden in $FRONTEND_DIR."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Erstelle .env-Datei
|
||||
log "${YELLOW}Erstelle .env Datei...${NC}"
|
||||
cat > .env << EOL
|
||||
# Basic Server Configuration
|
||||
RUNTIME_ENVIRONMENT=prod
|
||||
DB_PATH=db/sqlite.db
|
||||
|
||||
# OAuth Configuration (Bitte anpassen)
|
||||
OAUTH_CLIENT_ID=client_id
|
||||
OAUTH_CLIENT_SECRET=client_secret
|
||||
|
||||
# Backend-API URL (IP-Adresse oder Hostname des Backend-Servers)
|
||||
NEXT_PUBLIC_API_URL=http://localhost:5000
|
||||
EOL
|
||||
|
||||
if [ ! -f ".env" ]; then
|
||||
error_log "Konnte .env-Datei nicht erstellen. Prüfen Sie die Berechtigungen."
|
||||
exit 1
|
||||
fi
|
||||
log "${GREEN}.env Datei erfolgreich erstellt${NC}"
|
||||
log "${YELLOW}HINWEIS: Bitte passen Sie die Backend-URL in der .env-Datei an, falls das Backend auf einem anderen Server läuft.${NC}"
|
||||
|
||||
# Datenbank-Verzeichnis erstellen
|
||||
log "Erstelle Datenbankverzeichnis"
|
||||
if ! mkdir -p db; then
|
||||
error_log "Konnte Verzeichnis 'db' nicht erstellen. Prüfen Sie die Berechtigungen."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Docker-Image bauen und starten
|
||||
log "${YELLOW}Baue und starte Frontend-Container...${NC}"
|
||||
|
||||
# Verwende die richtige Docker Compose Version
|
||||
if [ "${DOCKER_COMPOSE_V2:-false}" = true ]; then
|
||||
# Docker Compose V2 Plugin (docker compose)
|
||||
if ! docker compose up -d --build; then
|
||||
error_log "Docker Compose (v2) fehlgeschlagen. Versuche mit v1 Format..."
|
||||
if ! docker-compose up -d --build; then
|
||||
error_log "Docker Compose fehlgeschlagen. Siehe Fehlermeldung oben."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
else
|
||||
# Docker Compose V1 (docker-compose)
|
||||
if ! docker-compose up -d --build; then
|
||||
error_log "Docker Compose fehlgeschlagen. Siehe Fehlermeldung oben."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfe, ob der Container läuft
|
||||
log "Warte 10 Sekunden, bis der Container gestartet ist..."
|
||||
sleep 10
|
||||
|
||||
# Prüfe mehrmals, da der Container möglicherweise länger zum Starten braucht
|
||||
MAX_ATTEMPTS=5
|
||||
CURRENT_ATTEMPT=1
|
||||
|
||||
while [ $CURRENT_ATTEMPT -le $MAX_ATTEMPTS ]; do
|
||||
log "Prüfe Container-Status (Versuch $CURRENT_ATTEMPT von $MAX_ATTEMPTS)..."
|
||||
|
||||
if docker ps | grep -q "myp-frontend"; then
|
||||
log "${GREEN}Frontend-Container läuft${NC}"
|
||||
break
|
||||
else
|
||||
CONTAINER_STATUS=$(docker ps -a | grep myp-frontend)
|
||||
CONTAINER_CREATED=$(echo "$CONTAINER_STATUS" | grep -q "Created" && echo "true" || echo "false")
|
||||
CONTAINER_EXITED=$(echo "$CONTAINER_STATUS" | grep -q "Exited" && echo "true" || echo "false")
|
||||
|
||||
if [ "$CONTAINER_EXITED" = "true" ]; then
|
||||
log "${YELLOW}Container wurde beendet. Prüfe Logs...${NC}"
|
||||
docker logs myp-frontend
|
||||
|
||||
log "${YELLOW}Starte Container neu mit besserer Debug-Ausgabe...${NC}"
|
||||
docker rm -f myp-frontend
|
||||
|
||||
if [ "${DOCKER_COMPOSE_V2:-false}" = true ]; then
|
||||
docker compose up -d
|
||||
else
|
||||
docker-compose up -d
|
||||
fi
|
||||
|
||||
sleep 10
|
||||
fi
|
||||
|
||||
if [ $CURRENT_ATTEMPT -eq $MAX_ATTEMPTS ]; then
|
||||
error_log "Frontend-Container läuft nach mehreren Versuchen nicht. Container-Status:"
|
||||
docker ps -a | grep myp-frontend
|
||||
log "Container-Logs:"
|
||||
docker logs myp-frontend
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
CURRENT_ATTEMPT=$((CURRENT_ATTEMPT + 1))
|
||||
sleep 20
|
||||
done
|
||||
|
||||
# Teste ob der Server erreichbar ist
|
||||
log "${YELLOW}Teste ob Frontend-Server erreichbar ist...${NC}"
|
||||
log "${YELLOW}HINWEIS: Bei der Erstinstallation kann es einige Minuten dauern, bis der Server erreichbar ist${NC}"
|
||||
log "${YELLOW}Bei anhaltenden Problemen kann ein Neustart des Systems helfen${NC}"
|
||||
|
||||
MAX_ATTEMPTS=3
|
||||
ATTEMPT=1
|
||||
|
||||
while [ $ATTEMPT -le $MAX_ATTEMPTS ]; do
|
||||
log "Versuch $ATTEMPT/$MAX_ATTEMPTS..."
|
||||
if curl -s -o /dev/null -w "%{http_code}" http://localhost:3000 2>/dev/null | grep -q "200\|304"; then
|
||||
log "${GREEN}Frontend-Server ist erreichbar!${NC}"
|
||||
break
|
||||
else
|
||||
# Prüfe, ob der Container noch läuft oder Fehler aufweist
|
||||
CONTAINER_STATUS=$(docker inspect --format='{{.State.Status}}' myp-frontend 2>/dev/null || echo "nicht gefunden")
|
||||
|
||||
if [ "$CONTAINER_STATUS" != "running" ]; then
|
||||
log "${YELLOW}Container ist nicht aktiv (Status: $CONTAINER_STATUS). Prüfe Logs...${NC}"
|
||||
docker logs myp-frontend --tail 20
|
||||
|
||||
# Wenn der Container gestoppt wurde, starte ihn neu
|
||||
if [ "$CONTAINER_STATUS" = "exited" ] || [ "$CONTAINER_STATUS" = "created" ]; then
|
||||
log "${YELLOW}Versuche, den Container neuzustarten...${NC}"
|
||||
docker start myp-frontend
|
||||
sleep 10
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $ATTEMPT -eq $MAX_ATTEMPTS ]; then
|
||||
log "${YELLOW}Server noch nicht erreichbar. Das ist bei der Erstinstallation normal.${NC}"
|
||||
log "${GREEN}Der Container ist installiert und sollte nach einem System-Neustart korrekt funktionieren.${NC}"
|
||||
log "${GREEN}Beim ersten Start kann die Datenbank-Migration und der Build länger dauern.${NC}"
|
||||
log "Überprüfen Sie später den Container-Status mit: docker logs myp-frontend"
|
||||
else
|
||||
log "Server noch nicht erreichbar. Warte 10 Sekunden..."
|
||||
sleep 10
|
||||
fi
|
||||
fi
|
||||
ATTEMPT=$((ATTEMPT+1))
|
||||
done
|
||||
|
||||
# Stellen Sie sicher, dass Verzeichnis- und Datei-Berechtigungen korrekt gesetzt sind
|
||||
log "${YELLOW}Bereite Datenbank vor...${NC}"
|
||||
mkdir -p db
|
||||
touch db/sqlite.db
|
||||
chmod 666 db/sqlite.db
|
||||
log "${GREEN}Datenbank vorbereitet${NC}"
|
||||
|
||||
# Führe bei Bedarf SQLite-Rebuild im Container aus
|
||||
log "${YELLOW}Führe SQLite-Rebuild im Container durch...${NC}"
|
||||
docker exec myp-frontend npm_config_build_from_source=true pnpm rebuild better-sqlite3 || {
|
||||
log "${YELLOW}Rebuild im laufenden Container nicht möglich. Wird beim nächsten Start automatisch ausgeführt.${NC}"
|
||||
}
|
||||
|
||||
# Prüfe ob die Datenbank-Migration erfolgt ist
|
||||
log "${YELLOW}Prüfe Datenbank-Migration...${NC}"
|
||||
log "${YELLOW}Hinweis: Die Migration wird beim ersten Start nach dem Systemneustart automatisch ausgeführt${NC}"
|
||||
|
||||
if [ -f "db/sqlite.db" ]; then
|
||||
log "${GREEN}Datenbank existiert${NC}"
|
||||
|
||||
# Setze Berechtigungen
|
||||
chmod 666 db/sqlite.db
|
||||
|
||||
# Prüfe Datenbankgröße
|
||||
DB_SIZE=$(du -b db/sqlite.db 2>/dev/null | cut -f1 || echo "0")
|
||||
if [ "$DB_SIZE" -gt 1000 ]; then
|
||||
log "${GREEN}Datenbank scheint initialisiert zu sein (Größe: $DB_SIZE Bytes)${NC}"
|
||||
else
|
||||
log "${YELLOW}Datenbank ist leer oder sehr klein. Die Migration wird beim ersten Start ausgeführt.${NC}"
|
||||
fi
|
||||
else
|
||||
log "${YELLOW}Konnte Datenbank-Datei nicht finden. Wird beim Neustart automatisch erstellt.${NC}"
|
||||
fi
|
||||
|
||||
log "${GREEN}=== Installation abgeschlossen ===${NC}"
|
||||
log "${YELLOW}WICHTIG: Nach der Erstinstallation ist ein Systemneustart erforderlich${NC}"
|
||||
log "${YELLOW}Danach ist das Frontend unter http://localhost:3000 erreichbar${NC}"
|
||||
log "Anzeigen der Logs: docker logs -f myp-frontend"
|
||||
|
||||
# Verwende die richtige Docker Compose Version für Hinweis
|
||||
if [ "${DOCKER_COMPOSE_V2:-false}" = true ]; then
|
||||
log "Frontend stoppen: docker compose -f $FRONTEND_DIR/docker-compose.yml down"
|
||||
else
|
||||
log "Frontend stoppen: docker-compose -f $FRONTEND_DIR/docker-compose.yml down"
|
||||
fi
|
536
log.txt
Normal file
536
log.txt
Normal file
@ -0,0 +1,536 @@
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 40ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463'
|
||||
}
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "4214325463"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 39ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3020338880"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3020338880"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463'
|
||||
}
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "4214325463"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 37ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "4214325463"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463'
|
||||
}
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 32ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463'
|
||||
}
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "4214325463"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 31ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3020338880"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3020338880"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
@ -1,27 +0,0 @@
|
||||
# Build and utility assets
|
||||
docker/
|
||||
scripts/
|
||||
|
||||
# Ignore node_modules as they will be installed in the container
|
||||
node_modules
|
||||
|
||||
# Ignore build artifacts
|
||||
.next
|
||||
|
||||
# Ignore runtime data
|
||||
db/
|
||||
|
||||
# Ignore local configuration files
|
||||
.env
|
||||
.env.example
|
||||
|
||||
# Ignore version control files
|
||||
.git
|
||||
.gitignore
|
||||
|
||||
# Ignore IDE/editor specific files
|
||||
*.log
|
||||
*.tmp
|
||||
*.DS_Store
|
||||
.vscode/
|
||||
.idea/
|
10
packages/reservation-platform/.env
Normal file
10
packages/reservation-platform/.env
Normal file
@ -0,0 +1,10 @@
|
||||
# Basic Server Configuration
|
||||
RUNTIME_ENVIRONMENT=prod
|
||||
DB_PATH=db/sqlite.db
|
||||
|
||||
# OAuth Configuration (Bitte anpassen)
|
||||
OAUTH_CLIENT_ID=client_id
|
||||
OAUTH_CLIENT_SECRET=client_secret
|
||||
|
||||
# Backend-API URL (IP-Adresse oder Hostname des Backend-Servers)
|
||||
NEXT_PUBLIC_API_URL=http://localhost:5000
|
@ -1,3 +1,7 @@
|
||||
# Basic Server Configuration
|
||||
RUNTIME_ENVIRONMENT=dev
|
||||
DB_PATH=db/sqlite.db
|
||||
|
||||
# OAuth Configuration
|
||||
OAUTH_CLIENT_ID=client_id
|
||||
OAUTH_CLIENT_SECRET=client_secret
|
5
packages/reservation-platform/.gitignore
vendored
5
packages/reservation-platform/.gitignore
vendored
@ -1,10 +1,7 @@
|
||||
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
|
||||
|
||||
# db folder
|
||||
db/
|
||||
|
||||
# Env file
|
||||
.env
|
||||
/db
|
||||
|
||||
|
||||
# dependencies
|
||||
|
@ -1,34 +1,72 @@
|
||||
FROM node:20-bookworm-slim
|
||||
FROM node:20-alpine
|
||||
|
||||
# Create application directory
|
||||
RUN mkdir -p /usr/src/app
|
||||
WORKDIR /app
|
||||
|
||||
# Set environment variables
|
||||
ENV PORT=3000
|
||||
ENV NEXT_TELEMETRY_DISABLED=1
|
||||
|
||||
WORKDIR /usr/src/app
|
||||
|
||||
# Copy package.json and pnpm-lock.yaml
|
||||
COPY package.json /usr/src/app
|
||||
COPY pnpm-lock.yaml /usr/src/app
|
||||
# Install system dependencies for SQLite and native modules
|
||||
RUN apk add --no-cache python3 build-base g++ make sqlite sqlite-dev gcc musl-dev git libffi-dev openssl-dev cmake
|
||||
|
||||
# Install pnpm
|
||||
RUN corepack enable pnpm
|
||||
RUN npm install -g pnpm
|
||||
|
||||
# Install dependencies
|
||||
RUN pnpm install
|
||||
# Copy package files
|
||||
COPY package.json pnpm-lock.yaml ./
|
||||
|
||||
# Copy the rest of the application code
|
||||
COPY . /usr/src/app
|
||||
# Install dependencies with native bindings build approval, ensuring to build from source for all platforms
|
||||
ENV CFLAGS="-fPIC" \
|
||||
LDFLAGS="-fPIC" \
|
||||
CXXFLAGS="-fPIC" \
|
||||
npm_config_build_from_source=true \
|
||||
npm_config_sqlite=/usr/local \
|
||||
npm_config_sqlite_libname=sqlite3
|
||||
|
||||
# Initialize Database, if it not already exists
|
||||
RUN pnpm run db
|
||||
# Durchführen der Installation mit umfassenden Flags für native Bindungen
|
||||
RUN pnpm install --unsafe-perm --no-optional --frozen-lockfile
|
||||
|
||||
# Build the application
|
||||
RUN pnpm run build
|
||||
# Explizit better-sqlite3 für die aktuelle Plattform neu bauen
|
||||
RUN cd /app/node_modules/better-sqlite3 && \
|
||||
npm run build-release
|
||||
|
||||
# Install tsx for running TypeScript files directly
|
||||
RUN pnpm add -D tsx
|
||||
|
||||
# Copy source code
|
||||
COPY . .
|
||||
|
||||
# Create database directory
|
||||
RUN mkdir -p db/
|
||||
|
||||
# Build the Next.js application
|
||||
RUN pnpm build || echo "Generate schema failed, but continuing..."
|
||||
|
||||
# Expose the port
|
||||
EXPOSE 3000
|
||||
|
||||
# Startup script to migrate DB and start app
|
||||
RUN echo '#!/bin/sh' > /app/startup.sh && \
|
||||
echo 'set -e' >> /app/startup.sh && \
|
||||
echo 'mkdir -p /app/db' >> /app/startup.sh && \
|
||||
echo 'echo "Starting application..."' >> /app/startup.sh && \
|
||||
echo 'echo "Konfiguriere DB-Verzeichnis..."' >> /app/startup.sh && \
|
||||
echo 'DB_FILE="/app/db/sqlite.db"' >> /app/startup.sh && \
|
||||
echo 'if [ ! -f "$DB_FILE" ]; then' >> /app/startup.sh && \
|
||||
echo ' echo "Creating empty database file..."' >> /app/startup.sh && \
|
||||
echo ' touch "$DB_FILE"' >> /app/startup.sh && \
|
||||
echo 'fi' >> /app/startup.sh && \
|
||||
echo 'chmod 666 "$DB_FILE"' >> /app/startup.sh && \
|
||||
echo 'chmod -R 777 /app/db' >> /app/startup.sh && \
|
||||
echo 'export DB_PATH=$DB_FILE' >> /app/startup.sh && \
|
||||
echo 'echo "Datenbank wird unter $DB_PATH verwendet"' >> /app/startup.sh && \
|
||||
echo 'echo "Führe Datenbank-Migration aus..."' >> /app/startup.sh && \
|
||||
echo 'NODE_ENV=production npx tsx ./src/server/db/migrate.ts' >> /app/startup.sh && \
|
||||
echo 'echo "Migration abgeschlossen"' >> /app/startup.sh && \
|
||||
echo 'echo "Starte Next.js Anwendung..."' >> /app/startup.sh && \
|
||||
echo 'if [ -d ".next" ]; then' >> /app/startup.sh && \
|
||||
echo ' pnpm start' >> /app/startup.sh && \
|
||||
echo 'else' >> /app/startup.sh && \
|
||||
echo ' echo "Build directory nicht gefunden, führe Build aus..."' >> /app/startup.sh && \
|
||||
echo ' pnpm build && pnpm start' >> /app/startup.sh && \
|
||||
echo 'fi' >> /app/startup.sh && \
|
||||
chmod +x /app/startup.sh
|
||||
|
||||
# Start the application
|
||||
CMD ["/bin/sh", "-c", "if [ ! -f ./db/sqlite.db ]; then pnpm db; fi && pnpm start"]
|
||||
CMD ["/app/startup.sh"]
|
@ -1,32 +1,217 @@
|
||||
# MYP - Manage Your Printer
|
||||
utilss/analytics/(scope).ts
|
||||
deriver.ts
|
||||
utils/sentinel.ts -> auth guard
|
||||
|
||||
MYP (Manage Your Printer) ist eine Webanwendung zur Reservierung von 3D-Druckern.
|
||||
Sie wurde im Rahmen des Abschlussprojektes der Fachinformatiker Ausbildung für Daten- und Prozessanalyse für die Technische Berufsausbildung des Mercedes-Benz Werkes Berlin-Marienfelde entwickelt.
|
||||
|
||||
## Deployment
|
||||
---
|
||||
|
||||
### Voraussetzungen
|
||||
Basierend auf den erwähnten Anforderungen, hier sind einige zusätzliche Spalten, die Sie zu Ihrer Datenbank hinzufügen könnten:
|
||||
|
||||
- Netzwerk auf Raspberry Pi ist eingerichtet
|
||||
- Docker ist installiert
|
||||
Für die Tabelle printers:
|
||||
|
||||
### Schritte
|
||||
total_print_jobs: Die Gesamtzahl der Druckaufträge, die ein Drucker ausgeführt hat.
|
||||
|
||||
1. Docker-Container bauen (docker/build.sh)
|
||||
2. Docker-Container speichern (docker/save.sh caddy:2.8 myp-rp:latest)
|
||||
3. Docker-Container auf Raspberry Pi bereitstellen (docker/deploy.sh)
|
||||
total_active_time: Die Gesamtzeit, in der der Drucker aktiv war (in Minuten).
|
||||
|
||||
## Entwicklerinformationen
|
||||
total_error_jobs: Die Gesamtzahl der Druckaufträge, die aufgrund eines Fehlers abgebrochen wurden.
|
||||
|
||||
### Raspberry Pi Einstellungen
|
||||
last_maintenance_date: Das Datum der letzten Wartung des Druckers.
|
||||
|
||||
Auf dem Raspberry Pi wurde Raspbian Lite installiert.
|
||||
Unter /srv/* sind die Projektdateien zu finden.
|
||||
Für die Tabelle printJobs:
|
||||
|
||||
### Anmeldedaten
|
||||
end_time: Die Zeit, zu der der Druckauftrag beendet wurde.
|
||||
|
||||
```
|
||||
Benutzer: myp
|
||||
Passwort: (persönlich bekannt)
|
||||
was_successful: Ein boolescher Wert, der angibt, ob der Druckauftrag erfolgreich abgeschlossen wurde.
|
||||
|
||||
error_code: Ein Code, der einen bestimmten Fehler identifiziert, wenn der Druckauftrag abgebrochen wurde.
|
||||
|
||||
Für die Tabelle users:
|
||||
|
||||
total_print_jobs: Die Gesamtzahl der Druckaufträge, die ein Benutzer gestartet hat.
|
||||
|
||||
preferred_printer_id: Die ID des Druckers, den der Benutzer am häufigsten verwendet.
|
||||
|
||||
last_login_date: Das Datum des letzten Logins des Benutzers.
|
||||
|
||||
Diese zusätzlichen Spalten könnten Ihnen dabei helfen, die oben genannten statistischen Analysen und Machine Learning-Vorhersagen durchzuführen. Bitte beachten Sie, dass Sie möglicherweise zusätzliche Logik in Ihrer Anwendung implementieren müssen, um diese Spalten zu aktualisieren, wenn entsprechende Ereignisse eintreten (z.B. ein Druckauftrag wird gestartet oder beendet, ein Benutzer loggt sich ein usw.).
|
||||
|
||||
---
|
||||
|
||||
Basierend auf Ihrem Datenbankschema, das Informationen über Drucker, Druckaufträge und Benutzer enthält, könnten Sie eine Vielzahl von statistischen Analysen und Machine Learning-Vorhersagen treffen. Hier sind einige konkrete Vorschläge:
|
||||
|
||||
### Statistische Analysen:
|
||||
1. **Auslastungsanalyse**: Bestimmen Sie die Auslastung der Drucker, indem Sie die Anzahl und Dauer der Druckaufträge analysieren.
|
||||
2. **Fehleranalyse**: Untersuchen Sie die Häufigkeit und Ursachen von abgebrochenen Druckaufträgen, um Muster zu erkennen.
|
||||
3. **Benutzerverhalten**: Analysieren Sie das Verhalten der Benutzer, z.B. welche Drucker am häufigsten verwendet werden oder zu welchen Zeiten die meisten Druckaufträge eingehen.
|
||||
|
||||
### Machine Learning-Vorhersagen:
|
||||
1. **Vorhersage der Druckerauslastung**: Verwenden Sie Zeitreihenanalysen, um zukünftige Auslastungsmuster der Drucker vorherzusagen.
|
||||
2. **Anomalieerkennung**: Setzen Sie Machine Learning ein, um Anomalien im Druckverhalten zu erkennen, die auf potenzielle Probleme hinweisen könnten.
|
||||
3. **Empfehlungssystem**: Entwickeln Sie ein Modell, das Benutzern basierend auf ihren bisherigen Druckaufträgen und Präferenzen Drucker empfiehlt.
|
||||
|
||||
### Konkrete Umsetzungsempfehlungen:
|
||||
- **Daten vorbereiten**: Reinigen und transformieren Sie Ihre Daten, um sie für die Analyse vorzubereiten. Entfernen Sie Duplikate, behandeln Sie fehlende Werte und konvertieren Sie kategoriale Daten in ein format, das von Machine Learning-Algorithmen verarbeitet werden kann.
|
||||
- **Feature Engineering**: Erstellen Sie neue Merkmale (Features), die für Vorhersagemodelle nützlich sein könnten, wie z.B. die durchschnittliche Dauer der Druckaufträge pro Benutzer oder die Gesamtzahl der Druckaufträge pro Drucker.
|
||||
- **Modellauswahl**: Wählen Sie geeignete Machine Learning-Modelle aus. Für Zeitreihenprognosen könnten ARIMA-Modelle geeignet sein, während für die Klassifizierung von Benutzerverhalten Entscheidungsbäume oder Random Forests verwendet werden könnten.
|
||||
- **Modelltraining und -validierung**: Trainieren Sie Ihre Modelle mit einem Teil Ihrer Daten und validieren Sie sie mit einem anderen Teil, um sicherzustellen, dass die Modelle gut generalisieren und nicht überangepasst sind.
|
||||
- **Ergebnisinterpretation**: Interpretieren Sie die Ergebnisse Ihrer Modelle und nutzen Sie sie, um geschäftliche Entscheidungen zu treffen oder die Benutzererfahrung auf Ihrer Plattform zu verbessern.
|
||||
|
||||
Diese Vorschläge sind abhängig von der Qualität und Quantität Ihrer Daten sowie den spezifischen Zielen, die Sie mit Ihrer Plattform verfolgen. Es ist wichtig, dass Sie die Modelle regelmäßig aktualisieren, um die Genauigkeit der Vorhersagen zu erhalten und zu verbessern.
|
||||
|
||||
Quelle: Unterhaltung mit Bing, 11.5.2024
|
||||
(1) Data Science – Nutzung von KI für Predictive Analytics - Springer. https://link.springer.com/content/pdf/10.1007/978-3-658-33731-5_27.pdf.
|
||||
(2) Predictive Analytics: Grundlagen, Projektbeispiele und Lessons ... - Haufe. https://www.haufe.de/finance/haufe-finance-office-premium/predictive-analytics-grundlagen-projektbeispiele-und-lessons-learned_idesk_PI20354_HI13561373.html.
|
||||
(3) Predictive Modelling: Was es ist und wie es dir dabei helfen kann, dein .... https://www.acquisa.de/magazin/predictive-modelling.
|
||||
(4) Deep Learning und Predictive Analytics: Vorhersage von Kundenverhalten .... https://www.hagel-it.de/it-insights/deep-learning-und-predictive-analytics-vorhersage-von-kundenverhalten-und-markttrends.html.
|
||||
(5) undefined. https://doi.org/10.1007/978-3-658-33731-5_27.
|
||||
|
||||
---
|
||||
|
||||
https://github.com/drizzle-team/drizzle-orm/discussions/1480#discussioncomment-9363695
|
||||
|
||||
---
|
||||
|
||||
Um eine 3D-Drucker Reservierungsplattform zu entwickeln und die genannten Kriterien umzusetzen, empfehle ich folgende Schritte:
|
||||
|
||||
### Kundenspezifische Anforderungen analysieren:
|
||||
1. **Stakeholder-Interviews** durchführen, um Bedürfnisse und Erwartungen zu verstehen.
|
||||
2. **Umfragen** erstellen, um Feedback von potenziellen Nutzern zu sammeln.
|
||||
3. **Anforderungsworkshops** abhalten, um gemeinsam mit den Stakeholdern Anforderungen zu definieren.
|
||||
4. **User Stories** und **Use Cases** entwickeln, um die Anforderungen zu konkretisieren.
|
||||
|
||||
### Projektumsetzung planen:
|
||||
1. **Projektziele** klar definieren und mit den betrieblichen Zielen abstimmen.
|
||||
2. **Ressourcenplanung** vornehmen, um Personal, Zeit und Budget effizient einzusetzen.
|
||||
3. **Risikoanalyse** durchführen, um potenzielle Hindernisse frühzeitig zu erkennen.
|
||||
4. **Meilensteinplanung** erstellen, um wichtige Projektphasen zu strukturieren.
|
||||
|
||||
### Daten identifizieren, klassifizieren und modellieren:
|
||||
1. **Datenquellen** identifizieren, die für die Reservierungsplattform relevant sind.
|
||||
2. **Datenklassifikation** vornehmen, um die Daten nach Typ und Sensibilität zu ordnen.
|
||||
3. **Entity-Relationship-Modelle** (ERM) erstellen, um die Beziehungen zwischen den Daten zu visualisieren.
|
||||
|
||||
### Mathematische Vorhersagemodelle und statistische Verfahren nutzen:
|
||||
1. **Regressionsanalysen** durchführen, um zukünftige Nutzungsmuster vorherzusagen.
|
||||
2. **Clusteranalysen** anwenden, um Nutzergruppen zu identifizieren und zu segmentieren.
|
||||
3. **Zeitreihenanalysen** nutzen, um Trends und saisonale Schwankungen zu erkennen.
|
||||
|
||||
### Datenqualität sicherstellen:
|
||||
1. **Validierungsregeln** implementieren, um die Eingabe korrekter Daten zu gewährleisten.
|
||||
2. **Datenbereinigung** regelmäßig durchführen, um Duplikate und Inkonsistenzen zu entfernen.
|
||||
3. **Datenintegrität** durch Referenzintegritätsprüfungen sicherstellen.
|
||||
|
||||
### Analyseergebnisse aufbereiten und Optimierungsmöglichkeiten aufzeigen:
|
||||
1. **Dashboards** entwickeln, um die wichtigsten Kennzahlen übersichtlich darzustellen.
|
||||
2. **Berichte** generieren, die detaillierte Einblicke in die Nutzungsdaten bieten.
|
||||
3. **Handlungsempfehlungen** ableiten, um die Plattform kontinuierlich zu verbessern.
|
||||
|
||||
### Projektdokumentation anforderungsgerecht erstellen:
|
||||
1. **Dokumentationsstandards** festlegen, um Einheitlichkeit zu gewährleisten.
|
||||
2. **Versionskontrolle** nutzen, um Änderungen nachvollziehbar zu machen.
|
||||
3. **Projektfortschritt** dokumentieren, um den Überblick über den aktuellen Stand zu behalten.
|
||||
|
||||
Diese Empfehlungen sollen als Leitfaden dienen, um die genannten Kriterien systematisch und strukturiert in Ihrem Abschlussprojekt umzusetzen.
|
||||
|
||||
Quelle: Unterhaltung mit Bing, 11.5.2024
|
||||
(1) Erfolgreiche Datenanalyseprojekte: Diese Strategien sollten Sie kennen. https://www.b2bsmartdata.de/blog/erfolgreiche-datenanalyseprojekte-diese-strategien-sollten-sie-kennen.
|
||||
(2) Projektdokumentation - wichtige Grundregeln | dieprojektmanager. https://dieprojektmanager.com/projektdokumentation-wichtige-grundregeln/.
|
||||
(3) Projektdokumentation: Definition, Aufbau, Inhalte und Beispiel. https://www.wirtschaftswissen.de/unternehmensfuehrung/projektmanagement/projektdokumentation-je-genauer-sie-ist-desto-weniger-arbeit-haben-sie-mit-nachfolgeprojekten/.
|
||||
(4) Was ist Datenmodellierung? | IBM. https://www.ibm.com/de-de/topics/data-modeling.
|
||||
(5) Was ist Datenmodellierung? | Microsoft Power BI. https://powerbi.microsoft.com/de-de/what-is-data-modeling/.
|
||||
(6) Inhalte Datenmodelle und Datenmodellierung Datenmodellierung ... - TUM. https://wwwbroy.in.tum.de/lehre/vorlesungen/mbe/SS07/vorlfolien/02_Datenmodellierung.pdf.
|
||||
(7) Definition von Datenmodellierung: Einsatzbereiche und Typen.. https://business.adobe.com/de/blog/basics/define-data-modeling.
|
||||
(8) 3. Informations- und Datenmodelle - RPTU. http://lgis.informatik.uni-kl.de/archiv/wwwdvs.informatik.uni-kl.de/courses/DBS/WS2000/Vorlesungsunterlagen/Kapitel.03.pdf.
|
||||
(9) Prozessoptimierung: 7 Methoden im Überblick! [2024] • Asana. https://asana.com/de/resources/process-improvement-methodologies.
|
||||
(10) Prozessoptimierung: Definition, Methoden & Praxis-Beispiele. https://peras.de/hr-blog/detail/hr-blog/prozessoptimierung.
|
||||
(11) Optimierungspotenzial erkennen - OPTANO. https://optano.com/blog/optimierungspotenzial-erkennen/.
|
||||
(12) Projektplanung: Definition, Ziele und Ablauf - wirtschaftswissen.de. https://www.wirtschaftswissen.de/unternehmensfuehrung/projektmanagement/in-nur-5-schritten-zur-fehlerfreien-projektplanung/.
|
||||
(13) Projektphasen: Die Vier! Von der Planung zur Umsetzung. https://www.pureconsultant.de/de/wissen/projektphasen/.
|
||||
(14) Hinweise zur Abschlussprüfung in den IT-Berufen (VO 2020) - IHK_DE. https://www.ihk.de/blueprint/servlet/resource/blob/5361152/008d092b38f621b2c97c66d5193d9f6c/pruefungshinweise-neue-vo-2020-data.pdf.
|
||||
(15) PAO – Projektantrag Fachinformatiker Daten- und Prozessanalyse - IHK_DE. https://www.ihk.de/blueprint/servlet/resource/blob/5673390/37eb05e451ed6051f6316f66d012cc50/projektantrag-fachinformatiker-daten-und-prozessanalyse-data.pdf.
|
||||
(16) IT-BERUFE Leitfaden zur IHK-Abschlussprüfung Fachinformatikerinnen und .... https://www.ihk.de/blueprint/servlet/resource/blob/5439816/6570224fb196bc7e10d16beeeb75fec1/neu-leitfaden-fian-data.pdf.
|
||||
(17) Fachinformatiker/-in Daten- und Prozessanalyse - IHK Nord Westfalen. https://www.ihk.de/nordwestfalen/bildung/ausbildung/ausbildungsberufe-a-z/fachinformatiker-daten-und-prozessanalyse-4767680.
|
||||
(18) Leitfaden zur IHK-Abschlussprüfung Fachinformatiker/-in .... https://www.ihk.de/blueprint/servlet/resource/blob/5682602/2fbedf4b4f33f7522d28ebc611adc909/fachinformatikerin-daten-und-prozessanalyse-data.pdf.
|
||||
(19) § 28 FIAusbV - Einzelnorm - Gesetze im Internet. https://www.gesetze-im-internet.de/fiausbv/__28.html.
|
||||
(20) Hinweise des Prüfungsausschusses zur Projektarbeit. https://www.neubrandenburg.ihk.de/fileadmin/user_upload/Aus_und_Weiterbildung/Ausbildung/Projektarbeit_Fachinformatiker_FR._Daten-_und_Prozessanalyse.pdf.
|
||||
(21) Datenqualität: Definition und Methoden zur kontinuierlichen .... https://www.acquisa.de/magazin/datenqualitaet.
|
||||
(22) Datenqualität: Definition, Merkmale und Analyse (Guide) - Kobold AI. https://www.kobold.ai/datenqualitaet-guide/.
|
||||
(23) Datenqualität: Definition und Methoden zur kontinuierlichen .... https://bing.com/search?q=Sicherstellung+der+Datenqualit%c3%a4t.
|
||||
(24) Datenqualitätsmanagement: Sicherstellung hoher Datenstandards. https://www.data-analyst.de/glossar/data-quality-management/.
|
||||
(25) Kundenspezifische Anforderungen CSR - Beratung für Managementsysteme. https://smct-management.de/kundenspezifische-anforderungen-csr-im-sinne-der-iatf-16949/.
|
||||
(26) CSR Sys - Kundenspezifische Anforderungen verwalten und bewerten. https://smct-management.de/csr-sys-kundenspezifische-anforderungen/.
|
||||
(27) Beauftragter für Customer Specific Requirements (CSR). https://www.tuev-nord.de/de/weiterbildung/seminare/beauftragter-fuer-customer-specific-requirements-csr-a/.
|
||||
(28) Kundenspezifische Anforderungen Seminar | Jetzt anfragen! - qdc. https://qdc.de/kundenspezifische-anforderungen-seminar/.
|
||||
|
||||
---
|
||||
|
||||
Um die Punkte zur Datenidentifikation, -klassifikation, -modellierung und zur Nutzung mathematischer Modelle und statistischer Verfahren weiter zu konkretisieren, finden Sie hier detaillierte Empfehlungen:
|
||||
|
||||
### Datenquellen identifizieren:
|
||||
1. **Bestandsaufnahme** der aktuellen Daten: Erfassen Sie alle Daten, die bereits im Unternehmen vorhanden sind, wie z.B. Kundeninformationen, Transaktionsdaten und Gerätenutzungsdaten.
|
||||
2. **Externe Datenquellen** prüfen: Untersuchen Sie, ob und welche externen Datenquellen wie Materiallieferanten oder Wartungsdienstleister relevant sein könnten.
|
||||
3. **IoT-Sensordaten**: Berücksichtigen Sie die Integration von IoT-Geräten, die in Echtzeit Daten über den Zustand und die Nutzung der 3D-Drucker liefern.
|
||||
|
||||
### Datenklassifikation:
|
||||
1. **Sensibilitätsstufen** festlegen: Bestimmen Sie, welche Daten sensibel sind (z.B. personenbezogene Daten) und einer besonderen Schutzstufe bedürfen.
|
||||
2. **Datenkategorien** erstellen: Ordnen Sie die Daten in Kategorien wie Nutzungsdaten, Finanzdaten, Betriebsdaten etc.
|
||||
3. **Zugriffsrechte** definieren: Legen Sie fest, wer Zugriff auf welche Daten haben darf, um die Datensicherheit zu gewährleisten.
|
||||
|
||||
### Entity-Relationship-Modelle (ERM):
|
||||
1. **Datenentitäten** identifizieren: Bestimmen Sie die Kernentitäten wie Benutzer, Drucker, Reservierungen und Materialien.
|
||||
2. **Beziehungen** festlegen: Definieren Sie, wie diese Entitäten miteinander in Beziehung stehen (z.B. ein Benutzer kann mehrere Reservierungen haben).
|
||||
3. **ERM-Tools** nutzen: Verwenden Sie Software wie Lucidchart oder Microsoft Visio, um die ERMs zu visualisieren.
|
||||
|
||||
### Regressionsanalysen:
|
||||
1. **Historische Daten** sammeln: Nutzen Sie vergangene Nutzungsdaten, um Muster zu erkennen.
|
||||
2. **Prädiktive Variablen** wählen: Identifizieren Sie Faktoren, die die Nutzung beeinflussen könnten, wie z.B. Uhrzeit, Wochentag oder Materialtyp.
|
||||
3. **Regressionsmodelle** anwenden: Nutzen Sie lineare oder logistische Regression, um zukünftige Nutzungsmuster vorherzusagen.
|
||||
|
||||
### Clusteranalysen:
|
||||
1. **Nutzersegmentierung**: Teilen Sie Nutzer basierend auf ihrem Verhalten in Gruppen ein, z.B. nach Häufigkeit der Nutzung oder bevorzugten Materialien.
|
||||
2. **K-Means-Clustering**: Verwenden Sie Algorithmen wie K-Means, um die Nutzer in sinnvolle Cluster zu segmentieren.
|
||||
3. **Cluster-Validierung**: Überprüfen Sie die Güte der Clusterbildung, um sicherzustellen, dass die Segmente aussagekräftig sind.
|
||||
|
||||
### Zeitreihenanalysen:
|
||||
1. **Zeitstempel-Daten** analysieren: Untersuchen Sie Daten mit Zeitstempeln, um Trends und Muster über die Zeit zu erkennen.
|
||||
2. **Saisonale Effekte** berücksichtigen: Identifizieren Sie saisonale Schwankungen in der Nutzung der 3D-Drucker.
|
||||
3. **ARIMA-Modelle**: Nutzen Sie autoregressive integrierte gleitende Durchschnitte (ARIMA), um zukünftige Trends zu prognostizieren.
|
||||
|
||||
Diese Methoden helfen Ihnen, ein tiefes Verständnis der Daten zu entwickeln, das für die erfolgreiche Umsetzung Ihrer Reservierungsplattform unerlässlich ist. Denken Sie daran, dass die genaue Anwendung dieser Techniken von den spezifischen Daten und Anforderungen Ihres Projekts abhängt. Es ist wichtig, dass Sie sich mit den Grundlagen der Datenanalyse und statistischen Modellierung vertraut machen, um diese Methoden effektiv anwenden zu können.
|
||||
|
||||
----
|
||||
This is a [Next.js](https://nextjs.org/) project bootstrapped with [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app).
|
||||
|
||||
## Getting Started
|
||||
|
||||
First, run the development server:
|
||||
|
||||
```bash
|
||||
npm run dev
|
||||
# or
|
||||
yarn dev
|
||||
# or
|
||||
pnpm dev
|
||||
# or
|
||||
bun dev
|
||||
```
|
||||
|
||||
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
|
||||
|
||||
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file.
|
||||
|
||||
This project uses [`next/font`](https://nextjs.org/docs/basic-features/font-optimization) to automatically optimize and load Inter, a custom Google Font.
|
||||
|
||||
## Learn More
|
||||
|
||||
To learn more about Next.js, take a look at the following resources:
|
||||
|
||||
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
|
||||
- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial.
|
||||
|
||||
You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js/) - your feedback and contributions are welcome!
|
||||
|
||||
## Deploy on Vercel
|
||||
|
||||
The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js.
|
||||
|
||||
Check out our [Next.js deployment documentation](https://nextjs.org/docs/deployment) for more details.
|
||||
|
23
packages/reservation-platform/docker-compose.yml
Normal file
23
packages/reservation-platform/docker-compose.yml
Normal file
@ -0,0 +1,23 @@
|
||||
version: '3'
|
||||
|
||||
services:
|
||||
frontend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
container_name: myp-frontend
|
||||
network_mode: host
|
||||
environment:
|
||||
- RUNTIME_ENVIRONMENT=${RUNTIME_ENVIRONMENT:-prod}
|
||||
- OAUTH_CLIENT_ID=${OAUTH_CLIENT_ID:-client_id}
|
||||
- OAUTH_CLIENT_SECRET=${OAUTH_CLIENT_SECRET:-client_secret}
|
||||
- NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL:-http://localhost:5000}
|
||||
volumes:
|
||||
- ./db:/app/db
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD", "wget", "--spider", "http://localhost:3000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
start_period: 40s
|
@ -1,31 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Define image name
|
||||
MYP_RP_IMAGE_NAME="myp-rp"
|
||||
|
||||
# Function to build Docker image
|
||||
build_image() {
|
||||
local image_name=$1
|
||||
local dockerfile=$2
|
||||
local platform=$3
|
||||
|
||||
echo "Building $image_name Docker image for $platform..."
|
||||
|
||||
docker buildx build --platform $platform -t ${image_name}:latest -f $dockerfile --load .
|
||||
if [ $? -eq 0 ]; then
|
||||
echo "$image_name Docker image built successfully"
|
||||
else
|
||||
echo "Error occurred while building $image_name Docker image"
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
# Create and use a builder instance (if not already created)
|
||||
BUILDER_NAME="myp-rp-arm64-builder"
|
||||
docker buildx create --name $BUILDER_NAME --use || docker buildx use $BUILDER_NAME
|
||||
|
||||
# Build myp-rp image
|
||||
build_image "$MYP_RP_IMAGE_NAME" "$PWD/Dockerfile" "linux/arm64"
|
||||
|
||||
# Remove the builder instance
|
||||
docker buildx rm $BUILDER_NAME
|
@ -1,8 +0,0 @@
|
||||
{
|
||||
debug
|
||||
}
|
||||
|
||||
m040tbaraspi001.de040.corpintra.net, m040tbaraspi001.de040.corpinter.net {
|
||||
reverse_proxy myp-rp:3000
|
||||
tls internal
|
||||
}
|
@ -1,19 +0,0 @@
|
||||
services:
|
||||
caddy:
|
||||
image: caddy:2.8
|
||||
container_name: caddy
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- 80:80
|
||||
- 443:443
|
||||
volumes:
|
||||
- ./caddy/data:/data
|
||||
- ./caddy/config:/config
|
||||
- ./caddy/Caddyfile:/etc/caddy/Caddyfile:ro
|
||||
myp-rp:
|
||||
image: myp-rp:latest
|
||||
container_name: myp-rp
|
||||
env_file: "/srv/myp-env/github.env"
|
||||
volumes:
|
||||
- /srv/MYP-DB:/usr/src/app/db
|
||||
restart: unless-stopped
|
@ -1,36 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Directory containing the Docker images
|
||||
IMAGE_DIR="docker/images"
|
||||
|
||||
# Load all Docker images from the tar.xz files in the IMAGE_DIR
|
||||
echo "Loading Docker images from $IMAGE_DIR..."
|
||||
|
||||
for image_file in "$IMAGE_DIR"/*.tar.xz; do
|
||||
if [ -f "$image_file" ]; then
|
||||
echo "Loading Docker image from $image_file..."
|
||||
docker load -i "$image_file"
|
||||
|
||||
# Check if the image loading was successful
|
||||
if [ $? -ne 0 ]; then
|
||||
echo "Error occurred while loading Docker image from $image_file"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "No Docker image tar.xz files found in $IMAGE_DIR."
|
||||
fi
|
||||
done
|
||||
|
||||
# Execute docker compose
|
||||
echo "Running docker compose..."
|
||||
docker compose -f "docker/compose.yml" up -d
|
||||
|
||||
# Check if the operation was successful
|
||||
if [ $? -eq 0 ]; then
|
||||
echo "Docker compose executed successfully"
|
||||
else
|
||||
echo "Error occurred while executing docker compose"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Deployment completed successfully"
|
@ -1,2 +0,0 @@
|
||||
caddy_2.8.tar.xz filter=lfs diff=lfs merge=lfs -text
|
||||
myp-rp_latest.tar.xz filter=lfs diff=lfs merge=lfs -text
|
BIN
packages/reservation-platform/docker/images/caddy_2.8.tar.xz
(Stored with Git LFS)
BIN
packages/reservation-platform/docker/images/caddy_2.8.tar.xz
(Stored with Git LFS)
Binary file not shown.
BIN
packages/reservation-platform/docker/images/myp-rp_latest.tar.xz
(Stored with Git LFS)
BIN
packages/reservation-platform/docker/images/myp-rp_latest.tar.xz
(Stored with Git LFS)
Binary file not shown.
@ -1,68 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Get image name as argument
|
||||
IMAGE_NAME=$1
|
||||
PLATFORM="linux/arm64"
|
||||
|
||||
# Define paths
|
||||
IMAGE_DIR="docker/images"
|
||||
IMAGE_FILE="${IMAGE_DIR}/${IMAGE_NAME//[:\/]/_}.tar"
|
||||
COMPRESSED_FILE="${IMAGE_FILE}.xz"
|
||||
|
||||
# Function to pull the image
|
||||
pull_image() {
|
||||
local image=$1
|
||||
if [[ $image == arm64v8/* ]]; then
|
||||
echo "Pulling image $image without platform specification..."
|
||||
docker pull $image
|
||||
else
|
||||
echo "Pulling image $image for platform $PLATFORM..."
|
||||
docker pull --platform $PLATFORM $image
|
||||
fi
|
||||
return $?
|
||||
}
|
||||
|
||||
# Pull the image if it is not available locally
|
||||
if ! docker image inspect ${IMAGE_NAME} &>/dev/null; then
|
||||
if pull_image ${IMAGE_NAME}; then
|
||||
echo "Image $IMAGE_NAME pulled successfully."
|
||||
else
|
||||
echo "Error occurred while pulling $IMAGE_NAME for platform $PLATFORM"
|
||||
echo "Trying to pull $IMAGE_NAME without platform specification..."
|
||||
|
||||
# Attempt to pull again without platform
|
||||
if pull_image ${IMAGE_NAME}; then
|
||||
echo "Image $IMAGE_NAME pulled successfully without platform."
|
||||
else
|
||||
echo "Error occurred while pulling $IMAGE_NAME without platform."
|
||||
echo "Trying to pull arm64v8/${IMAGE_NAME} instead..."
|
||||
|
||||
# Construct new image name
|
||||
NEW_IMAGE_NAME="arm64v8/${IMAGE_NAME}"
|
||||
if pull_image ${NEW_IMAGE_NAME}; then
|
||||
echo "Image $NEW_IMAGE_NAME pulled successfully."
|
||||
IMAGE_NAME=${NEW_IMAGE_NAME} # Update IMAGE_NAME to use the new one
|
||||
else
|
||||
echo "Error occurred while pulling $NEW_IMAGE_NAME"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
else
|
||||
echo "Image $IMAGE_NAME found locally. Skipping pull."
|
||||
fi
|
||||
|
||||
# Save the Docker image
|
||||
echo "Saving $IMAGE_NAME Docker image..."
|
||||
docker save ${IMAGE_NAME} > $IMAGE_FILE
|
||||
|
||||
# Compress the Docker image (overwriting if file exists)
|
||||
echo "Compressing $IMAGE_FILE..."
|
||||
xz -z --force $IMAGE_FILE
|
||||
|
||||
if [ $? -eq 0 ]; then
|
||||
echo "$IMAGE_NAME Docker image saved and compressed successfully as $COMPRESSED_FILE"
|
||||
else
|
||||
echo "Error occurred while compressing $IMAGE_NAME Docker image"
|
||||
exit 1
|
||||
fi
|
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user