torben fronted wiederhergestellt
This commit is contained in:
parent
1f6feafecc
commit
cef6abec32
242
INSTALL.md
242
INSTALL.md
@ -1,242 +0,0 @@
|
||||
# MYP System - Installationsanleitung
|
||||
|
||||
Dieses Dokument beschreibt die Installation des MYP-Systems, bestehend aus einem Frontend und einem Backend.
|
||||
|
||||
## Systemanforderungen
|
||||
|
||||
- **Frontend**:
|
||||
- Raspberry Pi 3B+ oder neuer (empfohlen: Pi 4 mit mindestens 2GB RAM)
|
||||
- Raspbian/Raspberry Pi OS (64-bit empfohlen)
|
||||
- Docker und Docker Compose (werden automatisch installiert)
|
||||
- Internet-Zugang für die Installation
|
||||
- Zwei Netzwerk-Schnittstellen:
|
||||
- Eine mit Internet-Zugang
|
||||
- Eine für die Verbindung zum Backend-Netzwerk
|
||||
|
||||
- **Backend**:
|
||||
- Raspberry Pi 3B+ oder neuer (empfohlen: Pi 4 mit mindestens 2GB RAM)
|
||||
- Raspbian/Raspberry Pi OS (64-bit empfohlen)
|
||||
- Docker und Docker Compose (werden automatisch installiert)
|
||||
- Verbindung zum Druckernetzwerk
|
||||
|
||||
## 1. Installation des Backends
|
||||
|
||||
Das Backend wird auf dem ersten Raspberry Pi installiert, der mit den Smart Plugs verbunden ist.
|
||||
|
||||
```bash
|
||||
# Den Code auf den Raspberry Pi kopieren
|
||||
scp -r /pfad/zum/projektverzeichnis pi@raspberry-backend:/home/pi/myp
|
||||
|
||||
# SSH-Verbindung herstellen
|
||||
ssh pi@raspberry-backend
|
||||
|
||||
# In das Projektverzeichnis wechseln
|
||||
cd /home/pi/myp
|
||||
|
||||
# Installations-Skript ausführbar machen und ausführen
|
||||
chmod +x install-backend.sh
|
||||
./install-backend.sh
|
||||
```
|
||||
|
||||
Das Skript erledigt folgende Aufgaben:
|
||||
- Installation von Docker und Docker Compose (falls nicht vorhanden)
|
||||
- Erstellung der nötigen Verzeichnisse und Dateien
|
||||
- Erstellung und Start des Docker-Containers
|
||||
- Initialisierung der Datenbank
|
||||
- Überprüfung, ob der Service korrekt läuft
|
||||
|
||||
Nach erfolgreicher Installation ist das Backend unter `http://raspberry-backend:5000` erreichbar.
|
||||
|
||||
## 2. Installation des Frontends
|
||||
|
||||
Das Frontend wird auf dem zweiten Raspberry Pi installiert, der mit dem Unternehmensnetzwerk verbunden ist.
|
||||
|
||||
```bash
|
||||
# Den Code auf den Raspberry Pi kopieren
|
||||
scp -r /pfad/zum/projektverzeichnis pi@raspberry-frontend:/home/pi/myp
|
||||
|
||||
# SSH-Verbindung herstellen
|
||||
ssh pi@raspberry-frontend
|
||||
|
||||
# In das Projektverzeichnis wechseln
|
||||
cd /home/pi/myp
|
||||
|
||||
# Installations-Skript ausführbar machen und ausführen
|
||||
chmod +x install-frontend.sh
|
||||
./install-frontend.sh
|
||||
```
|
||||
|
||||
Das Skript erledigt folgende Aufgaben:
|
||||
- Installation von Docker und Docker Compose (falls nicht vorhanden)
|
||||
- Erstellung der benötigten Verzeichnisse und Dateien
|
||||
- Erstellung und Start des Docker-Containers
|
||||
- Überprüfung, ob der Service korrekt läuft
|
||||
|
||||
Nach erfolgreicher Installation ist das Frontend unter `http://raspberry-frontend:3000` erreichbar.
|
||||
|
||||
## 3. Konfiguration der Verbindung zwischen Frontend und Backend
|
||||
|
||||
Für die Kommunikation zwischen Frontend und Backend muss die API-URL im Frontend konfiguriert werden:
|
||||
|
||||
1. Die Datei `/home/pi/myp/packages/reservation-platform/.env` auf dem Frontend-Raspberry Pi bearbeiten:
|
||||
|
||||
```
|
||||
# Basic Server Configuration
|
||||
RUNTIME_ENVIRONMENT=prod
|
||||
DB_PATH=db/sqlite.db
|
||||
|
||||
# OAuth Configuration
|
||||
OAUTH_CLIENT_ID=client_id
|
||||
OAUTH_CLIENT_SECRET=client_secret
|
||||
|
||||
# Backend-URL (Hostname oder IP-Adresse des Backend-Raspberry Pi)
|
||||
NEXT_PUBLIC_API_URL=http://raspberry-backend:5000
|
||||
```
|
||||
|
||||
2. Frontend-Container neu starten:
|
||||
|
||||
```bash
|
||||
cd /home/pi/myp/packages/reservation-platform
|
||||
docker-compose down
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## 4. Wartung und Fehlerbehebung
|
||||
|
||||
### Logs anzeigen
|
||||
|
||||
**Backend:**
|
||||
```bash
|
||||
docker logs -f myp-backend
|
||||
```
|
||||
|
||||
**Frontend:**
|
||||
```bash
|
||||
docker logs -f myp-frontend
|
||||
```
|
||||
|
||||
### Container neustarten
|
||||
|
||||
**Backend:**
|
||||
```bash
|
||||
cd /pfad/zum/backend
|
||||
docker-compose restart
|
||||
```
|
||||
|
||||
**Frontend:**
|
||||
```bash
|
||||
cd /pfad/zum/frontend
|
||||
docker-compose restart
|
||||
```
|
||||
|
||||
### Datenbank-Reset
|
||||
|
||||
Sollte die Datenbank zurückgesetzt werden müssen:
|
||||
|
||||
```bash
|
||||
# Auf dem Backend-Raspberry Pi
|
||||
cd /home/pi/myp/backend
|
||||
docker-compose down
|
||||
rm -f instance/myp.db
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
### Docker Compose YAML-Fehler
|
||||
|
||||
Wenn Sie einen YAML-Fehler in der Docker Compose-Datei erhalten:
|
||||
|
||||
```
|
||||
yaml: line 12: did not find expected key
|
||||
```
|
||||
|
||||
Überprüfen Sie folgende Punkte:
|
||||
1. Die Docker Compose-Version könnte veraltet sein. Die Installationsskripte installieren automatisch die richtige Version.
|
||||
2. Es könnte ein Syntaxfehler in der YAML-Datei vorliegen. Prüfen Sie insbesondere komplexe Werte wie JSON-Strings.
|
||||
|
||||
Fehlerbehebung:
|
||||
```bash
|
||||
# Auf dem betroffenen Server
|
||||
cd /home/pi/myp
|
||||
# Für das Backend
|
||||
nano backend/docker-compose.yml
|
||||
# Für das Frontend
|
||||
nano packages/reservation-platform/docker-compose.yml
|
||||
```
|
||||
|
||||
### Docker-Daemon läuft nicht
|
||||
|
||||
Wenn Sie die Fehlermeldung erhalten, dass der Docker-Daemon nicht läuft:
|
||||
|
||||
```
|
||||
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
|
||||
```
|
||||
|
||||
Starten Sie den Docker-Daemon:
|
||||
```bash
|
||||
sudo systemctl start docker
|
||||
# oder
|
||||
sudo service docker start
|
||||
```
|
||||
|
||||
### Container startet nicht
|
||||
|
||||
Wenn der Container nicht startet, prüfen Sie die Logs:
|
||||
|
||||
```bash
|
||||
docker logs myp-backend
|
||||
# oder
|
||||
docker logs myp-frontend
|
||||
```
|
||||
|
||||
### Frontend kann nicht auf Backend zugreifen
|
||||
|
||||
1. Stellen Sie sicher, dass beide Server im selben Netzwerk sind
|
||||
2. Überprüfen Sie die Firewall-Einstellungen
|
||||
3. Stellen Sie sicher, dass der Backend-Service auf Port 5000 läuft
|
||||
4. Stellen Sie sicher, dass die richtige Backend-URL in der .env-Datei des Frontends eingestellt ist
|
||||
|
||||
## 5. Automatischer Start beim Systemstart
|
||||
|
||||
Die Docker-Container sind so konfiguriert, dass sie automatisch beim Neustart der Geräte starten (`restart: unless-stopped`).
|
||||
|
||||
Sollte dies nicht funktionieren, kann der Start in die `/etc/rc.local` eingetragen werden:
|
||||
|
||||
```bash
|
||||
# Auf dem Backend-Raspberry Pi
|
||||
echo "cd /home/pi/myp/backend && docker-compose up -d" >> /etc/rc.local
|
||||
|
||||
# Auf dem Frontend-Raspberry Pi
|
||||
echo "cd /home/pi/myp/packages/reservation-platform && docker-compose up -d" >> /etc/rc.local
|
||||
```
|
||||
|
||||
## 6. Technische Details
|
||||
|
||||
- Das Backend ist eine Flask-Anwendung, die mit den Smart Plugs kommuniziert
|
||||
- Das Frontend ist eine Next.js-Anwendung
|
||||
- Beide Komponenten laufen in Docker-Containern mit Host-Netzwerkanbindung
|
||||
- Die Datenbanken werden in Docker-Volumes persistiert
|
||||
|
||||
## 7. Raspberry Pi-spezifische Anmerkungen
|
||||
|
||||
Wenn Sie Probleme mit der Docker-Installation auf dem Raspberry Pi haben, können Sie folgende Schritte manuell ausführen:
|
||||
|
||||
```bash
|
||||
# Docker für Raspberry Pi installieren
|
||||
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||
sudo sh get-docker.sh
|
||||
sudo usermod -aG docker $USER
|
||||
|
||||
# Docker Compose für die richtige Architektur installieren
|
||||
# Für 32-bit (armhf):
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-Linux-armv7" -o /usr/local/bin/docker-compose
|
||||
|
||||
# Für 64-bit (arm64):
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-Linux-aarch64" -o /usr/local/bin/docker-compose
|
||||
|
||||
sudo chmod +x /usr/local/bin/docker-compose
|
||||
```
|
||||
|
||||
## 8. Unterstützung
|
||||
|
||||
Bei Fragen oder Problemen wenden Sie sich an:
|
||||
- Till Tomczak (Projektentwickler)
|
@ -1,79 +0,0 @@
|
||||
# Frontend-Wiederherstellung und Installation
|
||||
|
||||
Diese Anleitung erklärt, wie du das Frontend auf den Stand von Torbens letztem Commit zurücksetzen und installieren kannst.
|
||||
|
||||
## Vorhandene Skripte
|
||||
|
||||
Es wurden drei Skripte erstellt, um die Wiederherstellung des Frontends zu erleichtern:
|
||||
|
||||
1. `fix-frontend-install.sh` - Master-Skript, das beide unten genannten Skripte ausführt
|
||||
2. `packages/restore-torben-frontend.sh` - Setzt das Frontend auf Torbens letzten Commit zurück
|
||||
3. `packages/install-torben-frontend.sh` - Installiert die Abhängigkeiten des wiederhergestellten Frontends
|
||||
|
||||
## Schnelle Lösung
|
||||
|
||||
Für die schnellste Lösung führe einfach das Master-Skript aus:
|
||||
|
||||
```bash
|
||||
chmod +x fix-frontend-install.sh
|
||||
./fix-frontend-install.sh
|
||||
```
|
||||
|
||||
Das Skript wird:
|
||||
1. Das aktuelle Frontend-Verzeichnis sichern (optional)
|
||||
2. Das Frontend auf Torbens letzten Commit (27. Mai 2024) zurücksetzen
|
||||
3. Die Änderungen committen (optional)
|
||||
4. Die Frontend-Abhängigkeiten installieren
|
||||
5. Das Frontend bauen, um die Installation zu verifizieren
|
||||
|
||||
## Manuelle Schritte
|
||||
|
||||
Wenn du die Schritte manuell ausführen möchtest:
|
||||
|
||||
### 1. Frontend zurücksetzen
|
||||
|
||||
```bash
|
||||
chmod +x packages/restore-torben-frontend.sh
|
||||
./packages/restore-torben-frontend.sh
|
||||
```
|
||||
|
||||
### 2. Frontend installieren
|
||||
|
||||
```bash
|
||||
chmod +x packages/install-torben-frontend.sh
|
||||
./packages/install-torben-frontend.sh
|
||||
```
|
||||
|
||||
## Das System starten
|
||||
|
||||
### Frontend starten
|
||||
|
||||
```bash
|
||||
cd packages/reservation-platform
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
Das Frontend ist dann unter http://localhost:3000 erreichbar.
|
||||
|
||||
### Backend starten
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
source venv/bin/activate
|
||||
python app.py
|
||||
```
|
||||
|
||||
Das Backend läuft dann auf http://localhost:5000.
|
||||
|
||||
### Backend-Autostart konfigurieren
|
||||
|
||||
Für den automatischen Start des Backends beim Hochfahren:
|
||||
|
||||
```bash
|
||||
sudo ./backend/autostart-backend.sh
|
||||
```
|
||||
|
||||
## Bekannte Probleme
|
||||
|
||||
- Wenn beim Frontend-Start Fehler auftreten, überprüfe die .env-Datei in packages/reservation-platform/
|
||||
- Stelle sicher, dass das Backend erreichbar ist unter der URL, die in NEXT_PUBLIC_API_URL konfiguriert ist
|
@ -1 +0,0 @@
|
||||
PRINTER_IPS=192.168.0.10,192.168.0.11,192.168.0.12
|
@ -1,106 +0,0 @@
|
||||
# 🖨️ 3D-Drucker Status API 📊
|
||||
|
||||
Willkommen beim Blueprint der 3D-Drucker Status API! Diese API ermöglicht es Ihnen, den Status mehrerer über LAN verbundener 3D-Drucker zu überwachen und Druckaufträge an sie zu senden.
|
||||
|
||||
## 🌟 Funktionen
|
||||
|
||||
- 🔍 Abrufen des Status von 3D-Druckern, einschließlich ihres aktuellen Status, Fortschrittes und Temperatur.
|
||||
- 📥 Senden von Druckaufträgen an verfügbare 3D-Drucker.
|
||||
- 💾 Speichern und Aktualisieren des Status jedes Druckers in einer SQLite-Datenbank.
|
||||
|
||||
## 🛠️Verwendete Technologien
|
||||
|
||||
- 🐍 Python
|
||||
- 🌶️ Flask
|
||||
- 🗄️ SQLite
|
||||
- 🌐 HTTP-Anfragen
|
||||
|
||||
## 📋 Verordnungen
|
||||
|
||||
Bevor Sie die API starten, stellen Sie sicher, dass Sie folgendes haben:
|
||||
|
||||
- Python 3.x installiert
|
||||
- Flask und python-dotenv-Bibliotheken installiert (`pip install flask python-dotenv`)
|
||||
- Eine Liste von IP-Adressen der 3D-Drucker, die Sie überwachen möchten
|
||||
|
||||
## 🚀 Erste Schritte
|
||||
|
||||
1. Klonen Sie das Repository:
|
||||
```
|
||||
git clone https://git.i.mercedes-benz.com/TBA-Berlin-FI/MYP
|
||||
```
|
||||
|
||||
2. Installieren Sie die erforderlichen Abhängigkeiten:
|
||||
```
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
3. Erstellen Sie eine `.env`-Datei im Projektverzeichnis und geben Sie die IP-Adressen Ihrer 3D-Drucker an:
|
||||
```
|
||||
PRINTER_IPS=192.168.0.10,192.168.0.11,192.168.0.12
|
||||
```
|
||||
|
||||
4. Starten Sie das Skript, um die SQLite-Datenbank zu erstellen:
|
||||
```
|
||||
python create_db.py
|
||||
```
|
||||
|
||||
5. Starten Sie den API-Server:
|
||||
```
|
||||
python app.py
|
||||
```
|
||||
|
||||
6. Die API ist unter `http://localhost:5000` erreichbar.
|
||||
|
||||
## 📡 API-Endpunkte
|
||||
|
||||
- `GET /printer_status`: Rufen Sie den Status aller 3D-Drucker ab.
|
||||
- `POST /print_job`: Senden Sie einen Druckauftrag an einen bestimmten 3D-Drucker.
|
||||
|
||||
## 📝 API-Nutzung
|
||||
|
||||
### Druckerstatus abrufen
|
||||
|
||||
Senden Sie eine `GET`-Anfrage an `/printer_status`, um den Status aller 3D-Drucker abzurufen.
|
||||
|
||||
Antwort:
|
||||
```json
|
||||
[
|
||||
{
|
||||
"ip": "192.168.0.10",
|
||||
"status": "frei",
|
||||
"progress": 0,
|
||||
"temperature": 25
|
||||
},
|
||||
{
|
||||
"ip": "192.168.0.11",
|
||||
"status": "besetzt",
|
||||
"progress": 50,
|
||||
"temperature": 180
|
||||
},
|
||||
...
|
||||
]
|
||||
```
|
||||
|
||||
### Druckauftrag senden
|
||||
|
||||
Senden Sie eine `POST`-Anfrage an `/print_job` mit der folgenden JSON-Last, um einen Druckauftrag an einen bestimmten 3D-Drucker zu senden:
|
||||
|
||||
```json
|
||||
{
|
||||
"printer_ip": "192.168.0.10",
|
||||
"file_url": "http://example.com/print_file.gcode"
|
||||
}
|
||||
```
|
||||
|
||||
Antwort:
|
||||
```json
|
||||
{
|
||||
"message": "Druckauftrag gestartet"
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
## 📄 Lizenz
|
||||
|
||||
- --> Noch nicht verfügbar
|
@ -1,25 +0,0 @@
|
||||
import sqlite3
|
||||
from dotenv import load_dotenv
|
||||
import os
|
||||
|
||||
load_dotenv()
|
||||
printers = os.getenv('PRINTER_IPS').split(',')
|
||||
|
||||
def create_db():
|
||||
conn = sqlite3.connect('printers.db')
|
||||
c = conn.cursor()
|
||||
|
||||
# Tabelle 'printers' erstellen, falls sie nicht existiert
|
||||
c.execute('''CREATE TABLE IF NOT EXISTS printers
|
||||
(ip TEXT PRIMARY KEY, status TEXT)''')
|
||||
|
||||
# Drucker-IPs in die Tabelle einfügen, falls sie noch nicht vorhanden sind
|
||||
for printer_ip in printers:
|
||||
c.execute("INSERT OR IGNORE INTO printers (ip, status) VALUES (?, ?)", (printer_ip, "frei"))
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
print("Datenbank 'printers.db' erfolgreich erstellt.")
|
||||
|
||||
if __name__ == '__main__':
|
||||
create_db()
|
@ -1,3 +0,0 @@
|
||||
flask==2.1.0
|
||||
requests==2.25.1
|
||||
python-dotenv==0.20.0
|
@ -1,94 +0,0 @@
|
||||
from flask import Flask, jsonify, request
|
||||
import requests
|
||||
import sqlite3
|
||||
from dotenv import load_dotenv
|
||||
import os
|
||||
|
||||
load_dotenv()
|
||||
printers = os.getenv('PRINTER_IPS').split(',')
|
||||
|
||||
app = Flask(__name__)
|
||||
|
||||
# SQLite-Datenbank initialisieren
|
||||
def init_db():
|
||||
conn = sqlite3.connect('printers.db')
|
||||
c = conn.cursor()
|
||||
c.execute('''CREATE TABLE IF NOT EXISTS printers
|
||||
(ip TEXT PRIMARY KEY, status TEXT)''')
|
||||
for printer_ip in printers:
|
||||
c.execute("INSERT OR IGNORE INTO printers (ip, status) VALUES (?, ?)", (printer_ip, "frei"))
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
@app.route('/printer_status', methods=['GET'])
|
||||
def get_printer_status():
|
||||
printer_status = []
|
||||
conn = sqlite3.connect('printers.db')
|
||||
c = conn.cursor()
|
||||
|
||||
for printer_ip in printers:
|
||||
c.execute("SELECT status FROM printers WHERE ip = ?", (printer_ip,))
|
||||
status = c.fetchone()[0]
|
||||
|
||||
try:
|
||||
response = requests.get(f"http://{printer_ip}/api/printer/status")
|
||||
|
||||
if response.status_code == 200:
|
||||
status_data = response.json()
|
||||
printer_status.append({
|
||||
"ip": printer_ip,
|
||||
"status": status,
|
||||
"progress": status_data["progress"],
|
||||
"temperature": status_data["temperature"]
|
||||
})
|
||||
else:
|
||||
printer_status.append({
|
||||
"ip": printer_ip,
|
||||
"status": "Fehler bei der Abfrage",
|
||||
"progress": None,
|
||||
"temperature": None
|
||||
})
|
||||
except:
|
||||
printer_status.append({
|
||||
"ip": printer_ip,
|
||||
"status": "Drucker nicht erreichbar",
|
||||
"progress": None,
|
||||
"temperature": None
|
||||
})
|
||||
|
||||
conn.close()
|
||||
return jsonify(printer_status)
|
||||
|
||||
@app.route('/print_job', methods=['POST'])
|
||||
def submit_print_job():
|
||||
print_job = request.json
|
||||
printer_ip = print_job["printer_ip"]
|
||||
file_url = print_job["file_url"]
|
||||
|
||||
conn = sqlite3.connect('printers.db')
|
||||
c = conn.cursor()
|
||||
c.execute("SELECT status FROM printers WHERE ip = ?", (printer_ip,))
|
||||
status = c.fetchone()[0]
|
||||
|
||||
if status == "frei":
|
||||
try:
|
||||
response = requests.post(f"http://{printer_ip}/api/print_job", json={"file_url": file_url})
|
||||
|
||||
if response.status_code == 200:
|
||||
c.execute("UPDATE printers SET status = 'besetzt' WHERE ip = ?", (printer_ip,))
|
||||
conn.commit()
|
||||
conn.close()
|
||||
return jsonify({"message": "Druckauftrag gestartet"}), 200
|
||||
else:
|
||||
conn.close()
|
||||
return jsonify({"message": "Fehler beim Starten des Druckauftrags"}), 500
|
||||
except:
|
||||
conn.close()
|
||||
return jsonify({"message": "Drucker nicht erreichbar"}), 500
|
||||
else:
|
||||
conn.close()
|
||||
return jsonify({"message": "Drucker ist nicht frei"}), 400
|
||||
|
||||
if __name__ == '__main__':
|
||||
init_db()
|
||||
app.run(host='0.0.0.0', port=5000)
|
@ -1,38 +0,0 @@
|
||||
# entwendet aus:
|
||||
# https://github.com/ut-hnl-lab/ultimakerpy
|
||||
|
||||
# auch zum lesen:
|
||||
# https://github.com/MartinBienz/SDPremote?tab=readme-ov-file
|
||||
|
||||
import time
|
||||
from ultimakerpy import UMS3, JobState
|
||||
|
||||
def print_started(state):
|
||||
if state == JobState.PRINTING:
|
||||
time.sleep(6.0)
|
||||
return True
|
||||
return False
|
||||
|
||||
def layer_reached(pos, n):
|
||||
if round(pos / 0.2) >= n: # set layer pitch: 0.2 mm
|
||||
return True
|
||||
return False
|
||||
|
||||
printer = UMS3(name='MyPrinterName')
|
||||
targets = {
|
||||
'job_state': printer.job_state,
|
||||
'bed_pos': printer.bed.position,
|
||||
}
|
||||
|
||||
printer.print_from_dialog() # select file to print
|
||||
printer.peripherals.camera_streaming()
|
||||
with printer.data_logger('output2.csv', targets) as dl:
|
||||
timer = dl.get_timer()
|
||||
|
||||
# sleep until active leveling finishes
|
||||
timer.wait_for_datalog('job_state', print_started)
|
||||
|
||||
for n in range(1, 101):
|
||||
# sleep until the printing of specified layer to start
|
||||
timer.wait_for_datalog('bed_pos', lambda v: layer_reached(v, n))
|
||||
print('printing layer:', n)
|
@ -1,148 +0,0 @@
|
||||
from flask import Flask, render_template, request, redirect, url_for, jsonify, session
|
||||
import sqlite3
|
||||
import bcrypt
|
||||
|
||||
app = Flask(__name__)
|
||||
app.secret_key = 'supersecretkey'
|
||||
|
||||
# Database setup
|
||||
def init_db():
|
||||
conn = sqlite3.connect('database.db')
|
||||
c = conn.cursor()
|
||||
c.execute('''CREATE TABLE IF NOT EXISTS users (id INTEGER PRIMARY KEY, username TEXT, password TEXT)''')
|
||||
c.execute('''CREATE TABLE IF NOT EXISTS printers (id INTEGER PRIMARY KEY, name TEXT, status TEXT)''')
|
||||
c.execute('''CREATE TABLE IF NOT EXISTS jobs (id INTEGER PRIMARY KEY, printer_id INTEGER, user TEXT, date TEXT, status TEXT)''')
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
init_db()
|
||||
|
||||
# User registration (Admin setup)
|
||||
def add_admin():
|
||||
conn = sqlite3.connect('database.db')
|
||||
c = conn.cursor()
|
||||
hashed_pw = bcrypt.hashpw('adminpassword'.encode('utf-8'), bcrypt.gensalt())
|
||||
c.execute("INSERT INTO users (username, password) VALUES (?, ?)", ('admin', hashed_pw))
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
# Comment the next line after the first run
|
||||
# add_admin()
|
||||
|
||||
# API Endpoints
|
||||
@app.route('/api/printers/status', methods=['GET'])
|
||||
def get_printer_status():
|
||||
conn = sqlite3.connect('database.db')
|
||||
c = conn.cursor()
|
||||
c.execute("SELECT * FROM printers")
|
||||
printers = c.fetchall()
|
||||
conn.close()
|
||||
return jsonify(printers)
|
||||
|
||||
@app.route('/api/printers/job', methods=['POST'])
|
||||
def create_job():
|
||||
if not session.get('logged_in'):
|
||||
return jsonify({'error': 'Unauthorized'}), 403
|
||||
|
||||
data = request.json
|
||||
user = session['username']
|
||||
printer_id = data['printer_id']
|
||||
conn = sqlite3.connect('database.db')
|
||||
c = conn.cursor()
|
||||
|
||||
c.execute("SELECT status FROM printers WHERE id=?", (printer_id,))
|
||||
status = c.fetchone()[0]
|
||||
|
||||
if status == 'frei':
|
||||
c.execute("INSERT INTO jobs (printer_id, user, date, status) VALUES (?, ?, datetime('now'), 'in progress')",
|
||||
(printer_id, user))
|
||||
c.execute("UPDATE printers SET status='belegt' WHERE id=?", (printer_id,))
|
||||
conn.commit()
|
||||
elif status == 'belegt':
|
||||
return jsonify({'error': 'Printer already in use'}), 409
|
||||
else:
|
||||
return jsonify({'error': 'Invalid printer status'}), 400
|
||||
|
||||
conn.close()
|
||||
return jsonify({'message': 'Job created and printer turned on'}), 200
|
||||
|
||||
@app.route('/api/printers/reserve', methods=['POST'])
|
||||
def reserve_printer():
|
||||
if not session.get('logged_in'):
|
||||
return jsonify({'error': 'Unauthorized'}), 403
|
||||
|
||||
data = request.json
|
||||
printer_id = data['printer_id']
|
||||
conn = sqlite3.connect('database.db')
|
||||
c = conn.cursor()
|
||||
|
||||
c.execute("SELECT status FROM printers WHERE id=?", (printer_id,))
|
||||
status = c.fetchone()[0]
|
||||
|
||||
if status == 'frei':
|
||||
c.execute("UPDATE printers SET status='reserviert' WHERE id=?", (printer_id,))
|
||||
conn.commit()
|
||||
message = 'Printer reserved'
|
||||
else:
|
||||
message = 'Printer cannot be reserved'
|
||||
|
||||
conn.close()
|
||||
return jsonify({'message': message}), 200
|
||||
|
||||
@app.route('/api/printers/release', methods=['POST'])
|
||||
def release_printer():
|
||||
if not session.get('logged_in'):
|
||||
return jsonify({'error': 'Unauthorized'}), 403
|
||||
|
||||
data = request.json
|
||||
printer_id = data['printer_id']
|
||||
conn = sqlite3.connect('database.db')
|
||||
c = conn.cursor()
|
||||
|
||||
c.execute("UPDATE printers SET status='frei' WHERE id=?", (printer_id,))
|
||||
conn.commit()
|
||||
conn.close()
|
||||
return jsonify({'message': 'Printer released'}), 200
|
||||
|
||||
# Authentication routes
|
||||
@app.route('/login', methods=['GET', 'POST'])
|
||||
def login():
|
||||
if request.method == 'POST':
|
||||
username = request.form['username']
|
||||
password = request.form['password'].encode('utf-8')
|
||||
|
||||
conn = sqlite3.connect('database.db')
|
||||
c = conn.cursor()
|
||||
c.execute("SELECT * FROM users WHERE username=?", (username,))
|
||||
user = c.fetchone()
|
||||
conn.close()
|
||||
|
||||
if user and bcrypt.checkpw(password, user[2].encode('utf-8')):
|
||||
session['logged_in'] = True
|
||||
session['username'] = username
|
||||
return redirect(url_for('dashboard'))
|
||||
else:
|
||||
return render_template('login.html', error='Invalid Credentials')
|
||||
|
||||
return render_template('login.html')
|
||||
|
||||
@app.route('/dashboard')
|
||||
def dashboard():
|
||||
if not session.get('logged_in'):
|
||||
return redirect(url_for('login'))
|
||||
|
||||
conn = sqlite3.connect('database.db')
|
||||
c = conn.cursor()
|
||||
c.execute("SELECT * FROM printers")
|
||||
printers = c.fetchall()
|
||||
conn.close()
|
||||
|
||||
return render_template('dashboard.html', printers=printers)
|
||||
|
||||
@app.route('/logout')
|
||||
def logout():
|
||||
session.clear()
|
||||
return redirect(url_for('login'))
|
||||
|
||||
if __name__ == '__main__':
|
||||
app.run(debug=True)
|
@ -1,20 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>3D Printer Management</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/tailwindcss@2.2.19/dist/tailwind.min.css" rel="stylesheet">
|
||||
<link href="https://cdn.jsdelivr.net/npm/daisyui@1.14.0/dist/full.css" rel="stylesheet">
|
||||
</head>
|
||||
<body class="bg-black text-white">
|
||||
<nav class="bg-gray-800 p-4">
|
||||
<div class="container mx-auto">
|
||||
<h1 class="text-xl">3D Printer Management Dashboard</h1>
|
||||
</div>
|
||||
</nav>
|
||||
<div class="container mx-auto mt-5">
|
||||
{% block content %}{% endblock %}
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
@ -1,29 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block content %}
|
||||
<h2 class="text-2xl mb-4">Printer Status</h2>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4">
|
||||
{% for printer in printers %}
|
||||
<div class="card bg-gray-900 shadow-xl">
|
||||
<div class="card-body">
|
||||
<h2 class="card-title">{{ printer[1] }}</h2>
|
||||
<p>Status: {{ printer[2] }}</p>
|
||||
{% if printer[2] == 'frei' %}
|
||||
<form method="POST" action="/api/printers/job">
|
||||
<input type="hidden" name="printer_id" value="{{ printer[0] }}">
|
||||
<button class="btn btn-success mt-4 w-full">Start Job</button>
|
||||
</form>
|
||||
{% elif printer[2] == 'belegt' %}
|
||||
<button class="btn btn-warning mt-4 w-full" disabled>In Use</button>
|
||||
{% elif printer[2] == 'reserviert' %}
|
||||
<form method="POST" action="/api/printers/release">
|
||||
<input type="hidden" name="printer_id" value="{{ printer[0] }}">
|
||||
<button class="btn btn-info mt-4 w-full">Release</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
<a href="/logout" class="btn btn-secondary mt-4">Logout</a>
|
||||
{% endblock %}
|
@ -1,33 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block content %}
|
||||
<div class="flex justify-center items-center h-screen">
|
||||
<div class="card w-96 bg-gray-900 shadow-xl">
|
||||
<div class="card-body">
|
||||
<h2 class="card-title">Login</h2>
|
||||
<form method="POST">
|
||||
<div class="form-control">
|
||||
<label class="label">
|
||||
<span class="label-text">Username</span>
|
||||
</label>
|
||||
<input type="text" name="username" class="input input-bordered w-full" required>
|
||||
</div>
|
||||
<div class="form-control">
|
||||
<label class="label">
|
||||
<span class="label-text">Password</span>
|
||||
</label>
|
||||
<input type="password" name="password" class="input input-bordered w-full" required>
|
||||
</div>
|
||||
<div class="form-control mt-6">
|
||||
<button class="btn btn-primary w-full">Login</button>
|
||||
</div>
|
||||
</form>
|
||||
{% if error %}
|
||||
<div class="mt-4 text-red-500">
|
||||
{{ error }}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
@ -1,3 +0,0 @@
|
||||
SECRET_KEY=dev-secret-key-change-in-production
|
||||
DATABASE_URL=sqlite:///app.db
|
||||
JWT_SECRET=dev-jwt-secret-change-in-production
|
@ -1,3 +0,0 @@
|
||||
SECRET_KEY=change-me-to-a-real-secret-key
|
||||
DATABASE_URL=sqlite:///app.db
|
||||
JWT_SECRET=change-me-to-a-real-jwt-secret
|
@ -1,20 +0,0 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install dependencies
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy application code
|
||||
COPY . .
|
||||
|
||||
# Run database migrations
|
||||
RUN mkdir -p /app/instance
|
||||
ENV FLASK_APP=wsgi.py
|
||||
|
||||
# Expose port
|
||||
EXPOSE 5000
|
||||
|
||||
# Run the application
|
||||
CMD ["gunicorn", "--bind", "0.0.0.0:5000", "wsgi:app"]
|
@ -1,96 +0,0 @@
|
||||
# Reservation Platform Backend
|
||||
|
||||
This is the Flask backend for the 3D Printer Reservation Platform, providing a RESTful API for managing printers, reservations, and users.
|
||||
|
||||
## Features
|
||||
|
||||
- User authentication with email/password
|
||||
- Role-based permission system (admin, user)
|
||||
- Printer management
|
||||
- Reservation system
|
||||
- User management
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Authentication
|
||||
- `POST /auth/register` - Register a new user
|
||||
- `POST /auth/login` - Login with username/email and password
|
||||
- `POST /auth/logout` - Log out a user by invalidating their session
|
||||
|
||||
### Printers
|
||||
- `GET /api/printers` - Get all printers
|
||||
- `GET /api/printers/<printer_id>` - Get a specific printer
|
||||
- `POST /api/printers` - Create a new printer (admin only)
|
||||
- `PUT /api/printers/<printer_id>` - Update a printer (admin only)
|
||||
- `DELETE /api/printers/<printer_id>` - Delete a printer (admin only)
|
||||
- `GET /api/printers/availability` - Get availability information for all printers
|
||||
|
||||
### Print Jobs
|
||||
- `GET /api/jobs` - Get jobs for the current user or all jobs for admin
|
||||
- `GET /api/jobs/<job_id>` - Get a specific job
|
||||
- `POST /api/jobs` - Create a new print job (reserve a printer)
|
||||
- `PUT /api/jobs/<job_id>` - Update a job
|
||||
- `DELETE /api/jobs/<job_id>` - Delete a job (cancel reservation)
|
||||
- `GET /api/jobs/<job_id>/remaining-time` - Get remaining time for a job (public endpoint)
|
||||
|
||||
### Users
|
||||
- `GET /api/users` - Get all users (admin only)
|
||||
- `GET /api/users/<user_id>` - Get a specific user (admin only)
|
||||
- `PUT /api/users/<user_id>` - Update a user (admin only)
|
||||
- `DELETE /api/users/<user_id>` - Delete a user (admin only)
|
||||
- `GET /api/me` - Get the current user's profile
|
||||
- `PUT /api/me` - Update the current user's profile
|
||||
|
||||
## Installation
|
||||
|
||||
### Prerequisites
|
||||
- Python 3.11 or higher
|
||||
- pip
|
||||
|
||||
### Setup
|
||||
|
||||
1. Clone the repository
|
||||
```bash
|
||||
git clone https://github.com/your-repo/reservation-platform.git
|
||||
cd reservation-platform/packages/flask-backend
|
||||
```
|
||||
|
||||
2. Install dependencies
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
3. Create a `.env` file with the following variables:
|
||||
```
|
||||
SECRET_KEY=your-secret-key
|
||||
DATABASE_URL=sqlite:///app.db
|
||||
JWT_SECRET=your-jwt-secret
|
||||
```
|
||||
|
||||
4. Initialize the database
|
||||
```bash
|
||||
flask db upgrade
|
||||
python scripts/init_db.py
|
||||
```
|
||||
|
||||
5. Run the development server
|
||||
```bash
|
||||
python wsgi.py
|
||||
```
|
||||
|
||||
## Docker Deployment
|
||||
|
||||
1. Build and run with Docker Compose
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## Development
|
||||
|
||||
### Running Migrations
|
||||
|
||||
To create a new migration after updating models:
|
||||
```bash
|
||||
flask db migrate -m "Description of changes"
|
||||
flask db upgrade
|
||||
```
|
@ -1,32 +0,0 @@
|
||||
from flask import Flask
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from flask_migrate import Migrate
|
||||
from flask_cors import CORS
|
||||
from config import Config
|
||||
|
||||
db = SQLAlchemy()
|
||||
migrate = Migrate()
|
||||
|
||||
def create_app(config_class=Config):
|
||||
app = Flask(__name__)
|
||||
app.config.from_object(config_class)
|
||||
|
||||
# Initialize extensions
|
||||
db.init_app(app)
|
||||
migrate.init_app(app, db)
|
||||
CORS(app)
|
||||
|
||||
# Register blueprints
|
||||
from app.api import bp as api_bp
|
||||
app.register_blueprint(api_bp, url_prefix='/api')
|
||||
|
||||
from app.auth import bp as auth_bp
|
||||
app.register_blueprint(auth_bp, url_prefix='/auth')
|
||||
|
||||
@app.route('/health')
|
||||
def health_check():
|
||||
return {'status': 'ok'}
|
||||
|
||||
return app
|
||||
|
||||
from app import models
|
@ -1,5 +0,0 @@
|
||||
from flask import Blueprint
|
||||
|
||||
bp = Blueprint('api', __name__)
|
||||
|
||||
from app.api import printers, jobs, users
|
@ -1,219 +0,0 @@
|
||||
from flask import request, jsonify
|
||||
from app import db
|
||||
from app.api import bp
|
||||
from app.models import PrintJob, Printer, User
|
||||
from app.auth.routes import token_required, admin_required
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
@bp.route('/jobs', methods=['GET'])
|
||||
@token_required
|
||||
def get_jobs():
|
||||
"""Get jobs for the current user or all jobs for admin"""
|
||||
is_admin = request.user_role == 'admin'
|
||||
user_id = request.user_id
|
||||
|
||||
# Parse query parameters
|
||||
status = request.args.get('status') # active, upcoming, completed, aborted, all
|
||||
printer_id = request.args.get('printer_id')
|
||||
|
||||
# Base query
|
||||
query = PrintJob.query
|
||||
|
||||
# Filter by user unless admin
|
||||
if not is_admin:
|
||||
query = query.filter_by(user_id=user_id)
|
||||
|
||||
# Filter by printer if provided
|
||||
if printer_id:
|
||||
query = query.filter_by(printer_id=printer_id)
|
||||
|
||||
# Apply status filter
|
||||
now = datetime.utcnow()
|
||||
if status == 'active':
|
||||
query = query.filter_by(aborted=False) \
|
||||
.filter(PrintJob.start_at <= now) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > now)
|
||||
elif status == 'upcoming':
|
||||
query = query.filter_by(aborted=False) \
|
||||
.filter(PrintJob.start_at > now)
|
||||
elif status == 'completed':
|
||||
query = query.filter_by(aborted=False) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) <= now)
|
||||
elif status == 'aborted':
|
||||
query = query.filter_by(aborted=True)
|
||||
|
||||
# Order by start time, most recent first
|
||||
query = query.order_by(PrintJob.start_at.desc())
|
||||
|
||||
# Execute query
|
||||
jobs = query.all()
|
||||
result = [job.to_dict() for job in jobs]
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/jobs/<job_id>', methods=['GET'])
|
||||
@token_required
|
||||
def get_job(job_id):
|
||||
"""Get a specific job"""
|
||||
job = PrintJob.query.get_or_404(job_id)
|
||||
|
||||
# Check permissions
|
||||
is_admin = request.user_role == 'admin'
|
||||
user_id = request.user_id
|
||||
|
||||
if not is_admin and job.user_id != user_id:
|
||||
return jsonify({'error': 'Not authorized to view this job'}), 403
|
||||
|
||||
return jsonify(job.to_dict())
|
||||
|
||||
@bp.route('/jobs', methods=['POST'])
|
||||
@token_required
|
||||
def create_job():
|
||||
"""Create a new print job (reserve a printer)"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
required_fields = ['printer_id', 'start_at', 'duration_in_minutes']
|
||||
for field in required_fields:
|
||||
if field not in data:
|
||||
return jsonify({'error': f'Missing required field: {field}'}), 400
|
||||
|
||||
# Validate printer
|
||||
printer = Printer.query.get(data['printer_id'])
|
||||
if not printer:
|
||||
return jsonify({'error': 'Printer not found'}), 404
|
||||
|
||||
if printer.status != 0: # Not operational
|
||||
return jsonify({'error': 'Printer is not operational'}), 400
|
||||
|
||||
# Parse start time
|
||||
try:
|
||||
start_at = datetime.fromisoformat(data['start_at'].replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
return jsonify({'error': 'Invalid start_at format'}), 400
|
||||
|
||||
# Validate duration
|
||||
try:
|
||||
duration = int(data['duration_in_minutes'])
|
||||
if duration <= 0 or duration > 480: # Max 8 hours
|
||||
return jsonify({'error': 'Invalid duration (must be between 1 and 480 minutes)'}), 400
|
||||
except ValueError:
|
||||
return jsonify({'error': 'Duration must be a number'}), 400
|
||||
|
||||
end_at = start_at + timedelta(minutes=duration)
|
||||
|
||||
# Check if the printer is available during the requested time
|
||||
conflicting_jobs = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(
|
||||
(PrintJob.start_at < end_at) &
|
||||
(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > start_at)
|
||||
) \
|
||||
.all()
|
||||
|
||||
if conflicting_jobs:
|
||||
return jsonify({'error': 'Printer is not available during the requested time'}), 409
|
||||
|
||||
# Create job
|
||||
job = PrintJob(
|
||||
printer_id=data['printer_id'],
|
||||
user_id=request.user_id,
|
||||
start_at=start_at,
|
||||
duration_in_minutes=duration,
|
||||
comments=data.get('comments', '')
|
||||
)
|
||||
|
||||
db.session.add(job)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify(job.to_dict()), 201
|
||||
|
||||
@bp.route('/jobs/<job_id>', methods=['PUT'])
|
||||
@token_required
|
||||
def update_job(job_id):
|
||||
"""Update a job"""
|
||||
job = PrintJob.query.get_or_404(job_id)
|
||||
|
||||
# Check permissions
|
||||
is_admin = request.user_role == 'admin'
|
||||
user_id = request.user_id
|
||||
|
||||
if not is_admin and job.user_id != user_id:
|
||||
return jsonify({'error': 'Not authorized to update this job'}), 403
|
||||
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Only allow certain fields to be updated
|
||||
if 'comments' in data:
|
||||
job.comments = data['comments']
|
||||
|
||||
# Admin or owner can abort a job
|
||||
if 'aborted' in data and data['aborted'] and not job.aborted:
|
||||
job.aborted = True
|
||||
job.abort_reason = data.get('abort_reason', '')
|
||||
|
||||
# Admin or owner can extend a job if it's active
|
||||
now = datetime.utcnow()
|
||||
is_active = (not job.aborted and
|
||||
job.start_at <= now and
|
||||
job.get_end_time() > now)
|
||||
|
||||
if 'extend_minutes' in data and is_active:
|
||||
try:
|
||||
extend_minutes = int(data['extend_minutes'])
|
||||
if extend_minutes <= 0 or extend_minutes > 120: # Max extend 2 hours
|
||||
return jsonify({'error': 'Invalid extension (must be between 1 and 120 minutes)'}), 400
|
||||
|
||||
new_end_time = job.get_end_time() + timedelta(minutes=extend_minutes)
|
||||
|
||||
# Check for conflicts with the extension
|
||||
conflicting_jobs = PrintJob.query.filter_by(printer_id=job.printer_id, aborted=False) \
|
||||
.filter(PrintJob.id != job.id) \
|
||||
.filter(PrintJob.start_at < new_end_time) \
|
||||
.filter(PrintJob.start_at > job.get_end_time()) \
|
||||
.all()
|
||||
|
||||
if conflicting_jobs:
|
||||
return jsonify({'error': 'Cannot extend job due to conflicts with other reservations'}), 409
|
||||
|
||||
job.duration_in_minutes += extend_minutes
|
||||
except ValueError:
|
||||
return jsonify({'error': 'Extend minutes must be a number'}), 400
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify(job.to_dict())
|
||||
|
||||
@bp.route('/jobs/<job_id>', methods=['DELETE'])
|
||||
@token_required
|
||||
def delete_job(job_id):
|
||||
"""Delete a job (cancel reservation)"""
|
||||
job = PrintJob.query.get_or_404(job_id)
|
||||
|
||||
# Check permissions
|
||||
is_admin = request.user_role == 'admin'
|
||||
user_id = request.user_id
|
||||
|
||||
if not is_admin and job.user_id != user_id:
|
||||
return jsonify({'error': 'Not authorized to delete this job'}), 403
|
||||
|
||||
# Only allow deletion of upcoming jobs
|
||||
now = datetime.utcnow()
|
||||
if job.start_at <= now and not is_admin:
|
||||
return jsonify({'error': 'Cannot delete an active or completed job'}), 400
|
||||
|
||||
db.session.delete(job)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({'message': 'Job deleted successfully'})
|
||||
|
||||
@bp.route('/jobs/<job_id>/remaining-time', methods=['GET'])
|
||||
def get_remaining_time(job_id):
|
||||
"""Get remaining time for a job (public endpoint)"""
|
||||
job = PrintJob.query.get_or_404(job_id)
|
||||
|
||||
remaining_seconds = job.get_remaining_time()
|
||||
|
||||
return jsonify({
|
||||
'job_id': job.id,
|
||||
'remaining_seconds': remaining_seconds,
|
||||
'is_active': job.is_active()
|
||||
})
|
@ -1,177 +0,0 @@
|
||||
from flask import request, jsonify
|
||||
from app import db
|
||||
from app.api import bp
|
||||
from app.models import Printer, PrintJob
|
||||
from app.auth.routes import token_required, admin_required
|
||||
from datetime import datetime
|
||||
|
||||
@bp.route('/printers', methods=['GET'])
|
||||
def get_printers():
|
||||
"""Get all printers"""
|
||||
printers = Printer.query.all()
|
||||
result = []
|
||||
|
||||
for printer in printers:
|
||||
# Get active job for the printer if any
|
||||
now = datetime.utcnow()
|
||||
active_job = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(PrintJob.start_at <= now) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > now) \
|
||||
.first()
|
||||
|
||||
printer_data = {
|
||||
'id': printer.id,
|
||||
'name': printer.name,
|
||||
'description': printer.description,
|
||||
'status': printer.status,
|
||||
'is_available': printer.status == 0 and active_job is None,
|
||||
'active_job': active_job.to_dict() if active_job else None
|
||||
}
|
||||
result.append(printer_data)
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/printers/<printer_id>', methods=['GET'])
|
||||
def get_printer(printer_id):
|
||||
"""Get a specific printer"""
|
||||
printer = Printer.query.get_or_404(printer_id)
|
||||
|
||||
# Get active job for the printer if any
|
||||
now = datetime.utcnow()
|
||||
active_job = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(PrintJob.start_at <= now) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > now) \
|
||||
.first()
|
||||
|
||||
# Get upcoming jobs
|
||||
upcoming_jobs = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(PrintJob.start_at > now) \
|
||||
.order_by(PrintJob.start_at) \
|
||||
.limit(5) \
|
||||
.all()
|
||||
|
||||
result = {
|
||||
'id': printer.id,
|
||||
'name': printer.name,
|
||||
'description': printer.description,
|
||||
'status': printer.status,
|
||||
'is_available': printer.status == 0 and active_job is None,
|
||||
'active_job': active_job.to_dict() if active_job else None,
|
||||
'upcoming_jobs': [job.to_dict() for job in upcoming_jobs]
|
||||
}
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/printers', methods=['POST'])
|
||||
@admin_required
|
||||
def create_printer():
|
||||
"""Create a new printer (admin only)"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
required_fields = ['name', 'description']
|
||||
for field in required_fields:
|
||||
if field not in data:
|
||||
return jsonify({'error': f'Missing required field: {field}'}), 400
|
||||
|
||||
printer = Printer(
|
||||
name=data['name'],
|
||||
description=data['description'],
|
||||
status=data.get('status', 0)
|
||||
)
|
||||
|
||||
db.session.add(printer)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'id': printer.id,
|
||||
'name': printer.name,
|
||||
'description': printer.description,
|
||||
'status': printer.status
|
||||
}), 201
|
||||
|
||||
@bp.route('/printers/<printer_id>', methods=['PUT'])
|
||||
@admin_required
|
||||
def update_printer(printer_id):
|
||||
"""Update a printer (admin only)"""
|
||||
printer = Printer.query.get_or_404(printer_id)
|
||||
data = request.get_json() or {}
|
||||
|
||||
if 'name' in data:
|
||||
printer.name = data['name']
|
||||
if 'description' in data:
|
||||
printer.description = data['description']
|
||||
if 'status' in data:
|
||||
printer.status = data['status']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'id': printer.id,
|
||||
'name': printer.name,
|
||||
'description': printer.description,
|
||||
'status': printer.status
|
||||
})
|
||||
|
||||
@bp.route('/printers/<printer_id>', methods=['DELETE'])
|
||||
@admin_required
|
||||
def delete_printer(printer_id):
|
||||
"""Delete a printer (admin only)"""
|
||||
printer = Printer.query.get_or_404(printer_id)
|
||||
|
||||
# Check if the printer has active jobs
|
||||
now = datetime.utcnow()
|
||||
active_jobs = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(PrintJob.start_at <= now) \
|
||||
.filter(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) > now) \
|
||||
.all()
|
||||
|
||||
if active_jobs:
|
||||
return jsonify({'error': 'Cannot delete printer with active jobs'}), 400
|
||||
|
||||
db.session.delete(printer)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({'message': 'Printer deleted successfully'})
|
||||
|
||||
@bp.route('/printers/availability', methods=['GET'])
|
||||
def get_availability():
|
||||
"""Get availability information for all printers"""
|
||||
start_date = request.args.get('start_date')
|
||||
end_date = request.args.get('end_date')
|
||||
|
||||
if not start_date or not end_date:
|
||||
return jsonify({'error': 'start_date and end_date are required'}), 400
|
||||
|
||||
try:
|
||||
start = datetime.fromisoformat(start_date.replace('Z', '+00:00'))
|
||||
end = datetime.fromisoformat(end_date.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
return jsonify({'error': 'Invalid date format'}), 400
|
||||
|
||||
if start >= end:
|
||||
return jsonify({'error': 'start_date must be before end_date'}), 400
|
||||
|
||||
printers = Printer.query.all()
|
||||
result = []
|
||||
|
||||
for printer in printers:
|
||||
# Get all jobs for this printer in the date range
|
||||
jobs = PrintJob.query.filter_by(printer_id=printer.id, aborted=False) \
|
||||
.filter(
|
||||
(PrintJob.start_at <= end) &
|
||||
(PrintJob.start_at.op('+')(PrintJob.duration_in_minutes * 60) >= start)
|
||||
) \
|
||||
.order_by(PrintJob.start_at) \
|
||||
.all()
|
||||
|
||||
# Convert to availability slots
|
||||
availability = {
|
||||
'printer_id': printer.id,
|
||||
'printer_name': printer.name,
|
||||
'status': printer.status,
|
||||
'jobs': [job.to_dict() for job in jobs]
|
||||
}
|
||||
|
||||
result.append(availability)
|
||||
|
||||
return jsonify(result)
|
@ -1,139 +0,0 @@
|
||||
from flask import request, jsonify
|
||||
from app import db
|
||||
from app.api import bp
|
||||
from app.models import User, PrintJob
|
||||
from app.auth.routes import admin_required, token_required
|
||||
|
||||
@bp.route('/users', methods=['GET'])
|
||||
@admin_required
|
||||
def get_users():
|
||||
"""Get all users (admin only)"""
|
||||
users = User.query.all()
|
||||
result = []
|
||||
|
||||
for user in users:
|
||||
# Count jobs
|
||||
total_jobs = PrintJob.query.filter_by(user_id=user.id).count()
|
||||
active_jobs = PrintJob.query.filter_by(user_id=user.id, aborted=False).count()
|
||||
|
||||
user_data = {
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role,
|
||||
'job_count': total_jobs,
|
||||
'active_job_count': active_jobs
|
||||
}
|
||||
result.append(user_data)
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/users/<user_id>', methods=['GET'])
|
||||
@admin_required
|
||||
def get_user(user_id):
|
||||
"""Get a specific user (admin only)"""
|
||||
user = User.query.get_or_404(user_id)
|
||||
|
||||
# Count jobs
|
||||
total_jobs = PrintJob.query.filter_by(user_id=user.id).count()
|
||||
active_jobs = PrintJob.query.filter_by(user_id=user.id, aborted=False).count()
|
||||
|
||||
result = {
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role,
|
||||
'job_count': total_jobs,
|
||||
'active_job_count': active_jobs
|
||||
}
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/users/<user_id>', methods=['PUT'])
|
||||
@admin_required
|
||||
def update_user(user_id):
|
||||
"""Update a user (admin only)"""
|
||||
user = User.query.get_or_404(user_id)
|
||||
data = request.get_json() or {}
|
||||
|
||||
if 'role' in data and data['role'] in ['admin', 'user', 'guest']:
|
||||
user.role = data['role']
|
||||
|
||||
if 'display_name' in data:
|
||||
user.display_name = data['display_name']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role
|
||||
})
|
||||
|
||||
@bp.route('/users/<user_id>', methods=['DELETE'])
|
||||
@admin_required
|
||||
def delete_user(user_id):
|
||||
"""Delete a user (admin only)"""
|
||||
user = User.query.get_or_404(user_id)
|
||||
|
||||
# Check if user has active jobs
|
||||
active_jobs = PrintJob.query.filter_by(user_id=user.id, aborted=False).first()
|
||||
if active_jobs:
|
||||
return jsonify({'error': 'Cannot delete user with active jobs'}), 400
|
||||
|
||||
db.session.delete(user)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({'message': 'User deleted successfully'})
|
||||
|
||||
@bp.route('/me', methods=['GET'])
|
||||
@token_required
|
||||
def get_current_user():
|
||||
"""Get the current user's profile"""
|
||||
user = User.query.get(request.user_id)
|
||||
if not user:
|
||||
return jsonify({'error': 'User not found'}), 404
|
||||
|
||||
result = {
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role
|
||||
}
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
@bp.route('/me', methods=['PUT'])
|
||||
@token_required
|
||||
def update_current_user():
|
||||
"""Update the current user's profile"""
|
||||
user = User.query.get(request.user_id)
|
||||
if not user:
|
||||
return jsonify({'error': 'User not found'}), 404
|
||||
|
||||
data = request.get_json() or {}
|
||||
|
||||
if 'display_name' in data:
|
||||
user.display_name = data['display_name']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
result = {
|
||||
'id': user.id,
|
||||
'github_id': user.github_id,
|
||||
'username': user.username,
|
||||
'display_name': user.display_name,
|
||||
'email': user.email,
|
||||
'role': user.role
|
||||
}
|
||||
|
||||
return jsonify(result)
|
@ -1,5 +0,0 @@
|
||||
from flask import Blueprint
|
||||
|
||||
bp = Blueprint('auth', __name__)
|
||||
|
||||
from app.auth import routes
|
@ -1,156 +0,0 @@
|
||||
from flask import request, jsonify, current_app
|
||||
from app import db
|
||||
from app.auth import bp
|
||||
from app.models import User, Session
|
||||
from datetime import datetime, timedelta
|
||||
import time
|
||||
import functools
|
||||
import re
|
||||
|
||||
@bp.route('/register', methods=['POST'])
|
||||
def register():
|
||||
"""Register a new user"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate required fields
|
||||
required_fields = ['username', 'email', 'password']
|
||||
for field in required_fields:
|
||||
if field not in data:
|
||||
return jsonify({'error': f'Missing required field: {field}'}), 400
|
||||
|
||||
# Validate email format
|
||||
email_regex = r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$'
|
||||
if not re.match(email_regex, data['email']):
|
||||
return jsonify({'error': 'Invalid email format'}), 400
|
||||
|
||||
# Validate password strength (at least 8 characters)
|
||||
if len(data['password']) < 8:
|
||||
return jsonify({'error': 'Password must be at least 8 characters long'}), 400
|
||||
|
||||
# Check if username already exists
|
||||
if User.query.filter_by(username=data['username']).first():
|
||||
return jsonify({'error': 'Username already exists'}), 400
|
||||
|
||||
# Check if email already exists
|
||||
if User.query.filter_by(email=data['email']).first():
|
||||
return jsonify({'error': 'Email already exists'}), 400
|
||||
|
||||
# Create new user
|
||||
user = User(
|
||||
username=data['username'],
|
||||
email=data['email'],
|
||||
display_name=data.get('display_name', data['username']),
|
||||
role='user' # Default role
|
||||
)
|
||||
user.set_password(data['password'])
|
||||
|
||||
db.session.add(user)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'id': user.id,
|
||||
'username': user.username,
|
||||
'email': user.email,
|
||||
'display_name': user.display_name,
|
||||
'role': user.role
|
||||
}), 201
|
||||
|
||||
@bp.route('/login', methods=['POST'])
|
||||
def login():
|
||||
"""Login a user with username/email and password"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate required fields
|
||||
if 'password' not in data:
|
||||
return jsonify({'error': 'Password is required'}), 400
|
||||
|
||||
if 'username' not in data and 'email' not in data:
|
||||
return jsonify({'error': 'Username or email is required'}), 400
|
||||
|
||||
# Find user by username or email
|
||||
user = None
|
||||
if 'username' in data:
|
||||
user = User.query.filter_by(username=data['username']).first()
|
||||
else:
|
||||
user = User.query.filter_by(email=data['email']).first()
|
||||
|
||||
# Check if user exists and verify password
|
||||
if not user or not user.check_password(data['password']):
|
||||
return jsonify({'error': 'Invalid credentials'}), 401
|
||||
|
||||
# Create a session for the user
|
||||
expires_at = int((datetime.utcnow() + timedelta(days=7)).timestamp())
|
||||
session = Session(
|
||||
user_id=user.id,
|
||||
expires_at=expires_at
|
||||
)
|
||||
db.session.add(session)
|
||||
db.session.commit()
|
||||
|
||||
# Generate JWT token
|
||||
token = user.generate_token()
|
||||
|
||||
return jsonify({
|
||||
'token': token,
|
||||
'user': {
|
||||
'id': user.id,
|
||||
'username': user.username,
|
||||
'email': user.email,
|
||||
'display_name': user.display_name,
|
||||
'role': user.role
|
||||
}
|
||||
})
|
||||
|
||||
@bp.route('/logout', methods=['POST'])
|
||||
def logout():
|
||||
"""Log out a user by invalidating their session"""
|
||||
auth_header = request.headers.get('Authorization')
|
||||
if not auth_header or not auth_header.startswith('Bearer '):
|
||||
return jsonify({'error': 'Authorization header required'}), 401
|
||||
|
||||
token = auth_header.split(' ')[1]
|
||||
payload = User.verify_token(token)
|
||||
if not payload:
|
||||
return jsonify({'error': 'Invalid token'}), 401
|
||||
|
||||
# Delete all sessions for this user
|
||||
Session.query.filter_by(user_id=payload['user_id']).delete()
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({'message': 'Successfully logged out'})
|
||||
|
||||
def token_required(f):
|
||||
@functools.wraps(f)
|
||||
def decorated(*args, **kwargs):
|
||||
auth_header = request.headers.get('Authorization')
|
||||
if not auth_header or not auth_header.startswith('Bearer '):
|
||||
return jsonify({'error': 'Authorization header required'}), 401
|
||||
|
||||
token = auth_header.split(' ')[1]
|
||||
payload = User.verify_token(token)
|
||||
if not payload:
|
||||
return jsonify({'error': 'Invalid token'}), 401
|
||||
|
||||
# Check if user has an active session
|
||||
user_id = payload['user_id']
|
||||
current_time = int(time.time())
|
||||
session = Session.query.filter_by(user_id=user_id).filter(Session.expires_at > current_time).first()
|
||||
if not session:
|
||||
return jsonify({'error': 'No active session found'}), 401
|
||||
|
||||
# Add user to request context
|
||||
request.user_id = user_id
|
||||
request.user_role = payload['role']
|
||||
|
||||
return f(*args, **kwargs)
|
||||
return decorated
|
||||
|
||||
def admin_required(f):
|
||||
@functools.wraps(f)
|
||||
@token_required
|
||||
def decorated(*args, **kwargs):
|
||||
if request.user_role != 'admin':
|
||||
return jsonify({'error': 'Admin privileges required'}), 403
|
||||
|
||||
return f(*args, **kwargs)
|
||||
return decorated
|
@ -1,124 +0,0 @@
|
||||
from app import db
|
||||
import uuid
|
||||
from datetime import datetime, timedelta
|
||||
import jwt
|
||||
from config import Config
|
||||
import bcrypt
|
||||
|
||||
class User(db.Model):
|
||||
__tablename__ = 'user'
|
||||
|
||||
id = db.Column(db.String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
username = db.Column(db.String(64), index=True, unique=True, nullable=False)
|
||||
display_name = db.Column(db.String(120))
|
||||
email = db.Column(db.String(120), index=True, unique=True, nullable=False)
|
||||
password_hash = db.Column(db.String(128), nullable=False)
|
||||
role = db.Column(db.String(20), default='user')
|
||||
|
||||
print_jobs = db.relationship('PrintJob', backref='user', lazy='dynamic', cascade='all, delete-orphan')
|
||||
sessions = db.relationship('Session', backref='user', lazy='dynamic', cascade='all, delete-orphan')
|
||||
|
||||
def set_password(self, password):
|
||||
"""Hash and set the user's password"""
|
||||
password_bytes = password.encode('utf-8')
|
||||
salt = bcrypt.gensalt()
|
||||
self.password_hash = bcrypt.hashpw(password_bytes, salt).decode('utf-8')
|
||||
|
||||
def check_password(self, password):
|
||||
"""Check if the provided password matches the stored hash"""
|
||||
password_bytes = password.encode('utf-8')
|
||||
stored_hash = self.password_hash.encode('utf-8')
|
||||
return bcrypt.checkpw(password_bytes, stored_hash)
|
||||
|
||||
def generate_token(self):
|
||||
"""Generate a JWT token for this user"""
|
||||
payload = {
|
||||
'user_id': self.id,
|
||||
'username': self.username,
|
||||
'email': self.email,
|
||||
'role': self.role,
|
||||
'exp': datetime.utcnow() + timedelta(seconds=Config.JWT_ACCESS_TOKEN_EXPIRES)
|
||||
}
|
||||
return jwt.encode(payload, Config.JWT_SECRET, algorithm='HS256')
|
||||
|
||||
@staticmethod
|
||||
def verify_token(token):
|
||||
"""Verify and decode a JWT token"""
|
||||
try:
|
||||
payload = jwt.decode(token, Config.JWT_SECRET, algorithms=['HS256'])
|
||||
return payload
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
class Session(db.Model):
|
||||
__tablename__ = 'session'
|
||||
|
||||
id = db.Column(db.String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
user_id = db.Column(db.String(36), db.ForeignKey('user.id', ondelete='CASCADE'), nullable=False)
|
||||
expires_at = db.Column(db.Integer, nullable=False)
|
||||
|
||||
|
||||
class Printer(db.Model):
|
||||
__tablename__ = 'printer'
|
||||
|
||||
id = db.Column(db.String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
name = db.Column(db.String(120), nullable=False)
|
||||
description = db.Column(db.Text, nullable=False)
|
||||
status = db.Column(db.Integer, nullable=False, default=0) # 0: OPERATIONAL, 1: OUT_OF_ORDER
|
||||
|
||||
print_jobs = db.relationship('PrintJob', backref='printer', lazy='dynamic', cascade='all, delete-orphan')
|
||||
|
||||
|
||||
class PrintJob(db.Model):
|
||||
__tablename__ = 'printJob'
|
||||
|
||||
id = db.Column(db.String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
printer_id = db.Column(db.String(36), db.ForeignKey('printer.id', ondelete='CASCADE'), nullable=False)
|
||||
user_id = db.Column(db.String(36), db.ForeignKey('user.id', ondelete='CASCADE'), nullable=False)
|
||||
start_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
duration_in_minutes = db.Column(db.Integer, nullable=False)
|
||||
comments = db.Column(db.Text)
|
||||
aborted = db.Column(db.Boolean, nullable=False, default=False)
|
||||
abort_reason = db.Column(db.Text)
|
||||
|
||||
def get_end_time(self):
|
||||
return self.start_at + timedelta(minutes=self.duration_in_minutes)
|
||||
|
||||
def is_active(self):
|
||||
now = datetime.utcnow()
|
||||
return (not self.aborted and
|
||||
self.start_at <= now and
|
||||
now < self.get_end_time())
|
||||
|
||||
def get_remaining_time(self):
|
||||
if self.aborted:
|
||||
return 0
|
||||
|
||||
now = datetime.utcnow()
|
||||
if now < self.start_at:
|
||||
# Job hasn't started yet
|
||||
return self.duration_in_minutes * 60
|
||||
|
||||
end_time = self.get_end_time()
|
||||
if now >= end_time:
|
||||
# Job has ended
|
||||
return 0
|
||||
|
||||
# Job is ongoing
|
||||
remaining_seconds = (end_time - now).total_seconds()
|
||||
return int(remaining_seconds)
|
||||
|
||||
def to_dict(self):
|
||||
return {
|
||||
'id': self.id,
|
||||
'printer_id': self.printer_id,
|
||||
'user_id': self.user_id,
|
||||
'start_at': self.start_at.isoformat(),
|
||||
'duration_in_minutes': self.duration_in_minutes,
|
||||
'comments': self.comments,
|
||||
'aborted': self.aborted,
|
||||
'abort_reason': self.abort_reason,
|
||||
'remaining_time': self.get_remaining_time(),
|
||||
'is_active': self.is_active()
|
||||
}
|
@ -1,13 +0,0 @@
|
||||
import os
|
||||
from dotenv import load_dotenv
|
||||
|
||||
basedir = os.path.abspath(os.path.dirname(__file__))
|
||||
load_dotenv(os.path.join(basedir, '.env'))
|
||||
|
||||
class Config:
|
||||
SECRET_KEY = os.environ.get('SECRET_KEY') or 'you-will-never-guess'
|
||||
SQLALCHEMY_DATABASE_URI = os.environ.get('DATABASE_URL') or \
|
||||
'sqlite:///' + os.path.join(basedir, 'app.db')
|
||||
SQLALCHEMY_TRACK_MODIFICATIONS = False
|
||||
JWT_SECRET = os.environ.get('JWT_SECRET') or 'jwt-secret-key'
|
||||
JWT_ACCESS_TOKEN_EXPIRES = 3600 # 1 hour in seconds
|
@ -1,20 +0,0 @@
|
||||
version: '3.8'
|
||||
|
||||
services:
|
||||
flask-backend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
restart: always
|
||||
ports:
|
||||
- "5000:5000"
|
||||
environment:
|
||||
- SECRET_KEY=your-secret-key
|
||||
- DATABASE_URL=sqlite:///app.db
|
||||
- JWT_SECRET=your-jwt-secret
|
||||
volumes:
|
||||
- ./instance:/app/instance
|
||||
command: >
|
||||
bash -c "python -m flask db upgrade &&
|
||||
python scripts/init_db.py &&
|
||||
gunicorn --bind 0.0.0.0:5000 wsgi:app"
|
@ -1,89 +0,0 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = migrations
|
||||
|
||||
# template used to generate migration files
|
||||
file_template = %%(year)d%%(month).2d%%(day).2d_%%(hour).2d%%(minute).2d%%(second).2d_%%(slug)s
|
||||
|
||||
# sys.path path, will be prepended to sys.path if present.
|
||||
# defaults to the current working directory.
|
||||
prepend_sys_path = .
|
||||
|
||||
# timezone to use when rendering the date
|
||||
# within the migration file as well as the filename.
|
||||
# string value is passed to dateutil.tz.gettz()
|
||||
# leave blank for localtime
|
||||
# timezone =
|
||||
|
||||
# max length of characters to apply to the
|
||||
# "slug" field
|
||||
truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
# sourceless = false
|
||||
|
||||
# version location specification; this defaults
|
||||
# to migrations/versions. When using multiple version
|
||||
# directories, initial revisions must be specified with --version-path
|
||||
# version_locations = %(here)s/bar %(here)s/bat migrations/versions
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
sqlalchemy.url = driver://user:pass@localhost/dbname
|
||||
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks = black
|
||||
# black.type = console_scripts
|
||||
# black.entrypoint = black
|
||||
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
@ -1,91 +0,0 @@
|
||||
from __future__ import with_statement
|
||||
|
||||
import logging
|
||||
from logging.config import fileConfig
|
||||
|
||||
from flask import current_app
|
||||
|
||||
from alembic import context
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
fileConfig(config.config_file_name)
|
||||
logger = logging.getLogger('alembic.env')
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
# from myapp import mymodel
|
||||
# target_metadata = mymodel.Base.metadata
|
||||
config.set_main_option(
|
||||
'sqlalchemy.url',
|
||||
str(current_app.extensions['migrate'].db.get_engine().url).replace(
|
||||
'%', '%%'))
|
||||
target_metadata = current_app.extensions['migrate'].db.metadata
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url, target_metadata=target_metadata, literal_binds=True
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online():
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
|
||||
# this callback is used to prevent an auto-migration from being generated
|
||||
# when there are no changes to the schema
|
||||
# reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html
|
||||
def process_revision_directives(context, revision, directives):
|
||||
if getattr(config.cmd_opts, 'autogenerate', False):
|
||||
script = directives[0]
|
||||
if script.upgrade_ops.is_empty():
|
||||
directives[:] = []
|
||||
logger.info('No changes in schema detected.')
|
||||
|
||||
connectable = current_app.extensions['migrate'].db.get_engine()
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=target_metadata,
|
||||
process_revision_directives=process_revision_directives,
|
||||
**current_app.extensions['migrate'].configure_args
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
@ -1,24 +0,0 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade():
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade():
|
||||
${downgrades if downgrades else "pass"}
|
@ -1,75 +0,0 @@
|
||||
"""Initial migration
|
||||
|
||||
Revision ID: initial_migration
|
||||
Revises:
|
||||
Create Date: 2025-03-06 12:00:00.000000
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'initial_migration'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# Create user table
|
||||
op.create_table('user',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('username', sa.String(length=64), nullable=False),
|
||||
sa.Column('display_name', sa.String(length=120), nullable=True),
|
||||
sa.Column('email', sa.String(length=120), nullable=False),
|
||||
sa.Column('password_hash', sa.String(length=128), nullable=False),
|
||||
sa.Column('role', sa.String(length=20), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('email'),
|
||||
sa.UniqueConstraint('username')
|
||||
)
|
||||
op.create_index(op.f('ix_user_email'), 'user', ['email'], unique=True)
|
||||
op.create_index(op.f('ix_user_username'), 'user', ['username'], unique=True)
|
||||
|
||||
# Create session table
|
||||
op.create_table('session',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('user_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('expires_at', sa.Integer(), nullable=False),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
# Create printer table
|
||||
op.create_table('printer',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('name', sa.String(length=120), nullable=False),
|
||||
sa.Column('description', sa.Text(), nullable=False),
|
||||
sa.Column('status', sa.Integer(), nullable=False),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
# Create printJob table
|
||||
op.create_table('printJob',
|
||||
sa.Column('id', sa.String(length=36), nullable=False),
|
||||
sa.Column('printer_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('user_id', sa.String(length=36), nullable=False),
|
||||
sa.Column('start_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('duration_in_minutes', sa.Integer(), nullable=False),
|
||||
sa.Column('comments', sa.Text(), nullable=True),
|
||||
sa.Column('aborted', sa.Boolean(), nullable=False),
|
||||
sa.Column('abort_reason', sa.Text(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['printer_id'], ['printer.id'], ondelete='CASCADE'),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table('printJob')
|
||||
op.drop_table('printer')
|
||||
op.drop_table('session')
|
||||
op.drop_index(op.f('ix_user_username'), table_name='user')
|
||||
op.drop_index(op.f('ix_user_email'), table_name='user')
|
||||
op.drop_table('user')
|
@ -1,9 +0,0 @@
|
||||
Flask==2.3.3
|
||||
Flask-SQLAlchemy==3.1.1
|
||||
Flask-Migrate==4.0.5
|
||||
Flask-CORS==4.0.0
|
||||
python-dotenv==1.0.0
|
||||
SQLAlchemy==2.0.25
|
||||
pyjwt==2.8.0
|
||||
bcrypt==4.1.2
|
||||
gunicorn==21.2.0
|
@ -1,23 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Initialize virtual environment if it doesn't exist
|
||||
if [ ! -d "venv" ]; then
|
||||
echo "Creating virtual environment..."
|
||||
python3 -m venv venv
|
||||
fi
|
||||
|
||||
# Activate virtual environment
|
||||
source venv/bin/activate
|
||||
|
||||
# Install dependencies
|
||||
echo "Installing dependencies..."
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Initialize database
|
||||
echo "Initializing database..."
|
||||
flask db upgrade
|
||||
python scripts/init_db.py
|
||||
|
||||
# Run the application
|
||||
echo "Starting Flask application..."
|
||||
python wsgi.py
|
@ -1,55 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
from app import create_app, db
|
||||
from app.models import User, Printer
|
||||
import uuid
|
||||
|
||||
def init_db():
|
||||
app = create_app()
|
||||
with app.app_context():
|
||||
# Create tables
|
||||
db.create_all()
|
||||
|
||||
# Check if we already have an admin user
|
||||
admin = User.query.filter_by(role='admin').first()
|
||||
if not admin:
|
||||
# Create admin user
|
||||
admin = User(
|
||||
id=str(uuid.uuid4()),
|
||||
username='admin',
|
||||
display_name='Administrator',
|
||||
email='admin@example.com',
|
||||
role='admin'
|
||||
)
|
||||
admin.set_password('admin123') # Default password, change in production!
|
||||
db.session.add(admin)
|
||||
print("Created admin user with username 'admin' and password 'admin123'")
|
||||
|
||||
# Check if we have any printers
|
||||
printer_count = Printer.query.count()
|
||||
if printer_count == 0:
|
||||
# Create sample printers
|
||||
printers = [
|
||||
Printer(
|
||||
name='Printer 1',
|
||||
description='3D Printer for general use',
|
||||
status=0 # OPERATIONAL
|
||||
),
|
||||
Printer(
|
||||
name='Printer 2',
|
||||
description='High resolution printer for detailed work',
|
||||
status=0 # OPERATIONAL
|
||||
),
|
||||
Printer(
|
||||
name='Printer 3',
|
||||
description='Large format printer for big projects',
|
||||
status=0 # OPERATIONAL
|
||||
)
|
||||
]
|
||||
db.session.add_all(printers)
|
||||
print("Created sample printers")
|
||||
|
||||
db.session.commit()
|
||||
print("Database initialized successfully!")
|
||||
|
||||
if __name__ == '__main__':
|
||||
init_db()
|
@ -1,6 +0,0 @@
|
||||
from app import create_app
|
||||
|
||||
app = create_app()
|
||||
|
||||
if __name__ == '__main__':
|
||||
app.run(host='0.0.0.0', port=5000, debug=True)
|
@ -1,71 +0,0 @@
|
||||
root@raspberrypi:/home/user/Projektarbeit-MYP# ./install-frontend.sh
|
||||
[2025-04-01 10:58:30] Bereinige vorhandene Installation...
|
||||
[2025-04-01 10:58:30] Bereinigung abgeschlossen.
|
||||
[2025-04-01 10:58:30] Docker Compose v2 Plugin ist bereits installiert.
|
||||
[2025-04-01 10:58:30] Wechsle ins Verzeichnis: /home/user/Projektarbeit-MYP/packages/reservation-platform
|
||||
[2025-04-01 10:58:30] Erstelle .env Datei...
|
||||
[2025-04-01 10:58:30] .env Datei erfolgreich erstellt
|
||||
[2025-04-01 10:58:30] HINWEIS: Bitte passen Sie die Backend-URL in der .env-Datei an, falls das Backend auf einem anderen Server läuft.
|
||||
[2025-04-01 10:58:30] Erstelle Datenbankverzeichnis
|
||||
[2025-04-01 10:58:30] Baue und starte Frontend-Container...
|
||||
[2025-04-01 10:58:30] Dies kann auf einem Raspberry Pi mehrere Minuten dauern - bitte geduldig sein
|
||||
[2025-04-01 10:58:30] Baue lokales Image...
|
||||
WARN[0000] /home/user/Projektarbeit-MYP/packages/reservation-platform/docker-compose.yml: the attribute `version` is obsolete, it will be ignored, please remove it to avoid potential confusion
|
||||
Compose can now delegate builds to bake for better performance.
|
||||
To do so, set COMPOSE_BAKE=true.
|
||||
[+] Building 95.0s (11/16) docker-container:myp-rp-arm64-builder
|
||||
=> [frontend internal] load build definition from Dockerfile 0.0s
|
||||
=> => transferring dockerfile: 3.60kB 0.0s
|
||||
=> [frontend internal] load metadata for docker.io/library/node:alpine 2.4s
|
||||
=> [frontend internal] load .dockerignore 0.1s
|
||||
=> => transferring context: 2B 0.0s
|
||||
=> [frontend 1/12] FROM docker.io/library/node:alpine@sha256:6eae672406a2bc8ed93eab6f9f76a02eb247e06ba82b2f5032c0a4ae07e825ba 0.1s
|
||||
=> => resolve docker.io/library/node:alpine@sha256:6eae672406a2bc8ed93eab6f9f76a02eb247e06ba82b2f5032c0a4ae07e825ba 0.1s
|
||||
=> [frontend internal] load build context 0.1s
|
||||
=> => transferring context: 11.08kB 0.0s
|
||||
=> CACHED [frontend 2/12] WORKDIR /app 0.0s
|
||||
=> [frontend 3/12] RUN apk add --no-cache python3 build-base g++ make sqlite sqlite-dev gcc musl-dev git libffi-dev openssl-dev cmake 58.0s
|
||||
=> [frontend 4/12] RUN npm install -g pnpm 4.4s
|
||||
=> [frontend 5/12] COPY package.json pnpm-lock.yaml ./ 0.3s
|
||||
=> [frontend 6/12] RUN pnpm install --unsafe-perm --no-optional --frozen-lockfile 27.7s
|
||||
=> ERROR [frontend 7/12] RUN npm install -g npx 1.8s
|
||||
------
|
||||
> [frontend 7/12] RUN npm install -g npx:
|
||||
1.480 npm error code EEXIST
|
||||
1.480 npm error path /usr/local/bin/npx
|
||||
1.480 npm error EEXIST: file already exists
|
||||
1.480 npm error File exists: /usr/local/bin/npx
|
||||
1.481 npm error Remove the existing file and try again, or run npm
|
||||
1.481 npm error with --force to overwrite files recklessly.
|
||||
1.483 npm error A complete log of this run can be found in: /root/.npm/_logs/2025-04-01T09_00_04_989Z-debug-0.log
|
||||
------
|
||||
failed to solve: process "/bin/sh -c npm install -g npx" did not complete successfully: exit code: 1
|
||||
[2025-04-01 11:00:06] FEHLER: Docker Compose Build (v2) fehlgeschlagen. Versuche mit v1 Format...
|
||||
[+] Building 83.3s (11/16) docker-container:myp-rp-arm64-builder
|
||||
=> [frontend internal] load build definition from Dockerfile 0.0s
|
||||
=> => transferring dockerfile: 3.60kB 0.0s
|
||||
=> [frontend internal] load metadata for docker.io/library/node:alpine 0.6s
|
||||
=> [frontend internal] load .dockerignore 0.0s
|
||||
=> => transferring context: 2B 0.0s
|
||||
=> [frontend internal] load build context 0.2s
|
||||
=> => transferring context: 7.22kB 0.0s
|
||||
=> [frontend 1/12] FROM docker.io/library/node:alpine@sha256:6eae672406a2bc8ed93eab6f9f76a02eb247e06ba82b2f5032c0a4ae07e825ba 0.1s
|
||||
=> => resolve docker.io/library/node:alpine@sha256:6eae672406a2bc8ed93eab6f9f76a02eb247e06ba82b2f5032c0a4ae07e825ba 0.1s
|
||||
=> CACHED [frontend 2/12] WORKDIR /app 0.0s
|
||||
=> [frontend 3/12] RUN apk add --no-cache python3 build-base g++ make sqlite sqlite-dev gcc musl-dev git libffi-dev openssl-dev cmake 51.4s
|
||||
=> [frontend 4/12] RUN npm install -g pnpm 5.1s
|
||||
=> [frontend 5/12] COPY package.json pnpm-lock.yaml ./ 0.3s
|
||||
=> [frontend 6/12] RUN pnpm install --unsafe-perm --no-optional --frozen-lockfile 23.2s
|
||||
=> ERROR [frontend 7/12] RUN npm install -g npx 2.3s
|
||||
------
|
||||
> [frontend 7/12] RUN npm install -g npx:
|
||||
1.975 npm error code EEXIST
|
||||
1.975 npm error path /usr/local/bin/npx
|
||||
1.975 npm error EEXIST: file already exists
|
||||
1.975 npm error File exists: /usr/local/bin/npx
|
||||
1.975 npm error Remove the existing file and try again, or run npm
|
||||
1.975 npm error with --force to overwrite files recklessly.
|
||||
1.989 npm error A complete log of this run can be found in: /root/.npm/_logs/2025-04-01T09_01_27_844Z-debug-0.log
|
||||
------
|
||||
failed to solve: process "/bin/sh -c npm install -g npx" did not complete successfully: exit code: 1
|
||||
[2025-04-01 11:01:29] FEHLER: Docker Compose Build fehlgeschlagen. Siehe Fehlermeldung oben.
|
@ -1,22 +0,0 @@
|
||||
# Notwendige Frontend-Änderungen
|
||||
|
||||
1. Frontend-Authentifizierung anpassen:
|
||||
- GitHub OAuth durch lokale Authentifizierung ersetzen
|
||||
- Login-Komponenten für Benutzername/Passwort erstellen
|
||||
- Registrierungs-Formular implementieren
|
||||
- API-Routen für Login- und Registrierungsprozess anpassen
|
||||
|
||||
2. Datenbankschema:
|
||||
- Users-Tabelle anpassen um Passwort-Hash zu unterstützen
|
||||
- GitHub-ID entfernen oder optional machen
|
||||
|
||||
3. Auth-System:
|
||||
- Lucia.js: Anpassung von OAuth auf Formular-basierte Authentifizierung
|
||||
- Session-Management beibehalten
|
||||
|
||||
4. API-Endpunktanpassungen:
|
||||
- Neue Login und Register-Endpunkte erstellen
|
||||
- Route für initialen Admin-Setup
|
||||
|
||||
Die Änderungen im Frontend sind umfangreicher, da das aktuelle System stark auf GitHub OAuth ausgerichtet ist und komplett umgestellt werden muss.
|
||||
|
@ -1,591 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# MYP Frontend Installations-Skript
|
||||
# Dieses Skript installiert das Frontend mit Docker und Host-Netzwerkanbindung
|
||||
|
||||
# Farbcodes für Ausgabe
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[0;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Funktion zur Ausgabe mit Zeitstempel
|
||||
log() {
|
||||
echo -e "${BLUE}[$(date +'%Y-%m-%d %H:%M:%S')]${NC} $1"
|
||||
}
|
||||
|
||||
error_log() {
|
||||
echo -e "${RED}[$(date +'%Y-%m-%d %H:%M:%S')] FEHLER:${NC} $1" >&2
|
||||
}
|
||||
|
||||
# Funktion zum Bereinigen vorhandener Installationen
|
||||
cleanup_existing_installation() {
|
||||
log "${YELLOW}Bereinige vorhandene Installation...${NC}"
|
||||
|
||||
# Stoppe und entferne existierende Container
|
||||
if docker ps -a | grep -q "myp-frontend"; then
|
||||
log "Stoppe und entferne existierenden Frontend-Container..."
|
||||
docker stop myp-frontend &>/dev/null || true
|
||||
docker rm myp-frontend &>/dev/null || true
|
||||
fi
|
||||
|
||||
# Entferne Docker Images
|
||||
if docker images | grep -q "myp-frontend"; then
|
||||
log "Entferne existierendes Frontend-Image..."
|
||||
docker rmi myp-frontend &>/dev/null || true
|
||||
fi
|
||||
|
||||
log "${GREEN}Bereinigung abgeschlossen.${NC}"
|
||||
}
|
||||
|
||||
# Pfade definieren
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
FRONTEND_DIR="$SCRIPT_DIR/packages/reservation-platform"
|
||||
|
||||
# Prüfen ob Verzeichnis existiert
|
||||
if [ ! -d "$FRONTEND_DIR" ]; then
|
||||
error_log "Frontend-Verzeichnis '$FRONTEND_DIR' nicht gefunden."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Bereinige existierende Installation
|
||||
cleanup_existing_installation
|
||||
|
||||
# Funktion zur Installation von Docker und Docker Compose für Raspberry Pi
|
||||
install_docker() {
|
||||
log "${YELLOW}Docker ist nicht installiert. Installation wird gestartet...${NC}"
|
||||
|
||||
# Erkenne Raspberry Pi
|
||||
if [ -f /proc/device-tree/model ] && grep -q "Raspberry Pi" /proc/device-tree/model; then
|
||||
log "${GREEN}Raspberry Pi erkannt. Installiere Docker für ARM-Architektur...${NC}"
|
||||
IS_RASPBERRY_PI=true
|
||||
else
|
||||
IS_RASPBERRY_PI=false
|
||||
fi
|
||||
|
||||
# Aktualisiere Paketindex
|
||||
if ! sudo apt-get update; then
|
||||
error_log "Konnte Paketindex nicht aktualisieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Installiere erforderliche Pakete
|
||||
if ! sudo apt-get install -y apt-transport-https ca-certificates curl gnupg software-properties-common; then
|
||||
error_log "Konnte erforderliche Pakete nicht installieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Raspberry Pi-spezifische Installation
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
# Setze Systemarchitektur für Raspberry Pi (armhf oder arm64)
|
||||
ARCH=$(dpkg --print-architecture)
|
||||
log "Erkannte Systemarchitektur: ${ARCH}"
|
||||
|
||||
# Installiere Docker mit convenience script (für Raspberry Pi empfohlen)
|
||||
log "${YELLOW}Installiere Docker mit dem convenience script...${NC}"
|
||||
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||
sudo sh get-docker.sh
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Docker-Installation fehlgeschlagen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
# Standard-Installation für andere Systeme
|
||||
# Füge Docker's offiziellen GPG-Schlüssel hinzu
|
||||
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
|
||||
|
||||
# Füge Docker-Repository hinzu
|
||||
if ! sudo add-apt-repository "deb [arch=$(dpkg --print-architecture)] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"; then
|
||||
error_log "Konnte Docker-Repository nicht hinzufügen. Prüfen Sie, ob Ihr System unterstützt wird."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Aktualisiere Paketindex erneut
|
||||
sudo apt-get update
|
||||
|
||||
# Installiere Docker
|
||||
if ! sudo apt-get install -y docker-ce docker-ce-cli containerd.io; then
|
||||
error_log "Konnte Docker nicht installieren. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Füge aktuellen Benutzer zur Docker-Gruppe hinzu
|
||||
sudo usermod -aG docker "$USER"
|
||||
|
||||
log "${GREEN}Docker wurde installiert.${NC}"
|
||||
log "${YELLOW}WICHTIG: Möglicherweise müssen Sie sich neu anmelden, damit die Gruppenänderung wirksam wird.${NC}"
|
||||
|
||||
# Prüfen, ob Docker Compose v2 Plugin verfügbar ist (bevorzugt, da moderner)
|
||||
log "${YELLOW}Prüfe Docker Compose Version...${NC}"
|
||||
|
||||
if docker compose version &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v2 Plugin ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=true
|
||||
else
|
||||
log "${YELLOW}Docker Compose v2 Plugin nicht gefunden. Versuche Docker Compose v1 zu installieren...${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
# Für Raspberry Pi ist es besser, die richtige Architektur zu verwenden
|
||||
if [ "$ARCH" = "armhf" ]; then
|
||||
log "Installiere Docker Compose für armhf (32-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-armv7" -o /usr/local/bin/docker-compose
|
||||
elif [ "$ARCH" = "arm64" ]; then
|
||||
log "Installiere Docker Compose für arm64 (64-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-aarch64" -o /usr/local/bin/docker-compose
|
||||
else
|
||||
# Fallback auf v1.29.2 für unbekannte ARM-Architekturen
|
||||
log "Verwende automatische Architekturerkennung für Docker Compose v1.29.2..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
else
|
||||
# Für andere Systeme versuche zuerst v2, dann v1.29.2 als Fallback
|
||||
log "Installiere Docker Compose v2 für $(uname -s)/$(uname -m)..."
|
||||
if ! sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose; then
|
||||
log "${YELLOW}Konnte Docker Compose v2 nicht herunterladen. Versuche v1.29.2...${NC}"
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Konnte Docker Compose nicht herunterladen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
sudo chmod +x /usr/local/bin/docker-compose
|
||||
|
||||
log "${GREEN}Docker Compose wurde installiert.${NC}"
|
||||
fi
|
||||
|
||||
# Starte Docker-Dienst
|
||||
if command -v systemctl &> /dev/null; then
|
||||
sudo systemctl enable docker
|
||||
sudo systemctl start docker
|
||||
elif command -v service &> /dev/null; then
|
||||
sudo service docker enable
|
||||
sudo service docker start
|
||||
fi
|
||||
}
|
||||
|
||||
# Prüfen ob Docker installiert ist
|
||||
if ! command -v docker &> /dev/null; then
|
||||
log "${YELLOW}Docker ist nicht installiert.${NC}"
|
||||
read -p "Möchten Sie Docker installieren? (j/n): " install_docker_choice
|
||||
if [[ "$install_docker_choice" == "j" ]]; then
|
||||
install_docker
|
||||
else
|
||||
error_log "Docker wird für die Installation benötigt. Bitte installieren Sie Docker manuell."
|
||||
log "Siehe: https://docs.docker.com/get-docker/"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfen ob Docker Daemon läuft
|
||||
if ! docker info &> /dev/null; then
|
||||
log "${YELLOW}Docker-Daemon läuft nicht. Versuche, den Dienst zu starten...${NC}"
|
||||
|
||||
# Versuche, Docker zu starten
|
||||
if command -v systemctl &> /dev/null; then
|
||||
sudo systemctl start docker
|
||||
elif command -v service &> /dev/null; then
|
||||
sudo service docker start
|
||||
else
|
||||
error_log "Konnte Docker-Daemon nicht starten. Bitte starten Sie den Docker-Dienst manuell."
|
||||
log "Starten mit: sudo systemctl start docker oder sudo service docker start"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfe erneut, ob Docker läuft
|
||||
if ! docker info &> /dev/null; then
|
||||
error_log "Docker-Daemon konnte nicht gestartet werden. Bitte starten Sie den Docker-Dienst manuell."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
log "${GREEN}Docker-Daemon wurde erfolgreich gestartet.${NC}"
|
||||
fi
|
||||
|
||||
# Prüfen ob Docker Compose installiert ist
|
||||
if docker compose version &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v2 Plugin ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=true
|
||||
elif command -v docker-compose &> /dev/null; then
|
||||
log "${GREEN}Docker Compose v1 ist bereits installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
else
|
||||
log "${YELLOW}Docker Compose ist nicht installiert.${NC}"
|
||||
DOCKER_COMPOSE_V2=false
|
||||
read -p "Möchten Sie Docker Compose installieren? (j/n): " install_compose_choice
|
||||
if [[ "$install_compose_choice" == "j" ]]; then
|
||||
log "${YELLOW}Installiere Docker Compose...${NC}"
|
||||
|
||||
# Prüfe ob das Betriebssystem ARM-basiert ist (z.B. Raspberry Pi)
|
||||
if grep -q "arm" /proc/cpuinfo 2> /dev/null; then
|
||||
ARCH=$(dpkg --print-architecture 2> /dev/null || echo "unknown")
|
||||
IS_RASPBERRY_PI=true
|
||||
else
|
||||
IS_RASPBERRY_PI=false
|
||||
fi
|
||||
|
||||
# Versuche zuerst Docker Compose v2 zu installieren
|
||||
if [ "$IS_RASPBERRY_PI" = true ]; then
|
||||
if [ "$ARCH" = "armhf" ]; then
|
||||
log "Installiere Docker Compose für armhf (32-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-armv7" -o /usr/local/bin/docker-compose
|
||||
elif [ "$ARCH" = "arm64" ]; then
|
||||
log "Installiere Docker Compose für arm64 (64-bit)..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-linux-aarch64" -o /usr/local/bin/docker-compose
|
||||
else
|
||||
log "Verwende automatische Architekturerkennung für Docker Compose v1.29.2..."
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
else
|
||||
log "Installiere Docker Compose v2 für $(uname -s)/$(uname -m)..."
|
||||
if ! sudo curl -L "https://github.com/docker/compose/releases/download/v2.6.1/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose; then
|
||||
log "${YELLOW}Konnte Docker Compose v2 nicht herunterladen. Versuche v1.29.2...${NC}"
|
||||
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
error_log "Konnte Docker Compose nicht herunterladen. Bitte manuell installieren."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
sudo chmod +x /usr/local/bin/docker-compose
|
||||
|
||||
log "${GREEN}Docker Compose wurde installiert.${NC}"
|
||||
else
|
||||
error_log "Docker Compose wird für die Installation benötigt. Bitte installieren Sie es manuell."
|
||||
log "Siehe: https://docs.docker.com/compose/install/"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfen ob wget installiert ist (wird für healthcheck verwendet)
|
||||
if ! command -v wget &> /dev/null; then
|
||||
error_log "wget ist nicht installiert, wird aber für den Container-Healthcheck benötigt."
|
||||
log "Installation mit: sudo apt-get install wget"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Wechsle ins Frontend-Verzeichnis
|
||||
log "Wechsle ins Verzeichnis: $FRONTEND_DIR"
|
||||
cd "$FRONTEND_DIR" || {
|
||||
error_log "Konnte nicht ins Verzeichnis $FRONTEND_DIR wechseln."
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Prüfe ob Dockerfile existiert
|
||||
if [ ! -f "Dockerfile" ]; then
|
||||
error_log "Dockerfile nicht gefunden in $FRONTEND_DIR."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfe ob docker-compose.yml existiert
|
||||
if [ ! -f "docker-compose.yml" ]; then
|
||||
error_log "docker-compose.yml nicht gefunden in $FRONTEND_DIR."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfe ob package.json existiert
|
||||
if [ ! -f "package.json" ]; then
|
||||
error_log "package.json nicht gefunden in $FRONTEND_DIR."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Erstelle .env-Datei
|
||||
log "${YELLOW}Erstelle .env Datei...${NC}"
|
||||
cat > .env << EOL
|
||||
# Basic Server Configuration
|
||||
RUNTIME_ENVIRONMENT=prod
|
||||
DB_PATH=db/sqlite.db
|
||||
|
||||
# OAuth Configuration (Bitte anpassen)
|
||||
OAUTH_CLIENT_ID=client_id
|
||||
OAUTH_CLIENT_SECRET=client_secret
|
||||
|
||||
# Backend-API URL (IP-Adresse oder Hostname des Backend-Servers)
|
||||
NEXT_PUBLIC_API_URL=http://localhost:5000
|
||||
EOL
|
||||
|
||||
if [ ! -f ".env" ]; then
|
||||
error_log "Konnte .env-Datei nicht erstellen. Prüfen Sie die Berechtigungen."
|
||||
exit 1
|
||||
fi
|
||||
log "${GREEN}.env Datei erfolgreich erstellt${NC}"
|
||||
log "${YELLOW}HINWEIS: Bitte passen Sie die Backend-URL in der .env-Datei an, falls das Backend auf einem anderen Server läuft.${NC}"
|
||||
|
||||
# Datenbank-Verzeichnis erstellen
|
||||
log "Erstelle Datenbankverzeichnis"
|
||||
if ! mkdir -p db; then
|
||||
error_log "Konnte Verzeichnis 'db' nicht erstellen. Prüfen Sie die Berechtigungen."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Docker-Image bauen und starten
|
||||
log "${YELLOW}Baue und starte Frontend-Container...${NC}"
|
||||
log "${YELLOW}Dies kann auf einem Raspberry Pi mehrere Minuten dauern - bitte geduldig sein${NC}"
|
||||
|
||||
# Prüfe, ob Docker-Daemon läuft
|
||||
if ! docker info &>/dev/null; then
|
||||
log "${YELLOW}Docker-Daemon scheint nicht zu laufen. Versuche zu starten...${NC}"
|
||||
|
||||
# Versuche Docker zu starten
|
||||
if command -v systemctl &>/dev/null; then
|
||||
sudo systemctl start docker || true
|
||||
sleep 5
|
||||
elif command -v service &>/dev/null; then
|
||||
sudo service docker start || true
|
||||
sleep 5
|
||||
fi
|
||||
|
||||
# Prüfe erneut, ob Docker jetzt läuft
|
||||
if ! docker info &>/dev/null; then
|
||||
error_log "Docker-Daemon konnte nicht gestartet werden."
|
||||
log "Führen Sie vor der Installation bitte folgende Befehle aus:"
|
||||
log " sudo systemctl start docker"
|
||||
log " sudo systemctl enable docker"
|
||||
log "Starten Sie dann das Installationsskript erneut."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Docker-Rechte prüfen
|
||||
if ! docker ps &>/dev/null; then
|
||||
error_log "Sie haben keine Berechtigung, Docker ohne sudo zu verwenden."
|
||||
log "Bitte führen Sie folgenden Befehl aus und melden Sie sich danach neu an:"
|
||||
log " sudo usermod -aG docker $USER"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prüfen, ob erforderliche Basis-Images lokal verfügbar sind
|
||||
if ! docker image inspect node:lts-alpine &>/dev/null; then
|
||||
log "${YELLOW}Prüfe und setze DNS-Server für Docker...${NC}"
|
||||
|
||||
# DNS-Einstellungen prüfen und anpassen
|
||||
if [ -f /etc/docker/daemon.json ]; then
|
||||
log "Bestehende Docker-Konfiguration gefunden."
|
||||
else
|
||||
log "Erstelle Docker-Konfiguration mit Google DNS..."
|
||||
sudo mkdir -p /etc/docker
|
||||
echo '{
|
||||
"dns": ["8.8.8.8", "8.8.4.4"]
|
||||
}' | sudo tee /etc/docker/daemon.json > /dev/null
|
||||
|
||||
# Docker neu starten, damit die Änderungen wirksam werden
|
||||
if command -v systemctl &>/dev/null; then
|
||||
sudo systemctl restart docker
|
||||
sleep 5
|
||||
elif command -v service &>/dev/null; then
|
||||
sudo service docker restart
|
||||
sleep 5
|
||||
fi
|
||||
fi
|
||||
|
||||
# Versuche Image explizit mit anderen Tags herunterzuladen
|
||||
log "${YELLOW}Versuche lokal vorhandene Node-Version zu finden...${NC}"
|
||||
|
||||
# Suche nach allen verfügbaren Node-Images
|
||||
NODE_IMAGES=$(docker images --format "{{.Repository}}:{{.Tag}}" | grep "node:")
|
||||
|
||||
if [ -n "$NODE_IMAGES" ]; then
|
||||
log "Gefundene Node-Images: $NODE_IMAGES"
|
||||
# Verwende das erste gefundene Node-Image
|
||||
FIRST_NODE=$(echo "$NODE_IMAGES" | head -n 1)
|
||||
log "${GREEN}Verwende vorhandenes Node-Image: $FIRST_NODE${NC}"
|
||||
|
||||
# Aktualisiere den Dockerfile
|
||||
sed -i "s|FROM node:lts-alpine|FROM $FIRST_NODE|g" Dockerfile
|
||||
log "Dockerfile aktualisiert, um lokales Image zu verwenden."
|
||||
else
|
||||
# Versuche unterschiedliche Node-Versionen
|
||||
for NODE_VERSION in "node:20-alpine" "node:18-alpine" "node:16-alpine" "node:alpine" "node:slim"; do
|
||||
log "Versuche $NODE_VERSION zu laden..."
|
||||
if docker pull $NODE_VERSION; then
|
||||
log "${GREEN}Erfolgreich $NODE_VERSION heruntergeladen${NC}"
|
||||
# Aktualisiere den Dockerfile
|
||||
sed -i "s|FROM node:lts-alpine|FROM $NODE_VERSION|g" Dockerfile
|
||||
log "Dockerfile aktualisiert, um $NODE_VERSION zu verwenden."
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi
|
||||
fi
|
||||
|
||||
# Erhöhe Docker-Timeout für langsame Verbindungen und Raspberry Pi
|
||||
export DOCKER_CLIENT_TIMEOUT=300
|
||||
export COMPOSE_HTTP_TIMEOUT=300
|
||||
|
||||
# Verwende die richtige Docker Compose Version
|
||||
if [ "${DOCKER_COMPOSE_V2:-false}" = true ]; then
|
||||
# Docker Compose V2 Plugin (docker compose)
|
||||
log "Baue lokales Image..."
|
||||
if ! docker compose build --no-cache; then
|
||||
error_log "Docker Compose Build (v2) fehlgeschlagen. Versuche mit v1 Format..."
|
||||
if ! docker-compose build --no-cache; then
|
||||
error_log "Docker Compose Build fehlgeschlagen. Siehe Fehlermeldung oben."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
log "Starte Container aus lokalem Image..."
|
||||
if ! docker compose up -d; then
|
||||
error_log "Docker Compose Up (v2) fehlgeschlagen. Versuche mit v1 Format..."
|
||||
if ! docker-compose up -d; then
|
||||
error_log "Docker Compose Up fehlgeschlagen. Siehe Fehlermeldung oben."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
else
|
||||
# Docker Compose V1 (docker-compose)
|
||||
log "Baue lokales Image..."
|
||||
if ! docker-compose build --no-cache; then
|
||||
error_log "Docker Compose Build fehlgeschlagen. Siehe Fehlermeldung oben."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
log "Starte Container aus lokalem Image..."
|
||||
if ! docker-compose up -d; then
|
||||
error_log "Docker Compose Up fehlgeschlagen. Siehe Fehlermeldung oben."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Prüfe, ob der Container läuft
|
||||
log "Warte 10 Sekunden, bis der Container gestartet ist..."
|
||||
sleep 10
|
||||
|
||||
# Prüfe mehrmals, da der Container möglicherweise länger zum Starten braucht
|
||||
MAX_ATTEMPTS=5
|
||||
CURRENT_ATTEMPT=1
|
||||
|
||||
while [ $CURRENT_ATTEMPT -le $MAX_ATTEMPTS ]; do
|
||||
log "Prüfe Container-Status (Versuch $CURRENT_ATTEMPT von $MAX_ATTEMPTS)..."
|
||||
|
||||
if docker ps | grep -q "myp-frontend"; then
|
||||
log "${GREEN}Frontend-Container läuft${NC}"
|
||||
break
|
||||
else
|
||||
CONTAINER_STATUS=$(docker ps -a | grep myp-frontend)
|
||||
CONTAINER_CREATED=$(echo "$CONTAINER_STATUS" | grep -q "Created" && echo "true" || echo "false")
|
||||
CONTAINER_EXITED=$(echo "$CONTAINER_STATUS" | grep -q "Exited" && echo "true" || echo "false")
|
||||
|
||||
if [ "$CONTAINER_EXITED" = "true" ]; then
|
||||
log "${YELLOW}Container wurde beendet. Prüfe Logs...${NC}"
|
||||
docker logs myp-frontend
|
||||
|
||||
log "${YELLOW}Starte Container neu mit besserer Debug-Ausgabe...${NC}"
|
||||
docker rm -f myp-frontend
|
||||
|
||||
if [ "${DOCKER_COMPOSE_V2:-false}" = true ]; then
|
||||
docker compose up -d
|
||||
else
|
||||
docker-compose up -d
|
||||
fi
|
||||
|
||||
sleep 10
|
||||
fi
|
||||
|
||||
if [ $CURRENT_ATTEMPT -eq $MAX_ATTEMPTS ]; then
|
||||
error_log "Frontend-Container läuft nach mehreren Versuchen nicht. Container-Status:"
|
||||
docker ps -a | grep myp-frontend
|
||||
log "Container-Logs:"
|
||||
docker logs myp-frontend
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
CURRENT_ATTEMPT=$((CURRENT_ATTEMPT + 1))
|
||||
sleep 20
|
||||
done
|
||||
|
||||
# Teste ob der Server erreichbar ist
|
||||
log "${YELLOW}Teste ob Frontend-Server erreichbar ist...${NC}"
|
||||
log "${YELLOW}HINWEIS: Bei der Erstinstallation kann es einige Minuten dauern, bis der Server erreichbar ist${NC}"
|
||||
log "${YELLOW}Bei anhaltenden Problemen kann ein Neustart des Systems helfen${NC}"
|
||||
|
||||
MAX_ATTEMPTS=3
|
||||
ATTEMPT=1
|
||||
|
||||
while [ $ATTEMPT -le $MAX_ATTEMPTS ]; do
|
||||
log "Versuch $ATTEMPT/$MAX_ATTEMPTS..."
|
||||
if curl -s -o /dev/null -w "%{http_code}" http://localhost:3000 2>/dev/null | grep -q "200\|304"; then
|
||||
log "${GREEN}Frontend-Server ist erreichbar!${NC}"
|
||||
break
|
||||
else
|
||||
# Prüfe, ob der Container noch läuft oder Fehler aufweist
|
||||
CONTAINER_STATUS=$(docker inspect --format='{{.State.Status}}' myp-frontend 2>/dev/null || echo "nicht gefunden")
|
||||
|
||||
if [ "$CONTAINER_STATUS" != "running" ]; then
|
||||
log "${YELLOW}Container ist nicht aktiv (Status: $CONTAINER_STATUS). Prüfe Logs...${NC}"
|
||||
docker logs myp-frontend --tail 20
|
||||
|
||||
# Wenn der Container gestoppt wurde, starte ihn neu
|
||||
if [ "$CONTAINER_STATUS" = "exited" ] || [ "$CONTAINER_STATUS" = "created" ]; then
|
||||
log "${YELLOW}Versuche, den Container neuzustarten...${NC}"
|
||||
docker start myp-frontend
|
||||
sleep 10
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $ATTEMPT -eq $MAX_ATTEMPTS ]; then
|
||||
log "${YELLOW}Server noch nicht erreichbar. Das ist bei der Erstinstallation normal.${NC}"
|
||||
log "${GREEN}Der Container ist installiert und sollte nach einem System-Neustart korrekt funktionieren.${NC}"
|
||||
log "${GREEN}Beim ersten Start kann die Datenbank-Migration und der Build länger dauern.${NC}"
|
||||
log "Überprüfen Sie später den Container-Status mit: docker logs myp-frontend"
|
||||
else
|
||||
log "Server noch nicht erreichbar. Warte 10 Sekunden..."
|
||||
sleep 10
|
||||
fi
|
||||
fi
|
||||
ATTEMPT=$((ATTEMPT+1))
|
||||
done
|
||||
|
||||
# Stellen Sie sicher, dass Verzeichnis- und Datei-Berechtigungen korrekt gesetzt sind
|
||||
log "${YELLOW}Bereite Datenbank vor...${NC}"
|
||||
mkdir -p db
|
||||
touch db/sqlite.db
|
||||
chmod 666 db/sqlite.db
|
||||
log "${GREEN}Datenbank vorbereitet${NC}"
|
||||
|
||||
# Führe bei Bedarf SQLite-Rebuild im Container aus
|
||||
log "${YELLOW}Führe SQLite-Rebuild im Container durch...${NC}"
|
||||
docker exec myp-frontend npm_config_build_from_source=true pnpm rebuild better-sqlite3 || {
|
||||
log "${YELLOW}Rebuild im laufenden Container nicht möglich. Wird beim nächsten Start automatisch ausgeführt.${NC}"
|
||||
}
|
||||
|
||||
# Prüfe ob die Datenbank-Migration erfolgt ist
|
||||
log "${YELLOW}Prüfe Datenbank-Migration...${NC}"
|
||||
log "${YELLOW}Hinweis: Die Migration wird beim ersten Start nach dem Systemneustart automatisch ausgeführt${NC}"
|
||||
|
||||
if [ -f "db/sqlite.db" ]; then
|
||||
log "${GREEN}Datenbank existiert${NC}"
|
||||
|
||||
# Setze Berechtigungen
|
||||
chmod 666 db/sqlite.db
|
||||
|
||||
# Prüfe Datenbankgröße
|
||||
DB_SIZE=$(du -b db/sqlite.db 2>/dev/null | cut -f1 || echo "0")
|
||||
if [ "$DB_SIZE" -gt 1000 ]; then
|
||||
log "${GREEN}Datenbank scheint initialisiert zu sein (Größe: $DB_SIZE Bytes)${NC}"
|
||||
else
|
||||
log "${YELLOW}Datenbank ist leer oder sehr klein. Die Migration wird beim ersten Start ausgeführt.${NC}"
|
||||
fi
|
||||
else
|
||||
log "${YELLOW}Konnte Datenbank-Datei nicht finden. Wird beim Neustart automatisch erstellt.${NC}"
|
||||
fi
|
||||
|
||||
log "${GREEN}=== Installation abgeschlossen ===${NC}"
|
||||
log "${YELLOW}WICHTIG: Nach der Erstinstallation ist ein Systemneustart erforderlich${NC}"
|
||||
log "${YELLOW}Danach ist das Frontend unter http://localhost:3000 erreichbar${NC}"
|
||||
log "Anzeigen der Logs: docker logs -f myp-frontend"
|
||||
|
||||
# Verwende die richtige Docker Compose Version für Hinweis
|
||||
if [ "${DOCKER_COMPOSE_V2:-false}" = true ]; then
|
||||
log "Frontend stoppen: docker compose -f $FRONTEND_DIR/docker-compose.yml down"
|
||||
else
|
||||
log "Frontend stoppen: docker-compose -f $FRONTEND_DIR/docker-compose.yml down"
|
||||
fi
|
536
log.txt
536
log.txt
@ -1,536 +0,0 @@
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 40ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463'
|
||||
}
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "4214325463"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 39ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3020338880"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3020338880"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463'
|
||||
}
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "4214325463"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 37ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "4214325463"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463'
|
||||
}
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 32ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3425251174"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463'
|
||||
}
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "4214325463"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
||||
at new Database (/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/database.js:48:64)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (webpack-internal:///(rsc)/./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at async e9 (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:396515)
|
||||
at async tb (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400212)
|
||||
at async tS (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:35:400773)
|
||||
at async tR (/app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2130)
|
||||
at async /app/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:2722 {
|
||||
digest: '4214325463',
|
||||
page: '/'
|
||||
}
|
||||
GET / 500 in 31ms
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3020338880"
|
||||
⨯ Error: Could not locate the bindings file. Tried:
|
||||
at eval (./src/server/db/index.ts:14:16)
|
||||
at (rsc)/./src/server/db/index.ts (/app/.next/server/app/page.js:1001:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/server/auth/index.ts:6:68)
|
||||
at (rsc)/./src/server/auth/index.ts (/app/.next/server/app/page.js:957:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/components/header/index.tsx:12:70)
|
||||
at (rsc)/./src/components/header/index.tsx (/app/.next/server/app/page.js:697:1)
|
||||
at __webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
at eval (./src/app/layout.tsx:10:76)
|
||||
at (rsc)/./src/app/layout.tsx (/app/.next/server/app/page.js:594:1)
|
||||
at Function.__webpack_require__ (/app/.next/server/webpack-runtime.js:33:42)
|
||||
digest: "3020338880"
|
||||
Error: Could not locate the bindings file. Tried:
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Debug/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/out/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/Release/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build/default/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/compiled/20.19.0/linux/arm64/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/release/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/debug/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/addon-build/default/install-root/better_sqlite3.node
|
||||
→ /app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/lib/binding/node-v115-linux-arm64/better_sqlite3.node
|
||||
at bindings (/app/node_modules/.pnpm/bindings@1.5.0/node_modules/bindings/bindings.js:126:9)
|
27
packages/reservation-platform/.dockerignore
Normal file
27
packages/reservation-platform/.dockerignore
Normal file
@ -0,0 +1,27 @@
|
||||
# Build and utility assets
|
||||
docker/
|
||||
scripts/
|
||||
|
||||
# Ignore node_modules as they will be installed in the container
|
||||
node_modules
|
||||
|
||||
# Ignore build artifacts
|
||||
.next
|
||||
|
||||
# Ignore runtime data
|
||||
db/
|
||||
|
||||
# Ignore local configuration files
|
||||
.env
|
||||
.env.example
|
||||
|
||||
# Ignore version control files
|
||||
.git
|
||||
.gitignore
|
||||
|
||||
# Ignore IDE/editor specific files
|
||||
*.log
|
||||
*.tmp
|
||||
*.DS_Store
|
||||
.vscode/
|
||||
.idea/
|
@ -1,10 +0,0 @@
|
||||
# Basic Server Configuration
|
||||
RUNTIME_ENVIRONMENT=prod
|
||||
DB_PATH=db/sqlite.db
|
||||
|
||||
# OAuth Configuration (Bitte anpassen)
|
||||
OAUTH_CLIENT_ID=client_id
|
||||
OAUTH_CLIENT_SECRET=client_secret
|
||||
|
||||
# Backend-API URL (IP-Adresse oder Hostname des Backend-Servers)
|
||||
NEXT_PUBLIC_API_URL=http://localhost:5000
|
@ -1,7 +1,3 @@
|
||||
# Basic Server Configuration
|
||||
RUNTIME_ENVIRONMENT=dev
|
||||
DB_PATH=db/sqlite.db
|
||||
|
||||
# OAuth Configuration
|
||||
OAUTH_CLIENT_ID=client_id
|
||||
OAUTH_CLIENT_SECRET=client_secret
|
5
packages/reservation-platform/.gitignore
vendored
5
packages/reservation-platform/.gitignore
vendored
@ -1,7 +1,10 @@
|
||||
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
|
||||
|
||||
# db folder
|
||||
/db
|
||||
db/
|
||||
|
||||
# Env file
|
||||
.env
|
||||
|
||||
|
||||
# dependencies
|
||||
|
@ -1,88 +1,34 @@
|
||||
FROM node:alpine
|
||||
FROM node:20-bookworm-slim
|
||||
|
||||
WORKDIR /app
|
||||
# Create application directory
|
||||
RUN mkdir -p /usr/src/app
|
||||
|
||||
# Install system dependencies for SQLite and native modules
|
||||
RUN apk add --no-cache python3 build-base g++ make sqlite sqlite-dev gcc musl-dev git libffi-dev openssl-dev cmake
|
||||
# Set environment variables
|
||||
ENV PORT=3000
|
||||
ENV NEXT_TELEMETRY_DISABLED=1
|
||||
|
||||
WORKDIR /usr/src/app
|
||||
|
||||
# Copy package.json and pnpm-lock.yaml
|
||||
COPY package.json /usr/src/app
|
||||
COPY pnpm-lock.yaml /usr/src/app
|
||||
|
||||
# Install pnpm
|
||||
RUN npm install -g pnpm
|
||||
RUN corepack enable pnpm
|
||||
|
||||
# Copy package files
|
||||
COPY package.json pnpm-lock.yaml ./
|
||||
# Install dependencies
|
||||
RUN pnpm install
|
||||
|
||||
# Install dependencies with native bindings build approval, ensuring to build from source for all platforms
|
||||
ENV CFLAGS="-fPIC" \
|
||||
LDFLAGS="-fPIC" \
|
||||
CXXFLAGS="-fPIC" \
|
||||
npm_config_build_from_source=true \
|
||||
npm_config_sqlite=/usr/local \
|
||||
npm_config_sqlite_libname=sqlite3
|
||||
# Copy the rest of the application code
|
||||
COPY . /usr/src/app
|
||||
|
||||
# Durchführen der Installation mit umfassenden Flags für native Bindungen
|
||||
RUN pnpm install --unsafe-perm --no-optional --frozen-lockfile
|
||||
# Initialize Database, if it not already exists
|
||||
RUN pnpm run db
|
||||
|
||||
# Hinweis: better-sqlite3 neu bauen verursacht Fehler mit Node 23.10
|
||||
# npx ist bereits in Node.js integriert - kein zusätzlicher Install nötig
|
||||
# Build the application
|
||||
RUN pnpm run build
|
||||
|
||||
# Install tsx for running TypeScript files directly
|
||||
RUN pnpm add -D tsx
|
||||
|
||||
# Copy source code
|
||||
COPY . .
|
||||
|
||||
# Create database directory
|
||||
RUN mkdir -p db/
|
||||
|
||||
# Build the Next.js application
|
||||
RUN pnpm build || echo "Generate schema failed, but continuing..."
|
||||
|
||||
# Expose the port
|
||||
EXPOSE 3000
|
||||
|
||||
# Startup script with robust JSON fallback approach
|
||||
RUN echo '#!/bin/sh' > /app/startup.sh && \
|
||||
echo 'set -e' >> /app/startup.sh && \
|
||||
echo 'mkdir -p /app/db' >> /app/startup.sh && \
|
||||
echo 'echo "Starting application..."' >> /app/startup.sh && \
|
||||
echo 'echo "Konfiguriere DB-Verzeichnis..."' >> /app/startup.sh && \
|
||||
echo 'DB_FILE="/app/db/sqlite.db"' >> /app/startup.sh && \
|
||||
echo 'DB_JSON="/app/db/db.json"' >> /app/startup.sh && \
|
||||
echo 'if [ ! -f "$DB_FILE" ]; then' >> /app/startup.sh && \
|
||||
echo ' echo "Creating empty database file..."' >> /app/startup.sh && \
|
||||
echo ' touch "$DB_FILE"' >> /app/startup.sh && \
|
||||
echo 'fi' >> /app/startup.sh && \
|
||||
echo 'if [ ! -f "$DB_JSON" ]; then' >> /app/startup.sh && \
|
||||
echo ' echo "Creating empty JSON DB file (fallback)..."' >> /app/startup.sh && \
|
||||
echo ' echo "{}" > "$DB_JSON"' >> /app/startup.sh && \
|
||||
echo 'fi' >> /app/startup.sh && \
|
||||
echo 'chmod 666 "$DB_FILE"' >> /app/startup.sh && \
|
||||
echo 'chmod 666 "$DB_JSON"' >> /app/startup.sh && \
|
||||
echo 'chmod -R 777 /app/db' >> /app/startup.sh && \
|
||||
echo 'export DB_PATH=$DB_FILE' >> /app/startup.sh && \
|
||||
echo 'export DB_JSON_PATH=$DB_JSON' >> /app/startup.sh && \
|
||||
echo 'echo "Datenbank wird unter $DB_PATH verwendet"' >> /app/startup.sh && \
|
||||
echo 'echo "JSON Fallback unter $DB_JSON_PATH"' >> /app/startup.sh && \
|
||||
echo '' >> /app/startup.sh && \
|
||||
echo '# Try to rebuild better-sqlite3 for current platform, but continue if it fails' >> /app/startup.sh && \
|
||||
echo 'if [ ! -d "/app/node_modules/.pnpm/better-sqlite3@9.6.0/node_modules/better-sqlite3/build" ]; then' >> /app/startup.sh && \
|
||||
echo ' echo "SQLite Bindings nicht gefunden, versuche sie zu bauen..."' >> /app/startup.sh && \
|
||||
echo ' cd /app && CFLAGS="-fPIC" LDFLAGS="-fPIC" CXXFLAGS="-fPIC" npm_config_build_from_source=true npm_config_sqlite=/usr/local npm_config_sqlite_libname=sqlite3 pnpm rebuild better-sqlite3 || echo "SQLite Rebuild fehlgeschlagen - wird JSON-Fallback verwenden"' >> /app/startup.sh && \
|
||||
echo 'fi' >> /app/startup.sh && \
|
||||
echo '' >> /app/startup.sh && \
|
||||
echo 'echo "Führe Datenbank-Migration aus..."' >> /app/startup.sh && \
|
||||
echo 'NODE_ENV=production npx tsx ./src/server/db/migrate.ts || echo "SQLite Migration fehlgeschlagen - wird beim Neustart erneut versucht"' >> /app/startup.sh && \
|
||||
echo 'echo "Migration abgeschlossen"' >> /app/startup.sh && \
|
||||
echo '' >> /app/startup.sh && \
|
||||
echo 'echo "Starte Next.js Anwendung..."' >> /app/startup.sh && \
|
||||
echo 'if [ -d ".next" ]; then' >> /app/startup.sh && \
|
||||
echo ' NODE_OPTIONS="--no-warnings" pnpm start' >> /app/startup.sh && \
|
||||
echo 'else' >> /app/startup.sh && \
|
||||
echo ' echo "Build directory nicht gefunden, führe Build aus..."' >> /app/startup.sh && \
|
||||
echo ' NODE_OPTIONS="--no-warnings" pnpm build || echo "Build fehlgeschlagen - wird beim Neustart erneut versucht"' >> /app/startup.sh && \
|
||||
echo ' NODE_OPTIONS="--no-warnings" pnpm start || NODE_OPTIONS="--no-warnings" pnpm dev' >> /app/startup.sh && \
|
||||
echo 'fi' >> /app/startup.sh && \
|
||||
chmod +x /app/startup.sh
|
||||
|
||||
# Start the application
|
||||
CMD ["/app/startup.sh"]
|
||||
CMD ["/bin/sh", "-c", "if [ ! -f ./db/sqlite.db ]; then pnpm db; fi && pnpm start"]
|
||||
|
@ -1,217 +1,32 @@
|
||||
utilss/analytics/(scope).ts
|
||||
deriver.ts
|
||||
utils/sentinel.ts -> auth guard
|
||||
# MYP - Manage Your Printer
|
||||
|
||||
MYP (Manage Your Printer) ist eine Webanwendung zur Reservierung von 3D-Druckern.
|
||||
Sie wurde im Rahmen des Abschlussprojektes der Fachinformatiker Ausbildung für Daten- und Prozessanalyse für die Technische Berufsausbildung des Mercedes-Benz Werkes Berlin-Marienfelde entwickelt.
|
||||
|
||||
---
|
||||
## Deployment
|
||||
|
||||
Basierend auf den erwähnten Anforderungen, hier sind einige zusätzliche Spalten, die Sie zu Ihrer Datenbank hinzufügen könnten:
|
||||
### Voraussetzungen
|
||||
|
||||
Für die Tabelle printers:
|
||||
- Netzwerk auf Raspberry Pi ist eingerichtet
|
||||
- Docker ist installiert
|
||||
|
||||
total_print_jobs: Die Gesamtzahl der Druckaufträge, die ein Drucker ausgeführt hat.
|
||||
### Schritte
|
||||
|
||||
total_active_time: Die Gesamtzeit, in der der Drucker aktiv war (in Minuten).
|
||||
1. Docker-Container bauen (docker/build.sh)
|
||||
2. Docker-Container speichern (docker/save.sh caddy:2.8 myp-rp:latest)
|
||||
3. Docker-Container auf Raspberry Pi bereitstellen (docker/deploy.sh)
|
||||
|
||||
total_error_jobs: Die Gesamtzahl der Druckaufträge, die aufgrund eines Fehlers abgebrochen wurden.
|
||||
## Entwicklerinformationen
|
||||
|
||||
last_maintenance_date: Das Datum der letzten Wartung des Druckers.
|
||||
### Raspberry Pi Einstellungen
|
||||
|
||||
Für die Tabelle printJobs:
|
||||
Auf dem Raspberry Pi wurde Raspbian Lite installiert.
|
||||
Unter /srv/* sind die Projektdateien zu finden.
|
||||
|
||||
end_time: Die Zeit, zu der der Druckauftrag beendet wurde.
|
||||
### Anmeldedaten
|
||||
|
||||
was_successful: Ein boolescher Wert, der angibt, ob der Druckauftrag erfolgreich abgeschlossen wurde.
|
||||
|
||||
error_code: Ein Code, der einen bestimmten Fehler identifiziert, wenn der Druckauftrag abgebrochen wurde.
|
||||
|
||||
Für die Tabelle users:
|
||||
|
||||
total_print_jobs: Die Gesamtzahl der Druckaufträge, die ein Benutzer gestartet hat.
|
||||
|
||||
preferred_printer_id: Die ID des Druckers, den der Benutzer am häufigsten verwendet.
|
||||
|
||||
last_login_date: Das Datum des letzten Logins des Benutzers.
|
||||
|
||||
Diese zusätzlichen Spalten könnten Ihnen dabei helfen, die oben genannten statistischen Analysen und Machine Learning-Vorhersagen durchzuführen. Bitte beachten Sie, dass Sie möglicherweise zusätzliche Logik in Ihrer Anwendung implementieren müssen, um diese Spalten zu aktualisieren, wenn entsprechende Ereignisse eintreten (z.B. ein Druckauftrag wird gestartet oder beendet, ein Benutzer loggt sich ein usw.).
|
||||
|
||||
---
|
||||
|
||||
Basierend auf Ihrem Datenbankschema, das Informationen über Drucker, Druckaufträge und Benutzer enthält, könnten Sie eine Vielzahl von statistischen Analysen und Machine Learning-Vorhersagen treffen. Hier sind einige konkrete Vorschläge:
|
||||
|
||||
### Statistische Analysen:
|
||||
1. **Auslastungsanalyse**: Bestimmen Sie die Auslastung der Drucker, indem Sie die Anzahl und Dauer der Druckaufträge analysieren.
|
||||
2. **Fehleranalyse**: Untersuchen Sie die Häufigkeit und Ursachen von abgebrochenen Druckaufträgen, um Muster zu erkennen.
|
||||
3. **Benutzerverhalten**: Analysieren Sie das Verhalten der Benutzer, z.B. welche Drucker am häufigsten verwendet werden oder zu welchen Zeiten die meisten Druckaufträge eingehen.
|
||||
|
||||
### Machine Learning-Vorhersagen:
|
||||
1. **Vorhersage der Druckerauslastung**: Verwenden Sie Zeitreihenanalysen, um zukünftige Auslastungsmuster der Drucker vorherzusagen.
|
||||
2. **Anomalieerkennung**: Setzen Sie Machine Learning ein, um Anomalien im Druckverhalten zu erkennen, die auf potenzielle Probleme hinweisen könnten.
|
||||
3. **Empfehlungssystem**: Entwickeln Sie ein Modell, das Benutzern basierend auf ihren bisherigen Druckaufträgen und Präferenzen Drucker empfiehlt.
|
||||
|
||||
### Konkrete Umsetzungsempfehlungen:
|
||||
- **Daten vorbereiten**: Reinigen und transformieren Sie Ihre Daten, um sie für die Analyse vorzubereiten. Entfernen Sie Duplikate, behandeln Sie fehlende Werte und konvertieren Sie kategoriale Daten in ein format, das von Machine Learning-Algorithmen verarbeitet werden kann.
|
||||
- **Feature Engineering**: Erstellen Sie neue Merkmale (Features), die für Vorhersagemodelle nützlich sein könnten, wie z.B. die durchschnittliche Dauer der Druckaufträge pro Benutzer oder die Gesamtzahl der Druckaufträge pro Drucker.
|
||||
- **Modellauswahl**: Wählen Sie geeignete Machine Learning-Modelle aus. Für Zeitreihenprognosen könnten ARIMA-Modelle geeignet sein, während für die Klassifizierung von Benutzerverhalten Entscheidungsbäume oder Random Forests verwendet werden könnten.
|
||||
- **Modelltraining und -validierung**: Trainieren Sie Ihre Modelle mit einem Teil Ihrer Daten und validieren Sie sie mit einem anderen Teil, um sicherzustellen, dass die Modelle gut generalisieren und nicht überangepasst sind.
|
||||
- **Ergebnisinterpretation**: Interpretieren Sie die Ergebnisse Ihrer Modelle und nutzen Sie sie, um geschäftliche Entscheidungen zu treffen oder die Benutzererfahrung auf Ihrer Plattform zu verbessern.
|
||||
|
||||
Diese Vorschläge sind abhängig von der Qualität und Quantität Ihrer Daten sowie den spezifischen Zielen, die Sie mit Ihrer Plattform verfolgen. Es ist wichtig, dass Sie die Modelle regelmäßig aktualisieren, um die Genauigkeit der Vorhersagen zu erhalten und zu verbessern.
|
||||
|
||||
Quelle: Unterhaltung mit Bing, 11.5.2024
|
||||
(1) Data Science – Nutzung von KI für Predictive Analytics - Springer. https://link.springer.com/content/pdf/10.1007/978-3-658-33731-5_27.pdf.
|
||||
(2) Predictive Analytics: Grundlagen, Projektbeispiele und Lessons ... - Haufe. https://www.haufe.de/finance/haufe-finance-office-premium/predictive-analytics-grundlagen-projektbeispiele-und-lessons-learned_idesk_PI20354_HI13561373.html.
|
||||
(3) Predictive Modelling: Was es ist und wie es dir dabei helfen kann, dein .... https://www.acquisa.de/magazin/predictive-modelling.
|
||||
(4) Deep Learning und Predictive Analytics: Vorhersage von Kundenverhalten .... https://www.hagel-it.de/it-insights/deep-learning-und-predictive-analytics-vorhersage-von-kundenverhalten-und-markttrends.html.
|
||||
(5) undefined. https://doi.org/10.1007/978-3-658-33731-5_27.
|
||||
|
||||
---
|
||||
|
||||
https://github.com/drizzle-team/drizzle-orm/discussions/1480#discussioncomment-9363695
|
||||
|
||||
---
|
||||
|
||||
Um eine 3D-Drucker Reservierungsplattform zu entwickeln und die genannten Kriterien umzusetzen, empfehle ich folgende Schritte:
|
||||
|
||||
### Kundenspezifische Anforderungen analysieren:
|
||||
1. **Stakeholder-Interviews** durchführen, um Bedürfnisse und Erwartungen zu verstehen.
|
||||
2. **Umfragen** erstellen, um Feedback von potenziellen Nutzern zu sammeln.
|
||||
3. **Anforderungsworkshops** abhalten, um gemeinsam mit den Stakeholdern Anforderungen zu definieren.
|
||||
4. **User Stories** und **Use Cases** entwickeln, um die Anforderungen zu konkretisieren.
|
||||
|
||||
### Projektumsetzung planen:
|
||||
1. **Projektziele** klar definieren und mit den betrieblichen Zielen abstimmen.
|
||||
2. **Ressourcenplanung** vornehmen, um Personal, Zeit und Budget effizient einzusetzen.
|
||||
3. **Risikoanalyse** durchführen, um potenzielle Hindernisse frühzeitig zu erkennen.
|
||||
4. **Meilensteinplanung** erstellen, um wichtige Projektphasen zu strukturieren.
|
||||
|
||||
### Daten identifizieren, klassifizieren und modellieren:
|
||||
1. **Datenquellen** identifizieren, die für die Reservierungsplattform relevant sind.
|
||||
2. **Datenklassifikation** vornehmen, um die Daten nach Typ und Sensibilität zu ordnen.
|
||||
3. **Entity-Relationship-Modelle** (ERM) erstellen, um die Beziehungen zwischen den Daten zu visualisieren.
|
||||
|
||||
### Mathematische Vorhersagemodelle und statistische Verfahren nutzen:
|
||||
1. **Regressionsanalysen** durchführen, um zukünftige Nutzungsmuster vorherzusagen.
|
||||
2. **Clusteranalysen** anwenden, um Nutzergruppen zu identifizieren und zu segmentieren.
|
||||
3. **Zeitreihenanalysen** nutzen, um Trends und saisonale Schwankungen zu erkennen.
|
||||
|
||||
### Datenqualität sicherstellen:
|
||||
1. **Validierungsregeln** implementieren, um die Eingabe korrekter Daten zu gewährleisten.
|
||||
2. **Datenbereinigung** regelmäßig durchführen, um Duplikate und Inkonsistenzen zu entfernen.
|
||||
3. **Datenintegrität** durch Referenzintegritätsprüfungen sicherstellen.
|
||||
|
||||
### Analyseergebnisse aufbereiten und Optimierungsmöglichkeiten aufzeigen:
|
||||
1. **Dashboards** entwickeln, um die wichtigsten Kennzahlen übersichtlich darzustellen.
|
||||
2. **Berichte** generieren, die detaillierte Einblicke in die Nutzungsdaten bieten.
|
||||
3. **Handlungsempfehlungen** ableiten, um die Plattform kontinuierlich zu verbessern.
|
||||
|
||||
### Projektdokumentation anforderungsgerecht erstellen:
|
||||
1. **Dokumentationsstandards** festlegen, um Einheitlichkeit zu gewährleisten.
|
||||
2. **Versionskontrolle** nutzen, um Änderungen nachvollziehbar zu machen.
|
||||
3. **Projektfortschritt** dokumentieren, um den Überblick über den aktuellen Stand zu behalten.
|
||||
|
||||
Diese Empfehlungen sollen als Leitfaden dienen, um die genannten Kriterien systematisch und strukturiert in Ihrem Abschlussprojekt umzusetzen.
|
||||
|
||||
Quelle: Unterhaltung mit Bing, 11.5.2024
|
||||
(1) Erfolgreiche Datenanalyseprojekte: Diese Strategien sollten Sie kennen. https://www.b2bsmartdata.de/blog/erfolgreiche-datenanalyseprojekte-diese-strategien-sollten-sie-kennen.
|
||||
(2) Projektdokumentation - wichtige Grundregeln | dieprojektmanager. https://dieprojektmanager.com/projektdokumentation-wichtige-grundregeln/.
|
||||
(3) Projektdokumentation: Definition, Aufbau, Inhalte und Beispiel. https://www.wirtschaftswissen.de/unternehmensfuehrung/projektmanagement/projektdokumentation-je-genauer-sie-ist-desto-weniger-arbeit-haben-sie-mit-nachfolgeprojekten/.
|
||||
(4) Was ist Datenmodellierung? | IBM. https://www.ibm.com/de-de/topics/data-modeling.
|
||||
(5) Was ist Datenmodellierung? | Microsoft Power BI. https://powerbi.microsoft.com/de-de/what-is-data-modeling/.
|
||||
(6) Inhalte Datenmodelle und Datenmodellierung Datenmodellierung ... - TUM. https://wwwbroy.in.tum.de/lehre/vorlesungen/mbe/SS07/vorlfolien/02_Datenmodellierung.pdf.
|
||||
(7) Definition von Datenmodellierung: Einsatzbereiche und Typen.. https://business.adobe.com/de/blog/basics/define-data-modeling.
|
||||
(8) 3. Informations- und Datenmodelle - RPTU. http://lgis.informatik.uni-kl.de/archiv/wwwdvs.informatik.uni-kl.de/courses/DBS/WS2000/Vorlesungsunterlagen/Kapitel.03.pdf.
|
||||
(9) Prozessoptimierung: 7 Methoden im Überblick! [2024] • Asana. https://asana.com/de/resources/process-improvement-methodologies.
|
||||
(10) Prozessoptimierung: Definition, Methoden & Praxis-Beispiele. https://peras.de/hr-blog/detail/hr-blog/prozessoptimierung.
|
||||
(11) Optimierungspotenzial erkennen - OPTANO. https://optano.com/blog/optimierungspotenzial-erkennen/.
|
||||
(12) Projektplanung: Definition, Ziele und Ablauf - wirtschaftswissen.de. https://www.wirtschaftswissen.de/unternehmensfuehrung/projektmanagement/in-nur-5-schritten-zur-fehlerfreien-projektplanung/.
|
||||
(13) Projektphasen: Die Vier! Von der Planung zur Umsetzung. https://www.pureconsultant.de/de/wissen/projektphasen/.
|
||||
(14) Hinweise zur Abschlussprüfung in den IT-Berufen (VO 2020) - IHK_DE. https://www.ihk.de/blueprint/servlet/resource/blob/5361152/008d092b38f621b2c97c66d5193d9f6c/pruefungshinweise-neue-vo-2020-data.pdf.
|
||||
(15) PAO – Projektantrag Fachinformatiker Daten- und Prozessanalyse - IHK_DE. https://www.ihk.de/blueprint/servlet/resource/blob/5673390/37eb05e451ed6051f6316f66d012cc50/projektantrag-fachinformatiker-daten-und-prozessanalyse-data.pdf.
|
||||
(16) IT-BERUFE Leitfaden zur IHK-Abschlussprüfung Fachinformatikerinnen und .... https://www.ihk.de/blueprint/servlet/resource/blob/5439816/6570224fb196bc7e10d16beeeb75fec1/neu-leitfaden-fian-data.pdf.
|
||||
(17) Fachinformatiker/-in Daten- und Prozessanalyse - IHK Nord Westfalen. https://www.ihk.de/nordwestfalen/bildung/ausbildung/ausbildungsberufe-a-z/fachinformatiker-daten-und-prozessanalyse-4767680.
|
||||
(18) Leitfaden zur IHK-Abschlussprüfung Fachinformatiker/-in .... https://www.ihk.de/blueprint/servlet/resource/blob/5682602/2fbedf4b4f33f7522d28ebc611adc909/fachinformatikerin-daten-und-prozessanalyse-data.pdf.
|
||||
(19) § 28 FIAusbV - Einzelnorm - Gesetze im Internet. https://www.gesetze-im-internet.de/fiausbv/__28.html.
|
||||
(20) Hinweise des Prüfungsausschusses zur Projektarbeit. https://www.neubrandenburg.ihk.de/fileadmin/user_upload/Aus_und_Weiterbildung/Ausbildung/Projektarbeit_Fachinformatiker_FR._Daten-_und_Prozessanalyse.pdf.
|
||||
(21) Datenqualität: Definition und Methoden zur kontinuierlichen .... https://www.acquisa.de/magazin/datenqualitaet.
|
||||
(22) Datenqualität: Definition, Merkmale und Analyse (Guide) - Kobold AI. https://www.kobold.ai/datenqualitaet-guide/.
|
||||
(23) Datenqualität: Definition und Methoden zur kontinuierlichen .... https://bing.com/search?q=Sicherstellung+der+Datenqualit%c3%a4t.
|
||||
(24) Datenqualitätsmanagement: Sicherstellung hoher Datenstandards. https://www.data-analyst.de/glossar/data-quality-management/.
|
||||
(25) Kundenspezifische Anforderungen CSR - Beratung für Managementsysteme. https://smct-management.de/kundenspezifische-anforderungen-csr-im-sinne-der-iatf-16949/.
|
||||
(26) CSR Sys - Kundenspezifische Anforderungen verwalten und bewerten. https://smct-management.de/csr-sys-kundenspezifische-anforderungen/.
|
||||
(27) Beauftragter für Customer Specific Requirements (CSR). https://www.tuev-nord.de/de/weiterbildung/seminare/beauftragter-fuer-customer-specific-requirements-csr-a/.
|
||||
(28) Kundenspezifische Anforderungen Seminar | Jetzt anfragen! - qdc. https://qdc.de/kundenspezifische-anforderungen-seminar/.
|
||||
|
||||
---
|
||||
|
||||
Um die Punkte zur Datenidentifikation, -klassifikation, -modellierung und zur Nutzung mathematischer Modelle und statistischer Verfahren weiter zu konkretisieren, finden Sie hier detaillierte Empfehlungen:
|
||||
|
||||
### Datenquellen identifizieren:
|
||||
1. **Bestandsaufnahme** der aktuellen Daten: Erfassen Sie alle Daten, die bereits im Unternehmen vorhanden sind, wie z.B. Kundeninformationen, Transaktionsdaten und Gerätenutzungsdaten.
|
||||
2. **Externe Datenquellen** prüfen: Untersuchen Sie, ob und welche externen Datenquellen wie Materiallieferanten oder Wartungsdienstleister relevant sein könnten.
|
||||
3. **IoT-Sensordaten**: Berücksichtigen Sie die Integration von IoT-Geräten, die in Echtzeit Daten über den Zustand und die Nutzung der 3D-Drucker liefern.
|
||||
|
||||
### Datenklassifikation:
|
||||
1. **Sensibilitätsstufen** festlegen: Bestimmen Sie, welche Daten sensibel sind (z.B. personenbezogene Daten) und einer besonderen Schutzstufe bedürfen.
|
||||
2. **Datenkategorien** erstellen: Ordnen Sie die Daten in Kategorien wie Nutzungsdaten, Finanzdaten, Betriebsdaten etc.
|
||||
3. **Zugriffsrechte** definieren: Legen Sie fest, wer Zugriff auf welche Daten haben darf, um die Datensicherheit zu gewährleisten.
|
||||
|
||||
### Entity-Relationship-Modelle (ERM):
|
||||
1. **Datenentitäten** identifizieren: Bestimmen Sie die Kernentitäten wie Benutzer, Drucker, Reservierungen und Materialien.
|
||||
2. **Beziehungen** festlegen: Definieren Sie, wie diese Entitäten miteinander in Beziehung stehen (z.B. ein Benutzer kann mehrere Reservierungen haben).
|
||||
3. **ERM-Tools** nutzen: Verwenden Sie Software wie Lucidchart oder Microsoft Visio, um die ERMs zu visualisieren.
|
||||
|
||||
### Regressionsanalysen:
|
||||
1. **Historische Daten** sammeln: Nutzen Sie vergangene Nutzungsdaten, um Muster zu erkennen.
|
||||
2. **Prädiktive Variablen** wählen: Identifizieren Sie Faktoren, die die Nutzung beeinflussen könnten, wie z.B. Uhrzeit, Wochentag oder Materialtyp.
|
||||
3. **Regressionsmodelle** anwenden: Nutzen Sie lineare oder logistische Regression, um zukünftige Nutzungsmuster vorherzusagen.
|
||||
|
||||
### Clusteranalysen:
|
||||
1. **Nutzersegmentierung**: Teilen Sie Nutzer basierend auf ihrem Verhalten in Gruppen ein, z.B. nach Häufigkeit der Nutzung oder bevorzugten Materialien.
|
||||
2. **K-Means-Clustering**: Verwenden Sie Algorithmen wie K-Means, um die Nutzer in sinnvolle Cluster zu segmentieren.
|
||||
3. **Cluster-Validierung**: Überprüfen Sie die Güte der Clusterbildung, um sicherzustellen, dass die Segmente aussagekräftig sind.
|
||||
|
||||
### Zeitreihenanalysen:
|
||||
1. **Zeitstempel-Daten** analysieren: Untersuchen Sie Daten mit Zeitstempeln, um Trends und Muster über die Zeit zu erkennen.
|
||||
2. **Saisonale Effekte** berücksichtigen: Identifizieren Sie saisonale Schwankungen in der Nutzung der 3D-Drucker.
|
||||
3. **ARIMA-Modelle**: Nutzen Sie autoregressive integrierte gleitende Durchschnitte (ARIMA), um zukünftige Trends zu prognostizieren.
|
||||
|
||||
Diese Methoden helfen Ihnen, ein tiefes Verständnis der Daten zu entwickeln, das für die erfolgreiche Umsetzung Ihrer Reservierungsplattform unerlässlich ist. Denken Sie daran, dass die genaue Anwendung dieser Techniken von den spezifischen Daten und Anforderungen Ihres Projekts abhängt. Es ist wichtig, dass Sie sich mit den Grundlagen der Datenanalyse und statistischen Modellierung vertraut machen, um diese Methoden effektiv anwenden zu können.
|
||||
|
||||
----
|
||||
This is a [Next.js](https://nextjs.org/) project bootstrapped with [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app).
|
||||
|
||||
## Getting Started
|
||||
|
||||
First, run the development server:
|
||||
|
||||
```bash
|
||||
npm run dev
|
||||
# or
|
||||
yarn dev
|
||||
# or
|
||||
pnpm dev
|
||||
# or
|
||||
bun dev
|
||||
```
|
||||
Benutzer: myp
|
||||
Passwort: (persönlich bekannt)
|
||||
```
|
||||
|
||||
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
|
||||
|
||||
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file.
|
||||
|
||||
This project uses [`next/font`](https://nextjs.org/docs/basic-features/font-optimization) to automatically optimize and load Inter, a custom Google Font.
|
||||
|
||||
## Learn More
|
||||
|
||||
To learn more about Next.js, take a look at the following resources:
|
||||
|
||||
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
|
||||
- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial.
|
||||
|
||||
You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js/) - your feedback and contributions are welcome!
|
||||
|
||||
## Deploy on Vercel
|
||||
|
||||
The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js.
|
||||
|
||||
Check out our [Next.js deployment documentation](https://nextjs.org/docs/deployment) for more details.
|
||||
|
@ -1,23 +0,0 @@
|
||||
version: '3'
|
||||
|
||||
services:
|
||||
frontend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
container_name: myp-frontend
|
||||
network_mode: host
|
||||
environment:
|
||||
- RUNTIME_ENVIRONMENT=${RUNTIME_ENVIRONMENT:-prod}
|
||||
- OAUTH_CLIENT_ID=${OAUTH_CLIENT_ID:-client_id}
|
||||
- OAUTH_CLIENT_SECRET=${OAUTH_CLIENT_SECRET:-client_secret}
|
||||
- NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL:-http://localhost:5000}
|
||||
volumes:
|
||||
- ./db:/app/db
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD", "wget", "--spider", "http://localhost:3000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
start_period: 40s
|
31
packages/reservation-platform/docker/build.sh
Normal file
31
packages/reservation-platform/docker/build.sh
Normal file
@ -0,0 +1,31 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Define image name
|
||||
MYP_RP_IMAGE_NAME="myp-rp"
|
||||
|
||||
# Function to build Docker image
|
||||
build_image() {
|
||||
local image_name=$1
|
||||
local dockerfile=$2
|
||||
local platform=$3
|
||||
|
||||
echo "Building $image_name Docker image for $platform..."
|
||||
|
||||
docker buildx build --platform $platform -t ${image_name}:latest -f $dockerfile --load .
|
||||
if [ $? -eq 0 ]; then
|
||||
echo "$image_name Docker image built successfully"
|
||||
else
|
||||
echo "Error occurred while building $image_name Docker image"
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
# Create and use a builder instance (if not already created)
|
||||
BUILDER_NAME="myp-rp-arm64-builder"
|
||||
docker buildx create --name $BUILDER_NAME --use || docker buildx use $BUILDER_NAME
|
||||
|
||||
# Build myp-rp image
|
||||
build_image "$MYP_RP_IMAGE_NAME" "$PWD/Dockerfile" "linux/arm64"
|
||||
|
||||
# Remove the builder instance
|
||||
docker buildx rm $BUILDER_NAME
|
8
packages/reservation-platform/docker/caddy/Caddyfile
Normal file
8
packages/reservation-platform/docker/caddy/Caddyfile
Normal file
@ -0,0 +1,8 @@
|
||||
{
|
||||
debug
|
||||
}
|
||||
|
||||
m040tbaraspi001.de040.corpintra.net, m040tbaraspi001.de040.corpinter.net {
|
||||
reverse_proxy myp-rp:3000
|
||||
tls internal
|
||||
}
|
19
packages/reservation-platform/docker/compose.yml
Normal file
19
packages/reservation-platform/docker/compose.yml
Normal file
@ -0,0 +1,19 @@
|
||||
services:
|
||||
caddy:
|
||||
image: caddy:2.8
|
||||
container_name: caddy
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- 80:80
|
||||
- 443:443
|
||||
volumes:
|
||||
- ./caddy/data:/data
|
||||
- ./caddy/config:/config
|
||||
- ./caddy/Caddyfile:/etc/caddy/Caddyfile:ro
|
||||
myp-rp:
|
||||
image: myp-rp:latest
|
||||
container_name: myp-rp
|
||||
env_file: "/srv/myp-env/github.env"
|
||||
volumes:
|
||||
- /srv/MYP-DB:/usr/src/app/db
|
||||
restart: unless-stopped
|
36
packages/reservation-platform/docker/deploy.sh
Normal file
36
packages/reservation-platform/docker/deploy.sh
Normal file
@ -0,0 +1,36 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Directory containing the Docker images
|
||||
IMAGE_DIR="docker/images"
|
||||
|
||||
# Load all Docker images from the tar.xz files in the IMAGE_DIR
|
||||
echo "Loading Docker images from $IMAGE_DIR..."
|
||||
|
||||
for image_file in "$IMAGE_DIR"/*.tar.xz; do
|
||||
if [ -f "$image_file" ]; then
|
||||
echo "Loading Docker image from $image_file..."
|
||||
docker load -i "$image_file"
|
||||
|
||||
# Check if the image loading was successful
|
||||
if [ $? -ne 0 ]; then
|
||||
echo "Error occurred while loading Docker image from $image_file"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "No Docker image tar.xz files found in $IMAGE_DIR."
|
||||
fi
|
||||
done
|
||||
|
||||
# Execute docker compose
|
||||
echo "Running docker compose..."
|
||||
docker compose -f "docker/compose.yml" up -d
|
||||
|
||||
# Check if the operation was successful
|
||||
if [ $? -eq 0 ]; then
|
||||
echo "Docker compose executed successfully"
|
||||
else
|
||||
echo "Error occurred while executing docker compose"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Deployment completed successfully"
|
2
packages/reservation-platform/docker/images/.gitattributes
vendored
Normal file
2
packages/reservation-platform/docker/images/.gitattributes
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
caddy_2.8.tar.xz filter=lfs diff=lfs merge=lfs -text
|
||||
myp-rp_latest.tar.xz filter=lfs diff=lfs merge=lfs -text
|
BIN
packages/reservation-platform/docker/images/caddy_2.8.tar.xz
(Stored with Git LFS)
Normal file
BIN
packages/reservation-platform/docker/images/caddy_2.8.tar.xz
(Stored with Git LFS)
Normal file
Binary file not shown.
BIN
packages/reservation-platform/docker/images/myp-rp_latest.tar.xz
(Stored with Git LFS)
Normal file
BIN
packages/reservation-platform/docker/images/myp-rp_latest.tar.xz
(Stored with Git LFS)
Normal file
Binary file not shown.
68
packages/reservation-platform/docker/save.sh
Normal file
68
packages/reservation-platform/docker/save.sh
Normal file
@ -0,0 +1,68 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Get image name as argument
|
||||
IMAGE_NAME=$1
|
||||
PLATFORM="linux/arm64"
|
||||
|
||||
# Define paths
|
||||
IMAGE_DIR="docker/images"
|
||||
IMAGE_FILE="${IMAGE_DIR}/${IMAGE_NAME//[:\/]/_}.tar"
|
||||
COMPRESSED_FILE="${IMAGE_FILE}.xz"
|
||||
|
||||
# Function to pull the image
|
||||
pull_image() {
|
||||
local image=$1
|
||||
if [[ $image == arm64v8/* ]]; then
|
||||
echo "Pulling image $image without platform specification..."
|
||||
docker pull $image
|
||||
else
|
||||
echo "Pulling image $image for platform $PLATFORM..."
|
||||
docker pull --platform $PLATFORM $image
|
||||
fi
|
||||
return $?
|
||||
}
|
||||
|
||||
# Pull the image if it is not available locally
|
||||
if ! docker image inspect ${IMAGE_NAME} &>/dev/null; then
|
||||
if pull_image ${IMAGE_NAME}; then
|
||||
echo "Image $IMAGE_NAME pulled successfully."
|
||||
else
|
||||
echo "Error occurred while pulling $IMAGE_NAME for platform $PLATFORM"
|
||||
echo "Trying to pull $IMAGE_NAME without platform specification..."
|
||||
|
||||
# Attempt to pull again without platform
|
||||
if pull_image ${IMAGE_NAME}; then
|
||||
echo "Image $IMAGE_NAME pulled successfully without platform."
|
||||
else
|
||||
echo "Error occurred while pulling $IMAGE_NAME without platform."
|
||||
echo "Trying to pull arm64v8/${IMAGE_NAME} instead..."
|
||||
|
||||
# Construct new image name
|
||||
NEW_IMAGE_NAME="arm64v8/${IMAGE_NAME}"
|
||||
if pull_image ${NEW_IMAGE_NAME}; then
|
||||
echo "Image $NEW_IMAGE_NAME pulled successfully."
|
||||
IMAGE_NAME=${NEW_IMAGE_NAME} # Update IMAGE_NAME to use the new one
|
||||
else
|
||||
echo "Error occurred while pulling $NEW_IMAGE_NAME"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
else
|
||||
echo "Image $IMAGE_NAME found locally. Skipping pull."
|
||||
fi
|
||||
|
||||
# Save the Docker image
|
||||
echo "Saving $IMAGE_NAME Docker image..."
|
||||
docker save ${IMAGE_NAME} > $IMAGE_FILE
|
||||
|
||||
# Compress the Docker image (overwriting if file exists)
|
||||
echo "Compressing $IMAGE_FILE..."
|
||||
xz -z --force $IMAGE_FILE
|
||||
|
||||
if [ $? -eq 0 ]; then
|
||||
echo "$IMAGE_NAME Docker image saved and compressed successfully as $COMPRESSED_FILE"
|
||||
else
|
||||
echo "Error occurred while compressing $IMAGE_NAME Docker image"
|
||||
exit 1
|
||||
fi
|
116
packages/reservation-platform/docs/Admin-Dashboard.md
Normal file
116
packages/reservation-platform/docs/Admin-Dashboard.md
Normal file
@ -0,0 +1,116 @@
|
||||
# **Detaillierte Dokumentation des Admin-Dashboards**
|
||||
|
||||
In diesem Abschnitt werde ich die Funktionen und Nutzung des Admin-Dashboards genauer beschreiben, einschließlich der verschiedenen Module, Diagramme und deren Zweck.
|
||||
|
||||
---
|
||||
|
||||
## **1. Überblick über das Admin-Dashboard**
|
||||
|
||||
Das Admin-Dashboard ist der zentrale Verwaltungsbereich für Administratoren. Es bietet Funktionen wie die Verwaltung von Druckern, Benutzern und Druckaufträgen sowie detaillierte Statistiken und Analysen.
|
||||
|
||||
### **1.1. Navigation**
|
||||
Das Dashboard enthält ein Sidebar-Menü mit den folgenden Hauptbereichen:
|
||||
1. **Dashboard:** Übersicht der wichtigsten Statistiken.
|
||||
2. **Benutzer:** Verwaltung von Benutzerkonten.
|
||||
3. **Drucker:** Hinzufügen, Bearbeiten und Verwalten von Druckern.
|
||||
4. **Druckaufträge:** Einsicht in alle Druckaufträge und deren Status.
|
||||
5. **Einstellungen:** Konfiguration der Anwendung.
|
||||
6. **Über MYP:** Informationen über das Projekt und den Entwickler.
|
||||
|
||||
Die Sidebar wird in der Datei `src/app/admin/admin-sidebar.tsx` definiert und dynamisch basierend auf der aktuellen Seite hervorgehoben.
|
||||
|
||||
---
|
||||
|
||||
## **2. Funktionen des Admin-Dashboards**
|
||||
|
||||
### **2.1. Benutzerverwaltung**
|
||||
- **Datei:** `src/app/admin/users/page.tsx`
|
||||
- **Beschreibung:** Ermöglicht das Anzeigen, Bearbeiten und Löschen von Benutzerkonten.
|
||||
- **Funktionen:**
|
||||
- Anzeige einer Liste aller registrierten Benutzer.
|
||||
- Bearbeiten von Benutzerrollen (z. B. „admin“ oder „user“).
|
||||
- Deaktivieren oder Löschen von Benutzerkonten.
|
||||
|
||||
---
|
||||
|
||||
### **2.2. Druckerverwaltung**
|
||||
- **Datei:** `src/app/admin/printers/page.tsx`
|
||||
- **Beschreibung:** Verwaltung der Drucker, einschließlich Hinzufügen, Bearbeiten und Deaktivieren.
|
||||
- **Funktionen:**
|
||||
- Statusanzeige der Drucker (aktiv/inaktiv).
|
||||
- Hinzufügen neuer Drucker mit Namen und Beschreibung.
|
||||
- Löschen oder Bearbeiten bestehender Drucker.
|
||||
|
||||
---
|
||||
|
||||
### **2.3. Druckaufträge**
|
||||
- **Datei:** `src/app/admin/jobs/page.tsx`
|
||||
- **Beschreibung:** Übersicht aller Druckaufträge, einschließlich Details wie Startzeit, Dauer und Status.
|
||||
- **Funktionen:**
|
||||
- Filtern nach Benutzern, Druckern oder Status (abgeschlossen, abgebrochen).
|
||||
- Anzeigen von Abbruchgründen und Fehlermeldungen.
|
||||
- Sortieren nach Zeit oder Benutzer.
|
||||
|
||||
---
|
||||
|
||||
### **2.4. Einstellungen**
|
||||
- **Datei:** `src/app/admin/settings/page.tsx`
|
||||
- **Beschreibung:** Konfigurationsseite für die Anwendung.
|
||||
- **Funktionen:**
|
||||
- Ändern von globalen Einstellungen wie Standardzeiten oder Fehlerrichtlinien.
|
||||
- Download von Daten (z. B. Export der Druckhistorie).
|
||||
|
||||
---
|
||||
|
||||
## **3. Statistiken und Diagramme**
|
||||
|
||||
Das Admin-Dashboard enthält interaktive Diagramme, die wichtige Statistiken visualisieren. Hier einige der zentralen Diagramme:
|
||||
|
||||
### **3.1. Abbruchgründe**
|
||||
- **Datei:** `src/app/admin/charts/printer-error-chart.tsx`
|
||||
- **Beschreibung:** Zeigt die Häufigkeit der Abbruchgründe für Druckaufträge in einem Balkendiagramm.
|
||||
- **Nutzen:** Identifiziert häufige Probleme wie Materialmangel oder Düsenverstopfungen.
|
||||
|
||||
---
|
||||
|
||||
### **3.2. Fehlerraten**
|
||||
- **Datei:** `src/app/admin/charts/printer-error-rate.tsx`
|
||||
- **Beschreibung:** Zeigt die prozentuale Fehlerrate für jeden Drucker in einem Balkendiagramm.
|
||||
- **Nutzen:** Ermöglicht die Überwachung und Identifizierung von problematischen Druckern.
|
||||
|
||||
---
|
||||
|
||||
### **3.3. Druckvolumen**
|
||||
- **Datei:** `src/app/admin/charts/printer-volume.tsx`
|
||||
- **Beschreibung:** Zeigt das Druckvolumen für heute, diese Woche und diesen Monat.
|
||||
- **Nutzen:** Vergleich des Druckeroutputs über verschiedene Zeiträume.
|
||||
|
||||
---
|
||||
|
||||
### **3.4. Prognostizierte Nutzung**
|
||||
- **Datei:** `src/app/admin/charts/printer-forecast.tsx`
|
||||
- **Beschreibung:** Ein Bereichsdiagramm zeigt die erwartete Druckernutzung pro Wochentag.
|
||||
- **Nutzen:** Hilft bei der Planung von Wartungsarbeiten oder Ressourcenzuweisungen.
|
||||
|
||||
---
|
||||
|
||||
### **3.5. Druckerauslastung**
|
||||
- **Datei:** `src/app/admin/charts/printer-utilization.tsx`
|
||||
- **Beschreibung:** Zeigt die aktuelle Nutzung eines Druckers in Prozent in einem Kreisdiagramm.
|
||||
- **Nutzen:** Überwacht die Auslastung und identifiziert ungenutzte Ressourcen.
|
||||
|
||||
---
|
||||
|
||||
## **4. Rollenbasierte Zugriffssteuerung**
|
||||
|
||||
Das Admin-Dashboard ist nur für Benutzer mit der Rolle „admin“ zugänglich. Nicht berechtigte Benutzer werden auf die Startseite umgeleitet. Die Zugriffssteuerung erfolgt durch folgende Logik:
|
||||
- **Datei:** `src/app/admin/layout.tsx`
|
||||
- **Funktion:** `validateRequest` prüft die Rolle des aktuellen Benutzers.
|
||||
- **Umleitung:** Falls die Rolle unzureichend ist, wird der Benutzer automatisch umgeleitet:
|
||||
```typescript
|
||||
if (guard(user, IS_NOT, UserRole.ADMIN)) {
|
||||
redirect("/");
|
||||
}
|
||||
```
|
||||
|
||||
Nächster Schritt: [=> API-Endpunkte und deren Nutzung](./API.md)
|
79
packages/reservation-platform/docs/Architektur.md
Normal file
79
packages/reservation-platform/docs/Architektur.md
Normal file
@ -0,0 +1,79 @@
|
||||
# **Technische Architektur und Codeaufbau**
|
||||
|
||||
In diesem Abschnitt erläutere ich die Architektur und Struktur des MYP-Projekts sowie die Funktionalitäten der zentralen Komponenten.
|
||||
|
||||
---
|
||||
|
||||
## **1. Technische Architektur**
|
||||
|
||||
### **1.1. Architekturübersicht**
|
||||
MYP basiert auf einer modernen Webanwendungsarchitektur:
|
||||
- **Frontend:** Entwickelt mit React und Next.js. Stellt die Benutzeroberfläche bereit.
|
||||
- **Backend:** Nutzt Node.js und Drizzle ORM für die Datenbankinteraktion und Geschäftslogik.
|
||||
- **Datenbank:** SQLite zur Speicherung von Nutzerdaten, Druckaufträgen und Druckerkonfigurationen.
|
||||
- **Containerisierung:** Docker wird verwendet, um die Anwendung in isolierten Containern bereitzustellen.
|
||||
- **Webserver:** Caddy dient als Reverse Proxy mit HTTPS-Unterstützung.
|
||||
|
||||
### **1.2. Modulübersicht**
|
||||
- **Datenfluss:** Die Anwendung ist stark datengetrieben. API-Routen werden genutzt, um Daten zwischen Frontend und Backend auszutauschen.
|
||||
- **Rollenbasierter Zugriff:** Über ein Berechtigungssystem können Administratoren und Benutzer unterschiedliche Funktionen nutzen.
|
||||
|
||||
---
|
||||
|
||||
## **2. Codeaufbau**
|
||||
|
||||
### **2.1. Ordnerstruktur**
|
||||
Die Datei `repomix-output.txt` zeigt eine strukturierte Übersicht des Projekts. Nachfolgend einige wichtige Verzeichnisse:
|
||||
|
||||
| **Verzeichnis** | **Inhalt** |
|
||||
|--------------------------|---------------------------------------------------------------------------|
|
||||
| `src/app` | Next.js-Seiten und Komponenten für Benutzer und Admins. |
|
||||
| `src/components` | Wiederverwendbare UI-Komponenten wie Karten, Diagramme, Buttons etc. |
|
||||
| `src/server` | Backend-Logik, Authentifizierung und Datenbankinteraktionen. |
|
||||
| `src/utils` | Hilfsfunktionen für Analysen, Validierungen und Datenbankzugriffe. |
|
||||
| `drizzle` | Datenbank-Migrationsdateien und Metadaten. |
|
||||
| `docker` | Docker-Konfigurations- und Bereitstellungsskripte. |
|
||||
|
||||
---
|
||||
|
||||
### **2.2. Hauptdateien**
|
||||
#### **Frontend**
|
||||
- **`src/app/page.tsx`:** Startseite der Anwendung.
|
||||
- **`src/app/admin/`:** Admin-spezifische Seiten, z. B. Druckerverwaltung oder Fehlerstatistiken.
|
||||
- **`src/components/ui/`:** UI-Komponenten wie Dialoge, Formulare und Tabellen.
|
||||
|
||||
#### **Backend**
|
||||
- **`src/server/auth/`:** Authentifizierung und Benutzerrollenmanagement.
|
||||
- **`src/server/actions/`:** Funktionen zur Interaktion mit Druckaufträgen und Druckern.
|
||||
- **`src/utils/`:** Analyse und Verarbeitung von Druckdaten (z. B. Fehlerquoten und Auslastung).
|
||||
|
||||
#### **Datenbank**
|
||||
- **`drizzle/0000_overjoyed_strong_guy.sql`:** SQLite-Datenbankschema mit Tabellen für Drucker, Benutzer und Druckaufträge.
|
||||
- **`drizzle.meta/`:** Metadaten zur Datenbankmigration.
|
||||
|
||||
---
|
||||
|
||||
### **2.3. Datenbankschema**
|
||||
Das Schema enthält vier Haupttabellen:
|
||||
1. **`user`:** Speichert Benutzerinformationen, einschließlich Rollen und E-Mail-Adressen.
|
||||
2. **`printer`:** Beschreibt die Drucker, ihren Status und ihre Eigenschaften.
|
||||
3. **`printJob`:** Zeichnet Druckaufträge auf, einschließlich Startzeit, Dauer und Abbruchgrund.
|
||||
4. **`session`:** Verwaltert Benutzer-Sitzungen und Ablaufzeiten.
|
||||
|
||||
---
|
||||
|
||||
## **3. Wichtige Funktionen**
|
||||
|
||||
### **3.1. Authentifizierung**
|
||||
Das System nutzt OAuth zur Anmeldung. Benutzerrollen werden in der Tabelle `user` gespeichert und im Backend überprüft.
|
||||
|
||||
### **3.2. Statistiken**
|
||||
- **Fehlerrate:** Berechnet die Häufigkeit von Abbrüchen für jeden Drucker.
|
||||
- **Auslastung:** Prozentuale Nutzung der Drucker, basierend auf geplanten und abgeschlossenen Druckaufträgen.
|
||||
- **Prognosen:** Verwenden historische Daten, um zukünftige Drucknutzungen vorherzusagen.
|
||||
|
||||
### **3.3. API-Endpunkte**
|
||||
- **`src/app/api/printers/`:** Zugriff auf Druckerkonfigurationsdaten.
|
||||
- **`src/app/api/job/[jobId]/`:** Verwaltung einzelner Druckaufträge.
|
||||
|
||||
Nächster Schritt: [=> Datenbank und Analytik-Funktionen](./Datenbank.md)
|
150
packages/reservation-platform/docs/Bereitstellungsdetails .md
Normal file
150
packages/reservation-platform/docs/Bereitstellungsdetails .md
Normal file
@ -0,0 +1,150 @@
|
||||
# **Bereitstellungsdetails und Best Practices**
|
||||
|
||||
In diesem Abschnitt erläutere ich, wie das MYP-Projekt auf einem Server bereitgestellt wird, sowie empfohlene Praktiken zur Verwaltung und Optimierung des Systems.
|
||||
|
||||
---
|
||||
|
||||
## **1. Bereitstellungsschritte**
|
||||
|
||||
### **1.1. Voraussetzungen**
|
||||
- **Server:** Raspberry Pi mit installiertem Raspbian Lite.
|
||||
- **Docker:** Docker und Docker Compose müssen vorab installiert sein.
|
||||
- **Netzwerk:** Der Server muss über eine statische IP-Adresse oder einen DNS-Namen erreichbar sein.
|
||||
|
||||
### **1.2. Vorbereitung**
|
||||
#### **1.2.1. Docker-Images erstellen und speichern**
|
||||
Führen Sie die folgenden Schritte auf dem Entwicklungssystem aus:
|
||||
1. **Images erstellen:**
|
||||
```bash
|
||||
bash docker/build.sh
|
||||
```
|
||||
2. **Images exportieren und komprimieren:**
|
||||
```bash
|
||||
bash docker/save.sh <image-name>
|
||||
```
|
||||
Dies speichert die Docker-Images im Verzeichnis `docker/images/`.
|
||||
|
||||
#### **1.2.2. Übertragung auf den Server**
|
||||
Kopieren Sie die erzeugten `.tar.xz`-Dateien auf den Raspberry Pi:
|
||||
```bash
|
||||
scp docker/images/*.tar.xz <username>@<server-ip>:/path/to/destination/
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **1.3. Images auf dem Server laden**
|
||||
Loggen Sie sich auf dem Server ein und laden Sie die Docker-Images:
|
||||
```bash
|
||||
docker load -i /path/to/destination/<image-name>.tar.xz
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **1.4. Starten der Anwendung**
|
||||
Führen Sie das Bereitstellungsskript aus:
|
||||
```bash
|
||||
bash docker/deploy.sh
|
||||
```
|
||||
Dieses Skript:
|
||||
- Startet die Docker-Container mithilfe von `docker compose`.
|
||||
- Verbindet den Reverse Proxy (Caddy) mit der Anwendung.
|
||||
|
||||
Die Anwendung sollte unter `http://<server-ip>` oder der konfigurierten Domain erreichbar sein.
|
||||
|
||||
---
|
||||
|
||||
## **2. Best Practices**
|
||||
|
||||
### **2.1. Sicherheit**
|
||||
1. **Umgebungsvariablen schützen:**
|
||||
- Stellen Sie sicher, dass die Datei `.env` nicht versehentlich in ein öffentliches Repository hochgeladen wird.
|
||||
- Verwenden Sie geeignete Zugriffsrechte:
|
||||
```bash
|
||||
chmod 600 .env
|
||||
```
|
||||
2. **HTTPS aktivieren:**
|
||||
- Der Caddy-Webserver unterstützt automatisch HTTPS. Stellen Sie sicher, dass eine gültige Domain konfiguriert ist.
|
||||
|
||||
3. **Zugriffsrechte beschränken:**
|
||||
- Verwenden Sie Benutzerrollen („admin“, „guest“), um den Zugriff auf kritische Funktionen zu steuern.
|
||||
|
||||
---
|
||||
|
||||
### **2.2. Performance**
|
||||
1. **Docker-Container optimieren:**
|
||||
- Reduzieren Sie die Größe der Docker-Images, indem Sie unnötige Dateien in `.dockerignore` ausschließen.
|
||||
|
||||
2. **Datenbankwartung:**
|
||||
- Führen Sie regelmäßige Backups der SQLite-Datenbank durch:
|
||||
```bash
|
||||
cp db/sqlite.db /path/to/backup/location/
|
||||
```
|
||||
- Optimieren Sie die Datenbank regelmäßig:
|
||||
```sql
|
||||
VACUUM;
|
||||
```
|
||||
|
||||
3. **Skalierung:**
|
||||
- Bei hoher Last kann die Anwendung mit Kubernetes oder einer Cloud-Lösung (z. B. AWS oder Azure) skaliert werden.
|
||||
|
||||
---
|
||||
|
||||
### **2.3. Fehlerbehebung**
|
||||
1. **Logs überprüfen:**
|
||||
- Docker-Logs können wichtige Debug-Informationen liefern:
|
||||
```bash
|
||||
docker logs <container-name>
|
||||
```
|
||||
|
||||
2. **Health Checks:**
|
||||
- Integrieren Sie Health Checks in die Docker Compose-Datei, um sicherzustellen, dass die Dienste korrekt laufen.
|
||||
|
||||
3. **Fehlerhafte Drucker deaktivieren:**
|
||||
- Deaktivieren Sie Drucker mit einer hohen Fehlerrate über das Admin-Dashboard, um die Benutzererfahrung zu verbessern.
|
||||
|
||||
---
|
||||
|
||||
### **2.4. Updates**
|
||||
1. **Neue Funktionen hinzufügen:**
|
||||
- Aktualisieren Sie die Anwendung und erstellen Sie neue Docker-Images:
|
||||
```bash
|
||||
git pull origin main
|
||||
bash docker/build.sh
|
||||
```
|
||||
- Stellen Sie die aktualisierten Images bereit:
|
||||
```bash
|
||||
bash docker/deploy.sh
|
||||
```
|
||||
|
||||
2. **Datenbankmigrationen:**
|
||||
- Führen Sie neue Migrationsskripte mit folgendem Befehl aus:
|
||||
```bash
|
||||
pnpm run db:migrate
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## **3. Backup und Wiederherstellung**
|
||||
|
||||
### **3.1. Backups erstellen**
|
||||
Sichern Sie wichtige Dateien und Datenbanken regelmäßig:
|
||||
- **SQLite-Datenbank:**
|
||||
```bash
|
||||
cp db/sqlite.db /backup/location/sqlite-$(date +%F).db
|
||||
```
|
||||
- **Docker-Images:**
|
||||
```bash
|
||||
docker save myp-rp:latest | gzip > /backup/location/myp-rp-$(date +%F).tar.gz
|
||||
```
|
||||
|
||||
### **3.2. Wiederherstellung**
|
||||
- **Datenbank wiederherstellen:**
|
||||
```bash
|
||||
cp /backup/location/sqlite-<date>.db db/sqlite.db
|
||||
```
|
||||
- **Docker-Images importieren:**
|
||||
```bash
|
||||
docker load < /backup/location/myp-rp-<date>.tar.gz
|
||||
```
|
||||
|
||||
Nächster Schritt: [=> Admin-Dashboard](./Admin-Dashboard.md)
|
153
packages/reservation-platform/docs/Datenbank.md
Normal file
153
packages/reservation-platform/docs/Datenbank.md
Normal file
@ -0,0 +1,153 @@
|
||||
# **Datenbank und Analytik-Funktionen**
|
||||
|
||||
Dieser Abschnitt konzentriert sich auf die Struktur der Datenbank sowie die Analyse- und Prognosefunktionen, die im Projekt verwendet werden.
|
||||
|
||||
---
|
||||
|
||||
## **1. Datenbankstruktur**
|
||||
|
||||
Das Datenbankschema wurde mit **Drizzle ORM** definiert und basiert auf SQLite. Die wichtigsten Tabellen und ihre Zwecke sind:
|
||||
|
||||
### **1.1. Tabellenübersicht**
|
||||
|
||||
#### **`user`**
|
||||
- Speichert Benutzerinformationen.
|
||||
- Enthält Rollen wie „admin“ oder „guest“ zur Verwaltung von Berechtigungen.
|
||||
|
||||
| **Feld** | **Typ** | **Beschreibung** |
|
||||
|-------------------|------------|-------------------------------------------|
|
||||
| `id` | `text` | Eindeutige ID des Benutzers. |
|
||||
| `github_id` | `integer` | ID des Benutzers aus dem OAuth-Dienst. |
|
||||
| `name` | `text` | Benutzername. |
|
||||
| `displayName` | `text` | Angezeigter Name. |
|
||||
| `email` | `text` | E-Mail-Adresse. |
|
||||
| `role` | `text` | Benutzerrolle, Standardwert: „guest“. |
|
||||
|
||||
---
|
||||
|
||||
#### **`printer`**
|
||||
- Beschreibt verfügbare Drucker und deren Status.
|
||||
|
||||
| **Feld** | **Typ** | **Beschreibung** |
|
||||
|-------------------|------------|-------------------------------------------|
|
||||
| `id` | `text` | Eindeutige Drucker-ID. |
|
||||
| `name` | `text` | Name des Druckers. |
|
||||
| `description` | `text` | Beschreibung oder Spezifikationen. |
|
||||
| `status` | `integer` | Betriebsstatus (0 = inaktiv, 1 = aktiv). |
|
||||
|
||||
---
|
||||
|
||||
#### **`printJob`**
|
||||
- Speichert Informationen zu Druckaufträgen.
|
||||
|
||||
| **Feld** | **Typ** | **Beschreibung** |
|
||||
|-----------------------|---------------|-------------------------------------------------------|
|
||||
| `id` | `text` | Eindeutige Auftrags-ID. |
|
||||
| `printerId` | `text` | Verweis auf die ID des Druckers. |
|
||||
| `userId` | `text` | Verweis auf die ID des Benutzers. |
|
||||
| `startAt` | `integer` | Startzeit des Druckauftrags (Unix-Timestamp). |
|
||||
| `durationInMinutes` | `integer` | Dauer des Druckauftrags in Minuten. |
|
||||
| `comments` | `text` | Zusätzliche Kommentare. |
|
||||
| `aborted` | `integer` | 1 = Abgebrochen, 0 = Erfolgreich abgeschlossen. |
|
||||
| `abortReason` | `text` | Grund für den Abbruch (falls zutreffend). |
|
||||
|
||||
---
|
||||
|
||||
#### **`session`**
|
||||
- Verwaltert Benutzer-Sitzungen und Ablaufzeiten.
|
||||
|
||||
| **Feld** | **Typ** | **Beschreibung** |
|
||||
|-------------------|------------|-------------------------------------------|
|
||||
| `id` | `text` | Eindeutige Sitzungs-ID. |
|
||||
| `user_id` | `text` | Verweis auf die ID des Benutzers. |
|
||||
| `expires_at` | `integer` | Zeitpunkt, wann die Sitzung abläuft. |
|
||||
|
||||
---
|
||||
|
||||
### **1.2. Relationen**
|
||||
- `printer` → `printJob`: Druckaufträge sind an spezifische Drucker gebunden.
|
||||
- `user` → `printJob`: Druckaufträge werden Benutzern zugewiesen.
|
||||
- `user` → `session`: Sitzungen verknüpfen Benutzer mit Login-Details.
|
||||
|
||||
---
|
||||
|
||||
## **2. Analytik-Funktionen**
|
||||
|
||||
Das Projekt bietet verschiedene Analytik- und Prognosetools, um die Druckernutzung und Fehler zu überwachen.
|
||||
|
||||
### **2.1. Fehlerratenanalyse**
|
||||
- Funktion: `calculatePrinterErrorRate` (in `src/utils/analytics/error-rate.ts`).
|
||||
- Berechnet die prozentuale Fehlerrate für jeden Drucker basierend auf abgebrochenen Aufträgen.
|
||||
|
||||
Beispielausgabe:
|
||||
```json
|
||||
[
|
||||
{ "name": "Drucker 1", "errorRate": 5.2 },
|
||||
{ "name": "Drucker 2", "errorRate": 3.7 }
|
||||
]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **2.2. Abbruchgründe**
|
||||
- Funktion: `calculateAbortReasonsCount` (in `src/utils/analytics/errors.ts`).
|
||||
- Zählt die Häufigkeit der Abbruchgründe aus der Tabelle `printJob`.
|
||||
|
||||
Beispielausgabe:
|
||||
```json
|
||||
[
|
||||
{ "abortReason": "Materialmangel", "count": 10 },
|
||||
{ "abortReason": "Düsenverstopfung", "count": 7 }
|
||||
]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **2.3. Nutzung und Prognosen**
|
||||
#### Nutzung:
|
||||
- Funktion: `calculatePrinterUtilization` (in `src/utils/analytics/utilization.ts`).
|
||||
- Berechnet die Nutzung der Drucker in Prozent.
|
||||
|
||||
Beispielausgabe:
|
||||
```json
|
||||
{ "printerId": "1", "utilizationPercentage": 85 }
|
||||
```
|
||||
|
||||
#### Prognosen:
|
||||
- Funktion: `forecastPrinterUsage` (in `src/utils/analytics/forecast.ts`).
|
||||
- Nutzt historische Daten, um die erwartete Druckernutzung für kommende Tage/Wochen zu schätzen.
|
||||
|
||||
Beispielausgabe:
|
||||
```json
|
||||
[
|
||||
{ "day": 1, "usageMinutes": 300 },
|
||||
{ "day": 2, "usageMinutes": 200 }
|
||||
]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **2.4. Druckvolumen**
|
||||
- Funktion: `calculatePrintVolumes` (in `src/utils/analytics/volume.ts`).
|
||||
- Vergleicht die Anzahl der abgeschlossenen Druckaufträge für heute, diese Woche und diesen Monat.
|
||||
|
||||
Beispielausgabe:
|
||||
```json
|
||||
{
|
||||
"today": 15,
|
||||
"thisWeek": 90,
|
||||
"thisMonth": 300
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## **3. Datenbankinitialisierung**
|
||||
Die Datenbank wird über Skripte in der `package.json` initialisiert:
|
||||
```bash
|
||||
pnpm run db:clean # Datenbank und Migrationsordner löschen
|
||||
pnpm run db:generate # Neues Schema generieren
|
||||
pnpm run db:migrate # Migrationsskripte ausführen
|
||||
```
|
||||
|
||||
Nächster Schritt: [=> Bereitstellungsdetails und Best Practices](./Bereitstellungsdetails.md)
|
93
packages/reservation-platform/docs/Installation.md
Normal file
93
packages/reservation-platform/docs/Installation.md
Normal file
@ -0,0 +1,93 @@
|
||||
# **Installation und Einrichtung**
|
||||
|
||||
In diesem Abschnitt wird beschrieben, wie die MYP-Anwendung installiert und eingerichtet wird. Diese Schritte umfassen die Vorbereitung der Umgebung, das Konfigurieren der notwendigen Dienste und die Bereitstellung des Projekts.
|
||||
|
||||
---
|
||||
|
||||
## **Voraussetzungen**
|
||||
### **Hardware und Software**
|
||||
- **Raspberry Pi:** Die Anwendung ist für den Einsatz auf einem Raspberry Pi optimiert, auf dem Raspbian Lite installiert sein sollte.
|
||||
- **Docker:** Docker und Docker Compose müssen installiert sein.
|
||||
- **Netzwerkzugriff:** Der Raspberry Pi muss im Netzwerk erreichbar sein.
|
||||
|
||||
### **Abhängigkeiten**
|
||||
- Node.js (mindestens Version 20)
|
||||
- PNPM (Paketmanager)
|
||||
- SQLite (für lokale Datenbankverwaltung)
|
||||
|
||||
---
|
||||
|
||||
## **Schritte zur Einrichtung**
|
||||
|
||||
### **1. Repository klonen**
|
||||
Klonen Sie das Repository auf Ihr System:
|
||||
```bash
|
||||
git clone <repository-url>
|
||||
cd <repository-ordner>
|
||||
```
|
||||
|
||||
### **2. Konfiguration der Umgebungsvariablen**
|
||||
Passen Sie die Datei `.env.example` an und benennen Sie sie in `.env` um:
|
||||
```bash
|
||||
cp .env.example .env
|
||||
```
|
||||
Erforderliche Variablen:
|
||||
- `OAUTH_CLIENT_ID`: Client-ID für die OAuth-Authentifizierung
|
||||
- `OAUTH_CLIENT_SECRET`: Geheimnis für die OAuth-Authentifizierung
|
||||
|
||||
### **3. Docker-Container erstellen**
|
||||
Führen Sie das Skript `build.sh` aus, um Docker-Images zu erstellen:
|
||||
```bash
|
||||
bash docker/build.sh
|
||||
```
|
||||
Dies erstellt die notwendigen Docker-Images, einschließlich der Anwendung und eines Caddy-Webservers.
|
||||
|
||||
### **4. Docker-Images speichern**
|
||||
Speichern Sie die Images in komprimierter Form, um sie auf anderen Geräten bereitzustellen:
|
||||
```bash
|
||||
bash docker/save.sh <image-name>
|
||||
```
|
||||
|
||||
### **5. Bereitstellung**
|
||||
Kopieren Sie die Docker-Images auf den Zielserver (z. B. Raspberry Pi) und führen Sie `deploy.sh` aus:
|
||||
```bash
|
||||
scp docker/images/*.tar.xz <ziel-server>:/path/to/deployment/
|
||||
bash docker/deploy.sh
|
||||
```
|
||||
Das Skript führt die Docker Compose-Konfiguration aus und startet die Anwendung.
|
||||
|
||||
### **(Optional: 6. Admin-User anlegen)**
|
||||
|
||||
Um einen Admin-User anzulegen, muss zuerst das Container-Image gestartet werden. Anschließend meldet man sich mittels
|
||||
der GitHub-Authentifizierung bei der Anwendung an.
|
||||
|
||||
Der nun in der Datenbank angelegte User hat die Rolle `guest`. Über das CLI muss man nun in die SQLite-Datenbank (die Datenbank sollte außerhalb des Container-Images liegen) wechseln und
|
||||
den User updaten.
|
||||
|
||||
|
||||
#### SQL-Befehl, um den User zu updaten:
|
||||
```bash
|
||||
sqlite3 db.sqlite3
|
||||
UPDATE users SET role = 'admin' WHERE id = <user-id>;
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## **Start der Anwendung**
|
||||
Sobald die Docker-Container laufen, ist die Anwendung unter der angegebenen Domain oder IP-Adresse erreichbar. Standardmäßig verwendet der Caddy-Webserver Port 80 (HTTP) und 443 (HTTPS).
|
||||
|
||||
---
|
||||
|
||||
## **Optional: Entwicklungsmodus**
|
||||
Für lokale Tests können Sie die Anwendung ohne Docker starten:
|
||||
1. Installieren Sie Abhängigkeiten:
|
||||
```bash
|
||||
pnpm install
|
||||
```
|
||||
2. Starten Sie den Entwicklungsserver:
|
||||
```bash
|
||||
pnpm dev
|
||||
```
|
||||
Die Anwendung ist dann unter `http://localhost:3000` verfügbar.
|
||||
|
||||
Nächster Schritt: [=> Nutzung](./Nutzung.md)
|
75
packages/reservation-platform/docs/Nutzung.md
Normal file
75
packages/reservation-platform/docs/Nutzung.md
Normal file
@ -0,0 +1,75 @@
|
||||
# **Features und Nutzung der Anwendung**
|
||||
|
||||
In diesem Abschnitt beschreibe ich die Hauptfunktionen von MYP (Manage Your Printer) und gebe Anweisungen zur Nutzung der verschiedenen Module.
|
||||
|
||||
---
|
||||
|
||||
## **1. Hauptfunktionen**
|
||||
|
||||
### **1.1. Druckerreservierung**
|
||||
- Nutzer können Drucker für einen definierten Zeitraum reservieren.
|
||||
- Konflikte bei Reservierungen werden durch ein Echtzeit-Überprüfungssystem verhindert.
|
||||
|
||||
### **1.2. Fehler- und Auslastungsanalyse**
|
||||
- Darstellung von Druckfehlern nach Kategorien und Häufigkeiten.
|
||||
- Übersicht der aktuellen und historischen Druckernutzung.
|
||||
- Diagramme zur Fehlerrate, Nutzung und Druckvolumen.
|
||||
|
||||
### **1.3. Admin-Dashboard**
|
||||
- Verwaltung von Druckern, Nutzern und Druckaufträgen.
|
||||
- Überblick über alle Abbruchgründe und Druckfehler.
|
||||
- Zugriff auf erweiterte Statistiken und Prognosen.
|
||||
|
||||
---
|
||||
|
||||
## **2. Nutzung der Anwendung**
|
||||
|
||||
### **2.1. Login und Authentifizierung**
|
||||
- Die Anwendung unterstützt OAuth-basierte Authentifizierung.
|
||||
- Nutzer müssen sich mit einem gültigen Konto anmelden, um Zugriff auf die Funktionen zu erhalten.
|
||||
|
||||
### **2.2. Dashboard**
|
||||
- Nach dem Login gelangen die Nutzer auf das Dashboard, das einen Überblick über die aktuelle Druckernutzung bietet.
|
||||
- Administratoren haben Zugriff auf zusätzliche Menüpunkte, wie z. B. Benutzerverwaltung.
|
||||
|
||||
---
|
||||
|
||||
## **3. Admin-Funktionen**
|
||||
|
||||
### **3.1. Druckerverwaltung**
|
||||
- Administratoren können Drucker hinzufügen, bearbeiten oder löschen.
|
||||
- Status eines Druckers (z. B. „in Betrieb“, „außer Betrieb“) kann angepasst werden.
|
||||
|
||||
### **3.2. Nutzerverwaltung**
|
||||
- Verwalten von Benutzerkonten, einschließlich Rollen (z. B. „Admin“ oder „User“).
|
||||
- Benutzer können aktiviert oder deaktiviert werden.
|
||||
|
||||
### **3.3. Statistiken und Berichte**
|
||||
- Diagramme wie:
|
||||
- **Abbruchgründe:** Zeigt häufige Fehlerursachen.
|
||||
- **Fehlerrate:** Prozentuale Fehlerquote der Drucker.
|
||||
- **Nutzung:** Prognosen für die Druckernutzung pro Wochentag.
|
||||
|
||||
---
|
||||
|
||||
## **4. Diagramme und Visualisierungen**
|
||||
|
||||
### **4.1. Abbruchgründe**
|
||||
- Ein Säulendiagramm zeigt die Häufigkeiten der Fehlerursachen.
|
||||
- Nutzt Echtzeit-Daten aus der Druckhistorie.
|
||||
|
||||
### **4.2. Prognostizierte Nutzung**
|
||||
- Ein Liniendiagramm zeigt die erwartete Druckernutzung pro Tag.
|
||||
- Hilft bei der Planung von Wartungszeiten.
|
||||
|
||||
### **4.3. Druckvolumen**
|
||||
- Balkendiagramme vergleichen Druckaufträge heute, diese Woche und diesen Monat.
|
||||
|
||||
---
|
||||
|
||||
## **5. Interaktive Komponenten**
|
||||
- **Benachrichtigungen:** Informieren über Druckaufträge, Fehler oder Systemereignisse.
|
||||
- **Filter und Suchfunktionen:** Erleichtern das Auffinden von Druckern oder Druckaufträgen.
|
||||
- **Rollenbasierter Zugriff:** Funktionen sind je nach Benutzerrolle eingeschränkt.
|
||||
|
||||
Nächster Schritt: [=> Technische Architektur und Codeaufbau](./Architektur.md)
|
37
packages/reservation-platform/docs/README.md
Normal file
37
packages/reservation-platform/docs/README.md
Normal file
@ -0,0 +1,37 @@
|
||||
# **Einleitung**
|
||||
|
||||
> Information: Die Dokumenation wurde mit generativer AI erstellt und kann fehlerhaft sein. Im Zweifel bitte die Quellcode-Dateien anschauen oder die Entwickler kontaktieren.
|
||||
|
||||
## **Projektbeschreibung**
|
||||
MYP (Manage Your Printer) ist eine Webanwendung zur Verwaltung und Reservierung von 3D-Druckern. Das Projekt wurde als Abschlussarbeit im Rahmen der Fachinformatiker-Ausbildung mit Schwerpunkt Daten- und Prozessanalyse entwickelt und dient als Plattform zur einfachen Koordination und Überwachung von Druckressourcen. Es wurde speziell für die Technische Berufsausbildung des Mercedes-Benz Werkes in Berlin-Marienfelde erstellt.
|
||||
|
||||
---
|
||||
|
||||
## **Hauptmerkmale**
|
||||
- **Druckerreservierungen:** Nutzer können 3D-Drucker in definierten Zeitfenstern reservieren.
|
||||
- **Fehleranalyse:** Statistiken über Druckfehler und Abbruchgründe werden visuell dargestellt.
|
||||
- **Druckauslastung:** Echtzeit-Daten über die Nutzung der Drucker.
|
||||
- **Admin-Dashboard:** Übersichtliche Verwaltung und Konfiguration von Druckern, Benutzern und Druckaufträgen.
|
||||
- **Datenbankintegration:** Alle Daten werden in einer SQLite-Datenbank gespeichert und verwaltet.
|
||||
|
||||
---
|
||||
|
||||
## **Technologien**
|
||||
- **Frontend:** React, Next.js, TailwindCSS
|
||||
- **Backend:** Node.js, Drizzle ORM
|
||||
- **Datenbank:** SQLite
|
||||
- **Deployment:** Docker und Raspberry Pi
|
||||
- **Zusätzliche Bibliotheken:** recharts für Diagramme, Faker.js für Testdaten, sowie diverse Radix-UI-Komponenten.
|
||||
|
||||
---
|
||||
|
||||
## **Dateistruktur**
|
||||
Die Repository-Dateien sind in logische Abschnitte unterteilt:
|
||||
1. **Docker-Konfigurationen** (`docker/`) - Skripte und Konfigurationsdateien für die Bereitstellung.
|
||||
2. **Frontend-Komponenten** (`src/app/`) - Weboberfläche und deren Funktionalitäten.
|
||||
3. **Backend-Funktionen** (`src/server/`) - Datenbankinteraktionen und Authentifizierungslogik.
|
||||
4. **Utils und Helferfunktionen** (`src/utils/`) - Wiederverwendbare Dienste und Hilfsmethoden.
|
||||
5. **Datenbank-Skripte** (`drizzle/`) - Datenbankschemas und Migrationsdateien.
|
||||
|
||||
|
||||
Nächster Schritt: [=> Installation](./Installation.md)
|
@ -5,8 +5,8 @@ export default defineConfig({
|
||||
dialect: "sqlite",
|
||||
schema: "./src/server/db/schema.ts",
|
||||
out: "./drizzle",
|
||||
driver: "better-sqlite",
|
||||
driver: "libsql",
|
||||
dbCredentials: {
|
||||
url: "db/sqlite.db",
|
||||
url: "file:./db/sqlite.db",
|
||||
},
|
||||
});
|
||||
|
@ -1,4 +1,26 @@
|
||||
/** @type {import('next').NextConfig} */
|
||||
const nextConfig = {};
|
||||
const nextConfig = {
|
||||
async headers() {
|
||||
return [
|
||||
{
|
||||
source: "/:path*",
|
||||
headers: [
|
||||
{
|
||||
key: "Access-Control-Allow-Origin",
|
||||
value: "m040tbaraspi001.de040.corpintra.net",
|
||||
},
|
||||
{
|
||||
key: "Access-Control-Allow-Methods",
|
||||
value: "GET, POST, PUT, DELETE, OPTIONS",
|
||||
},
|
||||
{
|
||||
key: "Access-Control-Allow-Headers",
|
||||
value: "Content-Type, Authorization",
|
||||
},
|
||||
],
|
||||
},
|
||||
];
|
||||
},
|
||||
};
|
||||
|
||||
export default nextConfig;
|
||||
|
@ -1,7 +1,8 @@
|
||||
{
|
||||
"name": "myp-rp",
|
||||
"version": "0.1.0",
|
||||
"version": "1.0.0",
|
||||
"private": true,
|
||||
"packageManager": "pnpm@9.12.1",
|
||||
"scripts": {
|
||||
"dev": "next dev",
|
||||
"build": "next build",
|
||||
@ -15,59 +16,68 @@
|
||||
"db:reset": "pnpm db:clean && pnpm db"
|
||||
},
|
||||
"dependencies": {
|
||||
"@headlessui/react": "^2.0.3",
|
||||
"@headlessui/tailwindcss": "^0.2.0",
|
||||
"@hookform/resolvers": "^3.3.4",
|
||||
"@lucia-auth/adapter-drizzle": "^1.0.7",
|
||||
"@radix-ui/react-alert-dialog": "^1.0.5",
|
||||
"@radix-ui/react-avatar": "^1.0.4",
|
||||
"@radix-ui/react-dialog": "^1.0.5",
|
||||
"@radix-ui/react-dropdown-menu": "^2.0.6",
|
||||
"@radix-ui/react-hover-card": "^1.0.7",
|
||||
"@faker-js/faker": "^9.2.0",
|
||||
"@headlessui/react": "^2.1.10",
|
||||
"@headlessui/tailwindcss": "^0.2.1",
|
||||
"@hookform/resolvers": "^3.9.0",
|
||||
"@libsql/client": "^0.14.0",
|
||||
"@lucia-auth/adapter-drizzle": "^1.1.0",
|
||||
"@radix-ui/react-alert-dialog": "^1.1.2",
|
||||
"@radix-ui/react-avatar": "^1.1.1",
|
||||
"@radix-ui/react-dialog": "^1.1.2",
|
||||
"@radix-ui/react-dropdown-menu": "^2.1.2",
|
||||
"@radix-ui/react-hover-card": "^1.1.2",
|
||||
"@radix-ui/react-icons": "^1.3.0",
|
||||
"@radix-ui/react-label": "^2.0.2",
|
||||
"@radix-ui/react-scroll-area": "^1.0.5",
|
||||
"@radix-ui/react-select": "^2.0.0",
|
||||
"@radix-ui/react-slot": "^1.0.2",
|
||||
"@radix-ui/react-tabs": "^1.0.4",
|
||||
"@radix-ui/react-toast": "^1.1.5",
|
||||
"@remixicon/react": "^4.2.0",
|
||||
"@tanstack/react-table": "^8.16.0",
|
||||
"@tremor/react": "^3.16.2",
|
||||
"arctic": "^1.8.1",
|
||||
"better-sqlite3": "^9.6.0",
|
||||
"@radix-ui/react-label": "^2.1.0",
|
||||
"@radix-ui/react-scroll-area": "^1.2.0",
|
||||
"@radix-ui/react-select": "^2.1.2",
|
||||
"@radix-ui/react-slot": "^1.1.0",
|
||||
"@radix-ui/react-tabs": "^1.1.1",
|
||||
"@radix-ui/react-toast": "^1.2.2",
|
||||
"@remixicon/react": "^4.3.0",
|
||||
"@tanstack/react-table": "^8.20.5",
|
||||
"@tremor/react": "^3.18.3",
|
||||
"arctic": "^1.9.2",
|
||||
"class-variance-authority": "^0.7.0",
|
||||
"clsx": "^2.1.1",
|
||||
"date-fns": "^4.1.0",
|
||||
"drizzle-orm": "^0.30.10",
|
||||
"drizzle-json-db": "^0.1.1",
|
||||
"lucia": "^3.2.0",
|
||||
"lodash": "^4.17.21",
|
||||
"lucia": "^3.2.1",
|
||||
"lucide-react": "^0.378.0",
|
||||
"luxon": "^3.5.0",
|
||||
"next": "14.2.3",
|
||||
"next-themes": "^0.3.0",
|
||||
"oslo": "^1.2.0",
|
||||
"oslo": "^1.2.1",
|
||||
"react": "^18.3.1",
|
||||
"react-dom": "^18.3.1",
|
||||
"react-hook-form": "^7.51.4",
|
||||
"react-hook-form": "^7.53.0",
|
||||
"react-if": "^4.1.5",
|
||||
"react-timer-hook": "^3.0.7",
|
||||
"recharts": "^2.13.3",
|
||||
"regression": "^2.0.1",
|
||||
"sonner": "^1.4.41",
|
||||
"sonner": "^1.5.0",
|
||||
"sqlite": "^5.1.1",
|
||||
"sqlite3": "^5.1.7",
|
||||
"swr": "^2.2.5",
|
||||
"tailwind-merge": "^2.3.0",
|
||||
"tailwind-merge": "^2.5.3",
|
||||
"tailwindcss-animate": "^1.0.7",
|
||||
"use-debounce": "^10.0.0",
|
||||
"use-debounce": "^10.0.3",
|
||||
"uuid": "^11.0.2",
|
||||
"zod": "^3.23.8"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@biomejs/biome": "^1.7.3",
|
||||
"@tailwindcss/forms": "^0.5.7",
|
||||
"@types/better-sqlite3": "^7.6.10",
|
||||
"@types/node": "^20.12.11",
|
||||
"@types/react": "^18.3.1",
|
||||
"@types/react-dom": "^18.3.0",
|
||||
"drizzle-kit": "^0.21.1",
|
||||
"postcss": "^8.4.38",
|
||||
"tailwindcss": "^3.4.3",
|
||||
"typescript": "^5.4.5"
|
||||
"@biomejs/biome": "^1.9.3",
|
||||
"@tailwindcss/forms": "^0.5.9",
|
||||
"@types/lodash": "^4.17.13",
|
||||
"@types/luxon": "^3.4.2",
|
||||
"@types/node": "^20.16.11",
|
||||
"@types/react": "^18.3.11",
|
||||
"@types/react-dom": "^18.3.1",
|
||||
"drizzle-kit": "^0.21.4",
|
||||
"postcss": "^8.4.47",
|
||||
"tailwindcss": "^3.4.13",
|
||||
"ts-node": "^10.9.2",
|
||||
"typescript": "^5.6.3"
|
||||
}
|
||||
}
|
||||
|
2561
packages/reservation-platform/pnpm-lock.yaml
generated
2561
packages/reservation-platform/pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
@ -1,5 +0,0 @@
|
||||
ignoredBuiltDependencies:
|
||||
- '@biomejs/biome'
|
||||
- better-sqlite3
|
||||
- es5-ext
|
||||
- esbuild
|
9279
packages/reservation-platform/repomix-output.txt
Normal file
9279
packages/reservation-platform/repomix-output.txt
Normal file
File diff suppressed because it is too large
Load Diff
367
packages/reservation-platform/scripts/generate-data.js
Normal file
367
packages/reservation-platform/scripts/generate-data.js
Normal file
@ -0,0 +1,367 @@
|
||||
const sqlite3 = require("sqlite3");
|
||||
const faker = require("@faker-js/faker").faker;
|
||||
const { random, sample, sampleSize, sum } = require("lodash");
|
||||
const { DateTime } = require("luxon");
|
||||
const { open } = require("sqlite");
|
||||
const { v4: uuidv4 } = require("uuid");
|
||||
|
||||
const dbPath = "./db/sqlite.db";
|
||||
|
||||
// Configuration for test data generation
|
||||
let startDate = DateTime.fromISO("2024-10-08");
|
||||
let endDate = DateTime.fromISO("2024-11-08");
|
||||
let numberOfPrinters = 5;
|
||||
|
||||
// Use weekday names for better readability and ease of setting trends
|
||||
let avgPrintTimesPerDay = {
|
||||
Monday: 4,
|
||||
Tuesday: 2,
|
||||
Wednesday: 5,
|
||||
Thursday: 2,
|
||||
Friday: 3,
|
||||
Saturday: 0,
|
||||
Sunday: 0,
|
||||
}; // Average number of prints for each weekday
|
||||
|
||||
let avgPrintDurationPerDay = {
|
||||
Monday: 240, // Total average duration in minutes for Monday
|
||||
Tuesday: 30,
|
||||
Wednesday: 45,
|
||||
Thursday: 40,
|
||||
Friday: 120,
|
||||
Saturday: 0,
|
||||
Sunday: 0,
|
||||
}; // Average total duration of prints for each weekday
|
||||
|
||||
let printerUsage = {
|
||||
"Drucker 1": 0.5,
|
||||
"Drucker 2": 0.7,
|
||||
"Drucker 3": 0.6,
|
||||
"Drucker 4": 0.3,
|
||||
"Drucker 5": 0.4,
|
||||
}; // Usage percentages for each printer
|
||||
|
||||
// **New Configurations for Error Rates**
|
||||
let generalErrorRate = 0.05; // 5% chance any print job may fail
|
||||
let printerErrorRates = {
|
||||
"Drucker 1": 0.02, // 2% error rate for Printer 1
|
||||
"Drucker 2": 0.03,
|
||||
"Drucker 3": 0.01,
|
||||
"Drucker 4": 0.05,
|
||||
"Drucker 5": 0.04,
|
||||
}; // Error rates for each printer
|
||||
|
||||
const holidays = []; // Example holidays
|
||||
const existingJobs = [];
|
||||
|
||||
const initDB = async () => {
|
||||
console.log("Initializing database connection...");
|
||||
return open({
|
||||
filename: dbPath,
|
||||
driver: sqlite3.Database,
|
||||
});
|
||||
};
|
||||
|
||||
const createUser = (isPowerUser = false) => {
|
||||
const name = [faker.person.firstName(), faker.person.lastName()];
|
||||
|
||||
const user = {
|
||||
id: uuidv4(),
|
||||
github_id: faker.number.int(),
|
||||
username: `${name[0].slice(0, 2)}${name[1].slice(0, 6)}`.toUpperCase(),
|
||||
displayName: `${name[0]} ${name[1]}`.toUpperCase(),
|
||||
email: `${name[0]}.${name[1]}@example.com`,
|
||||
role: sample(["user", "admin"]),
|
||||
isPowerUser,
|
||||
};
|
||||
console.log("Created user:", user);
|
||||
return user;
|
||||
};
|
||||
|
||||
const createPrinter = (index) => {
|
||||
const printer = {
|
||||
id: uuidv4(),
|
||||
name: `Drucker ${index}`,
|
||||
description: faker.lorem.sentence(),
|
||||
status: random(0, 2),
|
||||
};
|
||||
console.log("Created printer:", printer);
|
||||
return printer;
|
||||
};
|
||||
|
||||
const isPrinterAvailable = (printer, startAt, duration) => {
|
||||
const endAt = startAt + duration * 60 * 1000; // Convert minutes to milliseconds
|
||||
return !existingJobs.some((job) => {
|
||||
const jobStart = job.startAt;
|
||||
const jobEnd = job.startAt + job.durationInMinutes * 60 * 1000;
|
||||
return (
|
||||
printer.id === job.printerId &&
|
||||
((startAt >= jobStart && startAt < jobEnd) ||
|
||||
(endAt > jobStart && endAt <= jobEnd) ||
|
||||
(startAt <= jobStart && endAt >= jobEnd))
|
||||
);
|
||||
});
|
||||
};
|
||||
|
||||
const createPrintJob = (users, printers, startAt, duration) => {
|
||||
const user = sample(users);
|
||||
let printer;
|
||||
|
||||
// Weighted selection based on printer usage
|
||||
const printerNames = Object.keys(printerUsage);
|
||||
const weightedPrinters = printers.filter((p) => printerNames.includes(p.name));
|
||||
|
||||
// Create a weighted array of printers based on usage percentages
|
||||
const printerWeights = weightedPrinters.map((p) => ({
|
||||
printer: p,
|
||||
weight: printerUsage[p.name],
|
||||
}));
|
||||
|
||||
const totalWeight = sum(printerWeights.map((pw) => pw.weight));
|
||||
const randomWeight = Math.random() * totalWeight;
|
||||
let accumulatedWeight = 0;
|
||||
for (const pw of printerWeights) {
|
||||
accumulatedWeight += pw.weight;
|
||||
if (randomWeight <= accumulatedWeight) {
|
||||
printer = pw.printer;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!printer) {
|
||||
printer = sample(printers);
|
||||
}
|
||||
|
||||
if (!isPrinterAvailable(printer, startAt, duration)) {
|
||||
console.log("Printer not available, skipping job creation.");
|
||||
return null;
|
||||
}
|
||||
|
||||
// **Determine if the job should be aborted based on error rates**
|
||||
let aborted = false;
|
||||
let abortReason = null;
|
||||
|
||||
// Calculate the combined error rate
|
||||
const printerErrorRate = printerErrorRates[printer.name] || 0;
|
||||
const combinedErrorRate = 1 - (1 - generalErrorRate) * (1 - printerErrorRate);
|
||||
|
||||
if (Math.random() < combinedErrorRate) {
|
||||
aborted = true;
|
||||
const errorMessages = [
|
||||
"Unbekannt",
|
||||
"Keine Ahnung",
|
||||
"Falsch gebucht",
|
||||
"Filament gelöst",
|
||||
"Druckabbruch",
|
||||
"Düsenverstopfung",
|
||||
"Schichthaftung fehlgeschlagen",
|
||||
"Materialmangel",
|
||||
"Dateifehler",
|
||||
"Temperaturproblem",
|
||||
"Mechanischer Fehler",
|
||||
"Softwarefehler",
|
||||
"Kalibrierungsfehler",
|
||||
"Überhitzung",
|
||||
];
|
||||
abortReason = sample(errorMessages); // Generate a random abort reason
|
||||
}
|
||||
|
||||
const printJob = {
|
||||
id: uuidv4(),
|
||||
printerId: printer.id,
|
||||
userId: user.id,
|
||||
startAt,
|
||||
durationInMinutes: duration,
|
||||
comments: faker.lorem.sentence(),
|
||||
aborted,
|
||||
abortReason,
|
||||
};
|
||||
console.log("Created print job:", printJob);
|
||||
return printJob;
|
||||
};
|
||||
|
||||
const generatePrintJobsForDay = async (users, printers, dayDate, totalJobsForDay, totalDurationForDay, db, dryRun) => {
|
||||
console.log(`Generating print jobs for ${dayDate.toISODate()}...`);
|
||||
|
||||
// Generate random durations that sum up approximately to totalDurationForDay
|
||||
const durations = [];
|
||||
let remainingDuration = totalDurationForDay;
|
||||
for (let i = 0; i < totalJobsForDay; i++) {
|
||||
const avgJobDuration = remainingDuration / (totalJobsForDay - i);
|
||||
const jobDuration = Math.max(
|
||||
Math.round(random(avgJobDuration * 0.8, avgJobDuration * 1.2)),
|
||||
5, // Minimum duration of 5 minutes
|
||||
);
|
||||
durations.push(jobDuration);
|
||||
remainingDuration -= jobDuration;
|
||||
}
|
||||
|
||||
// Shuffle durations to randomize job lengths
|
||||
const shuffledDurations = sampleSize(durations, durations.length);
|
||||
|
||||
for (let i = 0; i < totalJobsForDay; i++) {
|
||||
const duration = shuffledDurations[i];
|
||||
|
||||
// Random start time between 8 AM and 6 PM, adjusted to avoid overlapping durations
|
||||
const possibleStartHours = Array.from({ length: 10 }, (_, idx) => idx + 8); // 8 AM to 6 PM
|
||||
let startAt;
|
||||
let attempts = 0;
|
||||
do {
|
||||
const hour = sample(possibleStartHours);
|
||||
const minute = random(0, 59);
|
||||
startAt = dayDate.set({ hour, minute, second: 0, millisecond: 0 }).toMillis();
|
||||
attempts++;
|
||||
if (attempts > 10) {
|
||||
console.log("Unable to find available time slot, skipping job.");
|
||||
break;
|
||||
}
|
||||
} while (!isPrinterAvailable(sample(printers), startAt, duration));
|
||||
|
||||
if (attempts > 10) continue;
|
||||
|
||||
const printJob = createPrintJob(users, printers, startAt, duration);
|
||||
if (printJob) {
|
||||
if (!dryRun) {
|
||||
await db.run(
|
||||
`INSERT INTO printJob (id, printerId, userId, startAt, durationInMinutes, comments, aborted, abortReason)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
|
||||
[
|
||||
printJob.id,
|
||||
printJob.printerId,
|
||||
printJob.userId,
|
||||
printJob.startAt,
|
||||
printJob.durationInMinutes,
|
||||
printJob.comments,
|
||||
printJob.aborted ? 1 : 0,
|
||||
printJob.abortReason,
|
||||
],
|
||||
);
|
||||
}
|
||||
existingJobs.push(printJob);
|
||||
console.log("Inserted print job into database:", printJob.id);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const generateTestData = async (dryRun = false) => {
|
||||
console.log("Starting test data generation...");
|
||||
const db = await initDB();
|
||||
|
||||
// Generate users and printers
|
||||
const users = [
|
||||
...Array.from({ length: 7 }, () => createUser(false)),
|
||||
...Array.from({ length: 3 }, () => createUser(true)),
|
||||
];
|
||||
const printers = Array.from({ length: numberOfPrinters }, (_, index) => createPrinter(index + 1));
|
||||
|
||||
if (!dryRun) {
|
||||
// Insert users into the database
|
||||
for (const user of users) {
|
||||
await db.run(
|
||||
`INSERT INTO user (id, github_id, name, displayName, email, role)
|
||||
VALUES (?, ?, ?, ?, ?, ?)`,
|
||||
[user.id, user.github_id, user.username, user.displayName, user.email, user.role],
|
||||
);
|
||||
console.log("Inserted user into database:", user.id);
|
||||
}
|
||||
|
||||
// Insert printers into the database
|
||||
for (const printer of printers) {
|
||||
await db.run(
|
||||
`INSERT INTO printer (id, name, description, status)
|
||||
VALUES (?, ?, ?, ?)`,
|
||||
[printer.id, printer.name, printer.description, printer.status],
|
||||
);
|
||||
console.log("Inserted printer into database:", printer.id);
|
||||
}
|
||||
}
|
||||
|
||||
// Generate print jobs for each day within the specified date range
|
||||
let currentDay = startDate;
|
||||
while (currentDay <= endDate) {
|
||||
const weekdayName = currentDay.toFormat("EEEE"); // Get weekday name (e.g., 'Monday')
|
||||
if (holidays.includes(currentDay.toISODate()) || avgPrintTimesPerDay[weekdayName] === 0) {
|
||||
console.log(`Skipping holiday or no jobs scheduled: ${currentDay.toISODate()}`);
|
||||
currentDay = currentDay.plus({ days: 1 });
|
||||
continue;
|
||||
}
|
||||
|
||||
const totalJobsForDay = avgPrintTimesPerDay[weekdayName];
|
||||
const totalDurationForDay = avgPrintDurationPerDay[weekdayName];
|
||||
|
||||
await generatePrintJobsForDay(users, printers, currentDay, totalJobsForDay, totalDurationForDay, db, dryRun);
|
||||
currentDay = currentDay.plus({ days: 1 });
|
||||
}
|
||||
|
||||
if (!dryRun) {
|
||||
await db.close();
|
||||
console.log("Database connection closed. Test data generation complete.");
|
||||
} else {
|
||||
console.log("Dry run complete. No data was written to the database.");
|
||||
}
|
||||
};
|
||||
|
||||
const setConfigurations = (config) => {
|
||||
if (config.startDate) startDate = DateTime.fromISO(config.startDate);
|
||||
if (config.endDate) endDate = DateTime.fromISO(config.endDate);
|
||||
if (config.numberOfPrinters) numberOfPrinters = config.numberOfPrinters;
|
||||
if (config.avgPrintTimesPerDay) avgPrintTimesPerDay = config.avgPrintTimesPerDay;
|
||||
if (config.avgPrintDurationPerDay) avgPrintDurationPerDay = config.avgPrintDurationPerDay;
|
||||
if (config.printerUsage) printerUsage = config.printerUsage;
|
||||
if (config.generalErrorRate !== undefined) generalErrorRate = config.generalErrorRate;
|
||||
if (config.printerErrorRates) printerErrorRates = config.printerErrorRates;
|
||||
};
|
||||
|
||||
// Example usage
|
||||
setConfigurations({
|
||||
startDate: "2024-10-08",
|
||||
endDate: "2024-11-08",
|
||||
numberOfPrinters: 6,
|
||||
avgPrintTimesPerDay: {
|
||||
Monday: 4, // High usage
|
||||
Tuesday: 2, // Low usage
|
||||
Wednesday: 3, // Low usage
|
||||
Thursday: 2, // Low usage
|
||||
Friday: 8, // High usage
|
||||
Saturday: 0,
|
||||
Sunday: 0,
|
||||
},
|
||||
avgPrintDurationPerDay: {
|
||||
Monday: 300, // High total duration
|
||||
Tuesday: 60, // Low total duration
|
||||
Wednesday: 90,
|
||||
Thursday: 60,
|
||||
Friday: 240,
|
||||
Saturday: 0,
|
||||
Sunday: 0,
|
||||
},
|
||||
printerUsage: {
|
||||
"Drucker 1": 2.3,
|
||||
"Drucker 2": 1.7,
|
||||
"Drucker 3": 0.1,
|
||||
"Drucker 4": 1.5,
|
||||
"Drucker 5": 2.4,
|
||||
"Drucker 6": 0.3,
|
||||
"Drucker 7": 0.9,
|
||||
"Drucker 8": 0.1,
|
||||
},
|
||||
generalErrorRate: 0.05, // 5% general error rate
|
||||
printerErrorRates: {
|
||||
"Drucker 1": 0.02,
|
||||
"Drucker 2": 0.03,
|
||||
"Drucker 3": 0.1,
|
||||
"Drucker 4": 0.05,
|
||||
"Drucker 5": 0.04,
|
||||
"Drucker 6": 0.02,
|
||||
"Drucker 7": 0.01,
|
||||
"PrinteDrucker 8": 0.03,
|
||||
},
|
||||
});
|
||||
|
||||
generateTestData(process.argv.includes("--dry-run"))
|
||||
.then(() => {
|
||||
console.log("Test data generation script finished.");
|
||||
})
|
||||
.catch((err) => {
|
||||
console.error("Error generating test data:", err);
|
||||
});
|
@ -1,26 +0,0 @@
|
||||
"use client";
|
||||
|
||||
import { BarChart } from "@tremor/react";
|
||||
|
||||
interface AbortReasonsBarChartProps {
|
||||
// biome-ignore lint/suspicious/noExplicitAny: temporary fix
|
||||
data: any[];
|
||||
}
|
||||
|
||||
export function AbortReasonsBarChart(props: AbortReasonsBarChartProps) {
|
||||
const { data } = props;
|
||||
|
||||
const dataFormatter = (number: number) => Intl.NumberFormat("de-DE").format(number).toString();
|
||||
|
||||
return (
|
||||
<BarChart
|
||||
className="mt-6"
|
||||
data={data}
|
||||
index="name"
|
||||
categories={["Anzahl"]}
|
||||
colors={["blue"]}
|
||||
valueFormatter={dataFormatter}
|
||||
yAxisWidth={48}
|
||||
/>
|
||||
);
|
||||
}
|
@ -1,20 +0,0 @@
|
||||
"use client";
|
||||
|
||||
import { DonutChart, Legend } from "@tremor/react";
|
||||
|
||||
const dataFormatter = (number: number) => Intl.NumberFormat("de-DE").format(number).toString();
|
||||
|
||||
interface LoadFactorChartProps {
|
||||
// biome-ignore lint/suspicious/noExplicitAny: temp. fix
|
||||
data: any[];
|
||||
}
|
||||
export function LoadFactorChart(props: LoadFactorChartProps) {
|
||||
const { data } = props;
|
||||
|
||||
return (
|
||||
<div className="flex gap-4">
|
||||
<DonutChart data={data} variant="donut" colors={["green", "yellow"]} valueFormatter={dataFormatter} />
|
||||
<Legend categories={["Frei", "Belegt"]} colors={["green", "yellow"]} className="max-w-xs" />
|
||||
</div>
|
||||
);
|
||||
}
|
@ -0,0 +1,68 @@
|
||||
"use client";
|
||||
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { type ChartConfig, ChartContainer, ChartTooltip, ChartTooltipContent } from "@/components/ui/chart";
|
||||
import { Bar, BarChart, CartesianGrid, LabelList, XAxis, YAxis } from "recharts";
|
||||
|
||||
export const description = "Ein Säulendiagramm zur Darstellung der Abbruchgründe und ihrer Häufigkeit";
|
||||
|
||||
interface AbortReasonCountChartProps {
|
||||
abortReasonCount: {
|
||||
abortReason: string;
|
||||
count: number;
|
||||
}[];
|
||||
}
|
||||
|
||||
const chartConfig = {
|
||||
abortReason: {
|
||||
label: "Abbruchgrund",
|
||||
},
|
||||
} satisfies ChartConfig;
|
||||
|
||||
export function AbortReasonCountChart({ abortReasonCount }: AbortReasonCountChartProps) {
|
||||
// Transform data to fit the chart structure
|
||||
const chartData = abortReasonCount.map((reason) => ({
|
||||
abortReason: reason.abortReason,
|
||||
count: reason.count,
|
||||
}));
|
||||
|
||||
return (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Abbruchgründe</CardTitle>
|
||||
<CardDescription>Häufigkeit der Abbruchgründe für Druckaufträge</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ChartContainer config={chartConfig}>
|
||||
<BarChart
|
||||
accessibilityLayer
|
||||
data={chartData}
|
||||
margin={{
|
||||
top: 20,
|
||||
}}
|
||||
>
|
||||
<CartesianGrid vertical={false} strokeDasharray="3 3" />
|
||||
<XAxis
|
||||
dataKey="abortReason"
|
||||
tickLine={false}
|
||||
tickMargin={10}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<YAxis tickFormatter={(value) => `${value}`} />
|
||||
<ChartTooltip cursor={false} content={<ChartTooltipContent hideLabel />} />
|
||||
<Bar dataKey="count" fill="hsl(var(--chart-1))" radius={8}>
|
||||
<LabelList
|
||||
position="top"
|
||||
offset={12}
|
||||
className="fill-foreground"
|
||||
fontSize={12}
|
||||
formatter={(value: number) => `${value}`}
|
||||
/>
|
||||
</Bar>
|
||||
</BarChart>
|
||||
</ChartContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
@ -0,0 +1,66 @@
|
||||
"use client";
|
||||
import { Bar, BarChart, CartesianGrid, LabelList, XAxis, YAxis } from "recharts";
|
||||
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { type ChartConfig, ChartContainer, ChartTooltip, ChartTooltipContent } from "@/components/ui/chart";
|
||||
import type { PrinterErrorRate } from "@/utils/analytics/error-rate";
|
||||
|
||||
export const description = "Ein Säulendiagramm zur Darstellung der Fehlerrate";
|
||||
|
||||
interface PrinterErrorRateChartProps {
|
||||
printerErrorRate: PrinterErrorRate[];
|
||||
}
|
||||
|
||||
const chartConfig = {
|
||||
errorRate: {
|
||||
label: "Fehlerrate",
|
||||
},
|
||||
} satisfies ChartConfig;
|
||||
|
||||
export function PrinterErrorRateChart({ printerErrorRate }: PrinterErrorRateChartProps) {
|
||||
// Transform data to fit the chart structure
|
||||
const chartData = printerErrorRate.map((printer) => ({
|
||||
printer: printer.name,
|
||||
errorRate: printer.errorRate,
|
||||
}));
|
||||
|
||||
return (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Fehlerrate</CardTitle>
|
||||
<CardDescription>Fehlerrate der Drucker in Prozent</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ChartContainer config={chartConfig}>
|
||||
<BarChart
|
||||
accessibilityLayer
|
||||
data={chartData}
|
||||
margin={{
|
||||
top: 20,
|
||||
}}
|
||||
>
|
||||
<CartesianGrid vertical={false} strokeDasharray="3 3" />
|
||||
<XAxis
|
||||
dataKey="printer"
|
||||
tickLine={false}
|
||||
tickMargin={10}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<YAxis tickFormatter={(value) => `${value}%`} />
|
||||
<ChartTooltip cursor={false} content={<ChartTooltipContent hideLabel />} />
|
||||
<Bar dataKey="errorRate" fill="hsl(var(--chart-1))" radius={8}>
|
||||
<LabelList
|
||||
position="top"
|
||||
offset={12}
|
||||
className="fill-foreground"
|
||||
fontSize={12}
|
||||
formatter={(value: number) => `${value}%`}
|
||||
/>
|
||||
</Bar>
|
||||
</BarChart>
|
||||
</ChartContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
@ -0,0 +1,83 @@
|
||||
"use client";
|
||||
|
||||
import { Card, CardContent, CardFooter, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { type ChartConfig, ChartContainer, ChartTooltip, ChartTooltipContent } from "@/components/ui/chart";
|
||||
import { Area, AreaChart, CartesianGrid, XAxis, YAxis } from "recharts";
|
||||
|
||||
export const description = "Ein Bereichsdiagramm zur Darstellung der prognostizierten Nutzung pro Wochentag";
|
||||
|
||||
interface ForecastData {
|
||||
day: number; // 0 for Sunday, 1 for Monday, ..., 6 for Saturday
|
||||
usageMinutes: number;
|
||||
}
|
||||
|
||||
interface ForecastChartProps {
|
||||
forecastData: ForecastData[];
|
||||
}
|
||||
|
||||
const chartConfig = {
|
||||
usage: {
|
||||
label: "Prognostizierte Nutzung",
|
||||
color: "hsl(var(--chart-1))",
|
||||
},
|
||||
} satisfies ChartConfig;
|
||||
|
||||
const daysOfWeek = ["Sonntag", "Montag", "Dienstag", "Mittwoch", "Donnerstag", "Freitag", "Samstag"];
|
||||
|
||||
export function ForecastPrinterUsageChart({ forecastData }: ForecastChartProps) {
|
||||
// Transform and slice data to fit the chart structure
|
||||
const chartData = forecastData.map((data) => ({
|
||||
//slice(1, forecastData.length - 1).
|
||||
day: daysOfWeek[data.day], // Map day number to weekday name
|
||||
usage: data.usageMinutes,
|
||||
}));
|
||||
|
||||
return (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Prognostizierte Nutzung pro Wochentag</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ChartContainer className="h-64 w-full" config={chartConfig}>
|
||||
<AreaChart accessibilityLayer data={chartData} margin={{ left: 12, right: 12, top: 12 }}>
|
||||
<CartesianGrid vertical={true} />
|
||||
<XAxis dataKey="day" type="category" tickLine={true} tickMargin={10} axisLine={false} />
|
||||
<YAxis type="number" dataKey="usage" tickLine={false} tickMargin={10} axisLine={false} />
|
||||
<ChartTooltip cursor={false} content={<ChartTooltipContent hideLabel />} />
|
||||
<Area
|
||||
dataKey="usage"
|
||||
type="step"
|
||||
fill="hsl(var(--chart-1))"
|
||||
fillOpacity={0.4}
|
||||
stroke="hsl(var(--chart-1))"
|
||||
/>
|
||||
</AreaChart>
|
||||
</ChartContainer>
|
||||
</CardContent>
|
||||
<CardFooter className="flex-col items-start gap-2 text-sm">
|
||||
<div className="flex items-center gap-2 font-medium leading-none">
|
||||
Zeigt die prognostizierte Nutzungszeit pro Wochentag in Minuten.
|
||||
</div>
|
||||
<div className="leading-none text-muted-foreground">
|
||||
Besten Tage zur Wartung: {bestMaintenanceDays(forecastData)}
|
||||
</div>
|
||||
</CardFooter>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
function bestMaintenanceDays(forecastData: ForecastData[]) {
|
||||
const sortedData = forecastData.map((a) => a).sort((a, b) => a.usageMinutes - b.usageMinutes); // Sort ascending
|
||||
|
||||
const q1Index = Math.floor(sortedData.length * 0.33);
|
||||
const q1 = sortedData[q1Index].usageMinutes; // First quartile (Q1) value
|
||||
|
||||
const filteredData = sortedData.filter((data) => data.usageMinutes <= q1);
|
||||
|
||||
return filteredData
|
||||
.map((data) => {
|
||||
const days = ["Sonntag", "Montag", "Dienstag", "Mittwoch", "Donnerstag", "Freitag", "Samstag"];
|
||||
return days[data.day];
|
||||
})
|
||||
.join(", ");
|
||||
}
|
@ -0,0 +1,80 @@
|
||||
"use client";
|
||||
|
||||
import { TrendingUp } from "lucide-react";
|
||||
import * as React from "react";
|
||||
import { Label, Pie, PieChart } from "recharts";
|
||||
|
||||
import { Card, CardContent, CardDescription, CardFooter, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { type ChartConfig, ChartContainer, ChartTooltip, ChartTooltipContent } from "@/components/ui/chart";
|
||||
|
||||
export const description = "Nutzung des Druckers";
|
||||
|
||||
interface ComponentProps {
|
||||
data: {
|
||||
printerId: string;
|
||||
utilizationPercentage: number;
|
||||
name: string;
|
||||
};
|
||||
}
|
||||
|
||||
const chartConfig = {} satisfies ChartConfig;
|
||||
|
||||
export function PrinterUtilizationChart({ data }: ComponentProps) {
|
||||
const totalUtilization = React.useMemo(() => data.utilizationPercentage, [data]);
|
||||
const dataWithColor = {
|
||||
...data,
|
||||
fill: "rgb(34 197 94)",
|
||||
};
|
||||
const free = {
|
||||
printerId: "-",
|
||||
utilizationPercentage: 1 - data.utilizationPercentage,
|
||||
name: "(Frei)",
|
||||
fill: "rgb(212 212 212)",
|
||||
};
|
||||
|
||||
return (
|
||||
<Card className="flex flex-col">
|
||||
<CardHeader className="items-center pb-0">
|
||||
<CardTitle>{data.name}</CardTitle>
|
||||
<CardDescription>Nutzung des ausgewählten Druckers</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent className="flex-1 pb-0">
|
||||
<ChartContainer config={chartConfig} className="mx-auto aspect-square max-h-[250px]">
|
||||
<PieChart>
|
||||
<ChartTooltip cursor={false} content={<ChartTooltipContent hideLabel />} />
|
||||
<Pie
|
||||
data={[dataWithColor, free]}
|
||||
dataKey="utilizationPercentage"
|
||||
nameKey="name"
|
||||
innerRadius={60}
|
||||
strokeWidth={5}
|
||||
>
|
||||
<Label
|
||||
content={({ viewBox }) => {
|
||||
if (viewBox && "cx" in viewBox && "cy" in viewBox) {
|
||||
return (
|
||||
<text x={viewBox.cx} y={viewBox.cy} textAnchor="middle" dominantBaseline="middle">
|
||||
<tspan x={viewBox.cx} y={viewBox.cy} className="fill-foreground text-3xl font-bold">
|
||||
{(totalUtilization * 100).toFixed(2)}%
|
||||
</tspan>
|
||||
<tspan x={viewBox.cx} y={(viewBox.cy || 0) + 24} className="fill-muted-foreground">
|
||||
Gesamt-Nutzung
|
||||
</tspan>
|
||||
</text>
|
||||
);
|
||||
}
|
||||
}}
|
||||
/>
|
||||
</Pie>
|
||||
</PieChart>
|
||||
</ChartContainer>
|
||||
</CardContent>
|
||||
<CardFooter className="flex-col gap-2 text-sm">
|
||||
<div className="flex items-center gap-2 font-medium leading-none">
|
||||
Übersicht der Nutzung <TrendingUp className="h-4 w-4" />
|
||||
</div>
|
||||
<div className="leading-none text-muted-foreground">Aktuelle Auslastung des Druckers</div>
|
||||
</CardFooter>
|
||||
</Card>
|
||||
);
|
||||
}
|
@ -0,0 +1,69 @@
|
||||
"use client";
|
||||
import { Bar, BarChart, CartesianGrid, LabelList, XAxis } from "recharts";
|
||||
|
||||
import { Card, CardContent, CardDescription, CardFooter, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { type ChartConfig, ChartContainer, ChartTooltip, ChartTooltipContent } from "@/components/ui/chart";
|
||||
|
||||
export const description = "Ein Balkendiagramm mit Beschriftung";
|
||||
|
||||
interface PrintVolumes {
|
||||
today: number;
|
||||
thisWeek: number;
|
||||
thisMonth: number;
|
||||
}
|
||||
|
||||
const chartConfig = {
|
||||
volume: {
|
||||
label: "Volumen",
|
||||
},
|
||||
} satisfies ChartConfig;
|
||||
|
||||
interface PrinterVolumeChartProps {
|
||||
printerVolume: PrintVolumes;
|
||||
}
|
||||
|
||||
export function PrinterVolumeChart({ printerVolume }: PrinterVolumeChartProps) {
|
||||
const chartData = [
|
||||
{ period: "Heute", volume: printerVolume.today, color: "hsl(var(--chart-1))" },
|
||||
{ period: "Diese Woche", volume: printerVolume.thisWeek, color: "hsl(var(--chart-2))" },
|
||||
{ period: "Diesen Monat", volume: printerVolume.thisMonth, color: "hsl(var(--chart-3))" },
|
||||
];
|
||||
|
||||
return (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Druckvolumen</CardTitle>
|
||||
<CardDescription>Vergleich: Heute, Diese Woche, Diesen Monat</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ChartContainer className="h-64 w-full" config={chartConfig}>
|
||||
<BarChart
|
||||
accessibilityLayer
|
||||
data={chartData}
|
||||
margin={{
|
||||
top: 20,
|
||||
}}
|
||||
>
|
||||
<CartesianGrid vertical={false} />
|
||||
<XAxis
|
||||
dataKey="period"
|
||||
tickLine={false}
|
||||
tickMargin={10}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<ChartTooltip cursor={false} content={<ChartTooltipContent hideLabel />} />
|
||||
<Bar dataKey="volume" fill="var(--color-volume)" radius={8}>
|
||||
<LabelList position="top" offset={12} className="fill-foreground" fontSize={12} />
|
||||
</Bar>
|
||||
</BarChart>
|
||||
</ChartContainer>
|
||||
</CardContent>
|
||||
<CardFooter className="flex-col items-start gap-2 text-sm">
|
||||
<div className="leading-none text-muted-foreground">
|
||||
Zeigt das Druckvolumen für heute, diese Woche und diesen Monat
|
||||
</div>
|
||||
</CardFooter>
|
||||
</Card>
|
||||
);
|
||||
}
|
@ -1,24 +0,0 @@
|
||||
"use client";
|
||||
|
||||
import { DonutChart, Legend } from "@tremor/react";
|
||||
|
||||
const dataFormatter = (number: number) => Intl.NumberFormat("de-DE").format(number).toString();
|
||||
|
||||
interface PrintJobsDonutProps {
|
||||
// biome-ignore lint/suspicious/noExplicitAny: temp. fix
|
||||
data: any[];
|
||||
}
|
||||
export function PrintJobsDonut(props: PrintJobsDonutProps) {
|
||||
const { data } = props;
|
||||
|
||||
return (
|
||||
<div className="flex gap-4">
|
||||
<DonutChart data={data} variant="donut" colors={["green", "red", "yellow"]} valueFormatter={dataFormatter} />
|
||||
<Legend
|
||||
categories={["Abgeschlossen", "Abgebrochen", "Ausstehend"]}
|
||||
colors={["green", "red", "yellow"]}
|
||||
className="max-w-xs"
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
@ -1,18 +1,20 @@
|
||||
import { AdminSidebar } from "@/app/admin/admin-sidebar";
|
||||
import { validateRequest } from "@/server/auth";
|
||||
import { UserRole } from "@/server/auth/permissions";
|
||||
import { guard, is_not } from "@/utils/heimdall";
|
||||
import { IS_NOT, guard } from "@/utils/guard";
|
||||
import { redirect } from "next/navigation";
|
||||
|
||||
interface AdminLayoutProps {
|
||||
children: React.ReactNode;
|
||||
}
|
||||
|
||||
export const dynamic = "force-dynamic";
|
||||
|
||||
export default async function AdminLayout(props: AdminLayoutProps) {
|
||||
const { children } = props;
|
||||
const { user } = await validateRequest();
|
||||
|
||||
if (guard(user, is_not, UserRole.ADMIN)) {
|
||||
if (guard(user, IS_NOT, UserRole.ADMIN)) {
|
||||
redirect("/");
|
||||
}
|
||||
|
||||
|
@ -1,10 +1,17 @@
|
||||
import { AbortReasonsBarChart } from "@/app/admin/charts/abort-reasons";
|
||||
import { LoadFactorChart } from "@/app/admin/charts/load-factor";
|
||||
import { PrintJobsDonut } from "@/app/admin/charts/printjobs-donut";
|
||||
import { AbortReasonCountChart } from "@/app/admin/charts/printer-error-chart";
|
||||
import { PrinterErrorRateChart } from "@/app/admin/charts/printer-error-rate";
|
||||
import { ForecastPrinterUsageChart } from "@/app/admin/charts/printer-forecast";
|
||||
import { PrinterUtilizationChart } from "@/app/admin/charts/printer-utilization";
|
||||
import { PrinterVolumeChart } from "@/app/admin/charts/printer-volume";
|
||||
import { DataCard } from "@/components/data-card";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
|
||||
import { db } from "@/server/db";
|
||||
import { calculatePrinterErrorRate } from "@/utils/analytics/error-rate";
|
||||
import { calculateAbortReasonsCount } from "@/utils/analytics/errors";
|
||||
import { forecastPrinterUsage } from "@/utils/analytics/forecast";
|
||||
import { calculatePrinterUtilization } from "@/utils/analytics/utilization";
|
||||
import { calculatePrintVolumes } from "@/utils/analytics/volume";
|
||||
import type { Metadata } from "next";
|
||||
|
||||
export const metadata: Metadata = {
|
||||
@ -14,114 +21,100 @@ export const metadata: Metadata = {
|
||||
export const dynamic = "force-dynamic";
|
||||
|
||||
export default async function AdminPage() {
|
||||
const allPrintJobs = await db.query.printJobs.findMany({
|
||||
const currentDate = new Date();
|
||||
|
||||
const lastMonth = new Date();
|
||||
lastMonth.setDate(currentDate.getDate() - 31);
|
||||
const printers = await db.query.printers.findMany({});
|
||||
const printJobs = await db.query.printJobs.findMany({
|
||||
where: (job, { gte }) => gte(job.startAt, lastMonth),
|
||||
with: {
|
||||
printer: true,
|
||||
},
|
||||
});
|
||||
if (printJobs.length < 1) {
|
||||
return (
|
||||
<Card className="w-full">
|
||||
<CardHeader>
|
||||
<CardTitle>Druckaufträge</CardTitle>
|
||||
<CardDescription>Zurzeit sind keine Druckaufträge verfügbar.</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<p>Aktualisieren Sie die Seite oder prüfen Sie später erneut, ob neue Druckaufträge verfügbar sind.</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
const totalAmountOfPrintJobs = allPrintJobs.length;
|
||||
|
||||
const now = new Date();
|
||||
const completedPrintJobs = allPrintJobs.filter((job) => {
|
||||
const currentPrintJobs = printJobs.filter((job) => {
|
||||
if (job.aborted) return false;
|
||||
const endAt = new Date(job.startAt).getTime() + job.durationInMinutes * 1000 * 60;
|
||||
return endAt < now.getTime();
|
||||
}).length;
|
||||
const abortedPrintJobs = allPrintJobs.filter((job) => job.aborted).length;
|
||||
const pendingPrintJobs = totalAmountOfPrintJobs - completedPrintJobs - abortedPrintJobs;
|
||||
|
||||
const abortedPrintJobsReasons = Object.entries(
|
||||
allPrintJobs.reduce((accumulator: Record<string, number>, job) => {
|
||||
if (job.aborted && job.abortReason) {
|
||||
if (!accumulator[job.abortReason]) {
|
||||
accumulator[job.abortReason] = 1;
|
||||
} else {
|
||||
accumulator[job.abortReason]++;
|
||||
}
|
||||
}
|
||||
return accumulator;
|
||||
}, {}),
|
||||
).map(([name, count]) => ({ name, Anzahl: count }));
|
||||
const endAt = job.startAt.getTime() + job.durationInMinutes * 1000 * 60;
|
||||
|
||||
const mostAbortedPrinter = allPrintJobs.reduce((prev, current) => (prev.aborted > current.aborted ? prev : current));
|
||||
|
||||
const mostUsedPrinter = allPrintJobs.reduce((prev, current) =>
|
||||
prev.durationInMinutes > current.durationInMinutes ? prev : current,
|
||||
);
|
||||
|
||||
const allPrinters = await db.query.printers.findMany();
|
||||
|
||||
const freePrinters = allPrinters.filter((printer) => {
|
||||
const jobs = allPrintJobs.filter((job) => job.printerId === printer.id);
|
||||
const now = new Date();
|
||||
const inUse = jobs.some((job) => {
|
||||
const endAt = new Date(job.startAt).getTime() + job.durationInMinutes * 1000 * 60;
|
||||
return endAt > now.getTime();
|
||||
});
|
||||
return !inUse;
|
||||
return endAt > currentDate.getTime();
|
||||
});
|
||||
const occupiedPrinters = currentPrintJobs.map((job) => job.printer.id);
|
||||
const freePrinters = printers.filter((printer) => !occupiedPrinters.includes(printer.id));
|
||||
const printerUtilization = calculatePrinterUtilization(printJobs);
|
||||
const printerVolume = calculatePrintVolumes(printJobs);
|
||||
const printerAbortReasons = calculateAbortReasonsCount(printJobs);
|
||||
const printerErrorRate = calculatePrinterErrorRate(printJobs);
|
||||
const printerForecast = forecastPrinterUsage(printJobs);
|
||||
|
||||
return (
|
||||
<>
|
||||
<Tabs defaultValue={"@general"} className="flex flex-col gap-4 items-start">
|
||||
<TabsList className="bg-neutral-100 w-full py-6">
|
||||
<TabsTrigger value="@general">Allgemein</TabsTrigger>
|
||||
{allPrinters.map((printer) => (
|
||||
<TabsTrigger key={printer.id} value={printer.id}>
|
||||
{printer.name}
|
||||
</TabsTrigger>
|
||||
))}
|
||||
<TabsTrigger value="@capacity">Druckerauslastung</TabsTrigger>
|
||||
<TabsTrigger value="@report">Fehlerberichte</TabsTrigger>
|
||||
<TabsTrigger value="@forecasts">Prognosen</TabsTrigger>
|
||||
</TabsList>
|
||||
<TabsContent value="@general" className="w-full">
|
||||
<div className="flex flex-col lg:grid lg:grid-cols-2 gap-4">
|
||||
<DataCard title="Drucker mit meisten Reservierungen" value={mostUsedPrinter.printer.name} icon="Printer" />
|
||||
<DataCard title="Drucker mit meisten Abbrüchen" value={mostAbortedPrinter.printer.name} icon="Printer" />
|
||||
<Card className="w-full">
|
||||
<CardHeader>
|
||||
<CardTitle>Druckaufträge</CardTitle>
|
||||
<CardDescription>nach Status</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<PrintJobsDonut
|
||||
data={[
|
||||
{ name: "Abgeschlossen", value: completedPrintJobs },
|
||||
{ name: "Abgebrochen", value: abortedPrintJobs },
|
||||
{ name: "Ausstehend", value: pendingPrintJobs },
|
||||
]}
|
||||
/>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card className="w-full ">
|
||||
<CardHeader>
|
||||
<CardTitle>
|
||||
Auslastung: <span>{((1 - freePrinters.length / allPrinters.length) * 100).toFixed(2)}%</span>
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<LoadFactorChart
|
||||
data={[
|
||||
{ name: "Frei", value: freePrinters.length },
|
||||
{ name: "Belegt", value: allPrinters.length - freePrinters.length },
|
||||
]}
|
||||
/>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card className="w-full col-span-2">
|
||||
<CardHeader>
|
||||
<CardTitle>Abgebrochene Druckaufträge nach Abbruchgrund</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<AbortReasonsBarChart data={abortedPrintJobsReasons} />
|
||||
</CardContent>
|
||||
</Card>
|
||||
<div className="w-full col-span-2">
|
||||
<DataCard
|
||||
title="Aktuelle Auslastung"
|
||||
value={`${Math.round((occupiedPrinters.length / (freePrinters.length + occupiedPrinters.length)) * 100)}%`}
|
||||
icon={"Percent"}
|
||||
/>
|
||||
</div>
|
||||
<DataCard title="Aktive Drucker" value={occupiedPrinters.length} icon={"Rotate3d"} />
|
||||
<DataCard title="Freie Drucker" value={freePrinters.length} icon={"PowerOff"} />
|
||||
</div>
|
||||
</TabsContent>
|
||||
<TabsContent value="@capacity" className="w-full">
|
||||
<div className="flex flex-col lg:grid lg:grid-cols-2 gap-4">
|
||||
<div className="w-full col-span-2">
|
||||
<PrinterVolumeChart printerVolume={printerVolume} />
|
||||
</div>
|
||||
{printerUtilization.map((data) => (
|
||||
<PrinterUtilizationChart key={data.printerId} data={data} />
|
||||
))}
|
||||
</div>
|
||||
</TabsContent>
|
||||
<TabsContent value="@report" className="w-full">
|
||||
<div className="flex flex-col lg:grid lg:grid-cols-2 gap-4">
|
||||
<div className="w-full col-span-2">
|
||||
<PrinterErrorRateChart printerErrorRate={printerErrorRate} />
|
||||
</div>
|
||||
<div className="w-full col-span-2">
|
||||
<AbortReasonCountChart abortReasonCount={printerAbortReasons} />
|
||||
</div>
|
||||
</div>
|
||||
</TabsContent>
|
||||
<TabsContent value="@forecasts" className="w-full">
|
||||
<div className="flex flex-col lg:grid lg:grid-cols-2 gap-4">
|
||||
<div className="w-full col-span-2">
|
||||
<ForecastPrinterUsageChart
|
||||
forecastData={printerForecast.map((usageMinutes, index) => ({
|
||||
day: index,
|
||||
usageMinutes,
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</TabsContent>
|
||||
{allPrinters.map((printer) => (
|
||||
<TabsContent key={printer.id} value={printer.id}>
|
||||
{printer.description}
|
||||
</TabsContent>
|
||||
))}
|
||||
</Tabs>
|
||||
</>
|
||||
);
|
||||
|
@ -29,7 +29,13 @@ export function DeletePrinterDialog(props: DeletePrinterDialogProps) {
|
||||
description: "Drucker wird gelöscht...",
|
||||
});
|
||||
try {
|
||||
await deletePrinter(printerId);
|
||||
const result = await deletePrinter(printerId);
|
||||
if (result?.error) {
|
||||
toast({
|
||||
description: result.error,
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
toast({
|
||||
description: "Drucker wurde gelöscht.",
|
||||
});
|
||||
|
@ -57,11 +57,17 @@ export function PrinterForm(props: PrinterFormProps) {
|
||||
|
||||
// Update
|
||||
try {
|
||||
await updatePrinter(printer.id, {
|
||||
const result = await updatePrinter(printer.id, {
|
||||
description: values.description,
|
||||
name: values.name,
|
||||
status: values.status,
|
||||
});
|
||||
if (result?.error) {
|
||||
toast({
|
||||
description: result.error,
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
|
||||
setOpen(false);
|
||||
|
||||
@ -90,11 +96,17 @@ export function PrinterForm(props: PrinterFormProps) {
|
||||
|
||||
// Create
|
||||
try {
|
||||
await createPrinter({
|
||||
const result = await createPrinter({
|
||||
description: values.description,
|
||||
name: values.name,
|
||||
status: values.status,
|
||||
});
|
||||
if (result?.error) {
|
||||
toast({
|
||||
description: result.error,
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
|
||||
setOpen(false);
|
||||
|
||||
|
@ -1,5 +1,7 @@
|
||||
import fs from "node:fs";
|
||||
|
||||
export const dynamic = 'force-dynamic';
|
||||
|
||||
export async function GET() {
|
||||
return new Response(fs.readFileSync("./db/sqlite.db"));
|
||||
}
|
||||
|
@ -2,12 +2,19 @@ import { db } from "@/server/db";
|
||||
import { printJobs } from "@/server/db/schema";
|
||||
import { eq } from "drizzle-orm";
|
||||
|
||||
export const dynamic = "force-dynamic";
|
||||
|
||||
interface RemainingTimeRouteProps {
|
||||
params: {
|
||||
jobId: string;
|
||||
};
|
||||
}
|
||||
export async function GET(request: Request, { params }: RemainingTimeRouteProps) {
|
||||
// Trying to fix build error in container...
|
||||
if (params.jobId === undefined) {
|
||||
return Response.json({});
|
||||
}
|
||||
|
||||
// Get the job details
|
||||
const jobDetails = await db.query.printJobs.findFirst({
|
||||
where: eq(printJobs.id, params.jobId),
|
||||
|
@ -1,5 +1,7 @@
|
||||
import { getPrinters } from "@/server/actions/printers";
|
||||
|
||||
export const dynamic = "force-dynamic";
|
||||
|
||||
export async function GET() {
|
||||
const printers = await getPrinters();
|
||||
|
||||
|
@ -7,15 +7,30 @@ import { eq } from "drizzle-orm";
|
||||
import { generateIdFromEntropySize } from "lucia";
|
||||
import { cookies } from "next/headers";
|
||||
|
||||
export const dynamic = "force-dynamic";
|
||||
|
||||
interface GithubEmailResponse {
|
||||
email: string;
|
||||
primary: boolean;
|
||||
verified: boolean;
|
||||
visibility: string;
|
||||
}
|
||||
|
||||
export async function GET(request: Request): Promise<Response> {
|
||||
const url = new URL(request.url);
|
||||
const code = url.searchParams.get("code");
|
||||
const state = url.searchParams.get("state");
|
||||
const storedState = cookies().get("github_oauth_state")?.value ?? null;
|
||||
if (!code || !state || !storedState || state !== storedState) {
|
||||
return new Response(null, {
|
||||
status: 400,
|
||||
});
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
status_text: "Something is wrong",
|
||||
data: { code, state, storedState },
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
try {
|
||||
@ -27,7 +42,16 @@ export async function GET(request: Request): Promise<Response> {
|
||||
});
|
||||
const githubUser: GitHubUserResult = await githubUserResponse.json();
|
||||
|
||||
// Replace this with your own DB client.
|
||||
// Sometimes email can be null in the user query.
|
||||
if (githubUser.email === null || githubUser.email === undefined) {
|
||||
const githubEmailResponse = await fetch("https://git.i.mercedes-benz.com/api/v3/user/emails", {
|
||||
headers: {
|
||||
Authorization: `Bearer ${tokens.accessToken}`,
|
||||
},
|
||||
});
|
||||
const githubUserEmail: GithubEmailResponse[] = await githubEmailResponse.json();
|
||||
githubUser.email = githubUserEmail[0].email;
|
||||
}
|
||||
const existingUser = await db.query.users.findFirst({
|
||||
where: eq(users.github_id, githubUser.id),
|
||||
});
|
||||
@ -56,7 +80,10 @@ export async function GET(request: Request): Promise<Response> {
|
||||
|
||||
const session = await lucia.createSession(userId, {});
|
||||
const sessionCookie = lucia.createSessionCookie(session.id);
|
||||
cookies().set(sessionCookie.name, sessionCookie.value, sessionCookie.attributes);
|
||||
cookies().set(sessionCookie.name, sessionCookie.value, {
|
||||
...sessionCookie.attributes,
|
||||
secure: false, // Else cookie does not get set cause IT has not provided us an SSL certificate yet
|
||||
});
|
||||
return new Response(null, {
|
||||
status: 302,
|
||||
headers: {
|
||||
@ -64,13 +91,18 @@ export async function GET(request: Request): Promise<Response> {
|
||||
},
|
||||
});
|
||||
} catch (e) {
|
||||
console.log(e);
|
||||
// the specific error message depends on the provider
|
||||
if (e instanceof OAuth2RequestError) {
|
||||
// invalid code
|
||||
return new Response(null, {
|
||||
status: 400,
|
||||
});
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
status_text: "Invalid code",
|
||||
error: JSON.stringify(e),
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
},
|
||||
);
|
||||
}
|
||||
return new Response(null, {
|
||||
status: 500,
|
||||
|
@ -2,14 +2,18 @@ import { github } from "@/server/auth/oauth";
|
||||
import { generateState } from "arctic";
|
||||
import { cookies } from "next/headers";
|
||||
|
||||
export const dynamic = "force-dynamic";
|
||||
|
||||
export async function GET(): Promise<Response> {
|
||||
const state = generateState();
|
||||
const url = await github.createAuthorizationURL(state);
|
||||
const url = await github.createAuthorizationURL(state, {
|
||||
scopes: ["user"],
|
||||
});
|
||||
const ONE_HOUR = 60 * 60;
|
||||
|
||||
cookies().set("github_oauth_state", state, {
|
||||
path: "/",
|
||||
secure: process.env.NODE_ENV === "production",
|
||||
secure: false, //process.env.NODE_ENV === "production", -- can't be used until SSL certificate is provided by IT
|
||||
httpOnly: true,
|
||||
maxAge: ONE_HOUR,
|
||||
sameSite: "lax",
|
||||
|
Binary file not shown.
Before Width: | Height: | Size: 25 KiB After Width: | Height: | Size: 166 KiB |
@ -2,76 +2,60 @@
|
||||
@tailwind components;
|
||||
@tailwind utilities;
|
||||
|
||||
|
||||
@layer base {
|
||||
:root {
|
||||
--background: 0 0% 100%;
|
||||
--foreground: 0 0% 3.9%;
|
||||
|
||||
--foreground: 222.2 84% 4.9%;
|
||||
--card: 0 0% 100%;
|
||||
--card-foreground: 0 0% 3.9%;
|
||||
|
||||
--card-foreground: 222.2 84% 4.9%;
|
||||
--popover: 0 0% 100%;
|
||||
--popover-foreground: 0 0% 3.9%;
|
||||
|
||||
--primary: 0 0% 9%;
|
||||
--primary-foreground: 0 0% 98%;
|
||||
|
||||
--secondary: 0 0% 96.1%;
|
||||
--secondary-foreground: 0 0% 9%;
|
||||
|
||||
--muted: 0 0% 90.1%;
|
||||
--muted-foreground: 0 0% 45.1%;
|
||||
|
||||
--accent: 0 0% 96.1%;
|
||||
--accent-foreground: 0 0% 9%;
|
||||
|
||||
--popover-foreground: 222.2 84% 4.9%;
|
||||
--primary: 221.2 83.2% 53.3%;
|
||||
--primary-foreground: 210 40% 98%;
|
||||
--secondary: 210 40% 96.1%;
|
||||
--secondary-foreground: 222.2 47.4% 11.2%;
|
||||
--muted: 210 40% 96.1%;
|
||||
--muted-foreground: 215.4 16.3% 46.9%;
|
||||
--accent: 210 40% 96.1%;
|
||||
--accent-foreground: 222.2 47.4% 11.2%;
|
||||
--destructive: 0 84.2% 60.2%;
|
||||
--destructive-foreground: 0 0% 98%;
|
||||
|
||||
--border: 0 0% 89.8%;
|
||||
--input: 0 0% 89.8%;
|
||||
--ring: 0 0% 3.9%;
|
||||
|
||||
--radius: 0.5rem;
|
||||
--destructive-foreground: 210 40% 98%;
|
||||
--border: 214.3 31.8% 91.4%;
|
||||
--input: 214.3 31.8% 91.4%;
|
||||
--ring: 221.2 83.2% 53.3%;
|
||||
--radius: 0.75rem;
|
||||
--chart-1: 12 76% 61%;
|
||||
--chart-2: 173 58% 39%;
|
||||
--chart-3: 197 37% 24%;
|
||||
--chart-4: 43 74% 66%;
|
||||
--chart-5: 27 87% 67%;
|
||||
}
|
||||
|
||||
.dark {
|
||||
--background: 0 0% 3.9%;
|
||||
--foreground: 0 0% 98%;
|
||||
|
||||
--card: 0 0% 3.9%;
|
||||
--card-foreground: 0 0% 98%;
|
||||
|
||||
--popover: 0 0% 3.9%;
|
||||
--popover-foreground: 0 0% 98%;
|
||||
|
||||
--primary: 0 0% 98%;
|
||||
--primary-foreground: 0 0% 9%;
|
||||
|
||||
--secondary: 0 0% 14.9%;
|
||||
--secondary-foreground: 0 0% 98%;
|
||||
|
||||
--muted: 0 0% 14.9%;
|
||||
--muted-foreground: 0 0% 63.9%;
|
||||
|
||||
--accent: 0 0% 14.9%;
|
||||
--accent-foreground: 0 0% 98%;
|
||||
|
||||
--background: 222.2 84% 4.9%;
|
||||
--foreground: 210 40% 98%;
|
||||
--card: 222.2 84% 4.9%;
|
||||
--card-foreground: 210 40% 98%;
|
||||
--popover: 222.2 84% 4.9%;
|
||||
--popover-foreground: 210 40% 98%;
|
||||
--primary: 217.2 91.2% 59.8%;
|
||||
--primary-foreground: 222.2 47.4% 11.2%;
|
||||
--secondary: 217.2 32.6% 17.5%;
|
||||
--secondary-foreground: 210 40% 98%;
|
||||
--muted: 217.2 32.6% 17.5%;
|
||||
--muted-foreground: 215 20.2% 65.1%;
|
||||
--accent: 217.2 32.6% 17.5%;
|
||||
--accent-foreground: 210 40% 98%;
|
||||
--destructive: 0 62.8% 30.6%;
|
||||
--destructive-foreground: 0 0% 98%;
|
||||
|
||||
--border: 0 0% 14.9%;
|
||||
--input: 0 0% 14.9%;
|
||||
--ring: 0 0% 83.1%;
|
||||
--destructive-foreground: 210 40% 98%;
|
||||
--border: 217.2 32.6% 17.5%;
|
||||
--input: 217.2 32.6% 17.5%;
|
||||
--ring: 224.3 76.3% 48%;
|
||||
--chart-1: 220 70% 50%;
|
||||
--chart-2: 160 60% 45%;
|
||||
--chart-3: 30 80% 55%;
|
||||
--chart-4: 280 65% 60%;
|
||||
--chart-5: 340 75% 55%;
|
||||
}
|
||||
}
|
||||
|
||||
@layer base {
|
||||
* {
|
||||
@apply border-border;
|
||||
}
|
||||
|
||||
body {
|
||||
@apply bg-background text-foreground;
|
||||
}
|
||||
}
|
@ -52,7 +52,13 @@ export function CancelForm(props: CancelFormProps) {
|
||||
description: "Druckauftrag wird abgebrochen...",
|
||||
});
|
||||
try {
|
||||
await abortPrintJob(jobId, values.abortReason);
|
||||
const result = await abortPrintJob(jobId, values.abortReason);
|
||||
if (result?.error) {
|
||||
toast({
|
||||
description: result.error,
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
setOpen(false);
|
||||
toast({
|
||||
description: "Druckauftrag wurde abgebrochen.",
|
||||
|
@ -17,7 +17,13 @@ export function EditComments(props: EditCommentsProps) {
|
||||
|
||||
const debounced = useDebouncedCallback(async (value) => {
|
||||
try {
|
||||
await updatePrintComments(jobId, value);
|
||||
const result = await updatePrintComments(jobId, value);
|
||||
if (result?.error) {
|
||||
toast({
|
||||
description: result.error,
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
toast({
|
||||
description: "Anmerkungen wurden gespeichert.",
|
||||
});
|
||||
|
@ -53,7 +53,14 @@ export function ExtendForm(props: ExtendFormProps) {
|
||||
description: "Druckauftrag wird verlängert...",
|
||||
});
|
||||
try {
|
||||
await extendPrintJob(jobId, values.minutes, values.hours);
|
||||
const result = await extendPrintJob(jobId, values.minutes, values.hours);
|
||||
|
||||
if (result?.error) {
|
||||
toast({
|
||||
description: result.error,
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
|
||||
setOpen(false);
|
||||
form.reset();
|
||||
|
@ -27,7 +27,13 @@ export function FinishForm(props: FinishFormProps) {
|
||||
description: "Druckauftrag wird abgeschlossen...",
|
||||
});
|
||||
try {
|
||||
await earlyFinishPrintJob(jobId);
|
||||
const result = await earlyFinishPrintJob(jobId);
|
||||
if (result?.error) {
|
||||
toast({
|
||||
description: result.error,
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
toast({
|
||||
description: "Druckauftrag wurde abgeschlossen.",
|
||||
});
|
||||
|
@ -36,7 +36,7 @@ export default async function JobDetailsPage(props: JobDetailsPageProps) {
|
||||
});
|
||||
|
||||
if (!jobDetails) {
|
||||
return <div>Job not found</div>;
|
||||
return <div>Druckauftrag wurde nicht gefunden.</div>;
|
||||
}
|
||||
|
||||
const jobIsOnGoing = new Date(jobDetails.startAt).getTime() + jobDetails.durationInMinutes * 60 * 1000 > Date.now();
|
||||
|
@ -1,15 +1,8 @@
|
||||
import { Header } from "@/components/header";
|
||||
import { Toaster } from "@/components/ui/toaster";
|
||||
import { cn } from "@/utils/styles";
|
||||
import type { Metadata } from "next";
|
||||
|
||||
import "@/app/globals.css";
|
||||
import { Inter as FontSans } from "next/font/google";
|
||||
|
||||
const fontSans = FontSans({
|
||||
subsets: ["latin"],
|
||||
variable: "--font-sans",
|
||||
});
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: {
|
||||
@ -23,13 +16,15 @@ interface RootLayoutProps {
|
||||
children: React.ReactNode;
|
||||
}
|
||||
|
||||
export const dynamic = "force-dynamic";
|
||||
|
||||
export default function RootLayout(props: RootLayoutProps) {
|
||||
const { children } = props;
|
||||
|
||||
return (
|
||||
<html lang="de" suppressHydrationWarning>
|
||||
<head />
|
||||
<body className={cn("min-h-dvh bg-muted font-sans antialiased", fontSans.variable)}>
|
||||
<body className={"min-h-dvh bg-neutral-200 font-sans antialiased"}>
|
||||
<Header />
|
||||
<main className="flex-grow max-w-screen-2xl w-full mx-auto flex flex-col p-8 gap-4 text-foreground">
|
||||
{children}
|
||||
|
11
packages/reservation-platform/src/app/not-found.tsx
Normal file
11
packages/reservation-platform/src/app/not-found.tsx
Normal file
@ -0,0 +1,11 @@
|
||||
import Link from "next/link";
|
||||
|
||||
export default function NotFound() {
|
||||
return (
|
||||
<div>
|
||||
<h2>Nicht gefunden</h2>
|
||||
<p>Die angefragte Seite konnte nicht gefunden werden.</p>
|
||||
<Link href="/">Zurück zur Startseite</Link>
|
||||
</div>
|
||||
);
|
||||
}
|
@ -1,11 +1,12 @@
|
||||
import { columns } from "@/app/my/jobs/columns";
|
||||
import { JobsTable } from "@/app/my/jobs/data-table";
|
||||
import { DynamicPrinterCards } from "@/components/dynamic-printer-cards";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { validateRequest } from "@/server/auth";
|
||||
import { db } from "@/server/db";
|
||||
import { printJobs } from "@/server/db/schema";
|
||||
import { desc, eq } from "drizzle-orm";
|
||||
import { BoxesIcon, NewspaperIcon } from "lucide-react";
|
||||
import type { Metadata } from "next";
|
||||
|
||||
export const metadata: Metadata = {
|
||||
@ -43,8 +44,10 @@ export default async function HomePage() {
|
||||
{/* NEEDS TO BE FIXED FOR A NEW / EMPTY USER {isLoggedIn && <PersonalizedCards />} */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Druckerbelegung</CardTitle>
|
||||
<CardDescription>({printers.length} Verfügbar)</CardDescription>
|
||||
<CardTitle className="flex flex-row items-center gap-x-1">
|
||||
<BoxesIcon className="w-5 h-5" />
|
||||
<span className="text-lg">Druckerbelegung</span>
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4">
|
||||
<DynamicPrinterCards user={user} />
|
||||
@ -53,8 +56,10 @@ export default async function HomePage() {
|
||||
{userIsLoggedIn && (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Druckaufträge</CardTitle>
|
||||
<CardDescription>Deine aktuellen Druckaufträge</CardDescription>
|
||||
<CardTitle className="flex flex-row items-center gap-x-1">
|
||||
<NewspaperIcon className="w-5 h-5" />
|
||||
<span className="text-lg">Druckaufträge</span>
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<JobsTable columns={columns} data={jobs} />
|
||||
|
@ -1,15 +1,7 @@
|
||||
"use client";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { DialogClose } from "@/components/ui/dialog";
|
||||
import {
|
||||
Form,
|
||||
FormControl,
|
||||
FormDescription,
|
||||
FormField,
|
||||
FormItem,
|
||||
FormLabel,
|
||||
FormMessage,
|
||||
} from "@/components/ui/form";
|
||||
import { Form, FormControl, FormDescription, FormField, FormItem, FormLabel, FormMessage } from "@/components/ui/form";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Textarea } from "@/components/ui/textarea";
|
||||
import { useToast } from "@/components/ui/use-toast";
|
||||
@ -17,6 +9,7 @@ import { createPrintJob } from "@/server/actions/printJobs";
|
||||
import { zodResolver } from "@hookform/resolvers/zod";
|
||||
import { CalendarPlusIcon, XCircleIcon } from "lucide-react";
|
||||
import { useRouter } from "next/navigation";
|
||||
import { useState } from "react";
|
||||
import { useForm } from "react-hook-form";
|
||||
import { If, Then } from "react-if";
|
||||
import { z } from "zod";
|
||||
@ -41,6 +34,7 @@ export function PrinterReserveForm(props: PrinterReserveFormProps) {
|
||||
const { userId, printerId, isDialog } = props;
|
||||
const router = useRouter();
|
||||
const { toast } = useToast();
|
||||
const [isLocked, setLocked] = useState(false);
|
||||
|
||||
const form = useForm<z.infer<typeof formSchema>>({
|
||||
resolver: zodResolver(formSchema),
|
||||
@ -52,13 +46,25 @@ export function PrinterReserveForm(props: PrinterReserveFormProps) {
|
||||
});
|
||||
|
||||
async function onSubmit(values: z.infer<typeof formSchema>) {
|
||||
if (!isLocked) {
|
||||
setLocked(true);
|
||||
setTimeout(() => {
|
||||
setLocked(false);
|
||||
}, 1000 * 5);
|
||||
} else {
|
||||
toast({
|
||||
description: "Bitte warte ein wenig, bevor du eine weitere Reservierung tätigst...",
|
||||
variant: "default",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
if (values.hours === 0 && values.minutes === 0) {
|
||||
form.setError("hours", {
|
||||
message: "",
|
||||
});
|
||||
form.setError("minutes", {
|
||||
message:
|
||||
"Die Dauer des Druckauftrags muss mindestens 1 Minute betragen.",
|
||||
message: "Die Dauer des Druckauftrags muss mindestens 1 Minute betragen.",
|
||||
});
|
||||
return;
|
||||
}
|
||||
@ -70,6 +76,12 @@ export function PrinterReserveForm(props: PrinterReserveFormProps) {
|
||||
userId: userId,
|
||||
printerId: printerId,
|
||||
});
|
||||
if (typeof jobId === "object") {
|
||||
toast({
|
||||
description: jobId.error,
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
|
||||
router.push(`/job/${jobId}`);
|
||||
} catch (error) {
|
||||
@ -128,9 +140,8 @@ export function PrinterReserveForm(props: PrinterReserveFormProps) {
|
||||
<Textarea placeholder="" {...field} />
|
||||
</FormControl>
|
||||
<FormDescription>
|
||||
In dieses Feld kannst du Anmerkungen zu deinem Druckauftrag
|
||||
hinzufügen. Sie können beispielsweise Informationen über das
|
||||
Druckmaterial, die Druckqualität oder die Farbe enthalten.
|
||||
In dieses Feld kannst du Anmerkungen zu deinem Druckauftrag hinzufügen. Sie können beispielsweise
|
||||
Informationen über das Druckmaterial, die Druckqualität oder die Farbe enthalten.
|
||||
</FormDescription>
|
||||
<FormMessage />
|
||||
</FormItem>
|
||||
@ -140,17 +151,14 @@ export function PrinterReserveForm(props: PrinterReserveFormProps) {
|
||||
<If condition={isDialog}>
|
||||
<Then>
|
||||
<DialogClose asChild>
|
||||
<Button
|
||||
variant={"secondary"}
|
||||
className="gap-2 flex items-center"
|
||||
>
|
||||
<Button variant={"secondary"} className="gap-2 flex items-center">
|
||||
<XCircleIcon className="w-4 h-4" />
|
||||
<span>Abbrechen</span>
|
||||
</Button>
|
||||
</DialogClose>
|
||||
</Then>
|
||||
</If>
|
||||
<Button type="submit" className="gap-2 flex items-center">
|
||||
<Button type="submit" className="gap-2 flex items-center" disabled={isLocked}>
|
||||
<CalendarPlusIcon className="w-4 h-4" />
|
||||
<span>Reservieren</span>
|
||||
</Button>
|
||||
|
@ -1,7 +1,7 @@
|
||||
import { HeaderNavigation } from "@/components/header/navigation";
|
||||
import { LoginButton } from "@/components/login-button";
|
||||
import { LogoutButton } from "@/components/logout-button";
|
||||
import { Avatar, AvatarFallback } from "@/components/ui/avatar";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import {
|
||||
DropdownMenu,
|
||||
DropdownMenuContent,
|
||||
@ -13,7 +13,7 @@ import {
|
||||
} from "@/components/ui/dropdown-menu";
|
||||
import { validateRequest } from "@/server/auth";
|
||||
import { UserRole, hasRole } from "@/server/auth/permissions";
|
||||
import { ScanFaceIcon, StickerIcon, UserIcon, WrenchIcon } from "lucide-react";
|
||||
import { StickerIcon, UserIcon, WrenchIcon } from "lucide-react";
|
||||
import Link from "next/link";
|
||||
import { If, Then } from "react-if";
|
||||
|
||||
@ -78,14 +78,7 @@ export async function Header() {
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
)}
|
||||
{user == null && (
|
||||
<Button variant={"ghost"} className="gap-2 flex items-center" asChild>
|
||||
<Link href="/auth/login">
|
||||
<ScanFaceIcon className="w-4 h-4" />
|
||||
<span>Anmelden</span>
|
||||
</Link>
|
||||
</Button>
|
||||
)}
|
||||
{user == null && <LoginButton />}
|
||||
</div>
|
||||
</header>
|
||||
);
|
||||
|
@ -1,10 +1,12 @@
|
||||
"use client";
|
||||
import { cn } from "@/utils/styles";
|
||||
import { ContactRoundIcon, LayersIcon } from "lucide-react";
|
||||
import Link from "next/link";
|
||||
import { usePathname } from "next/navigation";
|
||||
|
||||
interface Site {
|
||||
name: string;
|
||||
icon: JSX.Element;
|
||||
path: string;
|
||||
}
|
||||
|
||||
@ -12,7 +14,8 @@ export function HeaderNavigation() {
|
||||
const pathname = usePathname();
|
||||
const sites: Site[] = [
|
||||
{
|
||||
name: "Mein Dashboard",
|
||||
name: "Dashboard",
|
||||
icon: <LayersIcon className="w-4 h-4" />,
|
||||
path: "/",
|
||||
},
|
||||
/* {
|
||||
@ -21,6 +24,7 @@ export function HeaderNavigation() {
|
||||
}, */
|
||||
{
|
||||
name: "Mein Profil",
|
||||
icon: <ContactRoundIcon className="w-4 h-4" />,
|
||||
path: "/my/profile",
|
||||
},
|
||||
];
|
||||
@ -31,12 +35,13 @@ export function HeaderNavigation() {
|
||||
<Link
|
||||
key={site.path}
|
||||
href={site.path}
|
||||
className={cn("transition-colors hover:text-neutral-50", {
|
||||
"text-neutral-50": pathname === site.path,
|
||||
className={cn("transition-colors hover:text-neutral-50 flex items-center gap-x-1", {
|
||||
"text-primary-foreground font-semibold": pathname === site.path,
|
||||
"text-neutral-500": pathname !== site.path,
|
||||
})}
|
||||
>
|
||||
{site.name}
|
||||
{site.icon}
|
||||
<span>{site.name}</span>
|
||||
</Link>
|
||||
))}
|
||||
</nav>
|
||||
|
@ -0,0 +1,37 @@
|
||||
"use client";
|
||||
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { useToast } from "@/components/ui/use-toast";
|
||||
import { ScanFaceIcon } from "lucide-react";
|
||||
import Link from "next/link";
|
||||
import { useState } from "react";
|
||||
|
||||
export function LoginButton() {
|
||||
const { toast } = useToast();
|
||||
const [isLocked, setLocked] = useState(false);
|
||||
function onClick() {
|
||||
if (!isLocked) {
|
||||
toast({
|
||||
description: "Du wirst angemeldet...",
|
||||
});
|
||||
|
||||
// Prevent multiple clicks because of login delay...
|
||||
setLocked(true);
|
||||
setTimeout(() => {
|
||||
setLocked(false);
|
||||
}, 1000 * 5);
|
||||
}
|
||||
toast({
|
||||
description: "Bitte warte einen Moment...",
|
||||
});
|
||||
}
|
||||
|
||||
return (
|
||||
<Button onClick={onClick} variant={"ghost"} className="gap-2 flex items-center" asChild disabled={isLocked}>
|
||||
<Link href="/auth/login">
|
||||
<ScanFaceIcon className="w-4 h-4" />
|
||||
<span>Anmelden</span>
|
||||
</Link>
|
||||
</Button>
|
||||
);
|
||||
}
|
@ -1,12 +1,21 @@
|
||||
"use client";
|
||||
|
||||
import { useToast } from "@/components/ui/use-toast";
|
||||
import { logout } from "@/server/actions/authentication/logout";
|
||||
import { LogOutIcon } from "lucide-react";
|
||||
import Link from "next/link";
|
||||
|
||||
export function LogoutButton() {
|
||||
const { toast } = useToast();
|
||||
function onClick() {
|
||||
toast({
|
||||
description: "Du wirst nun abgemeldet...",
|
||||
});
|
||||
logout();
|
||||
}
|
||||
|
||||
return (
|
||||
<Link href="/" onClick={() => logout()} className="flex items-center gap-2">
|
||||
<Link href="/" onClick={onClick} className="flex items-center gap-2">
|
||||
<LogOutIcon className="w-4 h-4" />
|
||||
<span>Abmelden</span>
|
||||
</Link>
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user