Support LAN deployment and managed Python runtime

Made-with: Cursor
This commit is contained in:
bzuccaro
2026-04-25 18:05:56 +00:00
parent fe568c17cd
commit e95ee9c938
12 changed files with 263 additions and 72 deletions
+14 -14
View File
@@ -101,7 +101,7 @@ If CUDA is already mounted or preinstalled outside `PATH`, the installer detects
- Run `./labctl assets lab2` when you want to populate repo-local Lab 2 assets in `assets/lab2/` from Hugging Face.
- After base setup, run `state/lab2/download_whiterabbitneo-gguf.sh` to fetch only the `Q4_K_M`, `Q8_0`, and `IQ2_M` files from `bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF` and register local Ollama models `WhiteRabbitNeo`, `WhiteRabbitNeo-Q4`, `WhiteRabbitNeo-Q8`, and `WhiteRabbitNeo-IQ2`.
- Unsloth homes are redirected into this project's `state/` tree via symlinks.
- Managed web services bind for access from both Linux and the Windows side of WSL, while `labctl urls` still reports localhost-friendly URLs.
- Managed web services bind on all interfaces for headless LAN/VPN access. `labctl urls` reports the detected LAN IP by default; set `COURSEWARE_URL_HOST=<host-or-ip>` before `./labctl up` to advertise a specific VPN DNS name or address.
- The local Ansible bootstrap in `.venv-ansible/` is machine-specific and will be recreated automatically if the folder is copied between hosts.
- `llama.cpp` uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts.
@@ -109,25 +109,25 @@ If CUDA is already mounted or preinstalled outside `PATH`, the installer detects
After `./deploy-courseware.sh`, run `./labctl urls`.
Default endpoints:
Default endpoints use the detected host LAN IP:
- Ollama API: `http://127.0.0.1:11434`
- Open WebUI: `http://127.0.0.1:8080`
- Netron: `http://127.0.0.1:8338`
- ChunkViz: `http://127.0.0.1:3001`
- Embedding Atlas: `http://127.0.0.1:5055`
- Unsloth Studio: `http://127.0.0.1:8888`
- Promptfoo UI: `http://127.0.0.1:15500`
- Wiki: `http://127.0.0.1:80`
- Lab 3 Terminal: `http://127.0.0.1:7681/wetty`
- Ollama API: `http://<host-lan-ip>:11434`
- Open WebUI: `http://<host-lan-ip>:8080`
- Netron: `http://<host-lan-ip>:8338`
- ChunkViz: `http://<host-lan-ip>:3001`
- Embedding Atlas: `http://<host-lan-ip>:5055`
- Unsloth Studio: `http://<host-lan-ip>:8888`
- Promptfoo UI: `http://<host-lan-ip>:15500`
- Wiki: `http://<host-lan-ip>:80`
- Lab 3 Terminal: `http://<host-lan-ip>:7681/wetty`
## Lab 3 Browser Terminal
The deployment will:
- bind `sshd` to `127.0.0.1:22` only
- install WeTTY and expose it at `http://127.0.0.1:7681/wetty`
- leave login identity management to the host, so any existing local account with password-based SSH access can sign in through the browser terminal
- bind `sshd` to `0.0.0.0:22` so regular SSH clients can connect over the LAN/VPN
- install WeTTY and expose it at `http://<host-lan-ip>:7681/wetty`
- leave login identity management to the host, so any existing local account with password-based SSH access can sign in through SSH or the browser terminal
## Notes