11 Commits

Author SHA1 Message Date
c4ch3c4d3 86a5df4681 Update Lab 2 Ollama Gemma model
Made-with: Cursor
2026-04-25 15:40:50 -06:00
bzuccaro e95ee9c938 Support LAN deployment and managed Python runtime
Made-with: Cursor
2026-04-25 18:05:56 +00:00
bzuccaro fe568c17cd Focus local lab deployment on Linux and WSL 2026-04-24 21:32:01 -06:00
OpenCode e915d87ec6 Align installer with updated lab models 2026-04-24 20:08:56 -06:00
OpenCode 7360cd040a Add wiki refresh command and service updates 2026-04-24 10:02:39 -06:00
OpenCode 78676ece59 Preload 2026-04-23 14:48:06 -06:00
OpenCode 30f80fe058 Fix regex escaping in Lab 1 Ollama version extraction
In the YAML folded block scalar, \\. was passed literally to
regex_search as \., so it never matched valid semver strings like
0.21.1. Use \. instead so the dot is matched correctly.
2026-04-22 20:19:31 -06:00
c4ch3c4d3 c79bc2eec0 Restore executable bits for courseware scripts 2026-04-16 11:16:01 -06:00
c4ch3c4d3 59f3032f91 Provision Netron and Lab 1 local assets 2026-04-16 11:15:39 -06:00
c4ch3c4d3 56305680e0 Fix terminal deployment regressions 2026-04-13 21:22:16 -06:00
c4ch3c4d3 5576142aec Use host-managed SSH accounts for browser terminal 2026-04-13 19:40:38 -06:00
33 changed files with 621 additions and 675 deletions
+55 -48
View File
@@ -4,32 +4,49 @@ This project builds a student-friendly local lab environment for the courseware
- `./deploy-courseware.sh` installs and configures the environment, then starts every managed service. - `./deploy-courseware.sh` installs and configures the environment, then starts every managed service.
- `./destroy-courseware.sh` stops the managed services, uninstalls courseware-managed Ollama, and removes the project-owned lab state. - `./destroy-courseware.sh` stops the managed services, uninstalls courseware-managed Ollama, and removes the project-owned lab state.
- `./labctl` provides day-two controls such as `assets lab2`, `start`, `stop`, `status`, `urls`, `logs`, and `open kiln`. - `./labctl` provides day-two controls such as `assets lab2`, `ollama_models`, `update_wiki`, `start`, `stop`, `status`, `urls`, `logs`, and `open kiln`.
## What It Installs ## What It Installs
- Ollama - Ollama
- `llama.cpp` - `llama.cpp`
- TransformerLab, pinned to the classic single-user `v0.28.2` release - Netron, served locally on port `8338`
- Open WebUI - Open WebUI
- ChunkViz - ChunkViz
- Embedding Atlas - Embedding Atlas
- Promptfoo - Promptfoo
- Unsloth Studio - Unsloth Studio
- Kiln Desktop - Kiln Desktop
- Course-specific support assets for lab 2 and lab 4 - Course-specific support assets for lab 1, lab 2, and lab 4
## Lab 1 Defaults
Lab 1 is now provisioned directly by the installer:
- The `Llama-3.2-1B.Q4_K_M.gguf` file is mirrored into `state/models/lab1/`.
- The Lab 1 confidence widget uses the pre-pulled Gemma 4 E2B Q4 Ollama model, `batiai/gemma4-e2b:q4`.
- The wiki serves a same-host download link for the Llama GGUF through `/api/lab1/models/...`.
- Lab 1 confidence visualization requires Ollama `0.12.11` or newer because it depends on logprobs.
## Lab 2 Defaults
`./labctl up` now pre-pulls the Gemma 4 E2B Ollama variants used by the wiki widgets:
- `gemma4:e2b-it-q8_0`
- `batiai/gemma4-e2b:q4`
- `batiai/gemma4-e2b:q6`
If you want to re-pull just those managed Ollama models later, run `./labctl ollama_models`.
## Supported Host Profiles ## Supported Host Profiles
This build intentionally avoids the reference VM's hardware workarounds. This build is the Linux/WSL variant of LLM Labs Local. If you are deploying on Apple Silicon macOS, use the sibling `LLM-Labs-Local-Mac` project instead.
- macOS: Apple Silicon only, with at least 16 GB unified memory.
- Native Debian/Ubuntu: Debian-family Linux with an NVIDIA GPU visible to `nvidia-smi` and at least 8 GB VRAM. - Native Debian/Ubuntu: Debian-family Linux with an NVIDIA GPU visible to `nvidia-smi` and at least 8 GB VRAM.
- WSL: Debian/Ubuntu-family Linux running under WSL, with the NVIDIA GPU exposed into the distro. - WSL: Debian/Ubuntu-family Linux running under WSL, with the NVIDIA GPU exposed into the distro.
The launcher and Ansible preflight now classify the host dynamically and apply different setup behavior for: The launcher and Ansible preflight classify the host dynamically and apply different setup behavior for:
- `macos`
- `native-debian-ubuntu` - `native-debian-ubuntu`
- `wsl` - `wsl`
@@ -51,6 +68,7 @@ On Linux and WSL, the first `./labctl up` or `./labctl preflight` run may prompt
On Ubuntu WSL x86_64, preflight now installs the Linux-side CUDA toolkit automatically if it is missing. On Ubuntu WSL x86_64, preflight now installs the Linux-side CUDA toolkit automatically if it is missing.
It first tries the distro package: It first tries the distro package:
- `sudo apt install -y nvidia-cuda-toolkit` - `sudo apt install -y nvidia-cuda-toolkit`
If that package is unavailable or still does not expose `nvcc`, the installer falls back to NVIDIA's WSL-Ubuntu repository bootstrap for the toolkit only, not a Linux GPU driver. If that package is unavailable or still does not expose `nvcc`, the installer falls back to NVIDIA's WSL-Ubuntu repository bootstrap for the toolkit only, not a Linux GPU driver.
@@ -64,77 +82,66 @@ For non-Ubuntu WSL distros, install the CUDA toolkit manually before running the
## Native Debian/Ubuntu CUDA Behavior ## Native Debian/Ubuntu CUDA Behavior
On native Debian/Ubuntu hosts, the installer now handles three CUDA-toolkit cases: On native Debian/Ubuntu hosts, the installer handles three CUDA-toolkit cases:
- If the toolkit is already usable, it reuses the existing install instead of forcing a reinstall. - If the toolkit is already usable, it reuses the existing install instead of forcing a reinstall.
- If the distro exposes `nvidia-cuda-toolkit`, it installs that package. - If the distro exposes `nvidia-cuda-toolkit`, it installs that package.
- If the distro package is unavailable, it bootstraps NVIDIA's official CUDA network repository for supported native Debian/Ubuntu releases and installs the toolkit from there. - If the distro package is unavailable, it bootstraps NVIDIA's official CUDA network repository for supported native Debian/Ubuntu releases and installs the toolkit from there.
If `apt` starts in a broken dependency state, the installer now attempts `dpkg --configure -a` and `apt-get --fix-broken install` before retrying package installation. If `apt` starts in a broken dependency state, the installer attempts `dpkg --configure -a` and `apt-get --fix-broken install` before retrying package installation.
If CUDA is already mounted or preinstalled outside `PATH`, the installer now detects standard locations such as `/usr/local/cuda/bin/nvcc` and `/usr/local/cuda-*/bin/nvcc`. If CUDA is already mounted or preinstalled outside `PATH`, the installer detects standard locations such as `/usr/local/cuda/bin/nvcc` and `/usr/local/cuda-*/bin/nvcc`.
## Standard Assumptions ## Standard Assumptions
- The host-side install path assumes modern local tooling, but TransformerLab itself is provisioned from a pinned classic single-user layout. - The default deployment is centered on Ollama-backed local inference and browser-based tools such as Netron and the wiki.
- TransformerLab is intentionally pinned to the older single-user `v0.28.2` release because newer upstream releases changed the project structure and behavior in ways that break this courseware. - Netron is installed into a managed Python virtual environment and served locally instead of being provisioned as a desktop package.
- This project does not rely on TransformerLab's upstream `install.sh`; the Ansible role provisions the pinned release directly so web assets, env layout, and runtime behavior stay reproducible. - Lab 1's Llama GGUF download is mirrored locally during `./labctl up`, so students do not have to fetch it manually from the original source.
- The courseware repairs installed TransformerLab Fastchat plugin manifests so Fastchat-gated features such as Model Architecture and Visualize Logprobs stay available on pinned installs. - WhiteRabbitNeo assets remain a separate Lab 2 flow and are still handled outside the default `./labctl up` run.
- No Ollama models are pulled during `./labctl up`; students pull models manually as part of the courseware. - Run `./labctl assets lab2` when you want to populate repo-local Lab 2 assets in `assets/lab2/` from Hugging Face.
- WhiteRabbitNeo assets are handled separately from `./labctl up` and `./labctl preflight`.
- Run `./labctl assets lab2` when you want to populate repo-local lab 2 assets in `assets/lab2/` from Hugging Face.
- After base setup, run `state/lab2/download_whiterabbitneo-gguf.sh` to fetch only the `Q4_K_M`, `Q8_0`, and `IQ2_M` files from `bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF` and register local Ollama models `WhiteRabbitNeo`, `WhiteRabbitNeo-Q4`, `WhiteRabbitNeo-Q8`, and `WhiteRabbitNeo-IQ2`. - After base setup, run `state/lab2/download_whiterabbitneo-gguf.sh` to fetch only the `Q4_K_M`, `Q8_0`, and `IQ2_M` files from `bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF` and register local Ollama models `WhiteRabbitNeo`, `WhiteRabbitNeo-Q4`, `WhiteRabbitNeo-Q8`, and `WhiteRabbitNeo-IQ2`.
- TransformerLab and Unsloth homes are redirected into this project's `state/` tree via symlinks. - Unsloth homes are redirected into this project's `state/` tree via symlinks.
- Managed web services bind for access from both Linux and the Windows side of WSL, while `labctl urls` still reports localhost-friendly URLs. - Managed web services bind on all interfaces for headless LAN/VPN access. `labctl urls` reports the detected LAN IP by default; set `COURSEWARE_URL_HOST=<host-or-ip>` before `./labctl up` to advertise a specific VPN DNS name or address.
- The local Ansible bootstrap in `.venv-ansible/` is machine-specific and will be recreated automatically if the folder is copied between hosts. - The local Ansible bootstrap in `.venv-ansible/` is machine-specific and will be recreated automatically if the folder is copied between hosts.
- `llama.cpp` now uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts. - `llama.cpp` uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts.
## Lab URLs ## Lab URLs
After `./deploy-courseware.sh`, run `./labctl urls`. After `./deploy-courseware.sh`, run `./labctl urls`.
Default endpoints: Default endpoints use the detected host LAN IP:
- Ollama API: `http://127.0.0.1:11434` - Ollama API: `http://<host-lan-ip>:11434`
- Open WebUI: `http://127.0.0.1:8080` - Open WebUI: `http://<host-lan-ip>:8080`
- TransformerLab: `http://127.0.0.1:8338` - Netron: `http://<host-lan-ip>:8338`
- ChunkViz: `http://127.0.0.1:3001` - ChunkViz: `http://<host-lan-ip>:3001`
- Embedding Atlas: `http://127.0.0.1:5055` - Embedding Atlas: `http://<host-lan-ip>:5055`
- Unsloth Studio: `http://127.0.0.1:8888` - Unsloth Studio: `http://<host-lan-ip>:8888`
- Promptfoo UI: `http://127.0.0.1:15500` - Promptfoo UI: `http://<host-lan-ip>:15500`
- Wiki: `http://127.0.0.1:80` - Wiki: `http://<host-lan-ip>:80`
- Lab 3 Terminal: `http://127.0.0.1:7681/wetty` - Lab 3 Terminal: `http://<host-lan-ip>:7681/wetty`
## Lab 3 Browser Terminal ## Lab 3 Browser Terminal
Linux and WSL deployments now require a managed `student` password hash before `./labctl up` or `./labctl preflight`.
Example:
```bash
export COURSEWARE_STUDENT_PASSWORD_HASH="$(openssl passwd -6 'student-password')"
./labctl up
```
The deployment will: The deployment will:
- create the managed `student` account - bind `sshd` to `0.0.0.0:22` so regular SSH clients can connect over the LAN/VPN
- create `/home/student/lab3` - install WeTTY and expose it at `http://<host-lan-ip>:7681/wetty`
- bind `sshd` to `127.0.0.1:22` only - leave login identity management to the host, so any existing local account with password-based SSH access can sign in through SSH or the browser terminal
- install WeTTY and expose it at `http://127.0.0.1:7681/wetty`
## Notes ## Notes
- `./labctl up` installs the environment and then starts every managed service. - `./labctl up` installs the environment and then starts every managed service.
- `./labctl versions` shows the pinned TransformerLab and Ansible runtime versions used by this workspace. - `./labctl versions` shows the pinned Netron version, minimum Ollama version, and Ansible runtime version used by this workspace.
- `./labctl assets lab2` is a separate manual step that clones the base WhiteRabbitNeo repo into `assets/lab2/WhiteRabbitNeo-V3-7B` and downloads the supported `Q4_K_M`, `Q8_0`, and `IQ2_M` GGUFs into `assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF`. - `./labctl assets lab2` is a separate manual step that clones the base WhiteRabbitNeo repo into `assets/lab2/WhiteRabbitNeo-V3-7B` and downloads the supported `Q4_K_M`, `Q8_0`, and `IQ2_M` GGUFs into `assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF`.
- TransformerLab is installed as a pinned single-user app and no default courseware-managed TransformerLab user is created automatically. - `./labctl ollama_models` re-pulls the managed Lab 2 Gemma 4 E2B Ollama model set without rerunning the full installer.
- `./labctl update_wiki` hard-resets the managed wiki checkout to the remote latest, rebuilds it, and restarts only the managed wiki service on port `80`.
- `./labctl start core` starts only `ollama` and `open-webui`. - `./labctl start core` starts only `ollama` and `open-webui`.
- `./labctl start all` starts every managed web service. - `./labctl start all` starts every managed web service.
- `./labctl open kiln` launches the Kiln desktop app installed into the project state. - `./labctl open kiln` launches the Kiln desktop app installed into the project state.
- The scripted Promptfoo install drops a starter config at `state/lab6/promptfoo.yaml`. - The scripted Promptfoo install drops a starter config at `state/lab6/promptfoo.yaml`.
- `labctl start all` now includes Promptfoo via `promptfoo view` and the cloned wiki app. - `labctl start all` includes Promptfoo via `promptfoo view` and the cloned wiki app.
- Lab 2 includes `state/lab2/download_whiterabbitneo-gguf.sh`, which uses `git` + `git lfs` to pull only the supported WhiteRabbitNeo quants. Add `--download-only` if you want the files without Ollama registration. - Lab 2 includes `state/lab2/download_whiterabbitneo-gguf.sh`, which uses `git` + `git lfs` to pull only the supported WhiteRabbitNeo quants. Add `--download-only` if you want the files without Ollama registration.
- The wiki is cloned from `https://git.zuccaro.me/bzuccaro/LLM-Labs.git` into `state/repos/LLM-Labs` and started with `npm`. - The wiki is cloned from `https://git.zuccaro.me/bzuccaro/LLM-Labs.git` into `state/repos/LLM-Labs` and started with `npm`.
- `./labctl down` now uninstalls Ollama entirely when this project installed it, instead of only stopping the service. - `./labctl down` uninstalls Ollama entirely when this project installed it, instead of only stopping the service.
- Unsloth Studio currently supports chat and data workflows on macOS; Linux/WSL remains the standard path for NVIDIA-backed training. - This variant is intended for NVIDIA-backed Linux/WSL training and lab workflows.
+37 -39
View File
@@ -2,6 +2,8 @@ courseware_state_dir: "{{ courseware_root }}/state"
courseware_markers_dir: "{{ courseware_state_dir }}/markers" courseware_markers_dir: "{{ courseware_state_dir }}/markers"
courseware_logs_dir: "{{ courseware_state_dir }}/logs" courseware_logs_dir: "{{ courseware_state_dir }}/logs"
courseware_run_dir: "{{ courseware_state_dir }}/run" courseware_run_dir: "{{ courseware_state_dir }}/run"
courseware_cache_dir: "{{ courseware_state_dir }}/cache"
courseware_tmp_dir: "{{ courseware_state_dir }}/tmp"
courseware_repos_dir: "{{ courseware_state_dir }}/repos" courseware_repos_dir: "{{ courseware_state_dir }}/repos"
courseware_venvs_dir: "{{ courseware_state_dir }}/venvs" courseware_venvs_dir: "{{ courseware_state_dir }}/venvs"
courseware_models_dir: "{{ courseware_state_dir }}/models" courseware_models_dir: "{{ courseware_state_dir }}/models"
@@ -9,28 +11,39 @@ courseware_datasets_dir: "{{ courseware_state_dir }}/datasets"
courseware_tools_dir: "{{ courseware_state_dir }}/tools" courseware_tools_dir: "{{ courseware_state_dir }}/tools"
courseware_apps_dir: "{{ courseware_state_dir }}/apps" courseware_apps_dir: "{{ courseware_state_dir }}/apps"
courseware_downloads_dir: "{{ courseware_state_dir }}/downloads" courseware_downloads_dir: "{{ courseware_state_dir }}/downloads"
courseware_lab1_dir: "{{ courseware_state_dir }}/lab1"
courseware_lab2_dir: "{{ courseware_state_dir }}/lab2" courseware_lab2_dir: "{{ courseware_state_dir }}/lab2"
courseware_lab6_dir: "{{ courseware_state_dir }}/lab6" courseware_lab6_dir: "{{ courseware_state_dir }}/lab6"
courseware_transformerlab_legacy_home: "{{ courseware_state_dir }}/transformerlab-home"
courseware_safe_homes_dir: "{{ lookup('env', 'HOME') }}/.local/share/local-lab-deployment"
courseware_transformerlab_home: "{{ (courseware_safe_homes_dir ~ '/transformerlab-home') if ' ' in courseware_root else courseware_transformerlab_legacy_home }}"
courseware_unsloth_home: "{{ courseware_state_dir }}/unsloth-home" courseware_unsloth_home: "{{ courseware_state_dir }}/unsloth-home"
courseware_lab1_models_dir: "{{ courseware_models_dir }}/lab1"
courseware_ollama_models_dir: "{{ courseware_models_dir }}/ollama" courseware_ollama_models_dir: "{{ courseware_models_dir }}/ollama"
courseware_node_runtime_dir: "{{ courseware_tools_dir }}/node-runtime" courseware_node_runtime_dir: "{{ courseware_tools_dir }}/node-runtime"
courseware_node_runtime_bin_dir: "{{ courseware_node_runtime_dir }}/node_modules/node/bin" courseware_node_runtime_bin_dir: "{{ courseware_node_runtime_dir }}/node_modules/node/bin"
courseware_uv_dir: "{{ courseware_tools_dir }}/uv"
courseware_uv_bin: "{{ courseware_uv_dir }}/bin/uv"
courseware_uv_cache_dir: "{{ courseware_cache_dir }}/uv"
courseware_python_runtime_dir: "{{ courseware_tools_dir }}/python"
courseware_netron_venv_dir: "{{ courseware_venvs_dir }}/netron"
courseware_wetty_dir: "{{ courseware_tools_dir }}/wetty" courseware_wetty_dir: "{{ courseware_tools_dir }}/wetty"
courseware_promptfoo_dir: "{{ courseware_lab6_dir }}" courseware_promptfoo_dir: "{{ courseware_lab6_dir }}"
courseware_wiki_repo_dir: "{{ courseware_repos_dir }}/LLM-Labs" courseware_wiki_repo_dir: "{{ courseware_repos_dir }}/LLM-Labs"
courseware_wiki_runtime_config_path: "{{ courseware_wiki_repo_dir }}/public/courseware-runtime.json" courseware_wiki_runtime_config_path: "{{ courseware_wiki_repo_dir }}/public/courseware-runtime.json"
courseware_llama_cpp_bin_dir: "{{ courseware_repos_dir }}/llama.cpp/build/bin" courseware_llama_cpp_bin_dir: "{{ courseware_repos_dir }}/llama.cpp/build/bin"
courseware_lab3_dir: "/home/student/lab3"
courseware_bind_host: "0.0.0.0" courseware_bind_host: "0.0.0.0"
courseware_url_host: "127.0.0.1" courseware_url_host: >-
{{
(lookup('env', 'COURSEWARE_URL_HOST') | trim)
if (lookup('env', 'COURSEWARE_URL_HOST') | trim | length) > 0
else (
ansible_default_ipv4.address
| default(ansible_all_ipv4_addresses | default(['127.0.0.1']) | first)
)
}}
courseware_ports: courseware_ports:
ollama: 11434 ollama: 11434
open_webui: 8080 open_webui: 8080
transformerlab: 8338 netron: 8338
chunkviz: 3001 chunkviz: 3001
embedding_atlas: 5055 embedding_atlas: 5055
unsloth: 8888 unsloth: 8888
@@ -38,43 +51,25 @@ courseware_ports:
wiki: 80 wiki: 80
wetty: 7681 wetty: 7681
courseware_transformerlab_install_mode: "single-user-pinned" courseware_netron_version: "9.0.1"
courseware_transformerlab_version: "v0.28.2" courseware_ollama_min_version: "0.12.11"
courseware_transformerlab_version_dir: "{{ courseware_transformerlab_version | regex_replace('^v', '') }}"
courseware_transformerlab_source_archive: "{{ courseware_downloads_dir }}/transformerlab-app-{{ courseware_transformerlab_version_dir }}.tar.gz"
courseware_transformerlab_web_archive: "{{ courseware_downloads_dir }}/transformerlab-web-{{ courseware_transformerlab_version_dir }}.tar.gz"
courseware_transformerlab_miniforge_installer: "{{ courseware_downloads_dir }}/transformerlab-miniforge-installer.sh"
courseware_transformerlab_default_user_email: "student@zuccaro.me"
courseware_transformerlab_default_user_password: "student"
courseware_transformerlab_default_user_first_name: "Student"
courseware_transformerlab_default_user_last_name: ""
courseware_transformerlab_required_loader_plugins:
- "fastchat_server"
courseware_transformerlab_required_supports_fastchat:
- "chat"
- "completion"
- "visualize_model"
- "model_layers"
- "rag"
- "tools"
- "template"
- "embeddings"
- "tokenize"
- "logprobs"
- "batched"
courseware_llama_cpp_commit: "51fa458a92d6a3f305f8fd76fc8f702e3e87ddb5" courseware_llama_cpp_commit: "51fa458a92d6a3f305f8fd76fc8f702e3e87ddb5"
courseware_chunkviz_commit: "a891eacafda1f28a12373ad3b00102e68f07c57f" courseware_chunkviz_commit: "a891eacafda1f28a12373ad3b00102e68f07c57f"
courseware_promptfoo_version: "0.119.0" courseware_promptfoo_version: "0.119.0"
courseware_kiln_release_tag: "v0.18.1" courseware_kiln_release_tag: "v0.18.1"
courseware_node_runtime_version: "20.20.2" courseware_node_runtime_version: "20.20.2"
courseware_python_runtime_version: "3.12"
courseware_uv_spec: "uv"
courseware_wetty_spec: "wetty@2.5.0" courseware_wetty_spec: "wetty@2.5.0"
courseware_wetty_base_path: "/wetty" courseware_wetty_base_path: "/wetty"
courseware_wiki_repo: "https://git.zuccaro.me/bzuccaro/LLM-Labs.git" courseware_wiki_repo: "https://git.zuccaro.me/bzuccaro/LLM-Labs.git"
courseware_student_username: "student"
courseware_student_password_hash: "{{ lookup('env', 'COURSEWARE_STUDENT_PASSWORD_HASH') | default('', true) }}"
courseware_open_webui_spec: "open-webui" courseware_open_webui_spec: "open-webui"
courseware_embedding_atlas_spec: "embedding-atlas" courseware_embedding_atlas_spec: "embedding-atlas"
courseware_lab1_ollama_model_alias: "batiai/gemma4-e2b:q4"
courseware_lab1_llama_filename: "Llama-3.2-1B.Q4_K_M.gguf"
courseware_lab1_llama_download_url: "https://huggingface.co/DevQuasar-3/meta-llama.Llama-3.2-1B-GGUF/resolve/main/Llama-3.2-1B.Q4_K_M.gguf?download=true"
courseware_lab1_llama_local_path: "{{ courseware_lab1_models_dir }}/{{ courseware_lab1_llama_filename }}"
courseware_white_rabbit_repo: "bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF" courseware_white_rabbit_repo: "bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
courseware_white_rabbit_variants: courseware_white_rabbit_variants:
@@ -91,12 +86,15 @@ courseware_white_rabbit_variants:
- ollama_model: "WhiteRabbitNeo-IQ2" - ollama_model: "WhiteRabbitNeo-IQ2"
quant: "IQ2_M" quant: "IQ2_M"
filename: "WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-IQ2_M.gguf" filename: "WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-IQ2_M.gguf"
courseware_ollama_models: courseware_lab2_ollama_models:
- "llama3.2" - label: "Gemma 4 E2B IT Q8"
- "qwen3.5:4b" value: "gemma4:e2b-it-q8_0"
- "gemma3n:e2b" - label: "Gemma 4 E2B Q4"
courseware_optional_ollama_models: value: "batiai/gemma4-e2b:q4"
- "gemma3:12b-it-qat" - label: "Gemma 4 E2B Q6"
value: "batiai/gemma4-e2b:q6"
courseware_ollama_models: "{{ courseware_lab2_ollama_models | map(attribute='value') | list }}"
courseware_optional_ollama_models: []
courseware_install_optional_heavy_models: false courseware_install_optional_heavy_models: false
courseware_wsl_cuda_pin_url: "https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/cuda-wsl-ubuntu.pin" courseware_wsl_cuda_pin_url: "https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/cuda-wsl-ubuntu.pin"
@@ -153,7 +151,7 @@ courseware_ollama_install_marker: "{{ courseware_markers_dir }}/ollama-installed
courseware_services: courseware_services:
- "ollama" - "ollama"
- "open-webui" - "open-webui"
- "transformerlab" - "netron"
- "chunkviz" - "chunkviz"
- "embedding-atlas" - "embedding-atlas"
- "unsloth" - "unsloth"
-65
View File
@@ -12,17 +12,6 @@
changed_when: false changed_when: false
failed_when: false failed_when: false
- name: Stat managed TransformerLab symlink
stat:
path: "{{ ansible_env.HOME }}/.transformerlab"
follow: false
register: courseware_down_transformerlab
- name: Stat managed TransformerLab marker
stat:
path: "{{ ansible_env.HOME }}/.transformerlab/.courseware-managed"
register: courseware_down_transformerlab_marker
- name: Stat managed Unsloth symlink - name: Stat managed Unsloth symlink
stat: stat:
path: "{{ ansible_env.HOME }}/.unsloth" path: "{{ ansible_env.HOME }}/.unsloth"
@@ -34,11 +23,6 @@
path: "{{ ansible_env.HOME }}/.unsloth/.courseware-managed" path: "{{ ansible_env.HOME }}/.unsloth/.courseware-managed"
register: courseware_down_unsloth_marker register: courseware_down_unsloth_marker
- name: Stat conda environments file
stat:
path: "{{ ansible_env.HOME }}/.conda/environments.txt"
register: courseware_down_conda_envs
- name: Stat courseware-managed Ollama install marker - name: Stat courseware-managed Ollama install marker
stat: stat:
path: "{{ courseware_ollama_install_marker }}" path: "{{ courseware_ollama_install_marker }}"
@@ -156,55 +140,6 @@
- courseware_down_ollama_marker.stat.exists - courseware_down_ollama_marker.stat.exists
failed_when: false failed_when: false
- name: Stop courseware-managed Ollama macOS app if running
command: pkill -x Ollama
when:
- ansible_system == "Darwin"
- courseware_down_ollama_marker.stat.exists
changed_when: false
failed_when: false
- name: Uninstall courseware-managed Ollama Homebrew formula
command: brew uninstall ollama
when:
- ansible_system == "Darwin"
- courseware_down_ollama_marker.stat.exists
changed_when: false
failed_when: false
- name: Remove managed TransformerLab conda environment entry
lineinfile:
path: "{{ ansible_env.HOME }}/.conda/environments.txt"
regexp: "^{{ (courseware_transformerlab_home ~ '/envs/transformerlab') | regex_escape() }}$"
state: absent
when: courseware_down_conda_envs.stat.exists
failed_when: false
- name: Remove managed TransformerLab path
file:
path: "{{ ansible_env.HOME }}/.transformerlab"
state: absent
when:
- courseware_down_transformerlab.stat.exists
- >
(courseware_down_transformerlab.stat.islnk and
courseware_down_transformerlab.stat.lnk_source == courseware_transformerlab_home)
or courseware_down_transformerlab_marker.stat.exists
failed_when: false
- name: Remove managed TransformerLab home
file:
path: "{{ courseware_transformerlab_home }}"
state: absent
failed_when: false
- name: Remove legacy managed TransformerLab home
file:
path: "{{ courseware_transformerlab_legacy_home }}"
state: absent
when: courseware_transformerlab_legacy_home != courseware_transformerlab_home
failed_when: false
- name: Remove managed Unsloth path - name: Remove managed Unsloth path
file: file:
path: "{{ ansible_env.HOME }}/.unsloth" path: "{{ ansible_env.HOME }}/.unsloth"
+5 -2
View File
@@ -6,14 +6,17 @@
- { role: preflight, tags: ["preflight"] } - { role: preflight, tags: ["preflight"] }
- directories - directories
- packages - packages
- python_runtime
- netron
- lab1_assets
- lab_assets - lab_assets
- node_runtime - node_runtime
- { role: terminal, when: ansible_system == "Linux" } - { role: terminal, when: ansible_system == "Linux" }
- llama_cpp - llama_cpp
- transformerlab
- open_webui - open_webui
- chunkviz - chunkviz
- promptfoo - promptfoo
- wiki - { role: ollama_models, tags: ["ollama_models"] }
- { role: wiki, tags: ["wiki"] }
- kiln - kiln
- unsloth - unsloth
-7
View File
@@ -12,10 +12,3 @@ common_packages_debian:
- ninja-build - ninja-build
- libssl-dev - libssl-dev
- pkg-config - pkg-config
common_packages_macos:
- python3
- git
- curl
- cmake
- ninja
-11
View File
@@ -20,17 +20,6 @@
when: ansible_os_family == "Debian" when: ansible_os_family == "Debian"
become: yes become: yes
- name: Ensure Homebrew is installed (macOS)
ansible.builtin.homebrew:
name:
- python3
- git
- curl
- cmake
- ninja
state: present
when: ansible_os_family == "Darwin"
- name: Install Python virtual environment module (user space) - name: Install Python virtual environment module (user space)
ansible.builtin.pip: ansible.builtin.pip:
name: virtualenv name: virtualenv
+5 -34
View File
@@ -8,6 +8,9 @@
- "{{ courseware_markers_dir }}" - "{{ courseware_markers_dir }}"
- "{{ courseware_logs_dir }}" - "{{ courseware_logs_dir }}"
- "{{ courseware_run_dir }}" - "{{ courseware_run_dir }}"
- "{{ courseware_cache_dir }}"
- "{{ courseware_tmp_dir }}"
- "{{ courseware_uv_cache_dir }}"
- "{{ courseware_repos_dir }}" - "{{ courseware_repos_dir }}"
- "{{ courseware_venvs_dir }}" - "{{ courseware_venvs_dir }}"
- "{{ courseware_models_dir }}" - "{{ courseware_models_dir }}"
@@ -15,10 +18,10 @@
- "{{ courseware_tools_dir }}" - "{{ courseware_tools_dir }}"
- "{{ courseware_apps_dir }}" - "{{ courseware_apps_dir }}"
- "{{ courseware_downloads_dir }}" - "{{ courseware_downloads_dir }}"
- "{{ courseware_lab1_dir }}"
- "{{ courseware_lab2_dir }}" - "{{ courseware_lab2_dir }}"
- "{{ courseware_safe_homes_dir }}"
- "{{ courseware_transformerlab_home }}"
- "{{ courseware_unsloth_home }}" - "{{ courseware_unsloth_home }}"
- "{{ courseware_lab1_models_dir }}"
- "{{ courseware_ollama_models_dir }}" - "{{ courseware_ollama_models_dir }}"
- name: Seed managed ownership markers - name: Seed managed ownership markers
@@ -27,40 +30,8 @@
state: touch state: touch
mode: "0644" mode: "0644"
loop: loop:
- "{{ courseware_transformerlab_home }}/.courseware-managed"
- "{{ courseware_unsloth_home }}/.courseware-managed" - "{{ courseware_unsloth_home }}/.courseware-managed"
- name: Check existing TransformerLab path
stat:
path: "{{ ansible_env.HOME }}/.transformerlab"
follow: false
register: courseware_transformerlab_link
- name: Check existing TransformerLab ownership marker
stat:
path: "{{ ansible_env.HOME }}/.transformerlab/.courseware-managed"
register: courseware_transformerlab_marker
- name: Fail if TransformerLab path is already occupied
fail:
msg: "{{ ansible_env.HOME }}/.transformerlab already exists and is not managed by this project."
when:
- courseware_transformerlab_link.stat.exists
- >
(
(not courseware_transformerlab_link.stat.islnk) or
(courseware_transformerlab_link.stat.islnk and
courseware_transformerlab_link.stat.lnk_source != courseware_transformerlab_home)
) and
(not courseware_transformerlab_marker.stat.exists)
- name: Link TransformerLab home into project state
file:
src: "{{ courseware_transformerlab_home }}"
dest: "{{ ansible_env.HOME }}/.transformerlab"
state: link
force: true
- name: Check existing Unsloth path - name: Check existing Unsloth path
stat: stat:
path: "{{ ansible_env.HOME }}/.unsloth" path: "{{ ansible_env.HOME }}/.unsloth"
-19
View File
@@ -1,19 +0,0 @@
- name: Download Kiln macOS disk image
get_url:
url: "https://github.com/Kiln-AI/Kiln/releases/download/{{ courseware_kiln_release_tag }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg"
dest: "{{ courseware_downloads_dir }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg"
mode: "0644"
- name: Install Kiln.app into project state
shell: |
set -euo pipefail
mount_point=$(mktemp -d /tmp/kiln.XXXXXX)
hdiutil attach "{{ courseware_downloads_dir }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg" -mountpoint "$mount_point" -nobrowse -quiet
app_path=$(find "$mount_point" -maxdepth 1 -name '*.app' | head -n 1)
rm -rf "{{ courseware_apps_dir }}/Kiln.app"
cp -R "$app_path" "{{ courseware_apps_dir }}/Kiln.app"
hdiutil detach "$mount_point" -quiet
rmdir "$mount_point"
args:
executable: /bin/bash
creates: "{{ courseware_apps_dir }}/Kiln.app"
-5
View File
@@ -1,8 +1,3 @@
- name: Install Kiln on Linux - name: Install Kiln on Linux
include_tasks: linux.yml include_tasks: linux.yml
when: ansible_system == "Linux" when: ansible_system == "Linux"
- name: Install Kiln on macOS
include_tasks: macos.yml
when: ansible_system == "Darwin"
+38
View File
@@ -0,0 +1,38 @@
- name: Ensure Lab 1 model directory exists
file:
path: "{{ courseware_lab1_models_dir }}"
state: directory
mode: "0755"
- name: Check installed Ollama version
command:
argv:
- "{{ courseware_ollama_bin }}"
- --version
register: courseware_lab1_ollama_version
changed_when: false
- name: Extract installed Ollama semantic version
set_fact:
courseware_lab1_ollama_semver: >-
{{
courseware_lab1_ollama_version.stdout
| regex_search('[0-9]+\.[0-9]+\.[0-9]+')
| default('')
}}
- name: Fail when Ollama is too old for Lab 1 logprobs
fail:
msg: >-
Lab 1 requires Ollama {{ courseware_ollama_min_version }} or newer because
the confidence visualizer depends on logprob support. Installed version:
{{ courseware_lab1_ollama_version.stdout | trim }}.
when:
- courseware_lab1_ollama_semver | length == 0
or not (courseware_lab1_ollama_semver is version(courseware_ollama_min_version, '>='))
- name: Download mirrored Lab 1 Llama model
get_url:
url: "{{ courseware_lab1_llama_download_url }}"
dest: "{{ courseware_lab1_llama_local_path }}"
mode: "0644"
-26
View File
@@ -23,18 +23,6 @@
gpu_type: "{{ 'nvidia' if nvidia_smi_output.rc == 0 else 'none' }}" gpu_type: "{{ 'nvidia' if nvidia_smi_output.rc == 0 else 'none' }}"
when: is_wsl | default(false) or ansible_os_family == "Debian" when: is_wsl | default(false) or ansible_os_family == "Debian"
- name: Check for Metal GPU on macOS
ansible.builtin.command: system_profiler SPDisplaysDataType
register: metal_check
changed_when: false
failed_when: false
when: ansible_os_family == "Darwin"
- name: Set GPU type for macOS
ansible.builtin.set_fact:
gpu_type: "metal"
when: ansible_os_family == "Darwin" and metal_check.rc == 0
- name: Display detected GPU type - name: Display detected GPU type
ansible.builtin.debug: ansible.builtin.debug:
msg: "llama.cpp GPU type: {{ gpu_type | default('none') }}" msg: "llama.cpp GPU type: {{ gpu_type | default('none') }}"
@@ -58,7 +46,6 @@
{{ {{
not llama_cpp_stat.stat.exists or not llama_cpp_stat.stat.exists or
(gpu_type == 'nvidia' and existing_gpu_check.stdout != 'cuda') or (gpu_type == 'nvidia' and existing_gpu_check.stdout != 'cuda') or
(gpu_type == 'metal' and existing_gpu_check.stdout != 'metal') or
(gpu_type == 'amd' and existing_gpu_check.stdout != 'amd') (gpu_type == 'amd' and existing_gpu_check.stdout != 'amd')
}} }}
@@ -120,19 +107,6 @@
when: gpu_type == 'amd' and cmake_configured.rc != 0 when: gpu_type == 'amd' and cmake_configured.rc != 0
become: no become: no
- name: Configure llama.cpp for Metal (macOS)
ansible.builtin.command:
argv:
- cmake
- ..
- -G Ninja
- -DCMAKE_BUILD_TYPE=Release
- -DGGML_METAL=on
args:
chdir: "{{ llmlab_base }}/lab2/llama.cpp/build"
when: gpu_type == 'metal' and cmake_configured.rc != 0
become: no
- name: Configure llama.cpp for CPU only - name: Configure llama.cpp for CPU only
ansible.builtin.command: ansible.builtin.command:
argv: argv:
+1 -1
View File
@@ -78,7 +78,7 @@
- name: Set llama.cpp backend flag - name: Set llama.cpp backend flag
set_fact: set_fact:
courseware_llama_backend_flag: "{{ '-DGGML_METAL=ON' if ansible_system == 'Darwin' else '-DGGML_CUDA=ON' }}" courseware_llama_backend_flag: "-DGGML_CUDA=ON"
- name: Set llama.cpp build parallelism - name: Set llama.cpp build parallelism
set_fact: set_fact:
+30
View File
@@ -0,0 +1,30 @@
- name: Create Netron virtual environment
command:
argv:
- "{{ courseware_python_bin }}"
- -m
- venv
- "{{ courseware_netron_venv_dir }}"
args:
creates: "{{ courseware_netron_venv_dir }}/bin/python"
- name: Upgrade Netron venv tooling
command:
argv:
- "{{ courseware_netron_venv_dir }}/bin/python"
- -m
- pip
- install
- --upgrade
- pip
- setuptools
- wheel
- name: Install Netron server runtime
command:
argv:
- "{{ courseware_netron_venv_dir }}/bin/python"
- -m
- pip
- install
- "netron=={{ courseware_netron_version }}"
-8
View File
@@ -16,14 +16,6 @@
become: yes become: yes
notify: Start Ollama service notify: Start Ollama service
- name: Install Ollama (macOS via Homebrew)
ansible.builtin.homebrew:
name: ollama
state: present
when:
- ansible_os_family == "Darwin"
- ollama_version_check.rc != 0
- name: Check if Ollama service exists - name: Check if Ollama service exists
ansible.builtin.command: systemctl list-unit-files ollama.service ansible.builtin.command: systemctl list-unit-files ollama.service
register: ollama_service_check register: ollama_service_check
+38 -2
View File
@@ -4,10 +4,28 @@
state: directory state: directory
mode: "0755" mode: "0755"
- name: Check Open WebUI virtual environment Python version
command:
argv:
- "{{ courseware_venvs_dir }}/open-webui/bin/python"
- -c
- "import sys; print(f'{sys.version_info.major}.{sys.version_info.minor}')"
register: courseware_open_webui_venv_python_version
changed_when: false
failed_when: false
- name: Remove Open WebUI virtual environment with incompatible Python
file:
path: "{{ courseware_venvs_dir }}/open-webui"
state: absent
when:
- courseware_open_webui_venv_python_version.rc == 0
- courseware_open_webui_venv_python_version.stdout != courseware_python_runtime_version
- name: Create Open WebUI virtual environment - name: Create Open WebUI virtual environment
command: command:
argv: argv:
- "{{ courseware_transformerlab_home }}/envs/transformerlab/bin/python" - "{{ courseware_python_bin }}"
- -m - -m
- venv - venv
- "{{ courseware_venvs_dir }}/open-webui" - "{{ courseware_venvs_dir }}/open-webui"
@@ -36,10 +54,28 @@
- "{{ courseware_open_webui_spec }}" - "{{ courseware_open_webui_spec }}"
- "numpy<2" - "numpy<2"
- name: Check Embedding Atlas virtual environment Python version
command:
argv:
- "{{ courseware_venvs_dir }}/embedding-atlas/bin/python"
- -c
- "import sys; print(f'{sys.version_info.major}.{sys.version_info.minor}')"
register: courseware_embedding_atlas_venv_python_version
changed_when: false
failed_when: false
- name: Remove Embedding Atlas virtual environment with incompatible Python
file:
path: "{{ courseware_venvs_dir }}/embedding-atlas"
state: absent
when:
- courseware_embedding_atlas_venv_python_version.rc == 0
- courseware_embedding_atlas_venv_python_version.stdout != courseware_python_runtime_version
- name: Create Embedding Atlas virtual environment - name: Create Embedding Atlas virtual environment
command: command:
argv: argv:
- "{{ courseware_transformerlab_home }}/envs/transformerlab/bin/python" - "{{ courseware_python_bin }}"
- -m - -m
- venv - venv
- "{{ courseware_venvs_dir }}/embedding-atlas" - "{{ courseware_venvs_dir }}/embedding-atlas"
+1
View File
@@ -14,6 +14,7 @@
- pkg-config - pkg-config
- python3 - python3
- python3-pip - python3-pip
- python3-setuptools
- python3-venv - python3-venv
- unzip - unzip
- zstd - zstd
-29
View File
@@ -1,29 +0,0 @@
- name: Check installed Homebrew formulas
command: "brew list --versions {{ item }}"
loop:
- git
- git-lfs
- cmake
- node
- python@3.11
- ollama
register: courseware_brew_checks
changed_when: false
failed_when: false
- name: Install missing Homebrew formulas
command: "brew install {{ item.item }}"
loop: "{{ courseware_brew_checks.results }}"
when: item.rc != 0
- name: Mark Ollama as installed by courseware on macOS
file:
path: "{{ courseware_ollama_install_marker }}"
state: touch
mode: "0644"
when:
- courseware_brew_checks.results
| selectattr('item', 'equalto', 'ollama')
| selectattr('rc', 'ne', 0)
| list
| length > 0
-5
View File
@@ -1,8 +1,3 @@
- name: Install macOS prerequisites
include_tasks: macos.yml
when: ansible_system == "Darwin"
- name: Install Linux prerequisites - name: Install Linux prerequisites
include_tasks: linux.yml include_tasks: linux.yml
when: ansible_system == "Linux" when: ansible_system == "Linux"
+4 -47
View File
@@ -1,56 +1,14 @@
- name: Classify supported host profile - name: Classify supported host profile
set_fact: set_fact:
courseware_is_macos: "{{ ansible_system == 'Darwin' }}"
courseware_is_linux: "{{ ansible_system == 'Linux' }}" courseware_is_linux: "{{ ansible_system == 'Linux' }}"
courseware_is_wsl: "{{ 'microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower }}" courseware_is_wsl: "{{ 'microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower }}"
courseware_is_native_linux: "{{ ansible_system == 'Linux' and not ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) }}" courseware_is_native_linux: "{{ ansible_system == 'Linux' and not ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) }}"
courseware_host_profile: "{{ 'macos' if ansible_system == 'Darwin' else ('wsl' if ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) else ('native-debian-ubuntu' if ansible_system == 'Linux' and ansible_os_family == 'Debian' else 'unsupported')) }}" courseware_host_profile: "{{ 'wsl' if ansible_system == 'Linux' and ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) else ('native-debian-ubuntu' if ansible_system == 'Linux' and ansible_os_family == 'Debian' else 'unsupported') }}"
- name: Fail on unsupported operating systems - name: Fail on unsupported operating systems
fail: fail:
msg: "Supported platforms are Apple Silicon macOS and Debian-family Linux/WSL." msg: "Supported platforms are Debian-family Linux and WSL."
when: ansible_system not in ["Darwin", "Linux"] when: courseware_host_profile == "unsupported"
- name: Fail on unsupported macOS architecture
fail:
msg: "This installer supports Apple Silicon Macs only."
when:
- ansible_system == "Darwin"
- ansible_architecture not in ["arm64", "aarch64"]
- name: Fail on undersized macOS systems
fail:
msg: "This courseware assumes a modern Apple Silicon Mac with at least 16 GB of unified memory."
when:
- ansible_system == "Darwin"
- (ansible_memtotal_mb | int) < 16000
- name: Check for Xcode command line tools
command: xcode-select -p
register: courseware_xcode_select
changed_when: false
when: ansible_system == "Darwin"
- name: Check for Homebrew
command: which brew
register: courseware_brew_check
changed_when: false
failed_when: false
when: ansible_system == "Darwin"
- name: Fail when Xcode command line tools are missing
fail:
msg: "Install Xcode Command Line Tools first with 'xcode-select --install'."
when:
- ansible_system == "Darwin"
- courseware_xcode_select.rc != 0
- name: Fail when Homebrew is missing
fail:
msg: "Install Homebrew first from https://brew.sh/."
when:
- ansible_system == "Darwin"
- courseware_brew_check.rc != 0
- name: Fail on unsupported Linux family - name: Fail on unsupported Linux family
fail: fail:
@@ -330,6 +288,5 @@
- name: Set runtime binary defaults - name: Set runtime binary defaults
set_fact: set_fact:
courseware_python_bin: >- courseware_python_bin: "/usr/bin/python3"
{{ '/opt/homebrew/opt/python@3.11/bin/python3.11' if ansible_system == 'Darwin' else '/usr/bin/python3' }}
courseware_ollama_bin: "ollama" courseware_ollama_bin: "ollama"
@@ -0,0 +1,74 @@
- name: Create contained Python runtime manager virtual environment
command:
argv:
- /usr/bin/python3
- -m
- venv
- "{{ courseware_uv_dir }}"
args:
creates: "{{ courseware_uv_dir }}/bin/python"
- name: Upgrade contained Python runtime manager tooling
command:
argv:
- "{{ courseware_uv_dir }}/bin/python"
- -m
- pip
- install
- --upgrade
- pip
- setuptools
- wheel
- name: Install contained Python runtime manager
command:
argv:
- "{{ courseware_uv_dir }}/bin/python"
- -m
- pip
- install
- "{{ courseware_uv_spec }}"
- name: Install managed CPython runtime
command:
argv:
- "{{ courseware_uv_bin }}"
- python
- install
- "{{ courseware_python_runtime_version }}"
- --install-dir
- "{{ courseware_python_runtime_dir }}"
environment:
UV_PYTHON_INSTALL_DIR: "{{ courseware_python_runtime_dir }}"
UV_CACHE_DIR: "{{ courseware_uv_cache_dir }}"
XDG_CACHE_HOME: "{{ courseware_cache_dir }}"
TMPDIR: "{{ courseware_tmp_dir }}"
register: courseware_python_runtime_install
changed_when: "'Installed Python' in courseware_python_runtime_install.stdout"
- name: Resolve managed CPython runtime
command:
argv:
- "{{ courseware_uv_bin }}"
- python
- find
- "{{ courseware_python_runtime_version }}"
environment:
UV_PYTHON_INSTALL_DIR: "{{ courseware_python_runtime_dir }}"
UV_CACHE_DIR: "{{ courseware_uv_cache_dir }}"
XDG_CACHE_HOME: "{{ courseware_cache_dir }}"
TMPDIR: "{{ courseware_tmp_dir }}"
register: courseware_python_runtime_find
changed_when: false
- name: Set managed Python runtime for courseware venvs
set_fact:
courseware_python_bin: "{{ courseware_python_runtime_find.stdout | trim }}"
- name: Verify managed Python runtime version
command:
argv:
- "{{ courseware_python_bin }}"
- -c
- "import sys; expected=tuple(map(int, '{{ courseware_python_runtime_version }}'.split('.'))); raise SystemExit(0 if sys.version_info[:len(expected)] == expected else 1)"
changed_when: false
+30 -98
View File
@@ -1,11 +1,3 @@
- name: Fail when student password hash is not configured
fail:
msg: >-
Set COURSEWARE_STUDENT_PASSWORD_HASH in the environment before running ./labctl up.
Example:
export COURSEWARE_STUDENT_PASSWORD_HASH="$(openssl passwd -6 'student-password')"
when: courseware_student_password_hash | trim | length == 0
- name: Install terminal prerequisites - name: Install terminal prerequisites
become: true become: true
apt: apt:
@@ -29,6 +21,13 @@
mode: "0644" mode: "0644"
register: courseware_terminal_sshd_config register: courseware_terminal_sshd_config
- name: Ensure sshd runtime directory exists
become: true
file:
path: /run/sshd
state: directory
mode: "0755"
- name: Validate sshd configuration - name: Validate sshd configuration
become: true become: true
command: command:
@@ -39,13 +38,6 @@
- /etc/ssh/sshd_config - /etc/ssh/sshd_config
changed_when: false changed_when: false
- name: Ensure sshd runtime directory exists
become: true
file:
path: /run/sshd
state: directory
mode: "0755"
- name: Start and enable sshd with systemd when available - name: Start and enable sshd with systemd when available
become: true become: true
systemd: systemd:
@@ -54,6 +46,25 @@
enabled: true enabled: true
when: ansible_service_mgr == "systemd" when: ansible_service_mgr == "systemd"
- name: Check systemd sshd listener policy
become: true
command: ss -ltn
register: courseware_terminal_systemd_ss_listeners
changed_when: false
when: ansible_service_mgr == "systemd"
- name: Restart sshd with systemd when listener policy is not active
become: true
systemd:
name: ssh
state: restarted
enabled: true
when:
- ansible_service_mgr == "systemd"
- >-
'0.0.0.0:22' not in courseware_terminal_systemd_ss_listeners.stdout
or '[::]:22' in courseware_terminal_systemd_ss_listeners.stdout
- name: Check for running sshd when systemd is unavailable - name: Check for running sshd when systemd is unavailable
become: true become: true
command: pgrep -x sshd command: pgrep -x sshd
@@ -79,84 +90,6 @@
- ansible_service_mgr != "systemd" - ansible_service_mgr != "systemd"
- courseware_terminal_sshd_pid.rc != 0 - courseware_terminal_sshd_pid.rc != 0
- name: Ensure managed terminal user exists
become: true
user:
name: "{{ courseware_student_username }}"
password: "{{ courseware_student_password_hash }}"
shell: /bin/bash
create_home: true
state: present
- name: Ensure lab 3 workspace root exists
become: true
file:
path: "{{ courseware_lab3_dir }}"
state: directory
owner: "{{ courseware_student_username }}"
group: "{{ courseware_student_username }}"
mode: "0755"
- name: Ensure lab 3 WhiteRabbitNeo workspace exists
become: true
file:
path: "{{ courseware_lab3_dir }}/WhiteRabbitNeo"
state: directory
owner: "{{ courseware_student_username }}"
group: "{{ courseware_student_username }}"
mode: "0755"
- name: Write lab 3 workspace README
become: true
template:
src: lab3-workspace-readme.txt.j2
dest: "{{ courseware_lab3_dir }}/README.txt"
owner: "{{ courseware_student_username }}"
group: "{{ courseware_student_username }}"
mode: "0644"
- name: Check for repo-local WhiteRabbitNeo base repo
stat:
path: "{{ courseware_root }}/assets/lab2/WhiteRabbitNeo-V3-7B"
register: courseware_lab3_base_repo_stat
- name: Check for repo-local WhiteRabbitNeo GGUF directory
stat:
path: "{{ courseware_root }}/assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
register: courseware_lab3_gguf_repo_stat
- name: Link WhiteRabbitNeo base repo into the student workspace when repo-local assets exist
become: true
file:
src: "{{ courseware_root }}/assets/lab2/WhiteRabbitNeo-V3-7B"
dest: "{{ courseware_lab3_dir }}/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B"
state: link
force: true
owner: "{{ courseware_student_username }}"
group: "{{ courseware_student_username }}"
when: courseware_lab3_base_repo_stat.stat.exists
- name: Link WhiteRabbitNeo GGUF directory into the student workspace when repo-local assets exist
become: true
file:
src: "{{ courseware_root }}/assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
dest: "{{ courseware_lab3_dir }}/WhiteRabbitNeo/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
state: link
force: true
owner: "{{ courseware_student_username }}"
group: "{{ courseware_student_username }}"
when: courseware_lab3_gguf_repo_stat.stat.exists
- name: Link WhiteRabbitNeo download helper into the student workspace
become: true
file:
src: "{{ courseware_lab2_dir }}/download_whiterabbitneo-gguf.sh"
dest: "{{ courseware_lab3_dir }}/download_whiterabbitneo-gguf.sh"
state: link
force: true
owner: "{{ courseware_student_username }}"
group: "{{ courseware_student_username }}"
- name: Create contained WeTTY directory - name: Create contained WeTTY directory
file: file:
path: "{{ courseware_wetty_dir }}" path: "{{ courseware_wetty_dir }}"
@@ -175,19 +108,18 @@
environment: environment:
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}" PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
- name: Check loopback sshd listener - name: Check sshd listener
become: true become: true
command: ss -ltn command: ss -ltn
register: courseware_terminal_ss_listeners register: courseware_terminal_ss_listeners
changed_when: false changed_when: false
- name: Assert sshd is loopback-only - name: Assert sshd accepts LAN and loopback clients
assert: assert:
that: that:
- "'127.0.0.1:22' in courseware_terminal_ss_listeners.stdout" - "'0.0.0.0:22' in courseware_terminal_ss_listeners.stdout"
- "'0.0.0.0:22' not in courseware_terminal_ss_listeners.stdout"
- "'[::]:22' not in courseware_terminal_ss_listeners.stdout" - "'[::]:22' not in courseware_terminal_ss_listeners.stdout"
fail_msg: "sshd must listen only on 127.0.0.1:22 for the browser terminal deployment." fail_msg: "sshd must listen on 0.0.0.0:22 so VPN/LAN SSH clients and local WeTTY can connect."
- name: Assert WeTTY binary exists - name: Assert WeTTY binary exists
stat: stat:
@@ -1,14 +0,0 @@
This workspace is managed for the Lab 3 browser terminal.
You should log in as:
- username: {{ courseware_student_username }}
Start here:
- working directory: {{ courseware_lab3_dir }}
Helpful paths:
- WhiteRabbitNeo helper: {{ courseware_lab3_dir }}/download_whiterabbitneo-gguf.sh
- repo-local base repo symlink: {{ courseware_lab3_dir }}/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B
- repo-local GGUF symlink: {{ courseware_lab3_dir }}/WhiteRabbitNeo/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF
Some symlinks only appear after the corresponding repo-local lab assets are present.
@@ -1,12 +1,11 @@
# Managed by Local Courseware Deployment. # Managed by Local Courseware Deployment.
ListenAddress 127.0.0.1 ListenAddress 0.0.0.0
AddressFamily inet AddressFamily inet
PermitRootLogin no PermitRootLogin no
PasswordAuthentication yes PasswordAuthentication yes
KbdInteractiveAuthentication no KbdInteractiveAuthentication no
ChallengeResponseAuthentication no ChallengeResponseAuthentication no
UsePAM yes UsePAM yes
AllowUsers {{ courseware_student_username }}
AllowTcpForwarding no AllowTcpForwarding no
X11Forwarding no X11Forwarding no
PrintMotd no PrintMotd no
+2 -2
View File
@@ -145,7 +145,7 @@
- name: Determine Miniforge platform suffix - name: Determine Miniforge platform suffix
set_fact: set_fact:
courseware_transformerlab_miniforge_platform: "{{ 'Linux-x86_64' if ansible_system == 'Linux' and ansible_architecture == 'x86_64' else 'Linux-aarch64' if ansible_system == 'Linux' and ansible_architecture in ['aarch64', 'arm64'] else 'MacOSX-arm64' if ansible_system == 'Darwin' and ansible_architecture == 'arm64' else 'MacOSX-x86_64' if ansible_system == 'Darwin' and ansible_architecture == 'x86_64' else 'unsupported' }}" courseware_transformerlab_miniforge_platform: "{{ 'Linux-x86_64' if ansible_system == 'Linux' and ansible_architecture == 'x86_64' else 'Linux-aarch64' if ansible_system == 'Linux' and ansible_architecture in ['aarch64', 'arm64'] else 'unsupported' }}"
- name: Fail for unsupported Miniforge platform - name: Fail for unsupported Miniforge platform
fail: fail:
@@ -210,7 +210,7 @@
elif [ "$accelerator" = "rocm" ]; then elif [ "$accelerator" = "rocm" ]; then
wheel_args+=(--index https://download.pytorch.org/whl/rocm6.4 --index-strategy unsafe-best-match) wheel_args+=(--index https://download.pytorch.org/whl/rocm6.4 --index-strategy unsafe-best-match)
extra="rocm" extra="rocm"
elif [ "{{ ansible_system }}" != "Darwin" ]; then else
wheel_args+=(--index https://download.pytorch.org/whl/cpu --index-strategy unsafe-best-match) wheel_args+=(--index https://download.pytorch.org/whl/cpu --index-strategy unsafe-best-match)
fi fi
+19
View File
@@ -17,6 +17,10 @@
args: args:
executable: /bin/bash executable: /bin/bash
creates: "{{ courseware_unsloth_home }}/.install_complete" creates: "{{ courseware_unsloth_home }}/.install_complete"
environment:
UV_CACHE_DIR: "{{ courseware_uv_cache_dir }}"
XDG_CACHE_HOME: "{{ courseware_cache_dir }}"
TMPDIR: "{{ courseware_tmp_dir }}"
rescue: rescue:
- name: Capture Unsloth installer log tail - name: Capture Unsloth installer log tail
shell: | shell: |
@@ -41,3 +45,18 @@
Last log lines: Last log lines:
{{ courseware_unsloth_install_log_tail.stdout | default('(no log output captured)') }} {{ courseware_unsloth_install_log_tail.stdout | default('(no log output captured)') }}
- name: Install x86_64-compatible NumPy for Unsloth Studio
command:
argv:
- "{{ ansible_env.HOME }}/.unsloth/studio/unsloth_studio/bin/python"
- -m
- pip
- install
- "numpy<2"
environment:
UV_CACHE_DIR: "{{ courseware_uv_cache_dir }}"
XDG_CACHE_HOME: "{{ courseware_cache_dir }}"
TMPDIR: "{{ courseware_tmp_dir }}"
register: courseware_unsloth_numpy_install
changed_when: "'Successfully installed' in courseware_unsloth_numpy_install.stdout"
+8 -13
View File
@@ -1,5 +1,5 @@
diff --git a/src/app/labs/[slug]/page.tsx b/src/app/labs/[slug]/page.tsx diff --git a/src/app/labs/[slug]/page.tsx b/src/app/labs/[slug]/page.tsx
index f67308f..a6aac38 100644 index eb949ae..bb3d51c 100644
--- a/src/app/labs/[slug]/page.tsx --- a/src/app/labs/[slug]/page.tsx
+++ b/src/app/labs/[slug]/page.tsx +++ b/src/app/labs/[slug]/page.tsx
@@ -462,6 +462,19 @@ function markdownToHtml(markdown: string) { @@ -462,6 +462,19 @@ function markdownToHtml(markdown: string) {
@@ -41,20 +41,15 @@ index f67308f..a6aac38 100644
return ( return (
<main className="mx-auto w-full max-w-5xl px-6 py-10"> <main className="mx-auto w-full max-w-5xl px-6 py-10">
diff --git a/src/components/labs/LabContent.tsx b/src/components/labs/LabContent.tsx diff --git a/src/components/labs/LabContent.tsx b/src/components/labs/LabContent.tsx
index 7a7ce52..8778a23 100644 index 6addccf..afdd12f 100644
--- a/src/components/labs/LabContent.tsx --- a/src/components/labs/LabContent.tsx
+++ b/src/components/labs/LabContent.tsx +++ b/src/components/labs/LabContent.tsx
@@ -277,7 +277,12 @@ export function LabContent({ className, html }: LabContentProps) { @@ -346,6 +346,7 @@ export function LabContent({ className, html }: LabContentProps) {
> <img
<div className="lab-image-modal__surface" onClick={(event) => event.stopPropagation()}> className="lab-image-modal__image"
{/* eslint-disable-next-line @next/next/no-img-element */} src={zoomedImage.src}
- <img className="lab-image-modal__image" src={zoomedImage.src} alt={zoomedImage.alt} /> alt={zoomedImage.alt}
+ <img
+ className="lab-image-modal__image"
+ src={zoomedImage.src}
+ alt={zoomedImage.alt}
+ referrerPolicy="no-referrer" + referrerPolicy="no-referrer"
+ /> />
</div> </div>
</div> </div>
) : null}
+12 -3
View File
@@ -2,7 +2,9 @@
git: git:
repo: "{{ courseware_wiki_repo }}" repo: "{{ courseware_wiki_repo }}"
dest: "{{ courseware_wiki_repo_dir }}" dest: "{{ courseware_wiki_repo_dir }}"
update: false update: "{{ courseware_wiki_force_update | default(false) | bool }}"
force: "{{ courseware_wiki_force_update | default(false) | bool }}"
register: courseware_wiki_repo_sync
- name: Check whether wiki referrer policy patch is already applied - name: Check whether wiki referrer policy patch is already applied
command: command:
@@ -27,14 +29,21 @@
args: args:
chdir: "{{ courseware_wiki_repo_dir }}" chdir: "{{ courseware_wiki_repo_dir }}"
when: courseware_wiki_referrer_policy_patch.rc != 0 when: courseware_wiki_referrer_policy_patch.rc != 0
register: courseware_wiki_referrer_policy_apply
- name: Stat wiki Next dependency
stat:
path: "{{ courseware_wiki_repo_dir }}/node_modules/next/package.json"
register: courseware_wiki_next_dependency
- name: Install wiki dependencies with contained Node runtime - name: Install wiki dependencies with contained Node runtime
command: npm install command: npm install
args: args:
chdir: "{{ courseware_wiki_repo_dir }}" chdir: "{{ courseware_wiki_repo_dir }}"
creates: "{{ courseware_wiki_repo_dir }}/node_modules/next/package.json"
environment: environment:
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}" PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
when:
- not courseware_wiki_next_dependency.stat.exists or courseware_wiki_repo_sync.changed
- name: Render wiki runtime config - name: Render wiki runtime config
template: template:
@@ -54,4 +63,4 @@
environment: environment:
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}" PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
when: when:
- not courseware_wiki_build.stat.exists or courseware_wiki_referrer_policy_patch.rc != 0 - not courseware_wiki_build.stat.exists or courseware_wiki_repo_sync.changed or courseware_wiki_referrer_policy_patch.rc != 0
@@ -1,5 +1,13 @@
{ {
"lab3TerminalUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.wetty }}{{ courseware_wetty_base_path }}", "lab1NetronUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.netron }}",
"lab3Username": "{{ courseware_student_username }}", "lab2OllamaUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.ollama }}",
"lab3WorkingDirectory": "{{ courseware_lab3_dir }}" "lab2OllamaModels": [
{% for model in courseware_lab2_ollama_models %}
{
"label": "{{ model.label }}",
"value": "{{ model.value }}"
}{% if not loop.last %},{% endif %}
{% endfor %}
],
"lab3TerminalUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.wetty }}{{ courseware_wetty_base_path }}"
} }
+8 -4
View File
@@ -30,14 +30,18 @@
ansible.builtin.import_role: ansible.builtin.import_role:
name: ollama name: ollama
- name: Include Netron setup
ansible.builtin.import_role:
name: netron
- name: Include Lab 1 asset setup
ansible.builtin.import_role:
name: lab1_assets
- name: Include llama.cpp setup - name: Include llama.cpp setup
ansible.builtin.import_role: ansible.builtin.import_role:
name: llama-cpp name: llama-cpp
- name: Include Transformer Lab setup
ansible.builtin.import_role:
name: transformerlab
- name: Include Unsloth Studio setup - name: Include Unsloth Studio setup
ansible.builtin.import_role: ansible.builtin.import_role:
name: unsloth name: unsloth
+7 -10
View File
@@ -5,16 +5,18 @@ COURSEWARE_BIND_HOST="{{ courseware_bind_host }}"
COURSEWARE_URL_HOST="{{ courseware_url_host }}" COURSEWARE_URL_HOST="{{ courseware_url_host }}"
COURSEWARE_OLLAMA_PORT="{{ courseware_ports.ollama }}" COURSEWARE_OLLAMA_PORT="{{ courseware_ports.ollama }}"
COURSEWARE_OPEN_WEBUI_PORT="{{ courseware_ports.open_webui }}" COURSEWARE_OPEN_WEBUI_PORT="{{ courseware_ports.open_webui }}"
COURSEWARE_TRANSFORMERLAB_PORT="{{ courseware_ports.transformerlab }}" COURSEWARE_NETRON_PORT="{{ courseware_ports.netron }}"
COURSEWARE_CHUNKVIZ_PORT="{{ courseware_ports.chunkviz }}" COURSEWARE_CHUNKVIZ_PORT="{{ courseware_ports.chunkviz }}"
COURSEWARE_EMBEDDING_ATLAS_PORT="{{ courseware_ports.embedding_atlas }}" COURSEWARE_EMBEDDING_ATLAS_PORT="{{ courseware_ports.embedding_atlas }}"
COURSEWARE_UNSLOTH_PORT="{{ courseware_ports.unsloth }}" COURSEWARE_UNSLOTH_PORT="{{ courseware_ports.unsloth }}"
COURSEWARE_PROMPTFOO_PORT="{{ courseware_ports.promptfoo }}" COURSEWARE_PROMPTFOO_PORT="{{ courseware_ports.promptfoo }}"
COURSEWARE_WIKI_PORT="{{ courseware_ports.wiki }}" COURSEWARE_WIKI_PORT="{{ courseware_ports.wiki }}"
COURSEWARE_WETTY_PORT="{{ courseware_ports.wetty }}" COURSEWARE_WETTY_PORT="{{ courseware_ports.wetty }}"
COURSEWARE_OLLAMA_MIN_VERSION="{{ courseware_ollama_min_version }}"
OLLAMA_BIN="{{ courseware_ollama_bin }}" OLLAMA_BIN="{{ courseware_ollama_bin }}"
OLLAMA_MODELS_DIR="{{ courseware_ollama_models_dir }}" OLLAMA_MODELS_DIR="{{ courseware_ollama_models_dir }}"
NODE_RUNTIME_BIN_DIR="{{ courseware_node_runtime_bin_dir }}" NODE_RUNTIME_BIN_DIR="{{ courseware_node_runtime_bin_dir }}"
NETRON_VENV="{{ courseware_netron_venv_dir }}"
WETTY_BIN="{{ courseware_wetty_dir }}/node_modules/.bin/wetty" WETTY_BIN="{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
COURSEWARE_WETTY_BASE_PATH="{{ courseware_wetty_base_path }}" COURSEWARE_WETTY_BASE_PATH="{{ courseware_wetty_base_path }}"
OPEN_WEBUI_VENV="{{ courseware_venvs_dir }}/open-webui" OPEN_WEBUI_VENV="{{ courseware_venvs_dir }}/open-webui"
@@ -23,13 +25,9 @@ CHUNKVIZ_DIR="{{ courseware_repos_dir }}/ChunkViz"
EMBEDDING_ATLAS_VENV="{{ courseware_venvs_dir }}/embedding-atlas" EMBEDDING_ATLAS_VENV="{{ courseware_venvs_dir }}/embedding-atlas"
TTPS_DATASET_PATH="{{ courseware_datasets_dir }}/ttps_dataset.parquet" TTPS_DATASET_PATH="{{ courseware_datasets_dir }}/ttps_dataset.parquet"
WIKI_TEST_RAW_PATH="{{ courseware_datasets_dir }}/wiki.test.raw" WIKI_TEST_RAW_PATH="{{ courseware_datasets_dir }}/wiki.test.raw"
TRANSFORMERLAB_DIR="{{ courseware_transformerlab_home }}" COURSEWARE_OLLAMA_BASE_URL="http://{{ courseware_url_host }}:{{ courseware_ports.ollama }}"
TRANSFORMERLAB_DEFAULT_USER_EMAIL="{{ courseware_transformerlab_default_user_email }}" COURSEWARE_LAB1_LLAMA_MODEL_PATH="{{ courseware_lab1_llama_local_path }}"
TRANSFORMERLAB_DEFAULT_USER_PASSWORD="{{ courseware_transformerlab_default_user_password }}" COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS="{{ courseware_lab1_ollama_model_alias }}"
TRANSFORMERLAB_DEFAULT_USER_FIRST_NAME="{{ courseware_transformerlab_default_user_first_name }}"
TRANSFORMERLAB_DEFAULT_USER_LAST_NAME="{{ courseware_transformerlab_default_user_last_name }}"
COURSEWARE_STUDENT_USERNAME="{{ courseware_student_username }}"
COURSEWARE_LAB3_DIR="{{ courseware_lab3_dir }}"
UNSLOTH_BIN="{{ ansible_env.HOME }}/.local/bin/unsloth" UNSLOTH_BIN="{{ ansible_env.HOME }}/.local/bin/unsloth"
PROMPTFOO_DIR="{{ courseware_promptfoo_dir }}" PROMPTFOO_DIR="{{ courseware_promptfoo_dir }}"
PROMPTFOO_BIN="{{ courseware_tools_dir }}/promptfoo/node_modules/.bin/promptfoo" PROMPTFOO_BIN="{{ courseware_tools_dir }}/promptfoo/node_modules/.bin/promptfoo"
@@ -37,5 +35,4 @@ WIKI_DIR="{{ courseware_wiki_repo_dir }}"
WIKI_RUNTIME_CONFIG_PATH="{{ courseware_wiki_runtime_config_path }}" WIKI_RUNTIME_CONFIG_PATH="{{ courseware_wiki_runtime_config_path }}"
LLAMA_CPP_BIN_DIR="{{ courseware_llama_cpp_bin_dir }}" LLAMA_CPP_BIN_DIR="{{ courseware_llama_cpp_bin_dir }}"
KILN_LINUX_BIN="{{ courseware_apps_dir }}/kiln/Kiln" KILN_LINUX_BIN="{{ courseware_apps_dir }}/kiln/Kiln"
KILN_MAC_APP="{{ courseware_apps_dir }}/Kiln.app" KILN_LAUNCH_PATH="{{ courseware_apps_dir }}/kiln/Kiln"
KILN_LAUNCH_PATH="{% if ansible_system == 'Darwin' %}{{ courseware_apps_dir }}/Kiln.app{% else %}{{ courseware_apps_dir }}/kiln/Kiln{% endif %}"
+55 -51
View File
@@ -18,6 +18,8 @@ usage() {
Usage: Usage:
./labctl up ./labctl up
./labctl down ./labctl down
./labctl update_wiki
./labctl ollama_models
./labctl preflight ./labctl preflight
./labctl versions ./labctl versions
./labctl assets lab2 [--refresh] ./labctl assets lab2 [--refresh]
@@ -30,7 +32,7 @@ Usage:
EOF EOF
} }
transformerlab_version() { netron_version() {
local version_file=$ROOT_DIR/ansible/group_vars/all.yml local version_file=$ROOT_DIR/ansible/group_vars/all.yml
if [ ! -f "$version_file" ]; then if [ ! -f "$version_file" ]; then
@@ -38,46 +40,35 @@ transformerlab_version() {
return return
fi fi
sed -nE 's/^courseware_transformerlab_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1 sed -nE 's/^courseware_netron_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
}
minimum_ollama_version() {
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
if [ ! -f "$version_file" ]; then
printf '%s\n' "unknown"
return
fi
sed -nE 's/^courseware_ollama_min_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
} }
print_versions() { print_versions() {
cat <<EOF cat <<EOF
Pinned component versions: Pinned component versions:
TransformerLab: $(transformerlab_version) (single-user pinned install) Netron: $(netron_version)
Minimum Ollama: $(minimum_ollama_version)
Ansible Core: 2.18.6 Ansible Core: 2.18.6
EOF EOF
} }
require_student_password_hash() {
local current_host_profile
current_host_profile=$(host_profile)
if [ "$current_host_profile" = "macos" ]; then
return
fi
if [ -n "${COURSEWARE_STUDENT_PASSWORD_HASH:-}" ]; then
return
fi
cat <<'EOF' >&2
Missing COURSEWARE_STUDENT_PASSWORD_HASH.
Set a password hash for the managed `student` login before running this command. Example:
export COURSEWARE_STUDENT_PASSWORD_HASH="$(openssl passwd -6 'student-password')"
Then rerun:
./labctl up
EOF
exit 1
}
confirm_installation() { confirm_installation() {
local response local response
local tlab_version local pinned_netron
tlab_version=$(transformerlab_version) local min_ollama
pinned_netron=$(netron_version)
min_ollama=$(minimum_ollama_version)
if [ ! -t 0 ]; then if [ ! -t 0 ]; then
cat <<EOF >&2 cat <<EOF >&2
@@ -85,14 +76,16 @@ WARNING: THIS SCRIPT WILL CONFIGURE YOUR ENVIRONMENT WILL THE FOLLOWING SOFTWARE
- Ollama - Ollama
- llama.cpp - llama.cpp
- TransformerLab (single-user pinned to ${tlab_version}) - Netron (${pinned_netron})
- Open WebUI - Open WebUI
- ChunkViz - ChunkViz
- Embedding Atlas - Embedding Atlas
- Promptfoo - Promptfoo
- Unsloth Studio - Unsloth Studio
- Kiln Desktop - Kiln Desktop
- Course-specific support assets for lab 2 and lab 4 - Course-specific support assets for lab 1, lab 2, and lab 4
- Pre-pulled Gemma 4 E2B Ollama models for Lab 1 and Lab 2
- Lab 1 confidence support through Gemma 4 E2B Q4 (requires Ollama ${min_ollama}+)
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.) IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
@@ -108,14 +101,16 @@ WARNING: THIS SCRIPT WILL CONFIGURE YOUR ENVIRONMENT WILL THE FOLLOWING SOFTWARE
- Ollama - Ollama
- llama.cpp - llama.cpp
- TransformerLab (single-user pinned to ${tlab_version}) - Netron (${pinned_netron})
- Open WebUI - Open WebUI
- ChunkViz - ChunkViz
- Embedding Atlas - Embedding Atlas
- Promptfoo - Promptfoo
- Unsloth Studio - Unsloth Studio
- Kiln Desktop - Kiln Desktop
- Course-specific support assets for lab 2 and lab 4 - Course-specific support assets for lab 1, lab 2, and lab 4
- Pre-pulled Gemma 4 E2B Ollama models for Lab 1 and Lab 2
- Lab 1 confidence support through Gemma 4 E2B Q4 (requires Ollama ${min_ollama}+)
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.) IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
@@ -137,29 +132,16 @@ host_is_wsl() {
[ "$(uname -s)" = "Linux" ] && uname -r | grep -qiE 'microsoft|wsl' [ "$(uname -s)" = "Linux" ] && uname -r | grep -qiE 'microsoft|wsl'
} }
host_is_macos() {
[ "$(uname -s)" = "Darwin" ]
}
host_is_linux() { host_is_linux() {
[ "$(uname -s)" = "Linux" ] [ "$(uname -s)" = "Linux" ]
} }
host_is_arm_mac() {
host_is_macos && [ "$(uname -m)" = "arm64" ]
}
host_profile() { host_profile() {
if host_is_wsl; then if host_is_wsl; then
printf '%s\n' "wsl" printf '%s\n' "wsl"
return return
fi fi
if host_is_macos; then
printf '%s\n' "macos"
return
fi
if host_is_linux && host_is_debian_family; then if host_is_linux && host_is_debian_family; then
printf '%s\n' "native-debian-ubuntu" printf '%s\n' "native-debian-ubuntu"
return return
@@ -304,7 +286,6 @@ Python 3 was not found.
Install it first, then rerun this command: Install it first, then rerun this command:
- Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3 python3-venv - Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3 python3-venv
- macOS: brew install python@3.11
EOF EOF
exit 1 exit 1
} }
@@ -406,7 +387,6 @@ Python 3 is installed, but its virtual environment support is still unavailable.
Install the missing venv package for your platform, then rerun this command: Install the missing venv package for your platform, then rerun this command:
- Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3-venv python3-pip - Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3-venv python3-pip
- macOS: brew reinstall python@3.11
EOF EOF
exit 1 exit 1
fi fi
@@ -530,6 +510,15 @@ require_arg() {
fi fi
} }
require_managed_runtime() {
if [ ! -f "$ROOT_DIR/state/runtime.env" ]; then
cat <<'EOF' >&2
Missing state/runtime.env. Run ./labctl up first so the managed environment exists before using this command.
EOF
exit 1
fi
}
handle_assets_command() { handle_assets_command() {
local asset_group=${1:-} local asset_group=${1:-}
shift || true shift || true
@@ -545,6 +534,17 @@ handle_assets_command() {
esac esac
} }
refresh_ollama_models() {
require_managed_runtime
run_playbook up.yml --tags ollama_models
}
update_wiki() {
require_managed_runtime
run_playbook up.yml --tags wiki -e "courseware_wiki_force_update=true"
run_project_script "$ROOT_DIR/scripts/service_manager.sh" restart-wiki
}
main() { main() {
local cmd=${1:-} local cmd=${1:-}
shift || true shift || true
@@ -552,17 +552,21 @@ main() {
case "$cmd" in case "$cmd" in
up) up)
confirm_installation confirm_installation
require_student_password_hash
run_playbook up.yml run_playbook up.yml
run_project_script "$ROOT_DIR/scripts/service_manager.sh" start all run_project_script "$ROOT_DIR/scripts/service_manager.sh" start all
;; ;;
ollama_models)
refresh_ollama_models
;;
update_wiki)
update_wiki
;;
down) down)
run_project_script "$ROOT_DIR/scripts/service_manager.sh" stop all || true run_project_script "$ROOT_DIR/scripts/service_manager.sh" stop all || true
run_playbook down.yml run_playbook down.yml
;; ;;
preflight) preflight)
confirm_installation confirm_installation
require_student_password_hash
run_playbook up.yml --tags preflight run_playbook up.yml --tags preflight
;; ;;
versions) versions)
+22 -15
View File
@@ -14,18 +14,25 @@ load_runtime_env() {
: "${COURSEWARE_STATE_DIR:=$STATE_DIR}" : "${COURSEWARE_STATE_DIR:=$STATE_DIR}"
: "${COURSEWARE_BIND_HOST:=127.0.0.1}" : "${COURSEWARE_BIND_HOST:=127.0.0.1}"
: "${COURSEWARE_URL_HOST:=127.0.0.1}" if [ -z "${COURSEWARE_URL_HOST:-}" ]; then
COURSEWARE_URL_HOST=$(ip route get 1.1.1.1 2>/dev/null | sed -nE 's/.* src ([0-9.]+).*/\1/p' | head -n 1)
: "${COURSEWARE_URL_HOST:=127.0.0.1}"
fi
: "${COURSEWARE_NETRON_PORT:=8338}"
: "${COURSEWARE_PROMPTFOO_PORT:=15500}" : "${COURSEWARE_PROMPTFOO_PORT:=15500}"
: "${COURSEWARE_WIKI_PORT:=80}" : "${COURSEWARE_WIKI_PORT:=80}"
: "${COURSEWARE_WETTY_PORT:=7681}" : "${COURSEWARE_WETTY_PORT:=7681}"
: "${COURSEWARE_OLLAMA_MIN_VERSION:=0.12.11}"
: "${COURSEWARE_WETTY_BASE_PATH:=/wetty}" : "${COURSEWARE_WETTY_BASE_PATH:=/wetty}"
: "${COURSEWARE_STUDENT_USERNAME:=student}"
: "${COURSEWARE_LAB3_DIR:=/home/student/lab3}"
: "${NODE_RUNTIME_BIN_DIR:=$COURSEWARE_STATE_DIR/tools/node-runtime/node_modules/node/bin}" : "${NODE_RUNTIME_BIN_DIR:=$COURSEWARE_STATE_DIR/tools/node-runtime/node_modules/node/bin}"
: "${NETRON_VENV:=$COURSEWARE_STATE_DIR/venvs/netron}"
: "${WETTY_BIN:=$COURSEWARE_STATE_DIR/tools/wetty/node_modules/.bin/wetty}" : "${WETTY_BIN:=$COURSEWARE_STATE_DIR/tools/wetty/node_modules/.bin/wetty}"
: "${PROMPTFOO_DIR:=$COURSEWARE_STATE_DIR/lab6}" : "${PROMPTFOO_DIR:=$COURSEWARE_STATE_DIR/lab6}"
: "${WIKI_DIR:=$COURSEWARE_STATE_DIR/repos/LLM-Labs}" : "${WIKI_DIR:=$COURSEWARE_STATE_DIR/repos/LLM-Labs}"
: "${WIKI_RUNTIME_CONFIG_PATH:=$WIKI_DIR/public/courseware-runtime.json}" : "${WIKI_RUNTIME_CONFIG_PATH:=$WIKI_DIR/public/courseware-runtime.json}"
: "${COURSEWARE_OLLAMA_BASE_URL:=http://$COURSEWARE_URL_HOST:$COURSEWARE_OLLAMA_PORT}"
: "${COURSEWARE_LAB1_LLAMA_MODEL_PATH:=$COURSEWARE_STATE_DIR/models/lab1/Llama-3.2-1B.Q4_K_M.gguf}"
: "${COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS:=batiai/gemma4-e2b:q4}"
: "${LLAMA_CPP_BIN_DIR:=$COURSEWARE_STATE_DIR/repos/llama.cpp/build/bin}" : "${LLAMA_CPP_BIN_DIR:=$COURSEWARE_STATE_DIR/repos/llama.cpp/build/bin}"
if [ -n "${OLLAMA_BIN:-}" ] && [[ "$OLLAMA_BIN" != */* ]] && command -v "$OLLAMA_BIN" >/dev/null 2>&1; then if [ -n "${OLLAMA_BIN:-}" ] && [[ "$OLLAMA_BIN" != */* ]] && command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
@@ -44,7 +51,7 @@ service_list() {
printf '%s\n' \ printf '%s\n' \
"ollama" \ "ollama" \
"open-webui" \ "open-webui" \
"transformerlab" \ "netron" \
"chunkviz" \ "chunkviz" \
"embedding-atlas" \ "embedding-atlas" \
"unsloth" \ "unsloth" \
@@ -65,7 +72,7 @@ service_port() {
case "$1" in case "$1" in
ollama) printf '%s\n' "${COURSEWARE_OLLAMA_PORT}" ;; ollama) printf '%s\n' "${COURSEWARE_OLLAMA_PORT}" ;;
open-webui) printf '%s\n' "${COURSEWARE_OPEN_WEBUI_PORT}" ;; open-webui) printf '%s\n' "${COURSEWARE_OPEN_WEBUI_PORT}" ;;
transformerlab) printf '%s\n' "${COURSEWARE_TRANSFORMERLAB_PORT}" ;; netron) printf '%s\n' "${COURSEWARE_NETRON_PORT}" ;;
chunkviz) printf '%s\n' "${COURSEWARE_CHUNKVIZ_PORT}" ;; chunkviz) printf '%s\n' "${COURSEWARE_CHUNKVIZ_PORT}" ;;
embedding-atlas) printf '%s\n' "${COURSEWARE_EMBEDDING_ATLAS_PORT}" ;; embedding-atlas) printf '%s\n' "${COURSEWARE_EMBEDDING_ATLAS_PORT}" ;;
unsloth) printf '%s\n' "${COURSEWARE_UNSLOTH_PORT}" ;; unsloth) printf '%s\n' "${COURSEWARE_UNSLOTH_PORT}" ;;
@@ -80,7 +87,7 @@ service_url() {
case "$1" in case "$1" in
ollama) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OLLAMA_PORT" ;; ollama) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OLLAMA_PORT" ;;
open-webui) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OPEN_WEBUI_PORT" ;; open-webui) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OPEN_WEBUI_PORT" ;;
transformerlab) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_TRANSFORMERLAB_PORT" ;; netron) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_NETRON_PORT" ;;
chunkviz) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_CHUNKVIZ_PORT" ;; chunkviz) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_CHUNKVIZ_PORT" ;;
embedding-atlas) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_EMBEDDING_ATLAS_PORT" ;; embedding-atlas) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_EMBEDDING_ATLAS_PORT" ;;
unsloth) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_UNSLOTH_PORT" ;; unsloth) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_UNSLOTH_PORT" ;;
@@ -109,14 +116,11 @@ service_command() {
"$COURSEWARE_BIND_HOST" \ "$COURSEWARE_BIND_HOST" \
"$COURSEWARE_OPEN_WEBUI_PORT" "$COURSEWARE_OPEN_WEBUI_PORT"
;; ;;
transformerlab) netron)
printf 'export PATH="%s/envs/transformerlab/bin:$PATH"; export VIRTUAL_ENV="%s/envs/transformerlab"; export CONDA_PREFIX="%s/envs/transformerlab"; cd "%s/src" && exec ./run.sh -c -h %s -p %s' \ printf 'exec "%s/bin/netron" --host %s --port %s --verbosity quiet' \
"$TRANSFORMERLAB_DIR" \ "$NETRON_VENV" \
"$TRANSFORMERLAB_DIR" \
"$TRANSFORMERLAB_DIR" \
"$TRANSFORMERLAB_DIR" \
"$COURSEWARE_BIND_HOST" \ "$COURSEWARE_BIND_HOST" \
"$COURSEWARE_TRANSFORMERLAB_PORT" "$COURSEWARE_NETRON_PORT"
;; ;;
chunkviz) chunkviz)
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/serve" build -s -n -L -l tcp://%s:%s' \ printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/serve" build -s -n -L -l tcp://%s:%s' \
@@ -126,7 +130,7 @@ service_command() {
"$COURSEWARE_CHUNKVIZ_PORT" "$COURSEWARE_CHUNKVIZ_PORT"
;; ;;
embedding-atlas) embedding-atlas)
printf 'exec "%s/bin/embedding-atlas" "%s" --text "Scenario" --host %s --port %s' \ printf 'exec "%s/bin/embedding-atlas" "%s" --text "Scenario" --host %s --port %s --no-auto-port' \
"$EMBEDDING_ATLAS_VENV" \ "$EMBEDDING_ATLAS_VENV" \
"$TTPS_DATASET_PATH" \ "$TTPS_DATASET_PATH" \
"$COURSEWARE_BIND_HOST" \ "$COURSEWARE_BIND_HOST" \
@@ -147,9 +151,12 @@ service_command() {
"$COURSEWARE_PROMPTFOO_PORT" "$COURSEWARE_PROMPTFOO_PORT"
;; ;;
wiki) wiki)
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/next" start --hostname %s --port %s' \ printf 'cd "%s" && PATH="%s:$PATH" exec env COURSEWARE_OLLAMA_BASE_URL="%s" COURSEWARE_LAB1_LLAMA_MODEL_PATH="%s" COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS="%s" "./node_modules/.bin/next" start --hostname %s --port %s' \
"$WIKI_DIR" \ "$WIKI_DIR" \
"$NODE_RUNTIME_BIN_DIR" \ "$NODE_RUNTIME_BIN_DIR" \
"$COURSEWARE_OLLAMA_BASE_URL" \
"$COURSEWARE_LAB1_LLAMA_MODEL_PATH" \
"$COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS" \
"$COURSEWARE_BIND_HOST" \ "$COURSEWARE_BIND_HOST" \
"$COURSEWARE_WIKI_PORT" "$COURSEWARE_WIKI_PORT"
;; ;;
+158 -112
View File
@@ -9,31 +9,7 @@ load_runtime_env
mkdir -p "$STATE_DIR/run" "$STATE_DIR/logs" mkdir -p "$STATE_DIR/run" "$STATE_DIR/logs"
ensure_transformerlab_default_user() {
local helper_python="${TRANSFORMERLAB_DIR}/envs/transformerlab/bin/python"
if [ ! -x "$helper_python" ]; then
return 0
fi
if [ -z "${TRANSFORMERLAB_DEFAULT_USER_EMAIL:-}" ] || [ -z "${TRANSFORMERLAB_DEFAULT_USER_PASSWORD:-}" ]; then
return 0
fi
"$helper_python" "$SCRIPT_DIR/ensure_transformerlab_user.py" \
--transformerlab-dir "$TRANSFORMERLAB_DIR" \
--email "$TRANSFORMERLAB_DEFAULT_USER_EMAIL" \
--password "$TRANSFORMERLAB_DEFAULT_USER_PASSWORD" \
--first-name "${TRANSFORMERLAB_DEFAULT_USER_FIRST_NAME:-Student}" \
--last-name "${TRANSFORMERLAB_DEFAULT_USER_LAST_NAME:-}" >>"$STATE_DIR/logs/transformerlab_default_user.log" 2>&1 || true
}
check_wetty_prereqs() { check_wetty_prereqs() {
if ! id "$COURSEWARE_STUDENT_USERNAME" >/dev/null 2>&1; then
echo "Missing terminal user '$COURSEWARE_STUDENT_USERNAME'. Re-run ./labctl up after setting COURSEWARE_STUDENT_PASSWORD_HASH." >&2
exit 1
fi
if [ ! -x "$WETTY_BIN" ]; then if [ ! -x "$WETTY_BIN" ]; then
echo "Missing WeTTY binary at $WETTY_BIN. Re-run ./labctl up." >&2 echo "Missing WeTTY binary at $WETTY_BIN. Re-run ./labctl up." >&2
exit 1 exit 1
@@ -44,11 +20,6 @@ check_wetty_prereqs() {
exit 1 exit 1
fi fi
if [ ! -d "$COURSEWARE_LAB3_DIR" ]; then
echo "Missing lab workspace at $COURSEWARE_LAB3_DIR. Re-run ./labctl up." >&2
exit 1
fi
if ! python3 - <<'PY' if ! python3 - <<'PY'
import socket, sys import socket, sys
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
@@ -66,6 +37,40 @@ PY
fi fi
} }
ollama_version_gte_minimum() {
local version_output
local installed_version
if ! command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
return 1
fi
version_output=$("$OLLAMA_BIN" --version 2>/dev/null || true)
installed_version=$(printf '%s' "$version_output" | grep -oE '[0-9]+\.[0-9]+\.[0-9]+' | head -n 1)
if [ -z "$installed_version" ]; then
return 1
fi
[ "$(printf '%s\n' "$COURSEWARE_OLLAMA_MIN_VERSION" "$installed_version" | sort -V | head -n 1)" = "$COURSEWARE_OLLAMA_MIN_VERSION" ]
}
assert_ollama_logprobs_support() {
if ollama_version_gte_minimum; then
return 0
fi
local version_output
version_output=$("$OLLAMA_BIN" --version 2>/dev/null || printf 'unknown')
cat <<EOF >&2
Lab 1 requires Ollama ${COURSEWARE_OLLAMA_MIN_VERSION} or newer because the confidence visualizer depends on logprobs.
Installed version: ${version_output}
Re-run ./labctl up after upgrading Ollama.
EOF
exit 1
}
resolve_targets() { resolve_targets() {
if [ $# -eq 0 ]; then if [ $# -eq 0 ]; then
echo "No target specified." >&2 echo "No target specified." >&2
@@ -107,6 +112,19 @@ is_running() {
has_live_pid "$service" || service_ready "$service" has_live_pid "$service" || service_ready "$service"
} }
service_startup_attempts() {
case "$1" in
embedding-atlas)
# First launch embeds the bundled dataset. On older GPU drivers this falls
# back to CPU and can take close to an hour.
printf '%s\n' 3600
;;
*)
printf '%s\n' 60
;;
esac
}
service_ready() { service_ready() {
local service=$1 local service=$1
@@ -114,13 +132,10 @@ service_ready() {
ollama) ollama)
curl -fsS "$(service_url "$service")/api/tags" >/dev/null 2>&1 curl -fsS "$(service_url "$service")/api/tags" >/dev/null 2>&1
;; ;;
transformerlab)
curl -fsS "$(service_url "$service")/healthz" >/dev/null 2>&1
;;
promptfoo) promptfoo)
curl -fsS "$(service_url "$service")/health" >/dev/null 2>&1 curl -fsS "$(service_url "$service")/health" >/dev/null 2>&1
;; ;;
open-webui|chunkviz|embedding-atlas|unsloth|wiki|wetty) open-webui|netron|chunkviz|embedding-atlas|unsloth|wiki|wetty)
curl -fsS "$(service_url "$service")" >/dev/null 2>&1 curl -fsS "$(service_url "$service")" >/dev/null 2>&1
;; ;;
*) *)
@@ -140,6 +155,22 @@ service_listener_pids() {
| sort -u | sort -u
} }
service_port_has_listener() {
local service=$1
local port
port=$(service_port "$service") || return 1
ss -ltnH "( sport = :$port )" 2>/dev/null | grep -q .
}
service_listener_details() {
local service=$1
local port
port=$(service_port "$service") || return 0
ss -ltnp "( sport = :$port )" 2>/dev/null || true
}
kill_pid_tree() { kill_pid_tree() {
local signal=$1 local signal=$1
local pid=$2 local pid=$2
@@ -168,80 +199,16 @@ terminate_service_processes() {
done < <(service_listener_pids "$service") done < <(service_listener_pids "$service")
} }
start_one() { wait_for_service_ready() {
local service=$1 local service=$1
local cmd local log_file=$2
local log_file local pid_file=$3
local pid_file local startup_attempts=$4
local pid_grace_attempts=$5
local attempt local attempt
local pid_grace_attempts=5
if has_live_pid "$service"; then for attempt in $(seq 1 "$startup_attempts"); do
if [ "$service" = "transformerlab" ]; then
ensure_transformerlab_default_user
fi
echo "$service already running"
return 0
fi
if service_ready "$service"; then
if [ "$service" = "transformerlab" ]; then
ensure_transformerlab_default_user
fi
echo "$service already available"
return 0
fi
case "$service" in
open-webui)
start_one ollama
;;
transformerlab)
if command -v python3 >/dev/null 2>&1; then
python3 "$SCRIPT_DIR/repair_transformerlab_plugin_supports.py" \
--transformerlab-dir "$TRANSFORMERLAB_DIR" \
--plugin "fastchat_server" \
--required-support "chat" \
--required-support "completion" \
--required-support "visualize_model" \
--required-support "model_layers" \
--required-support "rag" \
--required-support "tools" \
--required-support "template" \
--required-support "embeddings" \
--required-support "tokenize" \
--required-support "logprobs" \
--required-support "batched" >>"$STATE_DIR/logs/transformerlab_plugin_supports.log" 2>&1 || true
fi
;;
wetty)
check_wetty_prereqs
;;
*)
;;
esac
cmd=$(service_command "$service")
log_file=$(service_log_file "$service")
pid_file=$(service_pid_file "$service")
if [ "$service" = "ollama" ]; then
env \
OLLAMA_HOST="${COURSEWARE_BIND_HOST}:${COURSEWARE_OLLAMA_PORT}" \
OLLAMA_MODELS="$OLLAMA_MODELS_DIR" \
"$OLLAMA_BIN" serve </dev/null >>"$log_file" 2>&1 &
elif command -v setsid >/dev/null 2>&1; then
nohup setsid bash -lc "$cmd" </dev/null >>"$log_file" 2>&1 &
else
nohup bash -lc "$cmd" </dev/null >>"$log_file" 2>&1 &
fi
echo $! >"$pid_file"
for attempt in $(seq 1 60); do
if service_ready "$service"; then if service_ready "$service"; then
if [ "$service" = "transformerlab" ]; then
ensure_transformerlab_default_user
fi
echo "started $service" echo "started $service"
return 0 return 0
fi fi
@@ -261,6 +228,68 @@ start_one() {
exit 1 exit 1
} }
start_one() {
local service=$1
local cmd
local log_file
local pid_file
local pid_grace_attempts=5
local startup_attempts
if [ "$service" = "ollama" ] || [ "$service" = "wiki" ]; then
assert_ollama_logprobs_support
fi
startup_attempts=$(service_startup_attempts "$service")
log_file=$(service_log_file "$service")
pid_file=$(service_pid_file "$service")
if service_ready "$service"; then
echo "$service already available"
return 0
fi
if has_live_pid "$service"; then
echo "$service already starting"
wait_for_service_ready "$service" "$log_file" "$pid_file" "$startup_attempts" "$pid_grace_attempts"
return 0
fi
case "$service" in
open-webui)
start_one ollama
;;
wetty)
check_wetty_prereqs
;;
*)
;;
esac
cmd=$(service_command "$service")
if [ "$service" = "ollama" ]; then
if command -v setsid >/dev/null 2>&1; then
nohup setsid env \
OLLAMA_HOST="${COURSEWARE_BIND_HOST}:${COURSEWARE_OLLAMA_PORT}" \
OLLAMA_MODELS="$OLLAMA_MODELS_DIR" \
"$OLLAMA_BIN" serve </dev/null >>"$log_file" 2>&1 &
else
nohup env \
OLLAMA_HOST="${COURSEWARE_BIND_HOST}:${COURSEWARE_OLLAMA_PORT}" \
OLLAMA_MODELS="$OLLAMA_MODELS_DIR" \
"$OLLAMA_BIN" serve </dev/null >>"$log_file" 2>&1 &
fi
elif command -v setsid >/dev/null 2>&1; then
nohup setsid bash -lc "$cmd" </dev/null >>"$log_file" 2>&1 &
else
nohup bash -lc "$cmd" </dev/null >>"$log_file" 2>&1 &
fi
echo $! >"$pid_file"
wait_for_service_ready "$service" "$log_file" "$pid_file" "$startup_attempts" "$pid_grace_attempts"
}
stop_one() { stop_one() {
local service=$1 local service=$1
local pid_file local pid_file
@@ -307,6 +336,28 @@ stop_one() {
exit 1 exit 1
} }
restart_managed_wiki() {
local wiki_log_file
wiki_log_file=$(service_log_file wiki)
if has_live_pid wiki; then
stop_one wiki
fi
if service_port_has_listener wiki; then
cat <<EOF >&2
Cannot restart wiki because port $(service_port wiki) is already in use by a non-managed listener.
Listener details:
$(service_listener_details wiki)
Leave that process alone or move it off port $(service_port wiki), then rerun ./labctl update_wiki.
Wiki log: $wiki_log_file
EOF
exit 1
fi
start_one wiki
}
status_one() { status_one() {
local service=$1 local service=$1
@@ -323,7 +374,7 @@ urls() {
cat <<EOF cat <<EOF
Ollama API: $(service_url ollama) Ollama API: $(service_url ollama)
Open WebUI: $(service_url open-webui) Open WebUI: $(service_url open-webui)
TransformerLab: $(service_url transformerlab) Netron: $(service_url netron)
ChunkViz: $(service_url chunkviz) ChunkViz: $(service_url chunkviz)
Embedding Atlas: $(service_url embedding-atlas) Embedding Atlas: $(service_url embedding-atlas)
Unsloth Studio: $(service_url unsloth) Unsloth Studio: $(service_url unsloth)
@@ -336,14 +387,6 @@ EOF
} }
open_kiln() { open_kiln() {
local host_os
host_os=$(uname -s)
if [ "$host_os" = "Darwin" ] && [ -d "$KILN_MAC_APP" ]; then
open "$KILN_MAC_APP"
return 0
fi
if [ -x "$KILN_LINUX_BIN" ]; then if [ -x "$KILN_LINUX_BIN" ]; then
nohup "$KILN_LINUX_BIN" >/dev/null 2>&1 & nohup "$KILN_LINUX_BIN" >/dev/null 2>&1 &
echo "started Kiln from $KILN_LINUX_BIN" echo "started Kiln from $KILN_LINUX_BIN"
@@ -409,6 +452,9 @@ main() {
fi fi
show_logs "$1" show_logs "$1"
;; ;;
restart-wiki)
restart_managed_wiki
;;
*) *)
echo "Unknown command: $cmd" >&2 echo "Unknown command: $cmd" >&2
exit 1 exit 1