13 Commits

Author SHA1 Message Date
c4ch3c4d3 86a5df4681 Update Lab 2 Ollama Gemma model
Made-with: Cursor
2026-04-25 15:40:50 -06:00
bzuccaro e95ee9c938 Support LAN deployment and managed Python runtime
Made-with: Cursor
2026-04-25 18:05:56 +00:00
bzuccaro fe568c17cd Focus local lab deployment on Linux and WSL 2026-04-24 21:32:01 -06:00
OpenCode e915d87ec6 Align installer with updated lab models 2026-04-24 20:08:56 -06:00
OpenCode 7360cd040a Add wiki refresh command and service updates 2026-04-24 10:02:39 -06:00
OpenCode 78676ece59 Preload 2026-04-23 14:48:06 -06:00
OpenCode 30f80fe058 Fix regex escaping in Lab 1 Ollama version extraction
In the YAML folded block scalar, \\. was passed literally to
regex_search as \., so it never matched valid semver strings like
0.21.1. Use \. instead so the dot is matched correctly.
2026-04-22 20:19:31 -06:00
c4ch3c4d3 c79bc2eec0 Restore executable bits for courseware scripts 2026-04-16 11:16:01 -06:00
c4ch3c4d3 59f3032f91 Provision Netron and Lab 1 local assets 2026-04-16 11:15:39 -06:00
c4ch3c4d3 56305680e0 Fix terminal deployment regressions 2026-04-13 21:22:16 -06:00
c4ch3c4d3 5576142aec Use host-managed SSH accounts for browser terminal 2026-04-13 19:40:38 -06:00
c4ch3c4d3 13ce59d901 Restore executable bits on deployment scripts 2026-04-13 16:40:35 -06:00
c4ch3c4d3 cbf6c3fad3 Add managed Lab 3 browser terminal deployment 2026-04-13 16:40:14 -06:00
32 changed files with 796 additions and 491 deletions
+60 -34
View File
@@ -4,32 +4,49 @@ This project builds a student-friendly local lab environment for the courseware
- `./deploy-courseware.sh` installs and configures the environment, then starts every managed service.
- `./destroy-courseware.sh` stops the managed services, uninstalls courseware-managed Ollama, and removes the project-owned lab state.
- `./labctl` provides day-two controls such as `assets lab2`, `start`, `stop`, `status`, `urls`, `logs`, and `open kiln`.
- `./labctl` provides day-two controls such as `assets lab2`, `ollama_models`, `update_wiki`, `start`, `stop`, `status`, `urls`, `logs`, and `open kiln`.
## What It Installs
- Ollama
- `llama.cpp`
- TransformerLab, pinned to the classic single-user `v0.28.2` release
- Netron, served locally on port `8338`
- Open WebUI
- ChunkViz
- Embedding Atlas
- Promptfoo
- Unsloth Studio
- Kiln Desktop
- Course-specific support assets for lab 2 and lab 4
- Course-specific support assets for lab 1, lab 2, and lab 4
## Lab 1 Defaults
Lab 1 is now provisioned directly by the installer:
- The `Llama-3.2-1B.Q4_K_M.gguf` file is mirrored into `state/models/lab1/`.
- The Lab 1 confidence widget uses the pre-pulled Gemma 4 E2B Q4 Ollama model, `batiai/gemma4-e2b:q4`.
- The wiki serves a same-host download link for the Llama GGUF through `/api/lab1/models/...`.
- Lab 1 confidence visualization requires Ollama `0.12.11` or newer because it depends on logprobs.
## Lab 2 Defaults
`./labctl up` now pre-pulls the Gemma 4 E2B Ollama variants used by the wiki widgets:
- `gemma4:e2b-it-q8_0`
- `batiai/gemma4-e2b:q4`
- `batiai/gemma4-e2b:q6`
If you want to re-pull just those managed Ollama models later, run `./labctl ollama_models`.
## Supported Host Profiles
This build intentionally avoids the reference VM's hardware workarounds.
This build is the Linux/WSL variant of LLM Labs Local. If you are deploying on Apple Silicon macOS, use the sibling `LLM-Labs-Local-Mac` project instead.
- macOS: Apple Silicon only, with at least 16 GB unified memory.
- Native Debian/Ubuntu: Debian-family Linux with an NVIDIA GPU visible to `nvidia-smi` and at least 8 GB VRAM.
- WSL: Debian/Ubuntu-family Linux running under WSL, with the NVIDIA GPU exposed into the distro.
The launcher and Ansible preflight now classify the host dynamically and apply different setup behavior for:
The launcher and Ansible preflight classify the host dynamically and apply different setup behavior for:
- `macos`
- `native-debian-ubuntu`
- `wsl`
@@ -51,6 +68,7 @@ On Linux and WSL, the first `./labctl up` or `./labctl preflight` run may prompt
On Ubuntu WSL x86_64, preflight now installs the Linux-side CUDA toolkit automatically if it is missing.
It first tries the distro package:
- `sudo apt install -y nvidia-cuda-toolkit`
If that package is unavailable or still does not expose `nvcc`, the installer falls back to NVIDIA's WSL-Ubuntu repository bootstrap for the toolkit only, not a Linux GPU driver.
@@ -64,58 +82,66 @@ For non-Ubuntu WSL distros, install the CUDA toolkit manually before running the
## Native Debian/Ubuntu CUDA Behavior
On native Debian/Ubuntu hosts, the installer now handles three CUDA-toolkit cases:
On native Debian/Ubuntu hosts, the installer handles three CUDA-toolkit cases:
- If the toolkit is already usable, it reuses the existing install instead of forcing a reinstall.
- If the distro exposes `nvidia-cuda-toolkit`, it installs that package.
- If the distro package is unavailable, it bootstraps NVIDIA's official CUDA network repository for supported native Debian/Ubuntu releases and installs the toolkit from there.
If `apt` starts in a broken dependency state, the installer now attempts `dpkg --configure -a` and `apt-get --fix-broken install` before retrying package installation.
If `apt` starts in a broken dependency state, the installer attempts `dpkg --configure -a` and `apt-get --fix-broken install` before retrying package installation.
If CUDA is already mounted or preinstalled outside `PATH`, the installer now detects standard locations such as `/usr/local/cuda/bin/nvcc` and `/usr/local/cuda-*/bin/nvcc`.
If CUDA is already mounted or preinstalled outside `PATH`, the installer detects standard locations such as `/usr/local/cuda/bin/nvcc` and `/usr/local/cuda-*/bin/nvcc`.
## Standard Assumptions
- The host-side install path assumes modern local tooling, but TransformerLab itself is provisioned from a pinned classic single-user layout.
- TransformerLab is intentionally pinned to the older single-user `v0.28.2` release because newer upstream releases changed the project structure and behavior in ways that break this courseware.
- This project does not rely on TransformerLab's upstream `install.sh`; the Ansible role provisions the pinned release directly so web assets, env layout, and runtime behavior stay reproducible.
- The courseware repairs installed TransformerLab Fastchat plugin manifests so Fastchat-gated features such as Model Architecture and Visualize Logprobs stay available on pinned installs.
- No Ollama models are pulled during `./labctl up`; students pull models manually as part of the courseware.
- WhiteRabbitNeo assets are handled separately from `./labctl up` and `./labctl preflight`.
- Run `./labctl assets lab2` when you want to populate repo-local lab 2 assets in `assets/lab2/` from Hugging Face.
- The default deployment is centered on Ollama-backed local inference and browser-based tools such as Netron and the wiki.
- Netron is installed into a managed Python virtual environment and served locally instead of being provisioned as a desktop package.
- Lab 1's Llama GGUF download is mirrored locally during `./labctl up`, so students do not have to fetch it manually from the original source.
- WhiteRabbitNeo assets remain a separate Lab 2 flow and are still handled outside the default `./labctl up` run.
- Run `./labctl assets lab2` when you want to populate repo-local Lab 2 assets in `assets/lab2/` from Hugging Face.
- After base setup, run `state/lab2/download_whiterabbitneo-gguf.sh` to fetch only the `Q4_K_M`, `Q8_0`, and `IQ2_M` files from `bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF` and register local Ollama models `WhiteRabbitNeo`, `WhiteRabbitNeo-Q4`, `WhiteRabbitNeo-Q8`, and `WhiteRabbitNeo-IQ2`.
- TransformerLab and Unsloth homes are redirected into this project's `state/` tree via symlinks.
- Managed web services bind for access from both Linux and the Windows side of WSL, while `labctl urls` still reports localhost-friendly URLs.
- Unsloth homes are redirected into this project's `state/` tree via symlinks.
- Managed web services bind on all interfaces for headless LAN/VPN access. `labctl urls` reports the detected LAN IP by default; set `COURSEWARE_URL_HOST=<host-or-ip>` before `./labctl up` to advertise a specific VPN DNS name or address.
- The local Ansible bootstrap in `.venv-ansible/` is machine-specific and will be recreated automatically if the folder is copied between hosts.
- `llama.cpp` now uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts.
- `llama.cpp` uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts.
## Lab URLs
After `./deploy-courseware.sh`, run `./labctl urls`.
Default endpoints:
Default endpoints use the detected host LAN IP:
- Ollama API: `http://127.0.0.1:11434`
- Open WebUI: `http://127.0.0.1:8080`
- TransformerLab: `http://127.0.0.1:8338`
- ChunkViz: `http://127.0.0.1:3001`
- Embedding Atlas: `http://127.0.0.1:5055`
- Unsloth Studio: `http://127.0.0.1:8888`
- Promptfoo UI: `http://127.0.0.1:15500`
- Wiki: `http://127.0.0.1:80`
- Ollama API: `http://<host-lan-ip>:11434`
- Open WebUI: `http://<host-lan-ip>:8080`
- Netron: `http://<host-lan-ip>:8338`
- ChunkViz: `http://<host-lan-ip>:3001`
- Embedding Atlas: `http://<host-lan-ip>:5055`
- Unsloth Studio: `http://<host-lan-ip>:8888`
- Promptfoo UI: `http://<host-lan-ip>:15500`
- Wiki: `http://<host-lan-ip>:80`
- Lab 3 Terminal: `http://<host-lan-ip>:7681/wetty`
## Lab 3 Browser Terminal
The deployment will:
- bind `sshd` to `0.0.0.0:22` so regular SSH clients can connect over the LAN/VPN
- install WeTTY and expose it at `http://<host-lan-ip>:7681/wetty`
- leave login identity management to the host, so any existing local account with password-based SSH access can sign in through SSH or the browser terminal
## Notes
- `./labctl up` installs the environment and then starts every managed service.
- `./labctl versions` shows the pinned TransformerLab and Ansible runtime versions used by this workspace.
- `./labctl versions` shows the pinned Netron version, minimum Ollama version, and Ansible runtime version used by this workspace.
- `./labctl assets lab2` is a separate manual step that clones the base WhiteRabbitNeo repo into `assets/lab2/WhiteRabbitNeo-V3-7B` and downloads the supported `Q4_K_M`, `Q8_0`, and `IQ2_M` GGUFs into `assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF`.
- TransformerLab is installed as a pinned single-user app and no default courseware-managed TransformerLab user is created automatically.
- `./labctl ollama_models` re-pulls the managed Lab 2 Gemma 4 E2B Ollama model set without rerunning the full installer.
- `./labctl update_wiki` hard-resets the managed wiki checkout to the remote latest, rebuilds it, and restarts only the managed wiki service on port `80`.
- `./labctl start core` starts only `ollama` and `open-webui`.
- `./labctl start all` starts every managed web service.
- `./labctl open kiln` launches the Kiln desktop app installed into the project state.
- The scripted Promptfoo install drops a starter config at `state/lab6/promptfoo.yaml`.
- `labctl start all` now includes Promptfoo via `promptfoo view` and the cloned wiki app.
- `labctl start all` includes Promptfoo via `promptfoo view` and the cloned wiki app.
- Lab 2 includes `state/lab2/download_whiterabbitneo-gguf.sh`, which uses `git` + `git lfs` to pull only the supported WhiteRabbitNeo quants. Add `--download-only` if you want the files without Ollama registration.
- The wiki is cloned from `https://git.zuccaro.me/bzuccaro/LLM-Labs.git` into `state/repos/LLM-Labs` and started with `npm`.
- `./labctl down` now uninstalls Ollama entirely when this project installed it, instead of only stopping the service.
- Unsloth Studio currently supports chat and data workflows on macOS; Linux/WSL remains the standard path for NVIDIA-backed training.
- `./labctl down` uninstalls Ollama entirely when this project installed it, instead of only stopping the service.
- This variant is intended for NVIDIA-backed Linux/WSL training and lab workflows.
+43 -36
View File
@@ -2,6 +2,8 @@ courseware_state_dir: "{{ courseware_root }}/state"
courseware_markers_dir: "{{ courseware_state_dir }}/markers"
courseware_logs_dir: "{{ courseware_state_dir }}/logs"
courseware_run_dir: "{{ courseware_state_dir }}/run"
courseware_cache_dir: "{{ courseware_state_dir }}/cache"
courseware_tmp_dir: "{{ courseware_state_dir }}/tmp"
courseware_repos_dir: "{{ courseware_state_dir }}/repos"
courseware_venvs_dir: "{{ courseware_state_dir }}/venvs"
courseware_models_dir: "{{ courseware_state_dir }}/models"
@@ -9,64 +11,65 @@ courseware_datasets_dir: "{{ courseware_state_dir }}/datasets"
courseware_tools_dir: "{{ courseware_state_dir }}/tools"
courseware_apps_dir: "{{ courseware_state_dir }}/apps"
courseware_downloads_dir: "{{ courseware_state_dir }}/downloads"
courseware_lab1_dir: "{{ courseware_state_dir }}/lab1"
courseware_lab2_dir: "{{ courseware_state_dir }}/lab2"
courseware_lab6_dir: "{{ courseware_state_dir }}/lab6"
courseware_transformerlab_legacy_home: "{{ courseware_state_dir }}/transformerlab-home"
courseware_safe_homes_dir: "{{ lookup('env', 'HOME') }}/.local/share/local-lab-deployment"
courseware_transformerlab_home: "{{ (courseware_safe_homes_dir ~ '/transformerlab-home') if ' ' in courseware_root else courseware_transformerlab_legacy_home }}"
courseware_unsloth_home: "{{ courseware_state_dir }}/unsloth-home"
courseware_lab1_models_dir: "{{ courseware_models_dir }}/lab1"
courseware_ollama_models_dir: "{{ courseware_models_dir }}/ollama"
courseware_node_runtime_dir: "{{ courseware_tools_dir }}/node-runtime"
courseware_node_runtime_bin_dir: "{{ courseware_node_runtime_dir }}/node_modules/node/bin"
courseware_uv_dir: "{{ courseware_tools_dir }}/uv"
courseware_uv_bin: "{{ courseware_uv_dir }}/bin/uv"
courseware_uv_cache_dir: "{{ courseware_cache_dir }}/uv"
courseware_python_runtime_dir: "{{ courseware_tools_dir }}/python"
courseware_netron_venv_dir: "{{ courseware_venvs_dir }}/netron"
courseware_wetty_dir: "{{ courseware_tools_dir }}/wetty"
courseware_promptfoo_dir: "{{ courseware_lab6_dir }}"
courseware_wiki_repo_dir: "{{ courseware_repos_dir }}/LLM-Labs"
courseware_wiki_runtime_config_path: "{{ courseware_wiki_repo_dir }}/public/courseware-runtime.json"
courseware_llama_cpp_bin_dir: "{{ courseware_repos_dir }}/llama.cpp/build/bin"
courseware_bind_host: "0.0.0.0"
courseware_url_host: "127.0.0.1"
courseware_url_host: >-
{{
(lookup('env', 'COURSEWARE_URL_HOST') | trim)
if (lookup('env', 'COURSEWARE_URL_HOST') | trim | length) > 0
else (
ansible_default_ipv4.address
| default(ansible_all_ipv4_addresses | default(['127.0.0.1']) | first)
)
}}
courseware_ports:
ollama: 11434
open_webui: 8080
transformerlab: 8338
netron: 8338
chunkviz: 3001
embedding_atlas: 5055
unsloth: 8888
promptfoo: 15500
wiki: 80
wetty: 7681
courseware_transformerlab_install_mode: "single-user-pinned"
courseware_transformerlab_version: "v0.28.2"
courseware_transformerlab_version_dir: "{{ courseware_transformerlab_version | regex_replace('^v', '') }}"
courseware_transformerlab_source_archive: "{{ courseware_downloads_dir }}/transformerlab-app-{{ courseware_transformerlab_version_dir }}.tar.gz"
courseware_transformerlab_web_archive: "{{ courseware_downloads_dir }}/transformerlab-web-{{ courseware_transformerlab_version_dir }}.tar.gz"
courseware_transformerlab_miniforge_installer: "{{ courseware_downloads_dir }}/transformerlab-miniforge-installer.sh"
courseware_transformerlab_default_user_email: "student@zuccaro.me"
courseware_transformerlab_default_user_password: "student"
courseware_transformerlab_default_user_first_name: "Student"
courseware_transformerlab_default_user_last_name: ""
courseware_transformerlab_required_loader_plugins:
- "fastchat_server"
courseware_transformerlab_required_supports_fastchat:
- "chat"
- "completion"
- "visualize_model"
- "model_layers"
- "rag"
- "tools"
- "template"
- "embeddings"
- "tokenize"
- "logprobs"
- "batched"
courseware_netron_version: "9.0.1"
courseware_ollama_min_version: "0.12.11"
courseware_llama_cpp_commit: "51fa458a92d6a3f305f8fd76fc8f702e3e87ddb5"
courseware_chunkviz_commit: "a891eacafda1f28a12373ad3b00102e68f07c57f"
courseware_promptfoo_version: "0.119.0"
courseware_kiln_release_tag: "v0.18.1"
courseware_node_runtime_version: "20.20.2"
courseware_python_runtime_version: "3.12"
courseware_uv_spec: "uv"
courseware_wetty_spec: "wetty@2.5.0"
courseware_wetty_base_path: "/wetty"
courseware_wiki_repo: "https://git.zuccaro.me/bzuccaro/LLM-Labs.git"
courseware_open_webui_spec: "open-webui"
courseware_embedding_atlas_spec: "embedding-atlas"
courseware_lab1_ollama_model_alias: "batiai/gemma4-e2b:q4"
courseware_lab1_llama_filename: "Llama-3.2-1B.Q4_K_M.gguf"
courseware_lab1_llama_download_url: "https://huggingface.co/DevQuasar-3/meta-llama.Llama-3.2-1B-GGUF/resolve/main/Llama-3.2-1B.Q4_K_M.gguf?download=true"
courseware_lab1_llama_local_path: "{{ courseware_lab1_models_dir }}/{{ courseware_lab1_llama_filename }}"
courseware_white_rabbit_repo: "bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
courseware_white_rabbit_variants:
@@ -83,12 +86,15 @@ courseware_white_rabbit_variants:
- ollama_model: "WhiteRabbitNeo-IQ2"
quant: "IQ2_M"
filename: "WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-IQ2_M.gguf"
courseware_ollama_models:
- "llama3.2"
- "qwen3.5:4b"
- "gemma3n:e2b"
courseware_optional_ollama_models:
- "gemma3:12b-it-qat"
courseware_lab2_ollama_models:
- label: "Gemma 4 E2B IT Q8"
value: "gemma4:e2b-it-q8_0"
- label: "Gemma 4 E2B Q4"
value: "batiai/gemma4-e2b:q4"
- label: "Gemma 4 E2B Q6"
value: "batiai/gemma4-e2b:q6"
courseware_ollama_models: "{{ courseware_lab2_ollama_models | map(attribute='value') | list }}"
courseware_optional_ollama_models: []
courseware_install_optional_heavy_models: false
courseware_wsl_cuda_pin_url: "https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/cuda-wsl-ubuntu.pin"
@@ -145,9 +151,10 @@ courseware_ollama_install_marker: "{{ courseware_markers_dir }}/ollama-installed
courseware_services:
- "ollama"
- "open-webui"
- "transformerlab"
- "netron"
- "chunkviz"
- "embedding-atlas"
- "unsloth"
- "promptfoo"
- "wiki"
- "wetty"
-65
View File
@@ -12,17 +12,6 @@
changed_when: false
failed_when: false
- name: Stat managed TransformerLab symlink
stat:
path: "{{ ansible_env.HOME }}/.transformerlab"
follow: false
register: courseware_down_transformerlab
- name: Stat managed TransformerLab marker
stat:
path: "{{ ansible_env.HOME }}/.transformerlab/.courseware-managed"
register: courseware_down_transformerlab_marker
- name: Stat managed Unsloth symlink
stat:
path: "{{ ansible_env.HOME }}/.unsloth"
@@ -34,11 +23,6 @@
path: "{{ ansible_env.HOME }}/.unsloth/.courseware-managed"
register: courseware_down_unsloth_marker
- name: Stat conda environments file
stat:
path: "{{ ansible_env.HOME }}/.conda/environments.txt"
register: courseware_down_conda_envs
- name: Stat courseware-managed Ollama install marker
stat:
path: "{{ courseware_ollama_install_marker }}"
@@ -156,55 +140,6 @@
- courseware_down_ollama_marker.stat.exists
failed_when: false
- name: Stop courseware-managed Ollama macOS app if running
command: pkill -x Ollama
when:
- ansible_system == "Darwin"
- courseware_down_ollama_marker.stat.exists
changed_when: false
failed_when: false
- name: Uninstall courseware-managed Ollama Homebrew formula
command: brew uninstall ollama
when:
- ansible_system == "Darwin"
- courseware_down_ollama_marker.stat.exists
changed_when: false
failed_when: false
- name: Remove managed TransformerLab conda environment entry
lineinfile:
path: "{{ ansible_env.HOME }}/.conda/environments.txt"
regexp: "^{{ (courseware_transformerlab_home ~ '/envs/transformerlab') | regex_escape() }}$"
state: absent
when: courseware_down_conda_envs.stat.exists
failed_when: false
- name: Remove managed TransformerLab path
file:
path: "{{ ansible_env.HOME }}/.transformerlab"
state: absent
when:
- courseware_down_transformerlab.stat.exists
- >
(courseware_down_transformerlab.stat.islnk and
courseware_down_transformerlab.stat.lnk_source == courseware_transformerlab_home)
or courseware_down_transformerlab_marker.stat.exists
failed_when: false
- name: Remove managed TransformerLab home
file:
path: "{{ courseware_transformerlab_home }}"
state: absent
failed_when: false
- name: Remove legacy managed TransformerLab home
file:
path: "{{ courseware_transformerlab_legacy_home }}"
state: absent
when: courseware_transformerlab_legacy_home != courseware_transformerlab_home
failed_when: false
- name: Remove managed Unsloth path
file:
path: "{{ ansible_env.HOME }}/.unsloth"
+6 -2
View File
@@ -6,13 +6,17 @@
- { role: preflight, tags: ["preflight"] }
- directories
- packages
- python_runtime
- netron
- lab1_assets
- lab_assets
- node_runtime
- { role: terminal, when: ansible_system == "Linux" }
- llama_cpp
- transformerlab
- open_webui
- chunkviz
- promptfoo
- wiki
- { role: ollama_models, tags: ["ollama_models"] }
- { role: wiki, tags: ["wiki"] }
- kiln
- unsloth
-7
View File
@@ -12,10 +12,3 @@ common_packages_debian:
- ninja-build
- libssl-dev
- pkg-config
common_packages_macos:
- python3
- git
- curl
- cmake
- ninja
-11
View File
@@ -20,17 +20,6 @@
when: ansible_os_family == "Debian"
become: yes
- name: Ensure Homebrew is installed (macOS)
ansible.builtin.homebrew:
name:
- python3
- git
- curl
- cmake
- ninja
state: present
when: ansible_os_family == "Darwin"
- name: Install Python virtual environment module (user space)
ansible.builtin.pip:
name: virtualenv
+5 -34
View File
@@ -8,6 +8,9 @@
- "{{ courseware_markers_dir }}"
- "{{ courseware_logs_dir }}"
- "{{ courseware_run_dir }}"
- "{{ courseware_cache_dir }}"
- "{{ courseware_tmp_dir }}"
- "{{ courseware_uv_cache_dir }}"
- "{{ courseware_repos_dir }}"
- "{{ courseware_venvs_dir }}"
- "{{ courseware_models_dir }}"
@@ -15,10 +18,10 @@
- "{{ courseware_tools_dir }}"
- "{{ courseware_apps_dir }}"
- "{{ courseware_downloads_dir }}"
- "{{ courseware_lab1_dir }}"
- "{{ courseware_lab2_dir }}"
- "{{ courseware_safe_homes_dir }}"
- "{{ courseware_transformerlab_home }}"
- "{{ courseware_unsloth_home }}"
- "{{ courseware_lab1_models_dir }}"
- "{{ courseware_ollama_models_dir }}"
- name: Seed managed ownership markers
@@ -27,40 +30,8 @@
state: touch
mode: "0644"
loop:
- "{{ courseware_transformerlab_home }}/.courseware-managed"
- "{{ courseware_unsloth_home }}/.courseware-managed"
- name: Check existing TransformerLab path
stat:
path: "{{ ansible_env.HOME }}/.transformerlab"
follow: false
register: courseware_transformerlab_link
- name: Check existing TransformerLab ownership marker
stat:
path: "{{ ansible_env.HOME }}/.transformerlab/.courseware-managed"
register: courseware_transformerlab_marker
- name: Fail if TransformerLab path is already occupied
fail:
msg: "{{ ansible_env.HOME }}/.transformerlab already exists and is not managed by this project."
when:
- courseware_transformerlab_link.stat.exists
- >
(
(not courseware_transformerlab_link.stat.islnk) or
(courseware_transformerlab_link.stat.islnk and
courseware_transformerlab_link.stat.lnk_source != courseware_transformerlab_home)
) and
(not courseware_transformerlab_marker.stat.exists)
- name: Link TransformerLab home into project state
file:
src: "{{ courseware_transformerlab_home }}"
dest: "{{ ansible_env.HOME }}/.transformerlab"
state: link
force: true
- name: Check existing Unsloth path
stat:
path: "{{ ansible_env.HOME }}/.unsloth"
-19
View File
@@ -1,19 +0,0 @@
- name: Download Kiln macOS disk image
get_url:
url: "https://github.com/Kiln-AI/Kiln/releases/download/{{ courseware_kiln_release_tag }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg"
dest: "{{ courseware_downloads_dir }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg"
mode: "0644"
- name: Install Kiln.app into project state
shell: |
set -euo pipefail
mount_point=$(mktemp -d /tmp/kiln.XXXXXX)
hdiutil attach "{{ courseware_downloads_dir }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg" -mountpoint "$mount_point" -nobrowse -quiet
app_path=$(find "$mount_point" -maxdepth 1 -name '*.app' | head -n 1)
rm -rf "{{ courseware_apps_dir }}/Kiln.app"
cp -R "$app_path" "{{ courseware_apps_dir }}/Kiln.app"
hdiutil detach "$mount_point" -quiet
rmdir "$mount_point"
args:
executable: /bin/bash
creates: "{{ courseware_apps_dir }}/Kiln.app"
-5
View File
@@ -1,8 +1,3 @@
- name: Install Kiln on Linux
include_tasks: linux.yml
when: ansible_system == "Linux"
- name: Install Kiln on macOS
include_tasks: macos.yml
when: ansible_system == "Darwin"
+38
View File
@@ -0,0 +1,38 @@
- name: Ensure Lab 1 model directory exists
file:
path: "{{ courseware_lab1_models_dir }}"
state: directory
mode: "0755"
- name: Check installed Ollama version
command:
argv:
- "{{ courseware_ollama_bin }}"
- --version
register: courseware_lab1_ollama_version
changed_when: false
- name: Extract installed Ollama semantic version
set_fact:
courseware_lab1_ollama_semver: >-
{{
courseware_lab1_ollama_version.stdout
| regex_search('[0-9]+\.[0-9]+\.[0-9]+')
| default('')
}}
- name: Fail when Ollama is too old for Lab 1 logprobs
fail:
msg: >-
Lab 1 requires Ollama {{ courseware_ollama_min_version }} or newer because
the confidence visualizer depends on logprob support. Installed version:
{{ courseware_lab1_ollama_version.stdout | trim }}.
when:
- courseware_lab1_ollama_semver | length == 0
or not (courseware_lab1_ollama_semver is version(courseware_ollama_min_version, '>='))
- name: Download mirrored Lab 1 Llama model
get_url:
url: "{{ courseware_lab1_llama_download_url }}"
dest: "{{ courseware_lab1_llama_local_path }}"
mode: "0644"
-26
View File
@@ -23,18 +23,6 @@
gpu_type: "{{ 'nvidia' if nvidia_smi_output.rc == 0 else 'none' }}"
when: is_wsl | default(false) or ansible_os_family == "Debian"
- name: Check for Metal GPU on macOS
ansible.builtin.command: system_profiler SPDisplaysDataType
register: metal_check
changed_when: false
failed_when: false
when: ansible_os_family == "Darwin"
- name: Set GPU type for macOS
ansible.builtin.set_fact:
gpu_type: "metal"
when: ansible_os_family == "Darwin" and metal_check.rc == 0
- name: Display detected GPU type
ansible.builtin.debug:
msg: "llama.cpp GPU type: {{ gpu_type | default('none') }}"
@@ -58,7 +46,6 @@
{{
not llama_cpp_stat.stat.exists or
(gpu_type == 'nvidia' and existing_gpu_check.stdout != 'cuda') or
(gpu_type == 'metal' and existing_gpu_check.stdout != 'metal') or
(gpu_type == 'amd' and existing_gpu_check.stdout != 'amd')
}}
@@ -120,19 +107,6 @@
when: gpu_type == 'amd' and cmake_configured.rc != 0
become: no
- name: Configure llama.cpp for Metal (macOS)
ansible.builtin.command:
argv:
- cmake
- ..
- -G Ninja
- -DCMAKE_BUILD_TYPE=Release
- -DGGML_METAL=on
args:
chdir: "{{ llmlab_base }}/lab2/llama.cpp/build"
when: gpu_type == 'metal' and cmake_configured.rc != 0
become: no
- name: Configure llama.cpp for CPU only
ansible.builtin.command:
argv:
+1 -1
View File
@@ -78,7 +78,7 @@
- name: Set llama.cpp backend flag
set_fact:
courseware_llama_backend_flag: "{{ '-DGGML_METAL=ON' if ansible_system == 'Darwin' else '-DGGML_CUDA=ON' }}"
courseware_llama_backend_flag: "-DGGML_CUDA=ON"
- name: Set llama.cpp build parallelism
set_fact:
+30
View File
@@ -0,0 +1,30 @@
- name: Create Netron virtual environment
command:
argv:
- "{{ courseware_python_bin }}"
- -m
- venv
- "{{ courseware_netron_venv_dir }}"
args:
creates: "{{ courseware_netron_venv_dir }}/bin/python"
- name: Upgrade Netron venv tooling
command:
argv:
- "{{ courseware_netron_venv_dir }}/bin/python"
- -m
- pip
- install
- --upgrade
- pip
- setuptools
- wheel
- name: Install Netron server runtime
command:
argv:
- "{{ courseware_netron_venv_dir }}/bin/python"
- -m
- pip
- install
- "netron=={{ courseware_netron_version }}"
-8
View File
@@ -16,14 +16,6 @@
become: yes
notify: Start Ollama service
- name: Install Ollama (macOS via Homebrew)
ansible.builtin.homebrew:
name: ollama
state: present
when:
- ansible_os_family == "Darwin"
- ollama_version_check.rc != 0
- name: Check if Ollama service exists
ansible.builtin.command: systemctl list-unit-files ollama.service
register: ollama_service_check
+38 -2
View File
@@ -4,10 +4,28 @@
state: directory
mode: "0755"
- name: Check Open WebUI virtual environment Python version
command:
argv:
- "{{ courseware_venvs_dir }}/open-webui/bin/python"
- -c
- "import sys; print(f'{sys.version_info.major}.{sys.version_info.minor}')"
register: courseware_open_webui_venv_python_version
changed_when: false
failed_when: false
- name: Remove Open WebUI virtual environment with incompatible Python
file:
path: "{{ courseware_venvs_dir }}/open-webui"
state: absent
when:
- courseware_open_webui_venv_python_version.rc == 0
- courseware_open_webui_venv_python_version.stdout != courseware_python_runtime_version
- name: Create Open WebUI virtual environment
command:
argv:
- "{{ courseware_transformerlab_home }}/envs/transformerlab/bin/python"
- "{{ courseware_python_bin }}"
- -m
- venv
- "{{ courseware_venvs_dir }}/open-webui"
@@ -36,10 +54,28 @@
- "{{ courseware_open_webui_spec }}"
- "numpy<2"
- name: Check Embedding Atlas virtual environment Python version
command:
argv:
- "{{ courseware_venvs_dir }}/embedding-atlas/bin/python"
- -c
- "import sys; print(f'{sys.version_info.major}.{sys.version_info.minor}')"
register: courseware_embedding_atlas_venv_python_version
changed_when: false
failed_when: false
- name: Remove Embedding Atlas virtual environment with incompatible Python
file:
path: "{{ courseware_venvs_dir }}/embedding-atlas"
state: absent
when:
- courseware_embedding_atlas_venv_python_version.rc == 0
- courseware_embedding_atlas_venv_python_version.stdout != courseware_python_runtime_version
- name: Create Embedding Atlas virtual environment
command:
argv:
- "{{ courseware_transformerlab_home }}/envs/transformerlab/bin/python"
- "{{ courseware_python_bin }}"
- -m
- venv
- "{{ courseware_venvs_dir }}/embedding-atlas"
+1
View File
@@ -14,6 +14,7 @@
- pkg-config
- python3
- python3-pip
- python3-setuptools
- python3-venv
- unzip
- zstd
-29
View File
@@ -1,29 +0,0 @@
- name: Check installed Homebrew formulas
command: "brew list --versions {{ item }}"
loop:
- git
- git-lfs
- cmake
- node
- python@3.11
- ollama
register: courseware_brew_checks
changed_when: false
failed_when: false
- name: Install missing Homebrew formulas
command: "brew install {{ item.item }}"
loop: "{{ courseware_brew_checks.results }}"
when: item.rc != 0
- name: Mark Ollama as installed by courseware on macOS
file:
path: "{{ courseware_ollama_install_marker }}"
state: touch
mode: "0644"
when:
- courseware_brew_checks.results
| selectattr('item', 'equalto', 'ollama')
| selectattr('rc', 'ne', 0)
| list
| length > 0
-5
View File
@@ -1,8 +1,3 @@
- name: Install macOS prerequisites
include_tasks: macos.yml
when: ansible_system == "Darwin"
- name: Install Linux prerequisites
include_tasks: linux.yml
when: ansible_system == "Linux"
+4 -47
View File
@@ -1,56 +1,14 @@
- name: Classify supported host profile
set_fact:
courseware_is_macos: "{{ ansible_system == 'Darwin' }}"
courseware_is_linux: "{{ ansible_system == 'Linux' }}"
courseware_is_wsl: "{{ 'microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower }}"
courseware_is_native_linux: "{{ ansible_system == 'Linux' and not ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) }}"
courseware_host_profile: "{{ 'macos' if ansible_system == 'Darwin' else ('wsl' if ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) else ('native-debian-ubuntu' if ansible_system == 'Linux' and ansible_os_family == 'Debian' else 'unsupported')) }}"
courseware_host_profile: "{{ 'wsl' if ansible_system == 'Linux' and ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) else ('native-debian-ubuntu' if ansible_system == 'Linux' and ansible_os_family == 'Debian' else 'unsupported') }}"
- name: Fail on unsupported operating systems
fail:
msg: "Supported platforms are Apple Silicon macOS and Debian-family Linux/WSL."
when: ansible_system not in ["Darwin", "Linux"]
- name: Fail on unsupported macOS architecture
fail:
msg: "This installer supports Apple Silicon Macs only."
when:
- ansible_system == "Darwin"
- ansible_architecture not in ["arm64", "aarch64"]
- name: Fail on undersized macOS systems
fail:
msg: "This courseware assumes a modern Apple Silicon Mac with at least 16 GB of unified memory."
when:
- ansible_system == "Darwin"
- (ansible_memtotal_mb | int) < 16000
- name: Check for Xcode command line tools
command: xcode-select -p
register: courseware_xcode_select
changed_when: false
when: ansible_system == "Darwin"
- name: Check for Homebrew
command: which brew
register: courseware_brew_check
changed_when: false
failed_when: false
when: ansible_system == "Darwin"
- name: Fail when Xcode command line tools are missing
fail:
msg: "Install Xcode Command Line Tools first with 'xcode-select --install'."
when:
- ansible_system == "Darwin"
- courseware_xcode_select.rc != 0
- name: Fail when Homebrew is missing
fail:
msg: "Install Homebrew first from https://brew.sh/."
when:
- ansible_system == "Darwin"
- courseware_brew_check.rc != 0
msg: "Supported platforms are Debian-family Linux and WSL."
when: courseware_host_profile == "unsupported"
- name: Fail on unsupported Linux family
fail:
@@ -330,6 +288,5 @@
- name: Set runtime binary defaults
set_fact:
courseware_python_bin: >-
{{ '/opt/homebrew/opt/python@3.11/bin/python3.11' if ansible_system == 'Darwin' else '/usr/bin/python3' }}
courseware_python_bin: "/usr/bin/python3"
courseware_ollama_bin: "ollama"
@@ -0,0 +1,74 @@
- name: Create contained Python runtime manager virtual environment
command:
argv:
- /usr/bin/python3
- -m
- venv
- "{{ courseware_uv_dir }}"
args:
creates: "{{ courseware_uv_dir }}/bin/python"
- name: Upgrade contained Python runtime manager tooling
command:
argv:
- "{{ courseware_uv_dir }}/bin/python"
- -m
- pip
- install
- --upgrade
- pip
- setuptools
- wheel
- name: Install contained Python runtime manager
command:
argv:
- "{{ courseware_uv_dir }}/bin/python"
- -m
- pip
- install
- "{{ courseware_uv_spec }}"
- name: Install managed CPython runtime
command:
argv:
- "{{ courseware_uv_bin }}"
- python
- install
- "{{ courseware_python_runtime_version }}"
- --install-dir
- "{{ courseware_python_runtime_dir }}"
environment:
UV_PYTHON_INSTALL_DIR: "{{ courseware_python_runtime_dir }}"
UV_CACHE_DIR: "{{ courseware_uv_cache_dir }}"
XDG_CACHE_HOME: "{{ courseware_cache_dir }}"
TMPDIR: "{{ courseware_tmp_dir }}"
register: courseware_python_runtime_install
changed_when: "'Installed Python' in courseware_python_runtime_install.stdout"
- name: Resolve managed CPython runtime
command:
argv:
- "{{ courseware_uv_bin }}"
- python
- find
- "{{ courseware_python_runtime_version }}"
environment:
UV_PYTHON_INSTALL_DIR: "{{ courseware_python_runtime_dir }}"
UV_CACHE_DIR: "{{ courseware_uv_cache_dir }}"
XDG_CACHE_HOME: "{{ courseware_cache_dir }}"
TMPDIR: "{{ courseware_tmp_dir }}"
register: courseware_python_runtime_find
changed_when: false
- name: Set managed Python runtime for courseware venvs
set_fact:
courseware_python_bin: "{{ courseware_python_runtime_find.stdout | trim }}"
- name: Verify managed Python runtime version
command:
argv:
- "{{ courseware_python_bin }}"
- -c
- "import sys; expected=tuple(map(int, '{{ courseware_python_runtime_version }}'.split('.'))); raise SystemExit(0 if sys.version_info[:len(expected)] == expected else 1)"
changed_when: false
+132
View File
@@ -0,0 +1,132 @@
- name: Install terminal prerequisites
become: true
apt:
name:
- openssh-server
state: present
update_cache: true
- name: Ensure sshd drop-in directory exists
become: true
file:
path: /etc/ssh/sshd_config.d
state: directory
mode: "0755"
- name: Configure courseware loopback-only sshd policy
become: true
template:
src: sshd-courseware-terminal.conf.j2
dest: /etc/ssh/sshd_config.d/50-courseware-terminal.conf
mode: "0644"
register: courseware_terminal_sshd_config
- name: Ensure sshd runtime directory exists
become: true
file:
path: /run/sshd
state: directory
mode: "0755"
- name: Validate sshd configuration
become: true
command:
argv:
- /usr/sbin/sshd
- -t
- -f
- /etc/ssh/sshd_config
changed_when: false
- name: Start and enable sshd with systemd when available
become: true
systemd:
name: ssh
state: started
enabled: true
when: ansible_service_mgr == "systemd"
- name: Check systemd sshd listener policy
become: true
command: ss -ltn
register: courseware_terminal_systemd_ss_listeners
changed_when: false
when: ansible_service_mgr == "systemd"
- name: Restart sshd with systemd when listener policy is not active
become: true
systemd:
name: ssh
state: restarted
enabled: true
when:
- ansible_service_mgr == "systemd"
- >-
'0.0.0.0:22' not in courseware_terminal_systemd_ss_listeners.stdout
or '[::]:22' in courseware_terminal_systemd_ss_listeners.stdout
- name: Check for running sshd when systemd is unavailable
become: true
command: pgrep -x sshd
register: courseware_terminal_sshd_pid
changed_when: false
failed_when: false
when: ansible_service_mgr != "systemd"
- name: Reload running sshd when config changed outside systemd
become: true
command: pkill -HUP -x sshd
when:
- ansible_service_mgr != "systemd"
- courseware_terminal_sshd_pid.rc == 0
- courseware_terminal_sshd_config.changed
- name: Start sshd when it is not already running outside systemd
become: true
command:
argv:
- /usr/sbin/sshd
when:
- ansible_service_mgr != "systemd"
- courseware_terminal_sshd_pid.rc != 0
- name: Create contained WeTTY directory
file:
path: "{{ courseware_wetty_dir }}"
state: directory
mode: "0755"
- name: Install contained WeTTY runtime
command:
argv:
- npm
- install
- "{{ courseware_wetty_spec }}"
args:
chdir: "{{ courseware_wetty_dir }}"
creates: "{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
environment:
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
- name: Check sshd listener
become: true
command: ss -ltn
register: courseware_terminal_ss_listeners
changed_when: false
- name: Assert sshd accepts LAN and loopback clients
assert:
that:
- "'0.0.0.0:22' in courseware_terminal_ss_listeners.stdout"
- "'[::]:22' not in courseware_terminal_ss_listeners.stdout"
fail_msg: "sshd must listen on 0.0.0.0:22 so VPN/LAN SSH clients and local WeTTY can connect."
- name: Assert WeTTY binary exists
stat:
path: "{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
register: courseware_wetty_bin_stat
- name: Fail when WeTTY installation is incomplete
fail:
msg: "WeTTY was not installed under {{ courseware_wetty_dir }}."
when: not courseware_wetty_bin_stat.stat.exists
@@ -0,0 +1,11 @@
# Managed by Local Courseware Deployment.
ListenAddress 0.0.0.0
AddressFamily inet
PermitRootLogin no
PasswordAuthentication yes
KbdInteractiveAuthentication no
ChallengeResponseAuthentication no
UsePAM yes
AllowTcpForwarding no
X11Forwarding no
PrintMotd no
+2 -2
View File
@@ -145,7 +145,7 @@
- name: Determine Miniforge platform suffix
set_fact:
courseware_transformerlab_miniforge_platform: "{{ 'Linux-x86_64' if ansible_system == 'Linux' and ansible_architecture == 'x86_64' else 'Linux-aarch64' if ansible_system == 'Linux' and ansible_architecture in ['aarch64', 'arm64'] else 'MacOSX-arm64' if ansible_system == 'Darwin' and ansible_architecture == 'arm64' else 'MacOSX-x86_64' if ansible_system == 'Darwin' and ansible_architecture == 'x86_64' else 'unsupported' }}"
courseware_transformerlab_miniforge_platform: "{{ 'Linux-x86_64' if ansible_system == 'Linux' and ansible_architecture == 'x86_64' else 'Linux-aarch64' if ansible_system == 'Linux' and ansible_architecture in ['aarch64', 'arm64'] else 'unsupported' }}"
- name: Fail for unsupported Miniforge platform
fail:
@@ -210,7 +210,7 @@
elif [ "$accelerator" = "rocm" ]; then
wheel_args+=(--index https://download.pytorch.org/whl/rocm6.4 --index-strategy unsafe-best-match)
extra="rocm"
elif [ "{{ ansible_system }}" != "Darwin" ]; then
else
wheel_args+=(--index https://download.pytorch.org/whl/cpu --index-strategy unsafe-best-match)
fi
+19
View File
@@ -17,6 +17,10 @@
args:
executable: /bin/bash
creates: "{{ courseware_unsloth_home }}/.install_complete"
environment:
UV_CACHE_DIR: "{{ courseware_uv_cache_dir }}"
XDG_CACHE_HOME: "{{ courseware_cache_dir }}"
TMPDIR: "{{ courseware_tmp_dir }}"
rescue:
- name: Capture Unsloth installer log tail
shell: |
@@ -41,3 +45,18 @@
Last log lines:
{{ courseware_unsloth_install_log_tail.stdout | default('(no log output captured)') }}
- name: Install x86_64-compatible NumPy for Unsloth Studio
command:
argv:
- "{{ ansible_env.HOME }}/.unsloth/studio/unsloth_studio/bin/python"
- -m
- pip
- install
- "numpy<2"
environment:
UV_CACHE_DIR: "{{ courseware_uv_cache_dir }}"
XDG_CACHE_HOME: "{{ courseware_cache_dir }}"
TMPDIR: "{{ courseware_tmp_dir }}"
register: courseware_unsloth_numpy_install
changed_when: "'Successfully installed' in courseware_unsloth_numpy_install.stdout"
+8 -13
View File
@@ -1,5 +1,5 @@
diff --git a/src/app/labs/[slug]/page.tsx b/src/app/labs/[slug]/page.tsx
index f67308f..a6aac38 100644
index eb949ae..bb3d51c 100644
--- a/src/app/labs/[slug]/page.tsx
+++ b/src/app/labs/[slug]/page.tsx
@@ -462,6 +462,19 @@ function markdownToHtml(markdown: string) {
@@ -41,20 +41,15 @@ index f67308f..a6aac38 100644
return (
<main className="mx-auto w-full max-w-5xl px-6 py-10">
diff --git a/src/components/labs/LabContent.tsx b/src/components/labs/LabContent.tsx
index 7a7ce52..8778a23 100644
index 6addccf..afdd12f 100644
--- a/src/components/labs/LabContent.tsx
+++ b/src/components/labs/LabContent.tsx
@@ -277,7 +277,12 @@ export function LabContent({ className, html }: LabContentProps) {
>
<div className="lab-image-modal__surface" onClick={(event) => event.stopPropagation()}>
{/* eslint-disable-next-line @next/next/no-img-element */}
- <img className="lab-image-modal__image" src={zoomedImage.src} alt={zoomedImage.alt} />
+ <img
+ className="lab-image-modal__image"
+ src={zoomedImage.src}
+ alt={zoomedImage.alt}
@@ -346,6 +346,7 @@ export function LabContent({ className, html }: LabContentProps) {
<img
className="lab-image-modal__image"
src={zoomedImage.src}
alt={zoomedImage.alt}
+ referrerPolicy="no-referrer"
+ />
/>
</div>
</div>
) : null}
+18 -3
View File
@@ -2,7 +2,9 @@
git:
repo: "{{ courseware_wiki_repo }}"
dest: "{{ courseware_wiki_repo_dir }}"
update: false
update: "{{ courseware_wiki_force_update | default(false) | bool }}"
force: "{{ courseware_wiki_force_update | default(false) | bool }}"
register: courseware_wiki_repo_sync
- name: Check whether wiki referrer policy patch is already applied
command:
@@ -27,14 +29,27 @@
args:
chdir: "{{ courseware_wiki_repo_dir }}"
when: courseware_wiki_referrer_policy_patch.rc != 0
register: courseware_wiki_referrer_policy_apply
- name: Stat wiki Next dependency
stat:
path: "{{ courseware_wiki_repo_dir }}/node_modules/next/package.json"
register: courseware_wiki_next_dependency
- name: Install wiki dependencies with contained Node runtime
command: npm install
args:
chdir: "{{ courseware_wiki_repo_dir }}"
creates: "{{ courseware_wiki_repo_dir }}/node_modules/next/package.json"
environment:
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
when:
- not courseware_wiki_next_dependency.stat.exists or courseware_wiki_repo_sync.changed
- name: Render wiki runtime config
template:
src: courseware-runtime.json.j2
dest: "{{ courseware_wiki_runtime_config_path }}"
mode: "0644"
- name: Stat wiki build output
stat:
@@ -48,4 +63,4 @@
environment:
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
when:
- not courseware_wiki_build.stat.exists or courseware_wiki_referrer_policy_patch.rc != 0
- not courseware_wiki_build.stat.exists or courseware_wiki_repo_sync.changed or courseware_wiki_referrer_policy_patch.rc != 0
@@ -0,0 +1,13 @@
{
"lab1NetronUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.netron }}",
"lab2OllamaUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.ollama }}",
"lab2OllamaModels": [
{% for model in courseware_lab2_ollama_models %}
{
"label": "{{ model.label }}",
"value": "{{ model.value }}"
}{% if not loop.last %},{% endif %}
{% endfor %}
],
"lab3TerminalUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.wetty }}{{ courseware_wetty_base_path }}"
}
+8 -4
View File
@@ -30,14 +30,18 @@
ansible.builtin.import_role:
name: ollama
- name: Include Netron setup
ansible.builtin.import_role:
name: netron
- name: Include Lab 1 asset setup
ansible.builtin.import_role:
name: lab1_assets
- name: Include llama.cpp setup
ansible.builtin.import_role:
name: llama-cpp
- name: Include Transformer Lab setup
ansible.builtin.import_role:
name: transformerlab
- name: Include Unsloth Studio setup
ansible.builtin.import_role:
name: unsloth
+11 -8
View File
@@ -5,31 +5,34 @@ COURSEWARE_BIND_HOST="{{ courseware_bind_host }}"
COURSEWARE_URL_HOST="{{ courseware_url_host }}"
COURSEWARE_OLLAMA_PORT="{{ courseware_ports.ollama }}"
COURSEWARE_OPEN_WEBUI_PORT="{{ courseware_ports.open_webui }}"
COURSEWARE_TRANSFORMERLAB_PORT="{{ courseware_ports.transformerlab }}"
COURSEWARE_NETRON_PORT="{{ courseware_ports.netron }}"
COURSEWARE_CHUNKVIZ_PORT="{{ courseware_ports.chunkviz }}"
COURSEWARE_EMBEDDING_ATLAS_PORT="{{ courseware_ports.embedding_atlas }}"
COURSEWARE_UNSLOTH_PORT="{{ courseware_ports.unsloth }}"
COURSEWARE_PROMPTFOO_PORT="{{ courseware_ports.promptfoo }}"
COURSEWARE_WIKI_PORT="{{ courseware_ports.wiki }}"
COURSEWARE_WETTY_PORT="{{ courseware_ports.wetty }}"
COURSEWARE_OLLAMA_MIN_VERSION="{{ courseware_ollama_min_version }}"
OLLAMA_BIN="{{ courseware_ollama_bin }}"
OLLAMA_MODELS_DIR="{{ courseware_ollama_models_dir }}"
NODE_RUNTIME_BIN_DIR="{{ courseware_node_runtime_bin_dir }}"
NETRON_VENV="{{ courseware_netron_venv_dir }}"
WETTY_BIN="{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
COURSEWARE_WETTY_BASE_PATH="{{ courseware_wetty_base_path }}"
OPEN_WEBUI_VENV="{{ courseware_venvs_dir }}/open-webui"
OPEN_WEBUI_DATA_DIR="{{ courseware_state_dir }}/open-webui"
CHUNKVIZ_DIR="{{ courseware_repos_dir }}/ChunkViz"
EMBEDDING_ATLAS_VENV="{{ courseware_venvs_dir }}/embedding-atlas"
TTPS_DATASET_PATH="{{ courseware_datasets_dir }}/ttps_dataset.parquet"
WIKI_TEST_RAW_PATH="{{ courseware_datasets_dir }}/wiki.test.raw"
TRANSFORMERLAB_DIR="{{ courseware_transformerlab_home }}"
TRANSFORMERLAB_DEFAULT_USER_EMAIL="{{ courseware_transformerlab_default_user_email }}"
TRANSFORMERLAB_DEFAULT_USER_PASSWORD="{{ courseware_transformerlab_default_user_password }}"
TRANSFORMERLAB_DEFAULT_USER_FIRST_NAME="{{ courseware_transformerlab_default_user_first_name }}"
TRANSFORMERLAB_DEFAULT_USER_LAST_NAME="{{ courseware_transformerlab_default_user_last_name }}"
COURSEWARE_OLLAMA_BASE_URL="http://{{ courseware_url_host }}:{{ courseware_ports.ollama }}"
COURSEWARE_LAB1_LLAMA_MODEL_PATH="{{ courseware_lab1_llama_local_path }}"
COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS="{{ courseware_lab1_ollama_model_alias }}"
UNSLOTH_BIN="{{ ansible_env.HOME }}/.local/bin/unsloth"
PROMPTFOO_DIR="{{ courseware_promptfoo_dir }}"
PROMPTFOO_BIN="{{ courseware_tools_dir }}/promptfoo/node_modules/.bin/promptfoo"
WIKI_DIR="{{ courseware_wiki_repo_dir }}"
WIKI_RUNTIME_CONFIG_PATH="{{ courseware_wiki_runtime_config_path }}"
LLAMA_CPP_BIN_DIR="{{ courseware_llama_cpp_bin_dir }}"
KILN_LINUX_BIN="{{ courseware_apps_dir }}/kiln/Kiln"
KILN_MAC_APP="{{ courseware_apps_dir }}/Kiln.app"
KILN_LAUNCH_PATH="{% if ansible_system == 'Darwin' %}{{ courseware_apps_dir }}/Kiln.app{% else %}{{ courseware_apps_dir }}/kiln/Kiln{% endif %}"
KILN_LAUNCH_PATH="{{ courseware_apps_dir }}/kiln/Kiln"
+55 -24
View File
@@ -18,6 +18,8 @@ usage() {
Usage:
./labctl up
./labctl down
./labctl update_wiki
./labctl ollama_models
./labctl preflight
./labctl versions
./labctl assets lab2 [--refresh]
@@ -30,7 +32,7 @@ Usage:
EOF
}
transformerlab_version() {
netron_version() {
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
if [ ! -f "$version_file" ]; then
@@ -38,21 +40,35 @@ transformerlab_version() {
return
fi
sed -nE 's/^courseware_transformerlab_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
sed -nE 's/^courseware_netron_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
}
minimum_ollama_version() {
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
if [ ! -f "$version_file" ]; then
printf '%s\n' "unknown"
return
fi
sed -nE 's/^courseware_ollama_min_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
}
print_versions() {
cat <<EOF
Pinned component versions:
TransformerLab: $(transformerlab_version) (single-user pinned install)
Netron: $(netron_version)
Minimum Ollama: $(minimum_ollama_version)
Ansible Core: 2.18.6
EOF
}
confirm_installation() {
local response
local tlab_version
tlab_version=$(transformerlab_version)
local pinned_netron
local min_ollama
pinned_netron=$(netron_version)
min_ollama=$(minimum_ollama_version)
if [ ! -t 0 ]; then
cat <<EOF >&2
@@ -60,14 +76,16 @@ WARNING: THIS SCRIPT WILL CONFIGURE YOUR ENVIRONMENT WILL THE FOLLOWING SOFTWARE
- Ollama
- llama.cpp
- TransformerLab (single-user pinned to ${tlab_version})
- Netron (${pinned_netron})
- Open WebUI
- ChunkViz
- Embedding Atlas
- Promptfoo
- Unsloth Studio
- Kiln Desktop
- Course-specific support assets for lab 2 and lab 4
- Course-specific support assets for lab 1, lab 2, and lab 4
- Pre-pulled Gemma 4 E2B Ollama models for Lab 1 and Lab 2
- Lab 1 confidence support through Gemma 4 E2B Q4 (requires Ollama ${min_ollama}+)
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
@@ -83,14 +101,16 @@ WARNING: THIS SCRIPT WILL CONFIGURE YOUR ENVIRONMENT WILL THE FOLLOWING SOFTWARE
- Ollama
- llama.cpp
- TransformerLab (single-user pinned to ${tlab_version})
- Netron (${pinned_netron})
- Open WebUI
- ChunkViz
- Embedding Atlas
- Promptfoo
- Unsloth Studio
- Kiln Desktop
- Course-specific support assets for lab 2 and lab 4
- Course-specific support assets for lab 1, lab 2, and lab 4
- Pre-pulled Gemma 4 E2B Ollama models for Lab 1 and Lab 2
- Lab 1 confidence support through Gemma 4 E2B Q4 (requires Ollama ${min_ollama}+)
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
@@ -112,29 +132,16 @@ host_is_wsl() {
[ "$(uname -s)" = "Linux" ] && uname -r | grep -qiE 'microsoft|wsl'
}
host_is_macos() {
[ "$(uname -s)" = "Darwin" ]
}
host_is_linux() {
[ "$(uname -s)" = "Linux" ]
}
host_is_arm_mac() {
host_is_macos && [ "$(uname -m)" = "arm64" ]
}
host_profile() {
if host_is_wsl; then
printf '%s\n' "wsl"
return
fi
if host_is_macos; then
printf '%s\n' "macos"
return
fi
if host_is_linux && host_is_debian_family; then
printf '%s\n' "native-debian-ubuntu"
return
@@ -279,7 +286,6 @@ Python 3 was not found.
Install it first, then rerun this command:
- Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3 python3-venv
- macOS: brew install python@3.11
EOF
exit 1
}
@@ -381,7 +387,6 @@ Python 3 is installed, but its virtual environment support is still unavailable.
Install the missing venv package for your platform, then rerun this command:
- Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3-venv python3-pip
- macOS: brew reinstall python@3.11
EOF
exit 1
fi
@@ -505,6 +510,15 @@ require_arg() {
fi
}
require_managed_runtime() {
if [ ! -f "$ROOT_DIR/state/runtime.env" ]; then
cat <<'EOF' >&2
Missing state/runtime.env. Run ./labctl up first so the managed environment exists before using this command.
EOF
exit 1
fi
}
handle_assets_command() {
local asset_group=${1:-}
shift || true
@@ -520,6 +534,17 @@ handle_assets_command() {
esac
}
refresh_ollama_models() {
require_managed_runtime
run_playbook up.yml --tags ollama_models
}
update_wiki() {
require_managed_runtime
run_playbook up.yml --tags wiki -e "courseware_wiki_force_update=true"
run_project_script "$ROOT_DIR/scripts/service_manager.sh" restart-wiki
}
main() {
local cmd=${1:-}
shift || true
@@ -530,6 +555,12 @@ main() {
run_playbook up.yml
run_project_script "$ROOT_DIR/scripts/service_manager.sh" start all
;;
ollama_models)
refresh_ollama_models
;;
update_wiki)
update_wiki
;;
down)
run_project_script "$ROOT_DIR/scripts/service_manager.sh" stop all || true
run_playbook down.yml
+39 -14
View File
@@ -14,12 +14,25 @@ load_runtime_env() {
: "${COURSEWARE_STATE_DIR:=$STATE_DIR}"
: "${COURSEWARE_BIND_HOST:=127.0.0.1}"
: "${COURSEWARE_URL_HOST:=127.0.0.1}"
if [ -z "${COURSEWARE_URL_HOST:-}" ]; then
COURSEWARE_URL_HOST=$(ip route get 1.1.1.1 2>/dev/null | sed -nE 's/.* src ([0-9.]+).*/\1/p' | head -n 1)
: "${COURSEWARE_URL_HOST:=127.0.0.1}"
fi
: "${COURSEWARE_NETRON_PORT:=8338}"
: "${COURSEWARE_PROMPTFOO_PORT:=15500}"
: "${COURSEWARE_WIKI_PORT:=80}"
: "${COURSEWARE_WETTY_PORT:=7681}"
: "${COURSEWARE_OLLAMA_MIN_VERSION:=0.12.11}"
: "${COURSEWARE_WETTY_BASE_PATH:=/wetty}"
: "${NODE_RUNTIME_BIN_DIR:=$COURSEWARE_STATE_DIR/tools/node-runtime/node_modules/node/bin}"
: "${NETRON_VENV:=$COURSEWARE_STATE_DIR/venvs/netron}"
: "${WETTY_BIN:=$COURSEWARE_STATE_DIR/tools/wetty/node_modules/.bin/wetty}"
: "${PROMPTFOO_DIR:=$COURSEWARE_STATE_DIR/lab6}"
: "${WIKI_DIR:=$COURSEWARE_STATE_DIR/repos/LLM-Labs}"
: "${WIKI_RUNTIME_CONFIG_PATH:=$WIKI_DIR/public/courseware-runtime.json}"
: "${COURSEWARE_OLLAMA_BASE_URL:=http://$COURSEWARE_URL_HOST:$COURSEWARE_OLLAMA_PORT}"
: "${COURSEWARE_LAB1_LLAMA_MODEL_PATH:=$COURSEWARE_STATE_DIR/models/lab1/Llama-3.2-1B.Q4_K_M.gguf}"
: "${COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS:=batiai/gemma4-e2b:q4}"
: "${LLAMA_CPP_BIN_DIR:=$COURSEWARE_STATE_DIR/repos/llama.cpp/build/bin}"
if [ -n "${OLLAMA_BIN:-}" ] && [[ "$OLLAMA_BIN" != */* ]] && command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
@@ -38,12 +51,13 @@ service_list() {
printf '%s\n' \
"ollama" \
"open-webui" \
"transformerlab" \
"netron" \
"chunkviz" \
"embedding-atlas" \
"unsloth" \
"promptfoo" \
"wiki"
"wiki" \
"wetty"
}
service_pid_file() {
@@ -58,12 +72,13 @@ service_port() {
case "$1" in
ollama) printf '%s\n' "${COURSEWARE_OLLAMA_PORT}" ;;
open-webui) printf '%s\n' "${COURSEWARE_OPEN_WEBUI_PORT}" ;;
transformerlab) printf '%s\n' "${COURSEWARE_TRANSFORMERLAB_PORT}" ;;
netron) printf '%s\n' "${COURSEWARE_NETRON_PORT}" ;;
chunkviz) printf '%s\n' "${COURSEWARE_CHUNKVIZ_PORT}" ;;
embedding-atlas) printf '%s\n' "${COURSEWARE_EMBEDDING_ATLAS_PORT}" ;;
unsloth) printf '%s\n' "${COURSEWARE_UNSLOTH_PORT}" ;;
promptfoo) printf '%s\n' "${COURSEWARE_PROMPTFOO_PORT}" ;;
wiki) printf '%s\n' "${COURSEWARE_WIKI_PORT}" ;;
wetty) printf '%s\n' "${COURSEWARE_WETTY_PORT}" ;;
*) return 1 ;;
esac
}
@@ -72,12 +87,13 @@ service_url() {
case "$1" in
ollama) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OLLAMA_PORT" ;;
open-webui) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OPEN_WEBUI_PORT" ;;
transformerlab) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_TRANSFORMERLAB_PORT" ;;
netron) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_NETRON_PORT" ;;
chunkviz) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_CHUNKVIZ_PORT" ;;
embedding-atlas) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_EMBEDDING_ATLAS_PORT" ;;
unsloth) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_UNSLOTH_PORT" ;;
promptfoo) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_PROMPTFOO_PORT" ;;
wiki) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_WIKI_PORT" ;;
wetty) printf 'http://%s:%s%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_WETTY_PORT" "$COURSEWARE_WETTY_BASE_PATH" ;;
*) return 1 ;;
esac
}
@@ -100,14 +116,11 @@ service_command() {
"$COURSEWARE_BIND_HOST" \
"$COURSEWARE_OPEN_WEBUI_PORT"
;;
transformerlab)
printf 'export PATH="%s/envs/transformerlab/bin:$PATH"; export VIRTUAL_ENV="%s/envs/transformerlab"; export CONDA_PREFIX="%s/envs/transformerlab"; cd "%s/src" && exec ./run.sh -c -h %s -p %s' \
"$TRANSFORMERLAB_DIR" \
"$TRANSFORMERLAB_DIR" \
"$TRANSFORMERLAB_DIR" \
"$TRANSFORMERLAB_DIR" \
netron)
printf 'exec "%s/bin/netron" --host %s --port %s --verbosity quiet' \
"$NETRON_VENV" \
"$COURSEWARE_BIND_HOST" \
"$COURSEWARE_TRANSFORMERLAB_PORT"
"$COURSEWARE_NETRON_PORT"
;;
chunkviz)
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/serve" build -s -n -L -l tcp://%s:%s' \
@@ -117,7 +130,7 @@ service_command() {
"$COURSEWARE_CHUNKVIZ_PORT"
;;
embedding-atlas)
printf 'exec "%s/bin/embedding-atlas" "%s" --text "Scenario" --host %s --port %s' \
printf 'exec "%s/bin/embedding-atlas" "%s" --text "Scenario" --host %s --port %s --no-auto-port' \
"$EMBEDDING_ATLAS_VENV" \
"$TTPS_DATASET_PATH" \
"$COURSEWARE_BIND_HOST" \
@@ -138,12 +151,24 @@ service_command() {
"$COURSEWARE_PROMPTFOO_PORT"
;;
wiki)
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/next" start --hostname %s --port %s' \
printf 'cd "%s" && PATH="%s:$PATH" exec env COURSEWARE_OLLAMA_BASE_URL="%s" COURSEWARE_LAB1_LLAMA_MODEL_PATH="%s" COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS="%s" "./node_modules/.bin/next" start --hostname %s --port %s' \
"$WIKI_DIR" \
"$NODE_RUNTIME_BIN_DIR" \
"$COURSEWARE_OLLAMA_BASE_URL" \
"$COURSEWARE_LAB1_LLAMA_MODEL_PATH" \
"$COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS" \
"$COURSEWARE_BIND_HOST" \
"$COURSEWARE_WIKI_PORT"
;;
wetty)
printf 'cd "%s" && PATH="%s:$PATH" exec "%s" --host %s --port %s --base %s --allow-iframe --ssh-host 127.0.0.1 --ssh-port 22 --ssh-auth password' \
"$COURSEWARE_ROOT" \
"$NODE_RUNTIME_BIN_DIR" \
"$WETTY_BIN" \
"$COURSEWARE_BIND_HOST" \
"$COURSEWARE_WETTY_PORT" \
"$COURSEWARE_WETTY_BASE_PATH"
;;
*)
return 1
;;
+180 -92
View File
@@ -9,23 +9,66 @@ load_runtime_env
mkdir -p "$STATE_DIR/run" "$STATE_DIR/logs"
ensure_transformerlab_default_user() {
local helper_python="${TRANSFORMERLAB_DIR}/envs/transformerlab/bin/python"
check_wetty_prereqs() {
if [ ! -x "$WETTY_BIN" ]; then
echo "Missing WeTTY binary at $WETTY_BIN. Re-run ./labctl up." >&2
exit 1
fi
if [ ! -x "$helper_python" ]; then
if [ ! -f "$WIKI_RUNTIME_CONFIG_PATH" ]; then
echo "Missing wiki runtime config at $WIKI_RUNTIME_CONFIG_PATH. Re-run ./labctl up." >&2
exit 1
fi
if ! python3 - <<'PY'
import socket, sys
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(1)
try:
sock.connect(("127.0.0.1", 22))
except OSError:
sys.exit(1)
finally:
sock.close()
PY
then
echo "Loopback sshd is not reachable on 127.0.0.1:22." >&2
exit 1
fi
}
ollama_version_gte_minimum() {
local version_output
local installed_version
if ! command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
return 1
fi
version_output=$("$OLLAMA_BIN" --version 2>/dev/null || true)
installed_version=$(printf '%s' "$version_output" | grep -oE '[0-9]+\.[0-9]+\.[0-9]+' | head -n 1)
if [ -z "$installed_version" ]; then
return 1
fi
[ "$(printf '%s\n' "$COURSEWARE_OLLAMA_MIN_VERSION" "$installed_version" | sort -V | head -n 1)" = "$COURSEWARE_OLLAMA_MIN_VERSION" ]
}
assert_ollama_logprobs_support() {
if ollama_version_gte_minimum; then
return 0
fi
if [ -z "${TRANSFORMERLAB_DEFAULT_USER_EMAIL:-}" ] || [ -z "${TRANSFORMERLAB_DEFAULT_USER_PASSWORD:-}" ]; then
return 0
fi
local version_output
version_output=$("$OLLAMA_BIN" --version 2>/dev/null || printf 'unknown')
"$helper_python" "$SCRIPT_DIR/ensure_transformerlab_user.py" \
--transformerlab-dir "$TRANSFORMERLAB_DIR" \
--email "$TRANSFORMERLAB_DEFAULT_USER_EMAIL" \
--password "$TRANSFORMERLAB_DEFAULT_USER_PASSWORD" \
--first-name "${TRANSFORMERLAB_DEFAULT_USER_FIRST_NAME:-Student}" \
--last-name "${TRANSFORMERLAB_DEFAULT_USER_LAST_NAME:-}" >>"$STATE_DIR/logs/transformerlab_default_user.log" 2>&1 || true
cat <<EOF >&2
Lab 1 requires Ollama ${COURSEWARE_OLLAMA_MIN_VERSION} or newer because the confidence visualizer depends on logprobs.
Installed version: ${version_output}
Re-run ./labctl up after upgrading Ollama.
EOF
exit 1
}
resolve_targets() {
@@ -69,6 +112,19 @@ is_running() {
has_live_pid "$service" || service_ready "$service"
}
service_startup_attempts() {
case "$1" in
embedding-atlas)
# First launch embeds the bundled dataset. On older GPU drivers this falls
# back to CPU and can take close to an hour.
printf '%s\n' 3600
;;
*)
printf '%s\n' 60
;;
esac
}
service_ready() {
local service=$1
@@ -76,13 +132,10 @@ service_ready() {
ollama)
curl -fsS "$(service_url "$service")/api/tags" >/dev/null 2>&1
;;
transformerlab)
curl -fsS "$(service_url "$service")/healthz" >/dev/null 2>&1
;;
promptfoo)
curl -fsS "$(service_url "$service")/health" >/dev/null 2>&1
;;
open-webui|chunkviz|embedding-atlas|unsloth|wiki)
open-webui|netron|chunkviz|embedding-atlas|unsloth|wiki|wetty)
curl -fsS "$(service_url "$service")" >/dev/null 2>&1
;;
*)
@@ -102,6 +155,22 @@ service_listener_pids() {
| sort -u
}
service_port_has_listener() {
local service=$1
local port
port=$(service_port "$service") || return 1
ss -ltnH "( sport = :$port )" 2>/dev/null | grep -q .
}
service_listener_details() {
local service=$1
local port
port=$(service_port "$service") || return 0
ss -ltnp "( sport = :$port )" 2>/dev/null || true
}
kill_pid_tree() {
local signal=$1
local pid=$2
@@ -130,77 +199,16 @@ terminate_service_processes() {
done < <(service_listener_pids "$service")
}
start_one() {
wait_for_service_ready() {
local service=$1
local cmd
local log_file
local pid_file
local log_file=$2
local pid_file=$3
local startup_attempts=$4
local pid_grace_attempts=$5
local attempt
local pid_grace_attempts=5
if has_live_pid "$service"; then
if [ "$service" = "transformerlab" ]; then
ensure_transformerlab_default_user
fi
echo "$service already running"
return 0
fi
if service_ready "$service"; then
if [ "$service" = "transformerlab" ]; then
ensure_transformerlab_default_user
fi
echo "$service already available"
return 0
fi
case "$service" in
open-webui)
start_one ollama
;;
transformerlab)
if command -v python3 >/dev/null 2>&1; then
python3 "$SCRIPT_DIR/repair_transformerlab_plugin_supports.py" \
--transformerlab-dir "$TRANSFORMERLAB_DIR" \
--plugin "fastchat_server" \
--required-support "chat" \
--required-support "completion" \
--required-support "visualize_model" \
--required-support "model_layers" \
--required-support "rag" \
--required-support "tools" \
--required-support "template" \
--required-support "embeddings" \
--required-support "tokenize" \
--required-support "logprobs" \
--required-support "batched" >>"$STATE_DIR/logs/transformerlab_plugin_supports.log" 2>&1 || true
fi
;;
*)
;;
esac
cmd=$(service_command "$service")
log_file=$(service_log_file "$service")
pid_file=$(service_pid_file "$service")
if [ "$service" = "ollama" ]; then
env \
OLLAMA_HOST="${COURSEWARE_BIND_HOST}:${COURSEWARE_OLLAMA_PORT}" \
OLLAMA_MODELS="$OLLAMA_MODELS_DIR" \
"$OLLAMA_BIN" serve </dev/null >>"$log_file" 2>&1 &
elif command -v setsid >/dev/null 2>&1; then
nohup setsid bash -lc "$cmd" </dev/null >>"$log_file" 2>&1 &
else
nohup bash -lc "$cmd" </dev/null >>"$log_file" 2>&1 &
fi
echo $! >"$pid_file"
for attempt in $(seq 1 60); do
for attempt in $(seq 1 "$startup_attempts"); do
if service_ready "$service"; then
if [ "$service" = "transformerlab" ]; then
ensure_transformerlab_default_user
fi
echo "started $service"
return 0
fi
@@ -220,6 +228,68 @@ start_one() {
exit 1
}
start_one() {
local service=$1
local cmd
local log_file
local pid_file
local pid_grace_attempts=5
local startup_attempts
if [ "$service" = "ollama" ] || [ "$service" = "wiki" ]; then
assert_ollama_logprobs_support
fi
startup_attempts=$(service_startup_attempts "$service")
log_file=$(service_log_file "$service")
pid_file=$(service_pid_file "$service")
if service_ready "$service"; then
echo "$service already available"
return 0
fi
if has_live_pid "$service"; then
echo "$service already starting"
wait_for_service_ready "$service" "$log_file" "$pid_file" "$startup_attempts" "$pid_grace_attempts"
return 0
fi
case "$service" in
open-webui)
start_one ollama
;;
wetty)
check_wetty_prereqs
;;
*)
;;
esac
cmd=$(service_command "$service")
if [ "$service" = "ollama" ]; then
if command -v setsid >/dev/null 2>&1; then
nohup setsid env \
OLLAMA_HOST="${COURSEWARE_BIND_HOST}:${COURSEWARE_OLLAMA_PORT}" \
OLLAMA_MODELS="$OLLAMA_MODELS_DIR" \
"$OLLAMA_BIN" serve </dev/null >>"$log_file" 2>&1 &
else
nohup env \
OLLAMA_HOST="${COURSEWARE_BIND_HOST}:${COURSEWARE_OLLAMA_PORT}" \
OLLAMA_MODELS="$OLLAMA_MODELS_DIR" \
"$OLLAMA_BIN" serve </dev/null >>"$log_file" 2>&1 &
fi
elif command -v setsid >/dev/null 2>&1; then
nohup setsid bash -lc "$cmd" </dev/null >>"$log_file" 2>&1 &
else
nohup bash -lc "$cmd" </dev/null >>"$log_file" 2>&1 &
fi
echo $! >"$pid_file"
wait_for_service_ready "$service" "$log_file" "$pid_file" "$startup_attempts" "$pid_grace_attempts"
}
stop_one() {
local service=$1
local pid_file
@@ -266,6 +336,28 @@ stop_one() {
exit 1
}
restart_managed_wiki() {
local wiki_log_file
wiki_log_file=$(service_log_file wiki)
if has_live_pid wiki; then
stop_one wiki
fi
if service_port_has_listener wiki; then
cat <<EOF >&2
Cannot restart wiki because port $(service_port wiki) is already in use by a non-managed listener.
Listener details:
$(service_listener_details wiki)
Leave that process alone or move it off port $(service_port wiki), then rerun ./labctl update_wiki.
Wiki log: $wiki_log_file
EOF
exit 1
fi
start_one wiki
}
status_one() {
local service=$1
@@ -282,26 +374,19 @@ urls() {
cat <<EOF
Ollama API: $(service_url ollama)
Open WebUI: $(service_url open-webui)
TransformerLab: $(service_url transformerlab)
Netron: $(service_url netron)
ChunkViz: $(service_url chunkviz)
Embedding Atlas: $(service_url embedding-atlas)
Unsloth Studio: $(service_url unsloth)
Promptfoo CLI: $PROMPTFOO_BIN
Promptfoo UI: $(service_url promptfoo)
Wiki: $(service_url wiki)
Lab 3 Terminal: $(service_url wetty)
Kiln app: ${KILN_LAUNCH_PATH:-not installed}
EOF
}
open_kiln() {
local host_os
host_os=$(uname -s)
if [ "$host_os" = "Darwin" ] && [ -d "$KILN_MAC_APP" ]; then
open "$KILN_MAC_APP"
return 0
fi
if [ -x "$KILN_LINUX_BIN" ]; then
nohup "$KILN_LINUX_BIN" >/dev/null 2>&1 &
echo "started Kiln from $KILN_LINUX_BIN"
@@ -367,6 +452,9 @@ main() {
fi
show_logs "$1"
;;
restart-wiki)
restart_managed_wiki
;;
*)
echo "Unknown command: $cmd" >&2
exit 1