Compare commits
1 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| d2b0991f0d |
@@ -4,48 +4,32 @@ This project builds a student-friendly local lab environment for the courseware
|
|||||||
|
|
||||||
- `./deploy-courseware.sh` installs and configures the environment, then starts every managed service.
|
- `./deploy-courseware.sh` installs and configures the environment, then starts every managed service.
|
||||||
- `./destroy-courseware.sh` stops the managed services, uninstalls courseware-managed Ollama, and removes the project-owned lab state.
|
- `./destroy-courseware.sh` stops the managed services, uninstalls courseware-managed Ollama, and removes the project-owned lab state.
|
||||||
- `./labctl` provides day-two controls such as `assets lab2`, `ollama_models`, `update_wiki`, `start`, `stop`, `status`, `urls`, and `logs`.
|
- `./labctl` provides day-two controls such as `assets lab2`, `start`, `stop`, `status`, `urls`, `logs`, and `open kiln`.
|
||||||
|
|
||||||
## What It Installs
|
## What It Installs
|
||||||
|
|
||||||
- Ollama
|
- Ollama
|
||||||
- `llama.cpp`
|
- `llama.cpp`
|
||||||
- Netron, served locally on port `8338`
|
- TransformerLab, pinned to the classic single-user `v0.28.2` release
|
||||||
- Open WebUI
|
- Open WebUI
|
||||||
- ChunkViz
|
- ChunkViz
|
||||||
- Embedding Atlas
|
- Embedding Atlas
|
||||||
- Promptfoo
|
- Promptfoo
|
||||||
- Unsloth Studio
|
- Unsloth Studio
|
||||||
- Course-specific support assets for lab 1, lab 2, and lab 4
|
- Kiln Desktop
|
||||||
|
- Course-specific support assets for lab 2 and lab 4
|
||||||
## Lab 1 Defaults
|
|
||||||
|
|
||||||
Lab 1 is now provisioned directly by the installer:
|
|
||||||
|
|
||||||
- The `Llama-3.2-1B.Q4_K_M.gguf` file is mirrored into `state/models/lab1/`.
|
|
||||||
- The Lab 1 confidence widget uses the pre-pulled Gemma 4 E2B Q4 Ollama model, `batiai/gemma4-e2b:q4`.
|
|
||||||
- The wiki serves a same-host download link for the Llama GGUF through `/api/lab1/models/...`.
|
|
||||||
- Lab 1 confidence visualization requires Ollama `0.12.11` or newer because it depends on logprobs.
|
|
||||||
|
|
||||||
## Lab 2 Defaults
|
|
||||||
|
|
||||||
`./labctl up` now pre-pulls the Gemma 4 E2B Ollama variants used by the wiki widgets:
|
|
||||||
|
|
||||||
- `gemma4:e2b-it-q8_0`
|
|
||||||
- `batiai/gemma4-e2b:q4`
|
|
||||||
- `batiai/gemma4-e2b:q6`
|
|
||||||
|
|
||||||
If you want to re-pull just those managed Ollama models later, run `./labctl ollama_models`.
|
|
||||||
|
|
||||||
## Supported Host Profiles
|
## Supported Host Profiles
|
||||||
|
|
||||||
This build is the Linux/WSL variant of LLM Labs Local. If you are deploying on Apple Silicon macOS, use the sibling `LLM-Labs-Local-Mac` project instead.
|
This build intentionally avoids the reference VM's hardware workarounds.
|
||||||
|
|
||||||
|
- macOS: Apple Silicon only, with at least 16 GB unified memory.
|
||||||
- Native Debian/Ubuntu: Debian-family Linux with an NVIDIA GPU visible to `nvidia-smi` and at least 8 GB VRAM.
|
- Native Debian/Ubuntu: Debian-family Linux with an NVIDIA GPU visible to `nvidia-smi` and at least 8 GB VRAM.
|
||||||
- WSL: Debian/Ubuntu-family Linux running under WSL, with the NVIDIA GPU exposed into the distro.
|
- WSL: Debian/Ubuntu-family Linux running under WSL, with the NVIDIA GPU exposed into the distro.
|
||||||
|
|
||||||
The launcher and Ansible preflight classify the host dynamically and apply different setup behavior for:
|
The launcher and Ansible preflight now classify the host dynamically and apply different setup behavior for:
|
||||||
|
|
||||||
|
- `macos`
|
||||||
- `native-debian-ubuntu`
|
- `native-debian-ubuntu`
|
||||||
- `wsl`
|
- `wsl`
|
||||||
|
|
||||||
@@ -67,7 +51,6 @@ On Linux and WSL, the first `./labctl up` or `./labctl preflight` run may prompt
|
|||||||
On Ubuntu WSL x86_64, preflight now installs the Linux-side CUDA toolkit automatically if it is missing.
|
On Ubuntu WSL x86_64, preflight now installs the Linux-side CUDA toolkit automatically if it is missing.
|
||||||
|
|
||||||
It first tries the distro package:
|
It first tries the distro package:
|
||||||
|
|
||||||
- `sudo apt install -y nvidia-cuda-toolkit`
|
- `sudo apt install -y nvidia-cuda-toolkit`
|
||||||
|
|
||||||
If that package is unavailable or still does not expose `nvcc`, the installer falls back to NVIDIA's WSL-Ubuntu repository bootstrap for the toolkit only, not a Linux GPU driver.
|
If that package is unavailable or still does not expose `nvcc`, the installer falls back to NVIDIA's WSL-Ubuntu repository bootstrap for the toolkit only, not a Linux GPU driver.
|
||||||
@@ -81,28 +64,31 @@ For non-Ubuntu WSL distros, install the CUDA toolkit manually before running the
|
|||||||
|
|
||||||
## Native Debian/Ubuntu CUDA Behavior
|
## Native Debian/Ubuntu CUDA Behavior
|
||||||
|
|
||||||
On native Debian/Ubuntu hosts, the installer handles three CUDA-toolkit cases:
|
On native Debian/Ubuntu hosts, the installer now handles three CUDA-toolkit cases:
|
||||||
|
|
||||||
- If the toolkit is already usable, it reuses the existing install instead of forcing a reinstall.
|
- If the toolkit is already usable, it reuses the existing install instead of forcing a reinstall.
|
||||||
- If the distro exposes `nvidia-cuda-toolkit`, it installs that package.
|
- If the distro exposes `nvidia-cuda-toolkit`, it installs that package.
|
||||||
- If the distro package is unavailable, it bootstraps NVIDIA's official CUDA network repository for supported native Debian/Ubuntu releases and installs the toolkit from there.
|
- If the distro package is unavailable, it bootstraps NVIDIA's official CUDA network repository for supported native Debian/Ubuntu releases and installs the toolkit from there.
|
||||||
|
|
||||||
If `apt` starts in a broken dependency state, the installer attempts `dpkg --configure -a` and `apt-get --fix-broken install` before retrying package installation.
|
If `apt` starts in a broken dependency state, the installer now attempts `dpkg --configure -a` and `apt-get --fix-broken install` before retrying package installation.
|
||||||
|
|
||||||
If CUDA is already mounted or preinstalled outside `PATH`, the installer detects standard locations such as `/usr/local/cuda/bin/nvcc` and `/usr/local/cuda-*/bin/nvcc`.
|
If CUDA is already mounted or preinstalled outside `PATH`, the installer now detects standard locations such as `/usr/local/cuda/bin/nvcc` and `/usr/local/cuda-*/bin/nvcc`.
|
||||||
|
|
||||||
## Standard Assumptions
|
## Standard Assumptions
|
||||||
|
|
||||||
- The default deployment is centered on Ollama-backed local inference and browser-based tools such as Netron and the wiki.
|
- The host-side install path assumes modern local tooling, but TransformerLab itself is provisioned from a pinned classic single-user layout.
|
||||||
- Netron is installed into a managed Python virtual environment and served locally instead of being provisioned as a desktop package.
|
- TransformerLab is intentionally pinned to the older single-user `v0.28.2` release because newer upstream releases changed the project structure and behavior in ways that break this courseware.
|
||||||
- Lab 1's Llama GGUF download is mirrored locally during `./labctl up`, so students do not have to fetch it manually from the original source.
|
- This project does not rely on TransformerLab's upstream `install.sh`; the Ansible role provisions the pinned release directly so web assets, env layout, and runtime behavior stay reproducible.
|
||||||
- WhiteRabbitNeo assets remain a separate Lab 2 flow and are still handled outside the default `./labctl up` run.
|
- The courseware repairs the pinned TransformerLab install for symlink-aware plugin file lookups and refreshes installed Fastchat plugin manifests so Fastchat-gated features such as Model Architecture, activations, and Visualize Logprobs stay available on pinned installs.
|
||||||
- Run `./labctl assets lab2` when you want to populate repo-local Lab 2 assets in `assets/lab2/` from Hugging Face.
|
- The managed default TransformerLab student account is also seeded with the courseware Fastchat plugin plus the starter experiments and model metadata that `labctl up` depends on.
|
||||||
|
- No Ollama models are pulled during `./labctl up`; students pull models manually as part of the courseware.
|
||||||
|
- WhiteRabbitNeo assets are handled separately from `./labctl up` and `./labctl preflight`.
|
||||||
|
- Run `./labctl assets lab2` when you want to populate repo-local lab 2 assets in `assets/lab2/` from Hugging Face.
|
||||||
- After base setup, run `state/lab2/download_whiterabbitneo-gguf.sh` to fetch only the `Q4_K_M`, `Q8_0`, and `IQ2_M` files from `bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF` and register local Ollama models `WhiteRabbitNeo`, `WhiteRabbitNeo-Q4`, `WhiteRabbitNeo-Q8`, and `WhiteRabbitNeo-IQ2`.
|
- After base setup, run `state/lab2/download_whiterabbitneo-gguf.sh` to fetch only the `Q4_K_M`, `Q8_0`, and `IQ2_M` files from `bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF` and register local Ollama models `WhiteRabbitNeo`, `WhiteRabbitNeo-Q4`, `WhiteRabbitNeo-Q8`, and `WhiteRabbitNeo-IQ2`.
|
||||||
- Unsloth homes are redirected into this project's `state/` tree via symlinks.
|
- TransformerLab and Unsloth homes are redirected into this project's `state/` tree via symlinks.
|
||||||
- Managed web services bind for access from both Linux and the Windows side of WSL, while `labctl urls` still reports localhost-friendly URLs.
|
- Managed web services bind for access from both Linux and the Windows side of WSL, while `labctl urls` still reports localhost-friendly URLs.
|
||||||
- The local Ansible bootstrap in `.venv-ansible/` is machine-specific and will be recreated automatically if the folder is copied between hosts.
|
- The local Ansible bootstrap in `.venv-ansible/` is machine-specific and will be recreated automatically if the folder is copied between hosts.
|
||||||
- `llama.cpp` uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts.
|
- `llama.cpp` now uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts.
|
||||||
|
|
||||||
## Lab URLs
|
## Lab URLs
|
||||||
|
|
||||||
@@ -112,34 +98,25 @@ Default endpoints:
|
|||||||
|
|
||||||
- Ollama API: `http://127.0.0.1:11434`
|
- Ollama API: `http://127.0.0.1:11434`
|
||||||
- Open WebUI: `http://127.0.0.1:8080`
|
- Open WebUI: `http://127.0.0.1:8080`
|
||||||
- Netron: `http://127.0.0.1:8338`
|
- TransformerLab: `http://127.0.0.1:8338`
|
||||||
- ChunkViz: `http://127.0.0.1:3001`
|
- ChunkViz: `http://127.0.0.1:3001`
|
||||||
- Embedding Atlas: `http://127.0.0.1:5055`
|
- Embedding Atlas: `http://127.0.0.1:5055`
|
||||||
- Unsloth Studio: `http://127.0.0.1:8888`
|
- Unsloth Studio: `http://127.0.0.1:8888`
|
||||||
- Promptfoo UI: `http://127.0.0.1:15500`
|
- Promptfoo UI: `http://127.0.0.1:15500`
|
||||||
- Wiki: `http://127.0.0.1:80`
|
- Wiki: `http://127.0.0.1:80`
|
||||||
- Lab 3 Terminal: `http://127.0.0.1:7681/wetty`
|
|
||||||
|
|
||||||
## Lab 3 Browser Terminal
|
|
||||||
|
|
||||||
The deployment will:
|
|
||||||
|
|
||||||
- leave the host's SSH listen addresses under local control while requiring `127.0.0.1:22` for WeTTY
|
|
||||||
- install WeTTY and expose it at `http://127.0.0.1:7681/wetty`
|
|
||||||
- leave login identity management to the host, so any existing local account with password-based SSH access can sign in through the browser terminal
|
|
||||||
|
|
||||||
## Notes
|
## Notes
|
||||||
|
|
||||||
- `./labctl up` installs the environment and then starts every managed service.
|
- `./labctl up` installs the environment and then starts every managed service.
|
||||||
- `./labctl versions` shows the pinned Netron version, minimum Ollama version, and Ansible runtime version used by this workspace.
|
- `./labctl versions` shows the pinned TransformerLab and Ansible runtime versions used by this workspace.
|
||||||
- `./labctl assets lab2` is a separate manual step that clones the base WhiteRabbitNeo repo into `assets/lab2/WhiteRabbitNeo-V3-7B` and downloads the supported `Q4_K_M`, `Q8_0`, and `IQ2_M` GGUFs into `assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF`.
|
- `./labctl assets lab2` is a separate manual step that clones the base WhiteRabbitNeo repo into `assets/lab2/WhiteRabbitNeo-V3-7B` and downloads the supported `Q4_K_M`, `Q8_0`, and `IQ2_M` GGUFs into `assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF`.
|
||||||
- `./labctl ollama_models` re-pulls the managed Lab 2 Gemma 4 E2B Ollama model set without rerunning the full installer.
|
- TransformerLab is installed as a pinned single-user app and no default courseware-managed TransformerLab user is created automatically.
|
||||||
- `./labctl update_wiki` hard-resets the managed wiki checkout to the remote latest, rebuilds it, and restarts only the managed wiki service on port `80`.
|
|
||||||
- `./labctl start core` starts only `ollama` and `open-webui`.
|
- `./labctl start core` starts only `ollama` and `open-webui`.
|
||||||
- `./labctl start all` starts every managed web service.
|
- `./labctl start all` starts every managed web service.
|
||||||
|
- `./labctl open kiln` launches the Kiln desktop app installed into the project state.
|
||||||
- The scripted Promptfoo install drops a starter config at `state/lab6/promptfoo.yaml`.
|
- The scripted Promptfoo install drops a starter config at `state/lab6/promptfoo.yaml`.
|
||||||
- `labctl start all` includes Promptfoo via `promptfoo view` and the cloned wiki app.
|
- `labctl start all` now includes Promptfoo via `promptfoo view` and the cloned wiki app.
|
||||||
- Lab 2 includes `state/lab2/download_whiterabbitneo-gguf.sh`, which uses `git` + `git lfs` to pull only the supported WhiteRabbitNeo quants. Add `--download-only` if you want the files without Ollama registration.
|
- Lab 2 includes `state/lab2/download_whiterabbitneo-gguf.sh`, which uses `git` + `git lfs` to pull only the supported WhiteRabbitNeo quants. Add `--download-only` if you want the files without Ollama registration.
|
||||||
- The wiki is cloned from `https://git.zuccaro.me/bzuccaro/LLM-Labs.git` into `state/repos/LLM-Labs` and started with `npm`.
|
- The wiki is cloned from `https://git.zuccaro.me/bzuccaro/LLM-Labs.git` into `state/repos/LLM-Labs` and started with `npm`.
|
||||||
- `./labctl down` uninstalls Ollama entirely when this project installed it, instead of only stopping the service.
|
- `./labctl down` now uninstalls Ollama entirely when this project installed it, instead of only stopping the service.
|
||||||
- This variant is intended for NVIDIA-backed Linux/WSL training and lab workflows.
|
- Unsloth Studio currently supports chat and data workflows on macOS; Linux/WSL remains the standard path for NVIDIA-backed training.
|
||||||
|
|||||||
+37
-29
@@ -7,23 +7,19 @@ courseware_venvs_dir: "{{ courseware_state_dir }}/venvs"
|
|||||||
courseware_models_dir: "{{ courseware_state_dir }}/models"
|
courseware_models_dir: "{{ courseware_state_dir }}/models"
|
||||||
courseware_datasets_dir: "{{ courseware_state_dir }}/datasets"
|
courseware_datasets_dir: "{{ courseware_state_dir }}/datasets"
|
||||||
courseware_tools_dir: "{{ courseware_state_dir }}/tools"
|
courseware_tools_dir: "{{ courseware_state_dir }}/tools"
|
||||||
|
courseware_apps_dir: "{{ courseware_state_dir }}/apps"
|
||||||
courseware_downloads_dir: "{{ courseware_state_dir }}/downloads"
|
courseware_downloads_dir: "{{ courseware_state_dir }}/downloads"
|
||||||
courseware_lab1_dir: "{{ courseware_state_dir }}/lab1"
|
|
||||||
courseware_lab2_dir: "{{ courseware_state_dir }}/lab2"
|
courseware_lab2_dir: "{{ courseware_state_dir }}/lab2"
|
||||||
courseware_lab6_dir: "{{ courseware_state_dir }}/lab6"
|
courseware_lab6_dir: "{{ courseware_state_dir }}/lab6"
|
||||||
|
courseware_transformerlab_legacy_home: "{{ courseware_state_dir }}/transformerlab-home"
|
||||||
|
courseware_safe_homes_dir: "{{ lookup('env', 'HOME') }}/.local/share/local-lab-deployment"
|
||||||
|
courseware_transformerlab_home: "{{ (courseware_safe_homes_dir ~ '/transformerlab-home') if ' ' in courseware_root else courseware_transformerlab_legacy_home }}"
|
||||||
courseware_unsloth_home: "{{ courseware_state_dir }}/unsloth-home"
|
courseware_unsloth_home: "{{ courseware_state_dir }}/unsloth-home"
|
||||||
courseware_lab1_models_dir: "{{ courseware_models_dir }}/lab1"
|
|
||||||
courseware_ollama_models_dir: "{{ courseware_models_dir }}/ollama"
|
courseware_ollama_models_dir: "{{ courseware_models_dir }}/ollama"
|
||||||
courseware_node_runtime_dir: "{{ courseware_tools_dir }}/node-runtime"
|
courseware_node_runtime_dir: "{{ courseware_tools_dir }}/node-runtime"
|
||||||
courseware_node_runtime_bin_dir: "{{ courseware_node_runtime_dir }}/node_modules/node/bin"
|
courseware_node_runtime_bin_dir: "{{ courseware_node_runtime_dir }}/node_modules/node/bin"
|
||||||
courseware_uv_venv_dir: "{{ courseware_tools_dir }}/uv"
|
|
||||||
courseware_uv_python_install_dir: "{{ courseware_tools_dir }}/uv-python"
|
|
||||||
courseware_open_webui_python_version: "3.12"
|
|
||||||
courseware_netron_venv_dir: "{{ courseware_venvs_dir }}/netron"
|
|
||||||
courseware_wetty_dir: "{{ courseware_tools_dir }}/wetty"
|
|
||||||
courseware_promptfoo_dir: "{{ courseware_lab6_dir }}"
|
courseware_promptfoo_dir: "{{ courseware_lab6_dir }}"
|
||||||
courseware_wiki_repo_dir: "{{ courseware_repos_dir }}/LLM-Labs"
|
courseware_wiki_repo_dir: "{{ courseware_repos_dir }}/LLM-Labs"
|
||||||
courseware_wiki_runtime_config_path: "{{ courseware_wiki_repo_dir }}/public/courseware-runtime.json"
|
|
||||||
courseware_llama_cpp_bin_dir: "{{ courseware_repos_dir }}/llama.cpp/build/bin"
|
courseware_llama_cpp_bin_dir: "{{ courseware_repos_dir }}/llama.cpp/build/bin"
|
||||||
|
|
||||||
courseware_bind_host: "0.0.0.0"
|
courseware_bind_host: "0.0.0.0"
|
||||||
@@ -31,30 +27,46 @@ courseware_url_host: "127.0.0.1"
|
|||||||
courseware_ports:
|
courseware_ports:
|
||||||
ollama: 11434
|
ollama: 11434
|
||||||
open_webui: 8080
|
open_webui: 8080
|
||||||
netron: 8338
|
transformerlab: 8338
|
||||||
chunkviz: 3001
|
chunkviz: 3001
|
||||||
embedding_atlas: 5055
|
embedding_atlas: 5055
|
||||||
unsloth: 8888
|
unsloth: 8888
|
||||||
promptfoo: 15500
|
promptfoo: 15500
|
||||||
wiki: 80
|
wiki: 80
|
||||||
wetty: 7681
|
|
||||||
|
|
||||||
courseware_netron_version: "9.0.1"
|
courseware_transformerlab_install_mode: "single-user-pinned"
|
||||||
courseware_ollama_min_version: "0.12.11"
|
courseware_transformerlab_version: "v0.28.2"
|
||||||
|
courseware_transformerlab_version_dir: "{{ courseware_transformerlab_version | regex_replace('^v', '') }}"
|
||||||
|
courseware_transformerlab_source_archive: "{{ courseware_downloads_dir }}/transformerlab-app-{{ courseware_transformerlab_version_dir }}.tar.gz"
|
||||||
|
courseware_transformerlab_web_archive: "{{ courseware_downloads_dir }}/transformerlab-web-{{ courseware_transformerlab_version_dir }}.tar.gz"
|
||||||
|
courseware_transformerlab_miniforge_installer: "{{ courseware_downloads_dir }}/transformerlab-miniforge-installer.sh"
|
||||||
|
courseware_transformerlab_default_user_email: "student@zuccaro.me"
|
||||||
|
courseware_transformerlab_default_user_password: "student"
|
||||||
|
courseware_transformerlab_default_user_first_name: "Student"
|
||||||
|
courseware_transformerlab_default_user_last_name: ""
|
||||||
|
courseware_transformerlab_required_loader_plugins:
|
||||||
|
- "fastchat_server"
|
||||||
|
courseware_transformerlab_required_supports_fastchat:
|
||||||
|
- "chat"
|
||||||
|
- "completion"
|
||||||
|
- "visualize_model"
|
||||||
|
- "model_layers"
|
||||||
|
- "rag"
|
||||||
|
- "tools"
|
||||||
|
- "template"
|
||||||
|
- "embeddings"
|
||||||
|
- "tokenize"
|
||||||
|
- "logprobs"
|
||||||
|
- "batched"
|
||||||
courseware_llama_cpp_commit: "51fa458a92d6a3f305f8fd76fc8f702e3e87ddb5"
|
courseware_llama_cpp_commit: "51fa458a92d6a3f305f8fd76fc8f702e3e87ddb5"
|
||||||
courseware_chunkviz_commit: "a891eacafda1f28a12373ad3b00102e68f07c57f"
|
courseware_chunkviz_commit: "a891eacafda1f28a12373ad3b00102e68f07c57f"
|
||||||
courseware_promptfoo_version: "0.119.0"
|
courseware_promptfoo_version: "0.119.0"
|
||||||
|
courseware_kiln_release_tag: "v0.18.1"
|
||||||
courseware_node_runtime_version: "20.20.2"
|
courseware_node_runtime_version: "20.20.2"
|
||||||
courseware_wetty_spec: "wetty@2.5.0"
|
|
||||||
courseware_wetty_base_path: "/wetty"
|
|
||||||
courseware_wiki_repo: "https://git.zuccaro.me/bzuccaro/LLM-Labs.git"
|
courseware_wiki_repo: "https://git.zuccaro.me/bzuccaro/LLM-Labs.git"
|
||||||
|
|
||||||
courseware_open_webui_spec: "open-webui"
|
courseware_open_webui_spec: "open-webui"
|
||||||
courseware_embedding_atlas_spec: "embedding-atlas"
|
courseware_embedding_atlas_spec: "embedding-atlas"
|
||||||
courseware_lab1_ollama_model_alias: "batiai/gemma4-e2b:q4"
|
|
||||||
courseware_lab1_llama_filename: "Llama-3.2-1B.Q4_K_M.gguf"
|
|
||||||
courseware_lab1_llama_download_url: "https://huggingface.co/DevQuasar-3/meta-llama.Llama-3.2-1B-GGUF/resolve/main/Llama-3.2-1B.Q4_K_M.gguf?download=true"
|
|
||||||
courseware_lab1_llama_local_path: "{{ courseware_lab1_models_dir }}/{{ courseware_lab1_llama_filename }}"
|
|
||||||
|
|
||||||
courseware_white_rabbit_repo: "bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
|
courseware_white_rabbit_repo: "bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
|
||||||
courseware_white_rabbit_variants:
|
courseware_white_rabbit_variants:
|
||||||
@@ -71,15 +83,12 @@ courseware_white_rabbit_variants:
|
|||||||
- ollama_model: "WhiteRabbitNeo-IQ2"
|
- ollama_model: "WhiteRabbitNeo-IQ2"
|
||||||
quant: "IQ2_M"
|
quant: "IQ2_M"
|
||||||
filename: "WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-IQ2_M.gguf"
|
filename: "WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-IQ2_M.gguf"
|
||||||
courseware_lab2_ollama_models:
|
courseware_ollama_models:
|
||||||
- label: "Gemma 4 E2B IT Q8"
|
- "llama3.2"
|
||||||
value: "gemma4:e2b-it-q8_0"
|
- "qwen3.5:4b"
|
||||||
- label: "Gemma 4 E2B Q4"
|
- "gemma3n:e2b"
|
||||||
value: "batiai/gemma4-e2b:q4"
|
courseware_optional_ollama_models:
|
||||||
- label: "Gemma 4 E2B Q6"
|
- "gemma3:12b-it-qat"
|
||||||
value: "batiai/gemma4-e2b:q6"
|
|
||||||
courseware_ollama_models: "{{ courseware_lab2_ollama_models | map(attribute='value') | list }}"
|
|
||||||
courseware_optional_ollama_models: []
|
|
||||||
courseware_install_optional_heavy_models: false
|
courseware_install_optional_heavy_models: false
|
||||||
|
|
||||||
courseware_wsl_cuda_pin_url: "https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/cuda-wsl-ubuntu.pin"
|
courseware_wsl_cuda_pin_url: "https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/cuda-wsl-ubuntu.pin"
|
||||||
@@ -136,10 +145,9 @@ courseware_ollama_install_marker: "{{ courseware_markers_dir }}/ollama-installed
|
|||||||
courseware_services:
|
courseware_services:
|
||||||
- "ollama"
|
- "ollama"
|
||||||
- "open-webui"
|
- "open-webui"
|
||||||
- "netron"
|
- "transformerlab"
|
||||||
- "chunkviz"
|
- "chunkviz"
|
||||||
- "embedding-atlas"
|
- "embedding-atlas"
|
||||||
- "unsloth"
|
- "unsloth"
|
||||||
- "promptfoo"
|
- "promptfoo"
|
||||||
- "wiki"
|
- "wiki"
|
||||||
- "wetty"
|
|
||||||
|
|||||||
@@ -12,6 +12,17 @@
|
|||||||
changed_when: false
|
changed_when: false
|
||||||
failed_when: false
|
failed_when: false
|
||||||
|
|
||||||
|
- name: Stat managed TransformerLab symlink
|
||||||
|
stat:
|
||||||
|
path: "{{ ansible_env.HOME }}/.transformerlab"
|
||||||
|
follow: false
|
||||||
|
register: courseware_down_transformerlab
|
||||||
|
|
||||||
|
- name: Stat managed TransformerLab marker
|
||||||
|
stat:
|
||||||
|
path: "{{ ansible_env.HOME }}/.transformerlab/.courseware-managed"
|
||||||
|
register: courseware_down_transformerlab_marker
|
||||||
|
|
||||||
- name: Stat managed Unsloth symlink
|
- name: Stat managed Unsloth symlink
|
||||||
stat:
|
stat:
|
||||||
path: "{{ ansible_env.HOME }}/.unsloth"
|
path: "{{ ansible_env.HOME }}/.unsloth"
|
||||||
@@ -23,6 +34,11 @@
|
|||||||
path: "{{ ansible_env.HOME }}/.unsloth/.courseware-managed"
|
path: "{{ ansible_env.HOME }}/.unsloth/.courseware-managed"
|
||||||
register: courseware_down_unsloth_marker
|
register: courseware_down_unsloth_marker
|
||||||
|
|
||||||
|
- name: Stat conda environments file
|
||||||
|
stat:
|
||||||
|
path: "{{ ansible_env.HOME }}/.conda/environments.txt"
|
||||||
|
register: courseware_down_conda_envs
|
||||||
|
|
||||||
- name: Stat courseware-managed Ollama install marker
|
- name: Stat courseware-managed Ollama install marker
|
||||||
stat:
|
stat:
|
||||||
path: "{{ courseware_ollama_install_marker }}"
|
path: "{{ courseware_ollama_install_marker }}"
|
||||||
@@ -140,6 +156,55 @@
|
|||||||
- courseware_down_ollama_marker.stat.exists
|
- courseware_down_ollama_marker.stat.exists
|
||||||
failed_when: false
|
failed_when: false
|
||||||
|
|
||||||
|
- name: Stop courseware-managed Ollama macOS app if running
|
||||||
|
command: pkill -x Ollama
|
||||||
|
when:
|
||||||
|
- ansible_system == "Darwin"
|
||||||
|
- courseware_down_ollama_marker.stat.exists
|
||||||
|
changed_when: false
|
||||||
|
failed_when: false
|
||||||
|
|
||||||
|
- name: Uninstall courseware-managed Ollama Homebrew formula
|
||||||
|
command: brew uninstall ollama
|
||||||
|
when:
|
||||||
|
- ansible_system == "Darwin"
|
||||||
|
- courseware_down_ollama_marker.stat.exists
|
||||||
|
changed_when: false
|
||||||
|
failed_when: false
|
||||||
|
|
||||||
|
- name: Remove managed TransformerLab conda environment entry
|
||||||
|
lineinfile:
|
||||||
|
path: "{{ ansible_env.HOME }}/.conda/environments.txt"
|
||||||
|
regexp: "^{{ (courseware_transformerlab_home ~ '/envs/transformerlab') | regex_escape() }}$"
|
||||||
|
state: absent
|
||||||
|
when: courseware_down_conda_envs.stat.exists
|
||||||
|
failed_when: false
|
||||||
|
|
||||||
|
- name: Remove managed TransformerLab path
|
||||||
|
file:
|
||||||
|
path: "{{ ansible_env.HOME }}/.transformerlab"
|
||||||
|
state: absent
|
||||||
|
when:
|
||||||
|
- courseware_down_transformerlab.stat.exists
|
||||||
|
- >
|
||||||
|
(courseware_down_transformerlab.stat.islnk and
|
||||||
|
courseware_down_transformerlab.stat.lnk_source == courseware_transformerlab_home)
|
||||||
|
or courseware_down_transformerlab_marker.stat.exists
|
||||||
|
failed_when: false
|
||||||
|
|
||||||
|
- name: Remove managed TransformerLab home
|
||||||
|
file:
|
||||||
|
path: "{{ courseware_transformerlab_home }}"
|
||||||
|
state: absent
|
||||||
|
failed_when: false
|
||||||
|
|
||||||
|
- name: Remove legacy managed TransformerLab home
|
||||||
|
file:
|
||||||
|
path: "{{ courseware_transformerlab_legacy_home }}"
|
||||||
|
state: absent
|
||||||
|
when: courseware_transformerlab_legacy_home != courseware_transformerlab_home
|
||||||
|
failed_when: false
|
||||||
|
|
||||||
- name: Remove managed Unsloth path
|
- name: Remove managed Unsloth path
|
||||||
file:
|
file:
|
||||||
path: "{{ ansible_env.HOME }}/.unsloth"
|
path: "{{ ansible_env.HOME }}/.unsloth"
|
||||||
|
|||||||
@@ -6,15 +6,13 @@
|
|||||||
- { role: preflight, tags: ["preflight"] }
|
- { role: preflight, tags: ["preflight"] }
|
||||||
- directories
|
- directories
|
||||||
- packages
|
- packages
|
||||||
- netron
|
|
||||||
- lab1_assets
|
|
||||||
- lab_assets
|
- lab_assets
|
||||||
- node_runtime
|
- node_runtime
|
||||||
- { role: terminal, when: ansible_system == "Linux" }
|
|
||||||
- llama_cpp
|
- llama_cpp
|
||||||
|
- transformerlab
|
||||||
- open_webui
|
- open_webui
|
||||||
- chunkviz
|
- chunkviz
|
||||||
- promptfoo
|
- promptfoo
|
||||||
- { role: ollama_models, tags: ["ollama_models"] }
|
- wiki
|
||||||
- { role: wiki, tags: ["wiki"] }
|
- kiln
|
||||||
- unsloth
|
- unsloth
|
||||||
|
|||||||
@@ -12,3 +12,10 @@ common_packages_debian:
|
|||||||
- ninja-build
|
- ninja-build
|
||||||
- libssl-dev
|
- libssl-dev
|
||||||
- pkg-config
|
- pkg-config
|
||||||
|
|
||||||
|
common_packages_macos:
|
||||||
|
- python3
|
||||||
|
- git
|
||||||
|
- curl
|
||||||
|
- cmake
|
||||||
|
- ninja
|
||||||
|
|||||||
@@ -20,6 +20,17 @@
|
|||||||
when: ansible_os_family == "Debian"
|
when: ansible_os_family == "Debian"
|
||||||
become: yes
|
become: yes
|
||||||
|
|
||||||
|
- name: Ensure Homebrew is installed (macOS)
|
||||||
|
ansible.builtin.homebrew:
|
||||||
|
name:
|
||||||
|
- python3
|
||||||
|
- git
|
||||||
|
- curl
|
||||||
|
- cmake
|
||||||
|
- ninja
|
||||||
|
state: present
|
||||||
|
when: ansible_os_family == "Darwin"
|
||||||
|
|
||||||
- name: Install Python virtual environment module (user space)
|
- name: Install Python virtual environment module (user space)
|
||||||
ansible.builtin.pip:
|
ansible.builtin.pip:
|
||||||
name: virtualenv
|
name: virtualenv
|
||||||
|
|||||||
@@ -13,11 +13,12 @@
|
|||||||
- "{{ courseware_models_dir }}"
|
- "{{ courseware_models_dir }}"
|
||||||
- "{{ courseware_datasets_dir }}"
|
- "{{ courseware_datasets_dir }}"
|
||||||
- "{{ courseware_tools_dir }}"
|
- "{{ courseware_tools_dir }}"
|
||||||
|
- "{{ courseware_apps_dir }}"
|
||||||
- "{{ courseware_downloads_dir }}"
|
- "{{ courseware_downloads_dir }}"
|
||||||
- "{{ courseware_lab1_dir }}"
|
|
||||||
- "{{ courseware_lab2_dir }}"
|
- "{{ courseware_lab2_dir }}"
|
||||||
|
- "{{ courseware_safe_homes_dir }}"
|
||||||
|
- "{{ courseware_transformerlab_home }}"
|
||||||
- "{{ courseware_unsloth_home }}"
|
- "{{ courseware_unsloth_home }}"
|
||||||
- "{{ courseware_lab1_models_dir }}"
|
|
||||||
- "{{ courseware_ollama_models_dir }}"
|
- "{{ courseware_ollama_models_dir }}"
|
||||||
|
|
||||||
- name: Seed managed ownership markers
|
- name: Seed managed ownership markers
|
||||||
@@ -26,8 +27,40 @@
|
|||||||
state: touch
|
state: touch
|
||||||
mode: "0644"
|
mode: "0644"
|
||||||
loop:
|
loop:
|
||||||
|
- "{{ courseware_transformerlab_home }}/.courseware-managed"
|
||||||
- "{{ courseware_unsloth_home }}/.courseware-managed"
|
- "{{ courseware_unsloth_home }}/.courseware-managed"
|
||||||
|
|
||||||
|
- name: Check existing TransformerLab path
|
||||||
|
stat:
|
||||||
|
path: "{{ ansible_env.HOME }}/.transformerlab"
|
||||||
|
follow: false
|
||||||
|
register: courseware_transformerlab_link
|
||||||
|
|
||||||
|
- name: Check existing TransformerLab ownership marker
|
||||||
|
stat:
|
||||||
|
path: "{{ ansible_env.HOME }}/.transformerlab/.courseware-managed"
|
||||||
|
register: courseware_transformerlab_marker
|
||||||
|
|
||||||
|
- name: Fail if TransformerLab path is already occupied
|
||||||
|
fail:
|
||||||
|
msg: "{{ ansible_env.HOME }}/.transformerlab already exists and is not managed by this project."
|
||||||
|
when:
|
||||||
|
- courseware_transformerlab_link.stat.exists
|
||||||
|
- >
|
||||||
|
(
|
||||||
|
(not courseware_transformerlab_link.stat.islnk) or
|
||||||
|
(courseware_transformerlab_link.stat.islnk and
|
||||||
|
courseware_transformerlab_link.stat.lnk_source != courseware_transformerlab_home)
|
||||||
|
) and
|
||||||
|
(not courseware_transformerlab_marker.stat.exists)
|
||||||
|
|
||||||
|
- name: Link TransformerLab home into project state
|
||||||
|
file:
|
||||||
|
src: "{{ courseware_transformerlab_home }}"
|
||||||
|
dest: "{{ ansible_env.HOME }}/.transformerlab"
|
||||||
|
state: link
|
||||||
|
force: true
|
||||||
|
|
||||||
- name: Check existing Unsloth path
|
- name: Check existing Unsloth path
|
||||||
stat:
|
stat:
|
||||||
path: "{{ ansible_env.HOME }}/.unsloth"
|
path: "{{ ansible_env.HOME }}/.unsloth"
|
||||||
|
|||||||
@@ -0,0 +1,25 @@
|
|||||||
|
- name: Download Kiln Linux archive
|
||||||
|
get_url:
|
||||||
|
url: "https://github.com/Kiln-AI/Kiln/releases/download/{{ courseware_kiln_release_tag }}/Kiln.Linux.x64.zip"
|
||||||
|
dest: "{{ courseware_downloads_dir }}/Kiln.Linux.x64.zip"
|
||||||
|
mode: "0644"
|
||||||
|
|
||||||
|
- name: Create Kiln Linux directory
|
||||||
|
file:
|
||||||
|
path: "{{ courseware_apps_dir }}/kiln"
|
||||||
|
state: directory
|
||||||
|
mode: "0755"
|
||||||
|
|
||||||
|
- name: Unpack Kiln Linux binary
|
||||||
|
unarchive:
|
||||||
|
src: "{{ courseware_downloads_dir }}/Kiln.Linux.x64.zip"
|
||||||
|
dest: "{{ courseware_apps_dir }}/kiln"
|
||||||
|
remote_src: true
|
||||||
|
creates: "{{ courseware_apps_dir }}/kiln/Kiln"
|
||||||
|
|
||||||
|
- name: Ensure Kiln Linux binary is executable
|
||||||
|
file:
|
||||||
|
path: "{{ courseware_apps_dir }}/kiln/Kiln"
|
||||||
|
mode: "0755"
|
||||||
|
state: file
|
||||||
|
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
- name: Download Kiln macOS disk image
|
||||||
|
get_url:
|
||||||
|
url: "https://github.com/Kiln-AI/Kiln/releases/download/{{ courseware_kiln_release_tag }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg"
|
||||||
|
dest: "{{ courseware_downloads_dir }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg"
|
||||||
|
mode: "0644"
|
||||||
|
|
||||||
|
- name: Install Kiln.app into project state
|
||||||
|
shell: |
|
||||||
|
set -euo pipefail
|
||||||
|
mount_point=$(mktemp -d /tmp/kiln.XXXXXX)
|
||||||
|
hdiutil attach "{{ courseware_downloads_dir }}/Kiln.MacOS.AppleSilicon.M-Processor.dmg" -mountpoint "$mount_point" -nobrowse -quiet
|
||||||
|
app_path=$(find "$mount_point" -maxdepth 1 -name '*.app' | head -n 1)
|
||||||
|
rm -rf "{{ courseware_apps_dir }}/Kiln.app"
|
||||||
|
cp -R "$app_path" "{{ courseware_apps_dir }}/Kiln.app"
|
||||||
|
hdiutil detach "$mount_point" -quiet
|
||||||
|
rmdir "$mount_point"
|
||||||
|
args:
|
||||||
|
executable: /bin/bash
|
||||||
|
creates: "{{ courseware_apps_dir }}/Kiln.app"
|
||||||
@@ -0,0 +1,8 @@
|
|||||||
|
- name: Install Kiln on Linux
|
||||||
|
include_tasks: linux.yml
|
||||||
|
when: ansible_system == "Linux"
|
||||||
|
|
||||||
|
- name: Install Kiln on macOS
|
||||||
|
include_tasks: macos.yml
|
||||||
|
when: ansible_system == "Darwin"
|
||||||
|
|
||||||
@@ -1,38 +0,0 @@
|
|||||||
- name: Ensure Lab 1 model directory exists
|
|
||||||
file:
|
|
||||||
path: "{{ courseware_lab1_models_dir }}"
|
|
||||||
state: directory
|
|
||||||
mode: "0755"
|
|
||||||
|
|
||||||
- name: Check installed Ollama version
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- "{{ courseware_ollama_bin }}"
|
|
||||||
- --version
|
|
||||||
register: courseware_lab1_ollama_version
|
|
||||||
changed_when: false
|
|
||||||
|
|
||||||
- name: Extract installed Ollama semantic version
|
|
||||||
set_fact:
|
|
||||||
courseware_lab1_ollama_semver: >-
|
|
||||||
{{
|
|
||||||
courseware_lab1_ollama_version.stdout
|
|
||||||
| regex_search('[0-9]+\.[0-9]+\.[0-9]+')
|
|
||||||
| default('')
|
|
||||||
}}
|
|
||||||
|
|
||||||
- name: Fail when Ollama is too old for Lab 1 logprobs
|
|
||||||
fail:
|
|
||||||
msg: >-
|
|
||||||
Lab 1 requires Ollama {{ courseware_ollama_min_version }} or newer because
|
|
||||||
the confidence visualizer depends on logprob support. Installed version:
|
|
||||||
{{ courseware_lab1_ollama_version.stdout | trim }}.
|
|
||||||
when:
|
|
||||||
- courseware_lab1_ollama_semver | length == 0
|
|
||||||
or not (courseware_lab1_ollama_semver is version(courseware_ollama_min_version, '>='))
|
|
||||||
|
|
||||||
- name: Download mirrored Lab 1 Llama model
|
|
||||||
get_url:
|
|
||||||
url: "{{ courseware_lab1_llama_download_url }}"
|
|
||||||
dest: "{{ courseware_lab1_llama_local_path }}"
|
|
||||||
mode: "0644"
|
|
||||||
@@ -23,6 +23,18 @@
|
|||||||
gpu_type: "{{ 'nvidia' if nvidia_smi_output.rc == 0 else 'none' }}"
|
gpu_type: "{{ 'nvidia' if nvidia_smi_output.rc == 0 else 'none' }}"
|
||||||
when: is_wsl | default(false) or ansible_os_family == "Debian"
|
when: is_wsl | default(false) or ansible_os_family == "Debian"
|
||||||
|
|
||||||
|
- name: Check for Metal GPU on macOS
|
||||||
|
ansible.builtin.command: system_profiler SPDisplaysDataType
|
||||||
|
register: metal_check
|
||||||
|
changed_when: false
|
||||||
|
failed_when: false
|
||||||
|
when: ansible_os_family == "Darwin"
|
||||||
|
|
||||||
|
- name: Set GPU type for macOS
|
||||||
|
ansible.builtin.set_fact:
|
||||||
|
gpu_type: "metal"
|
||||||
|
when: ansible_os_family == "Darwin" and metal_check.rc == 0
|
||||||
|
|
||||||
- name: Display detected GPU type
|
- name: Display detected GPU type
|
||||||
ansible.builtin.debug:
|
ansible.builtin.debug:
|
||||||
msg: "llama.cpp GPU type: {{ gpu_type | default('none') }}"
|
msg: "llama.cpp GPU type: {{ gpu_type | default('none') }}"
|
||||||
@@ -46,6 +58,7 @@
|
|||||||
{{
|
{{
|
||||||
not llama_cpp_stat.stat.exists or
|
not llama_cpp_stat.stat.exists or
|
||||||
(gpu_type == 'nvidia' and existing_gpu_check.stdout != 'cuda') or
|
(gpu_type == 'nvidia' and existing_gpu_check.stdout != 'cuda') or
|
||||||
|
(gpu_type == 'metal' and existing_gpu_check.stdout != 'metal') or
|
||||||
(gpu_type == 'amd' and existing_gpu_check.stdout != 'amd')
|
(gpu_type == 'amd' and existing_gpu_check.stdout != 'amd')
|
||||||
}}
|
}}
|
||||||
|
|
||||||
@@ -107,6 +120,19 @@
|
|||||||
when: gpu_type == 'amd' and cmake_configured.rc != 0
|
when: gpu_type == 'amd' and cmake_configured.rc != 0
|
||||||
become: no
|
become: no
|
||||||
|
|
||||||
|
- name: Configure llama.cpp for Metal (macOS)
|
||||||
|
ansible.builtin.command:
|
||||||
|
argv:
|
||||||
|
- cmake
|
||||||
|
- ..
|
||||||
|
- -G Ninja
|
||||||
|
- -DCMAKE_BUILD_TYPE=Release
|
||||||
|
- -DGGML_METAL=on
|
||||||
|
args:
|
||||||
|
chdir: "{{ llmlab_base }}/lab2/llama.cpp/build"
|
||||||
|
when: gpu_type == 'metal' and cmake_configured.rc != 0
|
||||||
|
become: no
|
||||||
|
|
||||||
- name: Configure llama.cpp for CPU only
|
- name: Configure llama.cpp for CPU only
|
||||||
ansible.builtin.command:
|
ansible.builtin.command:
|
||||||
argv:
|
argv:
|
||||||
|
|||||||
@@ -78,7 +78,7 @@
|
|||||||
|
|
||||||
- name: Set llama.cpp backend flag
|
- name: Set llama.cpp backend flag
|
||||||
set_fact:
|
set_fact:
|
||||||
courseware_llama_backend_flag: "-DGGML_CUDA=ON"
|
courseware_llama_backend_flag: "{{ '-DGGML_METAL=ON' if ansible_system == 'Darwin' else '-DGGML_CUDA=ON' }}"
|
||||||
|
|
||||||
- name: Set llama.cpp build parallelism
|
- name: Set llama.cpp build parallelism
|
||||||
set_fact:
|
set_fact:
|
||||||
|
|||||||
@@ -1,30 +0,0 @@
|
|||||||
- name: Create Netron virtual environment
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- "{{ courseware_python_bin }}"
|
|
||||||
- -m
|
|
||||||
- venv
|
|
||||||
- "{{ courseware_netron_venv_dir }}"
|
|
||||||
args:
|
|
||||||
creates: "{{ courseware_netron_venv_dir }}/bin/python"
|
|
||||||
|
|
||||||
- name: Upgrade Netron venv tooling
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- "{{ courseware_netron_venv_dir }}/bin/python"
|
|
||||||
- -m
|
|
||||||
- pip
|
|
||||||
- install
|
|
||||||
- --upgrade
|
|
||||||
- pip
|
|
||||||
- setuptools
|
|
||||||
- wheel
|
|
||||||
|
|
||||||
- name: Install Netron server runtime
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- "{{ courseware_netron_venv_dir }}/bin/python"
|
|
||||||
- -m
|
|
||||||
- pip
|
|
||||||
- install
|
|
||||||
- "netron=={{ courseware_netron_version }}"
|
|
||||||
@@ -16,6 +16,14 @@
|
|||||||
become: yes
|
become: yes
|
||||||
notify: Start Ollama service
|
notify: Start Ollama service
|
||||||
|
|
||||||
|
- name: Install Ollama (macOS via Homebrew)
|
||||||
|
ansible.builtin.homebrew:
|
||||||
|
name: ollama
|
||||||
|
state: present
|
||||||
|
when:
|
||||||
|
- ansible_os_family == "Darwin"
|
||||||
|
- ollama_version_check.rc != 0
|
||||||
|
|
||||||
- name: Check if Ollama service exists
|
- name: Check if Ollama service exists
|
||||||
ansible.builtin.command: systemctl list-unit-files ollama.service
|
ansible.builtin.command: systemctl list-unit-files ollama.service
|
||||||
register: ollama_service_check
|
register: ollama_service_check
|
||||||
|
|||||||
@@ -4,58 +4,15 @@
|
|||||||
state: directory
|
state: directory
|
||||||
mode: "0755"
|
mode: "0755"
|
||||||
|
|
||||||
- name: Create uv helper virtual environment
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- "{{ courseware_python_bin }}"
|
|
||||||
- -m
|
|
||||||
- venv
|
|
||||||
- "{{ courseware_uv_venv_dir }}"
|
|
||||||
args:
|
|
||||||
creates: "{{ courseware_uv_venv_dir }}/bin/python"
|
|
||||||
|
|
||||||
- name: Install uv helper
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- "{{ courseware_uv_venv_dir }}/bin/python"
|
|
||||||
- -m
|
|
||||||
- pip
|
|
||||||
- install
|
|
||||||
- --upgrade
|
|
||||||
- pip
|
|
||||||
- uv
|
|
||||||
args:
|
|
||||||
creates: "{{ courseware_uv_venv_dir }}/bin/uv"
|
|
||||||
|
|
||||||
- name: Check Open WebUI virtual environment Python version
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- "{{ courseware_venvs_dir }}/open-webui/bin/python"
|
|
||||||
- -c
|
|
||||||
- "import importlib.util, sys; expected = tuple(map(int, '{{ courseware_open_webui_python_version }}'.split('.')[:2])); ok = sys.version_info[:len(expected)] == expected and importlib.util.find_spec('pip') is not None; raise SystemExit(0 if ok else 1)"
|
|
||||||
register: courseware_open_webui_python_check
|
|
||||||
changed_when: false
|
|
||||||
failed_when: false
|
|
||||||
|
|
||||||
- name: Remove incompatible Open WebUI virtual environment
|
|
||||||
file:
|
|
||||||
path: "{{ courseware_venvs_dir }}/open-webui"
|
|
||||||
state: absent
|
|
||||||
when: courseware_open_webui_python_check.rc != 0
|
|
||||||
|
|
||||||
- name: Create Open WebUI virtual environment
|
- name: Create Open WebUI virtual environment
|
||||||
command:
|
command:
|
||||||
argv:
|
argv:
|
||||||
- "{{ courseware_uv_venv_dir }}/bin/uv"
|
- "{{ courseware_transformerlab_home }}/envs/transformerlab/bin/python"
|
||||||
|
- -m
|
||||||
- venv
|
- venv
|
||||||
- --seed
|
|
||||||
- --python
|
|
||||||
- "{{ courseware_open_webui_python_version }}"
|
|
||||||
- "{{ courseware_venvs_dir }}/open-webui"
|
- "{{ courseware_venvs_dir }}/open-webui"
|
||||||
args:
|
args:
|
||||||
creates: "{{ courseware_venvs_dir }}/open-webui/bin/python"
|
creates: "{{ courseware_venvs_dir }}/open-webui/bin/python"
|
||||||
environment:
|
|
||||||
UV_PYTHON_INSTALL_DIR: "{{ courseware_uv_python_install_dir }}"
|
|
||||||
|
|
||||||
- name: Upgrade Open WebUI venv tooling
|
- name: Upgrade Open WebUI venv tooling
|
||||||
command:
|
command:
|
||||||
@@ -82,7 +39,7 @@
|
|||||||
- name: Create Embedding Atlas virtual environment
|
- name: Create Embedding Atlas virtual environment
|
||||||
command:
|
command:
|
||||||
argv:
|
argv:
|
||||||
- "{{ courseware_python_bin }}"
|
- "{{ courseware_transformerlab_home }}/envs/transformerlab/bin/python"
|
||||||
- -m
|
- -m
|
||||||
- venv
|
- venv
|
||||||
- "{{ courseware_venvs_dir }}/embedding-atlas"
|
- "{{ courseware_venvs_dir }}/embedding-atlas"
|
||||||
|
|||||||
@@ -14,7 +14,6 @@
|
|||||||
- pkg-config
|
- pkg-config
|
||||||
- python3
|
- python3
|
||||||
- python3-pip
|
- python3-pip
|
||||||
- python3-setuptools
|
|
||||||
- python3-venv
|
- python3-venv
|
||||||
- unzip
|
- unzip
|
||||||
- zstd
|
- zstd
|
||||||
|
|||||||
@@ -0,0 +1,29 @@
|
|||||||
|
- name: Check installed Homebrew formulas
|
||||||
|
command: "brew list --versions {{ item }}"
|
||||||
|
loop:
|
||||||
|
- git
|
||||||
|
- git-lfs
|
||||||
|
- cmake
|
||||||
|
- node
|
||||||
|
- python@3.11
|
||||||
|
- ollama
|
||||||
|
register: courseware_brew_checks
|
||||||
|
changed_when: false
|
||||||
|
failed_when: false
|
||||||
|
|
||||||
|
- name: Install missing Homebrew formulas
|
||||||
|
command: "brew install {{ item.item }}"
|
||||||
|
loop: "{{ courseware_brew_checks.results }}"
|
||||||
|
when: item.rc != 0
|
||||||
|
|
||||||
|
- name: Mark Ollama as installed by courseware on macOS
|
||||||
|
file:
|
||||||
|
path: "{{ courseware_ollama_install_marker }}"
|
||||||
|
state: touch
|
||||||
|
mode: "0644"
|
||||||
|
when:
|
||||||
|
- courseware_brew_checks.results
|
||||||
|
| selectattr('item', 'equalto', 'ollama')
|
||||||
|
| selectattr('rc', 'ne', 0)
|
||||||
|
| list
|
||||||
|
| length > 0
|
||||||
@@ -1,3 +1,8 @@
|
|||||||
|
- name: Install macOS prerequisites
|
||||||
|
include_tasks: macos.yml
|
||||||
|
when: ansible_system == "Darwin"
|
||||||
|
|
||||||
- name: Install Linux prerequisites
|
- name: Install Linux prerequisites
|
||||||
include_tasks: linux.yml
|
include_tasks: linux.yml
|
||||||
when: ansible_system == "Linux"
|
when: ansible_system == "Linux"
|
||||||
|
|
||||||
|
|||||||
@@ -1,14 +1,56 @@
|
|||||||
- name: Classify supported host profile
|
- name: Classify supported host profile
|
||||||
set_fact:
|
set_fact:
|
||||||
|
courseware_is_macos: "{{ ansible_system == 'Darwin' }}"
|
||||||
courseware_is_linux: "{{ ansible_system == 'Linux' }}"
|
courseware_is_linux: "{{ ansible_system == 'Linux' }}"
|
||||||
courseware_is_wsl: "{{ 'microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower }}"
|
courseware_is_wsl: "{{ 'microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower }}"
|
||||||
courseware_is_native_linux: "{{ ansible_system == 'Linux' and not ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) }}"
|
courseware_is_native_linux: "{{ ansible_system == 'Linux' and not ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) }}"
|
||||||
courseware_host_profile: "{{ 'wsl' if ansible_system == 'Linux' and ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) else ('native-debian-ubuntu' if ansible_system == 'Linux' and ansible_os_family == 'Debian' else 'unsupported') }}"
|
courseware_host_profile: "{{ 'macos' if ansible_system == 'Darwin' else ('wsl' if ('microsoft' in ansible_kernel | lower or 'wsl' in ansible_kernel | lower) else ('native-debian-ubuntu' if ansible_system == 'Linux' and ansible_os_family == 'Debian' else 'unsupported')) }}"
|
||||||
|
|
||||||
- name: Fail on unsupported operating systems
|
- name: Fail on unsupported operating systems
|
||||||
fail:
|
fail:
|
||||||
msg: "Supported platforms are Debian-family Linux and WSL."
|
msg: "Supported platforms are Apple Silicon macOS and Debian-family Linux/WSL."
|
||||||
when: courseware_host_profile == "unsupported"
|
when: ansible_system not in ["Darwin", "Linux"]
|
||||||
|
|
||||||
|
- name: Fail on unsupported macOS architecture
|
||||||
|
fail:
|
||||||
|
msg: "This installer supports Apple Silicon Macs only."
|
||||||
|
when:
|
||||||
|
- ansible_system == "Darwin"
|
||||||
|
- ansible_architecture not in ["arm64", "aarch64"]
|
||||||
|
|
||||||
|
- name: Fail on undersized macOS systems
|
||||||
|
fail:
|
||||||
|
msg: "This courseware assumes a modern Apple Silicon Mac with at least 16 GB of unified memory."
|
||||||
|
when:
|
||||||
|
- ansible_system == "Darwin"
|
||||||
|
- (ansible_memtotal_mb | int) < 16000
|
||||||
|
|
||||||
|
- name: Check for Xcode command line tools
|
||||||
|
command: xcode-select -p
|
||||||
|
register: courseware_xcode_select
|
||||||
|
changed_when: false
|
||||||
|
when: ansible_system == "Darwin"
|
||||||
|
|
||||||
|
- name: Check for Homebrew
|
||||||
|
command: which brew
|
||||||
|
register: courseware_brew_check
|
||||||
|
changed_when: false
|
||||||
|
failed_when: false
|
||||||
|
when: ansible_system == "Darwin"
|
||||||
|
|
||||||
|
- name: Fail when Xcode command line tools are missing
|
||||||
|
fail:
|
||||||
|
msg: "Install Xcode Command Line Tools first with 'xcode-select --install'."
|
||||||
|
when:
|
||||||
|
- ansible_system == "Darwin"
|
||||||
|
- courseware_xcode_select.rc != 0
|
||||||
|
|
||||||
|
- name: Fail when Homebrew is missing
|
||||||
|
fail:
|
||||||
|
msg: "Install Homebrew first from https://brew.sh/."
|
||||||
|
when:
|
||||||
|
- ansible_system == "Darwin"
|
||||||
|
- courseware_brew_check.rc != 0
|
||||||
|
|
||||||
- name: Fail on unsupported Linux family
|
- name: Fail on unsupported Linux family
|
||||||
fail:
|
fail:
|
||||||
@@ -288,5 +330,6 @@
|
|||||||
|
|
||||||
- name: Set runtime binary defaults
|
- name: Set runtime binary defaults
|
||||||
set_fact:
|
set_fact:
|
||||||
courseware_python_bin: "/usr/bin/python3"
|
courseware_python_bin: >-
|
||||||
|
{{ '/opt/homebrew/opt/python@3.11/bin/python3.11' if ansible_system == 'Darwin' else '/usr/bin/python3' }}
|
||||||
courseware_ollama_bin: "ollama"
|
courseware_ollama_bin: "ollama"
|
||||||
|
|||||||
@@ -1,117 +0,0 @@
|
|||||||
- name: Install terminal prerequisites
|
|
||||||
become: true
|
|
||||||
apt:
|
|
||||||
name:
|
|
||||||
- openssh-server
|
|
||||||
state: present
|
|
||||||
update_cache: true
|
|
||||||
|
|
||||||
- name: Ensure sshd drop-in directory exists
|
|
||||||
become: true
|
|
||||||
file:
|
|
||||||
path: /etc/ssh/sshd_config.d
|
|
||||||
state: directory
|
|
||||||
mode: "0755"
|
|
||||||
|
|
||||||
- name: Configure courseware sshd policy
|
|
||||||
become: true
|
|
||||||
template:
|
|
||||||
src: sshd-courseware-terminal.conf.j2
|
|
||||||
dest: /etc/ssh/sshd_config.d/50-courseware-terminal.conf
|
|
||||||
mode: "0644"
|
|
||||||
register: courseware_terminal_sshd_config
|
|
||||||
|
|
||||||
- name: Ensure sshd runtime directory exists
|
|
||||||
become: true
|
|
||||||
file:
|
|
||||||
path: /run/sshd
|
|
||||||
state: directory
|
|
||||||
mode: "0755"
|
|
||||||
|
|
||||||
- name: Validate sshd configuration
|
|
||||||
become: true
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- /usr/sbin/sshd
|
|
||||||
- -t
|
|
||||||
- -f
|
|
||||||
- /etc/ssh/sshd_config
|
|
||||||
changed_when: false
|
|
||||||
|
|
||||||
- name: Start and enable sshd with systemd when available
|
|
||||||
become: true
|
|
||||||
systemd:
|
|
||||||
name: ssh
|
|
||||||
state: started
|
|
||||||
enabled: true
|
|
||||||
when: ansible_service_mgr == "systemd"
|
|
||||||
|
|
||||||
- name: Reload sshd when config changed with systemd
|
|
||||||
become: true
|
|
||||||
systemd:
|
|
||||||
name: ssh
|
|
||||||
state: reloaded
|
|
||||||
when:
|
|
||||||
- ansible_service_mgr == "systemd"
|
|
||||||
- courseware_terminal_sshd_config.changed
|
|
||||||
|
|
||||||
- name: Check for running sshd when systemd is unavailable
|
|
||||||
become: true
|
|
||||||
command: pgrep -x sshd
|
|
||||||
register: courseware_terminal_sshd_pid
|
|
||||||
changed_when: false
|
|
||||||
failed_when: false
|
|
||||||
when: ansible_service_mgr != "systemd"
|
|
||||||
|
|
||||||
- name: Reload running sshd when config changed outside systemd
|
|
||||||
become: true
|
|
||||||
command: pkill -HUP -x sshd
|
|
||||||
when:
|
|
||||||
- ansible_service_mgr != "systemd"
|
|
||||||
- courseware_terminal_sshd_pid.rc == 0
|
|
||||||
- courseware_terminal_sshd_config.changed
|
|
||||||
|
|
||||||
- name: Start sshd when it is not already running outside systemd
|
|
||||||
become: true
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- /usr/sbin/sshd
|
|
||||||
when:
|
|
||||||
- ansible_service_mgr != "systemd"
|
|
||||||
- courseware_terminal_sshd_pid.rc != 0
|
|
||||||
|
|
||||||
- name: Create contained WeTTY directory
|
|
||||||
file:
|
|
||||||
path: "{{ courseware_wetty_dir }}"
|
|
||||||
state: directory
|
|
||||||
mode: "0755"
|
|
||||||
|
|
||||||
- name: Install contained WeTTY runtime
|
|
||||||
command:
|
|
||||||
argv:
|
|
||||||
- npm
|
|
||||||
- install
|
|
||||||
- "{{ courseware_wetty_spec }}"
|
|
||||||
args:
|
|
||||||
chdir: "{{ courseware_wetty_dir }}"
|
|
||||||
creates: "{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
|
|
||||||
environment:
|
|
||||||
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
|
|
||||||
|
|
||||||
- name: Wait for sshd to accept local WeTTY connections
|
|
||||||
wait_for:
|
|
||||||
host: 127.0.0.1
|
|
||||||
port: 22
|
|
||||||
state: started
|
|
||||||
timeout: 10
|
|
||||||
msg: "sshd must accept connections on 127.0.0.1:22 for the browser terminal deployment."
|
|
||||||
|
|
||||||
- name: Assert WeTTY binary exists
|
|
||||||
stat:
|
|
||||||
path: "{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
|
|
||||||
register: courseware_wetty_bin_stat
|
|
||||||
|
|
||||||
- name: Fail when WeTTY installation is incomplete
|
|
||||||
fail:
|
|
||||||
msg: "WeTTY was not installed under {{ courseware_wetty_dir }}."
|
|
||||||
when: not courseware_wetty_bin_stat.stat.exists
|
|
||||||
@@ -1,9 +0,0 @@
|
|||||||
# Managed by Local Courseware Deployment.
|
|
||||||
PermitRootLogin no
|
|
||||||
PasswordAuthentication yes
|
|
||||||
KbdInteractiveAuthentication no
|
|
||||||
ChallengeResponseAuthentication no
|
|
||||||
UsePAM yes
|
|
||||||
AllowTcpForwarding no
|
|
||||||
X11Forwarding no
|
|
||||||
PrintMotd no
|
|
||||||
@@ -145,7 +145,7 @@
|
|||||||
|
|
||||||
- name: Determine Miniforge platform suffix
|
- name: Determine Miniforge platform suffix
|
||||||
set_fact:
|
set_fact:
|
||||||
courseware_transformerlab_miniforge_platform: "{{ 'Linux-x86_64' if ansible_system == 'Linux' and ansible_architecture == 'x86_64' else 'Linux-aarch64' if ansible_system == 'Linux' and ansible_architecture in ['aarch64', 'arm64'] else 'unsupported' }}"
|
courseware_transformerlab_miniforge_platform: "{{ 'Linux-x86_64' if ansible_system == 'Linux' and ansible_architecture == 'x86_64' else 'Linux-aarch64' if ansible_system == 'Linux' and ansible_architecture in ['aarch64', 'arm64'] else 'MacOSX-arm64' if ansible_system == 'Darwin' and ansible_architecture == 'arm64' else 'MacOSX-x86_64' if ansible_system == 'Darwin' and ansible_architecture == 'x86_64' else 'unsupported' }}"
|
||||||
|
|
||||||
- name: Fail for unsupported Miniforge platform
|
- name: Fail for unsupported Miniforge platform
|
||||||
fail:
|
fail:
|
||||||
@@ -210,7 +210,7 @@
|
|||||||
elif [ "$accelerator" = "rocm" ]; then
|
elif [ "$accelerator" = "rocm" ]; then
|
||||||
wheel_args+=(--index https://download.pytorch.org/whl/rocm6.4 --index-strategy unsafe-best-match)
|
wheel_args+=(--index https://download.pytorch.org/whl/rocm6.4 --index-strategy unsafe-best-match)
|
||||||
extra="rocm"
|
extra="rocm"
|
||||||
else
|
elif [ "{{ ansible_system }}" != "Darwin" ]; then
|
||||||
wheel_args+=(--index https://download.pytorch.org/whl/cpu --index-strategy unsafe-best-match)
|
wheel_args+=(--index https://download.pytorch.org/whl/cpu --index-strategy unsafe-best-match)
|
||||||
fi
|
fi
|
||||||
|
|
||||||
@@ -233,12 +233,27 @@
|
|||||||
dest: "{{ courseware_state_dir }}/repair_transformerlab_plugin_supports.py"
|
dest: "{{ courseware_state_dir }}/repair_transformerlab_plugin_supports.py"
|
||||||
mode: "0755"
|
mode: "0755"
|
||||||
|
|
||||||
|
- name: Install TransformerLab source repair helper
|
||||||
|
copy:
|
||||||
|
src: "{{ playbook_dir }}/../../scripts/repair_transformerlab_symlink_paths.py"
|
||||||
|
dest: "{{ courseware_state_dir }}/repair_transformerlab_symlink_paths.py"
|
||||||
|
mode: "0755"
|
||||||
|
|
||||||
- name: Install TransformerLab default-user helper
|
- name: Install TransformerLab default-user helper
|
||||||
copy:
|
copy:
|
||||||
src: "{{ playbook_dir }}/../../scripts/ensure_transformerlab_user.py"
|
src: "{{ playbook_dir }}/../../scripts/ensure_transformerlab_user.py"
|
||||||
dest: "{{ courseware_state_dir }}/ensure_transformerlab_user.py"
|
dest: "{{ courseware_state_dir }}/ensure_transformerlab_user.py"
|
||||||
mode: "0755"
|
mode: "0755"
|
||||||
|
|
||||||
|
- name: Repair pinned TransformerLab symlink-aware plugin file lookups
|
||||||
|
command:
|
||||||
|
argv:
|
||||||
|
- python3
|
||||||
|
- "{{ courseware_state_dir }}/repair_transformerlab_symlink_paths.py"
|
||||||
|
- --transformerlab-dir
|
||||||
|
- "{{ courseware_transformerlab_home }}"
|
||||||
|
changed_when: false
|
||||||
|
|
||||||
- name: Repair installed Fastchat plugin supports
|
- name: Repair installed Fastchat plugin supports
|
||||||
command:
|
command:
|
||||||
argv:
|
argv:
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
diff --git a/src/app/labs/[slug]/page.tsx b/src/app/labs/[slug]/page.tsx
|
diff --git a/src/app/labs/[slug]/page.tsx b/src/app/labs/[slug]/page.tsx
|
||||||
index eb949ae..bb3d51c 100644
|
index f67308f..a6aac38 100644
|
||||||
--- a/src/app/labs/[slug]/page.tsx
|
--- a/src/app/labs/[slug]/page.tsx
|
||||||
+++ b/src/app/labs/[slug]/page.tsx
|
+++ b/src/app/labs/[slug]/page.tsx
|
||||||
@@ -462,6 +462,19 @@ function markdownToHtml(markdown: string) {
|
@@ -462,6 +462,19 @@ function markdownToHtml(markdown: string) {
|
||||||
@@ -41,15 +41,20 @@ index eb949ae..bb3d51c 100644
|
|||||||
return (
|
return (
|
||||||
<main className="mx-auto w-full max-w-5xl px-6 py-10">
|
<main className="mx-auto w-full max-w-5xl px-6 py-10">
|
||||||
diff --git a/src/components/labs/LabContent.tsx b/src/components/labs/LabContent.tsx
|
diff --git a/src/components/labs/LabContent.tsx b/src/components/labs/LabContent.tsx
|
||||||
index 6addccf..afdd12f 100644
|
index 7a7ce52..8778a23 100644
|
||||||
--- a/src/components/labs/LabContent.tsx
|
--- a/src/components/labs/LabContent.tsx
|
||||||
+++ b/src/components/labs/LabContent.tsx
|
+++ b/src/components/labs/LabContent.tsx
|
||||||
@@ -346,6 +346,7 @@ export function LabContent({ className, html }: LabContentProps) {
|
@@ -277,7 +277,12 @@ export function LabContent({ className, html }: LabContentProps) {
|
||||||
<img
|
>
|
||||||
className="lab-image-modal__image"
|
<div className="lab-image-modal__surface" onClick={(event) => event.stopPropagation()}>
|
||||||
src={zoomedImage.src}
|
{/* eslint-disable-next-line @next/next/no-img-element */}
|
||||||
alt={zoomedImage.alt}
|
- <img className="lab-image-modal__image" src={zoomedImage.src} alt={zoomedImage.alt} />
|
||||||
|
+ <img
|
||||||
|
+ className="lab-image-modal__image"
|
||||||
|
+ src={zoomedImage.src}
|
||||||
|
+ alt={zoomedImage.alt}
|
||||||
+ referrerPolicy="no-referrer"
|
+ referrerPolicy="no-referrer"
|
||||||
/>
|
+ />
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
) : null}
|
||||||
|
|||||||
@@ -2,9 +2,7 @@
|
|||||||
git:
|
git:
|
||||||
repo: "{{ courseware_wiki_repo }}"
|
repo: "{{ courseware_wiki_repo }}"
|
||||||
dest: "{{ courseware_wiki_repo_dir }}"
|
dest: "{{ courseware_wiki_repo_dir }}"
|
||||||
update: "{{ courseware_wiki_force_update | default(false) | bool }}"
|
update: false
|
||||||
force: "{{ courseware_wiki_force_update | default(false) | bool }}"
|
|
||||||
register: courseware_wiki_repo_sync
|
|
||||||
|
|
||||||
- name: Check whether wiki referrer policy patch is already applied
|
- name: Check whether wiki referrer policy patch is already applied
|
||||||
command:
|
command:
|
||||||
@@ -29,27 +27,14 @@
|
|||||||
args:
|
args:
|
||||||
chdir: "{{ courseware_wiki_repo_dir }}"
|
chdir: "{{ courseware_wiki_repo_dir }}"
|
||||||
when: courseware_wiki_referrer_policy_patch.rc != 0
|
when: courseware_wiki_referrer_policy_patch.rc != 0
|
||||||
register: courseware_wiki_referrer_policy_apply
|
|
||||||
|
|
||||||
- name: Stat wiki Next dependency
|
|
||||||
stat:
|
|
||||||
path: "{{ courseware_wiki_repo_dir }}/node_modules/next/package.json"
|
|
||||||
register: courseware_wiki_next_dependency
|
|
||||||
|
|
||||||
- name: Install wiki dependencies with contained Node runtime
|
- name: Install wiki dependencies with contained Node runtime
|
||||||
command: npm install
|
command: npm install
|
||||||
args:
|
args:
|
||||||
chdir: "{{ courseware_wiki_repo_dir }}"
|
chdir: "{{ courseware_wiki_repo_dir }}"
|
||||||
|
creates: "{{ courseware_wiki_repo_dir }}/node_modules/next/package.json"
|
||||||
environment:
|
environment:
|
||||||
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
|
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
|
||||||
when:
|
|
||||||
- not courseware_wiki_next_dependency.stat.exists or courseware_wiki_repo_sync.changed
|
|
||||||
|
|
||||||
- name: Render wiki runtime config
|
|
||||||
template:
|
|
||||||
src: courseware-runtime.json.j2
|
|
||||||
dest: "{{ courseware_wiki_runtime_config_path }}"
|
|
||||||
mode: "0644"
|
|
||||||
|
|
||||||
- name: Stat wiki build output
|
- name: Stat wiki build output
|
||||||
stat:
|
stat:
|
||||||
@@ -63,4 +48,4 @@
|
|||||||
environment:
|
environment:
|
||||||
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
|
PATH: "{{ courseware_node_runtime_bin_dir }}:{{ ansible_env.PATH }}"
|
||||||
when:
|
when:
|
||||||
- not courseware_wiki_build.stat.exists or courseware_wiki_repo_sync.changed or courseware_wiki_referrer_policy_patch.rc != 0
|
- not courseware_wiki_build.stat.exists or courseware_wiki_referrer_policy_patch.rc != 0
|
||||||
|
|||||||
@@ -1,13 +0,0 @@
|
|||||||
{
|
|
||||||
"lab1NetronUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.netron }}",
|
|
||||||
"lab2OllamaUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.ollama }}",
|
|
||||||
"lab2OllamaModels": [
|
|
||||||
{% for model in courseware_lab2_ollama_models %}
|
|
||||||
{
|
|
||||||
"label": "{{ model.label }}",
|
|
||||||
"value": "{{ model.value }}"
|
|
||||||
}{% if not loop.last %},{% endif %}
|
|
||||||
{% endfor %}
|
|
||||||
],
|
|
||||||
"lab3TerminalUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.wetty }}{{ courseware_wetty_base_path }}"
|
|
||||||
}
|
|
||||||
+4
-8
@@ -30,18 +30,14 @@
|
|||||||
ansible.builtin.import_role:
|
ansible.builtin.import_role:
|
||||||
name: ollama
|
name: ollama
|
||||||
|
|
||||||
- name: Include Netron setup
|
|
||||||
ansible.builtin.import_role:
|
|
||||||
name: netron
|
|
||||||
|
|
||||||
- name: Include Lab 1 asset setup
|
|
||||||
ansible.builtin.import_role:
|
|
||||||
name: lab1_assets
|
|
||||||
|
|
||||||
- name: Include llama.cpp setup
|
- name: Include llama.cpp setup
|
||||||
ansible.builtin.import_role:
|
ansible.builtin.import_role:
|
||||||
name: llama-cpp
|
name: llama-cpp
|
||||||
|
|
||||||
|
- name: Include Transformer Lab setup
|
||||||
|
ansible.builtin.import_role:
|
||||||
|
name: transformerlab
|
||||||
|
|
||||||
- name: Include Unsloth Studio setup
|
- name: Include Unsloth Studio setup
|
||||||
ansible.builtin.import_role:
|
ansible.builtin.import_role:
|
||||||
name: unsloth
|
name: unsloth
|
||||||
|
|||||||
@@ -5,32 +5,31 @@ COURSEWARE_BIND_HOST="{{ courseware_bind_host }}"
|
|||||||
COURSEWARE_URL_HOST="{{ courseware_url_host }}"
|
COURSEWARE_URL_HOST="{{ courseware_url_host }}"
|
||||||
COURSEWARE_OLLAMA_PORT="{{ courseware_ports.ollama }}"
|
COURSEWARE_OLLAMA_PORT="{{ courseware_ports.ollama }}"
|
||||||
COURSEWARE_OPEN_WEBUI_PORT="{{ courseware_ports.open_webui }}"
|
COURSEWARE_OPEN_WEBUI_PORT="{{ courseware_ports.open_webui }}"
|
||||||
COURSEWARE_NETRON_PORT="{{ courseware_ports.netron }}"
|
COURSEWARE_TRANSFORMERLAB_PORT="{{ courseware_ports.transformerlab }}"
|
||||||
COURSEWARE_CHUNKVIZ_PORT="{{ courseware_ports.chunkviz }}"
|
COURSEWARE_CHUNKVIZ_PORT="{{ courseware_ports.chunkviz }}"
|
||||||
COURSEWARE_EMBEDDING_ATLAS_PORT="{{ courseware_ports.embedding_atlas }}"
|
COURSEWARE_EMBEDDING_ATLAS_PORT="{{ courseware_ports.embedding_atlas }}"
|
||||||
COURSEWARE_UNSLOTH_PORT="{{ courseware_ports.unsloth }}"
|
COURSEWARE_UNSLOTH_PORT="{{ courseware_ports.unsloth }}"
|
||||||
COURSEWARE_PROMPTFOO_PORT="{{ courseware_ports.promptfoo }}"
|
COURSEWARE_PROMPTFOO_PORT="{{ courseware_ports.promptfoo }}"
|
||||||
COURSEWARE_WIKI_PORT="{{ courseware_ports.wiki }}"
|
COURSEWARE_WIKI_PORT="{{ courseware_ports.wiki }}"
|
||||||
COURSEWARE_WETTY_PORT="{{ courseware_ports.wetty }}"
|
|
||||||
COURSEWARE_OLLAMA_MIN_VERSION="{{ courseware_ollama_min_version }}"
|
|
||||||
OLLAMA_BIN="{{ courseware_ollama_bin }}"
|
OLLAMA_BIN="{{ courseware_ollama_bin }}"
|
||||||
OLLAMA_MODELS_DIR="{{ courseware_ollama_models_dir }}"
|
OLLAMA_MODELS_DIR="{{ courseware_ollama_models_dir }}"
|
||||||
NODE_RUNTIME_BIN_DIR="{{ courseware_node_runtime_bin_dir }}"
|
NODE_RUNTIME_BIN_DIR="{{ courseware_node_runtime_bin_dir }}"
|
||||||
NETRON_VENV="{{ courseware_netron_venv_dir }}"
|
|
||||||
WETTY_BIN="{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
|
|
||||||
COURSEWARE_WETTY_BASE_PATH="{{ courseware_wetty_base_path }}"
|
|
||||||
OPEN_WEBUI_VENV="{{ courseware_venvs_dir }}/open-webui"
|
OPEN_WEBUI_VENV="{{ courseware_venvs_dir }}/open-webui"
|
||||||
OPEN_WEBUI_DATA_DIR="{{ courseware_state_dir }}/open-webui"
|
OPEN_WEBUI_DATA_DIR="{{ courseware_state_dir }}/open-webui"
|
||||||
CHUNKVIZ_DIR="{{ courseware_repos_dir }}/ChunkViz"
|
CHUNKVIZ_DIR="{{ courseware_repos_dir }}/ChunkViz"
|
||||||
EMBEDDING_ATLAS_VENV="{{ courseware_venvs_dir }}/embedding-atlas"
|
EMBEDDING_ATLAS_VENV="{{ courseware_venvs_dir }}/embedding-atlas"
|
||||||
TTPS_DATASET_PATH="{{ courseware_datasets_dir }}/ttps_dataset.parquet"
|
TTPS_DATASET_PATH="{{ courseware_datasets_dir }}/ttps_dataset.parquet"
|
||||||
WIKI_TEST_RAW_PATH="{{ courseware_datasets_dir }}/wiki.test.raw"
|
WIKI_TEST_RAW_PATH="{{ courseware_datasets_dir }}/wiki.test.raw"
|
||||||
COURSEWARE_OLLAMA_BASE_URL="http://{{ courseware_url_host }}:{{ courseware_ports.ollama }}"
|
TRANSFORMERLAB_DIR="{{ courseware_transformerlab_home }}"
|
||||||
COURSEWARE_LAB1_LLAMA_MODEL_PATH="{{ courseware_lab1_llama_local_path }}"
|
TRANSFORMERLAB_DEFAULT_USER_EMAIL="{{ courseware_transformerlab_default_user_email }}"
|
||||||
COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS="{{ courseware_lab1_ollama_model_alias }}"
|
TRANSFORMERLAB_DEFAULT_USER_PASSWORD="{{ courseware_transformerlab_default_user_password }}"
|
||||||
|
TRANSFORMERLAB_DEFAULT_USER_FIRST_NAME="{{ courseware_transformerlab_default_user_first_name }}"
|
||||||
|
TRANSFORMERLAB_DEFAULT_USER_LAST_NAME="{{ courseware_transformerlab_default_user_last_name }}"
|
||||||
UNSLOTH_BIN="{{ ansible_env.HOME }}/.local/bin/unsloth"
|
UNSLOTH_BIN="{{ ansible_env.HOME }}/.local/bin/unsloth"
|
||||||
PROMPTFOO_DIR="{{ courseware_promptfoo_dir }}"
|
PROMPTFOO_DIR="{{ courseware_promptfoo_dir }}"
|
||||||
PROMPTFOO_BIN="{{ courseware_tools_dir }}/promptfoo/node_modules/.bin/promptfoo"
|
PROMPTFOO_BIN="{{ courseware_tools_dir }}/promptfoo/node_modules/.bin/promptfoo"
|
||||||
WIKI_DIR="{{ courseware_wiki_repo_dir }}"
|
WIKI_DIR="{{ courseware_wiki_repo_dir }}"
|
||||||
WIKI_RUNTIME_CONFIG_PATH="{{ courseware_wiki_runtime_config_path }}"
|
|
||||||
LLAMA_CPP_BIN_DIR="{{ courseware_llama_cpp_bin_dir }}"
|
LLAMA_CPP_BIN_DIR="{{ courseware_llama_cpp_bin_dir }}"
|
||||||
|
KILN_LINUX_BIN="{{ courseware_apps_dir }}/kiln/Kiln"
|
||||||
|
KILN_MAC_APP="{{ courseware_apps_dir }}/Kiln.app"
|
||||||
|
KILN_LAUNCH_PATH="{% if ansible_system == 'Darwin' %}{{ courseware_apps_dir }}/Kiln.app{% else %}{{ courseware_apps_dir }}/kiln/Kiln{% endif %}"
|
||||||
|
|||||||
@@ -18,8 +18,6 @@ usage() {
|
|||||||
Usage:
|
Usage:
|
||||||
./labctl up
|
./labctl up
|
||||||
./labctl down
|
./labctl down
|
||||||
./labctl update_wiki
|
|
||||||
./labctl ollama_models
|
|
||||||
./labctl preflight
|
./labctl preflight
|
||||||
./labctl versions
|
./labctl versions
|
||||||
./labctl assets lab2 [--refresh]
|
./labctl assets lab2 [--refresh]
|
||||||
@@ -27,11 +25,12 @@ Usage:
|
|||||||
./labctl stop [all|service...]
|
./labctl stop [all|service...]
|
||||||
./labctl status [all|service...]
|
./labctl status [all|service...]
|
||||||
./labctl urls
|
./labctl urls
|
||||||
|
./labctl open kiln
|
||||||
./labctl logs <service>
|
./labctl logs <service>
|
||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
netron_version() {
|
transformerlab_version() {
|
||||||
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
|
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
|
||||||
|
|
||||||
if [ ! -f "$version_file" ]; then
|
if [ ! -f "$version_file" ]; then
|
||||||
@@ -39,35 +38,21 @@ netron_version() {
|
|||||||
return
|
return
|
||||||
fi
|
fi
|
||||||
|
|
||||||
sed -nE 's/^courseware_netron_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
|
sed -nE 's/^courseware_transformerlab_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
|
||||||
}
|
|
||||||
|
|
||||||
minimum_ollama_version() {
|
|
||||||
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
|
|
||||||
|
|
||||||
if [ ! -f "$version_file" ]; then
|
|
||||||
printf '%s\n' "unknown"
|
|
||||||
return
|
|
||||||
fi
|
|
||||||
|
|
||||||
sed -nE 's/^courseware_ollama_min_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
|
|
||||||
}
|
}
|
||||||
|
|
||||||
print_versions() {
|
print_versions() {
|
||||||
cat <<EOF
|
cat <<EOF
|
||||||
Pinned component versions:
|
Pinned component versions:
|
||||||
Netron: $(netron_version)
|
TransformerLab: $(transformerlab_version) (single-user pinned install)
|
||||||
Minimum Ollama: $(minimum_ollama_version)
|
|
||||||
Ansible Core: 2.18.6
|
Ansible Core: 2.18.6
|
||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
confirm_installation() {
|
confirm_installation() {
|
||||||
local response
|
local response
|
||||||
local pinned_netron
|
local tlab_version
|
||||||
local min_ollama
|
tlab_version=$(transformerlab_version)
|
||||||
pinned_netron=$(netron_version)
|
|
||||||
min_ollama=$(minimum_ollama_version)
|
|
||||||
|
|
||||||
if [ ! -t 0 ]; then
|
if [ ! -t 0 ]; then
|
||||||
cat <<EOF >&2
|
cat <<EOF >&2
|
||||||
@@ -75,15 +60,14 @@ WARNING: THIS SCRIPT WILL CONFIGURE YOUR ENVIRONMENT WILL THE FOLLOWING SOFTWARE
|
|||||||
|
|
||||||
- Ollama
|
- Ollama
|
||||||
- llama.cpp
|
- llama.cpp
|
||||||
- Netron (${pinned_netron})
|
- TransformerLab (single-user pinned to ${tlab_version})
|
||||||
- Open WebUI
|
- Open WebUI
|
||||||
- ChunkViz
|
- ChunkViz
|
||||||
- Embedding Atlas
|
- Embedding Atlas
|
||||||
- Promptfoo
|
- Promptfoo
|
||||||
- Unsloth Studio
|
- Unsloth Studio
|
||||||
- Course-specific support assets for lab 1, lab 2, and lab 4
|
- Kiln Desktop
|
||||||
- Pre-pulled Gemma 4 E2B Ollama models for Lab 1 and Lab 2
|
- Course-specific support assets for lab 2 and lab 4
|
||||||
- Lab 1 confidence support through Gemma 4 E2B Q4 (requires Ollama ${min_ollama}+)
|
|
||||||
|
|
||||||
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
|
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
|
||||||
|
|
||||||
@@ -99,15 +83,14 @@ WARNING: THIS SCRIPT WILL CONFIGURE YOUR ENVIRONMENT WILL THE FOLLOWING SOFTWARE
|
|||||||
|
|
||||||
- Ollama
|
- Ollama
|
||||||
- llama.cpp
|
- llama.cpp
|
||||||
- Netron (${pinned_netron})
|
- TransformerLab (single-user pinned to ${tlab_version})
|
||||||
- Open WebUI
|
- Open WebUI
|
||||||
- ChunkViz
|
- ChunkViz
|
||||||
- Embedding Atlas
|
- Embedding Atlas
|
||||||
- Promptfoo
|
- Promptfoo
|
||||||
- Unsloth Studio
|
- Unsloth Studio
|
||||||
- Course-specific support assets for lab 1, lab 2, and lab 4
|
- Kiln Desktop
|
||||||
- Pre-pulled Gemma 4 E2B Ollama models for Lab 1 and Lab 2
|
- Course-specific support assets for lab 2 and lab 4
|
||||||
- Lab 1 confidence support through Gemma 4 E2B Q4 (requires Ollama ${min_ollama}+)
|
|
||||||
|
|
||||||
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
|
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
|
||||||
|
|
||||||
@@ -129,16 +112,29 @@ host_is_wsl() {
|
|||||||
[ "$(uname -s)" = "Linux" ] && uname -r | grep -qiE 'microsoft|wsl'
|
[ "$(uname -s)" = "Linux" ] && uname -r | grep -qiE 'microsoft|wsl'
|
||||||
}
|
}
|
||||||
|
|
||||||
|
host_is_macos() {
|
||||||
|
[ "$(uname -s)" = "Darwin" ]
|
||||||
|
}
|
||||||
|
|
||||||
host_is_linux() {
|
host_is_linux() {
|
||||||
[ "$(uname -s)" = "Linux" ]
|
[ "$(uname -s)" = "Linux" ]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
host_is_arm_mac() {
|
||||||
|
host_is_macos && [ "$(uname -m)" = "arm64" ]
|
||||||
|
}
|
||||||
|
|
||||||
host_profile() {
|
host_profile() {
|
||||||
if host_is_wsl; then
|
if host_is_wsl; then
|
||||||
printf '%s\n' "wsl"
|
printf '%s\n' "wsl"
|
||||||
return
|
return
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
if host_is_macos; then
|
||||||
|
printf '%s\n' "macos"
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
if host_is_linux && host_is_debian_family; then
|
if host_is_linux && host_is_debian_family; then
|
||||||
printf '%s\n' "native-debian-ubuntu"
|
printf '%s\n' "native-debian-ubuntu"
|
||||||
return
|
return
|
||||||
@@ -283,6 +279,7 @@ Python 3 was not found.
|
|||||||
|
|
||||||
Install it first, then rerun this command:
|
Install it first, then rerun this command:
|
||||||
- Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3 python3-venv
|
- Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3 python3-venv
|
||||||
|
- macOS: brew install python@3.11
|
||||||
EOF
|
EOF
|
||||||
exit 1
|
exit 1
|
||||||
}
|
}
|
||||||
@@ -384,6 +381,7 @@ Python 3 is installed, but its virtual environment support is still unavailable.
|
|||||||
|
|
||||||
Install the missing venv package for your platform, then rerun this command:
|
Install the missing venv package for your platform, then rerun this command:
|
||||||
- Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3-venv python3-pip
|
- Debian/Ubuntu/WSL: sudo apt update && sudo apt install -y python3-venv python3-pip
|
||||||
|
- macOS: brew reinstall python@3.11
|
||||||
EOF
|
EOF
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
@@ -507,15 +505,6 @@ require_arg() {
|
|||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
require_managed_runtime() {
|
|
||||||
if [ ! -f "$ROOT_DIR/state/runtime.env" ]; then
|
|
||||||
cat <<'EOF' >&2
|
|
||||||
Missing state/runtime.env. Run ./labctl up first so the managed environment exists before using this command.
|
|
||||||
EOF
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
handle_assets_command() {
|
handle_assets_command() {
|
||||||
local asset_group=${1:-}
|
local asset_group=${1:-}
|
||||||
shift || true
|
shift || true
|
||||||
@@ -531,17 +520,6 @@ handle_assets_command() {
|
|||||||
esac
|
esac
|
||||||
}
|
}
|
||||||
|
|
||||||
refresh_ollama_models() {
|
|
||||||
require_managed_runtime
|
|
||||||
run_playbook up.yml --tags ollama_models
|
|
||||||
}
|
|
||||||
|
|
||||||
update_wiki() {
|
|
||||||
require_managed_runtime
|
|
||||||
run_playbook up.yml --tags wiki -e "courseware_wiki_force_update=true"
|
|
||||||
run_project_script "$ROOT_DIR/scripts/service_manager.sh" restart-wiki
|
|
||||||
}
|
|
||||||
|
|
||||||
main() {
|
main() {
|
||||||
local cmd=${1:-}
|
local cmd=${1:-}
|
||||||
shift || true
|
shift || true
|
||||||
@@ -552,12 +530,6 @@ main() {
|
|||||||
run_playbook up.yml
|
run_playbook up.yml
|
||||||
run_project_script "$ROOT_DIR/scripts/service_manager.sh" start all
|
run_project_script "$ROOT_DIR/scripts/service_manager.sh" start all
|
||||||
;;
|
;;
|
||||||
ollama_models)
|
|
||||||
refresh_ollama_models
|
|
||||||
;;
|
|
||||||
update_wiki)
|
|
||||||
update_wiki
|
|
||||||
;;
|
|
||||||
down)
|
down)
|
||||||
run_project_script "$ROOT_DIR/scripts/service_manager.sh" stop all || true
|
run_project_script "$ROOT_DIR/scripts/service_manager.sh" stop all || true
|
||||||
run_playbook down.yml
|
run_playbook down.yml
|
||||||
@@ -573,7 +545,7 @@ main() {
|
|||||||
require_arg "$@"
|
require_arg "$@"
|
||||||
handle_assets_command "$@"
|
handle_assets_command "$@"
|
||||||
;;
|
;;
|
||||||
start|stop|status|urls|logs)
|
start|stop|status|urls|open|logs)
|
||||||
exec bash "$ROOT_DIR/scripts/service_manager.sh" "$cmd" "$@"
|
exec bash "$ROOT_DIR/scripts/service_manager.sh" "$cmd" "$@"
|
||||||
;;
|
;;
|
||||||
""|-h|--help|help)
|
""|-h|--help|help)
|
||||||
|
|||||||
+13
-35
@@ -15,21 +15,11 @@ load_runtime_env() {
|
|||||||
: "${COURSEWARE_STATE_DIR:=$STATE_DIR}"
|
: "${COURSEWARE_STATE_DIR:=$STATE_DIR}"
|
||||||
: "${COURSEWARE_BIND_HOST:=127.0.0.1}"
|
: "${COURSEWARE_BIND_HOST:=127.0.0.1}"
|
||||||
: "${COURSEWARE_URL_HOST:=127.0.0.1}"
|
: "${COURSEWARE_URL_HOST:=127.0.0.1}"
|
||||||
: "${COURSEWARE_NETRON_PORT:=8338}"
|
|
||||||
: "${COURSEWARE_PROMPTFOO_PORT:=15500}"
|
: "${COURSEWARE_PROMPTFOO_PORT:=15500}"
|
||||||
: "${COURSEWARE_WIKI_PORT:=80}"
|
: "${COURSEWARE_WIKI_PORT:=80}"
|
||||||
: "${COURSEWARE_WETTY_PORT:=7681}"
|
|
||||||
: "${COURSEWARE_OLLAMA_MIN_VERSION:=0.12.11}"
|
|
||||||
: "${COURSEWARE_WETTY_BASE_PATH:=/wetty}"
|
|
||||||
: "${NODE_RUNTIME_BIN_DIR:=$COURSEWARE_STATE_DIR/tools/node-runtime/node_modules/node/bin}"
|
: "${NODE_RUNTIME_BIN_DIR:=$COURSEWARE_STATE_DIR/tools/node-runtime/node_modules/node/bin}"
|
||||||
: "${NETRON_VENV:=$COURSEWARE_STATE_DIR/venvs/netron}"
|
|
||||||
: "${WETTY_BIN:=$COURSEWARE_STATE_DIR/tools/wetty/node_modules/.bin/wetty}"
|
|
||||||
: "${PROMPTFOO_DIR:=$COURSEWARE_STATE_DIR/lab6}"
|
: "${PROMPTFOO_DIR:=$COURSEWARE_STATE_DIR/lab6}"
|
||||||
: "${WIKI_DIR:=$COURSEWARE_STATE_DIR/repos/LLM-Labs}"
|
: "${WIKI_DIR:=$COURSEWARE_STATE_DIR/repos/LLM-Labs}"
|
||||||
: "${WIKI_RUNTIME_CONFIG_PATH:=$WIKI_DIR/public/courseware-runtime.json}"
|
|
||||||
: "${COURSEWARE_OLLAMA_BASE_URL:=http://$COURSEWARE_URL_HOST:$COURSEWARE_OLLAMA_PORT}"
|
|
||||||
: "${COURSEWARE_LAB1_LLAMA_MODEL_PATH:=$COURSEWARE_STATE_DIR/models/lab1/Llama-3.2-1B.Q4_K_M.gguf}"
|
|
||||||
: "${COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS:=batiai/gemma4-e2b:q4}"
|
|
||||||
: "${LLAMA_CPP_BIN_DIR:=$COURSEWARE_STATE_DIR/repos/llama.cpp/build/bin}"
|
: "${LLAMA_CPP_BIN_DIR:=$COURSEWARE_STATE_DIR/repos/llama.cpp/build/bin}"
|
||||||
|
|
||||||
if [ -n "${OLLAMA_BIN:-}" ] && [[ "$OLLAMA_BIN" != */* ]] && command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
|
if [ -n "${OLLAMA_BIN:-}" ] && [[ "$OLLAMA_BIN" != */* ]] && command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
|
||||||
@@ -48,13 +38,12 @@ service_list() {
|
|||||||
printf '%s\n' \
|
printf '%s\n' \
|
||||||
"ollama" \
|
"ollama" \
|
||||||
"open-webui" \
|
"open-webui" \
|
||||||
"netron" \
|
"transformerlab" \
|
||||||
"chunkviz" \
|
"chunkviz" \
|
||||||
"embedding-atlas" \
|
"embedding-atlas" \
|
||||||
"unsloth" \
|
"unsloth" \
|
||||||
"promptfoo" \
|
"promptfoo" \
|
||||||
"wiki" \
|
"wiki"
|
||||||
"wetty"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
service_pid_file() {
|
service_pid_file() {
|
||||||
@@ -69,13 +58,12 @@ service_port() {
|
|||||||
case "$1" in
|
case "$1" in
|
||||||
ollama) printf '%s\n' "${COURSEWARE_OLLAMA_PORT}" ;;
|
ollama) printf '%s\n' "${COURSEWARE_OLLAMA_PORT}" ;;
|
||||||
open-webui) printf '%s\n' "${COURSEWARE_OPEN_WEBUI_PORT}" ;;
|
open-webui) printf '%s\n' "${COURSEWARE_OPEN_WEBUI_PORT}" ;;
|
||||||
netron) printf '%s\n' "${COURSEWARE_NETRON_PORT}" ;;
|
transformerlab) printf '%s\n' "${COURSEWARE_TRANSFORMERLAB_PORT}" ;;
|
||||||
chunkviz) printf '%s\n' "${COURSEWARE_CHUNKVIZ_PORT}" ;;
|
chunkviz) printf '%s\n' "${COURSEWARE_CHUNKVIZ_PORT}" ;;
|
||||||
embedding-atlas) printf '%s\n' "${COURSEWARE_EMBEDDING_ATLAS_PORT}" ;;
|
embedding-atlas) printf '%s\n' "${COURSEWARE_EMBEDDING_ATLAS_PORT}" ;;
|
||||||
unsloth) printf '%s\n' "${COURSEWARE_UNSLOTH_PORT}" ;;
|
unsloth) printf '%s\n' "${COURSEWARE_UNSLOTH_PORT}" ;;
|
||||||
promptfoo) printf '%s\n' "${COURSEWARE_PROMPTFOO_PORT}" ;;
|
promptfoo) printf '%s\n' "${COURSEWARE_PROMPTFOO_PORT}" ;;
|
||||||
wiki) printf '%s\n' "${COURSEWARE_WIKI_PORT}" ;;
|
wiki) printf '%s\n' "${COURSEWARE_WIKI_PORT}" ;;
|
||||||
wetty) printf '%s\n' "${COURSEWARE_WETTY_PORT}" ;;
|
|
||||||
*) return 1 ;;
|
*) return 1 ;;
|
||||||
esac
|
esac
|
||||||
}
|
}
|
||||||
@@ -84,13 +72,12 @@ service_url() {
|
|||||||
case "$1" in
|
case "$1" in
|
||||||
ollama) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OLLAMA_PORT" ;;
|
ollama) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OLLAMA_PORT" ;;
|
||||||
open-webui) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OPEN_WEBUI_PORT" ;;
|
open-webui) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OPEN_WEBUI_PORT" ;;
|
||||||
netron) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_NETRON_PORT" ;;
|
transformerlab) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_TRANSFORMERLAB_PORT" ;;
|
||||||
chunkviz) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_CHUNKVIZ_PORT" ;;
|
chunkviz) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_CHUNKVIZ_PORT" ;;
|
||||||
embedding-atlas) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_EMBEDDING_ATLAS_PORT" ;;
|
embedding-atlas) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_EMBEDDING_ATLAS_PORT" ;;
|
||||||
unsloth) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_UNSLOTH_PORT" ;;
|
unsloth) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_UNSLOTH_PORT" ;;
|
||||||
promptfoo) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_PROMPTFOO_PORT" ;;
|
promptfoo) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_PROMPTFOO_PORT" ;;
|
||||||
wiki) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_WIKI_PORT" ;;
|
wiki) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_WIKI_PORT" ;;
|
||||||
wetty) printf 'http://%s:%s%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_WETTY_PORT" "$COURSEWARE_WETTY_BASE_PATH" ;;
|
|
||||||
*) return 1 ;;
|
*) return 1 ;;
|
||||||
esac
|
esac
|
||||||
}
|
}
|
||||||
@@ -113,11 +100,14 @@ service_command() {
|
|||||||
"$COURSEWARE_BIND_HOST" \
|
"$COURSEWARE_BIND_HOST" \
|
||||||
"$COURSEWARE_OPEN_WEBUI_PORT"
|
"$COURSEWARE_OPEN_WEBUI_PORT"
|
||||||
;;
|
;;
|
||||||
netron)
|
transformerlab)
|
||||||
printf 'exec "%s/bin/netron" --host %s --port %s --verbosity quiet' \
|
printf 'export PATH="%s/envs/transformerlab/bin:$PATH"; export VIRTUAL_ENV="%s/envs/transformerlab"; export CONDA_PREFIX="%s/envs/transformerlab"; cd "%s/src" && exec ./run.sh -c -h %s -p %s' \
|
||||||
"$NETRON_VENV" \
|
"$TRANSFORMERLAB_DIR" \
|
||||||
|
"$TRANSFORMERLAB_DIR" \
|
||||||
|
"$TRANSFORMERLAB_DIR" \
|
||||||
|
"$TRANSFORMERLAB_DIR" \
|
||||||
"$COURSEWARE_BIND_HOST" \
|
"$COURSEWARE_BIND_HOST" \
|
||||||
"$COURSEWARE_NETRON_PORT"
|
"$COURSEWARE_TRANSFORMERLAB_PORT"
|
||||||
;;
|
;;
|
||||||
chunkviz)
|
chunkviz)
|
||||||
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/serve" build -s -n -L -l tcp://%s:%s' \
|
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/serve" build -s -n -L -l tcp://%s:%s' \
|
||||||
@@ -127,7 +117,7 @@ service_command() {
|
|||||||
"$COURSEWARE_CHUNKVIZ_PORT"
|
"$COURSEWARE_CHUNKVIZ_PORT"
|
||||||
;;
|
;;
|
||||||
embedding-atlas)
|
embedding-atlas)
|
||||||
printf 'exec "%s/bin/embedding-atlas" "%s" --text "Scenario" --host %s --port %s --no-auto-port' \
|
printf 'exec "%s/bin/embedding-atlas" "%s" --text "Scenario" --host %s --port %s' \
|
||||||
"$EMBEDDING_ATLAS_VENV" \
|
"$EMBEDDING_ATLAS_VENV" \
|
||||||
"$TTPS_DATASET_PATH" \
|
"$TTPS_DATASET_PATH" \
|
||||||
"$COURSEWARE_BIND_HOST" \
|
"$COURSEWARE_BIND_HOST" \
|
||||||
@@ -148,24 +138,12 @@ service_command() {
|
|||||||
"$COURSEWARE_PROMPTFOO_PORT"
|
"$COURSEWARE_PROMPTFOO_PORT"
|
||||||
;;
|
;;
|
||||||
wiki)
|
wiki)
|
||||||
printf 'cd "%s" && PATH="%s:$PATH" exec env COURSEWARE_OLLAMA_BASE_URL="%s" COURSEWARE_LAB1_LLAMA_MODEL_PATH="%s" COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS="%s" "./node_modules/.bin/next" start --hostname %s --port %s' \
|
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/next" start --hostname %s --port %s' \
|
||||||
"$WIKI_DIR" \
|
"$WIKI_DIR" \
|
||||||
"$NODE_RUNTIME_BIN_DIR" \
|
"$NODE_RUNTIME_BIN_DIR" \
|
||||||
"$COURSEWARE_OLLAMA_BASE_URL" \
|
|
||||||
"$COURSEWARE_LAB1_LLAMA_MODEL_PATH" \
|
|
||||||
"$COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS" \
|
|
||||||
"$COURSEWARE_BIND_HOST" \
|
"$COURSEWARE_BIND_HOST" \
|
||||||
"$COURSEWARE_WIKI_PORT"
|
"$COURSEWARE_WIKI_PORT"
|
||||||
;;
|
;;
|
||||||
wetty)
|
|
||||||
printf 'cd "%s" && PATH="%s:$PATH" exec "%s" --host %s --port %s --base %s --allow-iframe --ssh-host 127.0.0.1 --ssh-port 22 --ssh-auth password' \
|
|
||||||
"$COURSEWARE_ROOT" \
|
|
||||||
"$NODE_RUNTIME_BIN_DIR" \
|
|
||||||
"$WETTY_BIN" \
|
|
||||||
"$COURSEWARE_BIND_HOST" \
|
|
||||||
"$COURSEWARE_WETTY_PORT" \
|
|
||||||
"$COURSEWARE_WETTY_BASE_PATH"
|
|
||||||
;;
|
|
||||||
*)
|
*)
|
||||||
return 1
|
return 1
|
||||||
;;
|
;;
|
||||||
|
|||||||
@@ -4,9 +4,15 @@ from __future__ import annotations
|
|||||||
import argparse
|
import argparse
|
||||||
import asyncio
|
import asyncio
|
||||||
import os
|
import os
|
||||||
|
import shutil
|
||||||
import sys
|
import sys
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
|
DEFAULT_WORKSPACE_PLUGINS = ("fastchat_server",)
|
||||||
|
DEFAULT_WORKSPACE_EXPERIMENTS = ("alpha", "beta", "gamma")
|
||||||
|
DEFAULT_WORKSPACE_MODELS = ("unsloth_Llama-3.2-1B-Instruct",)
|
||||||
|
DEFAULT_MODEL_METADATA_FILES = ("_tlab_complete_provenance.json",)
|
||||||
|
|
||||||
|
|
||||||
def parse_args() -> argparse.Namespace:
|
def parse_args() -> argparse.Namespace:
|
||||||
parser = argparse.ArgumentParser(
|
parser = argparse.ArgumentParser(
|
||||||
@@ -36,6 +42,92 @@ def bootstrap_source(transformerlab_dir: Path) -> None:
|
|||||||
os.environ.setdefault(key.strip(), value.strip().strip('"').strip("'"))
|
os.environ.setdefault(key.strip(), value.strip().strip('"').strip("'"))
|
||||||
|
|
||||||
|
|
||||||
|
def target_workspace(transformerlab_dir: Path, team_id: str) -> Path:
|
||||||
|
return transformerlab_dir / "orgs" / team_id / "workspace"
|
||||||
|
|
||||||
|
|
||||||
|
def workspace_team_id(workspace: Path, transformerlab_dir: Path) -> str | None:
|
||||||
|
orgs_dir = transformerlab_dir / "orgs"
|
||||||
|
try:
|
||||||
|
relative = workspace.relative_to(orgs_dir)
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if len(relative.parts) >= 2 and relative.parts[1] == "workspace":
|
||||||
|
return relative.parts[0]
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def candidate_workspaces(transformerlab_dir: Path, excluded_team_id: str) -> list[Path]:
|
||||||
|
candidates: list[Path] = []
|
||||||
|
root_workspace = transformerlab_dir / "workspace"
|
||||||
|
if root_workspace.is_dir():
|
||||||
|
candidates.append(root_workspace)
|
||||||
|
|
||||||
|
orgs_dir = transformerlab_dir / "orgs"
|
||||||
|
if not orgs_dir.is_dir():
|
||||||
|
return candidates
|
||||||
|
|
||||||
|
for workspace in sorted(orgs_dir.glob("*/workspace")):
|
||||||
|
if not workspace.is_dir():
|
||||||
|
continue
|
||||||
|
if workspace_team_id(workspace, transformerlab_dir) == excluded_team_id:
|
||||||
|
continue
|
||||||
|
candidates.append(workspace)
|
||||||
|
return candidates
|
||||||
|
|
||||||
|
|
||||||
|
def copy_dir_if_missing(source: Path | None, target: Path, label: str) -> bool:
|
||||||
|
if source is None or not source.is_dir() or target.exists():
|
||||||
|
return False
|
||||||
|
|
||||||
|
target.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
shutil.copytree(source, target)
|
||||||
|
print(f"Seeded {label} from {source}.")
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def copy_file_if_missing(source: Path | None, target: Path, label: str) -> bool:
|
||||||
|
if source is None or not source.is_file() or target.exists():
|
||||||
|
return False
|
||||||
|
|
||||||
|
target.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
shutil.copy2(source, target)
|
||||||
|
print(f"Seeded {label} from {source}.")
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def find_workspace_seed(transformerlab_dir: Path, category: str, name: str, excluded_team_id: str) -> Path | None:
|
||||||
|
for workspace in candidate_workspaces(transformerlab_dir, excluded_team_id):
|
||||||
|
candidate = workspace / category / name
|
||||||
|
if candidate.exists():
|
||||||
|
return candidate
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def seed_workspace(transformerlab_dir: Path, team_id: str) -> None:
|
||||||
|
workspace = target_workspace(transformerlab_dir, team_id)
|
||||||
|
workspace.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
for plugin in DEFAULT_WORKSPACE_PLUGINS:
|
||||||
|
source = transformerlab_dir / "src" / "transformerlab" / "plugins" / plugin
|
||||||
|
copy_dir_if_missing(source, workspace / "plugins" / plugin, f"plugin '{plugin}'")
|
||||||
|
|
||||||
|
for experiment in DEFAULT_WORKSPACE_EXPERIMENTS:
|
||||||
|
source = find_workspace_seed(transformerlab_dir, "experiments", experiment, team_id)
|
||||||
|
copy_dir_if_missing(source, workspace / "experiments" / experiment, f"experiment '{experiment}'")
|
||||||
|
|
||||||
|
copied_model = False
|
||||||
|
for model in DEFAULT_WORKSPACE_MODELS:
|
||||||
|
source = find_workspace_seed(transformerlab_dir, "models", model, team_id)
|
||||||
|
copied_model = copy_dir_if_missing(source, workspace / "models" / model, f"model '{model}'") or copied_model
|
||||||
|
|
||||||
|
for metadata_name in DEFAULT_MODEL_METADATA_FILES:
|
||||||
|
source = find_workspace_seed(transformerlab_dir, "models", metadata_name, team_id)
|
||||||
|
if copied_model or source is not None:
|
||||||
|
copy_file_if_missing(source, workspace / "models" / metadata_name, f"model metadata '{metadata_name}'")
|
||||||
|
|
||||||
|
|
||||||
async def ensure_user(args: argparse.Namespace) -> int:
|
async def ensure_user(args: argparse.Namespace) -> int:
|
||||||
from sqlalchemy import select
|
from sqlalchemy import select
|
||||||
from transformerlab.db.constants import DATABASE_FILE_NAME
|
from transformerlab.db.constants import DATABASE_FILE_NAME
|
||||||
@@ -129,6 +221,8 @@ async def ensure_user(args: argparse.Namespace) -> int:
|
|||||||
await session.commit()
|
await session.commit()
|
||||||
print(f"Updated team role to owner for {args.email}.")
|
print(f"Updated team role to owner for {args.email}.")
|
||||||
|
|
||||||
|
seed_workspace(Path(args.transformerlab_dir), str(user_team.team_id))
|
||||||
|
|
||||||
action = "Created" if created else "Verified"
|
action = "Created" if created else "Verified"
|
||||||
print(f"{action} default TransformerLab user {args.email}.")
|
print(f"{action} default TransformerLab user {args.email}.")
|
||||||
return 0
|
return 0
|
||||||
|
|||||||
@@ -0,0 +1,70 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Patch pinned TransformerLab source to tolerate symlinked home directories."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
PATCH_MARKER = "with symlinked TransformerLab home directories."
|
||||||
|
TARGET_BLOCK = re.compile(
|
||||||
|
r"(?P<indent>[ \t]+)# The following prevents path traversal attacks:.*?"
|
||||||
|
r"(?P=indent)# now get the file contents",
|
||||||
|
re.DOTALL,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser()
|
||||||
|
parser.add_argument("--transformerlab-dir", required=True)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def repair_plugins_router(path: Path) -> bool:
|
||||||
|
source = path.read_text(encoding="utf-8")
|
||||||
|
if PATCH_MARKER in source:
|
||||||
|
return False
|
||||||
|
|
||||||
|
replacement = (
|
||||||
|
" # The following prevents path traversal attacks while remaining compatible\n"
|
||||||
|
" # with symlinked TransformerLab home directories.\n"
|
||||||
|
" plugin_dir = Path(await lab_dirs.plugin_dir_by_name((pluginId)))\n"
|
||||||
|
" resolved_plugin_dir = plugin_dir.resolve()\n"
|
||||||
|
' final_path = (plugin_dir / f"{filename}{file_ext}").resolve()\n'
|
||||||
|
"\n"
|
||||||
|
" try:\n"
|
||||||
|
" final_path.relative_to(resolved_plugin_dir)\n"
|
||||||
|
" except ValueError:\n"
|
||||||
|
' return {"message": f"File {filename}{file_ext} is outside plugin directory"}\n'
|
||||||
|
"\n"
|
||||||
|
" # now get the file contents"
|
||||||
|
)
|
||||||
|
updated, count = TARGET_BLOCK.subn(replacement, source, count=1)
|
||||||
|
if count != 1:
|
||||||
|
raise RuntimeError(f"Could not find path traversal block in {path}")
|
||||||
|
|
||||||
|
path.write_text(updated, encoding="utf-8")
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
args = parse_args()
|
||||||
|
root = Path(args.transformerlab_dir).expanduser().resolve()
|
||||||
|
plugins_router = root / "src" / "transformerlab" / "routers" / "experiment" / "plugins.py"
|
||||||
|
if not plugins_router.exists():
|
||||||
|
print(f"missing TransformerLab plugins router: {plugins_router}", file=sys.stderr)
|
||||||
|
return 1
|
||||||
|
|
||||||
|
changed = repair_plugins_router(plugins_router)
|
||||||
|
if changed:
|
||||||
|
print(f"patched {plugins_router}")
|
||||||
|
else:
|
||||||
|
print(f"already patched {plugins_router}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
raise SystemExit(main())
|
||||||
+73
-120
@@ -9,66 +9,23 @@ load_runtime_env
|
|||||||
|
|
||||||
mkdir -p "$STATE_DIR/run" "$STATE_DIR/logs"
|
mkdir -p "$STATE_DIR/run" "$STATE_DIR/logs"
|
||||||
|
|
||||||
check_wetty_prereqs() {
|
ensure_transformerlab_default_user() {
|
||||||
if [ ! -x "$WETTY_BIN" ]; then
|
local helper_python="${TRANSFORMERLAB_DIR}/envs/transformerlab/bin/python"
|
||||||
echo "Missing WeTTY binary at $WETTY_BIN. Re-run ./labctl up." >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ ! -f "$WIKI_RUNTIME_CONFIG_PATH" ]; then
|
if [ ! -x "$helper_python" ]; then
|
||||||
echo "Missing wiki runtime config at $WIKI_RUNTIME_CONFIG_PATH. Re-run ./labctl up." >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if ! python3 - <<'PY'
|
|
||||||
import socket, sys
|
|
||||||
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
|
||||||
sock.settimeout(1)
|
|
||||||
try:
|
|
||||||
sock.connect(("127.0.0.1", 22))
|
|
||||||
except OSError:
|
|
||||||
sys.exit(1)
|
|
||||||
finally:
|
|
||||||
sock.close()
|
|
||||||
PY
|
|
||||||
then
|
|
||||||
echo "Loopback sshd is not reachable on 127.0.0.1:22." >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
ollama_version_gte_minimum() {
|
|
||||||
local version_output
|
|
||||||
local installed_version
|
|
||||||
|
|
||||||
if ! command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
version_output=$("$OLLAMA_BIN" --version 2>/dev/null || true)
|
|
||||||
installed_version=$(printf '%s' "$version_output" | grep -oE '[0-9]+\.[0-9]+\.[0-9]+' | head -n 1)
|
|
||||||
|
|
||||||
if [ -z "$installed_version" ]; then
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
[ "$(printf '%s\n' "$COURSEWARE_OLLAMA_MIN_VERSION" "$installed_version" | sort -V | head -n 1)" = "$COURSEWARE_OLLAMA_MIN_VERSION" ]
|
|
||||||
}
|
|
||||||
|
|
||||||
assert_ollama_logprobs_support() {
|
|
||||||
if ollama_version_gte_minimum; then
|
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
local version_output
|
if [ -z "${TRANSFORMERLAB_DEFAULT_USER_EMAIL:-}" ] || [ -z "${TRANSFORMERLAB_DEFAULT_USER_PASSWORD:-}" ]; then
|
||||||
version_output=$("$OLLAMA_BIN" --version 2>/dev/null || printf 'unknown')
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
cat <<EOF >&2
|
"$helper_python" "$SCRIPT_DIR/ensure_transformerlab_user.py" \
|
||||||
Lab 1 requires Ollama ${COURSEWARE_OLLAMA_MIN_VERSION} or newer because the confidence visualizer depends on logprobs.
|
--transformerlab-dir "$TRANSFORMERLAB_DIR" \
|
||||||
Installed version: ${version_output}
|
--email "$TRANSFORMERLAB_DEFAULT_USER_EMAIL" \
|
||||||
Re-run ./labctl up after upgrading Ollama.
|
--password "$TRANSFORMERLAB_DEFAULT_USER_PASSWORD" \
|
||||||
EOF
|
--first-name "${TRANSFORMERLAB_DEFAULT_USER_FIRST_NAME:-Student}" \
|
||||||
exit 1
|
--last-name "${TRANSFORMERLAB_DEFAULT_USER_LAST_NAME:-}" >>"$STATE_DIR/logs/transformerlab_default_user.log" 2>&1 || true
|
||||||
}
|
}
|
||||||
|
|
||||||
resolve_targets() {
|
resolve_targets() {
|
||||||
@@ -112,18 +69,6 @@ is_running() {
|
|||||||
has_live_pid "$service" || service_ready "$service"
|
has_live_pid "$service" || service_ready "$service"
|
||||||
}
|
}
|
||||||
|
|
||||||
service_startup_attempts() {
|
|
||||||
case "$1" in
|
|
||||||
embedding-atlas)
|
|
||||||
# The first launch can be noticeably slower on cold environments.
|
|
||||||
printf '%s\n' 180
|
|
||||||
;;
|
|
||||||
*)
|
|
||||||
printf '%s\n' 60
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
}
|
|
||||||
|
|
||||||
service_ready() {
|
service_ready() {
|
||||||
local service=$1
|
local service=$1
|
||||||
|
|
||||||
@@ -131,10 +76,13 @@ service_ready() {
|
|||||||
ollama)
|
ollama)
|
||||||
curl -fsS "$(service_url "$service")/api/tags" >/dev/null 2>&1
|
curl -fsS "$(service_url "$service")/api/tags" >/dev/null 2>&1
|
||||||
;;
|
;;
|
||||||
|
transformerlab)
|
||||||
|
curl -fsS "$(service_url "$service")/healthz" >/dev/null 2>&1
|
||||||
|
;;
|
||||||
promptfoo)
|
promptfoo)
|
||||||
curl -fsS "$(service_url "$service")/health" >/dev/null 2>&1
|
curl -fsS "$(service_url "$service")/health" >/dev/null 2>&1
|
||||||
;;
|
;;
|
||||||
open-webui|netron|chunkviz|embedding-atlas|unsloth|wiki|wetty)
|
open-webui|chunkviz|embedding-atlas|unsloth|wiki)
|
||||||
curl -fsS "$(service_url "$service")" >/dev/null 2>&1
|
curl -fsS "$(service_url "$service")" >/dev/null 2>&1
|
||||||
;;
|
;;
|
||||||
*)
|
*)
|
||||||
@@ -154,22 +102,6 @@ service_listener_pids() {
|
|||||||
| sort -u
|
| sort -u
|
||||||
}
|
}
|
||||||
|
|
||||||
service_port_has_listener() {
|
|
||||||
local service=$1
|
|
||||||
local port
|
|
||||||
|
|
||||||
port=$(service_port "$service") || return 1
|
|
||||||
ss -ltnH "( sport = :$port )" 2>/dev/null | grep -q .
|
|
||||||
}
|
|
||||||
|
|
||||||
service_listener_details() {
|
|
||||||
local service=$1
|
|
||||||
local port
|
|
||||||
|
|
||||||
port=$(service_port "$service") || return 0
|
|
||||||
ss -ltnp "( sport = :$port )" 2>/dev/null || true
|
|
||||||
}
|
|
||||||
|
|
||||||
kill_pid_tree() {
|
kill_pid_tree() {
|
||||||
local signal=$1
|
local signal=$1
|
||||||
local pid=$2
|
local pid=$2
|
||||||
@@ -205,18 +137,19 @@ start_one() {
|
|||||||
local pid_file
|
local pid_file
|
||||||
local attempt
|
local attempt
|
||||||
local pid_grace_attempts=5
|
local pid_grace_attempts=5
|
||||||
local startup_attempts
|
|
||||||
|
|
||||||
if [ "$service" = "ollama" ] || [ "$service" = "wiki" ]; then
|
|
||||||
assert_ollama_logprobs_support
|
|
||||||
fi
|
|
||||||
|
|
||||||
if has_live_pid "$service"; then
|
if has_live_pid "$service"; then
|
||||||
|
if [ "$service" = "transformerlab" ]; then
|
||||||
|
ensure_transformerlab_default_user
|
||||||
|
fi
|
||||||
echo "$service already running"
|
echo "$service already running"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if service_ready "$service"; then
|
if service_ready "$service"; then
|
||||||
|
if [ "$service" = "transformerlab" ]; then
|
||||||
|
ensure_transformerlab_default_user
|
||||||
|
fi
|
||||||
echo "$service already available"
|
echo "$service already available"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
@@ -225,15 +158,31 @@ start_one() {
|
|||||||
open-webui)
|
open-webui)
|
||||||
start_one ollama
|
start_one ollama
|
||||||
;;
|
;;
|
||||||
wetty)
|
transformerlab)
|
||||||
check_wetty_prereqs
|
if command -v python3 >/dev/null 2>&1; then
|
||||||
|
python3 "$SCRIPT_DIR/repair_transformerlab_symlink_paths.py" \
|
||||||
|
--transformerlab-dir "$TRANSFORMERLAB_DIR" >>"$STATE_DIR/logs/transformerlab_source_repairs.log" 2>&1 || true
|
||||||
|
python3 "$SCRIPT_DIR/repair_transformerlab_plugin_supports.py" \
|
||||||
|
--transformerlab-dir "$TRANSFORMERLAB_DIR" \
|
||||||
|
--plugin "fastchat_server" \
|
||||||
|
--required-support "chat" \
|
||||||
|
--required-support "completion" \
|
||||||
|
--required-support "visualize_model" \
|
||||||
|
--required-support "model_layers" \
|
||||||
|
--required-support "rag" \
|
||||||
|
--required-support "tools" \
|
||||||
|
--required-support "template" \
|
||||||
|
--required-support "embeddings" \
|
||||||
|
--required-support "tokenize" \
|
||||||
|
--required-support "logprobs" \
|
||||||
|
--required-support "batched" >>"$STATE_DIR/logs/transformerlab_plugin_supports.log" 2>&1 || true
|
||||||
|
fi
|
||||||
;;
|
;;
|
||||||
*)
|
*)
|
||||||
;;
|
;;
|
||||||
esac
|
esac
|
||||||
|
|
||||||
cmd=$(service_command "$service")
|
cmd=$(service_command "$service")
|
||||||
startup_attempts=$(service_startup_attempts "$service")
|
|
||||||
log_file=$(service_log_file "$service")
|
log_file=$(service_log_file "$service")
|
||||||
pid_file=$(service_pid_file "$service")
|
pid_file=$(service_pid_file "$service")
|
||||||
|
|
||||||
@@ -249,8 +198,11 @@ start_one() {
|
|||||||
fi
|
fi
|
||||||
echo $! >"$pid_file"
|
echo $! >"$pid_file"
|
||||||
|
|
||||||
for attempt in $(seq 1 "$startup_attempts"); do
|
for attempt in $(seq 1 60); do
|
||||||
if service_ready "$service"; then
|
if service_ready "$service"; then
|
||||||
|
if [ "$service" = "transformerlab" ]; then
|
||||||
|
ensure_transformerlab_default_user
|
||||||
|
fi
|
||||||
echo "started $service"
|
echo "started $service"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
@@ -316,28 +268,6 @@ stop_one() {
|
|||||||
exit 1
|
exit 1
|
||||||
}
|
}
|
||||||
|
|
||||||
restart_managed_wiki() {
|
|
||||||
local wiki_log_file
|
|
||||||
wiki_log_file=$(service_log_file wiki)
|
|
||||||
|
|
||||||
if has_live_pid wiki; then
|
|
||||||
stop_one wiki
|
|
||||||
fi
|
|
||||||
|
|
||||||
if service_port_has_listener wiki; then
|
|
||||||
cat <<EOF >&2
|
|
||||||
Cannot restart wiki because port $(service_port wiki) is already in use by a non-managed listener.
|
|
||||||
Listener details:
|
|
||||||
$(service_listener_details wiki)
|
|
||||||
Leave that process alone or move it off port $(service_port wiki), then rerun ./labctl update_wiki.
|
|
||||||
Wiki log: $wiki_log_file
|
|
||||||
EOF
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
start_one wiki
|
|
||||||
}
|
|
||||||
|
|
||||||
status_one() {
|
status_one() {
|
||||||
local service=$1
|
local service=$1
|
||||||
|
|
||||||
@@ -354,17 +284,36 @@ urls() {
|
|||||||
cat <<EOF
|
cat <<EOF
|
||||||
Ollama API: $(service_url ollama)
|
Ollama API: $(service_url ollama)
|
||||||
Open WebUI: $(service_url open-webui)
|
Open WebUI: $(service_url open-webui)
|
||||||
Netron: $(service_url netron)
|
TransformerLab: $(service_url transformerlab)
|
||||||
ChunkViz: $(service_url chunkviz)
|
ChunkViz: $(service_url chunkviz)
|
||||||
Embedding Atlas: $(service_url embedding-atlas)
|
Embedding Atlas: $(service_url embedding-atlas)
|
||||||
Unsloth Studio: $(service_url unsloth)
|
Unsloth Studio: $(service_url unsloth)
|
||||||
Promptfoo CLI: $PROMPTFOO_BIN
|
Promptfoo CLI: $PROMPTFOO_BIN
|
||||||
Promptfoo UI: $(service_url promptfoo)
|
Promptfoo UI: $(service_url promptfoo)
|
||||||
Wiki: $(service_url wiki)
|
Wiki: $(service_url wiki)
|
||||||
Lab 3 Terminal: $(service_url wetty)
|
Kiln app: ${KILN_LAUNCH_PATH:-not installed}
|
||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
|
open_kiln() {
|
||||||
|
local host_os
|
||||||
|
|
||||||
|
host_os=$(uname -s)
|
||||||
|
if [ "$host_os" = "Darwin" ] && [ -d "$KILN_MAC_APP" ]; then
|
||||||
|
open "$KILN_MAC_APP"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -x "$KILN_LINUX_BIN" ]; then
|
||||||
|
nohup "$KILN_LINUX_BIN" >/dev/null 2>&1 &
|
||||||
|
echo "started Kiln from $KILN_LINUX_BIN"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Kiln is not installed." >&2
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
show_logs() {
|
show_logs() {
|
||||||
local service=$1
|
local service=$1
|
||||||
local log_file
|
local log_file
|
||||||
@@ -406,6 +355,13 @@ main() {
|
|||||||
urls)
|
urls)
|
||||||
urls
|
urls
|
||||||
;;
|
;;
|
||||||
|
open)
|
||||||
|
if [ "${1:-}" != "kiln" ]; then
|
||||||
|
echo "Only 'open kiln' is supported." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
open_kiln
|
||||||
|
;;
|
||||||
logs)
|
logs)
|
||||||
if [ $# -ne 1 ]; then
|
if [ $# -ne 1 ]; then
|
||||||
echo "Usage: ./labctl logs <service>" >&2
|
echo "Usage: ./labctl logs <service>" >&2
|
||||||
@@ -413,9 +369,6 @@ main() {
|
|||||||
fi
|
fi
|
||||||
show_logs "$1"
|
show_logs "$1"
|
||||||
;;
|
;;
|
||||||
restart-wiki)
|
|
||||||
restart_managed_wiki
|
|
||||||
;;
|
|
||||||
*)
|
*)
|
||||||
echo "Unknown command: $cmd" >&2
|
echo "Unknown command: $cmd" >&2
|
||||||
exit 1
|
exit 1
|
||||||
|
|||||||
Reference in New Issue
Block a user