Provision Netron and Lab 1 local assets
This commit is contained in:
@@ -10,14 +10,23 @@ This project builds a student-friendly local lab environment for the courseware
|
|||||||
|
|
||||||
- Ollama
|
- Ollama
|
||||||
- `llama.cpp`
|
- `llama.cpp`
|
||||||
- TransformerLab, pinned to the classic single-user `v0.28.2` release
|
- Netron, served locally on port `8338`
|
||||||
- Open WebUI
|
- Open WebUI
|
||||||
- ChunkViz
|
- ChunkViz
|
||||||
- Embedding Atlas
|
- Embedding Atlas
|
||||||
- Promptfoo
|
- Promptfoo
|
||||||
- Unsloth Studio
|
- Unsloth Studio
|
||||||
- Kiln Desktop
|
- Kiln Desktop
|
||||||
- Course-specific support assets for lab 2 and lab 4
|
- Course-specific support assets for lab 1, lab 2, and lab 4
|
||||||
|
|
||||||
|
## Lab 1 Defaults
|
||||||
|
|
||||||
|
Lab 1 is now provisioned directly by the installer:
|
||||||
|
|
||||||
|
- The `Qwen3-0.6B-Q8_0.gguf` and `Llama-3.2-1B.Q4_K_M.gguf` files are mirrored into `state/models/lab1/`.
|
||||||
|
- The Qwen GGUF is pre-registered in Ollama as `lab1-qwen3-0.6b-q8_0`.
|
||||||
|
- The wiki serves same-host download links for both GGUFs through `/api/lab1/models/...`.
|
||||||
|
- Lab 1 confidence visualization requires Ollama `0.12.11` or newer because it depends on logprobs.
|
||||||
|
|
||||||
## Supported Host Profiles
|
## Supported Host Profiles
|
||||||
|
|
||||||
@@ -27,7 +36,7 @@ This build intentionally avoids the reference VM's hardware workarounds.
|
|||||||
- Native Debian/Ubuntu: Debian-family Linux with an NVIDIA GPU visible to `nvidia-smi` and at least 8 GB VRAM.
|
- Native Debian/Ubuntu: Debian-family Linux with an NVIDIA GPU visible to `nvidia-smi` and at least 8 GB VRAM.
|
||||||
- WSL: Debian/Ubuntu-family Linux running under WSL, with the NVIDIA GPU exposed into the distro.
|
- WSL: Debian/Ubuntu-family Linux running under WSL, with the NVIDIA GPU exposed into the distro.
|
||||||
|
|
||||||
The launcher and Ansible preflight now classify the host dynamically and apply different setup behavior for:
|
The launcher and Ansible preflight classify the host dynamically and apply different setup behavior for:
|
||||||
|
|
||||||
- `macos`
|
- `macos`
|
||||||
- `native-debian-ubuntu`
|
- `native-debian-ubuntu`
|
||||||
@@ -51,6 +60,7 @@ On Linux and WSL, the first `./labctl up` or `./labctl preflight` run may prompt
|
|||||||
On Ubuntu WSL x86_64, preflight now installs the Linux-side CUDA toolkit automatically if it is missing.
|
On Ubuntu WSL x86_64, preflight now installs the Linux-side CUDA toolkit automatically if it is missing.
|
||||||
|
|
||||||
It first tries the distro package:
|
It first tries the distro package:
|
||||||
|
|
||||||
- `sudo apt install -y nvidia-cuda-toolkit`
|
- `sudo apt install -y nvidia-cuda-toolkit`
|
||||||
|
|
||||||
If that package is unavailable or still does not expose `nvcc`, the installer falls back to NVIDIA's WSL-Ubuntu repository bootstrap for the toolkit only, not a Linux GPU driver.
|
If that package is unavailable or still does not expose `nvcc`, the installer falls back to NVIDIA's WSL-Ubuntu repository bootstrap for the toolkit only, not a Linux GPU driver.
|
||||||
@@ -64,30 +74,28 @@ For non-Ubuntu WSL distros, install the CUDA toolkit manually before running the
|
|||||||
|
|
||||||
## Native Debian/Ubuntu CUDA Behavior
|
## Native Debian/Ubuntu CUDA Behavior
|
||||||
|
|
||||||
On native Debian/Ubuntu hosts, the installer now handles three CUDA-toolkit cases:
|
On native Debian/Ubuntu hosts, the installer handles three CUDA-toolkit cases:
|
||||||
|
|
||||||
- If the toolkit is already usable, it reuses the existing install instead of forcing a reinstall.
|
- If the toolkit is already usable, it reuses the existing install instead of forcing a reinstall.
|
||||||
- If the distro exposes `nvidia-cuda-toolkit`, it installs that package.
|
- If the distro exposes `nvidia-cuda-toolkit`, it installs that package.
|
||||||
- If the distro package is unavailable, it bootstraps NVIDIA's official CUDA network repository for supported native Debian/Ubuntu releases and installs the toolkit from there.
|
- If the distro package is unavailable, it bootstraps NVIDIA's official CUDA network repository for supported native Debian/Ubuntu releases and installs the toolkit from there.
|
||||||
|
|
||||||
If `apt` starts in a broken dependency state, the installer now attempts `dpkg --configure -a` and `apt-get --fix-broken install` before retrying package installation.
|
If `apt` starts in a broken dependency state, the installer attempts `dpkg --configure -a` and `apt-get --fix-broken install` before retrying package installation.
|
||||||
|
|
||||||
If CUDA is already mounted or preinstalled outside `PATH`, the installer now detects standard locations such as `/usr/local/cuda/bin/nvcc` and `/usr/local/cuda-*/bin/nvcc`.
|
If CUDA is already mounted or preinstalled outside `PATH`, the installer detects standard locations such as `/usr/local/cuda/bin/nvcc` and `/usr/local/cuda-*/bin/nvcc`.
|
||||||
|
|
||||||
## Standard Assumptions
|
## Standard Assumptions
|
||||||
|
|
||||||
- The host-side install path assumes modern local tooling, but TransformerLab itself is provisioned from a pinned classic single-user layout.
|
- The default deployment is centered on Ollama-backed local inference and browser-based tools such as Netron and the wiki.
|
||||||
- TransformerLab is intentionally pinned to the older single-user `v0.28.2` release because newer upstream releases changed the project structure and behavior in ways that break this courseware.
|
- Netron is installed into a managed Python virtual environment and served locally instead of being provisioned as a desktop package.
|
||||||
- This project does not rely on TransformerLab's upstream `install.sh`; the Ansible role provisions the pinned release directly so web assets, env layout, and runtime behavior stay reproducible.
|
- Lab 1 model downloads are mirrored locally during `./labctl up`, so students do not have to fetch them manually from the original source.
|
||||||
- The courseware repairs installed TransformerLab Fastchat plugin manifests so Fastchat-gated features such as Model Architecture and Visualize Logprobs stay available on pinned installs.
|
- WhiteRabbitNeo assets remain a separate Lab 2 flow and are still handled outside the default `./labctl up` run.
|
||||||
- No Ollama models are pulled during `./labctl up`; students pull models manually as part of the courseware.
|
- Run `./labctl assets lab2` when you want to populate repo-local Lab 2 assets in `assets/lab2/` from Hugging Face.
|
||||||
- WhiteRabbitNeo assets are handled separately from `./labctl up` and `./labctl preflight`.
|
|
||||||
- Run `./labctl assets lab2` when you want to populate repo-local lab 2 assets in `assets/lab2/` from Hugging Face.
|
|
||||||
- After base setup, run `state/lab2/download_whiterabbitneo-gguf.sh` to fetch only the `Q4_K_M`, `Q8_0`, and `IQ2_M` files from `bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF` and register local Ollama models `WhiteRabbitNeo`, `WhiteRabbitNeo-Q4`, `WhiteRabbitNeo-Q8`, and `WhiteRabbitNeo-IQ2`.
|
- After base setup, run `state/lab2/download_whiterabbitneo-gguf.sh` to fetch only the `Q4_K_M`, `Q8_0`, and `IQ2_M` files from `bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF` and register local Ollama models `WhiteRabbitNeo`, `WhiteRabbitNeo-Q4`, `WhiteRabbitNeo-Q8`, and `WhiteRabbitNeo-IQ2`.
|
||||||
- TransformerLab and Unsloth homes are redirected into this project's `state/` tree via symlinks.
|
- Unsloth homes are redirected into this project's `state/` tree via symlinks.
|
||||||
- Managed web services bind for access from both Linux and the Windows side of WSL, while `labctl urls` still reports localhost-friendly URLs.
|
- Managed web services bind for access from both Linux and the Windows side of WSL, while `labctl urls` still reports localhost-friendly URLs.
|
||||||
- The local Ansible bootstrap in `.venv-ansible/` is machine-specific and will be recreated automatically if the folder is copied between hosts.
|
- The local Ansible bootstrap in `.venv-ansible/` is machine-specific and will be recreated automatically if the folder is copied between hosts.
|
||||||
- `llama.cpp` now uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts.
|
- `llama.cpp` uses a conservative, memory-aware build parallelism setting instead of an unbounded `-j` build, which avoids OOM failures on smaller Linux and WSL hosts.
|
||||||
|
|
||||||
## Lab URLs
|
## Lab URLs
|
||||||
|
|
||||||
@@ -97,7 +105,7 @@ Default endpoints:
|
|||||||
|
|
||||||
- Ollama API: `http://127.0.0.1:11434`
|
- Ollama API: `http://127.0.0.1:11434`
|
||||||
- Open WebUI: `http://127.0.0.1:8080`
|
- Open WebUI: `http://127.0.0.1:8080`
|
||||||
- TransformerLab: `http://127.0.0.1:8338`
|
- Netron: `http://127.0.0.1:8338`
|
||||||
- ChunkViz: `http://127.0.0.1:3001`
|
- ChunkViz: `http://127.0.0.1:3001`
|
||||||
- Embedding Atlas: `http://127.0.0.1:5055`
|
- Embedding Atlas: `http://127.0.0.1:5055`
|
||||||
- Unsloth Studio: `http://127.0.0.1:8888`
|
- Unsloth Studio: `http://127.0.0.1:8888`
|
||||||
@@ -116,15 +124,14 @@ The deployment will:
|
|||||||
## Notes
|
## Notes
|
||||||
|
|
||||||
- `./labctl up` installs the environment and then starts every managed service.
|
- `./labctl up` installs the environment and then starts every managed service.
|
||||||
- `./labctl versions` shows the pinned TransformerLab and Ansible runtime versions used by this workspace.
|
- `./labctl versions` shows the pinned Netron version, minimum Ollama version, and Ansible runtime version used by this workspace.
|
||||||
- `./labctl assets lab2` is a separate manual step that clones the base WhiteRabbitNeo repo into `assets/lab2/WhiteRabbitNeo-V3-7B` and downloads the supported `Q4_K_M`, `Q8_0`, and `IQ2_M` GGUFs into `assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF`.
|
- `./labctl assets lab2` is a separate manual step that clones the base WhiteRabbitNeo repo into `assets/lab2/WhiteRabbitNeo-V3-7B` and downloads the supported `Q4_K_M`, `Q8_0`, and `IQ2_M` GGUFs into `assets/lab2/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF`.
|
||||||
- TransformerLab is installed as a pinned single-user app and no default courseware-managed TransformerLab user is created automatically.
|
|
||||||
- `./labctl start core` starts only `ollama` and `open-webui`.
|
- `./labctl start core` starts only `ollama` and `open-webui`.
|
||||||
- `./labctl start all` starts every managed web service.
|
- `./labctl start all` starts every managed web service.
|
||||||
- `./labctl open kiln` launches the Kiln desktop app installed into the project state.
|
- `./labctl open kiln` launches the Kiln desktop app installed into the project state.
|
||||||
- The scripted Promptfoo install drops a starter config at `state/lab6/promptfoo.yaml`.
|
- The scripted Promptfoo install drops a starter config at `state/lab6/promptfoo.yaml`.
|
||||||
- `labctl start all` now includes Promptfoo via `promptfoo view` and the cloned wiki app.
|
- `labctl start all` includes Promptfoo via `promptfoo view` and the cloned wiki app.
|
||||||
- Lab 2 includes `state/lab2/download_whiterabbitneo-gguf.sh`, which uses `git` + `git lfs` to pull only the supported WhiteRabbitNeo quants. Add `--download-only` if you want the files without Ollama registration.
|
- Lab 2 includes `state/lab2/download_whiterabbitneo-gguf.sh`, which uses `git` + `git lfs` to pull only the supported WhiteRabbitNeo quants. Add `--download-only` if you want the files without Ollama registration.
|
||||||
- The wiki is cloned from `https://git.zuccaro.me/bzuccaro/LLM-Labs.git` into `state/repos/LLM-Labs` and started with `npm`.
|
- The wiki is cloned from `https://git.zuccaro.me/bzuccaro/LLM-Labs.git` into `state/repos/LLM-Labs` and started with `npm`.
|
||||||
- `./labctl down` now uninstalls Ollama entirely when this project installed it, instead of only stopping the service.
|
- `./labctl down` uninstalls Ollama entirely when this project installed it, instead of only stopping the service.
|
||||||
- Unsloth Studio currently supports chat and data workflows on macOS; Linux/WSL remains the standard path for NVIDIA-backed training.
|
- Unsloth Studio currently supports chat and data workflows on macOS; Linux/WSL remains the standard path for NVIDIA-backed training.
|
||||||
|
|||||||
+14
-29
@@ -9,15 +9,15 @@ courseware_datasets_dir: "{{ courseware_state_dir }}/datasets"
|
|||||||
courseware_tools_dir: "{{ courseware_state_dir }}/tools"
|
courseware_tools_dir: "{{ courseware_state_dir }}/tools"
|
||||||
courseware_apps_dir: "{{ courseware_state_dir }}/apps"
|
courseware_apps_dir: "{{ courseware_state_dir }}/apps"
|
||||||
courseware_downloads_dir: "{{ courseware_state_dir }}/downloads"
|
courseware_downloads_dir: "{{ courseware_state_dir }}/downloads"
|
||||||
|
courseware_lab1_dir: "{{ courseware_state_dir }}/lab1"
|
||||||
courseware_lab2_dir: "{{ courseware_state_dir }}/lab2"
|
courseware_lab2_dir: "{{ courseware_state_dir }}/lab2"
|
||||||
courseware_lab6_dir: "{{ courseware_state_dir }}/lab6"
|
courseware_lab6_dir: "{{ courseware_state_dir }}/lab6"
|
||||||
courseware_transformerlab_legacy_home: "{{ courseware_state_dir }}/transformerlab-home"
|
|
||||||
courseware_safe_homes_dir: "{{ lookup('env', 'HOME') }}/.local/share/local-lab-deployment"
|
|
||||||
courseware_transformerlab_home: "{{ (courseware_safe_homes_dir ~ '/transformerlab-home') if ' ' in courseware_root else courseware_transformerlab_legacy_home }}"
|
|
||||||
courseware_unsloth_home: "{{ courseware_state_dir }}/unsloth-home"
|
courseware_unsloth_home: "{{ courseware_state_dir }}/unsloth-home"
|
||||||
|
courseware_lab1_models_dir: "{{ courseware_models_dir }}/lab1"
|
||||||
courseware_ollama_models_dir: "{{ courseware_models_dir }}/ollama"
|
courseware_ollama_models_dir: "{{ courseware_models_dir }}/ollama"
|
||||||
courseware_node_runtime_dir: "{{ courseware_tools_dir }}/node-runtime"
|
courseware_node_runtime_dir: "{{ courseware_tools_dir }}/node-runtime"
|
||||||
courseware_node_runtime_bin_dir: "{{ courseware_node_runtime_dir }}/node_modules/node/bin"
|
courseware_node_runtime_bin_dir: "{{ courseware_node_runtime_dir }}/node_modules/node/bin"
|
||||||
|
courseware_netron_venv_dir: "{{ courseware_venvs_dir }}/netron"
|
||||||
courseware_wetty_dir: "{{ courseware_tools_dir }}/wetty"
|
courseware_wetty_dir: "{{ courseware_tools_dir }}/wetty"
|
||||||
courseware_promptfoo_dir: "{{ courseware_lab6_dir }}"
|
courseware_promptfoo_dir: "{{ courseware_lab6_dir }}"
|
||||||
courseware_wiki_repo_dir: "{{ courseware_repos_dir }}/LLM-Labs"
|
courseware_wiki_repo_dir: "{{ courseware_repos_dir }}/LLM-Labs"
|
||||||
@@ -29,7 +29,7 @@ courseware_url_host: "127.0.0.1"
|
|||||||
courseware_ports:
|
courseware_ports:
|
||||||
ollama: 11434
|
ollama: 11434
|
||||||
open_webui: 8080
|
open_webui: 8080
|
||||||
transformerlab: 8338
|
netron: 8338
|
||||||
chunkviz: 3001
|
chunkviz: 3001
|
||||||
embedding_atlas: 5055
|
embedding_atlas: 5055
|
||||||
unsloth: 8888
|
unsloth: 8888
|
||||||
@@ -37,30 +37,8 @@ courseware_ports:
|
|||||||
wiki: 80
|
wiki: 80
|
||||||
wetty: 7681
|
wetty: 7681
|
||||||
|
|
||||||
courseware_transformerlab_install_mode: "single-user-pinned"
|
courseware_netron_version: "9.0.1"
|
||||||
courseware_transformerlab_version: "v0.28.2"
|
courseware_ollama_min_version: "0.12.11"
|
||||||
courseware_transformerlab_version_dir: "{{ courseware_transformerlab_version | regex_replace('^v', '') }}"
|
|
||||||
courseware_transformerlab_source_archive: "{{ courseware_downloads_dir }}/transformerlab-app-{{ courseware_transformerlab_version_dir }}.tar.gz"
|
|
||||||
courseware_transformerlab_web_archive: "{{ courseware_downloads_dir }}/transformerlab-web-{{ courseware_transformerlab_version_dir }}.tar.gz"
|
|
||||||
courseware_transformerlab_miniforge_installer: "{{ courseware_downloads_dir }}/transformerlab-miniforge-installer.sh"
|
|
||||||
courseware_transformerlab_default_user_email: "student@zuccaro.me"
|
|
||||||
courseware_transformerlab_default_user_password: "student"
|
|
||||||
courseware_transformerlab_default_user_first_name: "Student"
|
|
||||||
courseware_transformerlab_default_user_last_name: ""
|
|
||||||
courseware_transformerlab_required_loader_plugins:
|
|
||||||
- "fastchat_server"
|
|
||||||
courseware_transformerlab_required_supports_fastchat:
|
|
||||||
- "chat"
|
|
||||||
- "completion"
|
|
||||||
- "visualize_model"
|
|
||||||
- "model_layers"
|
|
||||||
- "rag"
|
|
||||||
- "tools"
|
|
||||||
- "template"
|
|
||||||
- "embeddings"
|
|
||||||
- "tokenize"
|
|
||||||
- "logprobs"
|
|
||||||
- "batched"
|
|
||||||
courseware_llama_cpp_commit: "51fa458a92d6a3f305f8fd76fc8f702e3e87ddb5"
|
courseware_llama_cpp_commit: "51fa458a92d6a3f305f8fd76fc8f702e3e87ddb5"
|
||||||
courseware_chunkviz_commit: "a891eacafda1f28a12373ad3b00102e68f07c57f"
|
courseware_chunkviz_commit: "a891eacafda1f28a12373ad3b00102e68f07c57f"
|
||||||
courseware_promptfoo_version: "0.119.0"
|
courseware_promptfoo_version: "0.119.0"
|
||||||
@@ -72,6 +50,13 @@ courseware_wiki_repo: "https://git.zuccaro.me/bzuccaro/LLM-Labs.git"
|
|||||||
|
|
||||||
courseware_open_webui_spec: "open-webui"
|
courseware_open_webui_spec: "open-webui"
|
||||||
courseware_embedding_atlas_spec: "embedding-atlas"
|
courseware_embedding_atlas_spec: "embedding-atlas"
|
||||||
|
courseware_lab1_qwen_filename: "Qwen3-0.6B-Q8_0.gguf"
|
||||||
|
courseware_lab1_qwen_download_url: "https://huggingface.co/Qwen/Qwen3-0.6B-GGUF/resolve/main/Qwen3-0.6B-Q8_0.gguf?download=true"
|
||||||
|
courseware_lab1_qwen_local_path: "{{ courseware_lab1_models_dir }}/{{ courseware_lab1_qwen_filename }}"
|
||||||
|
courseware_lab1_qwen_model_alias: "lab1-qwen3-0.6b-q8_0"
|
||||||
|
courseware_lab1_llama_filename: "Llama-3.2-1B.Q4_K_M.gguf"
|
||||||
|
courseware_lab1_llama_download_url: "https://huggingface.co/DevQuasar-3/meta-llama.Llama-3.2-1B-GGUF/resolve/main/Llama-3.2-1B.Q4_K_M.gguf?download=true"
|
||||||
|
courseware_lab1_llama_local_path: "{{ courseware_lab1_models_dir }}/{{ courseware_lab1_llama_filename }}"
|
||||||
|
|
||||||
courseware_white_rabbit_repo: "bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
|
courseware_white_rabbit_repo: "bartowski/WhiteRabbitNeo_WhiteRabbitNeo-V3-7B-GGUF"
|
||||||
courseware_white_rabbit_variants:
|
courseware_white_rabbit_variants:
|
||||||
@@ -150,7 +135,7 @@ courseware_ollama_install_marker: "{{ courseware_markers_dir }}/ollama-installed
|
|||||||
courseware_services:
|
courseware_services:
|
||||||
- "ollama"
|
- "ollama"
|
||||||
- "open-webui"
|
- "open-webui"
|
||||||
- "transformerlab"
|
- "netron"
|
||||||
- "chunkviz"
|
- "chunkviz"
|
||||||
- "embedding-atlas"
|
- "embedding-atlas"
|
||||||
- "unsloth"
|
- "unsloth"
|
||||||
|
|||||||
@@ -12,17 +12,6 @@
|
|||||||
changed_when: false
|
changed_when: false
|
||||||
failed_when: false
|
failed_when: false
|
||||||
|
|
||||||
- name: Stat managed TransformerLab symlink
|
|
||||||
stat:
|
|
||||||
path: "{{ ansible_env.HOME }}/.transformerlab"
|
|
||||||
follow: false
|
|
||||||
register: courseware_down_transformerlab
|
|
||||||
|
|
||||||
- name: Stat managed TransformerLab marker
|
|
||||||
stat:
|
|
||||||
path: "{{ ansible_env.HOME }}/.transformerlab/.courseware-managed"
|
|
||||||
register: courseware_down_transformerlab_marker
|
|
||||||
|
|
||||||
- name: Stat managed Unsloth symlink
|
- name: Stat managed Unsloth symlink
|
||||||
stat:
|
stat:
|
||||||
path: "{{ ansible_env.HOME }}/.unsloth"
|
path: "{{ ansible_env.HOME }}/.unsloth"
|
||||||
@@ -34,11 +23,6 @@
|
|||||||
path: "{{ ansible_env.HOME }}/.unsloth/.courseware-managed"
|
path: "{{ ansible_env.HOME }}/.unsloth/.courseware-managed"
|
||||||
register: courseware_down_unsloth_marker
|
register: courseware_down_unsloth_marker
|
||||||
|
|
||||||
- name: Stat conda environments file
|
|
||||||
stat:
|
|
||||||
path: "{{ ansible_env.HOME }}/.conda/environments.txt"
|
|
||||||
register: courseware_down_conda_envs
|
|
||||||
|
|
||||||
- name: Stat courseware-managed Ollama install marker
|
- name: Stat courseware-managed Ollama install marker
|
||||||
stat:
|
stat:
|
||||||
path: "{{ courseware_ollama_install_marker }}"
|
path: "{{ courseware_ollama_install_marker }}"
|
||||||
@@ -172,39 +156,6 @@
|
|||||||
changed_when: false
|
changed_when: false
|
||||||
failed_when: false
|
failed_when: false
|
||||||
|
|
||||||
- name: Remove managed TransformerLab conda environment entry
|
|
||||||
lineinfile:
|
|
||||||
path: "{{ ansible_env.HOME }}/.conda/environments.txt"
|
|
||||||
regexp: "^{{ (courseware_transformerlab_home ~ '/envs/transformerlab') | regex_escape() }}$"
|
|
||||||
state: absent
|
|
||||||
when: courseware_down_conda_envs.stat.exists
|
|
||||||
failed_when: false
|
|
||||||
|
|
||||||
- name: Remove managed TransformerLab path
|
|
||||||
file:
|
|
||||||
path: "{{ ansible_env.HOME }}/.transformerlab"
|
|
||||||
state: absent
|
|
||||||
when:
|
|
||||||
- courseware_down_transformerlab.stat.exists
|
|
||||||
- >
|
|
||||||
(courseware_down_transformerlab.stat.islnk and
|
|
||||||
courseware_down_transformerlab.stat.lnk_source == courseware_transformerlab_home)
|
|
||||||
or courseware_down_transformerlab_marker.stat.exists
|
|
||||||
failed_when: false
|
|
||||||
|
|
||||||
- name: Remove managed TransformerLab home
|
|
||||||
file:
|
|
||||||
path: "{{ courseware_transformerlab_home }}"
|
|
||||||
state: absent
|
|
||||||
failed_when: false
|
|
||||||
|
|
||||||
- name: Remove legacy managed TransformerLab home
|
|
||||||
file:
|
|
||||||
path: "{{ courseware_transformerlab_legacy_home }}"
|
|
||||||
state: absent
|
|
||||||
when: courseware_transformerlab_legacy_home != courseware_transformerlab_home
|
|
||||||
failed_when: false
|
|
||||||
|
|
||||||
- name: Remove managed Unsloth path
|
- name: Remove managed Unsloth path
|
||||||
file:
|
file:
|
||||||
path: "{{ ansible_env.HOME }}/.unsloth"
|
path: "{{ ansible_env.HOME }}/.unsloth"
|
||||||
|
|||||||
@@ -6,11 +6,12 @@
|
|||||||
- { role: preflight, tags: ["preflight"] }
|
- { role: preflight, tags: ["preflight"] }
|
||||||
- directories
|
- directories
|
||||||
- packages
|
- packages
|
||||||
|
- netron
|
||||||
|
- lab1_assets
|
||||||
- lab_assets
|
- lab_assets
|
||||||
- node_runtime
|
- node_runtime
|
||||||
- { role: terminal, when: ansible_system == "Linux" }
|
- { role: terminal, when: ansible_system == "Linux" }
|
||||||
- llama_cpp
|
- llama_cpp
|
||||||
- transformerlab
|
|
||||||
- open_webui
|
- open_webui
|
||||||
- chunkviz
|
- chunkviz
|
||||||
- promptfoo
|
- promptfoo
|
||||||
|
|||||||
@@ -15,10 +15,10 @@
|
|||||||
- "{{ courseware_tools_dir }}"
|
- "{{ courseware_tools_dir }}"
|
||||||
- "{{ courseware_apps_dir }}"
|
- "{{ courseware_apps_dir }}"
|
||||||
- "{{ courseware_downloads_dir }}"
|
- "{{ courseware_downloads_dir }}"
|
||||||
|
- "{{ courseware_lab1_dir }}"
|
||||||
- "{{ courseware_lab2_dir }}"
|
- "{{ courseware_lab2_dir }}"
|
||||||
- "{{ courseware_safe_homes_dir }}"
|
|
||||||
- "{{ courseware_transformerlab_home }}"
|
|
||||||
- "{{ courseware_unsloth_home }}"
|
- "{{ courseware_unsloth_home }}"
|
||||||
|
- "{{ courseware_lab1_models_dir }}"
|
||||||
- "{{ courseware_ollama_models_dir }}"
|
- "{{ courseware_ollama_models_dir }}"
|
||||||
|
|
||||||
- name: Seed managed ownership markers
|
- name: Seed managed ownership markers
|
||||||
@@ -27,40 +27,8 @@
|
|||||||
state: touch
|
state: touch
|
||||||
mode: "0644"
|
mode: "0644"
|
||||||
loop:
|
loop:
|
||||||
- "{{ courseware_transformerlab_home }}/.courseware-managed"
|
|
||||||
- "{{ courseware_unsloth_home }}/.courseware-managed"
|
- "{{ courseware_unsloth_home }}/.courseware-managed"
|
||||||
|
|
||||||
- name: Check existing TransformerLab path
|
|
||||||
stat:
|
|
||||||
path: "{{ ansible_env.HOME }}/.transformerlab"
|
|
||||||
follow: false
|
|
||||||
register: courseware_transformerlab_link
|
|
||||||
|
|
||||||
- name: Check existing TransformerLab ownership marker
|
|
||||||
stat:
|
|
||||||
path: "{{ ansible_env.HOME }}/.transformerlab/.courseware-managed"
|
|
||||||
register: courseware_transformerlab_marker
|
|
||||||
|
|
||||||
- name: Fail if TransformerLab path is already occupied
|
|
||||||
fail:
|
|
||||||
msg: "{{ ansible_env.HOME }}/.transformerlab already exists and is not managed by this project."
|
|
||||||
when:
|
|
||||||
- courseware_transformerlab_link.stat.exists
|
|
||||||
- >
|
|
||||||
(
|
|
||||||
(not courseware_transformerlab_link.stat.islnk) or
|
|
||||||
(courseware_transformerlab_link.stat.islnk and
|
|
||||||
courseware_transformerlab_link.stat.lnk_source != courseware_transformerlab_home)
|
|
||||||
) and
|
|
||||||
(not courseware_transformerlab_marker.stat.exists)
|
|
||||||
|
|
||||||
- name: Link TransformerLab home into project state
|
|
||||||
file:
|
|
||||||
src: "{{ courseware_transformerlab_home }}"
|
|
||||||
dest: "{{ ansible_env.HOME }}/.transformerlab"
|
|
||||||
state: link
|
|
||||||
force: true
|
|
||||||
|
|
||||||
- name: Check existing Unsloth path
|
- name: Check existing Unsloth path
|
||||||
stat:
|
stat:
|
||||||
path: "{{ ansible_env.HOME }}/.unsloth"
|
path: "{{ ansible_env.HOME }}/.unsloth"
|
||||||
|
|||||||
@@ -0,0 +1,71 @@
|
|||||||
|
- name: Ensure Lab 1 model directory exists
|
||||||
|
file:
|
||||||
|
path: "{{ courseware_lab1_models_dir }}"
|
||||||
|
state: directory
|
||||||
|
mode: "0755"
|
||||||
|
|
||||||
|
- name: Check installed Ollama version
|
||||||
|
command:
|
||||||
|
argv:
|
||||||
|
- "{{ courseware_ollama_bin }}"
|
||||||
|
- --version
|
||||||
|
register: courseware_lab1_ollama_version
|
||||||
|
changed_when: false
|
||||||
|
|
||||||
|
- name: Extract installed Ollama semantic version
|
||||||
|
set_fact:
|
||||||
|
courseware_lab1_ollama_semver: >-
|
||||||
|
{{
|
||||||
|
courseware_lab1_ollama_version.stdout
|
||||||
|
| regex_search('[0-9]+\\.[0-9]+\\.[0-9]+')
|
||||||
|
| default('')
|
||||||
|
}}
|
||||||
|
|
||||||
|
- name: Fail when Ollama is too old for Lab 1 logprobs
|
||||||
|
fail:
|
||||||
|
msg: >-
|
||||||
|
Lab 1 requires Ollama {{ courseware_ollama_min_version }} or newer because
|
||||||
|
the confidence visualizer depends on logprob support. Installed version:
|
||||||
|
{{ courseware_lab1_ollama_version.stdout | trim }}.
|
||||||
|
when:
|
||||||
|
- courseware_lab1_ollama_semver | length == 0
|
||||||
|
or not (courseware_lab1_ollama_semver is version(courseware_ollama_min_version, '>='))
|
||||||
|
|
||||||
|
- name: Download mirrored Lab 1 Qwen model
|
||||||
|
get_url:
|
||||||
|
url: "{{ courseware_lab1_qwen_download_url }}"
|
||||||
|
dest: "{{ courseware_lab1_qwen_local_path }}"
|
||||||
|
mode: "0644"
|
||||||
|
|
||||||
|
- name: Download mirrored Lab 1 Llama model
|
||||||
|
get_url:
|
||||||
|
url: "{{ courseware_lab1_llama_download_url }}"
|
||||||
|
dest: "{{ courseware_lab1_llama_local_path }}"
|
||||||
|
mode: "0644"
|
||||||
|
|
||||||
|
- name: Write Lab 1 Ollama Modelfile
|
||||||
|
copy:
|
||||||
|
dest: "{{ courseware_lab1_dir }}/Modelfile.{{ courseware_lab1_qwen_model_alias }}"
|
||||||
|
mode: "0644"
|
||||||
|
content: |
|
||||||
|
FROM {{ courseware_lab1_qwen_local_path }}
|
||||||
|
|
||||||
|
- name: Start Ollama before Lab 1 model registration
|
||||||
|
command:
|
||||||
|
argv:
|
||||||
|
- "{{ courseware_root }}/scripts/service_manager.sh"
|
||||||
|
- start
|
||||||
|
- ollama
|
||||||
|
changed_when: false
|
||||||
|
|
||||||
|
- name: Register Lab 1 Qwen model with Ollama
|
||||||
|
command:
|
||||||
|
argv:
|
||||||
|
- "{{ courseware_ollama_bin }}"
|
||||||
|
- create
|
||||||
|
- "{{ courseware_lab1_qwen_model_alias }}"
|
||||||
|
- -f
|
||||||
|
- "{{ courseware_lab1_dir }}/Modelfile.{{ courseware_lab1_qwen_model_alias }}"
|
||||||
|
environment:
|
||||||
|
OLLAMA_HOST: "{{ courseware_bind_host }}:{{ courseware_ports.ollama }}"
|
||||||
|
OLLAMA_MODELS: "{{ courseware_ollama_models_dir }}"
|
||||||
@@ -0,0 +1,30 @@
|
|||||||
|
- name: Create Netron virtual environment
|
||||||
|
command:
|
||||||
|
argv:
|
||||||
|
- "{{ courseware_python_bin }}"
|
||||||
|
- -m
|
||||||
|
- venv
|
||||||
|
- "{{ courseware_netron_venv_dir }}"
|
||||||
|
args:
|
||||||
|
creates: "{{ courseware_netron_venv_dir }}/bin/python"
|
||||||
|
|
||||||
|
- name: Upgrade Netron venv tooling
|
||||||
|
command:
|
||||||
|
argv:
|
||||||
|
- "{{ courseware_netron_venv_dir }}/bin/python"
|
||||||
|
- -m
|
||||||
|
- pip
|
||||||
|
- install
|
||||||
|
- --upgrade
|
||||||
|
- pip
|
||||||
|
- setuptools
|
||||||
|
- wheel
|
||||||
|
|
||||||
|
- name: Install Netron server runtime
|
||||||
|
command:
|
||||||
|
argv:
|
||||||
|
- "{{ courseware_netron_venv_dir }}/bin/python"
|
||||||
|
- -m
|
||||||
|
- pip
|
||||||
|
- install
|
||||||
|
- "netron=={{ courseware_netron_version }}"
|
||||||
@@ -7,7 +7,7 @@
|
|||||||
- name: Create Open WebUI virtual environment
|
- name: Create Open WebUI virtual environment
|
||||||
command:
|
command:
|
||||||
argv:
|
argv:
|
||||||
- "{{ courseware_transformerlab_home }}/envs/transformerlab/bin/python"
|
- "{{ courseware_python_bin }}"
|
||||||
- -m
|
- -m
|
||||||
- venv
|
- venv
|
||||||
- "{{ courseware_venvs_dir }}/open-webui"
|
- "{{ courseware_venvs_dir }}/open-webui"
|
||||||
@@ -39,7 +39,7 @@
|
|||||||
- name: Create Embedding Atlas virtual environment
|
- name: Create Embedding Atlas virtual environment
|
||||||
command:
|
command:
|
||||||
argv:
|
argv:
|
||||||
- "{{ courseware_transformerlab_home }}/envs/transformerlab/bin/python"
|
- "{{ courseware_python_bin }}"
|
||||||
- -m
|
- -m
|
||||||
- venv
|
- venv
|
||||||
- "{{ courseware_venvs_dir }}/embedding-atlas"
|
- "{{ courseware_venvs_dir }}/embedding-atlas"
|
||||||
|
|||||||
@@ -1,3 +1,4 @@
|
|||||||
{
|
{
|
||||||
|
"lab1NetronUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.netron }}",
|
||||||
"lab3TerminalUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.wetty }}{{ courseware_wetty_base_path }}"
|
"lab3TerminalUrl": "http://{{ courseware_url_host }}:{{ courseware_ports.wetty }}{{ courseware_wetty_base_path }}"
|
||||||
}
|
}
|
||||||
|
|||||||
+8
-4
@@ -30,14 +30,18 @@
|
|||||||
ansible.builtin.import_role:
|
ansible.builtin.import_role:
|
||||||
name: ollama
|
name: ollama
|
||||||
|
|
||||||
|
- name: Include Netron setup
|
||||||
|
ansible.builtin.import_role:
|
||||||
|
name: netron
|
||||||
|
|
||||||
|
- name: Include Lab 1 asset setup
|
||||||
|
ansible.builtin.import_role:
|
||||||
|
name: lab1_assets
|
||||||
|
|
||||||
- name: Include llama.cpp setup
|
- name: Include llama.cpp setup
|
||||||
ansible.builtin.import_role:
|
ansible.builtin.import_role:
|
||||||
name: llama-cpp
|
name: llama-cpp
|
||||||
|
|
||||||
- name: Include Transformer Lab setup
|
|
||||||
ansible.builtin.import_role:
|
|
||||||
name: transformerlab
|
|
||||||
|
|
||||||
- name: Include Unsloth Studio setup
|
- name: Include Unsloth Studio setup
|
||||||
ansible.builtin.import_role:
|
ansible.builtin.import_role:
|
||||||
name: unsloth
|
name: unsloth
|
||||||
|
|||||||
@@ -5,16 +5,18 @@ COURSEWARE_BIND_HOST="{{ courseware_bind_host }}"
|
|||||||
COURSEWARE_URL_HOST="{{ courseware_url_host }}"
|
COURSEWARE_URL_HOST="{{ courseware_url_host }}"
|
||||||
COURSEWARE_OLLAMA_PORT="{{ courseware_ports.ollama }}"
|
COURSEWARE_OLLAMA_PORT="{{ courseware_ports.ollama }}"
|
||||||
COURSEWARE_OPEN_WEBUI_PORT="{{ courseware_ports.open_webui }}"
|
COURSEWARE_OPEN_WEBUI_PORT="{{ courseware_ports.open_webui }}"
|
||||||
COURSEWARE_TRANSFORMERLAB_PORT="{{ courseware_ports.transformerlab }}"
|
COURSEWARE_NETRON_PORT="{{ courseware_ports.netron }}"
|
||||||
COURSEWARE_CHUNKVIZ_PORT="{{ courseware_ports.chunkviz }}"
|
COURSEWARE_CHUNKVIZ_PORT="{{ courseware_ports.chunkviz }}"
|
||||||
COURSEWARE_EMBEDDING_ATLAS_PORT="{{ courseware_ports.embedding_atlas }}"
|
COURSEWARE_EMBEDDING_ATLAS_PORT="{{ courseware_ports.embedding_atlas }}"
|
||||||
COURSEWARE_UNSLOTH_PORT="{{ courseware_ports.unsloth }}"
|
COURSEWARE_UNSLOTH_PORT="{{ courseware_ports.unsloth }}"
|
||||||
COURSEWARE_PROMPTFOO_PORT="{{ courseware_ports.promptfoo }}"
|
COURSEWARE_PROMPTFOO_PORT="{{ courseware_ports.promptfoo }}"
|
||||||
COURSEWARE_WIKI_PORT="{{ courseware_ports.wiki }}"
|
COURSEWARE_WIKI_PORT="{{ courseware_ports.wiki }}"
|
||||||
COURSEWARE_WETTY_PORT="{{ courseware_ports.wetty }}"
|
COURSEWARE_WETTY_PORT="{{ courseware_ports.wetty }}"
|
||||||
|
COURSEWARE_OLLAMA_MIN_VERSION="{{ courseware_ollama_min_version }}"
|
||||||
OLLAMA_BIN="{{ courseware_ollama_bin }}"
|
OLLAMA_BIN="{{ courseware_ollama_bin }}"
|
||||||
OLLAMA_MODELS_DIR="{{ courseware_ollama_models_dir }}"
|
OLLAMA_MODELS_DIR="{{ courseware_ollama_models_dir }}"
|
||||||
NODE_RUNTIME_BIN_DIR="{{ courseware_node_runtime_bin_dir }}"
|
NODE_RUNTIME_BIN_DIR="{{ courseware_node_runtime_bin_dir }}"
|
||||||
|
NETRON_VENV="{{ courseware_netron_venv_dir }}"
|
||||||
WETTY_BIN="{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
|
WETTY_BIN="{{ courseware_wetty_dir }}/node_modules/.bin/wetty"
|
||||||
COURSEWARE_WETTY_BASE_PATH="{{ courseware_wetty_base_path }}"
|
COURSEWARE_WETTY_BASE_PATH="{{ courseware_wetty_base_path }}"
|
||||||
OPEN_WEBUI_VENV="{{ courseware_venvs_dir }}/open-webui"
|
OPEN_WEBUI_VENV="{{ courseware_venvs_dir }}/open-webui"
|
||||||
@@ -23,11 +25,10 @@ CHUNKVIZ_DIR="{{ courseware_repos_dir }}/ChunkViz"
|
|||||||
EMBEDDING_ATLAS_VENV="{{ courseware_venvs_dir }}/embedding-atlas"
|
EMBEDDING_ATLAS_VENV="{{ courseware_venvs_dir }}/embedding-atlas"
|
||||||
TTPS_DATASET_PATH="{{ courseware_datasets_dir }}/ttps_dataset.parquet"
|
TTPS_DATASET_PATH="{{ courseware_datasets_dir }}/ttps_dataset.parquet"
|
||||||
WIKI_TEST_RAW_PATH="{{ courseware_datasets_dir }}/wiki.test.raw"
|
WIKI_TEST_RAW_PATH="{{ courseware_datasets_dir }}/wiki.test.raw"
|
||||||
TRANSFORMERLAB_DIR="{{ courseware_transformerlab_home }}"
|
COURSEWARE_OLLAMA_BASE_URL="http://{{ courseware_url_host }}:{{ courseware_ports.ollama }}"
|
||||||
TRANSFORMERLAB_DEFAULT_USER_EMAIL="{{ courseware_transformerlab_default_user_email }}"
|
COURSEWARE_LAB1_QWEN_MODEL_PATH="{{ courseware_lab1_qwen_local_path }}"
|
||||||
TRANSFORMERLAB_DEFAULT_USER_PASSWORD="{{ courseware_transformerlab_default_user_password }}"
|
COURSEWARE_LAB1_LLAMA_MODEL_PATH="{{ courseware_lab1_llama_local_path }}"
|
||||||
TRANSFORMERLAB_DEFAULT_USER_FIRST_NAME="{{ courseware_transformerlab_default_user_first_name }}"
|
COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS="{{ courseware_lab1_qwen_model_alias }}"
|
||||||
TRANSFORMERLAB_DEFAULT_USER_LAST_NAME="{{ courseware_transformerlab_default_user_last_name }}"
|
|
||||||
UNSLOTH_BIN="{{ ansible_env.HOME }}/.local/bin/unsloth"
|
UNSLOTH_BIN="{{ ansible_env.HOME }}/.local/bin/unsloth"
|
||||||
PROMPTFOO_DIR="{{ courseware_promptfoo_dir }}"
|
PROMPTFOO_DIR="{{ courseware_promptfoo_dir }}"
|
||||||
PROMPTFOO_BIN="{{ courseware_tools_dir }}/promptfoo/node_modules/.bin/promptfoo"
|
PROMPTFOO_BIN="{{ courseware_tools_dir }}/promptfoo/node_modules/.bin/promptfoo"
|
||||||
|
|||||||
Executable → Regular
Executable → Regular
@@ -30,7 +30,7 @@ Usage:
|
|||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
transformerlab_version() {
|
netron_version() {
|
||||||
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
|
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
|
||||||
|
|
||||||
if [ ! -f "$version_file" ]; then
|
if [ ! -f "$version_file" ]; then
|
||||||
@@ -38,21 +38,35 @@ transformerlab_version() {
|
|||||||
return
|
return
|
||||||
fi
|
fi
|
||||||
|
|
||||||
sed -nE 's/^courseware_transformerlab_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
|
sed -nE 's/^courseware_netron_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
|
||||||
|
}
|
||||||
|
|
||||||
|
minimum_ollama_version() {
|
||||||
|
local version_file=$ROOT_DIR/ansible/group_vars/all.yml
|
||||||
|
|
||||||
|
if [ ! -f "$version_file" ]; then
|
||||||
|
printf '%s\n' "unknown"
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
sed -nE 's/^courseware_ollama_min_version:[[:space:]]*"([^"]+)".*/\1/p' "$version_file" | head -n 1
|
||||||
}
|
}
|
||||||
|
|
||||||
print_versions() {
|
print_versions() {
|
||||||
cat <<EOF
|
cat <<EOF
|
||||||
Pinned component versions:
|
Pinned component versions:
|
||||||
TransformerLab: $(transformerlab_version) (single-user pinned install)
|
Netron: $(netron_version)
|
||||||
|
Minimum Ollama: $(minimum_ollama_version)
|
||||||
Ansible Core: 2.18.6
|
Ansible Core: 2.18.6
|
||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
confirm_installation() {
|
confirm_installation() {
|
||||||
local response
|
local response
|
||||||
local tlab_version
|
local pinned_netron
|
||||||
tlab_version=$(transformerlab_version)
|
local min_ollama
|
||||||
|
pinned_netron=$(netron_version)
|
||||||
|
min_ollama=$(minimum_ollama_version)
|
||||||
|
|
||||||
if [ ! -t 0 ]; then
|
if [ ! -t 0 ]; then
|
||||||
cat <<EOF >&2
|
cat <<EOF >&2
|
||||||
@@ -60,14 +74,15 @@ WARNING: THIS SCRIPT WILL CONFIGURE YOUR ENVIRONMENT WILL THE FOLLOWING SOFTWARE
|
|||||||
|
|
||||||
- Ollama
|
- Ollama
|
||||||
- llama.cpp
|
- llama.cpp
|
||||||
- TransformerLab (single-user pinned to ${tlab_version})
|
- Netron (${pinned_netron})
|
||||||
- Open WebUI
|
- Open WebUI
|
||||||
- ChunkViz
|
- ChunkViz
|
||||||
- Embedding Atlas
|
- Embedding Atlas
|
||||||
- Promptfoo
|
- Promptfoo
|
||||||
- Unsloth Studio
|
- Unsloth Studio
|
||||||
- Kiln Desktop
|
- Kiln Desktop
|
||||||
- Course-specific support assets for lab 2 and lab 4
|
- Course-specific support assets for lab 1, lab 2, and lab 4
|
||||||
|
- A pre-registered Lab 1 Ollama model (requires Ollama ${min_ollama}+)
|
||||||
|
|
||||||
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
|
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
|
||||||
|
|
||||||
@@ -83,14 +98,15 @@ WARNING: THIS SCRIPT WILL CONFIGURE YOUR ENVIRONMENT WILL THE FOLLOWING SOFTWARE
|
|||||||
|
|
||||||
- Ollama
|
- Ollama
|
||||||
- llama.cpp
|
- llama.cpp
|
||||||
- TransformerLab (single-user pinned to ${tlab_version})
|
- Netron (${pinned_netron})
|
||||||
- Open WebUI
|
- Open WebUI
|
||||||
- ChunkViz
|
- ChunkViz
|
||||||
- Embedding Atlas
|
- Embedding Atlas
|
||||||
- Promptfoo
|
- Promptfoo
|
||||||
- Unsloth Studio
|
- Unsloth Studio
|
||||||
- Kiln Desktop
|
- Kiln Desktop
|
||||||
- Course-specific support assets for lab 2 and lab 4
|
- Course-specific support assets for lab 1, lab 2, and lab 4
|
||||||
|
- A pre-registered Lab 1 Ollama model (requires Ollama ${min_ollama}+)
|
||||||
|
|
||||||
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
|
IT IS RECOMMENDED TO RUN THIS IN AN ISLOATED ENVIRONMENT (Dedicated WSL, VM, etc.)
|
||||||
|
|
||||||
|
|||||||
+19
-11
@@ -15,15 +15,22 @@ load_runtime_env() {
|
|||||||
: "${COURSEWARE_STATE_DIR:=$STATE_DIR}"
|
: "${COURSEWARE_STATE_DIR:=$STATE_DIR}"
|
||||||
: "${COURSEWARE_BIND_HOST:=127.0.0.1}"
|
: "${COURSEWARE_BIND_HOST:=127.0.0.1}"
|
||||||
: "${COURSEWARE_URL_HOST:=127.0.0.1}"
|
: "${COURSEWARE_URL_HOST:=127.0.0.1}"
|
||||||
|
: "${COURSEWARE_NETRON_PORT:=8338}"
|
||||||
: "${COURSEWARE_PROMPTFOO_PORT:=15500}"
|
: "${COURSEWARE_PROMPTFOO_PORT:=15500}"
|
||||||
: "${COURSEWARE_WIKI_PORT:=80}"
|
: "${COURSEWARE_WIKI_PORT:=80}"
|
||||||
: "${COURSEWARE_WETTY_PORT:=7681}"
|
: "${COURSEWARE_WETTY_PORT:=7681}"
|
||||||
|
: "${COURSEWARE_OLLAMA_MIN_VERSION:=0.12.11}"
|
||||||
: "${COURSEWARE_WETTY_BASE_PATH:=/wetty}"
|
: "${COURSEWARE_WETTY_BASE_PATH:=/wetty}"
|
||||||
: "${NODE_RUNTIME_BIN_DIR:=$COURSEWARE_STATE_DIR/tools/node-runtime/node_modules/node/bin}"
|
: "${NODE_RUNTIME_BIN_DIR:=$COURSEWARE_STATE_DIR/tools/node-runtime/node_modules/node/bin}"
|
||||||
|
: "${NETRON_VENV:=$COURSEWARE_STATE_DIR/venvs/netron}"
|
||||||
: "${WETTY_BIN:=$COURSEWARE_STATE_DIR/tools/wetty/node_modules/.bin/wetty}"
|
: "${WETTY_BIN:=$COURSEWARE_STATE_DIR/tools/wetty/node_modules/.bin/wetty}"
|
||||||
: "${PROMPTFOO_DIR:=$COURSEWARE_STATE_DIR/lab6}"
|
: "${PROMPTFOO_DIR:=$COURSEWARE_STATE_DIR/lab6}"
|
||||||
: "${WIKI_DIR:=$COURSEWARE_STATE_DIR/repos/LLM-Labs}"
|
: "${WIKI_DIR:=$COURSEWARE_STATE_DIR/repos/LLM-Labs}"
|
||||||
: "${WIKI_RUNTIME_CONFIG_PATH:=$WIKI_DIR/public/courseware-runtime.json}"
|
: "${WIKI_RUNTIME_CONFIG_PATH:=$WIKI_DIR/public/courseware-runtime.json}"
|
||||||
|
: "${COURSEWARE_OLLAMA_BASE_URL:=http://$COURSEWARE_URL_HOST:$COURSEWARE_OLLAMA_PORT}"
|
||||||
|
: "${COURSEWARE_LAB1_QWEN_MODEL_PATH:=$COURSEWARE_STATE_DIR/models/lab1/Qwen3-0.6B-Q8_0.gguf}"
|
||||||
|
: "${COURSEWARE_LAB1_LLAMA_MODEL_PATH:=$COURSEWARE_STATE_DIR/models/lab1/Llama-3.2-1B.Q4_K_M.gguf}"
|
||||||
|
: "${COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS:=lab1-qwen3-0.6b-q8_0}"
|
||||||
: "${LLAMA_CPP_BIN_DIR:=$COURSEWARE_STATE_DIR/repos/llama.cpp/build/bin}"
|
: "${LLAMA_CPP_BIN_DIR:=$COURSEWARE_STATE_DIR/repos/llama.cpp/build/bin}"
|
||||||
|
|
||||||
if [ -n "${OLLAMA_BIN:-}" ] && [[ "$OLLAMA_BIN" != */* ]] && command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
|
if [ -n "${OLLAMA_BIN:-}" ] && [[ "$OLLAMA_BIN" != */* ]] && command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
|
||||||
@@ -42,7 +49,7 @@ service_list() {
|
|||||||
printf '%s\n' \
|
printf '%s\n' \
|
||||||
"ollama" \
|
"ollama" \
|
||||||
"open-webui" \
|
"open-webui" \
|
||||||
"transformerlab" \
|
"netron" \
|
||||||
"chunkviz" \
|
"chunkviz" \
|
||||||
"embedding-atlas" \
|
"embedding-atlas" \
|
||||||
"unsloth" \
|
"unsloth" \
|
||||||
@@ -63,7 +70,7 @@ service_port() {
|
|||||||
case "$1" in
|
case "$1" in
|
||||||
ollama) printf '%s\n' "${COURSEWARE_OLLAMA_PORT}" ;;
|
ollama) printf '%s\n' "${COURSEWARE_OLLAMA_PORT}" ;;
|
||||||
open-webui) printf '%s\n' "${COURSEWARE_OPEN_WEBUI_PORT}" ;;
|
open-webui) printf '%s\n' "${COURSEWARE_OPEN_WEBUI_PORT}" ;;
|
||||||
transformerlab) printf '%s\n' "${COURSEWARE_TRANSFORMERLAB_PORT}" ;;
|
netron) printf '%s\n' "${COURSEWARE_NETRON_PORT}" ;;
|
||||||
chunkviz) printf '%s\n' "${COURSEWARE_CHUNKVIZ_PORT}" ;;
|
chunkviz) printf '%s\n' "${COURSEWARE_CHUNKVIZ_PORT}" ;;
|
||||||
embedding-atlas) printf '%s\n' "${COURSEWARE_EMBEDDING_ATLAS_PORT}" ;;
|
embedding-atlas) printf '%s\n' "${COURSEWARE_EMBEDDING_ATLAS_PORT}" ;;
|
||||||
unsloth) printf '%s\n' "${COURSEWARE_UNSLOTH_PORT}" ;;
|
unsloth) printf '%s\n' "${COURSEWARE_UNSLOTH_PORT}" ;;
|
||||||
@@ -78,7 +85,7 @@ service_url() {
|
|||||||
case "$1" in
|
case "$1" in
|
||||||
ollama) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OLLAMA_PORT" ;;
|
ollama) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OLLAMA_PORT" ;;
|
||||||
open-webui) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OPEN_WEBUI_PORT" ;;
|
open-webui) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_OPEN_WEBUI_PORT" ;;
|
||||||
transformerlab) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_TRANSFORMERLAB_PORT" ;;
|
netron) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_NETRON_PORT" ;;
|
||||||
chunkviz) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_CHUNKVIZ_PORT" ;;
|
chunkviz) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_CHUNKVIZ_PORT" ;;
|
||||||
embedding-atlas) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_EMBEDDING_ATLAS_PORT" ;;
|
embedding-atlas) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_EMBEDDING_ATLAS_PORT" ;;
|
||||||
unsloth) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_UNSLOTH_PORT" ;;
|
unsloth) printf 'http://%s:%s\n' "$COURSEWARE_URL_HOST" "$COURSEWARE_UNSLOTH_PORT" ;;
|
||||||
@@ -107,14 +114,11 @@ service_command() {
|
|||||||
"$COURSEWARE_BIND_HOST" \
|
"$COURSEWARE_BIND_HOST" \
|
||||||
"$COURSEWARE_OPEN_WEBUI_PORT"
|
"$COURSEWARE_OPEN_WEBUI_PORT"
|
||||||
;;
|
;;
|
||||||
transformerlab)
|
netron)
|
||||||
printf 'export PATH="%s/envs/transformerlab/bin:$PATH"; export VIRTUAL_ENV="%s/envs/transformerlab"; export CONDA_PREFIX="%s/envs/transformerlab"; cd "%s/src" && exec ./run.sh -c -h %s -p %s' \
|
printf 'exec "%s/bin/netron" --host %s --port %s --verbosity quiet' \
|
||||||
"$TRANSFORMERLAB_DIR" \
|
"$NETRON_VENV" \
|
||||||
"$TRANSFORMERLAB_DIR" \
|
|
||||||
"$TRANSFORMERLAB_DIR" \
|
|
||||||
"$TRANSFORMERLAB_DIR" \
|
|
||||||
"$COURSEWARE_BIND_HOST" \
|
"$COURSEWARE_BIND_HOST" \
|
||||||
"$COURSEWARE_TRANSFORMERLAB_PORT"
|
"$COURSEWARE_NETRON_PORT"
|
||||||
;;
|
;;
|
||||||
chunkviz)
|
chunkviz)
|
||||||
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/serve" build -s -n -L -l tcp://%s:%s' \
|
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/serve" build -s -n -L -l tcp://%s:%s' \
|
||||||
@@ -145,9 +149,13 @@ service_command() {
|
|||||||
"$COURSEWARE_PROMPTFOO_PORT"
|
"$COURSEWARE_PROMPTFOO_PORT"
|
||||||
;;
|
;;
|
||||||
wiki)
|
wiki)
|
||||||
printf 'cd "%s" && PATH="%s:$PATH" exec "./node_modules/.bin/next" start --hostname %s --port %s' \
|
printf 'cd "%s" && PATH="%s:$PATH" exec env COURSEWARE_OLLAMA_BASE_URL="%s" COURSEWARE_LAB1_QWEN_MODEL_PATH="%s" COURSEWARE_LAB1_LLAMA_MODEL_PATH="%s" COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS="%s" "./node_modules/.bin/next" start --hostname %s --port %s' \
|
||||||
"$WIKI_DIR" \
|
"$WIKI_DIR" \
|
||||||
"$NODE_RUNTIME_BIN_DIR" \
|
"$NODE_RUNTIME_BIN_DIR" \
|
||||||
|
"$COURSEWARE_OLLAMA_BASE_URL" \
|
||||||
|
"$COURSEWARE_LAB1_QWEN_MODEL_PATH" \
|
||||||
|
"$COURSEWARE_LAB1_LLAMA_MODEL_PATH" \
|
||||||
|
"$COURSEWARE_LAB1_OLLAMA_MODEL_ALIAS" \
|
||||||
"$COURSEWARE_BIND_HOST" \
|
"$COURSEWARE_BIND_HOST" \
|
||||||
"$COURSEWARE_WIKI_PORT"
|
"$COURSEWARE_WIKI_PORT"
|
||||||
;;
|
;;
|
||||||
|
|||||||
Executable → Regular
+40
-51
@@ -9,25 +9,6 @@ load_runtime_env
|
|||||||
|
|
||||||
mkdir -p "$STATE_DIR/run" "$STATE_DIR/logs"
|
mkdir -p "$STATE_DIR/run" "$STATE_DIR/logs"
|
||||||
|
|
||||||
ensure_transformerlab_default_user() {
|
|
||||||
local helper_python="${TRANSFORMERLAB_DIR}/envs/transformerlab/bin/python"
|
|
||||||
|
|
||||||
if [ ! -x "$helper_python" ]; then
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ -z "${TRANSFORMERLAB_DEFAULT_USER_EMAIL:-}" ] || [ -z "${TRANSFORMERLAB_DEFAULT_USER_PASSWORD:-}" ]; then
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
"$helper_python" "$SCRIPT_DIR/ensure_transformerlab_user.py" \
|
|
||||||
--transformerlab-dir "$TRANSFORMERLAB_DIR" \
|
|
||||||
--email "$TRANSFORMERLAB_DEFAULT_USER_EMAIL" \
|
|
||||||
--password "$TRANSFORMERLAB_DEFAULT_USER_PASSWORD" \
|
|
||||||
--first-name "${TRANSFORMERLAB_DEFAULT_USER_FIRST_NAME:-Student}" \
|
|
||||||
--last-name "${TRANSFORMERLAB_DEFAULT_USER_LAST_NAME:-}" >>"$STATE_DIR/logs/transformerlab_default_user.log" 2>&1 || true
|
|
||||||
}
|
|
||||||
|
|
||||||
check_wetty_prereqs() {
|
check_wetty_prereqs() {
|
||||||
if [ ! -x "$WETTY_BIN" ]; then
|
if [ ! -x "$WETTY_BIN" ]; then
|
||||||
echo "Missing WeTTY binary at $WETTY_BIN. Re-run ./labctl up." >&2
|
echo "Missing WeTTY binary at $WETTY_BIN. Re-run ./labctl up." >&2
|
||||||
@@ -56,6 +37,40 @@ PY
|
|||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ollama_version_gte_minimum() {
|
||||||
|
local version_output
|
||||||
|
local installed_version
|
||||||
|
|
||||||
|
if ! command -v "$OLLAMA_BIN" >/dev/null 2>&1; then
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
version_output=$("$OLLAMA_BIN" --version 2>/dev/null || true)
|
||||||
|
installed_version=$(printf '%s' "$version_output" | grep -oE '[0-9]+\.[0-9]+\.[0-9]+' | head -n 1)
|
||||||
|
|
||||||
|
if [ -z "$installed_version" ]; then
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
[ "$(printf '%s\n' "$COURSEWARE_OLLAMA_MIN_VERSION" "$installed_version" | sort -V | head -n 1)" = "$COURSEWARE_OLLAMA_MIN_VERSION" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
assert_ollama_logprobs_support() {
|
||||||
|
if ollama_version_gte_minimum; then
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
local version_output
|
||||||
|
version_output=$("$OLLAMA_BIN" --version 2>/dev/null || printf 'unknown')
|
||||||
|
|
||||||
|
cat <<EOF >&2
|
||||||
|
Lab 1 requires Ollama ${COURSEWARE_OLLAMA_MIN_VERSION} or newer because the confidence visualizer depends on logprobs.
|
||||||
|
Installed version: ${version_output}
|
||||||
|
Re-run ./labctl up after upgrading Ollama.
|
||||||
|
EOF
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
resolve_targets() {
|
resolve_targets() {
|
||||||
if [ $# -eq 0 ]; then
|
if [ $# -eq 0 ]; then
|
||||||
echo "No target specified." >&2
|
echo "No target specified." >&2
|
||||||
@@ -104,13 +119,10 @@ service_ready() {
|
|||||||
ollama)
|
ollama)
|
||||||
curl -fsS "$(service_url "$service")/api/tags" >/dev/null 2>&1
|
curl -fsS "$(service_url "$service")/api/tags" >/dev/null 2>&1
|
||||||
;;
|
;;
|
||||||
transformerlab)
|
|
||||||
curl -fsS "$(service_url "$service")/healthz" >/dev/null 2>&1
|
|
||||||
;;
|
|
||||||
promptfoo)
|
promptfoo)
|
||||||
curl -fsS "$(service_url "$service")/health" >/dev/null 2>&1
|
curl -fsS "$(service_url "$service")/health" >/dev/null 2>&1
|
||||||
;;
|
;;
|
||||||
open-webui|chunkviz|embedding-atlas|unsloth|wiki|wetty)
|
open-webui|netron|chunkviz|embedding-atlas|unsloth|wiki|wetty)
|
||||||
curl -fsS "$(service_url "$service")" >/dev/null 2>&1
|
curl -fsS "$(service_url "$service")" >/dev/null 2>&1
|
||||||
;;
|
;;
|
||||||
*)
|
*)
|
||||||
@@ -166,18 +178,16 @@ start_one() {
|
|||||||
local attempt
|
local attempt
|
||||||
local pid_grace_attempts=5
|
local pid_grace_attempts=5
|
||||||
|
|
||||||
if has_live_pid "$service"; then
|
if [ "$service" = "ollama" ] || [ "$service" = "wiki" ]; then
|
||||||
if [ "$service" = "transformerlab" ]; then
|
assert_ollama_logprobs_support
|
||||||
ensure_transformerlab_default_user
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
if has_live_pid "$service"; then
|
||||||
echo "$service already running"
|
echo "$service already running"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if service_ready "$service"; then
|
if service_ready "$service"; then
|
||||||
if [ "$service" = "transformerlab" ]; then
|
|
||||||
ensure_transformerlab_default_user
|
|
||||||
fi
|
|
||||||
echo "$service already available"
|
echo "$service already available"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
@@ -186,24 +196,6 @@ start_one() {
|
|||||||
open-webui)
|
open-webui)
|
||||||
start_one ollama
|
start_one ollama
|
||||||
;;
|
;;
|
||||||
transformerlab)
|
|
||||||
if command -v python3 >/dev/null 2>&1; then
|
|
||||||
python3 "$SCRIPT_DIR/repair_transformerlab_plugin_supports.py" \
|
|
||||||
--transformerlab-dir "$TRANSFORMERLAB_DIR" \
|
|
||||||
--plugin "fastchat_server" \
|
|
||||||
--required-support "chat" \
|
|
||||||
--required-support "completion" \
|
|
||||||
--required-support "visualize_model" \
|
|
||||||
--required-support "model_layers" \
|
|
||||||
--required-support "rag" \
|
|
||||||
--required-support "tools" \
|
|
||||||
--required-support "template" \
|
|
||||||
--required-support "embeddings" \
|
|
||||||
--required-support "tokenize" \
|
|
||||||
--required-support "logprobs" \
|
|
||||||
--required-support "batched" >>"$STATE_DIR/logs/transformerlab_plugin_supports.log" 2>&1 || true
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
wetty)
|
wetty)
|
||||||
check_wetty_prereqs
|
check_wetty_prereqs
|
||||||
;;
|
;;
|
||||||
@@ -229,9 +221,6 @@ start_one() {
|
|||||||
|
|
||||||
for attempt in $(seq 1 60); do
|
for attempt in $(seq 1 60); do
|
||||||
if service_ready "$service"; then
|
if service_ready "$service"; then
|
||||||
if [ "$service" = "transformerlab" ]; then
|
|
||||||
ensure_transformerlab_default_user
|
|
||||||
fi
|
|
||||||
echo "started $service"
|
echo "started $service"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
@@ -313,7 +302,7 @@ urls() {
|
|||||||
cat <<EOF
|
cat <<EOF
|
||||||
Ollama API: $(service_url ollama)
|
Ollama API: $(service_url ollama)
|
||||||
Open WebUI: $(service_url open-webui)
|
Open WebUI: $(service_url open-webui)
|
||||||
TransformerLab: $(service_url transformerlab)
|
Netron: $(service_url netron)
|
||||||
ChunkViz: $(service_url chunkviz)
|
ChunkViz: $(service_url chunkviz)
|
||||||
Embedding Atlas: $(service_url embedding-atlas)
|
Embedding Atlas: $(service_url embedding-atlas)
|
||||||
Unsloth Studio: $(service_url unsloth)
|
Unsloth Studio: $(service_url unsloth)
|
||||||
|
|||||||
Reference in New Issue
Block a user