Remove managed student assumptions from Lab 3 terminal

This commit is contained in:
2026-04-13 19:39:51 -06:00
parent ca6a966ad6
commit a97c8a7694
6 changed files with 25 additions and 45 deletions
+1 -4
View File
@@ -25,8 +25,6 @@ The Lab 3 terminal now prefers a runtime-served config artifact at `/courseware-
Expected fields: Expected fields:
- `lab3TerminalUrl` - `lab3TerminalUrl`
- `lab3Username`
- `lab3WorkingDirectory`
Local development still falls back to `/wetty`, and `.env.example` keeps that default for simple standalone runs. Local development still falls back to `/wetty`, and `.env.example` keeps that default for simple standalone runs.
@@ -34,8 +32,7 @@ The Linux/WSL deployment contract is:
- loopback-only `sshd` on `127.0.0.1:22` - loopback-only `sshd` on `127.0.0.1:22`
- WeTTY exposed to the browser on `http://127.0.0.1:7681/wetty` - WeTTY exposed to the browser on `http://127.0.0.1:7681/wetty`
- a managed `student` login - a standard SSH login prompt that accepts any local account allowed by the host
- a working directory at `/home/student/lab3`
## Project Structure ## Project Structure
+9 -7
View File
@@ -24,14 +24,14 @@ In this lab, we will:
<strong>Execute</strong> sections require running commands and producing output. <strong>Execute</strong> sections require running commands and producing output.
</div> </div>
To start this lab, use the embedded terminal below. It connects to the same lab machine in your browser and should prompt you to log in with the managed `student` account. To start this lab, use the embedded terminal below. It connects to the same lab machine in your browser and should prompt you for any local username and password that already work on that host.
<div data-lab3-terminal></div> <div data-lab3-terminal></div>
If the embedded terminal is unavailable, you can still fall back to: If the embedded terminal is unavailable, you can still fall back to:
- SSH - <IP>:22 - SSH - <IP>:22
- The lab workspace is rooted at `/home/student/lab3` - A regular terminal session on the lab host
## Objective 1: HuggingFace & LLaMa.cpp ## Objective 1: HuggingFace & LLaMa.cpp
@@ -99,7 +99,7 @@ The projects original goal was to make LLaMA models accessible on systems wit
For this lab we will work with **WhiteRabbitNeoV37B**, a cybersecurityoriented finetune of Qwen2.5Coder7B. This model is less popular than LLaMA-3.2, and if we'd like to run it in `llama.cpp` or Ollama, we first need to convert it into a usable GGUF artifact. For this lab we will work with **WhiteRabbitNeoV37B**, a cybersecurityoriented finetune of Qwen2.5Coder7B. This model is less popular than LLaMA-3.2, and if we'd like to run it in `llama.cpp` or Ollama, we first need to convert it into a usable GGUF artifact.
<div class="lab-callout lab-callout--warning"> <div class="lab-callout lab-callout--warning">
<strong>Warning:</strong> Although the next two steps show how to find and download this model so you can replicate the process, any course-provided WhiteRabbitNeo support files will be staged under <code>/home/student/lab3/WhiteRabbitNeo</code> when they are available in the deployment. <strong>Warning:</strong> The commands below assume you are working from <code>~/lab3</code>. If you prefer another path, adjust the examples consistently as you go.
</div> </div>
### 1. Locate & download the model ### 1. Locate & download the model
@@ -140,6 +140,8 @@ git lfs install
2. Clone the model: 2. Clone the model:
```bash ```bash
mkdir -p ~/lab3/WhiteRabbitNeo
cd ~/lab3/WhiteRabbitNeo
git clone https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B git clone https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B
``` ```
@@ -148,7 +150,7 @@ git clone https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B
**LLaMa.cpp** makes it easy for us to package models downloaded in SafeTensors format to GGUF. We can convert the model with the following official project script command: **LLaMa.cpp** makes it easy for us to package models downloaded in SafeTensors format to GGUF. We can convert the model with the following official project script command:
```bash ```bash
convert_hf_to_gguf.py /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B/WhiteRabbitNeo-V3-7B --outfile /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf convert_hf_to_gguf.py ~/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B/WhiteRabbitNeo-V3-7B --outfile ~/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
``` ```
### 4 Execute: Review Model Metadata ### 4 Execute: Review Model Metadata
@@ -157,7 +159,7 @@ When these steps have completed, you should see a new WhiteRabbitNeo-V3-7B.gguf
Run the following command: Run the following command:
```bash ```bash
gguf-dump /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf gguf-dump ~/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
``` ```
We should then see: We should then see:
@@ -190,7 +192,7 @@ A text listing of all of the model's tensors, and the precision of each. Because
Run our newly created **.GGUF** file as is. Run the model using the following command: Run our newly created **.GGUF** file as is. Run the model using the following command:
```bash ```bash
llama-cli -m /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf llama-cli -m ~/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
``` ```
Once loaded, interact with the model. We can see a number of interesting parameters that were selected by default, such as **Top K**, **Top P**, **Temperature**, and more, which we'll discuss in the next section. In the meantime, explore interaction with the model. When run in this raw state, the model may be overly chatty. You can stop its output with `Ctrl+C` at any time. Once loaded, interact with the model. We can see a number of interesting parameters that were selected by default, such as **Top K**, **Top P**, **Temperature**, and more, which we'll discuss in the next section. In the meantime, explore interaction with the model. When run in this raw state, the model may be overly chatty. You can stop its output with `Ctrl+C` at any time.
@@ -319,7 +321,7 @@ We can also import our WhiteRabbitNeo **.GGUF** model into Ollama, without havin
1. **Create a simple modelfile** This will tell Ollama where the model lives. 1. **Create a simple modelfile** This will tell Ollama where the model lives.
```bash ```bash
echo "FROM /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf" > Modelfile echo "FROM $HOME/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf" > Modelfile
``` ```
2. **Register the model with Ollama** 2. **Register the model with Ollama**
+4 -7
View File
@@ -6,19 +6,16 @@ Default runtime config:
```json ```json
{ {
"lab3TerminalUrl": "http://127.0.0.1:7681/wetty", "lab3TerminalUrl": "http://127.0.0.1:7681/wetty"
"lab3Username": "student",
"lab3WorkingDirectory": "/home/student/lab3"
} }
``` ```
## Runtime Contract ## Runtime Contract
- Create a managed `student` Unix account
- Bind `sshd` to `127.0.0.1:22` only - Bind `sshd` to `127.0.0.1:22` only
- Run WeTTY on the lab host and expose it to the browser at `/wetty` - Run WeTTY on the lab host and expose it to the browser at `/wetty`
- Point WeTTY at localhost SSH without pre-setting the username so students see a login prompt - Point WeTTY at localhost SSH without pre-setting the username so users see a normal login prompt
- Start students in `/home/student/lab3` - Let the host control which existing local accounts are allowed to sign in
Example launch command: Example launch command:
@@ -29,7 +26,7 @@ wetty --host 0.0.0.0 --port 7681 --base /wetty --allow-iframe --ssh-host 127.0.0
## Linux / WSL Shape ## Linux / WSL Shape
- Install `openssh-server` - Install `openssh-server`
- Restrict `sshd` to `127.0.0.1` and the managed `student` account - Restrict `sshd` to `127.0.0.1`
- Install WeTTY with the managed Node runtime - Install WeTTY with the managed Node runtime
- Generate `/courseware-runtime.json` from deployment-time values - Generate `/courseware-runtime.json` from deployment-time values
- Start the wiki and WeTTY as separate managed services - Start the wiki and WeTTY as separate managed services
+7 -13
View File
@@ -5,7 +5,6 @@ import {
} from "~/components/labs/Lab3TerminalFrame"; } from "~/components/labs/Lab3TerminalFrame";
import { import {
LAB3_DEFAULT_TERMINAL_PATH, LAB3_DEFAULT_TERMINAL_PATH,
LAB3_DEFAULT_WORKING_DIRECTORY,
fetchLab3RuntimeConfig, fetchLab3RuntimeConfig,
getLab3TerminalPath, getLab3TerminalPath,
LAB3_RUNTIME_CONFIG_PATH, LAB3_RUNTIME_CONFIG_PATH,
@@ -27,11 +26,9 @@ describe("getLab3TerminalPath", () => {
}); });
describe("normalizeLab3RuntimeConfig", () => { describe("normalizeLab3RuntimeConfig", () => {
it("fills in the default student terminal metadata", () => { it("fills in the default terminal path", () => {
expect(normalizeLab3RuntimeConfig()).toEqual({ expect(normalizeLab3RuntimeConfig()).toEqual({
terminalPath: LAB3_DEFAULT_TERMINAL_PATH, terminalPath: LAB3_DEFAULT_TERMINAL_PATH,
username: "student",
workingDirectory: LAB3_DEFAULT_WORKING_DIRECTORY,
}); });
}); });
}); });
@@ -48,8 +45,6 @@ describe("fetchLab3RuntimeConfig", () => {
new Response( new Response(
JSON.stringify({ JSON.stringify({
lab3TerminalUrl: "http://127.0.0.1:7681/wetty", lab3TerminalUrl: "http://127.0.0.1:7681/wetty",
lab3Username: "student",
lab3WorkingDirectory: "/home/student/lab3",
}), }),
{ status: 200 }, { status: 200 },
), ),
@@ -57,8 +52,6 @@ describe("fetchLab3RuntimeConfig", () => {
await expect(fetchLab3RuntimeConfig()).resolves.toEqual({ await expect(fetchLab3RuntimeConfig()).resolves.toEqual({
terminalPath: "http://127.0.0.1:7681/wetty", terminalPath: "http://127.0.0.1:7681/wetty",
username: "student",
workingDirectory: "/home/student/lab3",
}); });
expect(fetchMock).toHaveBeenCalledWith(LAB3_RUNTIME_CONFIG_PATH, { expect(fetchMock).toHaveBeenCalledWith(LAB3_RUNTIME_CONFIG_PATH, {
@@ -82,7 +75,9 @@ describe("Lab3TerminalFrame", () => {
return link.getAttribute("href") === "/lab-terminal"; return link.getAttribute("href") === "/lab-terminal";
}), }),
).toBe(true); ).toBe(true);
expect(screen.getAllByText("/home/student/lab3")).toHaveLength(2); expect(
screen.getByText(/log in with a local account that has/i),
).toBeInTheDocument();
}); });
it("loads the terminal settings from the runtime config artifact", async () => { it("loads the terminal settings from the runtime config artifact", async () => {
@@ -90,8 +85,6 @@ describe("Lab3TerminalFrame", () => {
new Response( new Response(
JSON.stringify({ JSON.stringify({
lab3TerminalUrl: "http://127.0.0.1:7681/wetty", lab3TerminalUrl: "http://127.0.0.1:7681/wetty",
lab3Username: "student",
lab3WorkingDirectory: "/srv/labs/lab3",
}), }),
{ status: 200 }, { status: 200 },
), ),
@@ -105,8 +98,9 @@ describe("Lab3TerminalFrame", () => {
"http://127.0.0.1:7681/wetty", "http://127.0.0.1:7681/wetty",
); );
}); });
expect(
expect(screen.getAllByText("/srv/labs/lab3")).toHaveLength(2); screen.getByText(/Sign in with any local account that can SSH/i),
).toBeInTheDocument();
}); });
it("toggles the dock open and closed", () => { it("toggles the dock open and closed", () => {
+4 -5
View File
@@ -87,8 +87,8 @@ export function Lab3TerminalFrame({ srcPath }: Lab3TerminalFrameProps) {
<h3>Use the lab shell directly in your browser</h3> <h3>Use the lab shell directly in your browser</h3>
<p className="lab3-terminal-inline__lede"> <p className="lab3-terminal-inline__lede">
The terminal is docked to the bottom of the page. Expand it with the The terminal is docked to the bottom of the page. Expand it with the
arrow when you&apos;re ready to log in as <code>{runtimeConfig.username}</code>{" "} arrow when you&apos;re ready to log in with a local account that has
and work from <code>{runtimeConfig.workingDirectory}</code>. password-based SSH access on this lab host.
</p> </p>
</div> </div>
@@ -140,9 +140,8 @@ export function Lab3TerminalFrame({ srcPath }: Lab3TerminalFrameProps) {
<div className="lab3-terminal-dock__panel" id="lab3-terminal-dock-panel"> <div className="lab3-terminal-dock__panel" id="lab3-terminal-dock-panel">
<div className="lab3-terminal-dock__header"> <div className="lab3-terminal-dock__header">
<p className="lab3-terminal-dock__lede"> <p className="lab3-terminal-dock__lede">
Log in as <code>{runtimeConfig.username}</code>, work from{" "} Sign in with any local account that can SSH into this machine, or
<code>{runtimeConfig.workingDirectory}</code>, or pop the terminal pop the terminal out into a full tab for more room.
out into a full tab for more room.
</p> </p>
{isConfigResolved ? ( {isConfigResolved ? (
-9
View File
@@ -1,18 +1,12 @@
export const LAB3_DEFAULT_TERMINAL_PATH = "/wetty"; export const LAB3_DEFAULT_TERMINAL_PATH = "/wetty";
export const LAB3_DEFAULT_USERNAME = "student";
export const LAB3_DEFAULT_WORKING_DIRECTORY = "/home/student/lab3";
export const LAB3_RUNTIME_CONFIG_PATH = "/courseware-runtime.json"; export const LAB3_RUNTIME_CONFIG_PATH = "/courseware-runtime.json";
export type Lab3RuntimeConfig = { export type Lab3RuntimeConfig = {
lab3TerminalUrl?: string; lab3TerminalUrl?: string;
lab3Username?: string;
lab3WorkingDirectory?: string;
}; };
export type ResolvedLab3RuntimeConfig = { export type ResolvedLab3RuntimeConfig = {
terminalPath: string; terminalPath: string;
username: string;
workingDirectory: string;
}; };
export function getLab3TerminalPath(envValue?: string) { export function getLab3TerminalPath(envValue?: string) {
@@ -34,9 +28,6 @@ export function normalizeLab3RuntimeConfig(
): ResolvedLab3RuntimeConfig { ): ResolvedLab3RuntimeConfig {
return { return {
terminalPath: getLab3TerminalPath(config?.lab3TerminalUrl), terminalPath: getLab3TerminalPath(config?.lab3TerminalUrl),
username: config?.lab3Username?.trim() || LAB3_DEFAULT_USERNAME,
workingDirectory:
config?.lab3WorkingDirectory?.trim() || LAB3_DEFAULT_WORKING_DIRECTORY,
}; };
} }