Remove managed student assumptions from Lab 3 terminal
This commit is contained in:
@@ -25,8 +25,6 @@ The Lab 3 terminal now prefers a runtime-served config artifact at `/courseware-
|
||||
Expected fields:
|
||||
|
||||
- `lab3TerminalUrl`
|
||||
- `lab3Username`
|
||||
- `lab3WorkingDirectory`
|
||||
|
||||
Local development still falls back to `/wetty`, and `.env.example` keeps that default for simple standalone runs.
|
||||
|
||||
@@ -34,8 +32,7 @@ The Linux/WSL deployment contract is:
|
||||
|
||||
- loopback-only `sshd` on `127.0.0.1:22`
|
||||
- WeTTY exposed to the browser on `http://127.0.0.1:7681/wetty`
|
||||
- a managed `student` login
|
||||
- a working directory at `/home/student/lab3`
|
||||
- a standard SSH login prompt that accepts any local account allowed by the host
|
||||
|
||||
## Project Structure
|
||||
|
||||
|
||||
@@ -24,14 +24,14 @@ In this lab, we will:
|
||||
<strong>Execute</strong> sections require running commands and producing output.
|
||||
</div>
|
||||
|
||||
To start this lab, use the embedded terminal below. It connects to the same lab machine in your browser and should prompt you to log in with the managed `student` account.
|
||||
To start this lab, use the embedded terminal below. It connects to the same lab machine in your browser and should prompt you for any local username and password that already work on that host.
|
||||
|
||||
<div data-lab3-terminal></div>
|
||||
|
||||
If the embedded terminal is unavailable, you can still fall back to:
|
||||
|
||||
- SSH - <IP>:22
|
||||
- The lab workspace is rooted at `/home/student/lab3`
|
||||
- A regular terminal session on the lab host
|
||||
|
||||
## Objective 1: HuggingFace & LLaMa.cpp
|
||||
|
||||
@@ -99,7 +99,7 @@ The project’s original goal was to make LLaMA models accessible on systems wit
|
||||
For this lab we will work with **WhiteRabbitNeo‑V3‑7B**, a cybersecurity‑oriented fine‑tune of Qwen2.5‑Coder‑7B. This model is less popular than LLaMA-3.2, and if we'd like to run it in `llama.cpp` or Ollama, we first need to convert it into a usable GGUF artifact.
|
||||
|
||||
<div class="lab-callout lab-callout--warning">
|
||||
<strong>Warning:</strong> Although the next two steps show how to find and download this model so you can replicate the process, any course-provided WhiteRabbitNeo support files will be staged under <code>/home/student/lab3/WhiteRabbitNeo</code> when they are available in the deployment.
|
||||
<strong>Warning:</strong> The commands below assume you are working from <code>~/lab3</code>. If you prefer another path, adjust the examples consistently as you go.
|
||||
</div>
|
||||
|
||||
### 1. Locate & download the model
|
||||
@@ -140,6 +140,8 @@ git lfs install
|
||||
2. Clone the model:
|
||||
|
||||
```bash
|
||||
mkdir -p ~/lab3/WhiteRabbitNeo
|
||||
cd ~/lab3/WhiteRabbitNeo
|
||||
git clone https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B
|
||||
```
|
||||
|
||||
@@ -148,7 +150,7 @@ git clone https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B
|
||||
**LLaMa.cpp** makes it easy for us to package models downloaded in SafeTensors format to GGUF. We can convert the model with the following official project script command:
|
||||
|
||||
```bash
|
||||
convert_hf_to_gguf.py /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B/WhiteRabbitNeo-V3-7B --outfile /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
|
||||
convert_hf_to_gguf.py ~/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B/WhiteRabbitNeo-V3-7B --outfile ~/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
|
||||
```
|
||||
|
||||
### 4 Execute: Review Model Metadata
|
||||
@@ -157,7 +159,7 @@ When these steps have completed, you should see a new WhiteRabbitNeo-V3-7B.gguf
|
||||
Run the following command:
|
||||
|
||||
```bash
|
||||
gguf-dump /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
|
||||
gguf-dump ~/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
|
||||
```
|
||||
|
||||
We should then see:
|
||||
@@ -190,7 +192,7 @@ A text listing of all of the model's tensors, and the precision of each. Because
|
||||
Run our newly created **.GGUF** file as is. Run the model using the following command:
|
||||
|
||||
```bash
|
||||
llama-cli -m /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
|
||||
llama-cli -m ~/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf
|
||||
```
|
||||
|
||||
Once loaded, interact with the model. We can see a number of interesting parameters that were selected by default, such as **Top K**, **Top P**, **Temperature**, and more, which we'll discuss in the next section. In the meantime, explore interaction with the model. When run in this raw state, the model may be overly chatty. You can stop its output with `Ctrl+C` at any time.
|
||||
@@ -319,7 +321,7 @@ We can also import our WhiteRabbitNeo **.GGUF** model into Ollama, without havin
|
||||
1. **Create a simple modelfile** – This will tell Ollama where the model lives.
|
||||
|
||||
```bash
|
||||
echo "FROM /home/student/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf" > Modelfile
|
||||
echo "FROM $HOME/lab3/WhiteRabbitNeo/WhiteRabbitNeo-V3-7B.gguf" > Modelfile
|
||||
```
|
||||
|
||||
2. **Register the model with Ollama**
|
||||
|
||||
@@ -6,19 +6,16 @@ Default runtime config:
|
||||
|
||||
```json
|
||||
{
|
||||
"lab3TerminalUrl": "http://127.0.0.1:7681/wetty",
|
||||
"lab3Username": "student",
|
||||
"lab3WorkingDirectory": "/home/student/lab3"
|
||||
"lab3TerminalUrl": "http://127.0.0.1:7681/wetty"
|
||||
}
|
||||
```
|
||||
|
||||
## Runtime Contract
|
||||
|
||||
- Create a managed `student` Unix account
|
||||
- Bind `sshd` to `127.0.0.1:22` only
|
||||
- Run WeTTY on the lab host and expose it to the browser at `/wetty`
|
||||
- Point WeTTY at localhost SSH without pre-setting the username so students see a login prompt
|
||||
- Start students in `/home/student/lab3`
|
||||
- Point WeTTY at localhost SSH without pre-setting the username so users see a normal login prompt
|
||||
- Let the host control which existing local accounts are allowed to sign in
|
||||
|
||||
Example launch command:
|
||||
|
||||
@@ -29,7 +26,7 @@ wetty --host 0.0.0.0 --port 7681 --base /wetty --allow-iframe --ssh-host 127.0.0
|
||||
## Linux / WSL Shape
|
||||
|
||||
- Install `openssh-server`
|
||||
- Restrict `sshd` to `127.0.0.1` and the managed `student` account
|
||||
- Restrict `sshd` to `127.0.0.1`
|
||||
- Install WeTTY with the managed Node runtime
|
||||
- Generate `/courseware-runtime.json` from deployment-time values
|
||||
- Start the wiki and WeTTY as separate managed services
|
||||
|
||||
@@ -5,7 +5,6 @@ import {
|
||||
} from "~/components/labs/Lab3TerminalFrame";
|
||||
import {
|
||||
LAB3_DEFAULT_TERMINAL_PATH,
|
||||
LAB3_DEFAULT_WORKING_DIRECTORY,
|
||||
fetchLab3RuntimeConfig,
|
||||
getLab3TerminalPath,
|
||||
LAB3_RUNTIME_CONFIG_PATH,
|
||||
@@ -27,11 +26,9 @@ describe("getLab3TerminalPath", () => {
|
||||
});
|
||||
|
||||
describe("normalizeLab3RuntimeConfig", () => {
|
||||
it("fills in the default student terminal metadata", () => {
|
||||
it("fills in the default terminal path", () => {
|
||||
expect(normalizeLab3RuntimeConfig()).toEqual({
|
||||
terminalPath: LAB3_DEFAULT_TERMINAL_PATH,
|
||||
username: "student",
|
||||
workingDirectory: LAB3_DEFAULT_WORKING_DIRECTORY,
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -48,8 +45,6 @@ describe("fetchLab3RuntimeConfig", () => {
|
||||
new Response(
|
||||
JSON.stringify({
|
||||
lab3TerminalUrl: "http://127.0.0.1:7681/wetty",
|
||||
lab3Username: "student",
|
||||
lab3WorkingDirectory: "/home/student/lab3",
|
||||
}),
|
||||
{ status: 200 },
|
||||
),
|
||||
@@ -57,8 +52,6 @@ describe("fetchLab3RuntimeConfig", () => {
|
||||
|
||||
await expect(fetchLab3RuntimeConfig()).resolves.toEqual({
|
||||
terminalPath: "http://127.0.0.1:7681/wetty",
|
||||
username: "student",
|
||||
workingDirectory: "/home/student/lab3",
|
||||
});
|
||||
|
||||
expect(fetchMock).toHaveBeenCalledWith(LAB3_RUNTIME_CONFIG_PATH, {
|
||||
@@ -82,7 +75,9 @@ describe("Lab3TerminalFrame", () => {
|
||||
return link.getAttribute("href") === "/lab-terminal";
|
||||
}),
|
||||
).toBe(true);
|
||||
expect(screen.getAllByText("/home/student/lab3")).toHaveLength(2);
|
||||
expect(
|
||||
screen.getByText(/log in with a local account that has/i),
|
||||
).toBeInTheDocument();
|
||||
});
|
||||
|
||||
it("loads the terminal settings from the runtime config artifact", async () => {
|
||||
@@ -90,8 +85,6 @@ describe("Lab3TerminalFrame", () => {
|
||||
new Response(
|
||||
JSON.stringify({
|
||||
lab3TerminalUrl: "http://127.0.0.1:7681/wetty",
|
||||
lab3Username: "student",
|
||||
lab3WorkingDirectory: "/srv/labs/lab3",
|
||||
}),
|
||||
{ status: 200 },
|
||||
),
|
||||
@@ -105,8 +98,9 @@ describe("Lab3TerminalFrame", () => {
|
||||
"http://127.0.0.1:7681/wetty",
|
||||
);
|
||||
});
|
||||
|
||||
expect(screen.getAllByText("/srv/labs/lab3")).toHaveLength(2);
|
||||
expect(
|
||||
screen.getByText(/Sign in with any local account that can SSH/i),
|
||||
).toBeInTheDocument();
|
||||
});
|
||||
|
||||
it("toggles the dock open and closed", () => {
|
||||
|
||||
@@ -87,8 +87,8 @@ export function Lab3TerminalFrame({ srcPath }: Lab3TerminalFrameProps) {
|
||||
<h3>Use the lab shell directly in your browser</h3>
|
||||
<p className="lab3-terminal-inline__lede">
|
||||
The terminal is docked to the bottom of the page. Expand it with the
|
||||
arrow when you're ready to log in as <code>{runtimeConfig.username}</code>{" "}
|
||||
and work from <code>{runtimeConfig.workingDirectory}</code>.
|
||||
arrow when you're ready to log in with a local account that has
|
||||
password-based SSH access on this lab host.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
@@ -140,9 +140,8 @@ export function Lab3TerminalFrame({ srcPath }: Lab3TerminalFrameProps) {
|
||||
<div className="lab3-terminal-dock__panel" id="lab3-terminal-dock-panel">
|
||||
<div className="lab3-terminal-dock__header">
|
||||
<p className="lab3-terminal-dock__lede">
|
||||
Log in as <code>{runtimeConfig.username}</code>, work from{" "}
|
||||
<code>{runtimeConfig.workingDirectory}</code>, or pop the terminal
|
||||
out into a full tab for more room.
|
||||
Sign in with any local account that can SSH into this machine, or
|
||||
pop the terminal out into a full tab for more room.
|
||||
</p>
|
||||
|
||||
{isConfigResolved ? (
|
||||
|
||||
@@ -1,18 +1,12 @@
|
||||
export const LAB3_DEFAULT_TERMINAL_PATH = "/wetty";
|
||||
export const LAB3_DEFAULT_USERNAME = "student";
|
||||
export const LAB3_DEFAULT_WORKING_DIRECTORY = "/home/student/lab3";
|
||||
export const LAB3_RUNTIME_CONFIG_PATH = "/courseware-runtime.json";
|
||||
|
||||
export type Lab3RuntimeConfig = {
|
||||
lab3TerminalUrl?: string;
|
||||
lab3Username?: string;
|
||||
lab3WorkingDirectory?: string;
|
||||
};
|
||||
|
||||
export type ResolvedLab3RuntimeConfig = {
|
||||
terminalPath: string;
|
||||
username: string;
|
||||
workingDirectory: string;
|
||||
};
|
||||
|
||||
export function getLab3TerminalPath(envValue?: string) {
|
||||
@@ -34,9 +28,6 @@ export function normalizeLab3RuntimeConfig(
|
||||
): ResolvedLab3RuntimeConfig {
|
||||
return {
|
||||
terminalPath: getLab3TerminalPath(config?.lab3TerminalUrl),
|
||||
username: config?.lab3Username?.trim() || LAB3_DEFAULT_USERNAME,
|
||||
workingDirectory:
|
||||
config?.lab3WorkingDirectory?.trim() || LAB3_DEFAULT_WORKING_DIRECTORY,
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
Reference in New Issue
Block a user