diff --git a/README.md b/README.md index 2f0a74f..234be82 100644 --- a/README.md +++ b/README.md @@ -20,20 +20,22 @@ npm run dev ## Lab 3 Web Terminal -Set `NEXT_PUBLIC_LAB3_TERMINAL_PATH` to the WeTTY endpoint used by your deployment. The default expected path is `/wetty`, and `.env.example` includes that value. Local environments can also provide a full URL such as `http://127.0.0.1:7681/wetty`. +The Lab 3 terminal now prefers a runtime-served config artifact at `/courseware-runtime.json`. -The Lab 3 widget assumes: +Expected fields: -- WeTTY runs on the lab host and is bound to `127.0.0.1` -- the public proxy forwards `/wetty` to the local WeTTY port -- WebSocket upgrade happens at the reverse proxy -- WeTTY is launched as root so it can present `/bin/login` locally instead of SSH +- `lab3TerminalUrl` +- `lab3Username` +- `lab3WorkingDirectory` -Example service command: +Local development still falls back to `/wetty`, and `.env.example` keeps that default for simple standalone runs. -```bash -wetty --host 127.0.0.1 --port 3001 --base /wetty --allow-iframe -``` +The Linux/WSL deployment contract is: + +- loopback-only `sshd` on `127.0.0.1:22` +- WeTTY exposed to the browser on `http://127.0.0.1:7681/wetty` +- a managed `student` login +- a working directory at `/home/student/lab3` ## Project Structure diff --git a/content/labs/lab-3-llama-cpp-and-ollama.md b/content/labs/lab-3-llama-cpp-and-ollama.md index e59cfde..c48f2bf 100644 --- a/content/labs/lab-3-llama-cpp-and-ollama.md +++ b/content/labs/lab-3-llama-cpp-and-ollama.md @@ -24,14 +24,14 @@ In this lab, we will: Execute sections require running commands and producing output. -To start this lab, use the embedded terminal below. It connects to the same lab machine in your browser and should prompt you to log in with the default `student` account. +To start this lab, use the embedded terminal below. It connects to the same lab machine in your browser and should prompt you to log in with the managed `student` account.
If the embedded terminal is unavailable, you can still fall back to: - SSH -/home/student/lab3/WhiteRabbitNeo to speed up lab execution.
+ Warning: Although the next two steps show how to find and download this model so you can replicate the process, any course-provided WhiteRabbitNeo support files will be staged under /home/student/lab3/WhiteRabbitNeo when they are available in the deployment.
The terminal is docked to the bottom of the page. Expand it with the
- arrow when you're ready to work from /home/student/lab3.
+ arrow when you're ready to log in as {runtimeConfig.username}{" "}
+ and work from {runtimeConfig.workingDirectory}.
- Work from /home/student/lab3, or pop the terminal out
- into a full tab for more room.
+ Log in as {runtimeConfig.username}, work from{" "}
+ {runtimeConfig.workingDirectory}, or pop the terminal
+ out into a full tab for more room.
- Connecting to the embedded terminal... + {isConfigResolved + ? "Connecting to the embedded terminal..." + : "Loading terminal configuration..."}
) : null} @@ -144,12 +186,18 @@ export function Lab3TerminalFrame({ className="lab3-terminal-dock__status lab3-terminal-dock__status--warning" aria-live="polite" > - The reverse proxy is not running for{terminalPath}.
+ The browser terminal service is unavailable right now.
) : (
- If the terminal page does not appear, open it in a new tab or fall
- back to SSH on <IP>:22.
+ {isConfigResolved ? (
+ <>
+ If the terminal page does not appear, open it in a new tab and
+ confirm the service is reachable at {terminalPath}.
+ >
+ ) : (
+ "Waiting for terminal settings from the deployment runtime."
+ )}