Files
LLM-Labs/docs/lab3-terminal-deployment.md
T

36 lines
1.0 KiB
Markdown

# Lab 3 Embedded Terminal Deployment
Lab 3 now prefers a runtime config artifact served at `/courseware-runtime.json`.
Default runtime config:
```json
{
"lab3TerminalUrl": "http://127.0.0.1:7681/wetty",
"lab3Username": "student",
"lab3WorkingDirectory": "/home/student/lab3"
}
```
## Runtime Contract
- Create a managed `student` Unix account
- Bind `sshd` to `127.0.0.1:22` only
- Run WeTTY on the lab host and expose it to the browser at `/wetty`
- Point WeTTY at localhost SSH without pre-setting the username so students see a login prompt
- Start students in `/home/student/lab3`
Example launch command:
```bash
wetty --host 0.0.0.0 --port 7681 --base /wetty --allow-iframe --ssh-host 127.0.0.1 --ssh-port 22 --ssh-auth password
```
## Linux / WSL Shape
- Install `openssh-server`
- Restrict `sshd` to `127.0.0.1` and the managed `student` account
- Install WeTTY with the managed Node runtime
- Generate `/courseware-runtime.json` from deployment-time values
- Start the wiki and WeTTY as separate managed services