Skip to main content

Using Chatbot UI

This guide shows how to deploy Open WebUI Chatbot UI interface and connect it to your LLMBoost server.

By the end, you'll have:

  • A self-hosted Chatbot UI running in Docker
  • An LLMBoost server with OpenAI-compatible API proxy
  • SSH port forwarding to access it remotely

Start the Server

If you haven't started the LLMBoost server yet, follow the instructions to Deploy a single server with llmboost serve

Step 1: Run Open WebUI Docker

⚠️ Important: This command must be run on the same server where your LLMBoost server is running. OpenWebUI connects to the backend using http://localhost:8011/v1, which assumes both are on the same machine.

Use the following command to launch Open WebUI:

docker run \
--env=OPENAI_API_BASE_URL=http://localhost:8011/v1 \
--env=ENABLE_OLLAMA_API=false \
--env=OPENAI_API_KEY=no-token-needed \
--volume=/data/open-webui-volumes:/app/backend/data \
--network=host \
--restart=always \
ghcr.io/open-webui/open-webui:main

Step 2: Access OpenWebUI Remotely via SSH (Optional)

note

Only required if you're running LLMBoost and OpenWebUI on a remote server

If your LLMBoost server is running on a remote machine (e.g., a cloud GPU instance), you can securely access the WebUI and backend ports from your local machine using SSH port forwarding.

Use the following command from your local machine:

ssh -L 8080:localhost:8080 -L 8011:localhost:8011 [email protected]

Breakdown of each part:

SegmentDescription
sshStart a secure shell (SSH) session
-L 8080:localhost:8080Forward local port 8080 to remote port 8080 (ChatBot UI)
-L 8011:localhost:8011Forward local port 8011 to remote port 8011 (LLMBoost API server)
[email protected]SSH login using the username john to the remote server at 10.1.1.6

Step 3: Start Chatting!

Once everything is set up and the ports are forwarded (if using a remote server), you can access the Chatbot UI in your local browser:

http://localhost:8080

This will load the OpenWebUI interface, connected to your LLMBoost backend running on port 8011.

Enjoy chatting with your local model in private!