Skip to main content

SSL

2025


How to Install Open WebUI with Nginx as a Reserve Proxy

·10 mins

Open WebUI is a self-hosted, feature-rich, and user-friendly open-source web interface that lets you run large language models (LLMs) using Ollama and OpenAI-compatible APIs. You can extend Open WebUI with AI models and plugins to suit your needs.

This article explains how to install Open WebUI and run large language models (LLMs) using Ollama on a Vultr Cloud GPU instance. You will also configure Open WebUI with secure SSL certificates to expose the web interface using a custom domain.