44 results for “topic:ollama-webui”
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
Free, high-quality text-to-speech API endpoint to replace OpenAI, Azure, or ElevenLabs
Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
Self-host a ChatGPT-style web interface for Ollama 🦙
This repository provides resources and guidelines to facilitate the integration of Open-WebUI and Langfuse, enabling seamless monitoring and management of AI model usage statistics.
浏览器插件端、智能体、MCP开发中,欢迎Star
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
A Docker Compose to run a local ChatGPT-like application using Ollama, Ollama Web UI, Mistral NeMo & DeepSeek R1.
PuPu is a lightweight, cross-platform desktop AI client that works with both local and cloud-hosted models. Whether you prefer running models on your own machine or connecting to providers like OpenAI and Anthropic, PuPu gives you a unified, elegant interface — your AI, your rules.
Ollama with Let's Encrypt Using Docker Compose
LLM AI Client based on Blazor. (openai, chatgpt, llama, ollama, onnx, deepseekr1...)
高颜值 Ollama AI 聊天界面。
Persian Ollama Project: Enhance Persian (Farsi) prompts when chatting with Ollama LLMs.
Better, open-source, LLM wrapper (user management, text processing, file processing, image generation, web search, local models)
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
A minimal interface in pure HTML/CSS for talking with Ollama focused on ensuring you can read the code.
Simple web UI for Ollama
This Docker Compose setup provides an isolated application with Ollama, Open-WebUI, and Nginx reverse proxy to enable secure HTTPS access. Since Open-WebUI does not support SSL natively, Nginx acts as a reverse proxy, handling SSL termination.
SymChat is an AI chat assistant designed with privacy as a core principle. All conversations and data are stored locally on your device—nothing is sent to external servers by default. Our goal is to empower users with full control of their AI experience, discourage unnecessary data collection, and ensure your interactions remain private and secure
Minimal Ollama chat UI - no login, no heavy features.
Angular18 frontend for Ollama
Web Client For Ollama - Llama LLM
AI model deployment on Synology NAS and macOS 🧠🐳
Extremely simple chat interface for ollama models.
An advanced, multi-backend AI Agent Chat application built with React, FastAPI, and LangChain. Features a rich toolset including web search, code execution, and long-term memory.
ollama web_ui simple and easy
A simple Docker Compose setup to self-host Ollama and Open WebUI. Run your own private LLMs with GPU acceleration (NVIDIA/AMD) and complete data privacy. Easy to integrate with other services like n8n.
Streamlined Ollama WebUI Setup: Automated Scripts, LLM Integration, and Desktop Shortcut Creation
ollama-wrapper cli (+ webUI)