OpenWebUI-LangFlow Integration Platform
A production-ready framework connecting OpenWebUI's chat interface with LangFlow's AI workflow engine via Python pipelines. Build sophisticated conversational AI applications with visual workflow management on Kubernetes.
๐ Table of Contents
- Quick Start
- Architecture
- Components
- Available Pipelines
- Installation
- Pipeline Development
- Daily Workflow
- Deployment Options
- Configuration
- Troubleshooting
- Contributing
Quick Start
git clone https://github.com/pawelrosada/langflow-ui.git
cd langflow-ui
# Complete setup (creates cluster, deploys apps, loads pipelines)
task quickstart
# Access applications:
# OpenWebUI: http://localhost:3000
# Langflow: http://localhost:7860
# Pipelines API: http://localhost:9099Architecture
๐ For comprehensive technical documentation, component specifications, and advanced architecture patterns, see ARCHITECTURE.md
System Overview
graph TB
User([๐ค User]) --> OpenWebUI["๐ OpenWebUI<br/>Chat Interface<br/>Port 3000"]
OpenWebUI --> |"OpenAI API<br/>HTTP REST"| Pipelines["๐ง Pipelines<br/>Integration Layer<br/>Port 9099"]
Pipelines --> |"HTTP API<br/>JSON"| LangFlow["โก LangFlow<br/>Workflow Engine<br/>Port 7860"]
LangFlow --> |SQL| PostgreSQL[("๐๏ธ PostgreSQL<br/>Database<br/>Port 5432")]
LangFlow --> |"API Calls"| AIModels["๐ค AI Models<br/>GPT-4, Gemini, Claude"]
AIModels --> LangFlow
LangFlow --> Pipelines
Pipelines --> OpenWebUI
OpenWebUI --> User
subgraph DockerNet ["Docker Network"]
OpenWebUI
Pipelines
LangFlow
PostgreSQL
end
classDef frontend fill:#e1f5fe
classDef integration fill:#f3e5f5
classDef workflow fill:#e8f5e8
classDef database fill:#fff3e0
classDef external fill:#fce4ec
class OpenWebUI frontend
class Pipelines integration
class LangFlow workflow
class PostgreSQL database
class AIModels external
Data Flow Architecture
sequenceDiagram
participant U as ๐ค User
participant OW as ๐ OpenWebUI
participant P as ๐ง Pipelines
participant LF as โก LangFlow
participant DB as ๐๏ธ PostgreSQL
participant AI as ๐ค AI Models
U->>OW: Send Message
OW->>P: "POST /v1/chat/completions"
P->>P: "Rate Limiting & Validation"
P->>LF: "POST /api/v1/run/{workflow_id}"
LF->>DB: Load Workflow Config
DB-->>LF: Workflow Definition
LF->>AI: "API Request (GPT/Gemini/Claude)"
AI-->>LF: AI Response
LF->>DB: Store Conversation
LF-->>P: JSON Response
P->>P: "Format & Process"
P-->>OW: Streaming Response
OW-->>U: Display Response
Components
- OpenWebUI: Modern chat interface with user management
- LangFlow: Visual AI workflow builder with 200+ components
- Pipelines: 25+ Python integrations connecting chat to workflows
- PostgreSQL: Persistent data storage for workflows and conversations
Available Pipelines
๐ง Pipeline Generator - Auto-creates pipelines from LangFlow workflows
๐ Content: Blog Writer, Instagram Copywriter, Twitter Thread Generator
๐ Research: Market Research, Financial Analysis, News Aggregation
๐ค AI Models: OpenAI, Claude, Gemini integrations
๐ RAG: Vector Store, Hybrid Search, Document Processing
๐ฏ Agents: Search, Social Media, Customer Support
Installation
Prerequisites
- Docker (8GB+ RAM recommended)
- Internet connection for AI model APIs
Option 1: Docker Compose (Quick)
./setup-openwebui.shOption 2: Kubernetes Development (Full Platform)
task setup # Install tools, create cluster
task start # Start environment
task deploy # Deploy applications
task status # Check everythingPipeline Development
Update Pipelines
# Modify files in pipelines/ directory
task update-pipelines # Deploy to cluster
task pipelines-status # Verify loadedCreate Custom Pipeline
# pipelines/my_pipeline.py
class Pipeline:
def __init__(self):
self.name = "My Custom Pipeline"
self.id = "my_custom"
def pipe(self, user_message, model_id, messages, body):
# Your logic here
return f"Processed: {user_message}"Pipeline Generator Usage
- Access "๐ง Pipeline Generator" in OpenWebUI
- Automatically discovers LangFlow workflows
- Generates Python pipeline files
- Persists across restarts
Pipeline Architecture
graph TB
OpenWebUI["๐ OpenWebUI"] --> |"HTTP Request"| Pipelines["๐ง Pipelines Service"]
subgraph Pipelines ["๐ง Pipelines Container"]
Generator["๐ง Pipeline Generator"]
Pipeline1["๐ Blog Writer"]
Pipeline2["๐ Research Agent"]
PipelineN["... 25+ Pipelines"]
end
Pipelines --> |"Workflow Execution"| Langflow["โก Langflow"]
subgraph Storage ["๐พ Persistent Storage"]
PVC["PersistentVolumeClaim"]
ConfigMap["Pipeline Files ConfigMap"]
end
ConfigMap --> |"Init Container"| Pipelines
Pipelines --> |"Generated Files"| PVC
Daily Workflow
task up # Start everything
task status # Check services
task logs # View application logs
task update-pipelines # Update pipeline code
task stop # Stop environmentDeployment Options
Development: Kubernetes with Kind cluster (auto-managed)
Production: See HELM_DEVELOPMENT.md for scaling, security, monitoring
Configuration
Database: PostgreSQL for all services
Persistence: Automatic with PersistentVolumes
Secrets: Kubernetes secrets (dev) or external vault (prod)
Scaling: Horizontal pod autoscaling available
Troubleshooting
# Check pipeline status
task pipelines-status
# View logs
task pipelines-logs
# Restart pipelines
kubectl rollout restart deployment/langflow-app-pipelines
# Test pipeline API
curl -H "Authorization: Bearer API_KEY" http://localhost:9099/v1/modelsContributing
- Fork the repository
- Modify pipelines or add new ones
- Test with
task update-pipelines - Submit focused pull requests
Key Files:
pipelines/- Pipeline implementationshelm/- Kubernetes deploymentscripts/- Automation toolsTaskfile.yml- Development commands