1,831 results for “topic:local”
🏩 A simple process manager for developers. Start apps from your browser and access them using local domains
A development tool for all your projects that is fast, easy, powerful and liberating
Local Deep Research achieves ~95% on SimpleQA benchmark (tested with GPT-4.1-mini). Supports local and cloud LLMs (Ollama, Google, Anthropic, ...). Searches 10+ sources - arXiv, PubMed, web, and your private documents. Everything Local & Encrypted.
Claudable is an open-source web builder that leverages local CLI agents, such as Claude Code, Codex, Gemini CLI, Qwen Code, and Cursor Agent, to build and deploy products effortlessly.
Tired of pushing to test your .gitlab-ci.yml?
Run PyTorch LLMs locally on servers, desktop and mobile
TypeScript-centric app development platform: notebook and AI app builder
System font stack CSS organized by typeface classification for every modern operating system
Personal AI Notebooks. Organize files & webpages and generate notes from them. Open source, local & open data, open model choice (incl. local).
A datetime library for Rust that encourages you to jump into the pit of success.
WordPress LEMP stack with PHP 8.3, Composer, WP-CLI and more
One command brings a complete pre-wired LLM stack with hundreds of services to explore.
An open source approach to locally record and enable searching everything you view on your Mac.
🌧 An easy-to-use API for devices that use Tuya's cloud services. Documentation: https://codetheweb.github.io/tuyapi.
Text-To-Speech, RAG, and LLMs. All local!
ExHentai本地漫画标签管理阅读应用, ExHentai local manga tag-manager and reader
Supabase CLI. Manage postgres migrations, run Supabase locally, deploy edge functions. Postgres backups. Generating types from your database schema.
Authentication built for Nuxt 3! Easily add authentication via OAuth providers, credentials or Email Magic URLs!
Interactive, editable docs designed for coding agents
Fully-featured web interface for Ollama LLMs
Run local LLMs like llama, deepseek-distill, kokoro and more inside your browser
A VM for Drupal development
A curated list of awesome platforms, tools, practices and resources that helps run LLMs locally
Highly Performant, Modular, Memory Safe and Production-ready Inference, Ingestion and Indexing built in Rust 🦀
Build AI agents for your PC
ChattyUI - your private AI chat for running LLMs in the browser
My default LAMP development stack for Vagrant
Carina: an high performance and ops-free local storage for kubernetes
An local, offline (after initial setup), portable OCR software that can process images and PDF files, using DeepSeek-OCR AI (running directly on your machine).
My personal note about local and global descriptor