Run AI models, databases, and services entirely on your machine. Perfect for AI assistants like Claude Code, Cursor, and Gemini CLI. Develop faster, spend nothing, keep your data private.
LocalCloud orchestrates popular open-source tools into a cohesive platform
Run Llama, Mistral, Qwen, and other LLMs locally with OpenAI-compatible API
PostgreSQL with vector extensions and MongoDB for flexible data storage
Redis caching and job queues for lightning-fast applications
S3-compatible storage with MinIO for files, images, and documents
Share your work instantly with Cloudflare and Ngrok integration
Built-in RAG support with pgvector for semantic search applications
Non-interactive setup perfect for Claude Code, Cursor, and Gemini CLI
One command starts everything you need
LocalCloud is optimized for AI-powered development workflows
lc setup my-app --preset=ai-dev --yes
Perfect for Claude Code, Cursor, and Gemini CLI - no arrow keys or space bar needed
cat CLAUDE.md # Auto-created guidance
Complete project documentation automatically generated for AI assistants
--components=llm,database,vector --models=llama3.2:3b
Precise control over infrastructure without interactive prompts
From zero to AI app in under 5 minutes
curl -fsSL https://localcloud.sh/install | bash
Works on macOS, Linux, and Windows
lc start
All services start automatically. Start coding immediately!
See what developers are creating with LocalCloud
ChatGPT-like interface with conversation history and memory
Semantic search over your documents with vector embeddings
REST API with background job processing and caching
Connect with developers building the future of local-first AI