🤖 AI Assistant Ready
Claude Code • Cursor • Gemini CLI
🚀 Open Source 🏠 Local-First 💰 Zero Cost

Build AI Applications
Without Cloud Costs

Run AI models, databases, and services entirely on your machine. Perfect for AI assistants like Claude Code, Cursor, and Gemini CLI. Develop faster, spend nothing, keep your data private.

macOS / Linux
Windows
$ curl -fsSL https://localcloud.sh/install | bash
PS> iwr -useb https://localcloud.sh/install.ps1 | iex
$0
Development Cost
5min
Setup Time
100%
Open Source
LocalCloud Setup
$ lc setup my-ai-app

Everything You Need for AI Development

LocalCloud orchestrates popular open-source tools into a cohesive platform

🤖

Local AI Models

Run Llama, Mistral, Qwen, and other LLMs locally with OpenAI-compatible API

Ollama 70+ Models
🗄️

Multiple Databases

PostgreSQL with vector extensions and MongoDB for flexible data storage

PostgreSQL MongoDB pgvector

High Performance

Redis caching and job queues for lightning-fast applications

Redis Caching Queues
📦

Object Storage

S3-compatible storage with MinIO for files, images, and documents

MinIO S3 API
🌐

Secure Tunneling

Share your work instantly with Cloudflare and Ngrok integration

Cloudflare Ngrok
🔍

Vector Search

Built-in RAG support with pgvector for semantic search applications

RAG Embeddings
🤖

AI Assistant Ready

Non-interactive setup perfect for Claude Code, Cursor, and Gemini CLI

Non-Interactive Auto-CLAUDE.md Presets

All Services Included

One command starts everything you need

LocalCloud Status

All Services Running
🤖
AI Service (Ollama)
Port 11434 • llama3.2:3b
Running
🗄️
PostgreSQL
Port 5432 • pgvector enabled
Running
🍃
MongoDB
Port 27017 • Auth enabled
Running
Redis Cache
Port 6379 • 256MB allocated
Running
📦
MinIO Storage
Port 9000 • S3 Compatible
Running

Built for AI Assistants

LocalCloud is optimized for AI-powered development workflows

🚀

Non-Interactive Setup

lc setup my-app --preset=ai-dev --yes

Perfect for Claude Code, Cursor, and Gemini CLI - no arrow keys or space bar needed

📝

Auto-Generated CLAUDE.md

cat CLAUDE.md # Auto-created guidance

Complete project documentation automatically generated for AI assistants

⚙️

Component Control

--components=llm,database,vector --models=llama3.2:3b

Precise control over infrastructure without interactive prompts

Available Presets

ai-dev
AI + Database + Vector + Cache
--preset=ai-dev
full-stack
All services + Storage + Tunneling
--preset=full-stack
minimal
AI models only
--preset=minimal

Get Started in 3 Steps

From zero to AI app in under 5 minutes

1

Install LocalCloud

curl -fsSL https://localcloud.sh/install | bash

Works on macOS, Linux, and Windows

2

Setup Your Project

lc setup my-ai-app --preset=ai-dev --yes

AI Assistant mode (Claude Code, Cursor, Gemini CLI)

OR
lc setup my-ai-app

Interactive wizard for manual setup

3

Start Building

lc start

All services start automatically. Start coding immediately!

Build Amazing Apps

See what developers are creating with LocalCloud

💬

AI Chat Application

ChatGPT-like interface with conversation history and memory

Ollama PostgreSQL Redis
View Template →
🔍

RAG Document Search

Semantic search over your documents with vector embeddings

pgvector Embeddings MinIO
View Template →
🤖

AI API Backend

REST API with background job processing and caching

FastAPI Queue MongoDB
View Template →

Join the Community

Connect with developers building the future of local-first AI

100% Open Source
Apache 2.0 Licensed
Local-First Development