Quick Start
Get your first LocalCloud project running in 5 minutes. We’ll create a simple AI chat application that runs entirely on your local machine.Make sure you’ve installed LocalCloud before starting this guide.
Create Your First Project
1
Initialize the project
.localcloud
directory with your project configuration.2
Configure services
- AI Model: Choose a model (e.g.,
llama2
,mistral
,qwen2.5
) - Database: PostgreSQL for data storage
- Cache: Redis for performance
- Storage: MinIO for file storage
3
Start services
- Download required Docker images
- Start all services
- Configure networking
- Show connection details
Verify Everything is Running
Check the status of your services:Test the AI Service
Let’s test that the AI model is working:Connect to Services
Each service is now accessible locally:AI Model (Ollama)
- API:
http://localhost:11434
- Models endpoint:
http://localhost:11434/api/tags
- Compatible with OpenAI API format
PostgreSQL Database
- Host:
localhost
- Port:
5432
- Username:
localcloud
- Password:
localcloud
- Database:
localcloud
Redis Cache
- Host:
localhost
- Port:
6379
- No authentication by default
MinIO Storage
- API:
http://localhost:9000
- Console:
http://localhost:9001
- Access Key:
minioadmin
- Secret Key:
minioadmin
Build a Simple Chat Application
Let’s create a basic chat application using the services:- Python
- Node.js
Create Run the app:
app.py
:Managing Your Project
View logs
Stop services
Add more models
Check resource usage
What’s Next?
Congratulations! You now have a fully functional local AI development environment. Here’s what you can explore next:CLI Commands
Master all LocalCloud commands
Service Configuration
Customize your services
AI Models
Explore available AI models
Examples
See more example applications
Tips
Model Performance: Smaller models like
qwen2.5:3b
or phi3
run faster on modest hardware. Start with these if you have limited resources.First Start: The initial
lc start
may take several minutes as Docker images are downloaded. Subsequent starts will be much faster.Data Persistence: All your data is stored in Docker volumes. Use
lc reset --hard
only if you want to completely remove all data.