Services

LocalCloud provides a suite of integrated services that work together to create a complete local development environment. Each service runs in its own Docker container and is managed by the LocalCloud CLI.

Available Services

AI Service (Ollama)

Run language models locally with OpenAI-compatible API

PostgreSQL

Full-featured relational database with vector extensions

MongoDB

NoSQL document database for flexible data storage

Redis

High-performance caching and message queuing

MinIO

S3-compatible object storage for files and media

Service Architecture

AI Service (Ollama)

Ollama enables you to run large language models locally with excellent performance.

Features

  • OpenAI API Compatibility: Drop-in replacement for OpenAI API
  • Multiple Models: Support for Llama, Mistral, Qwen, and more
  • GPU Acceleration: Automatic GPU detection and utilization
  • Model Management: Easy model downloading and switching

Default Configuration

ai:
  type: ollama
  port: 11434
  models: []  # Models are pulled on demand

Supported Models

  • llama2 - Meta’s Llama 2 (7B)
  • llama3 - Meta’s Llama 3 (8B)
  • mistral - Mistral AI’s model (7B)
  • mixtral - Mistral’s MoE model (8x7B)
  • qwen2.5 - Alibaba’s Qwen models

API Endpoints

  • Generate: POST http://localhost:11434/api/generate
  • Chat: POST http://localhost:11434/api/chat
  • Models: GET http://localhost:11434/api/tags
  • Embeddings: POST http://localhost:11434/api/embeddings

PostgreSQL Database

Enterprise-grade relational database with powerful extensions for AI applications.

Features

  • Version 16: Latest stable PostgreSQL
  • Vector Storage: pgvector extension for embeddings
  • Full-Text Search: Built-in FTS capabilities
  • JSON Support: Native JSONB data type
  • Extensions: pgvector, pg_trgm, and more

Default Configuration

database:
  type: postgres
  version: "16"
  port: 5432
  name: localcloud
  user: localcloud
  password: localcloud  # Auto-generated in production
  extensions:
    - pgvector
    - pg_trgm

Connection Details

postgresql://localcloud:localcloud@localhost:5432/localcloud

Common Extensions

  • pgvector: Store and query vector embeddings
  • pg_trgm: Trigram-based text similarity
  • uuid-ossp: UUID generation
  • hstore: Key-value storage

Redis Cache

High-performance in-memory data store for caching and messaging.

Features

  • Caching: Sub-millisecond response times
  • Pub/Sub: Real-time messaging
  • Queues: Job and task queuing
  • Data Structures: Lists, sets, sorted sets, streams

Default Configuration

cache:
  type: redis
  version: "7"
  port: 6379

queue:
  type: redis  # Shares same instance
  port: 6379

Use Cases

MinIO Storage

S3-compatible object storage for files, images, and documents.

Features

  • S3 Compatible: Works with existing S3 SDKs
  • Web Console: Visual file management
  • Buckets: Organize files logically
  • Versioning: Track file changes

Default Configuration

storage:
  type: minio
  port: 9000      # API port
  console: 9001   # Web UI port
  access_key: minioadmin
  secret_key: minioadmin  # Change in production

Access Points

  • API Endpoint: http://localhost:9000
  • Web Console: http://localhost:9001
  • Default Credentials:
  • Access Key: minioadmin
  • Secret Key: minioadmin

SDK Examples

import boto3

s3 = boto3.client('s3',
endpoint_url='http://localhost:9000',
aws_access_key_id='minioadmin',
aws_secret_access_key='minioadmin'
)

# Create bucket
s3.create_bucket(Bucket='uploads')

# Upload file
s3.upload_file('document.pdf', 'uploads', 'document.pdf')

Service Lifecycle

Starting Services

Services are started in dependency order:
  1. Network Creation: Docker network for inter-service communication
  2. Database: PostgreSQL starts first
  3. Cache: Redis starts next
  4. Storage: MinIO initializes
  5. AI: Ollama starts last (may download models)
# Start all services
lc start

# Start specific service
lc start postgres

Health Checks

Each service includes health checks:
healthcheck:
  test: ["CMD", "curl", "-f", "http://localhost:11434/"]
  interval: 30s
  timeout: 10s
  retries: 3

Data Persistence

All service data is persisted in Docker volumes:
  • Ollama Models: localcloud_ollama_models
  • PostgreSQL Data: localcloud_postgres_data
  • Redis Data: localcloud_redis_data
  • MinIO Data: localcloud_minio_data
Data persists across lc stop/start cycles. Use lc reset --hard to remove all data.

Service Communication

Services communicate over the Docker network:
  • Internal DNS: Services can reference each other by name
  • Port Mapping: Services are exposed to localhost
  • Network Isolation: Services are isolated from external networks
Example internal communication:
# From within a container, services can use internal names
postgres_url = "postgresql://localcloud:localcloud@postgres:5432/localcloud"
redis_url = "redis://redis:6379"
minio_url = "http://minio:9000"

Resource Management

LocalCloud automatically manages resources:

Memory Limits

resources:
  memory_limit: "4Gi"  # Per service
  cpu_limit: "2"       # CPU cores

Monitoring

# Check resource usage
lc info

# View detailed stats
docker stats

Next Steps