Skip to main content

CLI Overview

LocalCloud provides a powerful command-line interface (CLI) for managing your local AI development environment. All commands are available through both localcloud and the shorter lc alias.

Command Structure

lc [command] [subcommand] [flags]
You can use either localcloud or lc - they are identical. We’ll use lc in examples for brevity.

Global Flags

These flags are available for all commands:
FlagShortDescriptionDefault
--verbose-vEnable verbose outputfalse
--config-cConfig file path./.localcloud/config.yaml
--project-pProject directory path.

Command Categories

Project Management

Initialize and manage your LocalCloud projects
  • lc setup - Create new project with interactive wizard
  • lc status - View project and service status
  • lc info - System and resource information
  • lc reset - Clean up project data

Service Control

Start, stop, and monitor services
  • lc start - Start all or specific services
  • lc stop - Stop all or specific services
  • lc restart - Restart services
  • lc services - List running services
  • lc logs - View service logs
  • lc ps - Show running containers

AI Models

Manage Ollama language models
  • lc models list - Show available models
  • lc models pull - Download new models
  • lc models remove - Remove models
  • lc models show - Model details

Database Tools

Database management utilities
  • lc mongo - MongoDB shell access
  • lc postgres - PostgreSQL shell (psql)
  • lc redis - Redis CLI access

Networking

Expose services publicly
  • lc tunnel - Create secure tunnels
  • lc tunnel list - Show active tunnels
  • lc tunnel stop - Stop tunnels

Components

Add or remove service components
  • lc component add - Add new services
  • lc component remove - Remove services
  • lc component list - Show components

Quick Reference

Essential Commands

# Initialize and setup a new project
lc setup [project-name]

# Start all services
lc start

# Check status
lc status

# View logs
lc logs

# Stop services
lc stop

Service Management

# List running services
lc services

# Start specific service
lc start postgres

# View service logs
lc logs ollama -f

# Get service info
lc info

Model Management

# List available models
lc models list

# Download a model
lc models pull llama2

# Remove a model
lc models remove llama2

Command Aliases

Many commands have shorter aliases for convenience:
Full CommandAliases
lc serviceslc svcs, lc ls
lc modelslc model, lc m
lc componentlc comp
lc models removelc models rm, lc models delete

Output Formats

Most commands support different output formats:
# Default human-readable output
lc status

# JSON output for scripting
lc info --json

# Detailed output
lc services --detailed

Auto-completion

Enable tab completion for your shell:
# Add to ~/.bashrc
source <(lc completion bash)

Help System

Get help for any command:
# General help
lc --help

# Command-specific help
lc start --help

# Subcommand help
lc models pull --help

Version Information

# Check LocalCloud version
lc --version

# Detailed version info
lc version --detailed

Common Workflows

Starting a New Project

# Create and setup
mkdir my-app && cd my-app
lc setup
lc start

Daily Development

# Morning startup
lc start
lc status

# Check logs if needed
lc logs -f

# Evening shutdown
lc stop

Model Management

# See what's available
lc models list

# Get a new model
lc models pull mistral

# Clean up unused models
lc models remove unused-model

Error Handling

LocalCloud provides clear error messages:
# Example: Project not initialized
$ lc start
Error: no LocalCloud project found. Run 'lc setup' first

# Example: Docker not running
$ lc start
Error: Docker is not running. Please start Docker Desktop first

Next Steps

Project Setup

Initialize your first project

Service Management

Learn to control services

Model Management

Work with AI models

Networking

Expose services with tunnels