Cache Service (Redis)
LocalCloud includes Redis configured as a caching layer to dramatically improve application performance by storing frequently accessed data in memory.What is Redis Cache?
Redis is an in-memory data structure store used as a cache in LocalCloud. It provides:- Sub-millisecond latency - Extremely fast data access
- Multiple data types - Strings, hashes, lists, sets, sorted sets
- Atomic operations - Thread-safe operations
- TTL support - Automatic expiration of cached data
Connection Details
When the cache service is running:- Host:
localhost
- Port:
6379
- Database:
0
(default) - Password: None (local development)
Connection String
Getting Started
Start Cache Service
Connect to Redis
Common Cache Patterns
Basic Key-Value Caching
Session Storage
Counters and Rate Limiting
Lists for Recent Items
Application Integration
Python with redis-py
Node.js with ioredis
Go with go-redis
Cache Strategies
Cache-Aside (Lazy Loading)
Write-Through Cache
Cache Invalidation
Monitoring and Debugging
Check Cache Stats
Using LocalCloud CLI
Configuration
Configure cache in.localcloud/config.yaml
:
Performance Tips
Memory Optimization
Connection Pooling
Common Use Cases
Web Application Caching
- Page fragments - Cache rendered HTML components
- API responses - Cache database query results
- User sessions - Store authentication state
- Shopping carts - Temporary user data
AI Application Caching
- Model outputs - Cache LLM responses for repeated queries
- Embeddings - Cache vector embeddings for documents
- Feature vectors - Cache computed features
- Rate limiting - Control API usage per user
Real-time Applications
- Leaderboards - Sorted sets for rankings
- Chat messages - Lists for message history
- Notifications - Queues for real-time updates
- Analytics - Counters for metrics
Troubleshooting
Memory Issues
Connection Issues
Related Documentation
- Queue Service - Redis-based job queues
- CLI Redis Commands - Command-line Redis operations
- Database - Primary data storage
Best Practices
- Set appropriate TTLs - Prevent cache from growing indefinitely
- Use consistent naming - Organize keys with prefixes
- Monitor memory usage - Avoid running out of memory
- Handle cache misses gracefully - Always have fallback to database
- Consider cache warming - Pre-load frequently accessed data
- Use transactions when needed - MULTI/EXEC for atomic operations