Caching
The Ellemment Stack includes robust caching utilities and a management dashboard for viewing and managing your cache. The system implements two distinct caching mechanisms:
Cache Types
SQLite Cache
- Separate from the main application database
- Managed by LiteFS for cross-instance replication
- Ideal for long-lived cached values
- Persists between application restarts
- Automatically replicated to all application instances
LRU (Least Recently Used) Cache
- In-memory cache implementation
- Perfect for expensive query results
- Helps deduplicate frequent data requests
- Not replicated across instances
- Cleared on application restart
- Best for short-lived cached values
When to Use Caching
Caching is most effective for:
- Expensive computations
- Slow data retrieval operations
- Rate-limited third-party API calls
- Frequently accessed, rarely changed data
Important: Before implementing caching, consider other optimization strategies. For slow database queries, first explore optimization through proper indexing and query tuning.
Cache Implementation
The stack provides a high-level caching abstraction using cachified
, eliminating the need to interact directly with cache stores. This abstraction integrates seamlessly with the server timing utilities.
Example Implementation
Here's a practical example of implementing caching for an external API call:
import { cachified, cache } from '#app/utils/cache.server.ts'
import { type Timings } from '#app/utils/timing.server.ts'
const eventSchema = z.object({
// Your schema definition here
})
export async function getScheduledEvents({
timings,
}: {
timings?: Timings
} = {}) {
const scheduledEvents = await cachified({
key: 'api:scheduled-events',
cache,
timings,
getFreshValue: () => {
// Your API fetch logic here
return [
// API response data
]
},
checkValue: eventSchema.array(),
// Cache valid for 24 hours
ttl: 1000 * 60 * 60 * 24,
// Allow stale cache use for up to 30 days while refreshing
staleWhileRevalidate: 1000 * 60 * 60 * 24 * 30,
})
return scheduledEvents
}
Caching Strategy
The implemented caching strategy includes:
Time-To-Live (TTL)
- Defines how long cached data remains valid
- Configurable per cache entry
- Balances data freshness with performance
Stale-While-Revalidate (SWR)
- Returns stale data while fetching fresh data
- Prevents user wait times
- Configurable grace period
- Perfect for non-critical data updates
Cache Management
The included dashboard provides:
- Real-time cache inspection
- Manual cache clearing
- Cache statistics and metrics
- Individual entry management
Cache Considerations
When implementing caching, consider:
-
Data Freshness Requirements
- How stale can the data be?
- What's the update frequency?
- Are there business requirements around data freshness?
-
Storage Requirements
- Expected cache size
- Memory constraints
- Disk space availability
-
Replication Needs
- Multi-instance setup requirements
- Data consistency requirements
- Network bandwidth considerations
-
Performance Impact
- Cache hit ratios
- Lookup performance
- Memory vs disk trade-offs
Best Practices
-
Cache Keys
- Use consistent naming conventions
- Include version numbers if needed
- Consider data dependencies
-
Cache Duration
- Set appropriate TTL values
- Use SWR for better user experience
- Consider data update patterns
-
Error Handling
- Handle cache misses gracefully
- Provide fallback mechanisms
- Log cache-related errors
-
Monitoring
- Track cache hit/miss ratios
- Monitor cache size
- Set up alerts for cache issues
Performance Optimization
To get the best performance from the caching system:
-
Choose the right cache type:
- SQLite for persistent, shared data
- LRU for temporary, instance-specific data
-
Optimize cache keys:
- Keep keys concise
- Use meaningful prefixes
- Consider key cardinality
-
Monitor and tune:
- Watch cache hit rates
- Adjust TTL values based on usage
- Clean up unused cache entries
For more information on performance monitoring, check out our monitoring documentation.
Cache Utilities
The stack provides several utilities for working with the cache:
-
Cache Management API
- Clear cache entries
- View cache status
- Manage cache settings
-
Cache Debugging Tools
- Monitor cache performance
- Track cache usage
- Debug cache issues
-
Cache Integration Helpers
- Easy integration with external APIs
- Built-in timing support
- Type-safe cache operations
For detailed deployment information, refer to the deployment documentation.