Overview
The CacheManager class provides a sophisticated multi-tier caching system with memory (LRU eviction) and localStorage backends. It’s isomorphic and works in both Node.js and browser environments.
Types
CacheEntryData
interface CacheEntryData<T = unknown> {
value: T;
timestamp: number;
ttl?: number;
}
CacheStatistics
interface CacheStatistics {
hits: number;
misses: number;
size: number;
hitRate: number;
}
CacheManagerOptions
interface CacheManagerOptions {
defaultTTL?: number; // Default TTL in milliseconds
maxMemorySize?: number; // Max items in memory cache (default: 100)
enableLocalStorage?: boolean; // Enable localStorage tier
enableIndexedDB?: boolean; // Reserved for future use
}
Constructor
new CacheManager<T>(options?: CacheManagerOptions)
Parameters
Configuration options for the cache manager
Example
import { CacheManager } from 'bytekit/utils/helpers';
// Default configuration
const cache = new CacheManager();
// Custom configuration
const cache = new CacheManager({
defaultTTL: 3600000, // 1 hour
maxMemorySize: 500, // 500 items in memory
enableLocalStorage: true // Enable persistent cache
});
Methods
set
Store a value in cache.
set(key: string, value: T, ttl?: number): void
Parameters
Time-to-live in milliseconds. Uses defaultTTL if not specified.
Example
const cache = new CacheManager();
// Cache with default TTL
cache.set('user:123', { id: 123, name: 'Alice' });
// Cache with custom TTL (5 minutes)
cache.set('temp:data', { value: 'temporary' }, 300000);
get
Retrieve a value from cache.
get(key: string): T | null
Parameters
Returns
Cached value, or null if not found or expired
Example
const cache = new CacheManager<User>();
const user = cache.get('user:123');
if (user) {
console.log('Cache hit:', user.name);
} else {
console.log('Cache miss - need to fetch from API');
}
has
Check if a key exists in cache.
has(key: string): boolean
Example
if (cache.has('user:123')) {
console.log('User is cached');
}
delete
Remove a key from cache.
delete(key: string): void
Example
cache.delete('user:123');
clear
Clear all cached data.
Example
// Clear all cache on logout
cache.clear();
getStats
Get cache statistics.
getStats(): CacheStatistics
Returns
Object containing hits, misses, size, and hit rate
Example
const stats = cache.getStats();
console.log(`Hit rate: ${(stats.hitRate * 100).toFixed(2)}%`);
console.log(`Cache size: ${stats.size} items`);
console.log(`Hits: ${stats.hits}, Misses: ${stats.misses}`);
getOrCompute
Get cached value or compute and cache it.
async getOrCompute(
key: string,
compute: () => Promise<T>,
ttl?: number
): Promise<T>
Parameters
Function to compute the value if not cached
Time-to-live in milliseconds
Example
const cache = new CacheManager<User>();
async function getUser(id: string): Promise<User> {
return cache.getOrCompute(
`user:${id}`,
async () => {
// This only runs on cache miss
const response = await fetch(`/api/users/${id}`);
return response.json();
},
3600000 // 1 hour TTL
);
}
// First call: fetches from API
const user1 = await getUser('123');
// Second call: returns from cache
const user2 = await getUser('123');
clearPattern
Clear cache entries matching a pattern.
async clearPattern(pattern: string): Promise<void>
Current Implementation: This method currently clears all cache. Full pattern matching will be implemented in a future version.
Factory Function
createCacheManager
Factory function for creating cache managers.
function createCacheManager<T = unknown>(
options?: CacheManagerOptions
): CacheManager<T>
Example
import { createCacheManager } from 'bytekit/utils/helpers';
const cache = createCacheManager({
defaultTTL: 3600000,
maxMemorySize: 200,
enableLocalStorage: true
});
Complete Example
import { CacheManager } from 'bytekit/utils/helpers';
interface Product {
id: string;
name: string;
price: number;
}
class ProductService {
private cache = new CacheManager<Product>({
defaultTTL: 1800000, // 30 minutes
maxMemorySize: 1000,
enableLocalStorage: true
});
async getProduct(id: string): Promise<Product> {
return this.cache.getOrCompute(
`product:${id}`,
async () => {
console.log('Fetching from API...');
const response = await fetch(`/api/products/${id}`);
return response.json();
}
);
}
async getProducts(ids: string[]): Promise<Product[]> {
return Promise.all(ids.map(id => this.getProduct(id)));
}
invalidateProduct(id: string): void {
this.cache.delete(`product:${id}`);
}
clearAllProducts(): void {
this.cache.clear();
}
getPerformanceStats(): void {
const stats = this.cache.getStats();
console.log('Cache Performance:');
console.log(` Hit Rate: ${(stats.hitRate * 100).toFixed(2)}%`);
console.log(` Total Hits: ${stats.hits}`);
console.log(` Total Misses: ${stats.misses}`);
console.log(` Cache Size: ${stats.size} items`);
}
}
// Usage
const productService = new ProductService();
// First call: API request
const product1 = await productService.getProduct('123');
// Second call: Cache hit
const product2 = await productService.getProduct('123');
// Check performance
productService.getPerformanceStats();
// Output: Hit Rate: 50.00%
Multi-Tier Caching
The cache manager uses a multi-tier approach:
- Memory Cache (L1): Fast in-memory cache with LRU eviction
- LocalStorage Cache (L2): Persistent cache that survives page reloads
const cache = new CacheManager({
maxMemorySize: 100, // L1: Keep 100 items in memory
enableLocalStorage: true // L2: Enable persistent cache
});
// On first get():
// 1. Check memory cache (L1)
// 2. If miss, check localStorage (L2)
// 3. If found in L2, restore to L1
// 4. If not found anywhere, return null
cache.set('key', 'value');
const value = cache.get('key'); // Checks L1 first, then L2
Best Practices
TTL Strategy: Use different TTLs based on data volatility:
- Frequently changing: 5-15 minutes
- Moderately changing: 30-60 minutes
- Rarely changing: 2-24 hours
Memory Limits: Set appropriate maxMemorySize based on your application’s memory constraints. Default is 100 items.
Performance Monitoring: Regularly check getStats() to monitor cache effectiveness. A hit rate above 80% indicates good cache utilization.