A standards-compliant HTTP cache implementation for server-side applications.
SharedCache is an HTTP caching library that follows Web Standards and HTTP specifications. It implements a cache interface similar to the Web Cache API but optimized for server-side shared caching scenarios.
stale-if-error
and stale-while-revalidate
Vary
headers, proxy revalidation, and authenticated responsesfetch
API with caching capabilities while maintaining full compatibilitys-maxage
over max-age
for shared cache performanceWhile the Web fetch
API has become ubiquitous in server-side JavaScript, existing browser Cache APIs are designed for single-user scenarios. Server-side applications need shared caches that serve multiple users efficiently.
SharedCache provides:
caches
API not availablecaches
APIlru-cache
directly// Cache API responses to reduce backend load const apiFetch = createFetch(cache, { defaults: { cacheControlOverride: 's-maxage=300' }, }); const userData = await apiFetch('/api/user/profile'); // First: 200ms, subsequent: 2ms
// Cache rendered pages using HTTP cache control directives export const handler = { async GET(ctx) { const response = await ctx.render(); // Set cache control headers for shared cache optimization response.headers.set( 'cache-control', 's-maxage=60, ' + // Cache for 60 seconds in shared caches 'stale-if-error=604800, ' + // Serve stale content for 7 days on errors 'stale-while-revalidate=604800' // Background revalidation for 7 days ); return response; }, };
Integration Requirements: This pattern requires web framework integration with SharedCache middleware or custom cache implementation in your SSR pipeline.
Cross-Runtime Applications// Same code works in Node.js, Deno, Bun, and Edge Runtime const fetch = createFetch(cache); // Deploy anywhere without code changes
// Redis backend for multi-instance applications const caches = new CacheStorage(createRedisStorage()); const cache = await caches.open('distributed-cache');
npm install @web-widget/shared-cache
# Using yarn yarn add @web-widget/shared-cache # Using pnpm pnpm add @web-widget/shared-cache
Here's a simple example to get you started with SharedCache:
import { CacheStorage, createFetch, type KVStorage, } from '@web-widget/shared-cache'; import { LRUCache } from 'lru-cache'; // Create a storage backend using LRU cache const createLRUCache = (): KVStorage => { const store = new LRUCache<string, any>({ max: 1024 }); return { async get(cacheKey: string) { return store.get(cacheKey); }, async set(cacheKey: string, value: any, ttl?: number) { store.set(cacheKey, value, { ttl }); }, async delete(cacheKey: string) { return store.delete(cacheKey); }, }; }; // Initialize cache storage const caches = new CacheStorage(createLRUCache()); async function example() { const cache = await caches.open('api-cache-v1'); // Create fetch with default configuration const fetch = createFetch(cache, { defaults: { cacheControlOverride: 's-maxage=300', // 5 minutes default caching ignoreRequestCacheControl: true, }, }); // First request - will hit the network console.time('First request'); const response1 = await fetch( 'https://httpbin.org/response-headers?cache-control=max-age%3D604800' ); console.timeEnd('First request'); // ~400ms // Second request - served from cache console.time('Cached request'); const response2 = await fetch( 'https://httpbin.org/response-headers?cache-control=max-age%3D604800' ); console.timeEnd('Cached request'); // ~2ms // Check cache status console.log('Cache status:', response2.headers.get('x-cache-status')); // "HIT" } example();
This package exports a comprehensive set of APIs for HTTP caching functionality:
import { createFetch, // Main fetch function with caching Cache, // SharedCache class CacheStorage, // SharedCacheStorage class } from '@web-widget/shared-cache'; const cache = await caches.open('api-cache-v1'); const fetch = createFetch(cache, { defaults: { cacheControlOverride: 's-maxage=300', ignoreRequestCacheControl: true, }, });
import { createFetch } from '@web-widget/shared-cache'; const cache = await caches.open('api-cache-v1'); const fetch = createFetch(cache, { defaults: { cacheControlOverride: 's-maxage=300', // 5 minutes default }, }); // Simple usage - automatic caching const userData = await fetch('/api/user/profile'); const sameData = await fetch('/api/user/profile'); // Served from cache
import Redis from 'ioredis'; import { CacheStorage, createFetch, type KVStorage, } from '@web-widget/shared-cache'; const createRedisStorage = (): KVStorage => { const redis = new Redis(process.env.REDIS_URL); return { async get(key: string) { const value = await redis.get(key); return value ? JSON.parse(value) : undefined; }, async set(key: string, value: any, ttl?: number) { const serialized = JSON.stringify(value); if (ttl) { await redis.setex(key, Math.ceil(ttl / 1000), serialized); } else { await redis.set(key, serialized); } }, async delete(key: string) { return (await redis.del(key)) > 0; }, }; }; const caches = new CacheStorage(createRedisStorage()); const cache = await caches.open('distributed-cache'); const fetch = createFetch(cache, { defaults: { cacheControlOverride: 's-maxage=600', cacheKeyRules: { header: { include: ['x-tenant-id'] }, // Multi-tenant support }, }, });
const deviceAwareFetch = createFetch(await caches.open('content-cache'), { defaults: { cacheControlOverride: 's-maxage=600', cacheKeyRules: { device: true, // Separate cache for mobile/desktop/tablet search: { exclude: ['timestamp'] }, }, }, }); const response = await deviceAwareFetch('/api/content');
const advancedFetch = createFetch(await caches.open('advanced-cache'), { defaults: { cacheControlOverride: 's-maxage=300, stale-while-revalidate=3600', cacheKeyRules: { search: { exclude: ['timestamp', '_'] }, header: { include: ['x-api-version'] }, cookie: { include: ['session_id'] }, device: true, }, }, });
import crypto from 'crypto'; const createEncryptedStorage = ( baseStorage: KVStorage, key: string ): KVStorage => { const encrypt = (text: string) => { const cipher = crypto.createCipher('aes192', key); let encrypted = cipher.update(text, 'utf8', 'hex'); encrypted += cipher.final('hex'); return encrypted; }; const decrypt = (text: string) => { const decipher = crypto.createDecipher('aes192', key); let decrypted = decipher.update(text, 'hex', 'utf8'); decrypted += decipher.final('utf8'); return decrypted; }; return { async get(cacheKey: string) { const encrypted = await baseStorage.get(cacheKey); return encrypted ? JSON.parse(decrypt(encrypted as string)) : undefined; }, async set(cacheKey: string, value: unknown, ttl?: number) { const encrypted = encrypt(JSON.stringify(value)); return baseStorage.set(cacheKey, encrypted, ttl); }, async delete(cacheKey: string) { return baseStorage.delete(cacheKey); }, }; }; const secureStorage = createEncryptedStorage(baseStorage, 'my-secret-key'); const caches = new CacheStorage(secureStorage);
const tenantFetch = createFetch(await caches.open('tenant-cache'), { defaults: { cacheControlOverride: 's-maxage=300', cacheKeyRules: { header: { include: ['x-tenant-id'] }, search: true, }, }, }); // Each tenant gets isolated cache const response = await tenantFetch('/api/data', { headers: { 'x-tenant-id': 'tenant-123' }, });Custom Fetch with Authentication
// Production-ready example with automatic token refresh const createAuthenticatedFetch = (getToken) => { return async (input, init) => { const token = await getToken(); const headers = new Headers(init?.headers); headers.set('Authorization', `Bearer ${token}`); const response = await globalThis.fetch(input, { ...init, headers, }); // Handle token expiration if (response.status === 401) { // Token might be expired, retry once with fresh token const freshToken = await getToken(true); // force refresh headers.set('Authorization', `Bearer ${freshToken}`); return globalThis.fetch(input, { ...init, headers, }); } return response; }; }; const authFetch = createFetch(await caches.open('authenticated-api'), { fetch: createAuthenticatedFetch(() => getApiToken()), defaults: { cacheControlOverride: 'public, ' + // Required: Allow caching of authenticated requests 's-maxage=300', cacheKeyRules: { header: { include: ['authorization'] }, // Cache per token }, }, }); const userData = await authFetch('/api/user/profile');Setting up Global Cache Storage
For applications that need a global cache instance, you can set up the caches
object:
import { CacheStorage, type KVStorage } from '@web-widget/shared-cache'; import { LRUCache } from 'lru-cache'; // Extend global types for TypeScript support declare global { interface WindowOrWorkerGlobalScope { caches: CacheStorage; } } const createLRUCache = (): KVStorage => { const store = new LRUCache<string, any>({ max: 1024, ttl: 1000 * 60 * 60, // 1 hour default TTL }); return { async get(cacheKey: string) { return store.get(cacheKey); }, async set(cacheKey: string, value: any, ttl?: number) { store.set(cacheKey, value, { ttl }); }, async delete(cacheKey: string) { return store.delete(cacheKey); }, }; }; // Set up global cache storage const caches = new CacheStorage(createLRUCache()); globalThis.caches = caches;
Once the global caches
is configured, you can also register a globally cached fetch
:
import { createFetch } from '@web-widget/shared-cache'; // Replace global fetch with cached version globalThis.fetch = createFetch(await caches.open('default'), { defaults: { cacheControlOverride: 's-maxage=60', // 1 minute default for global fetch }, });🎛️ Advanced Configuration Enhanced Fetch API with Defaults
The createFetch
API allows you to set default cache configuration:
import { createFetch } from '@web-widget/shared-cache'; const cache = await caches.open('api-cache'); // Create fetch with comprehensive defaults const fetch = createFetch(cache, { defaults: { cacheControlOverride: 's-maxage=300', cacheKeyRules: { header: { include: ['x-api-version'] }, }, ignoreRequestCacheControl: true, ignoreVary: false, }, }); // Use with defaults applied automatically const response1 = await fetch('/api/data'); // Override defaults for specific requests const response2 = await fetch('/api/data', { sharedCache: { cacheControlOverride: 's-maxage=600', // Override default }, });Custom Fetch Configuration
The createFetch
function accepts a custom fetch implementation, allowing you to integrate with existing HTTP clients or add cross-cutting concerns:
// Example: Integration with axios import axios from 'axios'; const axiosFetch = async (input, init) => { const response = await axios({ url: input.toString(), method: init?.method || 'GET', headers: init?.headers, data: init?.body, validateStatus: () => true, // Don't throw on 4xx/5xx }); return new Response(response.data, { status: response.status, statusText: response.statusText, headers: response.headers, }); }; const fetch = createFetch(await caches.open('axios-cache'), { fetch: axiosFetch, defaults: { cacheControlOverride: 's-maxage=300', }, }); // Example: Custom fetch with request/response transformation const transformFetch = async (input, init) => { // Transform request const url = new URL(input); url.searchParams.set('timestamp', Date.now().toString()); const response = await globalThis.fetch(url, init); // Transform response if (response.headers.get('content-type')?.includes('application/json')) { const data = await response.json(); const transformedData = { ...data, fetchedAt: new Date().toISOString(), }; return new Response(JSON.stringify(transformedData), { status: response.status, statusText: response.statusText, headers: response.headers, }); } return response; }; const transformedFetch = createFetch(await caches.open('transform-cache'), { fetch: transformFetch, defaults: { cacheControlOverride: 's-maxage=300', }, });
SharedCache extends the standard fetch API with caching options via the sharedCache
parameter:
const cache = await caches.open('api-cache'); const fetch = createFetch(cache); const response = await fetch('https://api.example.com/data', { // Standard fetch options method: 'GET', headers: { 'x-user-id': '1024', }, // SharedCache-specific options sharedCache: { cacheControlOverride: 's-maxage=120', varyOverride: 'accept-language', ignoreRequestCacheControl: true, ignoreVary: false, cacheKeyRules: { search: false, device: true, header: { include: ['x-user-id'], }, }, }, });
Override or extend cache control directives when APIs don't provide optimal caching headers:
// Add shared cache directive sharedCache: { cacheControlOverride: 's-maxage=3600'; } // Combine multiple directives sharedCache: { cacheControlOverride: 's-maxage=3600, must-revalidate'; }
Add additional Vary headers to ensure proper cache segmentation:
sharedCache: { varyOverride: 'accept-language, user-agent'; }
ignoreRequestCacheControl
Control whether to honor cache-control directives from the request:
// Ignore client cache-control headers (default: true) sharedCache: { ignoreRequestCacheControl: false; }
Disable Vary header processing for simplified caching:
sharedCache: { ignoreVary: true; // Cache regardless of Vary headers }
⚠️ Performance Warning: By default, SharedCache processes Vary headers which requires two KV storage queries per cache lookup. If you're using a slow KV storage (like remote Redis), this can significantly impact performance. Consider setting ignoreVary: true
to disable Vary processing and use only one query per lookup.
Customize how cache keys are generated to optimize cache hit rates and handle different caching scenarios:
sharedCache: { cacheKeyRules: { // URL components search: true, // Include query parameters (default) // Request context device: false, // Classify by device type cookie: { // Include specific cookies include: ['session_id', 'user_pref'] }, header: { // Include specific headers include: ['x-api-key'], checkPresence: ['x-mobile-app'] } } }
Default cache key rules:
search
: Control query parameter inclusionQuery Parameter Control:
// Include all query parameters (default) search: true; // Exclude all query parameters search: false; // Include specific parameters search: { include: ['category', 'page']; } // Include all except specific parameters search: { exclude: ['timestamp', 'nonce']; }
Automatically classify requests as mobile
, desktop
, or tablet
based on User-Agent:
cacheKeyRules: { device: true; // Separate cache for different device types }
Include specific cookies in the cache key:
cacheKeyRules: { cookie: { include: ['user_id', 'session_token'], checkPresence: ['is_premium'] // Check existence without value } }
Include request headers in the cache key:
cacheKeyRules: { header: { include: ['x-api-version'], checkPresence: ['x-feature-flag'] } }
Restricted Headers: For security and performance, certain headers cannot be included:
accept
, accept-charset
, accept-encoding
, accept-language
, user-agent
, referer
cache-control
, if-*
, range
, connection
authorization
, cookie
(handled separately by cookie rules)host
SharedCache provides comprehensive monitoring through the x-cache-status
header for debugging and performance analysis.
HIT
Response served from cache The requested resource was found in cache and is still fresh MISS
Response fetched from origin The requested resource was not found in cache EXPIRED
Cached response expired, fresh response fetched The cached response exceeded its TTL STALE
Stale response served Served due to stale-while-revalidate or stale-if-error BYPASS
Cache bypassed Bypassed due to cache control directives like no-store
REVALIDATED
Cached response revalidated Response validated with origin (304 Not Modified) DYNAMIC
Response cannot be cached Cannot be cached due to HTTP method or status code Cache Status Header Details
The x-cache-status
header is automatically added to all responses:
HIT
, MISS
, EXPIRED
, STALE
, BYPASS
, REVALIDATED
, DYNAMIC
SharedCache provides a comprehensive logging system with structured output for monitoring and debugging cache operations.
interface Logger { info(message?: unknown, ...optionalParams: unknown[]): void; warn(message?: unknown, ...optionalParams: unknown[]): void; debug(message?: unknown, ...optionalParams: unknown[]): void; error(message?: unknown, ...optionalParams: unknown[]): void; }
import { createLogger, LogLevel } from '@web-widget/shared-cache'; // Create a simple console logger const logger = { info: console.info.bind(console), warn: console.warn.bind(console), debug: console.debug.bind(console), error: console.error.bind(console), }; // Create SharedCache with logger const cache = new SharedCache(storage, { logger, });
SharedCache: Cache miss { url: 'https://api.com/data', cacheKey: 'api:data', method: 'GET' }
SharedCache: Cache item found { url: 'https://api.com/data', cacheKey: 'api:data', method: 'GET' }
SharedCache: Cache hit { url: 'https://api.com/data', cacheKey: 'api:data', cacheStatus: 'HIT' }
SharedCache: Serving stale response - Revalidating in background { url: 'https://api.com/data', cacheKey: 'api:data', cacheStatus: 'STALE' }
SharedCache: Revalidation network error - Using fallback 500 response { url: 'https://api.com/data', cacheKey: 'api:data', error: [NetworkError] }
SharedCache: Put operation failed { url: 'https://api.com/data', error: [StorageError] }
SharedCache: Revalidation failed - Server returned 5xx status { url: 'https://api.com/data', status: 503, cacheKey: 'api:data' }
const productionLogger = { info: (msg, ctx) => console.log(JSON.stringify({ level: 'INFO', message: msg, ...ctx })), warn: (msg, ctx) => console.warn(JSON.stringify({ level: 'WARN', message: msg, ...ctx })), debug: () => {}, // No debug in production error: (msg, ctx) => console.error(JSON.stringify({ level: 'ERROR', message: msg, ...ctx })), }; const cache = new SharedCache(storage, { logger: productionLogger, });Development Logging (DEBUG level)
const devLogger = { info: console.info.bind(console), warn: console.warn.bind(console), debug: console.debug.bind(console), error: console.error.bind(console), }; const cache = new SharedCache(storage, { logger: devLogger, });Structured Logger with Level Filtering
import { createLogger, LogLevel } from '@web-widget/shared-cache'; class CustomLogger { info(message: unknown, ...params: unknown[]) { this.log('INFO', message, ...params); } warn(message: unknown, ...params: unknown[]) { this.log('WARN', message, ...params); } debug(message: unknown, ...params: unknown[]) { this.log('DEBUG', message, ...params); } error(message: unknown, ...params: unknown[]) { this.log('ERROR', message, ...params); } private log(level: string, message: unknown, ...params: unknown[]) { const timestamp = new Date().toISOString(); const context = params[0] || {}; console.log( JSON.stringify({ timestamp, level, service: 'shared-cache', message, ...context, }) ); } } const customLogger = new CustomLogger(); const structuredLogger = createLogger(customLogger, LogLevel.DEBUG); const cache = new SharedCache(storage, { logger: customLogger, });
All log messages include structured context data:
interface SharedCacheLogContext { url?: string; // Request URL cacheKey?: string; // Generated cache key status?: number; // HTTP status code duration?: number; // Operation duration (ms) error?: unknown; // Error object cacheStatus?: string; // Cache result status ttl?: number; // Time to live (seconds) method?: string; // HTTP method [key: string]: unknown; // Additional context }
import { createLogger, LogLevel } from '@web-widget/shared-cache'; let hitCount = 0; let totalCount = 0; const monitoringLogger = { info: (message, context) => { if (context?.cacheStatus) { totalCount++; if (context.cacheStatus === 'HIT') hitCount++; // Log hit rate every 100 requests if (totalCount % 100 === 0) { console.log( `Cache hit rate: ${((hitCount / totalCount) * 100).toFixed(2)}%` ); } } console.log(message, context); }, warn: console.warn, debug: console.debug, error: console.error, }; const cache = new SharedCache(storage, { logger: createLogger(monitoringLogger, LogLevel.INFO), });
const performanceLogger = { info: (message, context) => { if (context?.duration) { console.log(`${message} - Duration: ${context.duration}ms`, context); } else { console.log(message, context); } }, warn: console.warn, debug: console.debug, error: console.error, };
const alertingLogger = { info: console.log, warn: console.warn, debug: console.debug, error: (message, context) => { console.error(message, context); // Send alerts for critical cache errors if (context?.error && message.includes('Put operation failed')) { sendAlert(`Cache storage error: ${context.error.message}`); } }, };
Main Functions:
createFetch(cache?, options?)
- Create cached fetch functioncreateLogger(logger?, logLevel?, prefix?)
- Create logger with level filteringClasses:
Cache
- Main cache implementationCacheStorage
- Cache storage managerKey Types:
KVStorage
- Storage backend interfaceSharedCacheRequestInitProperties
- Request cache configurationSharedCacheKeyRules
- Cache key generation rulesCreates a fetch function with shared cache configuration.
function createFetch( cache?: Cache, options?: { fetch?: typeof fetch; defaults?: Partial<SharedCacheRequestInitProperties>; } ): SharedCacheFetch;
Parameters:
cache
- Optional SharedCache instance (auto-discovered from globalThis.caches if not provided)options.fetch
- Custom fetch implementation to use as the underlying fetcher (defaults to globalThis.fetch)options.defaults
- Default shared cache options to apply to all requestsReturns: SharedCacheFetch
- A fetch function with caching capabilities
Basic Usage:
const cache = await caches.open('my-cache'); const fetch = createFetch(cache, { defaults: { cacheControlOverride: 's-maxage=300' }, });SharedCacheRequestInitProperties
Request-level cache configuration:
interface SharedCacheRequestInitProperties { cacheControlOverride?: string; cacheKeyRules?: SharedCacheKeyRules; ignoreRequestCacheControl?: boolean; ignoreVary?: boolean; varyOverride?: string; event?: ExtendableEvent; }
Cache key generation rules:
interface SharedCacheKeyRules { cookie?: FilterOptions | boolean; device?: FilterOptions | boolean; header?: FilterOptions | boolean; search?: FilterOptions | boolean; }
Storage backend interface:
interface KVStorage { get: (cacheKey: string) => Promise<unknown | undefined>; set: (cacheKey: string, value: unknown, ttl?: number) => Promise<void>; delete: (cacheKey: string) => Promise<boolean>; }
class Cache { match(request: RequestInfo | URL): Promise<Response | undefined>; put(request: RequestInfo | URL, response: Response): Promise<void>; delete(request: RequestInfo | URL): Promise<boolean>; } class CacheStorage { constructor(storage: KVStorage); open(cacheName: string): Promise<Cache>; }
createLogger(logger?, logLevel?, prefix?)
const logger = createLogger(console, LogLevel.INFO, 'MyApp');
Creates a structured logger with level filtering and optional prefix.
type SharedCacheStatus = | 'HIT' | 'MISS' | 'EXPIRED' | 'STALE' | 'BYPASS' | 'REVALIDATED' | 'DYNAMIC';
Status values are automatically added to response headers as x-cache-status
.
Complete API documentation available in TypeScript definitions and source code.
SharedCache demonstrates exceptional HTTP standards compliance, fully adhering to established web caching specifications:
✅ HTTP Caching Standards (RFC 7234)Complete Compliance Features:
no-store
, no-cache
, private
, public
, s-maxage
, and max-age
SharedCache implements a subset of the standard Web Cache API interface, focusing on core caching operations:
interface Cache { match(request: RequestInfo | URL): Promise<Response | undefined>; // ✅ Implemented put(request: RequestInfo | URL, response: Response): Promise<void>; // ✅ Implemented delete(request: RequestInfo | URL): Promise<boolean>; // ✅ Implemented // Not implemented - throw "not implemented" errors add(request: RequestInfo | URL): Promise<void>; // ❌ Throws error addAll(requests: RequestInfo[]): Promise<void>; // ❌ Throws error keys(): Promise<readonly Request[]>; // ❌ Throws error matchAll(): Promise<readonly Response[]>; // ❌ Throws error }
Implementation Status:
match()
, put()
, delete()
- Fully implemented with HTTP semanticsadd()
, addAll()
- Use put()
insteadkeys()
, matchAll()
- Not available in server environmentsOptions Parameter Differences:
SharedCache's CacheQueryOptions
interface differs from the standard Web Cache API:
interface CacheQueryOptions { ignoreSearch?: boolean; // ❌ Not implemented - throws error ignoreMethod?: boolean; // ✅ Supported ignoreVary?: boolean; // ❌ Not implemented - throws error }
Supported Options:
ignoreMethod
: Treat request as GET regardless of actual HTTP methodUnsupported Options (throw errors):
ignoreSearch
: Query string handling not customizableignoreVary
: Vary header processing not bypassable (Note: This option is actually supported in SharedCache)http-cache-semantics
for RFC complianceprivate
directive for user-specific contents-maxage
over max-age
for multi-user environmentsAuthorization
headers are not cached in shared caches unless explicitly permitted by response cache control directives🔒 Important Security Note: SharedCache automatically enforces HTTP caching security rules. Requests containing Authorization
headers will not be cached unless the response explicitly allows it with directives like public
, s-maxage
, or must-revalidate
. This ensures compliance with shared cache security requirements.
SharedCache is production-ready and battle-tested, providing enterprise-grade HTTP caching with full standards compliance for server-side applications.
❓ Frequently Asked Questions Q: Can I use different storage backends in production?A: Absolutely! SharedCache supports any storage backend that implements the KVStorage
interface:
// Redis example const redisStorage: KVStorage = { async get(key) { return JSON.parse((await redis.get(key)) || 'null'); }, async set(key, value, ttl) { await redis.setex(key, ttl / 1000, JSON.stringify(value)); }, async delete(key) { return (await redis.del(key)) > 0; }, };Q: How does SharedCache handle concurrent requests?
A: SharedCache handles concurrent requests efficiently by serving cache entries and avoiding duplicate network requests.
Q: Is SharedCache compatible with edge runtimes?A: SharedCache is technically compatible with edge runtimes, but it's typically not needed in edge environments. Most edge runtimes (Cloudflare Workers, Vercel Edge Runtime, Deno Deploy) already provide native caches
API implementation.
Primary Use Cases for SharedCache:
caches
API is not natively availableMigration Benefits:
When using SharedCache with meta-frameworks, you can develop with a consistent caching API and deploy to any environment - whether it has native caches
support or not. This provides true runtime portability for your caching logic.
stale-while-revalidate
and stale-if-error
directives?
A: These RFC 5861 extensions provide significant performance and reliability benefits:
A: SharedCache processes Vary headers by default, which requires two KV storage queries per cache lookup:
Performance Impact:
Recommendation for slow storage:
// Disable Vary processing for better performance const fetch = createFetch(cache, { sharedCache: { ignoreVary: true, // Reduces to single query per lookup }, });
Trade-offs:
// Best practice: Use both directives together const fetch = createFetch(cache, { defaults: { cacheControlOverride: 's-maxage=300, stale-while-revalidate=86400, stale-if-error=86400', }, });🤝 Who's Using SharedCache
SharedCache draws inspiration from industry-leading caching implementations:
MIT License - see LICENSE file for details.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4