- Overview
- Quick Start
- Basic Usage
- Advanced Configuration
- API Reference
- Best Practices
- Migration Guide
- Troubleshooting
- Related Documentation
MeteredMemoryCache is a decorator for IMemoryCache that automatically emits OpenTelemetry metrics for cache operations. It provides observability into cache hit rates, miss rates, and eviction patterns without requiring changes to your existing cache usage code.
- Zero-configuration metrics for any
IMemoryCacheimplementation - OpenTelemetry integration with standardized metric names
- Dimensional metrics with cache naming and custom tags
- Minimal performance overhead (~8-41ns per operation)
- Thread-safe operations with concurrent metric collection
- Dependency injection support with .NET options pattern
| Metric Name | Type | Description | Tags |
|---|---|---|---|
cache.requests |
ObservableCounter | Number of cache lookup operations | cache.name, cache.request.type (hit or miss) |
cache.evictions |
ObservableCounter | Number of cache evictions | cache.name |
cache.entries |
ObservableUpDownCounter | Current number of cache entries | cache.name |
cache.estimated_size |
ObservableGauge | Estimated size of the cache | cache.name |
The cache.evictions instrument counts all evictions except explicit user removals (Removed) and replacements (Replaced). The eviction reason is not emitted as a metric tag.
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using CacheImplementations;
var builder = Host.CreateApplicationBuilder(args);
// Configure OpenTelemetry
builder.Services.AddOpenTelemetry()
.WithMetrics(metrics => metrics
.AddMeter("Microsoft.Extensions.Caching.Memory.MemoryCache")
.AddPrometheusExporter());
// Register named cache with metrics
builder.Services.AddNamedMeteredMemoryCache("user-cache");
var app = builder.Build();
// Use the cache - metrics are emitted automatically
var cache = app.Services.GetRequiredKeyedService<IMemoryCache>("user-cache");
var result = cache.Get("some-key");using System.Diagnostics.Metrics;
using Microsoft.Extensions.Caching.Memory;
using CacheImplementations;
// Create and wrap cache
var innerCache = new MemoryCache(new MemoryCacheOptions());
var meter = new Meter("Microsoft.Extensions.Caching.Memory.MemoryCache");
IMemoryCache cache = new MeteredMemoryCache(innerCache, meter, "my-cache");
// Use normally - metrics emitted automatically
cache.Set("key", "value");
var value = cache.Get("key");using System.Diagnostics.Metrics;
using Microsoft.Extensions.Caching.Memory;
using CacheImplementations;
// Create the underlying cache and meter
var innerCache = new MemoryCache(new MemoryCacheOptions
{
SizeLimit = 1000
});
var meter = new Meter("Microsoft.Extensions.Caching.Memory.MemoryCache");
// Wrap with MeteredMemoryCache
IMemoryCache cache = new MeteredMemoryCache(innerCache, meter, "user-cache");
// Use normally - metrics are emitted automatically
var user = cache.GetOrCreate("user:123", entry =>
{
entry.AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5);
return LoadUserFromDatabase(123);
});
// Clean up
cache.Dispose();
meter.Dispose();using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using CacheImplementations;
var builder = Host.CreateApplicationBuilder(args);
// Configure OpenTelemetry
builder.Services.AddOpenTelemetry()
.WithMetrics(metrics => metrics
.AddMeter("Microsoft.Extensions.Caching.Memory.MemoryCache")
.AddPrometheusExporter());
// Register named cache with metrics
builder.Services.AddNamedMeteredMemoryCache("user-cache", options =>
{
options.AdditionalTags["service"] = "user-service";
options.AdditionalTags["version"] = "1.0";
});
var app = builder.Build();
// Resolve and use the cache
var cache = app.Services.GetRequiredKeyedService<IMemoryCache>("user-cache");
var result = cache.Get("some-key");var options = new MeteredMemoryCacheOptions
{
CacheName = "product-cache",
DisposeInner = true,
AdditionalTags =
{
["environment"] = "production",
["datacenter"] = "us-west-2",
["team"] = "catalog"
}
};
var cache = new MeteredMemoryCache(innerCache, meter, options);// Register multiple caches with different configurations
services.AddNamedMeteredMemoryCache("user-cache", opts =>
{
opts.AdditionalTags["type"] = "user-data";
});
services.AddNamedMeteredMemoryCache("product-cache", opts =>
{
opts.AdditionalTags["type"] = "product-data";
});
services.AddNamedMeteredMemoryCache("session-cache", opts =>
{
opts.AdditionalTags["type"] = "session-data";
opts.AdditionalTags["ttl"] = "short";
});
// Resolve specific caches
var userCache = serviceProvider.GetRequiredKeyedService<IMemoryCache>("user-cache");
var productCache = serviceProvider.GetRequiredKeyedService<IMemoryCache>("product-cache");// Add basic memory cache
services.AddMemoryCache();
// Decorate with metrics
services.DecorateMemoryCacheWithMetrics("main-cache",
configureOptions: opts =>
{
opts.AdditionalTags["component"] = "main";
});public MeteredMemoryCache(
IMemoryCache innerCache,
Meter meter,
string? cacheName = null,
bool disposeInner = false)innerCache: The underlying cache implementation to decoratemeter: OpenTelemetry meter for metric emissioncacheName: Optional logical name for dimensional metricsdisposeInner: Whether to dispose the inner cache on disposal
public MeteredMemoryCache(
IMemoryCache innerCache,
Meter meter,
MeteredMemoryCacheOptions options)MeteredMemoryCache implements the IMemoryCache interface and works with all standard extension methods from Microsoft.Extensions.Caching.Memory.CacheExtensions. All operations automatically emit metrics.
Common Extension Methods:
TryGetValue<T>(object key, out T value)- Type-safe retrieval with automatic hit/miss metricsSet<T>(object key, T value, MemoryCacheEntryOptions? options)- Sets entry with automatic eviction trackingGetOrCreate<T>(object key, Func<ICacheEntry, T> factory)- Gets existing or creates new with full metric coverageGetOrCreateAsync<T>(object key, Func<ICacheEntry, Task<T>> factory)- Async version of GetOrCreate
Example:
using Microsoft.Extensions.Caching.Memory;
// Use extension methods directly
if (cache.TryGetValue<UserData>("user:123", out var user))
{
// Hit recorded automatically
}
cache.Set("user:123", userData, new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(30)
});
var result = cache.GetOrCreate($"product:{id}", entry =>
{
entry.SlidingExpiration = TimeSpan.FromMinutes(10);
return GetProductFromDatabase(id);
});public string Name { get; }Gets the logical cache name. Defaults to "Default" when no explicit name is provided.
public static IServiceCollection AddNamedMeteredMemoryCache(
this IServiceCollection services,
string cacheName,
Action<MeteredMemoryCacheOptions>? configureOptions = null)Registers a named cache with complete dependency injection setup including:
- Options validation with
IValidateOptions<T> - Keyed service registration for multi-cache scenarios
- Automatic meter registration
- Fallback singleton registration for single-cache scenarios
public static IServiceCollection DecorateMemoryCacheWithMetrics(
this IServiceCollection services,
string? cacheName = null,
Action<MeteredMemoryCacheOptions>? configureOptions = null)Decorates existing IMemoryCache registration with metrics support.
Use hierarchical naming for related caches:
services.AddNamedMeteredMemoryCache("user.profile");
services.AddNamedMeteredMemoryCache("user.permissions");
services.AddNamedMeteredMemoryCache("product.catalog");
services.AddNamedMeteredMemoryCache("product.pricing");Establish consistent tag naming across your application:
var commonOptions = new Action<MeteredMemoryCacheOptions>(opts =>
{
opts.AdditionalTags["service"] = "my-service";
opts.AdditionalTags["version"] = Assembly.GetEntryAssembly()?.GetName().Version?.ToString();
opts.AdditionalTags["environment"] = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
});
services.AddNamedMeteredMemoryCache("cache1", commonOptions);
services.AddNamedMeteredMemoryCache("cache2", commonOptions);Configure DisposeInner based on ownership:
// Own the inner cache - dispose it
services.AddNamedMeteredMemoryCache("owned-cache", opts =>
{
opts.DisposeInner = true;
});
// Shared cache - don't dispose
services.DecorateMemoryCacheWithMetrics("shared-cache", opts =>
{
opts.DisposeInner = false; // default
});Always validate options in production:
services.AddNamedMeteredMemoryCache("critical-cache", opts =>
{
// Configuration here
})
.ValidateDataAnnotations()
.ValidateOnStart(); // Fail fast on startupBased on benchmarks with 16,384 operations on Windows 11/.NET 9.0.8:
| Operation | Raw Cache | Metered Cache | Overhead |
|---|---|---|---|
| Hit (Get) | 68.07ns | 90.77ns | +22.70ns (+33%) |
| Miss (Get) | 52.30ns | 92.92ns | +40.62ns (+78%) |
| Set | 543.34ns | 551.03ns | +7.69ns (+1.4%) |
| TryGetValue Hit | 53.35ns | 61.52ns | +8.17ns (+15%) |
| TryGetValue Miss | 43.13ns | 83.29ns | +40.16ns (+93%) |
| CreateEntry | 537.14ns | 547.98ns | +10.84ns (+2.0%) |
- Per-instance: ~200 bytes (3 counters + tag storage)
- Per-operation: 0 allocations on hot path
- Per-eviction: 0 allocations (uses pre-allocated tags)
- Thread-safe via
Interlockedatomic operations on internal counters - No global locks or contention points
- Linear scaling with operation rate
- Suitable for high-throughput scenarios
// Ensure meter is registered and exported
services.AddOpenTelemetry()
.WithMetrics(metrics => metrics
.AddMeter("Microsoft.Extensions.Caching.Memory.MemoryCache") // Must match meter name
.AddConsoleExporter()); // Add exporter// Will throw on second registration
services.AddNamedMeteredMemoryCache("duplicate");
services.AddNamedMeteredMemoryCache("duplicate"); // ❌ Throws
// Use different names
services.AddNamedMeteredMemoryCache("cache-v1");
services.AddNamedMeteredMemoryCache("cache-v2"); // ✅ OKEviction metrics are only emitted when:
- Entry has an eviction callback registered (automatic with MeteredMemoryCache)
- Entry is actually evicted (not just expired and accessed)
- Cache is not disposed before eviction occurs
// ❌ Invalid - null additional tags
var options = new MeteredMemoryCacheOptions
{
AdditionalTags = null // Throws on validation
};
// ✅ Valid
var options = new MeteredMemoryCacheOptions
{
AdditionalTags = new Dictionary<string, object?>()
};services.AddOpenTelemetry()
.WithMetrics(metrics => metrics
.AddMeter("Microsoft.Extensions.Caching.Memory.MemoryCache")
.AddConsoleExporter()); // Prints to consoleusing var meterProvider = Sdk.CreateMeterProviderBuilder()
.AddMeter("Microsoft.Extensions.Caching.Memory.MemoryCache")
.AddInMemoryExporter(exportedItems)
.Build();
// Perform cache operations
cache.Get("test-key");
// Inspect collected metrics
foreach (var metric in exportedItems)
{
Console.WriteLine($"{metric.Name}: {metric.GetGaugeLastValueLong()}");
}If you're currently instrumenting cache operations manually, MeteredMemoryCache eliminates the need for custom instrumentation code.
public class CacheService
{
private readonly IMemoryCache _cache;
private readonly IMetrics _metrics;
public T Get<T>(string key)
{
if (_cache.TryGetValue(key, out var value))
{
_metrics.Increment("cache.requests", new("cache.request.type", "hit"));
return (T)value;
}
_metrics.Increment("cache.requests", new("cache.request.type", "miss"));
return default(T);
}
public void Set<T>(string key, T value)
{
_cache.Set(key, value);
_metrics.Increment("cache.sets");
}
}public class CacheService
{
private readonly IMemoryCache _cache; // MeteredMemoryCache via DI
public T Get<T>(string key)
{
return _cache.TryGetValue(key, out var value) ? (T)value : default(T);
// Metrics emitted automatically
}
public void Set<T>(string key, T value)
{
_cache.Set(key, value);
// Metrics emitted automatically
}
}Migration Steps:
- Remove manual metric instrumentation code
- Register MeteredMemoryCache in DI container
- Configure OpenTelemetry to collect the new metrics
- Update monitoring dashboards to use new metric names
If you've built a custom cache wrapper for metrics, MeteredMemoryCache provides a standardized replacement.
public class InstrumentedCache : IMemoryCache
{
private readonly IMemoryCache _inner;
private readonly Counter<long> _requestCounter;
public InstrumentedCache(IMemoryCache inner, IMeterFactory meterFactory)
{
_inner = inner;
var meter = meterFactory.Create("Microsoft.Extensions.Caching.Memory.MemoryCache");
_requestCounter = meter.CreateCounter<long>("cache.requests");
}
public bool TryGetValue(object key, out object? value)
{
var result = _inner.TryGetValue(key, out value);
if (result)
_requestCounter.Add(1, new("cache.request.type", "hit"));
else
_requestCounter.Add(1, new("cache.request.type", "miss"));
return result;
}
// ... implement all other IMemoryCache methods
}// Register in DI
services.AddNamedMeteredMemoryCache("my-cache");
// Use directly - no custom wrapper needed
public class MyService
{
private readonly IMemoryCache _cache;
public MyService(IMemoryCache cache) // or keyed service for named caches
{
_cache = cache;
}
// All methods instrumented automatically
}// Before: IDistributedCache (if using in-memory)
services.AddDistributedMemoryCache();
// After: MeteredMemoryCache with equivalent functionality
services.AddNamedMeteredMemoryCache("distributed-equivalent", opts =>
{
opts.AdditionalTags["cache_type"] = "distributed_memory";
});// Before: LazyCache
services.AddLazyCache();
// After: MeteredMemoryCache with lazy loading pattern
services.AddNamedMeteredMemoryCache("lazy-cache");
// Usage with lazy loading
var result = cache.GetOrCreate("key", entry =>
{
entry.SlidingExpiration = TimeSpan.FromMinutes(5);
return ExpensiveOperation();
});- Identify Current Metrics: Document existing cache metrics and naming conventions
- Plan Metric Mapping: Map old metric names to new OpenTelemetry standard names
- Update Dependencies: Add required packages (OpenTelemetry, MeteredMemoryCache)
- Configure Services: Replace cache registrations with MeteredMemoryCache
- Remove Custom Code: Delete manual instrumentation and wrapper classes
- Update Monitoring: Modify dashboards and alerts for new metric names
- Test Thoroughly: Verify metrics are correctly emitted in all scenarios
- Monitor Performance: Ensure acceptable overhead in production workloads
| Old Metric | New Metric | Notes |
|---|---|---|
cache.hits |
cache.requests |
ObservableCounter with cache.request.type=hit |
cache.misses |
cache.requests |
ObservableCounter with cache.request.type=miss |
cache.sets |
N/A | Tracked via eviction callbacks instead |
cache.evictions |
cache.evictions |
ObservableCounter |
cache.size |
cache.estimated_size |
ObservableGauge (when SizeLimit is set) |
MeteredMemoryCache adds minimal overhead (15-40ns per operation). If you're migrating from:
- Heavy custom instrumentation: Expect performance improvement
- No instrumentation: Expect small performance decrease
- Third-party wrappers: Performance impact varies by implementation