Skip to content

ConcurrentLru Quickstart

Alex Peck edited this page Sep 20, 2022 · 16 revisions

ConcurrentLru is a thread-safe bounded size pseudo LRU. This page describes how to use the ConcurrentLru, and how it is implemented.

Quickstart

ConcurrentLru is intended to be a drop in replacement for ConcurrentDictionary, but with the benefit of bounded size based on an LRU eviction policy.

The code samples below illustrate how to create an LRU then get/remove/update items:

Constructor

int capacity = 666;
var lru = new ConcurrentLru<int, SomeItem>(capacity);

Getting Items

bool success1 = lru.TryGet(1, out var value);
var value1 = lru.GetOrAdd(1, (k) => new SomeItem(k));
var value2 = await lru.GetOrAddAsync(0, (k) => Task.FromResult(new SomeItem(k)));

Removing Items

bool success2 = lru.TryRemove(1); // remove item with key == 1
lru.Clear();
lru.Eviction.Policy.Value.Trim(1); // remove the coldest item

Updating Items

var item = new SomeItem(1);
bool success3 = lru.TryUpdate(1, item);
lru.AddOrUpdate(1, item);

Diagnostics

Console.WriteLine(lru.Metrics.Value.HitRatio);

// enumerate keys
foreach (var k in lru.Keys)
{
   Console.WriteLine(k);
}

// enumerate key value pairs
foreach (var kvp in lru)
{
   Console.WriteLine($"{kvp.Key} {kvp.Value}");
}

// register event on item removed
lru.Events.Value.ItemRemoved += (source, args) => Console.WriteLine($"{args.Reason} {args.Key} {args.Value}");

Builder API

Below is an example using all of the possible builder options:

var lru = new ConcurrentLruBuilder<int, Disposable>()
    .AsAsyncCache()
    .AsScopedCache()
    .WithAtomicGetOrAdd()
    .WithCapacity(3)
    .WithMetrics()
    .WithExpireAfterWrite(TimeSpan.FromSeconds(1))
    .WithKeyComparer(StringComparer.OrdinalIgnoreCase)
    .WithConcurrencyLevel(8)
    .Build();
Builder Method Description
AsAsyncCache Build an IAsyncCache, the GetOrAdd method becomes GetOrAddAsync.
AsScopedCache Build an IScopedCache. IDisposable values are wrapped in a lifetime scope. Scoped caches return lifetimes that prevent values from being disposed until the calling code completes.
WithAtomicGetOrAdd Execute the cache's GetOrAdd method atomically, such that it is applied at most once per key. Other threads attempting to update the same key will be blocked until value factory completes. Incurs a small performance penalty.
WithCapacity Sets the maximum number of values to keep in the cache. If more items than this are added, the cache eviction policy will determine which values to remove
WithMetrics Collect cache metrics, such as Hit rate. Metrics have a small performance penalty.
WithExpireAfterWrite Evict after a fixed duration since an entry's creation or most recent replacement.
WithKeyComparer Use the specified equality comparison implementation to compare keys.
WithConcurrencyLevel Sets the estimated number of threads that will update the cache concurrently.