Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
124 changes: 121 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
[![Packages](https://david-dm.org/Codibre/remembered.svg)](https://david-dm.org/Codibre/remembered)
[![npm version](https://badge.fury.io/js/remembered.svg)](https://badge.fury.io/js/remembered)

A module to remember for a given time the promises you made.
A module to remember for a given time the promises you made, with configurable cache strategies and eviction policies.

# How to install

Expand All @@ -16,6 +16,8 @@ npm install remembered

# Usage

## Basic Usage

Create a new Remembered instance giving the ttl you want, in ms.

``` ts
Expand All @@ -36,16 +38,110 @@ const [r1, r2, r3] = await Promise.all([
]);
```

In the above example, **r1**, **r2** and **r3** will receive the same exact +promise.
In the above example, **r1**, **r2** and **r3** will receive the same exact promise.
Remembered don't "cache" the result of your async operation: it caches the promise itself.

This is very useful for concurrent tasks where you have the same heavy call and you want it to happen just once.
In this example, the promise is resolved in 200 milliseconds, but the ttl is 1 second and it starts to count not after the promise is resolved, but when the promise is made. In other words, exactly 1 second after the first call, the callback will need to be called again.

If you want for the promise to be remembered just while it is not resolved, you can use **ttl** 0. In this case, while the promise is pending, Remembered will return the same reference, but, after it is resolved, then callback will be called

Another option is to use the **wrap** method:
## Cache Strategies and Eviction Policies

Remembered supports different cache strategies with configurable eviction policies to manage memory usage:

### Available Eviction Policies

- **LRU (Least Recently Used)**: Removes the least recently accessed item when capacity is reached
- **MRU (Most Recently Used)**: Removes the most recently accessed item when capacity is reached
- **FIFO (First In, First Out)**: Removes the oldest item when capacity is reached
- **Simple**: No eviction policy, stores items indefinitely (default)

### Configuration Options

```ts
interface RememberedConfig<TResponse = unknown, TKey = string> {
ttl: number | TtlFunction<TResponse, TKey>;
evictionPolicy?: 'LRU' | 'MRU' | 'FIFO';
capacity?: number; // Required when using eviction policies
nonBlocking?: boolean;
onReused?: (key: string) => void;
}
```

### Examples

#### LRU Cache with Capacity Limit

```ts
const remembered = new Remembered({
ttl: 5000,
evictionPolicy: 'LRU',
capacity: 100
});
```

#### MRU Cache for Recent Items

```ts
const remembered = new Remembered({
ttl: 3000,
evictionPolicy: 'MRU',
capacity: 50
});
```

#### FIFO Cache for Time-based Eviction

```ts
const remembered = new Remembered({
ttl: 10000,
evictionPolicy: 'FIFO',
capacity: 200
});
```

#### Simple Cache (Default)

```ts
const remembered = new Remembered({
ttl: 5000
// No eviction policy = Simple cache
});
```

### Advanced Configuration

#### Dynamic TTL Function

```ts
const remembered = new Remembered({
ttl: (key: string, response?: any) => {
// Different TTL based on key or response
return key.startsWith('user:') ? 30000 : 5000;
},
evictionPolicy: 'LRU',
capacity: 1000
});
```

#### Non-blocking Mode with Callback

```ts
const remembered = new Remembered({
ttl: 5000,
evictionPolicy: 'LRU',
capacity: 100,
nonBlocking: true,
onReused: (key: string) => {
console.log(`Cache hit for key: ${key}`);
}
});
```

## Wrapping Functions

Another option is to use the **wrap** method:

```ts
const callback = () => new Promise<number>((resolve) => {
Expand All @@ -66,6 +162,28 @@ The wrap method returns a version of your function that receives the exact same

The given ttl is meant to be readonly. So, if you change the ttl value of the provided, it will not take effect on the previous Remembered instances.

# Cache Strategy Details

## LRU (Least Recently Used)
- Best for: Frequently accessed data
- Evicts: Least recently accessed items
- Use case: General purpose caching, user sessions

## MRU (Most Recently Used)
- Best for: Data that becomes stale quickly
- Evicts: Most recently accessed items
- Use case: Temporary data, rate limiting

## FIFO (First In, First Out)
- Best for: Time-sensitive data
- Evicts: Oldest items regardless of access
- Use case: Logs, time-series data

## Simple
- Best for: Small datasets, development
- Evicts: Never (memory grows indefinitely)
- Use case: Testing, small applications

# Saudade

There is no proper translation for the word *saudade* in English.
Expand Down
39 changes: 35 additions & 4 deletions docs/classes/remembered.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

# Class: Remembered

A class that help you remember previous calls for you functions, to avoid new calls while it is not forgotten
A class that help you remember previous calls for you functions, to avoid new calls while it is not forgotten. Supports configurable cache strategies with eviction policies to manage memory usage.

## Table of contents

Expand All @@ -28,14 +28,45 @@ A class that help you remember previous calls for you functions, to avoid new ca

\+ **new Remembered**(`config?`: RememberedConfig): [*Remembered*](remembered.md)

Creates a new Remembered instance with optional cache configuration.

#### Parameters:

Name | Type |
:------ | :------ |
`config` | RememberedConfig |
Name | Type | Description |
:------ | :------ | :------ |
`config` | RememberedConfig | Configuration object with TTL and optional cache strategy settings |

**Returns:** [*Remembered*](remembered.md)

#### Configuration Options:

- `ttl`: Time to live in milliseconds or a function that returns TTL
- `evictionPolicy`: Optional cache eviction policy ('LRU', 'MRU', 'FIFO', or undefined for Simple)
- `capacity`: Maximum number of items to store (required when using eviction policies)
- `nonBlocking`: Whether to keep persistent last result for background updates
- `onReused`: Callback function called when a cached value is reused

#### Examples:

```ts
// Simple cache (default)
const remembered = new Remembered({ ttl: 1000 });

// LRU cache with capacity limit
const remembered = new Remembered({
ttl: 5000,
evictionPolicy: 'LRU',
capacity: 100
});

// Dynamic TTL with MRU eviction
const remembered = new Remembered({
ttl: (key, response) => key.startsWith('user:') ? 30000 : 5000,
evictionPolicy: 'MRU',
capacity: 50
});
```

## Properties

### map
Expand Down
3 changes: 2 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,11 @@
"lint": "npm run lint:format && npm run lint:style",
"lint:fix": "npm run lint:format:fix && npm run lint:style:fix",
"build": "tsc -p tsconfig.build.json",
"test": "jest test/unit",
"test": "jest test/unit --runInBand --forceExit",
"test:watch": "jest test/unit --watch",
"test:coverage": "jest test/unit --coverage",
"test:debug": "node --inspect-brk -r tsconfig-paths/register -r ts-node/register node_modules/.bin/jest --runInBand",
"test:only": "jest --runInBand $(grep -rnwl ./test -e \"test.only\\|it.only\\|describe.only\" --include \\*.ts | tr '\n' ' ') --forceExit",
"test:e2e": "jest test/e2e",
"clear": "npm run clear:build && npm run clear:modules",
"clear:build": "del-cli ./dist",
Expand Down
23 changes: 23 additions & 0 deletions src/cache-strategy/cache-factory.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
import {
EvictionRememberedConfig,
RememberedConfig,
} from '../remembered-config';
import { FifoCache, LruCache, MruCache, SimpleCache } from './implementations';

export function createCache<TResponse = unknown, TKey = string>(
config: RememberedConfig<TResponse, TKey>,
) {
switch (config.evictionPolicy) {
case 'LRU':
const lruConfig = config as EvictionRememberedConfig<TResponse, TKey>;
return new LruCache<TResponse, TKey>(lruConfig.capacity);
case 'FIFO':
const fifoConfig = config as EvictionRememberedConfig<TResponse, TKey>;
return new FifoCache<TResponse, TKey>(fifoConfig.capacity);
case 'MRU':
const mruConfig = config as EvictionRememberedConfig<TResponse, TKey>;
return new MruCache<TResponse, TKey>(mruConfig.capacity);
default:
return new SimpleCache<TResponse, TKey>();
}
}
68 changes: 68 additions & 0 deletions src/cache-strategy/core/base-cache.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
import { Cache } from './cache';
import { Node, LinkedList } from '../utils/linked-list';

export abstract class BaseCache<TResponse = unknown, TKey = string>
implements Cache<TResponse, TKey>
{
protected map: Map<TKey, Node<TResponse, TKey>>;
protected linkedList: LinkedList<TResponse, TKey>;
protected size: number;
protected readonly capacity: number;

constructor(capacity: number) {
if (capacity < 0) {
throw new Error('Capacity must be greater than or equal to 0');
}

this.map = new Map<TKey, Node<TResponse, TKey>>();
this.linkedList = new LinkedList<TResponse, TKey>();

this.capacity = capacity;
this.size = 0;
}

abstract get(key: TKey): TResponse | Promise<TResponse> | undefined;

set(key: TKey, value: TResponse | Promise<TResponse>): void {
if (this.capacity === 0) {
return;
}

const item = this.map.get(key);
if (item) {
this.handleExistingItemAccess(item, value);
return;
}

if (this.size === this.capacity) {
this.evictItem();
}

const node = this.linkedList.addNode(key, value);
this.size++;
this.map.set(key, node);
}

delete(key: TKey): void {
const item = this.map.get(key);
if (item) {
this.linkedList.removeNode(item);
this.map.delete(key);
this.size--;
}
}

protected abstract handleExistingItemAccess(
item: Node<TResponse, TKey>,
value: TResponse | Promise<TResponse>,
): void;
protected abstract evictItem(): void;

protected getHead(): Node<TResponse, TKey> {
return this.linkedList.getHead();
}

protected getTail(): Node<TResponse, TKey> {
return this.linkedList.getTail();
}
}
26 changes: 26 additions & 0 deletions src/cache-strategy/core/cache-factory.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
import {
EvictionRememberedConfig,
RememberedConfig,
} from '../../remembered-config';
import { FifoCache } from '../implementations/fifo-cache';
import { LruCache } from '../implementations/lru-cache';
import { MruCache } from '../implementations/mru-cache';
import { SimpleCache } from '../implementations/simple-cache';

export function createCache<TResponse = unknown, TKey = string>(
config: RememberedConfig<TResponse, TKey>,
) {
switch (config.evictionPolicy) {
case 'LRU':
const lruConfig = config as EvictionRememberedConfig<TResponse, TKey>;
return new LruCache<TResponse, TKey>(lruConfig.capacity);
case 'FIFO':
const fifoConfig = config as EvictionRememberedConfig<TResponse, TKey>;
return new FifoCache<TResponse, TKey>(fifoConfig.capacity);
case 'MRU':
const mruConfig = config as EvictionRememberedConfig<TResponse, TKey>;
return new MruCache<TResponse, TKey>(mruConfig.capacity);
default:
return new SimpleCache<TResponse, TKey>();
}
}
5 changes: 5 additions & 0 deletions src/cache-strategy/core/cache.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export interface Cache<TResponse = unknown, TKey = string> {
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it's better to keep the idiomatic around the Map word, instead of cache, to prevent confusion with dedicated cache services.
You can call it, for example, EvictionableMap. I also recommend you to invert the class generic parameters, to make it's contract compatible with Map<TKey, TResponse | Promise>

get(key: TKey): TResponse | Promise<TResponse> | undefined;
set(key: TKey, value: TResponse | Promise<TResponse>): void;
delete(key: TKey): void;
}
33 changes: 33 additions & 0 deletions src/cache-strategy/implementations/fifo-cache.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
import { BaseCache } from '../core/base-cache';
import { Node } from '../utils/linked-list';

export class FifoCache<TResponse = unknown, TKey = string> extends BaseCache<
TResponse,
TKey
> {
get(key: TKey): TResponse | Promise<TResponse> | undefined {
const item = this.map.get(key);
if (item) {
return item.value;
}
return undefined;
}

protected handleExistingItemAccess(
item: Node<TResponse, TKey>,
value: TResponse | Promise<TResponse>,
): void {
item.value = value;
}

protected evictItem(): void {
const nodeToRemove = this.getTail().prev;
if (nodeToRemove) {
this.linkedList.removeNode(nodeToRemove);
if (nodeToRemove.key) {
this.map.delete(nodeToRemove.key);
}
}
this.size--;
}
}
4 changes: 4 additions & 0 deletions src/cache-strategy/implementations/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
export { SimpleCache } from './simple-cache';
export { FifoCache } from './fifo-cache';
export { LruCache } from './lru-cache';
export { MruCache } from './mru-cache';
Loading