

ThisĮven if ttl tracking is enabled, it is strongly recommended to Optional, but must be a positive integer in ms if specified.Īt least one of max, maxSize, or TTL is required. Items as missing when they are fetched, and delete them. There is no pre-emptive pruning of expired items,īut you may set a TTL on the cache, and it will treat expired This is not primarily a TTL cache, and does not make strong TTL Incur overhead by deleting from Map objects rather than simply The staleness/TTL checks will reduce performance, as they will Note that stale items are NOT preemptively removed by default,Īnd MAY live in the cache, contributing to its LRU max, longĪlso, as this cache is optimized for LRU/MRU operations, some of Max time to live for items before they are considered stale. This may be overridden by passing an options object toīoolean, default false. Set to true to suppress calling the dispose() function if theĮntry key is still accessible within the cache. The disposeAfter() method is not called for canceled calls to However, note that it is very easy to inadvertently It is safe to add an item right back into the cache at this Removed and the cache is once again in a clean state. The same as dispose, but called after the entry is completely If you wish to handle evictions, overwrites,Īnd deletes of in-flight asynchronous fetches, you must use the The dispose() method is not called for canceled calls toįetchMethod(). delete Item was removed by explicit lete(key) or byĬalling cache.clear(), which deletes everything.set Item was overwritten by a new value.evict Item was evicted to make space for a new addition.The reason will be one of the following strings, corresponding Ifĭisposal functions may vary between cache entries, then theĮntire list must be scanned on every cache swap, even if no Passing an option to set(), for performance reasons. Unlike several other options, this may not be overridden by The dispose() function call, it will break things in subtle and NOTE: It is called before the item has been fully removedįrom the cache, so if you want to put it right back in, you need Other cleanup tasks when items are no longer stored in the cache. This can be handy if you want to close file descriptors or do

This may be set in calls to fetch(), or defaulted on theįunction that is called on items when they are dropped from theĬache, as this.dispose(value, key, reason).
#Cache replacement policy la gi update
This is important in cases where a fetchMethod is only calledĪs a background update while the stale value is returned, when

If noDeleteOnFetchRejection is set to true, then thisīehavior is suppressed, and the stale value remains in the cache Then by default, any existing stale value will be removed from If a fetchMethod throws an error or returns a rejected promise, Headers and the like for debugging purposes, which do not affect This is primarily intended for including x-request-id Meaningfully vary the fetch response needs to be present in the Note that this will only be relevant when the cache.fetch()Ĭall needs to call fetchMethod(). fetchContextĪrbitrary data that can be passed to the fetchMethod as the For example, a DNS cache may update the TTLīased on the value returned from a remote DNS server by changing Will result in modifying the settings to t() when the The options object is a union of the options that may be

This may be passedĪlong to async functions aware of AbortController/AbortSignal Means that the fetch should be abandoned. Signal.onabort method is called, or if it emits an 'abort'Įvent which you can listen to with addEventListener, then that If at any time, signal.aborted is set to true, or if the The global object, otherwise it's a pretty close polyfill. The signal object is an AbortSignal if that's available in If fetchMethod is not provided, then cache.fetch(key) isĮquivalent to Promise.resolve(cache.get(key)). Most of them can be // overridden for specific items in get()/set() const options = ). // All the other options are optional, see the sections below for // documentation on what each one does. // In most cases, it's best to specify a max for performance, so all // the required memory allocation is done up-front. Const LRU = require ( 'lru-cache' ) // At least one of 'max', 'ttl', or 'maxSize' is required, to prevent // unsafe unbounded storage.
