Home

LRU

Least Recently Used (LRU) is a cache eviction policy that discards the least recently accessed item when the cache is full. The core idea is to keep track of the order in which items are used so that those not accessed for the longest time can be removed first.

In practice, an efficient LRU cache is typically implemented with a hash map and a doubly linked

LRU is widely used in hardware and software caches, including CPU instruction and data caches, operating system

Limitations of LRU include its sensitivity to workload patterns. If a working set exceeds the cache capacity,

list.
The
hash
map
provides
O(1)
access
to
items
by
key,
while
the
doubly
linked
list
maintains
the
recency
order,
with
the
most
recently
used
item
at
one
end
and
the
least
recently
used
at
the
other.
When
an
item
is
accessed
or
inserted,
its
corresponding
node
is
moved
to
the
MRU
end.
If
the
cache
exceeds
its
capacity,
the
node
at
the
LRU
end
is
evicted.
This
combination
allows
get
and
put
operations
to
run
in
O(1)
time
on
average,
at
the
cost
of
additional
memory
to
store
the
list
pointers
and
the
hash
map
entries.
page
replacement
schemes,
database
buffer
pools,
and
web
caches.
In
practice,
exact
LRU
can
be
expensive
to
implement
at
large
scales,
leading
to
approximate
approaches
such
as
the
CLOCK
algorithm
(a
practical
LRU-like
eviction
method)
and
variants
like
LRU-k
or
ARC
that
balance
recency
with
other
factors.
performance
can
degrade
and
thrashing
may
occur.
Nevertheless,
LRU
remains
a
simple
and
effective
heuristic
for
maintaining
temporal
locality
in
many
caching
scenarios.