使用PriorityQueue实现LFU和LRU
时间: 2023-10-19 19:09:17 浏览: 193
LFU (Least Frequently Used) 和 LRU (Least Recently Used) 都是常见的缓存淘汰策略,它们可以用 PriorityQueue(优先队列)来实现。
LFU 的思想是,当缓存空间满时,淘汰掉最不经常使用的缓存。具体实现时,我们可以使用一个字典来记录每个缓存的访问次数,然后将缓存按照访问次数从小到大排序,淘汰访问次数最小的缓存。
LRU 的思想是,当缓存空间满时,淘汰最近最少使用的缓存。具体实现时,我们可以使用一个双向链表和一个字典来记录缓存的顺序和存储的位置。每当访问一个缓存时,将其移动到链表头部;当缓存空间满时,淘汰链表尾部的缓存。
下面是使用 PriorityQueue 实现 LFU 和 LRU 的 Python 代码示例:
LFU:
```python
import heapq
class LFUCache:
def __init__(self, capacity: int):
self.capacity = capacity
self.cache = {}
self.freq = {}
self.count = 0
def get(self, key: int) -> int:
if key not in self.cache:
return -1
self.freq[key] += 1
heapq.heapify(self.cache[key])
return self.cache[key][0]
def put(self, key: int, value: int) -> None:
if self.capacity == 0:
return
if key in self.cache:
self.cache[key].append(value)
self.freq[key] += 1
heapq.heapify(self.cache[key])
else:
if self.count == self.capacity:
min_freq = min(self.freq.values())
for k, v in self.freq.items():
if v == min_freq:
del self.cache[k]
del self.freq[k]
break
self.count -= 1
self.cache[key] = [value]
self.freq[key] = 1
self.count += 1
```
LRU:
```python
class LRUCache:
def __init__(self, capacity: int):
self.capacity = capacity
self.cache = {}
self.head = Node(0, 0)
self.tail = Node(0, 0)
self.head.next = self.tail
self.tail.prev = self.head
def get(self, key: int) -> int:
if key not in self.cache:
return -1
node = self.cache[key]
self._remove(node)
self._add(node)
return node.val
def put(self, key: int, value: int) -> None:
if key in self.cache:
self._remove(self.cache[key])
node = Node(key, value)
self.cache[key] = node
self._add(node)
if len(self.cache) > self.capacity:
node = self.head.next
self._remove(node)
del self.cache[node.key]
def _remove(self, node):
node.prev.next = node.next
node.next.prev = node.prev
def _add(self, node):
node.prev = self.tail.prev
node.next = self.tail
self.tail.prev.next = node
self.tail.prev = node
class Node:
def __init__(self, key, val):
self.key = key
self.val = val
self.prev = None
self.next = None
```
阅读全文