LRU Cache implementation
Design and implement a data structure for Least Recently Used (LRU) cache, which supports get and put.
Discussion :
LRU cache class
Discussion :
The key to solve this problem is using a double linked list which enables us to quickly move nodes.
The LRU cache is a hash table of keys and double linked nodes. The hash table makes get() operation time to be O(1). The list of double linked nodes make the nodes adding/removal operations time to be O(1).
Double linked list class
class Node{ int key; int val; Node prev; Node next; public Node(int key, int val){ this.key=key; this.val=val; } }
LRU cache class
class LRUCache { Node head; Node tail; HashMap<Integer, Node> map = null; int maxSize = 0; public LRUCache(int maxSize) { this.maxSize = maxSize ; this.map = new HashMap<>(); } public int get(int key) { if(map.get(key)==null){ return -1; } Node t = map.get(key); removeNode(t); addNode(t); return t.val; } public void put(int key, int val) { if(map.containsKey(key)){ Node t = map.get(key); t.val = val; removeNode(t); addNode(t); }else{ if(map.size()>=maxSize){ map.remove(head.key); removeNode(head); } Node node = new Node(key, val); addNode(node); map.put(key, node); } } private void removeNode(Node n){ if(n.prev!=null){ n.prev.next = n.next; }else{ head = n.next; } if(n.next!=null){ n.next.prev = n.prev; }else{ tail = n.prev; } } private void addNode(Node n){ if(tail!=null){ tail.next = n; } n.prev = tail; n.next = null; tail = n; if(head == null){ head = tail; } } }
Comments
Post a Comment