Open In App

LRU Cache Implementation using Doubly Linked List

Last Updated : 24 Sep, 2024
Summarize
Comments
Improve
Suggest changes
Like Article
Like
Save
Share
Report
News Follow

Design a data structure that works like a LRU(Least Recently Used) Cache. The LRUCache class has two methods get() and put() which are defined as follows.

  • LRUCache (Capacity c): Initialize LRU cache with positive size capacity c.
  • get(key): returns the value of the key if it already exists in the cache otherwise returns -1.
  • put(key, value): if the key is already present, update its value. If not present, add the key-value pair to the cache. If the cache reaches its capacity it should remove the key-value pair with the lowest priority.

Example:

Input: [LRUCache cache = new LRUCache(2) , put(1 ,1) , put(2 ,2) , get(1) , put(3 ,3) , get(2) , put(4 ,4) , get(1) , get(3) , get(4)]
Output: [1 ,-1, -1, 3, 4]
Explanation: The values mentioned in the output are the values returned by get operations.

  • Initialize LRUCache class with capacity = 2.
  • cache.put(1, 1): (key, pair) = (1,1) inserted and has the highest priority.
  • cache.put(2, 2): (key , pair) = (2,2) inserted and has the highest priority.
  • cache.get(1): For key 1, value is 1, so 1 returned and (1,1) moved to the highest priority.
  • cache.put(3, 3): Since cache is full, remove least recently used that is (2,2), (3,3) inserted with the highest priority.
  • cache.get(2): returns -1 (key 2 not found)
  • cache.put(4, 4): Since the cache is full, remove least recently used that is (1,1). (4,5) inserted with the highest priority.
  • cache.get(1): return -1 (not found)
  • cache.get(3): return 3 , (3,3) will moved to the highest priority.
  • cache.get(4): return 4 , (4,4) moved to the highest priority.

Thoughts about Implementation Using Arrays, Hashing and/or Heap

We use an array of triplets, where the items are key, value and priority

get(key) : We linearly search the key. If we find the item, we change priorities of all impacted and make the new item as the highest priority.
put(key): If there is space available, we insert at the end. If not, we linearly search items of the lowest priority and replace that item with the new one. We change priorities of all and make the new item as the highest priority.

Time Complexities of both the operations is O(n)

Can we make both operations in O(1) time? we can think of hashing. With hashing, we can insert, get and delete in O(1) time, but changing priorities would take linear time. We can think of using heap along with hashing for priorities. We can find and remove the least recently used (lowest priority) in O(Log n) time which is more than O(1) and changing priority in the heap would also be required.

Using Doubly Linked List and Hashing

The idea is to keep inserting the key-value pair at the head of the doubly linked list. When a node is accessed or added, it is moved to the head of the list (right after the dummy head node). This marks it as the most recently used. When the cache exceeds its capacity, the node at the tail (right before the dummy tail node) is removed as it represents the least recently used item.


Below is the implementation of the above approach: 

C++
// C++ program to implement LRU Least Recently Used)
#include <bits/stdc++.h>
using namespace std;

struct Node {
    int key;
    int value;
    Node *next;
    Node *prev;

    Node(int k, int v) {
        key = k;
        value = v;
        next = nullptr;
        prev = nullptr;
    }
};

// LRU Cache class
class LRUCache
{
  public:
  
    // Constructor to initialize the cache with a given capacity
    int capacity;
    unordered_map<int, Node *> cacheMap;
    Node *head;
    Node *tail;
    LRUCache(int capacity) {
        this->capacity = capacity;
        head = new Node(-1, -1);
        tail = new Node(-1, -1);
        head->next = tail;
        tail->prev = head;
    }

    // Function to get the value for a given key
    int get(int key) {
      
        if (cacheMap.find(key) == cacheMap.end())
            return -1;
  

        Node *node = cacheMap[key];
        remove(node);
        add(node);
        return node->value;
    }

    // Function to put a key-value pair into the cache
    void put(int key, int value) {
        if (cacheMap.find(key) != cacheMap.end()) {
            Node *oldNode = cacheMap[key];
            remove(oldNode);
          
        }

        Node *node = new Node(key, value);
        cacheMap[key] = node;
        add(node);
       
       
        if (cacheMap.size() > capacity) {
            Node *nodeToDelete = tail->prev;
            remove(nodeToDelete);
            cacheMap.erase(nodeToDelete->key);
        }
    }

    // Add a node right after the head 
      // (most recently used position)
    void add(Node *node) {
        Node *nextNode = head->next;
        head->next = node;
        node->prev = head;
        node->next = nextNode;
        nextNode->prev = node;
    }

    // Remove a node from the list
    void remove(Node *node) {
        Node *prevNode = node->prev;
        Node *nextNode = node->next;
        prevNode->next = nextNode;
        nextNode->prev = prevNode;
    }
};

int main(){
    LRUCache cache(2);
  
    cache.put(1, 1); 
    cache.put(2, 2);
    cout << cache.get(1) << endl;
    cache.put(3, 3);
    cout  << cache.get(2) << endl;
    cache.put(4, 4);
    cout << cache.get(1) << endl;
    cout << cache.get(3) << endl;
    cout << cache.get(4) << endl;

    return 0;
}
Java
// Java program to implement LRU Least Recently Used)
import java.util.HashMap;
import java.util.Map;

class Node {
    int key;
    int value;
    Node next;
    Node prev;

    Node(int key, int value) {
        this.key = key;
        this.value = value;
        this.next = null;
        this.prev = null;
    }
}

class LRUCache {
    private int capacity;
    private Map<Integer, Node> cacheMap;
    private Node head;
    private Node tail;

    // Constructor to initialize the cache with a given
    // capacity
    LRUCache(int capacity) {
        this.capacity = capacity;
        this.cacheMap = new HashMap<>();
        this.head = new Node(-1, -1);
        this.tail = new Node(-1, -1);
        this.head.next = this.tail;
        this.tail.prev = this.head;
    }

    // Function to get the value for a given key
    int get(int key) {
        if (!cacheMap.containsKey(key)) {
            return -1;
        }

        Node node = cacheMap.get(key);
        remove(node);
        add(node);
        return node.value;
    }

    // Function to put a key-value pair into the cache
    void put(int key, int value) {
        if (cacheMap.containsKey(key)) {
            Node oldNode = cacheMap.get(key);
            remove(oldNode);
        }

        Node node = new Node(key, value);
        cacheMap.put(key, node);
        add(node);

        if (cacheMap.size() > capacity) {
            Node nodeToDelete = tail.prev;
            remove(nodeToDelete);
            cacheMap.remove(nodeToDelete.key);
        }
    }

    // Add a node right after the head (most recently used
    // position)
    private void add(Node node) {
        Node nextNode = head.next;
        head.next = node;
        node.prev = head;
        node.next = nextNode;
        nextNode.prev = node;
    }

    // Remove a node from the list
    private void remove(Node node) {
        Node prevNode = node.prev;
        Node nextNode = node.next;
        prevNode.next = nextNode;
        nextNode.prev = prevNode;
    }
}


public class Main {
    public static void main(String[] args) {
        LRUCache cache = new LRUCache(2);

        cache.put(1, 1);
        cache.put(2, 2);
        System.out.println(cache.get(1));
        cache.put(3, 3);
        System.out.println(cache.get(2));
        cache.put(4, 4);
        System.out.println(cache.get(1));
        System.out.println(cache.get(3));
        System.out.println(cache.get(4));
    }
}
Python
# Python program to implement LRU Least Recently Used)
class Node:
    def __init__(self, key, value):
        self.key = key
        self.value = value
        self.prev = None
        self.next = None


class LRUCache:
    def __init__(self, capacity: int):
        self.capacity = capacity
        self.cache = {}
        self.head = Node(-1, -1)
        self.tail = Node(-1, -1)
        self.head.next = self.tail
        self.tail.prev = self.head

    def _add(self, node: Node):
      
        # Add a node right after the head
        # (most recently used position).
        next_node = self.head.next
        self.head.next = node
        node.prev = self.head
        node.next = next_node
        next_node.prev = node

    def _remove(self, node: Node):
      
       # emove a node from the
        # doubly linked list.
        prev_node = node.prev
        next_node = node.next
        prev_node.next = next_node
        next_node.prev = prev_node

    def get(self, key: int) -> int:
        # Get the value for a given key
        if key not in self.cache:
            return -1

        node = self.cache[key]
        self._remove(node)
        self._add(node)
        return node.value

    def put(self, key: int, value: int):
      
        #Put a key-value pair into the cache.
        if key in self.cache:
            node = self.cache[key]
            self._remove(node)
            del self.cache[key]

        if len(self.cache) >= self.capacity:
          
            # Remove the least recently used item
            # (just before the tail)
            lru_node = self.tail.prev
            self._remove(lru_node)
            del self.cache[lru_node.key]

        # Add the new node
        new_node = Node(key, value)
        self._add(new_node)
        self.cache[key] = new_node


if __name__ == "__main__":
    cache = LRUCache(2)

    cache.put(1, 1)
    cache.put(2, 2)
    print(cache.get(1))
    cache.put(3, 3)
    print(cache.get(2))
    cache.put(4, 4)
    print(cache.get(1))
    print(cache.get(3))
    print(cache.get(4))
C#
// C# program to implement LRU Least Recently Used)
using System;
using System.Collections.Generic;

class Node {
    public int Key;
    public int Value;
    public Node Prev;
    public Node Next;

    public Node(int key, int value) {
        Key = key;
        Value = value;
        Prev = null;
        Next = null;
    }
}

class LRUCache {
    private int capacity;
    private Dictionary<int, Node> cache;
    private Node head;
    private Node tail;

    // Constructor to initialize the
    // cache with a given capacity
    public LRUCache(int capacity){
        this.capacity = capacity;
        cache = new Dictionary<int, Node>();
        head = new Node(-1, -1);
        tail = new Node(-1, -1);
        head.Next = tail;
        tail.Prev = head;
    }

    // Add a node right after the head
    //(most recently used position)
    private void Add(Node node) {
        Node nextNode = head.Next;
        head.Next = node;
        node.Prev = head;
        node.Next = nextNode;
        nextNode.Prev = node;
    }

    // Remove a node from the doubly linked list
    private void Remove(Node node) {
        Node prevNode = node.Prev;
        Node nextNode = node.Next;
        prevNode.Next = nextNode;
        nextNode.Prev = prevNode;
    }

    // Get the value for a given key
    public int Get(int key) {
        if (!cache.ContainsKey(key)) {
            return -1;
        }

        Node node = cache[key];
        Remove(node);
        Add(node);
        return node.Value;
    }

    // Put a key-value pair into the cache
    public void Put(int key, int value) {
        if (cache.ContainsKey(key)) {
            Node oldNode = cache[key];
            Remove(oldNode);
            cache.Remove(key);
        }

        if (cache.Count >= capacity) {
            Node lruNode = tail.Prev;
            Remove(lruNode);
            cache.Remove(lruNode.Key);
        }

        Node newNode = new Node(key, value);
        Add(newNode);
        cache[key] = newNode;
    }
}

class GfG {
    static void Main() {
        LRUCache cache = new LRUCache(2);

        cache.Put(1, 1);
        cache.Put(2, 2);
        Console.WriteLine(cache.Get(1));
        cache.Put(3, 3);
        Console.WriteLine(cache.Get(2));
        cache.Put(4, 4);
        Console.WriteLine(cache.Get(1));
        Console.WriteLine(cache.Get(3));
        Console.WriteLine(cache.Get(4));
    }
}
JavaScript
// Javascript program to implement LRU Least Recently Used)
class Node {
    constructor(key, value) {
        this.key = key;
        this.value = value;
        this.prev = null;
        this.next = null;
    }
}

class LRUCache {
    constructor(capacity) {
        this.capacity = capacity;
        this.cache = new Map();
        this.head = new Node(-1, -1);
        this.tail = new Node(-1, -1);
        this.head.next = this.tail;
        this.tail.prev = this.head;
    }

    // Add a node right after the head
    //(most recently used position)
    add(node) {
        const nextNode = this.head.next;
        this.head.next = node;
        node.prev = this.head;
        node.next = nextNode;
        nextNode.prev = node;
    }

    // Remove a node from the doubly linked list
    remove(node) {
        const prevNode = node.prev;
        const nextNode = node.next;
        prevNode.next = nextNode;
        nextNode.prev = prevNode;
    }

    // Get the value for a given key
    get(key) {
        if (!this.cache.has(key)) {
            return -1;
        }

        const node = this.cache.get(key);
        this.remove(node);
        this.add(node);
        return node.value;
    }

    // Put a key-value pair into the cache
    put(key, value) {
        if (this.cache.has(key)) {
            const node = this.cache.get(key);
            this.remove(node);
            this.cache.delete(key);
        }

        if (this.cache.size >= this.capacity) {
            const lruNode = this.tail.prev;
            this.remove(lruNode);
            this.cache.delete(lruNode.key);
        }

        const newNode = new Node(key, value);
        this.add(newNode);
        this.cache.set(key, newNode);
    }
}

const cache = new LRUCache(2);
cache.put(1, 1);
cache.put(2, 2);
console.log(cache.get(1));
cache.put(3, 3);
console.log(cache.get(2));
cache.put(4, 4);
console.log(cache.get(1));
console.log(cache.get(3));
console.log(cache.get(4));

Output
1
-1
-1
3
4

Time Complexity : get(key) – O(1) and put(key, value) – O(1)
Auxiliary Space : O(capacity)

Using Inbuilt Doubly Linked List

The idea is to use inbuilt doubly linked list, it simplifies the implementation by avoiding the need to manually manage a doubly linked list while achieving efficient operations. Example – C++ uses a custom doubly linked list as std::list.

Note: Python’s standard library does not include a built-in doubly linked list implementation. To handle use cases that typically require a doubly linked list, such as efficiently managing elements at both ends of a sequence, Python provides the collections.deque class. While deque stands for double-ended queue, it essentially functions as a doubly linked list with efficient operations on both ends.

Below is the implementation of the above approach: 

C++
// C++ program to implement LRU Least Recently Used) using
//Built-in Doubly linked list
#include <bits/stdc++.h>
using namespace std;

class LRUCache {
  public:
    int capacity;
    list<pair<int, int>> List;

    // Map from key to list iterator
    unordered_map<int, list<pair<int, int>>::iterator> cacheMap;

    // Constructor to initialize the 
      //cache with a given capacity
    LRUCache(int capacity) {
        this->capacity = capacity;
    }

    // Function to get the value for a given key
    int get(int key) {
        auto it = cacheMap.find(key);
        if (it == cacheMap.end()) {
            return -1;
        }

        // Move the accessed node to the 
          //front (most recently used position)
        int value = it->second->second;
        List.erase(it->second);
        List.push_front({key, value});

        // Update the iterator in the map
        cacheMap[key] = List.begin();
        return value;
    }

    // Function to put a key-value pair into the cache
    void put(int key, int value) {
        auto it = cacheMap.find(key);
        if (it != cacheMap.end()) {
            // Remove the old node from the list and map
            List.erase(it->second);
            cacheMap.erase(it);
        }

        // Insert the new node at the front of the list
        List.push_front({key, value});
        cacheMap[key] = List.begin();

        // If the cache size exceeds the capacity,
          //remove the least recently used item
        if (cacheMap.size() > capacity) {
            auto lastNode = List.back().first;
            List.pop_back();
            cacheMap.erase(lastNode);
        }
    }
};

int main() {
  
    LRUCache cache(2);
    cache.put(1, 1);
    cache.put(2, 2);
    cout << cache.get(1) << endl;
    cache.put(3, 3);
    cout << cache.get(2) << endl;
    cache.put(4, 4);
    cout << cache.get(1) << endl;
    cout << cache.get(3) << endl;
    cout << cache.get(4) << endl;

    return 0;
}
Java
// Java program to implement LRU Least Recently Used) using
// Built-in Doubly linked list

import java.util.HashMap;
import java.util.LinkedList;
import java.util.Map;

class LRUCache {
    private int capacity;

    // Stores key-value pairs
    private Map<Integer, Integer> cacheMap;

    // Stores keys in the order of access
    private LinkedList<Integer> lruList;

    // Constructor to initialize the cache with a given
    // capacity
    LRUCache(int capacity) {
        this.capacity = capacity;
        this.cacheMap = new HashMap<>();
        this.lruList = new LinkedList<>();
    }

    // Function to get the value for a given key
    public int get(int key) {
        if (!cacheMap.containsKey(key)) {
            return -1;
        }

        // Move the accessed key to the front (most recently
        // used position)
        lruList.remove(Integer.valueOf(key));

        // Add key to the front
        lruList.addFirst(key);

        return cacheMap.get(key);
    }

    // Function to put a key-value pair into the cache
    public void put(int key, int value) {
        if (cacheMap.containsKey(key)) {
          
            // Update the value
            cacheMap.put(key, value);
          
            // Move the accessed key to the front
            lruList.remove(Integer.valueOf(key));
        }
        else {
          
            // Add new key-value pair
            if (cacheMap.size() >= capacity) {
              
                // Remove the least recently used item
                int leastUsedKey = lruList.removeLast();
                cacheMap.remove(leastUsedKey);
            }
            cacheMap.put(key, value);
        }
        // Add the key to the front (most recently used
        // position)
        lruList.addFirst(key);
    }

    public static void main(String[] args) {
      
        LRUCache cache = new LRUCache(2);
        cache.put(1, 1);
        cache.put(2, 2);
        System.out.println(cache.get(1));
        cache.put(3, 3);
        System.out.println(cache.get(2));
        cache.put(4, 4);
        System.out.println(cache.get(1));
        System.out.println(cache.get(3));
        System.out.println( cache.get(4));
    }
}
Python
# Python program to implement LRU Least Recently Used) using
# Built-in Doubly linked list
from collections import deque

class LRUCache:
    def __init__(self, capacity: int):
        self.capacity = capacity

        # Dictionary to store key-value pairs
        self.cache = {}

        # Deque to maintain the order of keys
        self.order = deque()

    def get(self, key: int) -> int:
        if key in self.cache:

            # Move the accessed key to 
            # the front of the deque
            self.order.remove(key)
            self.order.appendleft(key)
            return self.cache[key]
        else:
            return -1

    def put(self, key: int, value: int):
        if key in self.cache:

            # Update the value and move
            # the key to the front
            self.cache[key] = value
            self.order.remove(key)
            self.order.appendleft(key)
        else:
            if len(self.cache) >= self.capacity:

                # Remove the least recently used item
                lru_key = self.order.pop()
                del self.cache[lru_key]

            # Add the new key-value pair
            self.cache[key] = value
            self.order.appendleft(key)


if __name__ == "__main__":
  
    cache = LRUCache(2)
    cache.put(1, 1)
    cache.put(2, 2)
    print(cache.get(1))
    cache.put(3, 3)
    print(cache.get(2))
    cache.put(4, 4)
    print(cache.get(1))
    print(cache.get(3))
    print(cache.get(4))
C#
using System;
using System.Collections.Generic;

class LRUCache {
    private int capacity;
    private Dictionary<int, LinkedListNode<KeyValuePair<int, int>>> cacheMap;
    private LinkedList<KeyValuePair<int, int>> lruList;

    // Constructor to initialize the cache with a given capacity
    public LRUCache(int capacity) {
        this.capacity = capacity;
        this.cacheMap = new Dictionary<int, LinkedListNode<KeyValuePair<int, int>>>();
        this.lruList = new LinkedList<KeyValuePair<int, int>>();
    }

    // Function to get the value for a given key
    public int Get(int key) {
        if (cacheMap.TryGetValue(key, out LinkedListNode<KeyValuePair<int, int>> node)) {
          
            // Move the accessed node to the front (most recently used position)
            lruList.Remove(node);
            lruList.AddFirst(node);
            return node.Value.Value;
        } else {
            return -1;
        }
    }

    // Function to put a key-value pair into the cache
    public void Put(int key, int value) {
        if (cacheMap.TryGetValue(key, out LinkedListNode<KeyValuePair<int, int>> node)) {
          
            // Remove the old node from the list and map
            lruList.Remove(node);
            cacheMap.Remove(key);
        }

        // Insert the new node at the front of the list
        var newNode = new KeyValuePair<int, int>(key, value);
        var listNode = new LinkedListNode<KeyValuePair<int, int>>(newNode);
        lruList.AddFirst(listNode);
        cacheMap[key] = listNode;

        // If the cache size exceeds the capacity, remove the 
          // least recently used item
        if (cacheMap.Count > capacity) {
            var lastNode = lruList.Last;
            lruList.RemoveLast();
            cacheMap.Remove(lastNode.Value.Key);
        }
    }
}

class GfG {
    static void Main() {
      
        LRUCache cache = new LRUCache(2);
        cache.Put(1, 1);
        cache.Put(2, 2);
        Console.WriteLine(cache.Get(1));
        cache.Put(3, 3);
        Console.WriteLine(cache.Get(2));
        cache.Put(4, 4);
        Console.WriteLine(cache.Get(1));
        Console.WriteLine(cache.Get(3));
        Console.WriteLine(cache.Get(4));
    }
}
JavaScript
// Javascript program to implement LRU Least Recently Used)
// using Built-in Doubly linked list

class LRUCache {
    constructor(capacity) {
        this.capacity = capacity;
        this.cache = new Map();
    }

    // Get the value for a given key
    get(key) {
        if (!this.cache.has(key)) {
            return -1;
        }

        // Move the accessed key-value pair
        // to the end to mark it as recently used
        const value = this.cache.get(key);
        this.cache.delete(key);
        this.cache.set(key, value);
        return value;
    }

    // Put a key-value pair into the cache
    put(key, value) {
        if (this.cache.has(key)) {
        
            // Update the value and move the key to the end
            this.cache.delete(key);
        }
        else if (this.cache.size >= this.capacity) {
        
            // Remove the least recently used item (the
            // first item in the Map)
            this.cache.delete(this.cache.keys().next().value);
        }

        // Add the new key-value pair
        this.cache.set(key, value);
    }
}

const cache = new LRUCache(2);
cache.put(1, 1);
cache.put(2, 2);
console.log(cache.get(1));
cache.put(3, 3);
console.log(cache.get(2));
cache.put(4, 4);
console.log(cache.get(1));
console.log(cache.get(3));
console.log(cache.get(4));

Output
1
-1
-1
3
4

Time Complexity : get(key) – O(1) and put(key, value) – O(1)
Auxiliary Space: O(capacity)



Similar Reads

LRU Cache in Python using OrderedDict
LRU (Least Recently Used) Cache discards the least recently used items first. This algorithm requires keeping track of what was used when, which is expensive if one wants to make sure the algorithm always discards the least recently used item. General implementations of this technique require keeping "age bits" for cache-lines and track the "Least
4 min read
Design LRU Cache
Design a data structure for LRU Cache. It should support the following operations: get and put. get(key) - Returns the value of the given key if it exists in the cache; otherwise, returns -1.put(key, value) - Inserts or updates the key-value pair in the cache. If the cache reaches capacity, remove the least recently used item before adding the new
6 min read
Implementation of Least Recently Used (LRU) page replacement algorithm using Counters
Prerequisite - Least Recently Used (LRU) Page Replacement algorithm Least Recently Used page replacement algorithm replaces the page which is not used recently. Implementation: In this article, LRU is implemented using counters, a ctime (i.e., counter) variable is used to represent the current time, it is incremented for every page of the reference
8 min read
Implementation of stack using Doubly Linked List
Stack and doubly linked lists are two important data structures with their own benefits. Stack is a data structure that follows the LIFO technique and can be implemented using arrays or linked list data structures. Doubly linked list has the advantage that it can also traverse the previous node with the help of "previous" pointer. Doubly Linked Lis
15+ min read
Implementation of Deque using doubly linked list
Deque or Double Ended Queue is a generalized version of Queue data structure that allows insert and delete at both ends. In previous post Implementation of Deque using circular array has been discussed. Now in this post we see how we implement Deque using Doubly Linked List. Operations on Deque :Mainly the following four basic operations are perfor
15+ min read
Page Faults in LRU | Implementation
LRU uses the concept of paging for memory management. A page replacement algorithm is needed to decide which page needs to be replaced when the new page comes in. Whenever a new page is referred to and is not present in memory, the page fault occurs and the Operating System replaces one of the existing pages with a newly needed page. LRU is one suc
15 min read
Operations of Doubly Linked List with Implementation
A Doubly Linked List (DLL) contains an extra pointer, typically called the previous pointer, together with the next pointer and data which are there in a singly linked list. Below are operations on the given DLL: Add a node at the front of DLL: The new node is always added before the head of the given Linked List. And the newly added node becomes t
15+ min read
When is Doubly Linked List more Efficient than Singly Linked List?
Did you know there are some cases where a Doubly Linked List is more efficient than a Singly Linked List, even though it takes more memory compared to a Singly Linked List? What are those Cases? Well, we will discuss that in the following article, But first, let's talk about Singly and linked lists: What is a Singly Linked List?A singly linked list
4 min read
Is two way linked list and doubly linked list same?
Yes, a two-way linked list and a doubly linked list are the same. Both terms refer to a type of linked list where each node contains a reference to the next node as well as the previous node in the sequence. The term “two-way” emphasizes the ability to move in both directions through the list, while “doubly” highlights that there are two links per
3 min read
XOR Linked List - A Memory Efficient Doubly Linked List | Set 1
In this post, we're going to talk about how XOR linked lists are used to reduce the memory requirements of doubly-linked lists. We know that each node in a doubly-linked list has two pointer fields which contain the addresses of the previous and next node. On the other hand, each node of the XOR linked list requires only a single pointer field, whi
15+ min read
XOR Linked List – A Memory Efficient Doubly Linked List | Set 2
In the previous post, we discussed how a Doubly Linked can be created using only one space for the address field with every node. In this post, we will discuss the implementation of a memory-efficient doubly linked list. We will mainly discuss the following two simple functions. A function to insert a new node at the beginning.A function to travers
10 min read
Difference between Singly linked list and Doubly linked list
Introduction to Singly linked list : A singly linked list is a set of nodes where each node has two fields 'data' and 'link'. The 'data' field stores actual piece of information and 'link' field is used to point to next node. Basically the 'link' field stores the address of the next node.  Introduction to Doubly linked list : A Doubly Linked List (
2 min read
Construct a Doubly linked linked list from 2D Matrix
Given a 2D matrix, the task is to convert it into a doubly-linked list with four pointers that are next, previous, up, and down, each node of this list should be connected to its next, previous, up, and down nodes.Examples: Input: 2D matrix 1 2 3 4 5 6 7 8 9 Output: Approach: The main idea is to construct a new node for every element of the matrix
15+ min read
Least Frequently Used (LFU) Cache Implementation
Least Frequently Used (LFU) is a caching algorithm in which the least frequently used cache block is removed whenever the cache is overflowed. In LFU we check the old page as well as the frequency of that page and if the frequency of the page is larger than the old page we cannot remove it and if all the old pages are having same frequency then tak
9 min read
Why Arrays have better cache locality than Linked list?
What is a Cache locality?Cache locality refers to the property of computer programs where they tend to access a small, specific set of data repeatedly in a short period of time. By keeping this frequently accessed data in a small, fast memory region called a cache, the program can speed up its execution time. This is because accessing data from the
9 min read
Reverse a Doubly linked list using recursion
Given a doubly linked list. Reverse it using recursion. Original Doubly linked list Reversed Doubly linked list We have discussed Iterative solution to reverse a Doubly Linked List Algorithm: If list is empty, return Reverse head by swapping head->prev and head->next If prev = NULL it means that list is fully reversed. Else reverse(head->p
9 min read
Priority Queue using Doubly Linked List
Given Nodes with their priority, implement a priority queue using doubly linked list. Prerequisite : Priority Queue push(): This function is used to insert a new data into the queue.pop(): This function removes the element with the lowest priority value from the queue.peek() / top(): This function is used to get the lowest priority element in the q
11 min read
Large number arithmetic using doubly linked list
Given two very large numbers in form of strings. Your task is to apply different arithmetic operations on these strings. Prerequisite : Doubly Linked List. Examples: Input : m : 123456789123456789123456789123456789123456789123456789 n : 456789123456789123456789123456789123456789123456789 Output : Product : 563937184884934839205932493526930147847927
13 min read
Python | Queue using Doubly Linked List
A Queue is a collection of objects that are inserted and removed using First in First out Principle(FIFO). Insertion is done at the back(Rear) of the Queue and elements are accessed and deleted from first(Front) location in the queue. Queue Operations:1. enqueue() : Adds element to the back of Queue. 2. dequeue() : Removes and returns the first ele
3 min read
Python | Stack using Doubly Linked List
A stack is a collection of objects that are inserted and removed using Last in First out Principle(LIFO). User can insert elements into the stack, and can only access or remove the recently inserted object on top of the stack. The main advantage of using LinkedList over array for implementing stack is the dynamic allocation of data whereas in the a
4 min read
Partial derivative of a polynomial using Doubly Linked List
Given a 2-variable polynomial represented by a doubly linked list, the task is to find the partial derivative of a polynomial stored in the doubly-linked list. Examples: Input: P(x, y) = 2(x^3 y^4) + 3(x^5 y^7) + 1(x^2 y^6)Output:Partial derivatives w.r.t. x: 6(x^2 y^4) + 15(x^4 y^7) + 2(x^1 y^6)Partial derivatives w.r.t. y: 24(x^2 y^3) + 105(x^4 y
15+ min read
Employee Management System using doubly linked list in C
Design and implement a menu-driven program in C for the below operations on DLL of employee data with fields: SSN, name, department, designation, Salary, Phone Number: Create a DLL of N employee's data by using end insertion.Display the status of DLL and count the number of nodes in it.Perform insertion and deletion at end of DLL.Perform insertion
6 min read
Sort a K sorted Doubly Linked List | Set 2 (Using Shell Sort)
Given a doubly-linked list containing N nodes, where each node is at most K away from its target position in the list, the task is to sort the given doubly linked list. Examples: Input: DLL: 3<->6<->2<->12<->56<->8, K = 2Output: 2<->3<->6<->8<->12<->56 Input: DLL: 3<->2<->1<
12 min read
Convert Binary Tree to Doubly Linked List using Morris Traversal
Given a Binary Tree (BT), convert it to a Doubly Linked List (DLL). The left and right pointers in nodes are to be used as previous and next pointers respectively in converted DLL. The order of nodes in DLL must be the same as in Inorder for the given Binary Tree. The first node of Inorder traversal must be the head node of the DLL. Examples: Input
12 min read
Doubly Linked List using Sentinel Nodes
In the case of the simple doubly linked list, if we have to perform the insertion or deletion operation at the starting of the doubly linked list, end of the doubly linked list, or in between the starting and end nodes for each, we require to check different condition that makes the algorithm complex so to solve this problem we can use doubly linke
15+ min read
Convert Binary Tree to Doubly Linked List using inorder traversal
Given a Binary Tree (BT), the task is to convert it to a Doubly Linked List (DLL) in place. The left and right pointers in nodes will be used as previous and next pointers respectively in converted DLL. The order of nodes in DLL must be the same as the order of the given Binary Tree. The first node of Inorder traversal (leftmost node in BT) must be
15+ min read
LRU Approximation (Second Chance Algorithm)
If you are not familiar with Least Recently Used Algorithm, check Least Recently Used Algorithm(Page Replacement)This algorithm is a combination of using a queue, similar to FIFO (FIFO (Page Replacement)) alongside using an array to keep track of the bits used to give the queued page a "second chance". How does the algorithm work: Set all the value
15+ min read
LRU Full Form
LRU stands for Least Recently Used. The development of the LRU algorithm ended the debate and research that had been going on about page replacement algorithms in the 1960s and 1970s. LRU replaces the line in the cache that has been in the cache the longest with no reference to it. It works on the idea that the more recently used blocks are more li
2 min read
Difference Between LRU and FIFO Page Replacement Algorithms in Operating System
Page replacement algorithms are used in operating systems to manage memory effectively. Page replacement algorithms are essential in operating systems for efficient memory management. These algorithms select which memory pages should be changed when a new page is brought. Least Recently Used (LRU) and First-In-First-Out (FIFO) are two popular page
3 min read
Program for Least Recently Used (LRU) Page Replacement algorithm
Prerequisite: Page Replacement AlgorithmsIn operating systems that use paging for memory management, page replacement algorithm are needed to decide which page needed to be replaced when new page comes in. Whenever a new page is referred and not present in memory, page fault occurs and Operating System replaces one of the existing pages with newly
14 min read
three90RightbarBannerImg