Singly Linked List: Implementation and Operations
A singly linked list is a linear data structure where elements are stored in nodes, and each node contains two things: the data itself and a reference (pointer) to the next node in the sequence....
Key Insights
- Singly linked lists trade O(1) random access for O(1) insertion and deletion at the head, making them ideal for implementing stacks, queues, and scenarios requiring frequent modifications at known positions.
- The key to mastering linked list operations is understanding pointer manipulation—every insertion and deletion is fundamentally about rewiring references between nodes.
- While arrays outperform linked lists for most use cases in modern languages, understanding linked lists builds foundational knowledge for trees, graphs, and solving algorithmic interview problems.
Introduction to Singly Linked Lists
A singly linked list is a linear data structure where elements are stored in nodes, and each node contains two things: the data itself and a reference (pointer) to the next node in the sequence. Unlike arrays, linked list elements aren’t stored in contiguous memory locations—they’re scattered throughout memory, connected only by these next pointers.
This architecture creates distinct trade-offs. Arrays give you O(1) random access but O(n) insertion and deletion in the middle. Linked lists flip this: O(n) access but O(1) insertion and deletion once you have a reference to the target position. Arrays also require upfront size allocation or expensive resizing operations, while linked lists grow and shrink dynamically.
You’ll encounter linked lists when implementing stacks (push/pop from head), queues (enqueue at tail, dequeue from head), undo functionality in applications, and any scenario where you’re frequently inserting or removing elements from a dynamic collection. They’re also the foundation for understanding more complex structures like trees and graphs.
Node Structure and List Foundation
Every linked list starts with a node. The node is the atomic unit—get this right, and everything else follows.
class Node:
def __init__(self, data):
self.data = data
self.next = None
class LinkedList:
def __init__(self):
self.head = None
class Node {
constructor(data) {
this.data = data;
this.next = null;
}
}
class LinkedList {
constructor() {
this.head = null;
}
}
The LinkedList class maintains a single reference: head. This is your entry point into the list. Lose track of the head, and you lose the entire list. Some implementations also track tail for O(1) append operations and size to avoid traversing the list for length queries. I’ll keep it minimal here—you can add these optimizations based on your access patterns.
Core Insertion Operations
Insertion is where linked lists shine. Three scenarios cover most needs: inserting at the head, at the tail, and at a specific position.
class LinkedList:
def __init__(self):
self.head = None
def prepend(self, data):
"""Insert at head - O(1)"""
new_node = Node(data)
new_node.next = self.head
self.head = new_node
def append(self, data):
"""Insert at tail - O(n) without tail reference"""
new_node = Node(data)
if self.head is None:
self.head = new_node
return
current = self.head
while current.next is not None:
current = current.next
current.next = new_node
def insert_at(self, index, data):
"""Insert at specific position - O(n)"""
if index < 0:
raise IndexError("Index cannot be negative")
if index == 0:
self.prepend(data)
return
new_node = Node(data)
current = self.head
position = 0
while current is not None and position < index - 1:
current = current.next
position += 1
if current is None:
raise IndexError("Index out of bounds")
new_node.next = current.next
current.next = new_node
Notice the pattern in insert_at: you traverse to the node before your target position, then rewire. This is critical—you need access to the previous node to insert after it. The order of pointer assignment matters too: set new_node.next before modifying current.next, or you’ll lose the reference to the rest of the list.
Time complexities: prepend is O(1) because you have direct access to the head. append is O(n) because you must traverse the entire list. insert_at is O(n) in the worst case when inserting near the tail.
Deletion Operations
Deletion mirrors insertion but with an extra consideration: you’re removing references, which in garbage-collected languages means the node becomes eligible for cleanup.
def delete_value(self, value):
"""Delete first occurrence of value - O(n)"""
if self.head is None:
return False
# Special case: deleting the head
if self.head.data == value:
self.head = self.head.next
return True
current = self.head
while current.next is not None:
if current.next.data == value:
current.next = current.next.next
return True
current = current.next
return False
def delete_at(self, index):
"""Delete node at specific index - O(n)"""
if self.head is None:
raise IndexError("Cannot delete from empty list")
if index < 0:
raise IndexError("Index cannot be negative")
if index == 0:
deleted_data = self.head.data
self.head = self.head.next
return deleted_data
current = self.head
position = 0
while current.next is not None and position < index - 1:
current = current.next
position += 1
if current.next is None:
raise IndexError("Index out of bounds")
deleted_data = current.next.data
current.next = current.next.next
return deleted_data
The head deletion case requires special handling because there’s no previous node to update. For middle and tail deletions, you find the node before the target and skip over the target by setting current.next = current.next.next.
In languages without garbage collection (C, C++), you’d need to explicitly free the deleted node’s memory. In Python and JavaScript, setting current.next to skip the deleted node removes all references to it, making it eligible for garbage collection.
Traversal and Search
Traversal is the foundation of most linked list operations. You start at the head and follow next pointers until you hit None.
def traverse(self):
"""Print all elements - O(n)"""
elements = []
current = self.head
while current is not None:
elements.append(current.data)
current = current.next
return elements
def search(self, value):
"""Find index of value, return -1 if not found - O(n)"""
current = self.head
index = 0
while current is not None:
if current.data == value:
return index
current = current.next
index += 1
return -1
def get_length(self):
"""Count nodes - O(n)"""
count = 0
current = self.head
while current is not None:
count += 1
current = current.next
return count
def is_empty(self):
"""Check if list is empty - O(1)"""
return self.head is None
All traversal-based operations are O(n). This is the fundamental limitation of singly linked lists—you can’t jump to the middle or access by index without walking through preceding nodes. If you need frequent length checks, maintain a size counter that you increment on insertion and decrement on deletion.
Common Operations and Utilities
Reversing a linked list is a classic interview question and a practical operation. The iterative approach uses three pointers and runs in O(n) time with O(1) space.
def reverse(self):
"""Reverse list in-place - O(n) time, O(1) space"""
previous = None
current = self.head
while current is not None:
next_node = current.next # Save reference before we lose it
current.next = previous # Reverse the pointer
previous = current # Move previous forward
current = next_node # Move current forward
self.head = previous
def get_middle(self):
"""Find middle element using slow/fast pointers - O(n)"""
if self.head is None:
return None
slow = self.head
fast = self.head
while fast.next is not None and fast.next.next is not None:
slow = slow.next
fast = fast.next.next
return slow.data
The reversal algorithm is elegant: at each step, you reverse the current node’s pointer to point backward, then advance all three pointers. When current becomes None, previous points to what was the last node—now the new head.
The middle-finding algorithm uses the tortoise-and-hare technique: the slow pointer moves one step while the fast pointer moves two. When fast reaches the end, slow is at the middle. This pattern also detects cycles—if fast ever equals slow after the start, you have a loop.
Time/Space Complexity Summary and When to Use
| Operation | Time Complexity | Space Complexity |
|---|---|---|
| Prepend (insert at head) | O(1) | O(1) |
| Append (insert at tail) | O(n)* | O(1) |
| Insert at position | O(n) | O(1) |
| Delete head | O(1) | O(1) |
| Delete by value/position | O(n) | O(1) |
| Search | O(n) | O(1) |
| Access by index | O(n) | O(1) |
| Reverse | O(n) | O(1) |
*O(1) if you maintain a tail reference
Choose linked lists when:
- You frequently insert or delete at the beginning
- You don’t know the size upfront and want to avoid resize overhead
- You’re implementing stacks, queues, or adjacency lists for graphs
- Memory fragmentation is acceptable and cache performance isn’t critical
Avoid linked lists when:
- You need random access by index
- You’re iterating sequentially and want cache-friendly memory access
- Memory overhead per element matters (each node carries pointer overhead)
For most application code, arrays (or dynamic arrays like Python lists and JavaScript arrays) are the right choice. Their cache locality and simpler mental model outweigh linked list benefits. But understanding linked lists deeply prepares you for trees, graphs, and the pointer manipulation skills that separate junior from senior engineers.
Consider doubly linked lists when you need backward traversal or O(1) deletion given a node reference. Consider circular linked lists for round-robin scheduling or any scenario where you conceptually loop back to the start.