How to Create a Singleton in Python

The singleton pattern ensures a class has only one instance throughout your application's lifetime and provides a global point of access to it. Instead of creating new objects every time you...

Key Insights

  • Python offers multiple ways to implement singletons, but the module-level approach is often the most Pythonic and straightforward solution
  • Thread safety requires explicit locking mechanisms—most basic singleton implementations have race conditions in multi-threaded environments
  • Singletons make testing harder by introducing global state; consider dependency injection for better testability and code flexibility

What is the Singleton Pattern?

The singleton pattern ensures a class has only one instance throughout your application’s lifetime and provides a global point of access to it. Instead of creating new objects every time you instantiate a class, you get the same object back.

Singletons are useful for managing shared resources: database connection pools, application configuration, logging systems, and caching mechanisms. These resources should exist once and be accessible from anywhere in your codebase.

However, singletons are often overused. They introduce global state, make testing difficult, and can hide dependencies in your code. Before implementing a singleton, ask yourself: do I really need exactly one instance, or do I just need to share an instance? If it’s the latter, dependency injection is usually better.

Here’s a problem scenario where multiple instances cause issues:

class DatabaseConnection:
    def __init__(self, connection_string):
        self.connection_string = connection_string
        print(f"Opening database connection to {connection_string}")
        # Expensive connection setup here
    
    def query(self, sql):
        return f"Executing: {sql}"

# Problem: Multiple expensive connections
db1 = DatabaseConnection("postgresql://localhost/mydb")
db2 = DatabaseConnection("postgresql://localhost/mydb")  # Wasteful!
db3 = DatabaseConnection("postgresql://localhost/mydb")  # Even worse!

Each instantiation creates a new database connection, wasting resources. A singleton ensures only one connection exists.

Classic Implementation Using __new__

Python’s __new__ method controls instance creation before __init__ runs. By overriding it, you can return the same instance every time:

class DatabaseConnection:
    _instance = None
    
    def __new__(cls, *args, **kwargs):
        if cls._instance is None:
            cls._instance = super().__new__(cls)
        return cls._instance
    
    def __init__(self, connection_string):
        # Guard against re-initialization
        if not hasattr(self, 'connection_string'):
            self.connection_string = connection_string
            print(f"Opening database connection to {connection_string}")
    
    def query(self, sql):
        return f"Executing: {sql}"

# All variables point to the same instance
db1 = DatabaseConnection("postgresql://localhost/mydb")
db2 = DatabaseConnection("postgresql://localhost/mydb")
print(db1 is db2)  # True

Note the guard in __init__—without it, the initialization code runs every time you call the constructor, even though you’re getting the same instance. This approach works but has a critical flaw: it’s not thread-safe.

Decorator-Based Singleton

A decorator provides reusable singleton behavior without modifying class internals:

def singleton(cls):
    instances = {}
    
    def get_instance(*args, **kwargs):
        if cls not in instances:
            instances[cls] = cls(*args, **kwargs)
        return instances[cls]
    
    return get_instance

@singleton
class Configuration:
    def __init__(self, config_file):
        self.config_file = config_file
        self.settings = self._load_config()
        print(f"Loading configuration from {config_file}")
    
    def _load_config(self):
        # Simulate loading config
        return {"debug": True, "port": 8000}

config1 = Configuration("app.conf")
config2 = Configuration("app.conf")
print(config1 is config2)  # True

This approach is cleaner and makes singleton behavior explicit through the decorator. The decorator maintains a dictionary of instances, one per decorated class. However, it also lacks thread safety and changes the class into a function, which can break isinstance checks and introspection.

Metaclass Approach

Metaclasses control class creation itself, making them a powerful tool for enforcing singleton behavior:

class SingletonMeta(type):
    _instances = {}
    
    def __call__(cls, *args, **kwargs):
        if cls not in cls._instances:
            cls._instances[cls] = super().__call__(*args, **kwargs)
        return cls._instances[cls]

class Logger(metaclass=SingletonMeta):
    def __init__(self, log_file):
        if not hasattr(self, 'log_file'):
            self.log_file = log_file
            print(f"Initializing logger with {log_file}")
    
    def log(self, message):
        print(f"[{self.log_file}] {message}")

logger1 = Logger("app.log")
logger2 = Logger("app.log")
print(logger1 is logger2)  # True
isinstance(logger1, Logger)  # Still works!

The metaclass approach is more Pythonic than decorators because it preserves the class’s type and behavior. The __call__ method intercepts class instantiation, returning the cached instance if it exists. This is my preferred class-based approach when you need singleton behavior.

Module-Level Singleton (The Python Way)

Python’s import system naturally creates singletons. Modules are imported once and cached in sys.modules. This is the simplest and most Pythonic singleton:

# database.py
class _DatabaseConnection:
    def __init__(self):
        print("Opening database connection")
        self.connection_string = "postgresql://localhost/mydb"
    
    def query(self, sql):
        return f"Executing: {sql}"

# Create the singleton instance at module level
db_connection = _DatabaseConnection()
# app.py
from database import db_connection

# Everywhere you import db_connection, you get the same instance
result1 = db_connection.query("SELECT * FROM users")
# another_module.py
from database import db_connection

# Same instance as in app.py
result2 = db_connection.query("SELECT * FROM posts")

This approach is explicit, simple, and leverages Python’s built-in module caching. You’re not fighting the language—you’re using it as designed. For most use cases, this is the right answer.

Thread-Safe Singleton with Locking

In multi-threaded environments, multiple threads can simultaneously check if the instance exists, creating race conditions. Use locks to ensure thread safety:

import threading

class ThreadSafeSingleton:
    _instance = None
    _lock = threading.Lock()
    
    def __new__(cls, *args, **kwargs):
        if cls._instance is None:
            with cls._lock:
                # Double-checked locking
                if cls._instance is None:
                    cls._instance = super().__new__(cls)
        return cls._instance
    
    def __init__(self, value):
        if not hasattr(self, 'value'):
            self.value = value

# Test with multiple threads
def create_instance(value):
    instance = ThreadSafeSingleton(value)
    print(f"Thread {threading.current_thread().name}: {id(instance)}")

threads = [threading.Thread(target=create_instance, args=(i,)) for i in range(5)]
for thread in threads:
    thread.start()
for thread in threads:
    thread.join()

The double-checked locking pattern checks the instance twice: once without the lock (fast path) and once inside the lock (safe path). This minimizes lock contention while ensuring thread safety. The first check avoids acquiring the lock unnecessarily after the instance exists.

Testing and Best Practices

Singletons introduce global state, making tests interdependent. One test’s modifications affect others. Here’s how to handle singleton state in tests:

import pytest

class ApplicationCache(metaclass=SingletonMeta):
    def __init__(self):
        self.data = {}
    
    def set(self, key, value):
        self.data[key] = value
    
    def get(self, key):
        return self.data.get(key)
    
    def clear(self):
        self.data.clear()

@pytest.fixture(autouse=True)
def reset_singleton():
    """Reset singleton state before each test"""
    cache = ApplicationCache()
    cache.clear()
    yield
    cache.clear()

def test_cache_set():
    cache = ApplicationCache()
    cache.set("key1", "value1")
    assert cache.get("key1") == "value1"

def test_cache_isolation():
    cache = ApplicationCache()
    # This test should not see data from previous test
    assert cache.get("key1") is None

Better yet, consider alternatives to singletons:

Dependency Injection: Pass instances explicitly rather than accessing globals. This makes dependencies visible and testable.

class UserService:
    def __init__(self, db_connection, cache):
        self.db = db_connection
        self.cache = cache
    
    def get_user(self, user_id):
        # Dependencies are explicit and easily mocked
        cached = self.cache.get(user_id)
        if cached:
            return cached
        user = self.db.query(f"SELECT * FROM users WHERE id={user_id}")
        self.cache.set(user_id, user)
        return user

Use singletons sparingly. They’re appropriate for truly global resources like loggers or application configuration, but often dependency injection provides better testability and flexibility. When you do use singletons, prefer the module-level approach for simplicity, or use metaclasses if you need class-based behavior. Always consider thread safety in concurrent environments, and design your tests to handle global state appropriately.

Liked this? There's more.

Every week: one practical technique, explained simply, with code you can use immediately.