Python - functools Module (partial, lru_cache, wraps)
The `partial` function creates a new callable by freezing some portion of a function's arguments and/or keywords. This is particularly useful when you need to call a function multiple times with the...
Key Insights
functools.partialcreates new callable objects with pre-filled arguments, reducing code duplication and improving readability when repeatedly calling functions with the same parametersfunctools.lru_cacheimplements memoization with a least-recently-used eviction policy, dramatically improving performance for expensive recursive or repetitive function callsfunctools.wrapspreserves function metadata when creating decorators, ensuring proper introspection and debugging capabilities in wrapped functions
Understanding functools.partial
The partial function creates a new callable by freezing some portion of a function’s arguments and/or keywords. This is particularly useful when you need to call a function multiple times with the same arguments.
from functools import partial
def power(base, exponent):
return base ** exponent
# Create specialized functions
square = partial(power, exponent=2)
cube = partial(power, exponent=3)
print(square(5)) # 25
print(cube(5)) # 125
A practical use case involves database operations where connection parameters remain constant:
from functools import partial
import sqlite3
def execute_query(query, db_path, params=None):
conn = sqlite3.connect(db_path)
cursor = conn.cursor()
cursor.execute(query, params or [])
results = cursor.fetchall()
conn.close()
return results
# Create a specialized function for your production database
prod_query = partial(execute_query, db_path='/var/data/production.db')
# Now use it without repeating the db_path
users = prod_query("SELECT * FROM users WHERE active = ?", params=[1])
orders = prod_query("SELECT * FROM orders WHERE status = ?", params=['pending'])
The partial object also works with positional arguments:
from functools import partial
def multiply(x, y, z):
return x * y * z
# Fix the first argument
double_and_multiply = partial(multiply, 2)
print(double_and_multiply(3, 4)) # 2 * 3 * 4 = 24
# Fix multiple arguments
double_triple = partial(multiply, 2, 3)
print(double_triple(4)) # 2 * 3 * 4 = 24
Optimizing with lru_cache
The lru_cache decorator implements memoization, caching function results based on input arguments. This is invaluable for expensive computations, especially recursive algorithms.
from functools import lru_cache
import time
@lru_cache(maxsize=128)
def fibonacci(n):
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
start = time.time()
print(fibonacci(35))
print(f"Time: {time.time() - start:.4f}s")
# Without cache, this would take seconds
# With cache: typically under 0.001s
The maxsize parameter determines how many recent calls to cache. Setting it to None creates an unbounded cache:
from functools import lru_cache
import requests
@lru_cache(maxsize=None)
def fetch_user_data(user_id):
response = requests.get(f"https://api.example.com/users/{user_id}")
return response.json()
# First call hits the API
user = fetch_user_data(123)
# Subsequent calls return cached data
user = fetch_user_data(123) # Instant, no API call
Monitor cache performance using the cache_info() method:
from functools import lru_cache
@lru_cache(maxsize=32)
def expensive_operation(x, y):
return x ** y
expensive_operation(2, 10)
expensive_operation(3, 5)
expensive_operation(2, 10) # Cache hit
print(expensive_operation.cache_info())
# CacheInfo(hits=1, misses=2, maxsize=32, currsize=2)
Clear the cache when needed:
expensive_operation.cache_clear()
print(expensive_operation.cache_info())
# CacheInfo(hits=0, misses=0, maxsize=32, currsize=0)
A real-world example with database queries:
from functools import lru_cache
import hashlib
import json
@lru_cache(maxsize=256)
def get_user_permissions(user_id, role_id):
# Expensive database join operation
query = """
SELECT p.permission_name
FROM permissions p
JOIN role_permissions rp ON p.id = rp.permission_id
WHERE rp.role_id = ?
"""
# Simulate expensive query
return execute_query(query, params=[role_id])
# Cache different combinations
perms1 = get_user_permissions(1, 'admin')
perms2 = get_user_permissions(2, 'admin') # Different user, same role - new query
perms3 = get_user_permissions(1, 'admin') # Cache hit
Preserving Metadata with wraps
When creating decorators, wraps ensures the wrapped function retains its original metadata (name, docstring, module, etc.). Without it, debugging and introspection become difficult.
from functools import wraps
def my_decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
@my_decorator
def greet(name):
"""Greet someone by name."""
return f"Hello, {name}"
print(greet.__name__) # 'greet' (not 'wrapper')
print(greet.__doc__) # 'Greet someone by name.'
Without @wraps, you lose the original function’s identity:
def bad_decorator(func):
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
@bad_decorator
def process(data):
"""Process the data."""
return data
print(process.__name__) # 'wrapper' - not helpful!
print(process.__doc__) # None - documentation lost!
A practical logging decorator:
from functools import wraps
import logging
import time
logging.basicConfig(level=logging.INFO)
def log_execution(func):
@wraps(func)
def wrapper(*args, **kwargs):
start = time.time()
logging.info(f"Executing {func.__name__} with args={args}, kwargs={kwargs}")
try:
result = func(*args, **kwargs)
logging.info(f"{func.__name__} completed in {time.time() - start:.4f}s")
return result
except Exception as e:
logging.error(f"{func.__name__} failed: {str(e)}")
raise
return wrapper
@log_execution
def calculate_total(items, tax_rate=0.1):
"""Calculate total with tax."""
subtotal = sum(items)
return subtotal * (1 + tax_rate)
result = calculate_total([10, 20, 30], tax_rate=0.15)
Combining decorators with preserved metadata:
from functools import wraps, lru_cache
def validate_positive(func):
@wraps(func)
def wrapper(n):
if n < 0:
raise ValueError("Input must be positive")
return func(n)
return wrapper
@lru_cache(maxsize=128)
@validate_positive
def factorial(n):
"""Calculate factorial of n."""
if n <= 1:
return 1
return n * factorial(n - 1)
print(factorial(5)) # 120
print(factorial.__name__) # 'factorial'
# factorial(-1) # Raises ValueError
Advanced Patterns
Combine partial with lru_cache for specialized cached functions:
from functools import partial, lru_cache
@lru_cache(maxsize=128)
def fetch_data(endpoint, api_key, params):
# Expensive API call
return f"Data from {endpoint} with {params}"
# Create specialized cached functions per environment
prod_fetch = partial(fetch_data, api_key="prod_key_123")
dev_fetch = partial(fetch_data, api_key="dev_key_456")
# Each maintains its own cache
data1 = prod_fetch("/users", "filter=active")
data2 = prod_fetch("/users", "filter=active") # Cache hit
Create parameterized decorators using partial:
from functools import wraps, partial
def retry(max_attempts=3, delay=1):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
for attempt in range(max_attempts):
try:
return func(*args, **kwargs)
except Exception as e:
if attempt == max_attempts - 1:
raise
time.sleep(delay)
return wrapper
return decorator
# Create specialized retry decorators
retry_fast = partial(retry, max_attempts=2, delay=0.5)
retry_slow = partial(retry, max_attempts=5, delay=2)
@retry_fast()
def flaky_operation():
"""Operation that might fail."""
pass
The functools module provides essential tools for writing cleaner, more efficient Python code. Use partial to eliminate repetitive argument passing, lru_cache to optimize expensive computations, and wraps to maintain proper function metadata in decorators. These utilities are not just conveniences—they’re fundamental to writing production-quality Python applications.