NumPy - Polynomial Operations (np.poly1d, np.polyfit)

• NumPy's `poly1d` class provides an intuitive object-oriented interface for polynomial operations including evaluation, differentiation, integration, and root finding

Key Insights

• NumPy’s poly1d class provides an intuitive object-oriented interface for polynomial operations including evaluation, differentiation, integration, and root finding • polyfit performs least-squares polynomial regression, enabling curve fitting to noisy data with configurable polynomial degrees and optional covariance matrix output • Combining polynomial fitting with evaluation creates powerful workflows for data analysis, interpolation, and mathematical modeling with minimal code

Understanding poly1d for Polynomial Representation

The np.poly1d class represents polynomials as first-class objects in NumPy. Unlike working with raw coefficient arrays, poly1d provides methods that make polynomial manipulation straightforward.

import numpy as np

# Create polynomial: 3x^2 + 2x + 1
coefficients = [3, 2, 1]  # Highest degree first
p = np.poly1d(coefficients)

print(p)
# Output:
#    2
# 3 x + 2 x + 1

# Evaluate at specific points
x_values = np.array([0, 1, 2, 3])
y_values = p(x_values)
print(y_values)  # [1, 6, 17, 34]

# Access individual coefficients
print(p.coefficients)  # [3 2 1]
print(p.order)  # 2

Creating polynomials from roots is equally simple:

# Create polynomial from roots: (x-1)(x-2)(x-3)
roots = [1, 2, 3]
p = np.poly1d(roots, r=True)

print(p)
#    3     2
# 1 x - 6 x + 11 x - 6

# Verify roots
print(p.roots)  # [3. 2. 1.]

Polynomial Arithmetic and Calculus Operations

poly1d objects support standard arithmetic operations and calculus operations through intuitive methods and operators.

# Define two polynomials
p1 = np.poly1d([2, 3, 1])  # 2x^2 + 3x + 1
p2 = np.poly1d([1, -2])     # x - 2

# Addition and subtraction
p_sum = p1 + p2
print(p_sum)  # 2x^2 + 4x - 1

# Multiplication
p_mult = p1 * p2
print(p_mult)  # 2x^3 - x^2 - 5x - 2

# Division returns quotient and remainder
quotient, remainder = np.polydiv(p1, p2)
print(f"Quotient: {quotient}")
print(f"Remainder: {remainder}")

# Derivative
p_deriv = p1.deriv()
print(p_deriv)  # 4x + 3

# Second derivative
p_deriv2 = p1.deriv(2)
print(p_deriv2)  # 4

# Integration (with constant of integration = 0 by default)
p_integ = p1.integ()
print(p_integ)  # 0.6667x^3 + 1.5x^2 + x

# Integration with specified constant
p_integ_c = p1.integ(k=5)  # k is the constant
print(p_integ_c)  # 0.6667x^3 + 1.5x^2 + x + 5

Curve Fitting with polyfit

np.polyfit performs least-squares polynomial regression, finding the best-fit polynomial of specified degree for given data points.

# Generate noisy data from a known polynomial
true_coeffs = [2, -3, 1]  # 2x^2 - 3x + 1
x_data = np.linspace(-5, 5, 50)
y_true = np.polyval(true_coeffs, x_data)
noise = np.random.normal(0, 2, size=x_data.shape)
y_noisy = y_true + noise

# Fit polynomial of degree 2
fitted_coeffs = np.polyfit(x_data, y_noisy, deg=2)
print(f"True coefficients: {true_coeffs}")
print(f"Fitted coefficients: {fitted_coeffs}")

# Create poly1d object for easy evaluation
p_fitted = np.poly1d(fitted_coeffs)
y_fitted = p_fitted(x_data)

# Calculate R-squared
residuals = y_noisy - y_fitted
ss_res = np.sum(residuals**2)
ss_tot = np.sum((y_noisy - np.mean(y_noisy))**2)
r_squared = 1 - (ss_res / ss_tot)
print(f"R-squared: {r_squared:.4f}")

Advanced polyfit: Weights and Covariance

polyfit supports weighted fitting and returns covariance matrices for uncertainty estimation.

# Weighted polynomial fitting
x_data = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
y_data = np.array([2.1, 3.9, 6.2, 8.1, 9.8, 12.3, 14.1, 16.2, 18.0, 20.1])

# Give more weight to certain points (e.g., more reliable measurements)
weights = np.ones_like(x_data)
weights[0:3] = 2.0  # First three points are more reliable

coeffs_weighted = np.polyfit(x_data, y_data, deg=1, w=weights)
coeffs_unweighted = np.polyfit(x_data, y_data, deg=1)

print(f"Weighted fit: {coeffs_weighted}")
print(f"Unweighted fit: {coeffs_unweighted}")

# Get covariance matrix for uncertainty estimation
coeffs, cov_matrix = np.polyfit(x_data, y_data, deg=2, cov=True)
errors = np.sqrt(np.diag(cov_matrix))

print(f"Coefficients: {coeffs}")
print(f"Standard errors: {errors}")

Practical Application: Data Interpolation and Extrapolation

Combining polyfit and poly1d creates powerful data analysis workflows.

import matplotlib.pyplot as plt

# Experimental data: temperature vs. resistance
temperature = np.array([20, 30, 40, 50, 60, 70, 80])
resistance = np.array([100.2, 103.1, 106.3, 109.2, 112.5, 115.3, 118.7])

# Fit polynomials of different degrees
degrees = [1, 2, 3]
polynomials = {}

for deg in degrees:
    coeffs = np.polyfit(temperature, resistance, deg)
    polynomials[deg] = np.poly1d(coeffs)

# Generate smooth curve for plotting
temp_smooth = np.linspace(15, 85, 200)

# Predict resistance at specific temperature
target_temp = 55
for deg, poly in polynomials.items():
    predicted = poly(target_temp)
    print(f"Degree {deg}: R({target_temp}°C) = {predicted:.2f} Ω")

# Calculate prediction intervals for degree 2
coeffs, cov = np.polyfit(temperature, resistance, 2, cov=True)
p = np.poly1d(coeffs)

# Residual standard error
residuals = resistance - p(temperature)
residual_std = np.sqrt(np.sum(residuals**2) / (len(temperature) - 3))
print(f"Residual standard error: {residual_std:.3f}")

Solving Polynomial Equations

poly1d provides direct access to root finding and equation solving.

# Find roots of polynomial equation
# Solve: 2x^3 - 5x^2 + 3x - 1 = 0
p = np.poly1d([2, -5, 3, -1])
roots = p.roots

print("Roots:", roots)
print("Real roots:", roots[np.isreal(roots)].real)

# Verify roots by evaluation
for root in roots:
    result = p(root)
    print(f"p({root:.4f}) = {result:.10f}")

# Find polynomial from specified roots
desired_roots = [1, 2, 3]
p_from_roots = np.poly1d(desired_roots, r=True)
print(p_from_roots)

# Find where polynomial equals a specific value
# Solve: 2x^2 + 3x + 1 = 10
p = np.poly1d([2, 3, 1])
p_shifted = p - 10  # Create new polynomial: 2x^2 + 3x - 9
solutions = p_shifted.roots
print(f"Solutions where p(x) = 10: {solutions}")

Performance Considerations and Alternatives

While poly1d is convenient, NumPy also offers functional alternatives for performance-critical applications.

# poly1d approach
p = np.poly1d([1, 2, 3, 4])
x = np.linspace(0, 10, 1000000)

import time

# Using poly1d object
start = time.time()
y1 = p(x)
time_poly1d = time.time() - start

# Using polyval function directly
coeffs = [1, 2, 3, 4]
start = time.time()
y2 = np.polyval(coeffs, x)
time_polyval = time.time() - start

print(f"poly1d time: {time_poly1d:.4f}s")
print(f"polyval time: {time_polyval:.4f}s")
print(f"Results equal: {np.allclose(y1, y2)}")

# For very high-degree polynomials, consider Horner's method
# (which polyval uses internally)
def horner_eval(coeffs, x):
    result = coeffs[0]
    for coeff in coeffs[1:]:
        result = result * x + coeff
    return result

# Vectorized Horner's method
coeffs_array = np.array([1, 2, 3, 4])
start = time.time()
y3 = np.polyval(coeffs_array, x)
time_horner = time.time() - start
print(f"Horner time: {time_horner:.4f}s")

NumPy’s polynomial tools provide a complete toolkit for mathematical and data analysis tasks. Use poly1d for interactive work and clarity, but switch to functional APIs like polyval and polyfit when processing large datasets or optimizing performance. The combination of fitting, evaluation, and calculus operations makes NumPy suitable for everything from basic curve fitting to complex numerical analysis workflows.

Liked this? There's more.

Every week: one practical technique, explained simply, with code you can use immediately.