Futures and Promises: Deferred Computation
Every network request, file read, or database query forces a choice: wait for the result and block everything else, or continue working and handle the result later. Blocking is simple to reason about...
Key Insights
- Futures and promises separate the concerns of producing and consuming asynchronous values, enabling cleaner concurrent code than raw callbacks
- Understanding the pending → fulfilled/rejected lifecycle is essential for debugging async code and avoiding common pitfalls like unhandled rejections
- Composition primitives like
Promise.all()and chaining unlock the real power of deferred computation, but require careful error handling to avoid silent failures
The Problem with Blocking
Every network request, file read, or database query forces a choice: wait for the result and block everything else, or continue working and handle the result later. Blocking is simple to reason about but scales poorly. A web server that blocks on each request handles one user at a time. A UI that blocks on a network call freezes entirely.
Traditional threading solves this but introduces its own complexity—synchronization primitives, race conditions, and the overhead of context switching. Futures and promises offer a middle path: represent the eventual result of an operation as a first-class value that can be passed around, composed, and transformed before the actual result arrives.
This isn’t just syntactic convenience. Deferred computation fundamentally changes how you structure programs. Instead of imperative sequences that wait for results, you build pipelines of transformations that execute when values become available.
Conceptual Foundation: Futures vs Promises
The terms “future” and “promise” are often used interchangeably, but they represent distinct concepts in the producer/consumer relationship.
A promise is a writable container. The producer of an asynchronous value holds the promise and eventually resolves or rejects it. A future is a read-only handle to that eventual value. Consumers receive futures and attach handlers to react when values arrive.
This separation enforces a clean contract: only the code that creates the asynchronous operation can complete it, while any code with access to the future can observe the result.
// JavaScript conflates these concepts, but the pattern is visible
function fetchUserData(userId) {
// The executor function is the "producer" side
return new Promise((resolve, reject) => {
database.query(`SELECT * FROM users WHERE id = ${userId}`, (err, result) => {
if (err) reject(err);
else resolve(result);
});
});
}
// Consumer code only sees the "future" side
const userFuture = fetchUserData(42);
userFuture.then(user => console.log(user.name));
In languages with explicit separation, this distinction is clearer:
import scala.concurrent.{Future, Promise}
import scala.concurrent.ExecutionContext.Implicits.global
// Producer holds the promise
val promise: Promise[String] = Promise[String]()
// Consumer receives only the future
val future: Future[String] = promise.future
// Later, producer fulfills the promise
promise.success("computation complete")
// Consumer reacts to the future
future.foreach(result => println(result))
The producer/consumer split matters for API design. Functions should return futures, never promises. Exposing the promise lets callers resolve it with arbitrary values, breaking encapsulation.
Lifecycle of a Deferred Value
Every future exists in one of three states: pending, fulfilled, or rejected. Once a future leaves the pending state, it’s settled and cannot change again. This immutability is crucial—multiple consumers can attach handlers without worrying about the value changing underneath them.
function demonstrateLifecycle() {
const promise = new Promise((resolve, reject) => {
console.log("State: pending");
setTimeout(() => {
const success = Math.random() > 0.5;
if (success) {
resolve("Operation succeeded");
// State: fulfilled
} else {
reject(new Error("Operation failed"));
// State: rejected
}
}, 1000);
});
promise
.then(value => {
console.log(`Fulfilled with: ${value}`);
return value.toUpperCase(); // Transform and propagate
})
.catch(error => {
console.log(`Rejected with: ${error.message}`);
return "default value"; // Recover from error
})
.then(finalValue => {
console.log(`Final value: ${finalValue}`);
});
}
The .then() handler receives fulfilled values; .catch() handles rejections. Both return new futures, enabling transformation chains. When a handler returns a value, the next future fulfills with that value. When a handler throws or returns a rejected promise, the chain short-circuits to the next .catch().
This propagation model means errors bubble through chains until handled—similar to synchronous exceptions, but across asynchronous boundaries.
Composition and Chaining
Individual futures are useful, but composition is where deferred computation shines. You can express complex async workflows declaratively.
Sequential chaining executes operations in order, passing results forward:
fetchUser(userId)
.then(user => fetchOrders(user.id))
.then(orders => calculateTotal(orders))
.then(total => applyDiscount(total, 0.1))
.then(finalPrice => console.log(`Total: $${finalPrice}`));
Each step waits for the previous one. The chain reads top-to-bottom, unlike nested callbacks.
Parallel execution with Promise.all() waits for multiple independent operations:
async function loadDashboard(userId) {
const [user, orders, notifications] = await Promise.all([
fetchUser(userId),
fetchOrders(userId),
fetchNotifications(userId)
]);
return { user, orders, notifications };
}
All three requests fire simultaneously. The combined future fulfills when all complete, or rejects immediately if any fails. This fail-fast behavior is usually what you want—partial results are often useless.
Racing with Promise.race() resolves with whichever future settles first:
function fetchWithTimeout(url, ms) {
const timeout = new Promise((_, reject) =>
setTimeout(() => reject(new Error("Timeout")), ms)
);
return Promise.race([fetch(url), timeout]);
}
For cases where you want all results regardless of individual failures, use Promise.allSettled():
const results = await Promise.allSettled([
fetchFromPrimary(),
fetchFromBackup(),
fetchFromCache()
]);
const successes = results
.filter(r => r.status === "fulfilled")
.map(r => r.value);
Implementation Patterns Across Languages
Different languages make different trade-offs in their future implementations. Understanding these helps when working across ecosystems.
JavaScript Promises are eager—the executor runs immediately upon construction. They’re always asynchronous; handlers execute in a later microtask even if the promise is already settled.
Java’s CompletableFuture provides a richer API with explicit thread pool control:
CompletableFuture<String> future = CompletableFuture
.supplyAsync(() -> fetchData(), executor)
.thenApply(data -> transform(data))
.exceptionally(error -> "fallback");
// Blocking retrieval when needed
String result = future.get(5, TimeUnit.SECONDS);
Rust’s Future trait is lazy—futures don’t execute until polled by a runtime. This enables zero-cost abstractions but requires an executor:
use tokio;
async fn fetch_data() -> Result<String, Error> {
let response = reqwest::get("https://api.example.com/data").await?;
response.text().await
}
#[tokio::main]
async fn main() {
// Future executes only when awaited
match fetch_data().await {
Ok(data) => println!("{}", data),
Err(e) => eprintln!("Error: {}", e),
}
}
Python’s asyncio.Future integrates with the event loop and supports both callback-style and await syntax:
import asyncio
async def fetch_data():
await asyncio.sleep(1) # Simulate async work
return "data"
async def main():
# Create task to run concurrently
task = asyncio.create_task(fetch_data())
result = await task
print(result)
asyncio.run(main())
The key trade-off is eager vs lazy execution. Eager futures start work immediately but can waste resources if results are never needed. Lazy futures require explicit scheduling but enable better resource control.
Common Pitfalls and Best Practices
Unhandled rejections are the async equivalent of uncaught exceptions. Always terminate chains with error handling:
// Anti-pattern: floating promise with no error handling
fetchData().then(process);
// Correct: explicit error handling
fetchData()
.then(process)
.catch(error => {
logger.error("Failed to process data", error);
notifyMonitoring(error);
});
Accidental sequential execution happens when you await in a loop instead of parallelizing:
// Anti-pattern: sequential requests (slow)
async function fetchAllUsers(ids) {
const users = [];
for (const id of ids) {
users.push(await fetchUser(id)); // Each waits for previous
}
return users;
}
// Correct: parallel requests
async function fetchAllUsers(ids) {
return Promise.all(ids.map(id => fetchUser(id)));
}
Memory leaks from dangling promises occur when you create promises that never settle, keeping handlers in memory:
// Anti-pattern: promise that may never resolve
function waitForEvent(emitter, event) {
return new Promise(resolve => {
emitter.on(event, resolve); // Listener never removed
});
}
// Correct: cleanup on timeout or cancellation
function waitForEvent(emitter, event, timeout = 30000) {
return new Promise((resolve, reject) => {
const timer = setTimeout(() => {
emitter.off(event, handler);
reject(new Error("Timeout"));
}, timeout);
function handler(data) {
clearTimeout(timer);
resolve(data);
}
emitter.on(event, handler);
});
}
Cancellation is notably absent from most promise implementations. For operations that need cancellation, consider AbortController in JavaScript or structured concurrency patterns in other languages.
When to Use (and When Not To)
Futures excel at one-shot async operations: HTTP requests, database queries, file reads. They represent a single eventual value cleanly.
For streams of values, futures are the wrong abstraction. Use observables, async iterators, or channels instead. A WebSocket connection produces many messages over time—wrapping each in a separate future loses the streaming nature.
For CPU-bound work, futures don’t provide parallelism by themselves. You need actual threads or worker pools. Futures manage the coordination, not the execution.
Async/await syntax is largely syntactic sugar over futures, making sequential async code read like synchronous code. Use it liberally—it’s easier to read and debug than raw .then() chains. But understand that await points are where your function can be suspended and other code can run.
When you need fine-grained control over concurrent execution, backpressure handling, or complex coordination patterns, consider the actor model or CSP-style channels. Futures compose well but can become unwieldy for complex state machines.
The goal isn’t to use futures everywhere—it’s to represent deferred computation explicitly, making async behavior visible in your type signatures and enabling composition that callbacks can’t match.