Server-Sent Events: Unidirectional Streaming

Server-Sent Events (SSE) is a web technology that enables servers to push data to clients over a single, long-lived HTTP connection. Unlike WebSockets, which provide full-duplex communication, SSE is...

Key Insights

  • Server-Sent Events provide a simpler, HTTP-native alternative to WebSockets when you only need server-to-client data flow, with automatic reconnection and built-in event handling.
  • SSE uses standard HTTP connections, making it compatible with existing infrastructure, authentication systems, and load balancers without special configuration.
  • For use cases like real-time dashboards, notification systems, and LLM response streaming, SSE offers the right balance of simplicity and functionality without WebSocket complexity.

What Are Server-Sent Events?

Server-Sent Events (SSE) is a web technology that enables servers to push data to clients over a single, long-lived HTTP connection. Unlike WebSockets, which provide full-duplex communication, SSE is intentionally unidirectional: the server sends, the client receives.

This constraint is actually a feature. Most real-time applications don’t need bidirectional streaming. When a user needs to send data, a standard HTTP POST works fine. What you often need is efficient, continuous updates from server to client—and that’s exactly what SSE delivers.

The key advantages over WebSockets:

  • Simpler protocol: SSE runs over standard HTTP, not a separate protocol
  • Automatic reconnection: Built into the EventSource API
  • Works with existing infrastructure: No special proxy configuration needed
  • Lighter weight: No handshake overhead or frame parsing

Choose SSE when your data flows primarily from server to client. Choose WebSockets when you need true bidirectional streaming with low latency in both directions.

How SSE Works Under the Hood

SSE uses the EventSource browser API to establish a persistent HTTP connection. The server responds with Content-Type: text/event-stream and keeps the connection open, sending events as they occur.

The wire format is remarkably simple—just text with specific field prefixes:

event: notification
id: 1234
retry: 5000
data: {"message": "New comment on your post", "timestamp": 1699876543}

event: heartbeat
data: ping

data: This is a simple message
data: that spans multiple lines

Each field serves a purpose:

  • data: The actual payload (required). Multiple data: lines are concatenated with newlines.
  • event: Custom event type. Without it, the message triggers the generic message event.
  • id: Event identifier. The browser sends this as Last-Event-ID on reconnection.
  • retry: Reconnection interval in milliseconds. The browser uses this for automatic reconnection.

Events are separated by blank lines (double newline). The simplicity of this format means you can debug SSE streams with curl or any HTTP client.

Implementing an SSE Server

Here’s a production-ready SSE endpoint in Node.js with Express that handles multiple clients and proper connection lifecycle:

const express = require('express');
const app = express();

// Store active client connections
const clients = new Map();
let clientIdCounter = 0;

app.get('/events', (req, res) => {
  const clientId = ++clientIdCounter;
  
  // Set SSE headers
  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.setHeader('Connection', 'keep-alive');
  res.setHeader('X-Accel-Buffering', 'no'); // Disable nginx buffering
  
  // Flush headers immediately
  res.flushHeaders();
  
  // Send initial connection event
  res.write(`event: connected\ndata: ${JSON.stringify({ clientId })}\n\n`);
  
  // Store client connection
  clients.set(clientId, res);
  console.log(`Client ${clientId} connected. Total: ${clients.size}`);
  
  // Handle client disconnect
  req.on('close', () => {
    clients.delete(clientId);
    console.log(`Client ${clientId} disconnected. Total: ${clients.size}`);
  });
  
  // Send heartbeat every 30 seconds to keep connection alive
  const heartbeat = setInterval(() => {
    res.write(': heartbeat\n\n');
  }, 30000);
  
  req.on('close', () => clearInterval(heartbeat));
});

// Broadcast to all connected clients
function broadcast(event, data) {
  const message = `event: ${event}\ndata: ${JSON.stringify(data)}\n\n`;
  
  clients.forEach((res, clientId) => {
    try {
      res.write(message);
    } catch (error) {
      console.error(`Failed to send to client ${clientId}:`, error);
      clients.delete(clientId);
    }
  });
}

// Send to specific client
function sendToClient(clientId, event, data) {
  const client = clients.get(clientId);
  if (client) {
    client.write(`event: ${event}\ndata: ${JSON.stringify(data)}\n\n`);
  }
}

// Example: trigger events via API
app.post('/notify', express.json(), (req, res) => {
  broadcast('notification', req.body);
  res.json({ sent: true, clients: clients.size });
});

app.listen(3000, () => console.log('SSE server running on port 3000'));

Key implementation details:

  • X-Accel-Buffering header: Prevents nginx from buffering the response
  • Heartbeat comments: Lines starting with : are comments, keeping the connection alive through proxies
  • Connection cleanup: Always handle the close event to prevent memory leaks
  • Flush headers: Call flushHeaders() to send headers immediately

Client-Side Implementation

The browser’s EventSource API handles connection management, automatic reconnection, and event parsing:

class SSEClient {
  constructor(url, options = {}) {
    this.url = url;
    this.options = options;
    this.eventSource = null;
    this.listeners = new Map();
    this.reconnectAttempts = 0;
    this.maxReconnectAttempts = options.maxReconnectAttempts || 5;
  }

  connect() {
    this.eventSource = new EventSource(this.url);
    
    this.eventSource.onopen = () => {
      console.log('SSE connection established');
      this.reconnectAttempts = 0;
    };

    this.eventSource.onerror = (error) => {
      console.error('SSE error:', error);
      
      if (this.eventSource.readyState === EventSource.CLOSED) {
        this.handleDisconnect();
      }
    };

    // Generic message handler (events without 'event' field)
    this.eventSource.onmessage = (event) => {
      this.handleEvent('message', event);
    };

    // Re-attach custom event listeners
    this.listeners.forEach((handlers, eventType) => {
      handlers.forEach(handler => {
        this.eventSource.addEventListener(eventType, handler);
      });
    });
  }

  on(eventType, handler) {
    const wrappedHandler = (event) => {
      try {
        const data = JSON.parse(event.data);
        handler(data, event);
      } catch {
        handler(event.data, event);
      }
    };

    if (!this.listeners.has(eventType)) {
      this.listeners.set(eventType, []);
    }
    this.listeners.get(eventType).push(wrappedHandler);

    if (this.eventSource) {
      this.eventSource.addEventListener(eventType, wrappedHandler);
    }

    return () => this.off(eventType, wrappedHandler);
  }

  off(eventType, handler) {
    const handlers = this.listeners.get(eventType);
    if (handlers) {
      const index = handlers.indexOf(handler);
      if (index > -1) handlers.splice(index, 1);
    }
    if (this.eventSource) {
      this.eventSource.removeEventListener(eventType, handler);
    }
  }

  handleDisconnect() {
    if (this.reconnectAttempts < this.maxReconnectAttempts) {
      const delay = Math.min(1000 * Math.pow(2, this.reconnectAttempts), 30000);
      console.log(`Reconnecting in ${delay}ms...`);
      setTimeout(() => {
        this.reconnectAttempts++;
        this.connect();
      }, delay);
    } else {
      console.error('Max reconnection attempts reached');
    }
  }

  close() {
    if (this.eventSource) {
      this.eventSource.close();
      this.eventSource = null;
    }
  }
}

// Usage
const sse = new SSEClient('/events');
sse.connect();

sse.on('notification', (data) => {
  console.log('Notification:', data.message);
  showToast(data.message);
});

sse.on('connected', (data) => {
  console.log('Connected with ID:', data.clientId);
});

The EventSource API automatically reconnects on connection loss. The retry field from the server controls the delay. My wrapper adds exponential backoff and a maximum retry limit for additional resilience.

Real-World Use Cases

SSE shines in several scenarios:

Live Dashboards: Push metrics updates without polling. Lower latency, less server load.

Notification Systems: Real-time alerts, mentions, and activity feeds.

Log Streaming: Tail logs in the browser during deployments or debugging sessions.

AI/LLM Response Streaming: Stream tokens as they’re generated. This is why ChatGPT responses appear word-by-word—they use SSE.

Stock Tickers and Sports Scores: Continuous updates where the client never needs to send data.

Here’s a minimal notification system combining the concepts:

// Server: notifications.js
const notifications = [];

app.get('/notifications/stream', (req, res) => {
  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.flushHeaders();

  // Send recent notifications on connect
  const lastEventId = req.headers['last-event-id'];
  const missedNotifications = lastEventId 
    ? notifications.filter(n => n.id > parseInt(lastEventId))
    : notifications.slice(-10);
  
  missedNotifications.forEach(n => {
    res.write(`id: ${n.id}\nevent: notification\ndata: ${JSON.stringify(n)}\n\n`);
  });

  clients.set(res, true);
  req.on('close', () => clients.delete(res));
});

// Client: notifications.html
const sse = new EventSource('/notifications/stream');
sse.addEventListener('notification', (e) => {
  const notification = JSON.parse(e.data);
  document.getElementById('notifications').insertAdjacentHTML(
    'afterbegin',
    `<div class="notification">${notification.message}</div>`
  );
});

Limitations and Considerations

SSE isn’t perfect. Know these constraints:

Connection limits: Browsers limit connections per domain (typically 6 for HTTP/1.1). Use HTTP/2 where possible—it multiplexes streams over a single connection.

No binary data: SSE is text-only. Base64 encoding works but adds overhead. Use WebSockets for binary streaming.

Proxy buffering: Some proxies buffer responses. The X-Accel-Buffering: no header helps with nginx. For others, sending periodic comments (: keepalive\n\n) flushes buffers.

Scaling: Each connection consumes server resources. For high-scale deployments, use Redis pub/sub to broadcast events across multiple server instances:

const Redis = require('ioredis');
const subscriber = new Redis();
const publisher = new Redis();

subscriber.subscribe('events');
subscriber.on('message', (channel, message) => {
  broadcast('update', JSON.parse(message));
});

// Publish from any server instance
publisher.publish('events', JSON.stringify({ type: 'update', data: {} }));

SSE vs Alternatives: Making the Right Choice

Feature SSE WebSockets Long Polling HTTP/2 Streams
Direction Server → Client Bidirectional Server → Client Bidirectional
Protocol HTTP WS/WSS HTTP HTTP/2
Auto-reconnect Built-in Manual Manual Manual
Binary support No Yes Yes Yes
Browser support Excellent Excellent Universal Good
Proxy-friendly Yes Sometimes Yes Yes
Complexity Low Medium Low High

Choose SSE when: You need server-to-client streaming, want simplicity, and can use standard HTTP infrastructure.

Choose WebSockets when: You need true bidirectional streaming with minimal latency, or need binary data.

Choose Long Polling when: You need maximum compatibility or very infrequent updates.

Choose HTTP/2 Streams when: You’re already on HTTP/2 and need fine-grained control over streaming.

For most real-time features—dashboards, notifications, live feeds—SSE is the pragmatic choice. It’s simpler to implement, debug, and operate than WebSockets, and it works with your existing HTTP infrastructure without modification.

Liked this? There's more.

Every week: one practical technique, explained simply, with code you can use immediately.