The Headache of “Real-time Data Updates”
Last week, I was assigned to build a dashboard for monitoring system logs. The requirement sounded simple: as logs are generated by the server, they must appear instantly on the admin browser with less than 100ms latency. If you’re in the industry, you’re likely familiar with this challenge—whether it’s for notifications, stock tickers, or the currently trending AI model streaming like ChatGPT.
Initially, I considered just using whatever technology I was most comfortable with. However, after calculating the potential traffic, I realized that picking the wrong tech stack from the start would cause the server to “choke” as the user base grows.
Why Old Approaches Keep You Up at Night
First is Short Polling. Every 1-2 seconds, the client sends an HTTP request asking: “Is there anything new?”. This is incredibly resource-intensive. Even when the system is idle, the server must still handle requests, query the database, and return a useless empty array. Imagine 1,000 concurrent users—the server would have to shoulder 1,000 requests per second just to say “nothing new.” Which CPU can survive that?
Next is WebSocket. It’s the “big boss” of real-time communication thanks to its bidirectional capabilities. However, when you only need to push data from the server to the client, WebSocket is like using a cargo container truck to deliver a lunch box. You have to install additional libraries (like Socket.io), manage handshake configurations, handle disconnections, and deal with the headache of the Upgrade header when running behind Nginx or a Load Balancer.
Server-Sent Events (SSE): The Unsung Hero
After careful consideration, I chose Server-Sent Events (SSE). This is an HTML5 standard that allows the server to actively push data to the client over a single persistent HTTP connection.
Why is SSE a smarter choice?
- Pure HTTP: No complex protocols required; runs smoothly on standard 80/443 ports.
- Ultra-lightweight: Minimal message overhead. A typical HTTP request costs about 500 bytes of headers, whereas SSE sends data almost directly.
- Automatic Reconnection: Browsers will automatically reconnect if the network drops, without you writing a single line of code.
- Rapid Implementation: The frontend needs exactly 3 lines of code to start receiving data.
Implementing SSE with Node.js: Clean and Efficient
Here’s how I set up a simple Express server to push timestamps to the client every 2 seconds.
Step 1: Backend Setup with Express
The secret lies in setting the correct HTTP headers so the browser understands this is a continuous data stream.
const express = require('express');
const app = express();
const PORT = 3000;
app.get('/events', (req, res) => {
// Set mandatory headers for SSE
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
res.write('data: Connection successful!\n\n');
const intervalId = setInterval(() => {
const data = JSON.stringify({
message: 'New system log',
timestamp: new Date().toLocaleTimeString()
});
// Note: Must include "data: " prefix and end with "\n\n"
res.write(`data: ${data}\n\n`);
}, 2000);
// Cleanup resources when the user closes the tab
req.on('close', () => {
clearInterval(intervalId);
res.end();
console.log('User left, stopping interval.');
});
});
app.listen(PORT, () => {
console.log(`Server listening at http://localhost:${PORT}`);
});
Pro-tip: When streaming complex JSON data, I often copy the payload into toolcraft.app/en/tools/developer/json-formatter for a quick structure check. This helps avoid JSON parsing errors on the frontend due to formatting mistakes.
Step 2: Frontend Just Needs to “Listen”
You don’t need to install any NPM libraries. The EventSource API is built into all modern browsers.
const eventSource = new EventSource('/events');
eventSource.onmessage = (event) => {
const data = JSON.parse(event.data);
console.log('New data received:', data);
const list = document.getElementById('logs');
list.innerHTML += `<li>${data.timestamp}: ${data.message}</li>`;
};
eventSource.onerror = () => {
console.log('Connection lost, browser is retrying...');
};
3 Critical Notes to Save You from Trouble
Despite being great and cost-effective, SSE has limitations you need to know to avoid production issues:
- Old Connection Limit: With HTTP/1.1, browsers only allow a maximum of 6 SSE connections per domain. If a user opens more than 6 tabs, subsequent tabs will hang completely. Prioritize using HTTP/2 to raise this limit to hundreds of connections.
- Nginx Buffering: Nginx often buffers data before sending it to the client, which kills real-time performance. Remember to add the
X-Accel-Buffering: noheader to your config to ensure data is pushed immediately. - Newline Characters: Always end each message with two
\n\ncharacters. Missing even one newline is enough to make the client hang, thinking the data hasn’t finished transmitting.
Conclusion
After replacing Polling with SSE for the dashboard system, my server’s CPU usage dropped from 40% to just 5%. If you’re building an application that only requires a one-way data flow—like notifications, price updates, or AI streaming—SSE is your best bet. It’s clean code, lightweight, and extremely easy to maintain. Good luck with your implementation!

