Why Your SSE Stream Keeps Failing in Production – How I Debugged and Fixed /api/user/stream in Next.js

There’s a specific kind of frustration only backend developers understand — everything works perfectly on localhost, you deploy it to production, and then suddenly… something quietly breaks. No crash, no logs screaming at you. Just a silent failure in the Network tab.

Yeah, this was exactly my situation with Server-Sent Events (SSE) in a Next.js app.

I actually spent almost 2–3 hours stuck here thinking it was my code. But nope — the endpoint /api/user/stream was failing only in production. Locally everything was smooth.

And honestly… this part confused me a lot at first.

Next.js Server-Sent Events stream failing in production showing /api/user/stream error and debugging in browser DevTools

First — What Even Is a Stream Request?

Before jumping into debugging, let’s make this simple.

Server-Sent Events (SSE) is basically a way for the server to push updates to the browser in real time over a single long-lived HTTP connection.

Unlike WebSockets (two-way communication), SSE is one-way — server to client. It’s used for things like:

  • Live notifications
  • Dashboard updates
  • Activity feeds
  • Real-time ticket systems

When the browser sends Accept: text/event-stream, it expects the server to keep the connection open and continuously send updates.

If it closes immediately or never establishes properly, the stream fails.

Simple concept… but in production? It gets tricky.


What I Actually Saw in DevTools

In Chrome DevTools, I saw the request to /api/user/stream marked red — failed connection.

Everything else in the app was working fine. Only this stream endpoint was broken.

  • Request URL: https://ticketsystem.yourdomain.com/api/user/stream
  • Accept: text/event-stream
  • Connection: keep-alive

At first I thought — “okay maybe API issue?”

But nope. That was the wrong direction. The request was fine. The infrastructure was the problem.

Yeah… this was the main culprit.


Why SSE Fails in Production (Real Reasons)

Let’s break down what actually goes wrong in real-world setups.

1. Nginx Buffering Issue

This was the big one for me.

I didn’t notice this at first, but Nginx was buffering the response.

And for SSE, buffering basically kills the whole idea.

SSE needs streaming data instantly. But buffering waits until chunks are filled… so the client receives nothing in real time.

So the browser just thinks — “connection failed”.

2. Proxy Timeout (Silent Killer)

Most proxy servers have a default timeout (like 60 seconds).

That means even if your stream is working, it gets killed automatically.

This part honestly annoyed me the most because there was no obvious error.

3. Missing SSE Headers

If you don’t send proper headers like:

Content-Type: text/event-stream
Cache-Control: no-cache
Connection: keep-alive

The connection behaves unpredictably.

4. Serverless Limitations

If you're on platforms like Vercel or Lambda — long-running connections can get killed.

This is a common trap. Everything works locally, breaks in production.


How I Fixed It (Step by Step)

Step 1: Fixing API Route

I started by checking the Next.js stream handler.

import { NextRequest } from "next/server";
import { userClients, registerUserController } from "@/lib/sse";
import { getServerSession } from "next-auth";
import { authOptions } from "@/lib/authOptions";

const encoder = new TextEncoder();

export async function GET(req: NextRequest) {
  let userId: string;
  try {
    const session = await getServerSession(authOptions);
    if (!session?.user?.id) {
      return new Response("Unauthorized", { status: 401 });
    }
    userId = session.user.id;
  } catch (error) {
    console.error("Auth error in SSE:", error);
    return new Response("Internal Server Error", { status: 500 });
  }

  const headers = {
    "Content-Type": "text/event-stream",
    "Cache-Control": "no-cache, no-transform",
    "Connection": "keep-alive",
    "X-Accel-Buffering": "no",
  };

  const stream = new ReadableStream({
    start(controller) {
      if (!userClients.has(userId)) userClients.set(userId, new Set());
      userClients.get(userId)!.add(controller);

      registerUserController(userId, "__user__", controller);

      const send = (event: string, data: any) => {
        try {
          controller.enqueue(
            encoder.encode(
              `event: ${event}\ndata: ${JSON.stringify(data)}\n\n`,
            ),
          );
        } catch {
          cleanup();
        }
      };

      send("connected", {});

      const heartbeat = setInterval(() => {
        try {
          controller.enqueue(encoder.encode(`:\n\n`));
        } catch {
          cleanup();
        }
      }, 10000);

      const cleanup = () => {
        clearInterval(heartbeat);
        userClients.get(userId)?.delete(controller);

        try {
          controller.close();
        } catch {}
      };

      req.signal.addEventListener("abort", cleanup);
    },
  });

  return new Response(stream, {headers});
}

The important part here is X-Accel-Buffering: no.

Without it… Nginx just ignores streaming behavior.


Step 2: Nginx Fix (This Was THE Fix)

Honestly… this is where everything clicked.

I updated my Nginx config like this:

server {
    listen 80;
    server_name yourdomain.com;
    return 301 https://$host$request_uri;
}

server {
    listen 443 ssl http2;
    server_name yourdomain.com;
    ssl_certificate /etc/ssl/kocerts/ticket-system.crt;  # SSL cert path
    ssl_certificate_key /etc/ssl/kocerts/ticket-system.key;  # SSL key path

    access_log /var/log/nginx/tsystem_access.log;
    error_log /var/log/nginx/tsystem_error.log warn;

    proxy_buffering off;

    location /api/user/stream {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Connection "";
        proxy_set_header Host $host;
        proxy_cache off;
        proxy_buffering off;
        chunked_transfer_encoding off;
        proxy_read_timeout 86400;
        proxy_send_timeout 86400;
        add_header Cache-Control no-cache;
    }

    location /api/ {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Connection "";
        proxy_set_header Host $host;
        proxy_read_timeout 300;
    }

    location / {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Connection "";
        proxy_set_header Host $host;
    }
}

Just test and restart Nginx:

sudo nginx -t
sudo systemctl restart nginx

After this change, things finally started working.

And yeah… this was the main culprit.


Step 3: Added Heartbeat (Very Important)

I didn’t realize this initially, but idle connections get dropped in real environments.

So I added a heartbeat:

: ping\n\n

It keeps the connection alive without sending real data.

Simple trick, but very effective.


Step 4: Frontend Handling

useEffect(() => {
  const eventSource = new EventSource("/api/user/stream");

  eventSource.onmessage = (event) => {
    const data = JSON.parse(event.data);
  };

  eventSource.onerror = () => {
    console.warn("SSE connection lost, retrying...");
  };

  return () => eventSource.close();
}, []);

The browser actually handles reconnection automatically — which is nice.


Final Result

After all changes, I refreshed the page…

And finally — the stream stayed in pending state (which is correct).

Events started flowing properly, heartbeat was visible every 15 seconds, and no more random failures.

Server-Sent Events stream working correctly in Next.js production after fixing Nginx buffering and API configuration issue

Quick Checklist (If You're Stuck)

  • Is SSE header set correctly?
  • Is Nginx buffering disabled?
  • Is proxy timeout too low?
  • Are you using serverless (Vercel/Lambda)?
  • Is CDN interfering?

Final Thoughts

Looking back, this wasn’t a code issue at all.

It was purely infrastructure behavior.

I actually wasted a good amount of time thinking my stream logic was wrong, but it wasn’t.

Honestly… this part was annoying.

But once you understand how SSE interacts with proxies and Nginx, it becomes much easier.

If you're building real-time features in Next.js — just remember: SSE is simple, but infrastructure can break it silently.

Hope this saves you a few hours of debugging.

If you know another developer facing this, feel free to share it with them.

Selvaraj Iyyappan
April 16, 2026
f X W