a developer, I’m all the time in search of methods to make my functions extra dynamic and interactive. Customers right now anticipate real-time options, comparable to dwell notifications, streaming updates, and dashboards that refresh mechanically. The software that usually involves thoughts for internet builders when contemplating some of these functions is WebSockets, and it’s extremely highly effective.
There are occasions, although, when WebSockets could be overkill, and their full performance is usually not required. They supply a fancy, bi-directional communication channel, however many occasions, all I would like is for the server to push updates to the consumer. For these frequent eventualities, a extra simple and stylish answer that’s constructed proper into fashionable internet platforms is named Server-Despatched Occasions (SSE).
On this article, I’m going to introduce you to Server-Despatched Occasions. We’ll talk about what they’re, how they examine to WebSockets, and why they’re usually the proper software for the job. Then, we’ll dive right into a collection of sensible examples, utilizing Python and the FastAPI framework to construct real-time functions which might be surprisingly easy but highly effective.
What are Server-Despatched Occasions (SSE)?
Server-Despatched Occasions is an internet know-how commonplace that permits a server to push information to a consumer asynchronously as soon as an preliminary consumer connection has been established. It supplies a one-way, server-to-client stream of knowledge over a single, long-lived HTTP connection. The consumer, sometimes an internet browser, subscribes to this stream and may react to the messages it receives.
Some key points of Server-Despatched Occasions embrace:
- Easy Protocol. SSE is a simple, text-based protocol. Occasions are simply chunks of textual content despatched over HTTP, making them straightforward to debug with commonplace instruments like curl.
- Commonplace HTTP. SSE works over common HTTP/HTTPS. This implies it’s usually extra suitable with current firewalls and proxy servers.
- Computerized Reconnection. This can be a killer function. If the connection to the server is misplaced, the browser’s EventSource API will mechanically attempt to reconnect. You get this resilience free of charge, with out writing any further JavaScript code.
- One-Means Communication. SSE is strictly for server-to-client information pushes. If you happen to want full-duplex, client-to-server communication, WebSockets are the extra acceptable selection.
- Native Browser Help. All fashionable internet browsers have built-in assist for Server-Despatched Occasions (SSE) by means of the EventSource interface, eliminating the necessity for client-side libraries.
Why SSE Issues/Frequent Use Instances
The first benefit of SSE is its simplicity. For a big class of real-time issues, it supplies all the mandatory performance with a fraction of the complexity of WebSockets, each on the server and the consumer. This implies quicker growth, simpler upkeep, and fewer issues that may go incorrect.
SSE is an ideal match for any state of affairs the place the server must provoke communication and ship updates to the consumer. For instance …
- Reside Notification Programs. Pushing notifications to a person when a brand new message arrives or an necessary occasion happens.
- Actual-Time Exercise Feeds. Streaming updates to a person’s exercise feed, much like a Twitter or Fb timeline.
- Reside Information Dashboards. Sending steady updates for inventory tickers, sports activities scores, or monitoring metrics to a dwell dashboard.
- Streaming Log Outputs. Displaying the dwell log output from a long-running background course of immediately within the person’s browser.
- Progress Updates. Displaying the real-time progress of a file add, an information processing job, or every other long-running activity initiated by the person.
That’s sufficient idea; let’s see simply how straightforward it’s to implement these concepts with Python.
Establishing the Improvement Surroundings
We’ll utilise FastAPI, a contemporary and high-performance Python internet framework. Its native assist for asyncio and streaming responses makes it an ideal match for implementing Server-Despatched Occasions. You’ll additionally want the Uvicorn ASGI server to run the appliance.
As normal, we’ll arrange a growth setting to maintain our tasks separate. I counsel utilizing MiniConda for this, however be at liberty to make use of whichever software you’re accustomed to.
# Create and activate a brand new digital setting
(base) $ conda create -n sse-env python=3.13 -y
(base) $ activate sse-env
Now, set up the exterior libraries we want.
# Set up FastAPI and Uvicorn
(sse-env) $ pip set up fastapi uvicorn
That’s all of the setup we want. Now, we are able to begin coding.
Code Instance 1 — The Python Backend. A Easy SSE Endpoint
Let’s create our first SSE endpoint. It would ship a message with the present time to the consumer each second.
Create a file named app.py and sort the next into it.
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
from fastapi.middleware.cors import CORSMiddleware
import time
app = FastAPI()
# Enable requests from http://localhost:8080 (the place index.html is served)
app.add_middleware(
CORSMiddleware,
allow_origins=["http://localhost:8080"],
allow_methods=["GET"],
allow_headers=["*"],
)
def event_stream():
whereas True:
yield f"information: The time is {time.strftime('%X')}nn"
time.sleep(1)
@app.get("/stream-time")
def stream():
return StreamingResponse(event_stream(), media_type="textual content/event-stream")
I hope you agree that this code is easy.
- We outline an event_stream() perform. This loop repeats endlessly, producing a string each second.
- The yielded string is formatted in accordance with the SSE spec: it should begin with information: and finish with two newlines (nn).
- Our endpoint /stream-time returns a StreamingResponse, passing our generator to it and setting the media_type to textual content/event-stream. FastAPI handles the remaining, retaining the connection open and sending every yielded chunk to the consumer.
To run the code, don’t use the usual Python app.py command as you’d usually. As a substitute, do that.
(sse-env)$ uvicorn app:app --reload
INFO: Will look ahead to modifications in these directories: ['/home/tom']
INFO: Uvicorn operating on http://127.0.0.1:8000 (Press CTRL+C to give up)
INFO: Began reloader course of [4109269] utilizing WatchFiles
INFO: Began server course of [4109271]
INFO: Ready for utility startup.
INFO: Utility startup full.
Now, kind this handle into your browser …
http://127.0.0.1:8000/stream-time
… and you need to see one thing like this.
The display screen ought to show an up to date time file each second.
Code instance 2. Actual-Time System Monitoring Dashboard
On this instance, we are going to monitor our PC or laptop computer’s CPU and reminiscence utilization in actual time.
Right here is the app.py code you want.
import asyncio
import json
import psutil
from fastapi import FastAPI, Request
from fastapi.responses import HTMLResponse, StreamingResponse
from fastapi.middleware.cors import CORSMiddleware
import datetime
# Outline app FIRST
app = FastAPI()
# Then add middleware
app.add_middleware(
CORSMiddleware,
allow_origins=["http://localhost:8080"],
allow_methods=["GET"],
allow_headers=["*"],
)
async def system_stats_generator(request: Request):
whereas True:
if await request.is_disconnected():
print("Consumer disconnected.")
break
cpu_usage = psutil.cpu_percent()
memory_info = psutil.virtual_memory()
stats = {
"cpu_percent": cpu_usage,
"memory_percent": memory_info.%,
"memory_used_mb": spherical(memory_info.used / (1024 * 1024), 2),
"memory_total_mb": spherical(memory_info.complete / (1024 * 1024), 2)
}
yield f"information: {json.dumps(stats)}nn"
await asyncio.sleep(1)
@app.get("/system-stats")
async def stream_system_stats(request: Request):
return StreamingResponse(system_stats_generator(request), media_type="textual content/event-stream")
@app.get("/", response_class=HTMLResponse)
async def read_root():
with open("index.html") as f:
return HTMLResponse(content material=f.learn())
This code constructs a real-time system monitoring service utilizing the FastAPI internet framework. It creates an internet server that constantly tracks and broadcasts the host machine’s CPU and reminiscence utilization to any linked internet consumer.
First, it initialises a FastAPI utility and configures Cross-Origin Useful resource Sharing (CORS) middleware. This middleware is a safety function that’s explicitly configured right here to permit an internet web page served from http://localhost:8080 to make requests to this server, which is a standard requirement when the frontend and backend are developed individually.
The core of the appliance is the system_stats_generator asynchronous perform. This perform runs in an infinite loop, and in every iteration, it makes use of the psutil library to fetch the present CPU utilisation proportion and detailed reminiscence statistics, together with the share used, megabytes used, and complete megabytes. It packages this data right into a dictionary, converts it to a JSON string, after which yields it within the particular “textual content/event-stream” format (information: …nn).
Using asyncio.sleep(1) introduces a one-second pause between updates, stopping the loop from consuming extreme assets. The perform can also be designed to detect when a consumer has disconnected and gracefully cease sending information for that consumer.
The script defines two internet endpoints. The @app.get(“/system-stats”) endpoint creates a StreamingResponse that initiates the system_stats_generator. When a consumer makes a GET request to this URL, it establishes a persistent connection, and the server begins streaming the system stats each second. The second endpoint, @app.get(“/”), serves a static HTML file named index.html as the primary web page. This HTML file would sometimes include the JavaScript code wanted to connect with the /system-stats stream and dynamically show the incoming efficiency information on the net web page.
Now, right here is the up to date (index.html) front-end code.
System Monitor
Reminiscence Utilization
0% (0 / 0 MB)
Run the app utilizing Uvicorn, as we did in Instance 1. Then, in a separate command window, kind the next to start out a Python server.
python3 -m http.server 8080
Now, open the URL http://localhost:8080/index.html in your browser, and you will notice the output, which ought to replace constantly.

Code instance 3 — Background activity progress bar
On this instance, we provoke a activity and show a bar indicating the duty’s progress.
Up to date app.py
import asyncio
import json
import psutil
from fastapi import FastAPI, Request
from fastapi.responses import HTMLResponse, StreamingResponse
from fastapi.middleware.cors import CORSMiddleware
import datetime
# Outline app FIRST
app = FastAPI()
# Then add middleware
app.add_middleware(
CORSMiddleware,
allow_origins=["http://localhost:8080"],
allow_methods=["GET"],
allow_headers=["*"],
)
async def training_progress_generator(request: Request):
"""
Simulates a long-running AI coaching activity and streams progress.
"""
total_epochs = 10
steps_per_epoch = 100
for epoch in vary(1, total_epochs + 1):
# Simulate some preliminary processing for the epoch
await asyncio.sleep(0.5)
for step in vary(1, steps_per_epoch + 1):
# Test if consumer has disconnected
if await request.is_disconnected():
print("Consumer disconnected, stopping coaching activity.")
return
# Simulate work
await asyncio.sleep(0.02)
progress = (step / steps_per_epoch) * 100
simulated_loss = (1 / epoch) * (1 - (step / steps_per_epoch)) + 0.1
progress_data = {
"epoch": epoch,
"total_epochs": total_epochs,
"progress_percent": spherical(progress, 2),
"loss": spherical(simulated_loss, 4)
}
# Ship a named occasion "progress"
yield f"occasion: progressndata: {json.dumps(progress_data)}nn"
# Ship a last "full" occasion
yield f"occasion: completendata: Coaching full!nn"
@app.get("/stream-training")
async def stream_training(request: Request):
"""SSE endpoint to stream coaching progress."""
return StreamingResponse(training_progress_generator(request), media_type="textual content/event-stream")
@app.get("/", response_class=HTMLResponse)
async def read_root():
"""Serves the primary HTML web page."""
with open("index.html") as f:
return HTMLResponse(content material=f.learn())
The up to date index.html code is that this.
Reside Process Progress
Cease your current uvicorn and Python server processes in the event that they’re nonetheless operating, after which restart each.
Now, once you open the index.html web page, you need to see a display screen with a button. Urgent the button will begin a dummy activity, and a transferring bar will show the duty progress.

Code Instance 4— A Actual-Time Monetary Inventory Ticker
For our last instance, we are going to create a simulated inventory ticker. The server will generate random worth updates for a number of inventory symbols and ship them utilizing named occasions, the place the occasion identify corresponds to the inventory image (e.g., occasion: AAPL, occasion: GOOGL). This can be a highly effective sample for multiplexing completely different varieties of knowledge over a single SSE connection, permitting the consumer to deal with every stream independently.
Right here is the up to date app.py code you’ll want.
import asyncio
import json
import random
from fastapi import FastAPI, Request
from fastapi.responses import StreamingResponse
from fastapi.middleware.cors import CORSMiddleware
# Step 1: Create app first
app = FastAPI()
# Step 2: Add CORS to permit requests from http://localhost:8080
app.add_middleware(
CORSMiddleware,
allow_origins=["http://localhost:8080"],
allow_methods=["GET"],
allow_headers=["*"],
)
# Step 3: Simulated inventory costs
STOCKS = {
"AAPL": 150.00,
"GOOGL": 2800.00,
"MSFT": 300.00,
}
# Step 4: Generator to simulate updates
async def stock_ticker_generator(request: Request):
whereas True:
if await request.is_disconnected():
break
image = random.selection(record(STOCKS.keys()))
change = random.uniform(-0.5, 0.5)
STOCKS[symbol] = max(0, STOCKS[symbol] + change)
replace = {
"image": image,
"worth": spherical(STOCKS[symbol], 2),
"change": spherical(change, 2)
}
# Ship named occasions so the browser can hear by image
yield f"occasion: {image}ndata: {json.dumps(replace)}nn"
await asyncio.sleep(random.uniform(0.5, 1.5))
# Step 5: SSE endpoint
@app.get("/stream-stocks")
async def stream_stocks(request: Request):
return StreamingResponse(stock_ticker_generator(request), media_type="textual content/event-stream")
And the up to date index.html
Reside Inventory Ticker
Cease then restart the uvicorn and Python processes as earlier than. This time, once you open http://localhost:8080/index.html in your browser, you need to see a display screen like this, which is able to frequently replace the dummy costs of the three shares.

Abstract
On this article, I demonstrated that for a lot of real-time use circumstances, Server-Despatched Occasions supply an easier various to WebSockets. We mentioned the core rules of SSE, together with its one-way communication mannequin and automated reconnection capabilities. By a collection of hands-on examples utilizing Python and FastAPI, we noticed simply how straightforward it’s to construct highly effective real-time options. We lined:
- A easy Python back-end and SSE endpoint
- A dwell system monitoring dashboard streaming structured JSON information.
- An actual-time progress bar for a simulated long-running background activity.
- A multiplexed inventory ticker utilizing named occasions to handle completely different information streams.
Subsequent time it’s worthwhile to push information out of your server to a consumer, I encourage you to pause earlier than reaching for WebSockets. Ask your self in case you really want bi-directional communication. If the reply isn’t any, then Server-Despatched Occasions are seemingly the extra simple, quicker, and extra sturdy answer you’ve been in search of.