Deploying a Real-Time Stock Market Dashboard with FastAPI and WebSockets

Updated Feb 6, 2026

The Working Solution First

Here’s the FastAPI dashboard that pushes live stock updates to 50+ concurrent clients without breaking a sweat:

from fastapi import FastAPI, WebSocket, WebSocketDisconnect
from fastapi.responses import HTMLResponse
import asyncio
import yfinance as yf
from datetime import datetime
import json
from typing import Set

app = FastAPI()

class ConnectionManager:
    def __init__(self):
        self.active_connections: Set[WebSocket] = set()

    async def connect(self, websocket: WebSocket):
        await websocket.accept()
        self.active_connections.add(websocket)

    def disconnect(self, websocket: WebSocket):
        self.active_connections.discard(websocket)

    async def broadcast(self, message: dict):
        dead_connections = set()
        for connection in self.active_connections:
            try:
                await connection.send_json(message)
            except Exception:
                dead_connections.add(connection)
        # Clean up dead connections outside the iteration
        self.active_connections -= dead_connections

manager = ConnectionManager()

# Background task that fetches stock data
async def fetch_stock_data(symbols: list):
    while True:
        try:
            data = yf.download(symbols, period='1d', interval='1m', 
                             progress=False, show_errors=False)
            if not data.empty:
                latest = data['Close'].iloc[-1]
                payload = {
                    'timestamp': datetime.now().isoformat(),
                    'data': {sym: float(latest[sym]) for sym in symbols}
                }
                await manager.broadcast(payload)
        except Exception as e:
            # Yahoo Finance API occasionally times out - don't crash
            await manager.broadcast({
                'error': f'Data fetch failed: {str(e)}',
                'timestamp': datetime.now().isoformat()
            })
        await asyncio.sleep(60)  # Fetch every minute

@app.on_event("startup")
async def startup_event():
    symbols = ['AAPL', 'GOOGL', 'MSFT', 'TSLA', 'NVDA']
    asyncio.create_task(fetch_stock_data(symbols))

@app.websocket("/ws")
async def websocket_endpoint(websocket: WebSocket):
    await manager.connect(websocket)
    try:
        # Keep connection alive until client disconnects
        while True:
            await websocket.receive_text()  # Wait for ping
    except WebSocketDisconnect:
        manager.disconnect(websocket)

@app.get("/", response_class=HTMLResponse)
async def get_dashboard():
    return """
    <!DOCTYPE html>
    <html>
    <head>
        <title>Stock Dashboard</title>
        <style>
            body { font-family: 'Courier New', monospace; background: #0a0e27; color: #e0e0e0; }
            .stock-card { 
                background: #1a1f3a; 
                padding: 20px; 
                margin: 10px; 
                border-radius: 8px;
                border-left: 4px solid #00d4aa;
            }
            .price { font-size: 2em; font-weight: bold; }
            .up { color: #00ff88; }
            .down { color: #ff4444; }
            .timestamp { font-size: 0.8em; color: #888; }
        </style>
    </head>
    <body>
        <h1>Real-Time Stock Monitor</h1>
        <div id="stocks"></div>
        <div class="timestamp" id="last-update"></div>
        <script>
            const ws = new WebSocket('ws://localhost:8000/ws');
            const stocksDiv = document.getElementById('stocks');
            const timestampDiv = document.getElementById('last-update');
            let previousPrices = {};

            ws.onmessage = (event) => {
                const msg = JSON.parse(event.data);
                if (msg.error) {
                    console.error(msg.error);
                    return;
                }

                stocksDiv.innerHTML = '';
                for (const [symbol, price] of Object.entries(msg.data)) {
                    const card = document.createElement('div');
                    card.className = 'stock-card';

                    const priceClass = previousPrices[symbol] 
                        ? (price > previousPrices[symbol] ? 'up' : 'down')
                        : '';

                    card.innerHTML = `
                        <h2>${symbol}</h2>
                        <div class="price ${priceClass}">$${price.toFixed(2)}</div>
                    `;
                    stocksDiv.appendChild(card);
                    previousPrices[symbol] = price;
                }
                timestampDiv.textContent = `Last update: ${new Date(msg.timestamp).toLocaleTimeString()}`;
            };

            // Send periodic ping to keep connection alive
            setInterval(() => ws.send('ping'), 30000);
        </script>
    </body>
    </html>
    """

Run it with uvicorn main:app --reload, hit http://localhost:8000, and you get live price updates every minute with color-coded changes. The entire dashboard is 120 lines.

Digital display showing COVID-19 global confirmed cases in real-time.
Photo by Markus Spiske on Pexels

Why Not Just Polling?

The obvious approach is polling: make the frontend hit /api/stocks every 10 seconds. Simple, right?

But here’s what happens at scale. Each poll is a full HTTP request: TCP handshake, TLS negotiation if you’re using HTTPS, headers (easily 500-800 bytes), request parsing, response serialization, teardown. For 100 clients polling every 10 seconds, that’s 600 requests per minute. On a $5 DigitalOcean droplet (the kind you’d actually deploy this on), you start seeing 200-300ms response times under that load because Python’s GIL makes every request block a tiny bit.

WebSockets flip this: one persistent connection per client. The server pushes updates only when data changes. No handshake overhead, no request parsing per update, just raw JSON frames over an already-open socket. In practice, you can handle 500-1000 concurrent connections on the same $5 droplet that choked at 100 polling clients.

And crucially, WebSockets let the server decide when to push. With polling, you’re stuck guessing: poll too often and you waste bandwidth, poll too rarely and users see stale data. My dashboard pushes every 60 seconds during market hours (Yahoo Finance free tier rate limit) but could instantly push breaking news alerts without the client asking.

The ConnectionManager Pattern

The ConnectionManager class is doing more work than it looks. When a client connects, we call websocket.accept() and add it to a set. When they disconnect (browser close, network drop, etc.), we remove them. The tricky part is the broadcast loop:

async def broadcast(self, message: dict):
    dead_connections = set()
    for connection in self.active_connections:
        try:
            await connection.send_json(message)
        except Exception:
            dead_connections.add(connection)
    self.active_connections -= dead_connections

Why the dead_connections set instead of removing directly? Because you can’t mutate a set while iterating over it in Python (you’ll get a RuntimeError: Set changed size during iteration). I’ve seen production code try to self.active_connections.remove(connection) inside the loop and crash every few hours when a mobile client drops mid-broadcast.

The other subtle bit: we catch all exceptions during send, not just WebSocketDisconnect. Network hiccups can raise asyncio.TimeoutError, ConnectionResetError, or even OSError depending on the platform. Easier to just mark the connection dead and clean it up.

Background Tasks Without Blocking

The fetch_stock_data() coroutine runs forever in the background:

@app.on_event("startup")
async def startup_event():
    symbols = ['AAPL', 'GOOGL', 'MSFT', 'TSLA', 'NVDA']
    asyncio.create_task(fetch_stock_data(symbols))

asyncio.create_task() schedules the coroutine to run concurrently with the FastAPI event loop. It doesn’t block startup—the server becomes ready immediately while the background task starts fetching data. This is better than threading (no GIL contention, no locks) and better than running it inside the WebSocket handler (that would tie the fetch loop to individual clients).

One thing I’m not entirely sure about: what happens if yf.download() takes longer than 60 seconds? The await asyncio.sleep(60) happens after the download completes, so if Yahoo Finance is slow (which it is during market open), you might get irregular update intervals. A more robust version would use asyncio.create_task() to schedule the next fetch 60 seconds from now, not 60 seconds from when the previous fetch finished. But for a dashboard, irregular intervals are fine—better to show accurate data late than stale data on time.

Handling Yahoo Finance Flakiness

Yahoo Finance’s unofficial API (which yfinance wraps) is… unreliable. Sometimes it returns empty dataframes. Sometimes it times out. Sometimes it returns data with NaN values for no obvious reason. The defensive approach:

if not data.empty:
    latest = data['Close'].iloc[-1]
    payload = {
        'timestamp': datetime.now().isoformat(),
        'data': {sym: float(latest[sym]) for sym in symbols}
    }
    await manager.broadcast(payload)

We check not data.empty before accessing anything. And we explicitly cast to float() because pandas sometimes returns numpy.float64, which isn’t JSON-serializable (you’ll get a TypeError: Object of type float64 is not JSON serializable without the cast). I learned this the hard way after deploying and seeing the dashboard silently stop updating.

The error broadcast is important too:

except Exception as e:
    await manager.broadcast({
        'error': f'Data fetch failed: {str(e)}',
        'timestamp': datetime.now().isoformat()
    })

Rather than letting the exception kill the background task, we broadcast the error to all clients. The frontend can log it or show a “Connection issues” banner. This beats leaving users staring at a frozen dashboard wondering if their internet is down.

Frontend: Minimal JavaScript That Works

The dashboard HTML is inline because this is a single-page app with no build step. No React, no Webpack, no npm install that downloads 400MB of dependencies. Just vanilla JS:

const ws = new WebSocket('ws://localhost:8000/ws');

ws.onmessage = (event) => {
    const msg = JSON.parse(event.data);
    if (msg.error) {
        console.error(msg.error);
        return;
    }
    // ... update DOM
};

The color-coding logic compares current price to previous price:

const priceClass = previousPrices[symbol] 
    ? (price > previousPrices[symbol] ? 'up' : 'down')
    : '';

If we haven’t seen the symbol before (previousPrices[symbol] is undefined), don’t color it—avoids the first update flashing green or red randomly. This is the kind of micro-UX polish that costs 2 lines but makes the dashboard feel professional.

The periodic ping keeps the connection alive:

setInterval(() => ws.send('ping'), 30000);

Some reverse proxies (nginx, Cloudflare) close idle WebSocket connections after 60 seconds. Sending a ping every 30 seconds prevents that. The server’s await websocket.receive_text() loop doesn’t care what the message is—it just keeps the socket open.

Deployment: Uvicorn + Nginx

For production, you’d run Uvicorn behind nginx. The nginx config looks like:

upstream fastapi {
    server 127.0.0.1:8000;
}

server {
    listen 80;
    server_name yourdomain.com;

    location / {
        proxy_pass http://fastapi;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_set_header Host $host;
        proxy_read_timeout 86400;  # 24 hours for long-lived connections
    }
}

The key lines are proxy_http_version 1.1 (WebSockets require HTTP/1.1) and the Upgrade headers (tell nginx to pass through the WebSocket handshake). proxy_read_timeout 86400 prevents nginx from killing idle connections—set it to however long you want clients to stay connected.

You’d run Uvicorn with multiple workers for production:

uvicorn main:app --host 0.0.0.0 --port 8000 --workers 4

But be careful: each worker runs its own fetch_stock_data() background task. If you have 4 workers, you’re hitting Yahoo Finance 4x as often. For this dashboard, I’d actually stick with --workers 1 and scale horizontally (multiple servers behind a load balancer) if needed, because the background task doesn’t benefit from multiple processes.

What About Reconnection?

The JavaScript above doesn’t handle disconnects gracefully. If the server restarts or the network drops, the client just… stops working. A production version needs:

let ws;
let reconnectInterval = 1000;

function connect() {
    ws = new WebSocket('ws://localhost:8000/ws');

    ws.onopen = () => {
        console.log('Connected');
        reconnectInterval = 1000;  // Reset backoff
    };

    ws.onmessage = (event) => {
        // ... existing logic
    };

    ws.onclose = () => {
        console.log(`Disconnected, reconnecting in ${reconnectInterval}ms`);
        setTimeout(connect, reconnectInterval);
        reconnectInterval = Math.min(reconnectInterval * 2, 30000);  // Exponential backoff
    };
}

connect();

Exponential backoff prevents a broken server from getting hammered by reconnect attempts. Start at 1 second, double on each failure, cap at 30 seconds. This is the pattern you see in every production WebSocket app.

Scaling Considerations (That I Haven’t Tested)

If you need to scale beyond a single server, you’ll hit a problem: WebSocket connections are stateful. Client A connects to Server 1, Client B connects to Server 2. When a stock update happens, how do you broadcast to both?

The standard solution is Redis Pub/Sub. Each server subscribes to a Redis channel. When fetch_stock_data() gets new data, it publishes to Redis. All servers receive the message and broadcast to their connected clients. The FastAPI code would look like:

import aioredis

redis = await aioredis.create_redis_pool('redis://localhost')

async def fetch_stock_data(symbols: list):
    while True:
        # ... fetch data
        await redis.publish('stock_updates', json.dumps(payload))
        await asyncio.sleep(60)

async def redis_listener():
    channel = (await redis.subscribe('stock_updates'))[0]
    while await channel.wait_message():
        msg = await channel.get_json()
        await manager.broadcast(msg)

But I haven’t tested this at scale, so take it with a grain of salt. The theory is solid, but I’d want to benchmark it with 10k+ concurrent connections before trusting it in production.

Why FastAPI Over Flask or Django?

Flask and Django both support WebSockets (via extensions like Flask-SocketIO or Django Channels), but they’re not async-native. Flask-SocketIO uses eventlet or gevent for concurrency, which means monkey-patching Python’s standard library at runtime—it works, but debugging issues is a nightmare when you’re not sure if your bug is in your code or the monkey-patching.

Django Channels is more robust (it’s built on ASGI like FastAPI) but it’s heavy. You need Redis or RabbitMQ as a message broker, you need to configure channel layers, you need to understand Django’s middleware stack. For a stock dashboard, that’s overkill.

FastAPI is async from the ground up (built on Starlette and Uvicorn, both ASGI servers). The entire framework uses async/await, so WebSockets are a first-class citizen. No monkey-patching, no external brokers (unless you’re scaling to multiple servers), no surprises. And the performance is absurd—benchmarks put FastAPI at 60k+ requests/second on a single core, roughly 3x faster than Flask with gevent.

What’s Missing (And Why I’d Add It Next)

This dashboard doesn’t persist historical data. Every time you reload the page, you lose the price history. A real version would store ticks in a time-series database (InfluxDB, TimescaleDB) and let the frontend query historical data for charting. The WebSocket would only push new data, and the initial page load would fetch the last hour from the DB.

I’m also curious whether WebRTC data channels would be faster than WebSockets for this use case. WebRTC is peer-to-peer (no server in the middle once connected), so latency could be lower. But the setup is way more complex (STUN/TURN servers, ICE candidates), and I’m not convinced the 10-20ms latency improvement matters for minute-level stock data. If you were building a high-frequency trading dashboard with tick-level data, maybe—but for retail investors, WebSockets are plenty fast.

The portfolio optimization engine from Part 4 could feed into this dashboard—imagine a sidebar showing your portfolio’s real-time value as the stocks update. You’d add a /ws/portfolio endpoint that broadcasts total portfolio value whenever any holding’s price changes. The math is straightforward (just iwipi(t)\sum_i w_i \cdot p_i(t) where wiw_i is the number of shares and pi(t)p_i(t) is the current price), but the UX impact is huge: watching your net worth update live is way more engaging than refreshing a static page.

Use FastAPI for real-time dashboards. If you need multi-server scaling, add Redis Pub/Sub. If you’re building something transactional (chat, multiplayer games), look into Socket.IO for better fallback support on corporate networks. But for pushing stock data to browsers? WebSockets over FastAPI is the sweet spot.

US Stock Market Analyzer Series (5/5)

Did you find this helpful?

☕ Buy me a coffee

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

TODAY 390 | TOTAL 2,613