gotchapythonfastapiModerate
FastAPI middleware with BaseHTTPMiddleware — avoid for high-throughput
Viewed 0 times
middlewareBaseHTTPMiddlewarestreamingASGIperformancebuffering
Problem
BaseHTTPMiddleware is convenient but has a known performance issue: it buffers the entire response body in memory before passing it through, breaking streaming responses and adding overhead.
Solution
For simple middleware (adding headers, logging), use BaseHTTPMiddleware carefully. For streaming or high-performance needs, implement a pure ASGI middleware instead.
# Simple middleware — acceptable for non-streaming
from starlette.middleware.base import BaseHTTPMiddleware
class TimingMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request, call_next):
import time
start = time.time()
response = await call_next(request)
response.headers['X-Process-Time'] = str(time.time() - start)
return response
# Pure ASGI middleware — no buffering, streaming-safe
class RawASGIMiddleware:
def __init__(self, app):
self.app = app
async def __call__(self, scope, receive, send):
if scope['type'] == 'http':
# Modify scope/receive/send as needed
pass
await self.app(scope, receive, send)Why
BaseHTTPMiddleware wraps the ASGI interface with a Request/Response abstraction. To expose a Response object, Starlette must fully buffer the body. Pure ASGI middleware operates on the raw send/receive callables and can stream data without buffering.
Gotchas
- BaseHTTPMiddleware breaks StreamingResponse — the stream is fully consumed before the response object is returned
- Exception handling in middleware must re-raise or return a proper response — swallowing exceptions causes 500s
- Middleware order matters: app.add_middleware() adds in LIFO order (last added runs first)
Context
Adding cross-cutting concerns like logging, timing, or header injection to FastAPI
Revisions (0)
No revisions yet.