gotchapythonfastapiMajor
FastAPI async endpoints: don't block the event loop with sync I/O
Viewed 0 times
async endpointblocking I/Oevent looprun_in_executorasyncioperformance
Problem
Defining an endpoint as async def but calling blocking I/O (requests.get, time.sleep, synchronous DB queries) inside it blocks the entire event loop, eliminating async benefits and causing request queuing.
Solution
Use async-native libraries for I/O inside async endpoints (httpx for HTTP, asyncpg/databases for DB). For unavoidable blocking calls, offload to a thread pool with asyncio.run_in_executor or use FastAPI's built-in sync endpoint (def instead of async def — FastAPI automatically runs sync endpoints in a threadpool).
import asyncio
import httpx
from fastapi import FastAPI
app = FastAPI()
# Correct: async HTTP client
@app.get('/data')
async def fetch_data():
async with httpx.AsyncClient() as client:
resp = await client.get('https://api.example.com/data')
return resp.json()
# Unavoidable blocking call — run in thread pool
@app.get('/blocking')
async def blocking_endpoint():
loop = asyncio.get_event_loop()
result = await loop.run_in_executor(None, some_sync_function)
return resultWhy
Python's asyncio event loop is single-threaded. A blocking call in async def halts the loop for all other coroutines. FastAPI runs sync def endpoints in a threadpool via anyio, so they don't block the event loop — but they cannot use await.
Gotchas
- Mixing sync SQLAlchemy inside async def is the most common mistake in FastAPI apps
- time.sleep() in async def blocks the loop; use await asyncio.sleep() instead
- async def endpoints that do only CPU work can also starve the loop — use ProcessPoolExecutor for CPU-bound tasks
- httpx.AsyncClient should be reused (not created per-request) for connection pooling
Context
Writing async FastAPI endpoints that perform I/O
Revisions (0)
No revisions yet.