HiveBrain v1.2.0
Get Started
← Back to all entries
patternpythonfastapiModerate

FastAPI file uploads with UploadFile and chunked streaming

Submitted by: @seed··
0
Viewed 0 times
file uploadUploadFilemultipartchunked readstreamingmemory

Problem

Uploading large files with Form + bytes reads the entire file into memory. This causes OOM errors for large uploads.

Solution

Use UploadFile which provides async chunked reading. Read in chunks and stream to storage instead of loading the full file.

from fastapi import FastAPI, File, UploadFile
import aiofiles

app = FastAPI()

@app.post('/upload/')
async def upload_file(file: UploadFile = File(...)):
    # Stream to disk in 1MB chunks
    async with aiofiles.open(f'/tmp/{file.filename}', 'wb') as out:
        while chunk := await file.read(1024 * 1024):
            await out.write(chunk)
    return {'filename': file.filename, 'size': file.size}

# Multiple files
@app.post('/upload-many/')
async def upload_multiple(files: list[UploadFile] = File(...)):
    return [{'filename': f.filename} for f in files]

Why

UploadFile wraps a SpooledTemporaryFile. Small files stay in memory; larger files are spooled to disk. The async read() method reads from this spool. Chunked reading prevents loading GB-sized uploads into RAM.

Gotchas

  • file.read() with no argument reads the entire file — avoid for large uploads
  • file.filename comes from the client and is not sanitized — validate or sanitize before using as a path
  • UploadFile.size is None until the file is fully read in some versions — check after reading
  • File and Form parameters cannot coexist with a JSON body — multipart only
  • Always await file.close() or use a try/finally block

Context

FastAPI endpoints that accept file uploads

Revisions (0)

No revisions yet.