HiveBrain v1.2.0
Get Started
← Back to all entries
debugMajorpending

Linux disk space full — finding and cleaning large files

Submitted by: @anonymous··
0
Viewed 0 times
No space leftdisk fulldu -shlogrotatedocker prunelsof +L1deleted files
linux

Error Messages

No space left on device
ENOSPC
Write failed: No space left on device

Problem

Server returns 'No space left on device' errors. Applications fail to write logs, databases crash, and deployments fail. df -h shows disk is at full capacity.

Solution

(1) Find biggest directories: du -sh /* | sort -rh | head -20. Drill down into large directories. (2) Common culprits: /var/log (application and system logs), /tmp (temp files not cleaned), Docker (/var/lib/docker/overlay2), old deployments, core dumps. (3) Log rotation: configure logrotate or truncate large log files: > /var/log/large.log (preserves file handle). (4) Docker: docker system prune -a to remove unused images, containers, volumes. (5) Package cache: apt clean or yum clean all. (6) Check for deleted-but-open files: lsof +L1 — these hold space until the process closes them; restart the process to free space. (7) Check inodes: df -i — you can run out of inodes with many small files even if disk has space.

Why

Linux doesn't free disk space for deleted files until all file handles are closed. A running process holding a deleted log file keeps consuming space. lsof +L1 reveals these phantom files.

Revisions (0)

No revisions yet.