HomeGuidesSDK ExamplesAnnouncementsCommunity
Guides

Troubleshooting & Debugging

Troubleshooting "Too Many Open Files" Errors

Symptoms

Application crashes or fails with Python errors like:

OSError: [Errno 24] Too many open files

This typically manifests as:

  • Failed HTTP connections
  • Unable to open log files
  • Database connection failures
  • Cascading application errors

Diagnosis

Check the file descriptor limit inside the webhost container:

docker exec ar-webhost-1 sh -c "ulimit -n"

Expected values:

  • Too low: 1024 (will cause issues)
  • Ideal: 1048576 (1M) or higher

Where Limits Can Be Set

Limits are inherited in this hierarchy (most specific to least specific):

1. Docker Daemon Defaults

Location: /etc/docker/daemon.json

{
  "default-ulimits": {
    "nofile": {
      "Name": "nofile",
      "Hard": 1048576,
      "Soft": 1048576
    }
  }
}

2. Systemd Service (Docker Daemon)

Location: /etc/systemd/system/docker.service.d/override.conf

Check current limits:

systemctl show docker | grep LimitNOFILE

Known Distribution Differences

  • Ubuntu: Docker package typically includes LimitNOFILE=infinity in systemd service
  • Rocky Linux/RHEL: Docker package may not include this, resulting in lower default limits (1024)

If deploying across different Linux distributions, use the systemd override or docker-compose configuration to ensure consistency.