I have a python bottle app being served by nginx and uwsgi.
After running for a very few hours, I start getting a 500 error from the web app, and in /var/log/messages I see:
uwsgi: OSError: [Errno 24] Too many open files
lsof for the failing process shows 1023 files open
I've added fs.file-max=200000 to /etc/sysctl.conf
I've added to /etc/security/limit.conf:
uwsgi soft nofile 10000
uwsgi hard nofile 30000
I've done sysctl -p, and I've rebooted the machine. I still get the app failing when it hits 1,024 file descriptors.
Am I missing some uwsgi configuration, or something else? I can't seem to convince CentOS to give my app more files.
uwsgirunning as useruwsgi? – Phillip -Zyan K Lee- Stockmann Jan 16 '18 at 21:51sudo -u uwsgi -H /bin/bashandulimit -n. I also had a problem with early versions of systemd not setting configured limits correctly - that may not be the actual problem, but perhaps systemd is setting a lower limit when spawning uwsgi? – Phillip -Zyan K Lee- Stockmann Jan 16 '18 at 22:01systemctl restart uwsgiand Phillip that ulimit -n command returns 10000 – Mojo Jan 16 '18 at 22:06