I’ve just been tinkering around with an interesting issue: ORCA, the computational chemistry program I’ve been using because I can’t afford Gaussian, crashes during geometry optimisation of a (moderately) complex molecule because of OpenMPI.
OpenMPI is complaining about running out of file descriptors. Eh? Seriously? OK…
Turns out that Ubuntu 16.04 (even the server version) sets the open file limit at what is frankly a little on the low side – 1024 open files. That sounds like a lot, until you think that when running something via MPI it can be crunching across a lot of temporary files and so on… and it suddenly doesn’t seem so many. Interestingly, I never had this problem before because I was running Ubuntu 14.04 previously which (from what the internet says) had a limit of 4096. I checked with the latest release (14.04.5) which had a limit of 1024, so I’ll assume for now that the 4096 limit was in an older release…
I’ll be honest, since this is the first time I’ve encountered this issue, I’ve never actually checked previously…
Anyway, there appear to be two fixes that work, one on a per-user basis, one on a system-wide basis. Pick your poison.
The user-level fix is super easy, add the following to your
if [ $USER = "paramagnetic" ]; then
ulimit -n [pick-a-big-number, eg; 32768]
Can also be added to
At the system level, it’s a little more difficult, but still totally doable. Edit
* soft nofile 32768
* hard nofile 32768
root soft nofile32768
root hard nofile 32768
session required pam_limits.so
And reboot in all cases. If you have another form of system management software running, it might overrule you, which is annoying, but outside of the scope of this post.