Out of memory problem in linux

I was running an R program written by a third-party and for one specific dataset the process was getting killed. My machine had plenty of RAM available and the process was repeatedly getting killed by the OOM killer.

I read a little about the subject and I learned that I can tune the OOM killer by allowing my system to overcommit memory:

echo 1 > /proc/sys/vm/overcommit_memory

To permanently change this setting from 0 to 1, add this line to /etc/sysctl.conf:

vm.overcommit_memory=1

After doing this, the R process is no longer getting killed.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s