[Biomod-commits] Memory limitations using R

Robin Engler robin.engler at gmail.com
Tue Nov 10 14:16:26 CET 2009

Dear Alexandre,

The way I work around this is by dividing my area of projection into
smaller parts and running the projections on each part individually
(they can then be merged together later on again if needed). You can
program a loop to do this automatically.
Although memory allocation is a common problem in R, I'm surprised
that you encounter it as quickly as 500'000 grid points. Make sure
your code is very clean and that there are absolutely no unnecessary
objects loaded into memory, this will increase your available memory
I also know that Wilfried Thuiller is currently working on enabling
projections to made directly on grid objects in R, and assume that
this is likely to bring improvements in memory management. It might
take a few more weeks/month before this feature is available however.
Hope this helps,


Robin Engler
Spatial Ecology Group
University of Lausanne

On Tue, Nov 10, 2009 at 3:52 AM, alexandre sampaio
<alex_bonesso at yahoo.com.br> wrote:
> Hi,
> I am trying to run a BIOMOD projection to an area of 500,000 grid
> cells. The "Models" function works fine but every time I try to run it
> I get the message "Error: cannot allocate vector of size 317.8 Mb".
> I tried to increase the memory limit which is initially:
>> memory.limit()
> [1] 1535.875
> I am running R in a windows system with 3GB RAM, so I increased the R
> memory limit to 3GB and I still got the same error message after
> running the BIOMOD function "Projection".
> I tried to run R on Linux did not help.
> Is there any way to run a BIOMOD projection with this amount of data?
> Thanks for the help!
> Alexandre
> _______________________________________________
> Biomod-commits mailing list
> Biomod-commits at lists.r-forge.r-project.org
> https://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/biomod-commits

More information about the Biomod-commits mailing list