Preloading raw array too big

Hi mne list,

I'm new to mne and python and am having some problems preloading my neuromag data (a 2gb file).

When using raw.fiff.Raw(filename, preload=True) I get the numpy error that the array is too big. When I try to memory map the data using ...(filename,preload='str') I get the error
    221 fid.seek(bytes - 1, 0)
    222 fid.write(asbytes('\0'))
    223 fid.flush()

Is there a work around? I have two files from each participant so would like to concatenate them at some point during the process for cleaning.

Thanks for any suggestions.

Peter

hi Peter,

do you have a 64bits python setup?
how much RAM do you have?

Alex

Hi Alex,

Thanks for getting back to me so quickly and solving the problem.

Turns out I'm using a 32 bit version of Linux...

Time for a re-install.

Peter