Too short epochs

External Email - Use Caution

Dear MNE-team,

I've been preprocessing my EEG data using standard preprocessing steps such as highpass filtering (1Hz), line noise removal, downsampling, removing noisy channels, and ICA and downsampling using MNE. When I then epoch my data it drops the majority of the epochs (40 out of 70 with a sampling frequency of 250 and 60 out of 70 with a sampling frequency of 100). The drop_log indicates that all of the epochs were dropped because they were too short. I've seen a previous post in the mailing list where the tmin and tmax where specified incorrectly but I did specify tmin and tmax in seconds. And changing the length of the epochs even if changed to 10 seconds does not change the number of epochs dropped. So I'm suspecting there might be an error? Do you have some insights what the cause of this might be?

Thanks for your help,

Sebastian

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20200605/4887f732/attachment.html

External Email - Use Caution

I?ve been preprocessing my EEG data using standard preprocessing steps
such as highpass filtering (1Hz), line noise removal, downsampling,
removing noisy channels, and ICA and downsampling using MNE

Downsampling the raw data or when constricting epochs with `decim` or after
creating epochs with `epochs.decimate`? Generally downsampling / resampling
raw is discouraged...

When I then epoch my data it drops the majority of the epochs (40 out of
70 with a sampling frequency of 250 and 60 out of 70 with a sampling
frequency of 100). The drop_log indicates that all of the epochs were
dropped because they were too short.

This can happen if you resample raw and then don't adjust your `events`
array to compensate, perhaps this is what happened?

Eric
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20200605/00ba326a/attachment.html

External Email - Use Caution

Yes, I downsampled the raw data using the resample method. I did this
because I am using pyprep for the initial stages of preprocessing and
have got data from an experiment that runs for 40 minutes. So the memory
requirements become huge. Is there a way to downsample the data and avoid
these issue? How does the events array need to be adjusted?

Instead of resampling, I would first try to work around memory issues by
loading the raw data using memmapping, e.g. with preload='./tempfile' in
read_raw_fif <https://mne.tools/stable/generated/mne.io.read_raw_fif.html&gt;
(or whatever reading function you're using). It will load all data into a
temporary array/file on disk rather than in memory. Modifications you make
to the data (filtering, etc.) will be made on disk on this temporary copy
instead of in memory. I'm not sure the extent to which pyprep avoids making
other copies of the data, though, so this might not fix the problem.

If you absolutely have to downsample, you can either do `find_events` on
your resampled raw data (but some events might be dropped) or do something
like this to adjust the events array from the original data:

events[:, 0] = np.round(events[:, 0] / ratio).astype(int)

where `ratio` is your downsampling ratio. After this if you have very
closely spaced events you could have some duplicates in `events`, which
you'll have to suitably combine somehow, depending on your analysis.

Eric