Creating EpochsArray with explicit timestamps instead of sampling frequency

Hello,

I have EEG epochs stored in MatLab files with the following arrays:

  • allTargets: array of all target epochs with shape (n_epochs, n_samples, n_channels),
  • allNTargets: array of all non-target epochs,
  • tSCALE: array with sample timestamps (from range [-0.2;1] seconds) with shape (1, n_samples).

As the EpochsArray constructor requires an Info object, I created one with mne.create_info. Because I don’t have sampling frequency explicitly, I calculated it as sfreq = n_samples / 1.2. However, when I compare the tSCALE array with the epochsArray.times array (with np.allclose), they are not even close.
Is it possible to specify timestamps explicitly? Can I edit the EpochsArray.times attribute directly?

I am sorry if there is an answer somewhere already. I tried to search for it, but maybe I searched for the wrong keywords.
I really appreciate any help you can provide.

just a quick sanity check: did you do EpochsArray(..., tmin=-0.2)? Can you tell us more about how “not even close” the arrays are? (do you get a constant value if you subtract one from the other? what value?)

Yes, I did

I typed the sampling frequency formula into a calculator and got a result of 511.66… so I tried to round it to 512. When I create the EpochsArray with this sampling frequency, the difference is an almost constant vector with a single step in the middle at index 306.

I’m not sure if this makes any sense. Please let me know if you need to make something more clear.

that makes sense. Sampling frequency need not be an integer. I suggest when you create the info object you pass sfreq=n_samples / duration (i.e., whatever expression yielded the number 511.66…), and see if that gives the correct timestamps.

That is what I did earlier but then the difference between the arrays is not constant at all.

can you share a link to the relevant files and post a code sample showing what you’ve done so far (i.e., code that loads the files, then creates the info and the EpochsArray)?

I created a minimalistic example at colab:

Well, it seems like I was fooled by PyCharm’s debugger, which tells me that the difference between arrays is not constant, but it is. Sorry for the confusion.

OK, I realized what the problem is. MNE-Python requires when creating epochs that there be a sample at exactly time=0. So even though you specify tmin=-0.2, you end up with tmin equalling -0.19921875 because the time span tmin to zero does not accommodate an integer number of samples given your sampling frequency of 512 Hz. Put another way: in your data sfreq * tmin = 102.4 samples, so MNE-Python shifts everything by 0.4 samples to make sure we get a sample aligned at exactly zero.

The solution is to set tmin=0 when creating the epochs, and then shift times afterward:

import scipy.io
import numpy as np
from mne import create_info, EpochsArray

path = 'example.dat.mat'
subject = scipy.io.loadmat(path)


orig_times = subject['tSCALE'][0]
n_samp = subject['tSCALE'].shape[-1]
tmin, tmax = orig_times[[0, -1]]
duration = tmax - tmin
sfreq = (n_samp - 1) / duration
assert sfreq == 1 / np.diff(orig_times)[0]


ch_names = [x[0] for x in subject['electrodes'].flat]
ch_types = ['eeg'] * len(ch_names)

info = create_info(ch_names=ch_names, ch_types=ch_types, sfreq=sfreq)
info.set_montage('standard_1020')

# transpose to (n_epochs, n_channels, n_samples) 
data = subject['allTARGETS'].transpose(0, 2, 1)
n_epochs = data.shape[0]
n_samples = data.shape[2]

# create epochs
event_no = 1
events_dict = {'target': event_no}
events = np.column_stack((
    np.arange(0, n_epochs * n_samples, n_samples),
    np.zeros(n_epochs, dtype=int),
    np.full((n_epochs, ), event_no)
))
epoch = EpochsArray(
    data,
    info,
    tmin=0,               # <- changed
    baseline=(0, -tmin),  # <- note this change too
    events=events,
    event_id=events_dict
)

epoch = epoch.shift_time(tmin, relative=False)

np.testing.assert_allclose(orig_times, epoch.times)

Okay, good to know there has to be a sample at time=0.
Thank you for your help.