Importing an .set file

Hey guys, I need some help with importing .set files with eeglab. This error appers with every file that I try to use.

Some important informations:

  • I am using the .fdt file in the same path as .set
  • MNE version: 1.2.0
  • Platform: Linux-5.10.133±x86_64-with-Ubuntu-18.04-bionic
    (using colab)

That’s the simplifyng code that I am using and the .set and .fdt files are available in the following link:
https://drive.google.com/drive/folders/18YN0t0k7D8iHbWjKPUKOXGAMn-nLhQcj?usp=sharing

from google.colab import drive
from mne.io import read_raw_edf, read_raw_eeglab
from numpy import array
import numpy as np
import os

drive.mount('/content/gdrive', force_remount=True)

gdrive_path = '/content/gdrive/MyDrive/MNE_forum'

eeg_matrix = []
files = os.listdir(gdrive_path)
for eeg_file in files:

  file_type = eeg_file.split(".")[-1] 

  if file_type == "set":
    raw_data = read_raw_eeglab(f'{gdrive_path}/{eeg_file}')
    data = array(raw_data.get_data())

    eeg_matrix.append(data)
    223                 raise RuntimeError('Incorrect number of samples (%s != %s), '
    224                                    'please report this error to MNE-Python '
--> 225                                    'developers' % (block.size, count))
    226             block = block.reshape(n_channels, -1, order='F')
    227             n_samples = block.shape[1]  # = count // n_channels

RuntimeError: Incorrect number of samples (8251030 != 24999876), please report this error to MNE-Python developers

it’s a problem of your data. Trying to read the file with eeglab I see:

pop_loadset(): loading file /Users/alex/Downloads/MNE_forum/sub-mit040_task-Emotion_eeg.set …
Reading float file ‘/Users/alex/Downloads/MNE_forum/sub-mit040_task-Emotion_eeg.fdt’…
WARNING: The file size on disk does not correspond to the dataset, file has been truncated

Alex

1 Like

hmm, with matlab the imports occurs anyway, but with mne-python I can’t manipulate the data because of that error.

Do you propose something that I can do to use this dataset at python?

Perhaps I can adapt the data at the matlab or use another python method…

how did you end up with a broken file?

I would suggest you save the file back to disk with matlab and load it with mne

Alex