Platform: Windows-10-10.0.19041-SP0
Python: 3.7.6 (default, Jan 8 2020, 20:23:39) [MSC v.1916 64 bit (AMD64)]
mne: 0.23.0
Dear MNE users,
I am new with MNE and I am trying to plot ERPs data processed with EEGLAB (i.e., a matlab file with 128 rows -channels- and 513 columns -time points-). I have created a mne.EvokedArray
object and could plotted it fine (see the attached figure) with the following code:
import mne
import scipy.io
samplesfile = scipy.io.loadmat("path/mat_file.mat") #File read into dictionary
samples = samplesfile['data_mean2'] #Extract the numpy array in the dictionary
biosemi_montage = mne.channels.make_standard_montage('biosemi128')
length_channels = len(biosemi_montage.ch_names)
sampling_freq = 512
ch_types = ['eeg'] * 128
info = mne.create_info(ch_names=biosemi_montage.ch_names, ch_types=ch_types, sfreq=sampling_freq)
raw = mne.EvokedArray(samples, info, tmin=-.2, nave=160, kind='average', comment='Tone')
raw.set_montage(biosemi_montage)
mne.viz.plot_compare_evokeds(raw, picks=('C23'), colors=dict(Tone='red'), linestyles=dict(Tone='solid'))
I have only two questions I couldn´t solve:
- How can I plot CI of the ERP? Given that my data file contains data already averaged across subjects, I considered to use a
callable
object for CI. But when I tryed to add some random data with acallable
object (see the example below), none shadow CI appeared around the ERP.
import numpy as np
def ci95(): return np.random.randint(500000000, size=(128, 513))
mne.viz.plot_compare_evokeds(raw, picks=('C23'), colors=dict(Tone='red'), linestyles=dict(Tone='solid'),
ci=callable(ci95))
- I don´t know why my data is scaled to those values in the Y axis (please, see the same attached figure). When plotting the topomaps, I could apply
scalings=dict(eeg=1)
and convert the Y lims of the colorbar to [-3, 3]. But I cannot figure it out how to scale it on theplot_compare_evokeds
figure. I saw in the documentation that the parameterylim
makes
Y-axis limits for plots (after scaling has been applied).
But when I set this parameter in the plot_compare_evokeds
as, for instance, ylim=dict(eeg=[-20, 20])
, I got an error that says:
Image size of 510x54826907 pixels is too large. It must be less than 2^16 in each direction.
I am sure I am missing something, but I have read many forums and documentation and I couldn´t solve it.
Thanks in advance for any help on these issues.
Best,
Fer