Dear friends,
I am a new MNE Python user, so excuse me for trivial questions.
1. Is it possible to reconstruct the source for an integrated time window
(e.g., for the auditory N1, from 80 to 120 ms), in order to use the same
time window for the comparison among experimental conditions (e.g.,
attention vs. no attention)?
2. If yes, how to extract the time course of this source, and then to
obtain peak latency and coordinates of a peak vertex, and the amplitude
value for the estimated time window?
3. How to get a grand average for these source estimates?
Thank you so much for your help!
Irina
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20191020/46311a33/attachment.html
I am a new MNE Python user, so excuse me for trivial questions.
1. Is it possible to reconstruct the source for an integrated time window
(e.g., for the auditory N1, from 80 to 120 ms), in order to use the same
time window for the comparison among experimental conditions (e.g.,
attention vs. no attention)?
what do you mean by this? reconstruct the average from 80 to 120ms?
if you just want to limit what you reconstruct you can use Evoked.crop
method.
2. If yes, how to extract the time course of this source, and then to
obtain peak latency and coordinates of a peak vertex, and the amplitude
value for the estimated time window?
stc have data (time courses) and vertnos to know to which cortical vertex
it corresponds to.
3. How to get a grand average for these source estimates?
grand average over subject? you need to morph the stcs to fsaverage.
Hi,
Thank you very much for your reply,
Yes, I need to reconstruct the average from 80 to 120ms (to average the
whole-head source estimates over time - in this example, 40 time points at
sampling rate of 1000 Hz).
Irina.
Hi Alexandre,
Thank you very much for your help!
I prepared morphed (to fsaverage) stcs for all subjects. Which function
should I use to get the grand average?
Thank you!
Sorry,
I did not understand "get the stc for the full window".
I tried this:
morph_group = [morphS1, morphS2, morphS3, morphS4]
grand_average = mne.SourceEstimate.mean(morph_group)
returns error: "AttributeError: 'list' object has no attribute 'sum'"
I am very sorry for bothering you again,
I tried both options, and got the same error message:
TypeError: unsupported operand type(s) for +: 'SourceMorph' and
'SourceMorph'
Possibly, there was a mistake in morphing procedure.
1. I downloaded individual stcs:
stc = mne.read_source_estimate()
2. I computed morphed stcs:
morph = mne.compute_source_morph(stc, subject_from=subject_name,
subject_to='fsaverage', subjects_dir=subjects_dir)
3. saved them as *.h5:
morph.save()
4. Then I download all morphed stcs:
morphS1 = mne.read_source_morph(fileFolderSTC_morph + 'S1_stc-morph.h5')
morphS2=.........
5. and then tried to average:
grand_average = (morphS1 + morphS2 + morphS3 + morphS4) / 4
Dear Alexandre,
Is there a way to convert MNI coordinates TO the vertex number? I have
group mean x, y, z values for different experimental conditions, and I
would like to show them on the standard (fsaverage) brain.
Thank you very much for your kind help,
Irina.
But you'll probably want to plot using the `white` surface, assuming that's
what you used for source localization, otherwise the location won't make
much sense.
If you want the vertex number, you can load the fsaverage/surf/lh.white
and/or rh.white surfaces using `mne.read_surface`, and find the
index/vertex number of the nearest surface vertex to each of your x/y/z
points. Once you have this, you can use `add_foci` with
`coords_as_verts=True` and pass the vertex number -- and this will work
regardless of which surface you're using to plot.