I need to calculate timepoint-by-timepoint root mean square (RMS) from a grand-averaged ERP.
My script currently calculates the grand averaged ERP using the mne.grand_average function, with return an Evoked object. How do I calculate the RMS timepoint-by-timepoint of my grand-average waveform? I have tried to do the following:
rms = np.sqrt((ga.data**2).mean())
But this will just calculate the overall RMS. I think this is because ga.data is just the data without any temporal information. How do I do that? Any suggestion would be much appreciated!
the call to mean() is the problem here: it calculates the mean across all elements in the array. However, you’d like to calculate the mean values for each time point separately. Therefore, you need to tell mean() which axis to use. This should do the trick for you: