- MNE-Python version: 0.22.0
- operating system: linux
Hey! guys,
Is there a way to smooth the epoched data using a Gaussian kernel? I searched the website and didn’t find the information about that, any advice?
Best
Hey! guys,
Is there a way to smooth the epoched data using a Gaussian kernel? I searched the website and didn’t find the information about that, any advice?
Best
Do you need a Gaussian kernel specifically or are you just looking for a way to smooth the data? If the latter is the case, I suppose you could simply apply a low-pass filter with a rather small h_freq
value, this should make everything look much smoother.
Otherwise, SciPy might have some tools you can use …
a newly-added feature in the development version is epochs.apply_function()
which could be used for this. See here: mne.Epochs — MNE 0.23.dev0 documentation
You could also extract the data as a numpy array (epochs.get_data()
), apply the filter (e.g., scipy.ndimage.gaussian_filter1d — SciPy v1.6.1 Reference Guide), and then reconstruct epochs using mne.EpochsArray()
(passing in the smoothed data and the info
from the unsmoothed epochs object)
Thank you Richard, and yeah, I need a Gaussian kernel specifically based on the analysis methods of the literature I followed.
Thank you Dan, it seems like the first way is easier, so does the way it works is something like this:
fwhm=10
sigma=fwhm/2.3548
args = (epochs, sigma)
epochs.apply_function(scipy.ndimage.gaussian_filter1d(), picks='meg', n_jobs=8, *args)
Like this will work:
from scipy.ndimage import gaussian_filter1d
fwhm = 10
sigma = fwhm / 2.3548
epochs.apply_function(gaussian_filter1d, sigma=sigma)
Okay, I will try that. Thank you very much!
Hi! Dan,
I have a follow-up question, in my code, I downsampled my data first, from 1000 to 200Hz, and then intended to smooth the data using a 10ms-window, here is my code snippets:
rs_rate = 200
fwhm = 10
epoch_ds_learn = epoch_learn.resample(rs_rate, n_jobs=12)
from scipy.ndimage import gaussian_filter1d
sigma = fwhm / 2.3548
epochs_sm_learn = epoch_ds_learn.apply_function(gaussian_filter1d, n_jobs=12, sigma=sigma)
My question is since my data had been downsampled from 1000 to 200Hz, the time interval between two adjacent time points is from 1ms to 5ms, when I set the fwhm = 10, is the data of each time point after smooth the result of 5 ms before and after (10ms-window) or 25 ms before and after (50ms-window)?
honestly @YuZhou I would stick to standard filtering that you can parametrize with frequency cut-offs etc.
you can use mne.Epochs — MNE 0.24.dev0 documentation for example.
Alex
Hi! Alex,
Thank you so much for your reply! So savgol_filter( h_freq, verbose=None) seems to do better work than gaussian_filter1d? or the results will be almost the same?
If I use savgol_filter(h_freq, verbose=None), I still need to downsample, right?
If I want to smooth the data with a 10ms-length window after downsampling the data to 200Hz, what parameter should I set for ‘h_freq’ in savgol_filter(h_freq, verbose=None)?
if you say 10ms bins for average. I hear signals with at most half a cycle in 10ms. It means full cycles
of 20ms at most. It means a cutoff at 50Hz in h_freq. But I would say you can use 40Hz as filter is not
perfect to avoid ringing or 25Hz if you want a quarter of a cycle in 10ms.
HTH
Alex
So if I set the h_freq = 50, does it mean the data of every 20ms- time window will be smoothed? Actually, I don’t know how many seconds a cycle will need in my case, I just set the smoothed time window length followed the papers which did similar work with mine.
My other question is did savgol_filter() do a better job than gaussian_filter1d theoretically? or they basically have the same effect?
doing gaussian_filter1d on a signal for me is really odd. It’s designing a filter without
looking at sampling freq and non controlling clearly the filter cutoff
just use a proper filter design
savgol will do a low pass filter like gaussian_filter1d so it will smooth the signals
my 2c
Alex
Okay~Thank you so much for your advice and patience. I will try savgol_filter().
Best
Yu