I have a saved stc object that contains source localization data from 0 to 1 sec. I would like to divide this 1 second into 10 parts and calculate the mean of each 100-ms time interval and then plot it.
My stc data matrix has 20484 rows and 200 columns. Are these 200 columns time points? If so, how can I extract 100-ms time intervals from this matrix? (I want to calculate the mean of every 100-ms of my stc timeline)