Running mne.stats.permutation_cluster_test now leads to index error.

  • MNE-Python version: 0.24.0
  • operating system: Windows 10

Hi there,

I had a script that utilised the mne.stats.permutation_cluster_test function that was coded with version 0.19.2. I recently changed my laptop and downloaded the latest mne version here. Previously, the code work but now when running mne.stats.permutation_cluster_test, I get the following error.


File "C:\Users\angjw\Dropbox\MEG pipeline\MEEG Revamped\Group Analysis_v4d_hilbert.py", line 1246, in <module>
    T_obs, clusters, cluster_p_values, H0 = mne.stats.permutation_cluster_test(

  File "<decorator-gen-313>", line 24, in permutation_cluster_test

  File "C:\Users\angjw\anaconda3\envs\mne\lib\site-packages\mne\stats\cluster_level.py", line 1121, in permutation_cluster_test
    return _permutation_cluster_test(

  File "C:\Users\angjw\anaconda3\envs\mne\lib\site-packages\mne\stats\cluster_level.py", line 1006, in _permutation_cluster_test
    step_down_include[clusters[ti]] = False

IndexError: too many indices for array: array is 1-dimensional, but 2 were indexed

May I ask what has changed, and how do I go about rectifying the data input so it fits into the latest changes?

Update: The latest workable version seems to be v0.22. From 0.23 onward, this error started showing up.

hi,

There are 5 versions of differences between 0.19 and 0.24. We try to guarantee
backward compat for 2 versions when there is an API change.

you can also have a look at https://mne.tools/stable/whats_new.html to see the changes
we have made.

if you share a reproducible example I can have a look

Alex

Ah okay, I managed to pin point to the specific issue.

It seems that the error was gone when I let the step_down_p value be returned to default.

My initial script set this value to 0.05 and that led to that error. This was otherwise fine from v0.22 and back.

Below is the snippet of my script (It’s a little bit messy) if there is a need for reference. But I suspect this can pretty much be replicated on any other simpler setup by tweaking the step_down_p value.

import os

import numpy as np
import matplotlib.pyplot as plt

import mne

from mne.time_frequency import tfr_morlet, psd_multitaper, psd_welch
from mne.stats import permutation_cluster_test
from scipy import stats

from IPython import get_ipython

get_ipython().run_line_magic('matplotlib', 'qt')

## We set the log-level to 'warning' so the output is less verbose
mne.set_log_level('warning')

data_path = os.path.expanduser("C:\\Users\\angjw\\Dropbox\\MEG pipeline\\MEEG Revamped\\EBLSN\\EBLSN\\Test output\\")

save_dir_averages = data_path + 'grand_averages\\'

readfile = os.listdir(data_path)

## Averaging
#combineSensors = True 

## Time settings
tmin = -2.0
tmax = 3.0

## Freq settings
MinFreq = 1
MaxFreq = 12.5

decim = 1 

contrastpair = ['volblinks/4', 'blanks/3'] 

ch_name = ['MEG2013']#,'MEG2012','MEG1833','MEG1832','MEG2043','MEG2042',
          #'MEG2022','MEG2023','MEG2033','MEG2032']

combineSensors = True #Combine the (average) data across the list of ch_name above rather 
                        # than print them one by one.

baseline_mode = 'zscore' 
baseline = (1.5, 2.5)

epoch_set = []
extracted_epoch_set = []

for file in readfile:
    #Append filename to empty list if it ends with .csv
    if file.endswith('-epo.fif'):
        epoch_set.append(file)
        
print('No. of epoch data:', len(epoch_set))       

extracted_epoch_set = []
tfr1_power_set = []
tfr2_power_set = []
all_power_set = []
all_power_set2 = []
all_evoked_set = []
all_evoked_set2 = []
all_epochs_set = []
all_epochs_set2 = []

for extractEpoch in epoch_set:
    epochs = mne.read_epochs(data_path + extractEpoch)
    epochs.apply_proj()
    epochs.crop(tmin, tmax)  # crop the time to remove edge artifacts
    epochs.pick_channels(ch_name)
    extracted_epoch_set.append(epochs)              

for generatePower in extracted_epoch_set:

    # define frequencies of interest (log-spaced)
    freqs = np.logspace(*np.log10([MinFreq, MaxFreq]), num=10) #num=10
    n_cycles = freqs /2. #1 for temporal freq, 3 for balanced, freqs /2. for spectral resolution # different number of cycle per frequency
    
    # Was itc false, average false, no itc, .average()
    # Aim to make this process consistent with contrastplot.
    tfr_epochs_1 = tfr_morlet(generatePower[contrastpair[0]], freqs,
                              n_cycles=n_cycles, decim=decim, use_fft=False,
                              return_itc=False, average=False)
    
    tfr_epochs_2 = tfr_morlet(generatePower[contrastpair[1]], freqs,
                              n_cycles=n_cycles, decim=decim, use_fft=False,
                              return_itc=False, average=False)
    

    tfr_epochs_1.apply_baseline(mode=baseline_mode, baseline=baseline)
    tfr_epochs_2.apply_baseline(mode=baseline_mode, baseline=baseline)
    
    ## Calculate mean of all epochs
    ## this_tfr1.data.shape = one array of 10 frequencies over 501 (10ms) time bins.
    this_tfr1 = tfr_epochs_1.average()
    this_tfr2 = tfr_epochs_2.average()
    
    ## Need a dedicated and separate evoked item to edit it without affecting original this_tfr.
    averagingArray1 = tfr_epochs_1.average()
    averagingArray2 = tfr_epochs_2.average()  
    
    evoked_1 = generatePower[contrastpair[0]].average()
    evoked_2 = generatePower[contrastpair[1]].average()
    
    ## Calculate mean activation of all sensors.
    epochs_power_1 = this_tfr1.data.mean(axis=0)
    epochs_power_2 = this_tfr2.data.mean(axis=0)   
    
  
    all_epochs_set.append(epochs_power_1)  
    all_epochs_set2.append(epochs_power_2)  
  
    all_power_set.append(averagingArray1)
    all_power_set2.append(averagingArray2)   
    
    all_evoked_set.append(evoked_1)
    all_evoked_set2.append(evoked_2)
        

    #tfr1_power_set.append(epochs_power_1)
    #tfr2_power_set.append(epochs_power_2)

T_obs, clusters, cluster_p_values, H0 = mne.stats.permutation_cluster_test(
                              [np.array(all_epochs_set), np.array(all_epochs_set2)],
                               n_permutations=1024, threshold=None, 
                               step_down_p=0.05, max_step=1, tail=1)