**If you have a question or issue with MNE-Python, please include the following info:**

- MNE-Python version: 0.24.dev0
- operating system: linux

Hi! guys,

long time no see

I was trying to find the significant clusters of decoding performance obtained from GeneralizingEstimator(). Iβm confused about the cluster the permutation_cluster_1samp_test() gave, here is my code snippet:

```
# significance tests
globals()[f'clust_fea_{fea}'] = permutation_cluster_1samp_test(globals()[f'arr_cate_spec_fea_{fea}'], n_permutations=1024, tail=1, n_jobs=12, seed=6)
# outputs of the significance tests above
globals()[f't_obs_fea_{fea}'] = globals()[f'clust_fea_{fea}'][0] # (n_time_points,n_time_points), (700,700)
globals()[f'clusters_fea_{fea}'] = globals()[f'clust_fea_{fea}'][1]
globals()[f'cluster_pv_fea_{fea}'] = globals()[f'clust_fea_{fea}'][2]
```

`globals()[f'arr_cate_spec_fea_{fea}']`

is a 3-dimensional array of generalized decoding performance with the shape of (40,700,700), 40 is the number of subjects and 700 is the number of time points, and the obtained `globals()[f'clusters_fea_{fea}']`

is a tuple which I converted to an array, its shape is (2, 490000), here is its content:

```
In [132]: globals()[f'sig_clust_fea_{fea}']
Out[132]:
(array([ 0, 0, 0, ..., 699, 699, 699]),
array([ 0, 1, 2, ..., 697, 698, 699]))
```

I donβt know whatβs this array stands for, the indices of significant time points? but why the size is 490000 instead of 700?

Looking forward to hearing from you guys!

Thanks a lot!