There is this info:
>>> import mne
>>> src = mne.read_source_spaces(mne.datasets.sample.data_path() / 'subjects' / 'fsaverage' / 'bem' / 'fsaverage-ico-5-src.fif', patch_stats=True)
>>> src[0].keys()
dict_keys(['id', 'type', 'np', 'ntri', 'coord_frame', 'rr', 'nn', 'tris', 'nuse', 'inuse', 'vertno', 'nuse_tri', 'use_tris', 'nearest', 'nearest_dist', 'pinfo', 'patch_inds', 'dist', 'dist_limit', 'subject_his_id', 'tri_area', 'tri_cent', 'tri_nn', 'use_tri_cent', 'use_tri_nn', 'use_tri_area'])
>>> src[0]["use_tri_area"].shape
(20480,)
>>> src[0]["use_tris"].shape
(20480, 3)
So this could get you the surface area per triangle in the decimated space. But you want it per vertex not per triangle.
Really the “surface area per source” can be thought of as “total surface area / total number of sources”. So looking at it that way, you could calculate it as:
sum(s["tri_area"].sum() for s in src) / sum(s['nuse'] for s in src) * 1e6
6.367790682029399
The * 1e6
is a conversion from “sources per square m” to sources per square mm". For an oct-6
for example you’d get:
>>> src = mne.read_source_spaces(mne.datasets.sample.data_path() / 'subjects' / 'sample' / 'bem' / 'sample-oct-6-src.fif', patch_stats=True)
>>> sum(s["use_tri_area"].sum() for s in src) / sum(s['nuse'] for s in src) * 1e6
20.69581503362706
I’m not sure why the value for the ico-5 doesn’t match, assuming that manual entry is for fsaverage
(because it will vary by subject). Even that for sample
wouldn’t be 9.8:
>>> sum(s["use_tri_area"].sum() for s in src) / 20484 * 1e6
8.280750830677961