Best practice to get NIML/GIfTI or NIfTI from STC

Hi

I have predefined ROIs in .dset format (AFNI). I want to extract the activity of vertices within those ROIs from an STC file in MNE, but I am unsure about the best strategy. I found a discussion on adding GIfTI output for STC here or exporting surface sources as NIfTI volume here, but still, I am not sure what is the best strategy.
Is it possible to export the source estimated file as NIML.DSET or GIfTI? Or exporting surface sources as NIfTI?
Is it better to generate volume sources to export as NIfTI or surface sources?

  • MNE version: e.g. 0.24.1
  • operating system: Windows 10

Greetings-

It is possible to export as GIFTI, but we havenā€™t yet rolled the function into MNE proper. If you give us a few days we can make that repo public with some light documentation for the process.

In the meantime, it would be a good to update to the current version of MNE - 1.2.

-Peter

3 Likes

Hoping I managed to avoid any typos in this. Weā€™d love to bring this into MNE proper, but thereā€™s some debate on whether this belongs in MNE or in NiBabel.

Lots of thanks to Josh Teves (@jbteves) for the legwork on the python part of this. He wrote this ā€œhackā€ for converting STC files to GIFTI that are written to disk. You can get it via pip and info is here:

Taken from the help, you can run this directly from the command line:

stc2gii_hack \
    mymodel.fif \
    me-lh.stc \
    me-rh.stc \
    myhead

This is where mymodel.fif is your surface source space. The me-lh.stc/me-rh.stc are your STC time series. And the myhead is your prefix to save the new files as.

Check the surfaces with SUMA: suma -i myhead-lh.gii, open a surface controller and load the (in this case) left hemisphere time series (myhead-lh.time.gii). Youā€™ll notice that these are fairly grainy looking since MNE decimates the surface for (in part) faster computation. You should be able to select a node and press g to open the graph view and see your time series.

The rest of this post owes great thanks to the AFNI team as we went back and forth a bunch of times to try and get this working. Special thanks to Daniel Glen, Rick Reynolds, and Paul Taylor. These steps are needed to map those surface time series to the SUMA volumes. To start this process, you need to align the output GIFTI file from the hack to the SUMA folder corresponding file. I have this scripted, but because I thought (hoped?) it would be easier to read, Iā€™ve filled in the details with an example left hemisphere file and added ā€œbetter-ishā€ comments.

  1. Get surface info coordinates out:

SurfaceMetrics -i myhead-lh.gii -coords

  1. Calculate center of surface:
centerX=`3dBrickStat -mean myhead-lh.gii.coord.1D.dset[1]`
centerY=`3dBrickStat -mean myhead-lh.gii.coord.1D.dset[2]`
centerZ=`3dBrickStat -mean myhead-lh.gii.coord.1D.dset[3]`

echo "Center for your dataset: ${centerX} ${centerY} ${centerZ}"
  1. Get center of your SUMA surface:
SurfaceMetrics -i SUMA/std.60.lh.white.gii -coords
stdCenterX=`3dBrickStat -mean std.60.lh.white.gii.coord.1D.dset[1]`
stdCenterY=`3dBrickStat -mean std.60.lh.white.gii.coord.1D.dset[2]`
stdCenterZ=`3dBrickStat -mean std.60.lh.white.gii.coord.1D.dset[3]`

echo "Center for your std.60: ${stdCenterX} ${stdCenterY} ${stdCenterZ}"
  1. Calculate the distance between your surface and SUMA surface:
scaleX=`ccalc ${stdCenterX} - ${centerX}`
scaleY=`ccalc ${stdCenterY} - ${centerY}`
scaleZ=`ccalc ${stdCenterZ} - ${centerZ}`

echo "Shifting..."
echo "${scaleX} ${scaleY} ${scaleZ}"

if [ -e center_al.1D ]; then
    rm center_al.1D
fi
echo "1 0 0 ${scaleX}" >> center_al.1D
echo "0 1 0 ${scaleY}" >> center_al.1D
echo "0 0 1 ${scaleZ}" >> center_al.1D
  1. Create a new surface on the std.60 mesh with new center
    ConvertSurface -xmat_1D center_al.1D -i myhead-lh.gii -o myhead-lh_centered.gii

  2. Now that we have centered anatomical surfaces we want to bring the time series along with it:

SurfToSurf -i_gii SUMA/std.60.lh.white.gii \
-i_gii myhead-lh_centered.gii \
-dset myhead-lh.time.gii \
-prefix std.60.
  1. Finally you should be able to visualize the time series on the higher resolution (upsampled) surfaces using something like this:
    suma -spec SUMA/std.60.MySubject_lh.spec -sv SUMA/MySubject_SurfVol.nii

Followed by opening the Surface Controller and selecting std.60.myhead-lh.time.niml.dset

  1. Repeat this process for the right hemisphere and convert your ROI to a std.60 map and you should be able to do any of the 3dROIstats or 3dmaskave stepsā€¦
2 Likes

Thank you, Peter!

Hello

When I run stc2gii_hack command to convert STC file to Gii file but I get the following error:

Reading a source spaceā€¦ Computing patch statisticsā€¦ Patch information addedā€¦ [done]
Reading a source spaceā€¦ Computing patch statisticsā€¦ Patch information addedā€¦ [done]
2 source spaces read
Traceback (most recent call last): File ā€œ/home/anaconda3/envs/stc2gii/bin/stc2gii_hackā€, line 8, in sys.exit(main())
File ā€œ/home/anaconda3/envs/stc2gii/lib/python3.8/site-packages/stc2gii_hack/run_hack.pyā€, line 97, in main to_gii_simple(
File ā€œ/home/anaconda3/envs/stc2gii/lib/python3.8/site-packages/stc2gii_hack/hack.pyā€, line 59, in to_gii_simple fif_ob = get_decimated_surfaces(fif)
File ā€œ/home/anaconda3/envs/stc2gii/lib/python3.8/site-packages/stc2gii_hack/hack.pyā€, line 34, in get_decimated_surfaces assert (ss[ā€˜trisā€™] >= 0).all() AssertionError

I have no idea how to solve the issue. I created the src and stc files with the following commands in MNE version 1.3.1 (Windows 10):

trans = os.path.join(transformation_path,transformation_fname)
src = mne.setup_source_space(subject='freesurfer', spacing='oct5', add_dist='patch', subjects_dir=mri_path)
model = mne.make_bem_model(subject='freesurfer', ico=4, conductivity=(0.3,), subjects_dir=mri_path)
bem = mne.make_bem_solution(model)
fwd = mne.make_forward_solution(epoch.info, trans=trans, src=src, bem=bem, meg=True, eeg=False, mindist=5.0, n_jobs=1, verbose=True)
inverse_operator = make_inverse_operator(evoked.info, fwd, noise_cov, loose=1, depth=None, verbose=True)
del fwd
    
method = "dSPM"
snr = 3.
lambda2 = 1. / snr ** 2
stc= apply_inverse(evoked, inverse_operator, lambda2, method=method, pick_ori=None, verbose=True)
src = inverse_operator['src']

stc.save(volume_path+'\\'+f"stc_{e}")
src.save(volume_path+'\\'+f"src_{e}.fif")

I used the src and stc files as inputs to the stc2gii_hack command but I got the above error. Do you have any idea to solve the issue?

Happy to take a lookā€¦ I think perhaps worth sending your STC and SourceSpace FIF over so that I can take a look to see if itā€™s specific to your computer/MNE/Windows. Can you send it via dropbox or something like that?

We havenā€™t updated the hack in quite a while and weā€™ve now been waiting 2+ years for a mythical ā€œupstream fixā€ on the issue.

Thank you, Peter. I uploaded the files here:

https://drive.google.com/drive/folders/17n8SfKRHfzr1C9kQoDfzL18bNqvJE0F5?usp=share_link

Looks like something is wrong with the source spaceā€¦ I wonder if itā€™s adding the ā€œpatchā€ option is throwing off our function. Can you try to modify it without that option?

My code in our pytest is simply:

myss = mne.setup_source_space(subject,'oct5', surface='white')
mne.write_source_spaces(subject + '-src.fif',myss, overwrite=True)
1 Like

Hello Peter

Yes, you are right. There was something wrong with the source space but adding the ā€œpatchā€ is not source of error, this line of code was the problem:

src = inverse_operator['src']

When I removed it, it worked.

However, there is still an error when I use SurfToSurf command. It works with right hemisphere and give niml.dset file but it is not working with the left hemisphere! When I run it on the left hemisphere gii files I get the following error:

Saving M2M into std.141
Error SUMA_MakeSparseColumnFullSorted (suma_datasets.c:9369):
Not full and cannot fill it
Error SUMA_DsetCol2FloatFullSortedColumn (suma_datasets.c:9188):
Failed to get full column vector
ā€“ Error SUMA_morphDsetToStd (SUMA_SurfaceToSurface.c:858):
Failed to extract
ā€“ Error SurfToSurf (SUMA_SurfToSurf.c:724):
Failed to map /home/ali/Desktop/Test/temporary/test-lh.time.gii

I updated the shared folder with the new files if you want to check them.

Happy to take more of a look! Iā€™ll need your std.141 surfaces or std.60 surfaces. Possibly both.

1 Like

Thank you so much!

I uploaded two files: ico141_lh.white_al.asc, ico141_rh.white_al.asc

Hi Alireza-

Sorry having some issues modifying our testing framework for your asc files. Our pipeline is to take the GIFTI files from the command:

@SUMA_Make_Spec_FS -sid subj-01 -GIFTI

This should give you GIFTI files (.gii) in the std.60/std.141 format. The .asc formatted files you uploaded are a little odd and arenā€™t converting quite right. Posting your entire code and uploading the GIFTI files would help us debug.

-Peter

On the plus side it looks like your data is here after the conversion to STC to GIFTI!

1 Like

Thank you, Peter.

Unfortunately, I only have the asc files currently. I made them with Surfing toolbox (https://surfing.sourceforge.net/) and they worked for my searchlight analysis on the surface in CoSMoMVPA.

I tried to convert the asc files to gii by this command:

ConvertSurface -i_fs ico141_lh.white_al.asc -o_gii ico141_lh.white_al.gii

But I got the previous error. Do you think there is a problem with the std.141 file? I am asking because it works with the right hemisphere.

I also tried to merge gii files resulted from stc2gii_hack with the following command:

ConvertSurface -i_gii Test-lh.time.gii -i_gii Test-rh.time.gii -merge_surfs -o_gii Test-mh.time.gii

but I got the following error:

Reading /home/ali/Desktop/Test/Test-lh.time.gii ā€¦
** afni_open_gifti_surf: /home/ali/Desktop/Test/Test-lh.time.gii is not a surface
Error SUMA_Load_Surface_Object_eng: Failed in SUMA_GIFTI_Read.
Reading /home/ali/Desktop/Test/Test-rh.time.gii ā€¦
** afni_open_gifti_surf: /home/ali/Desktop/Test/Test-rh.time.gii is not a surface
Error SUMA_Load_Surface_Object_eng: Failed in SUMA_GIFTI_Read.
oo Warning SUMA_Free_Surface_Object (SUMA_CreateDO.c:19644):
NULL SO
oo Warning SUMA_Free_Surface_Object (SUMA_CreateDO.c:19644):
NULL SO
but I could merge the other files with the above command. Do you have any idea why I canā€™t merge the time series files?

I tried to convert fsaverage from freesurfer using the following command to get the gii files:

@SUMA_Make_Spec_FS -sid freesurfer -GIFTI

I tested with both std.60 and std.141 files but I got the same error that I got by using the asc files. I uploaded the std.60 and std.141 files in the folder.

I used the same command you used in your second post.

Hmmm, I wonder if thereā€™s something odd going on with the downsampled mesh.

Is the free surfer error also only with the left hemisphere? It wouldnā€™t surprise me that the fsaverage wonā€™t work since youā€™d need to find node correspondence between the downsampled surface and the high resolution surface for the single subject before going to the standard space surface.

Do you have a second subject we could test on?

1 Like

Hello

First, I should say that I could create the GIFTI files (.gii) in the std.60/std.141 format file of the subject. There is still error with the left hemisphere. Also, I should add that .asc files are not working with the current framework. Here is the result with .gii file and .asc file from the right hemisphere. Based on my past knowledge the result from .gii seems reasonable.

I also checked with a second subject and it worked. I mean, I could convert both left and right hemisphere. So, it seems that there was a problem with that specific subject. Do you have any idea where is the source of error with this subject? Do you think there is a problem with the .stc file?

Thanks Alireza,

I think sometimes the freesurfer segmentation gets a little funny. If you can re-run that participant through the pipeline (recon-all -all -3T) along with the SUMA command for making GIFTI surfaces. Then through the MNE specific steps to get your source solution / STC with those new files it may fix the issue. If not, youā€™re welcome to upload the subjectā€™s T1 and I can attempt it here.

Iā€™m afraid we wonā€™t be able to modify the hack to support asc files. Joshua Teves who originally developed the hack has moved on to another institute and as long as the current solution works with our pipeline with Freesurfer, we probably cannot invest much time in modifying it. I continue to hold out hope that we can merge this functionality into MNE directly once they decide the next steps for integration of STC support with nibabel.

1 Like

Thank you so much, Peter.

Thank you for this helpful walk-through @pmolfese.

Quick follow up: I am wondering if it is possible to use the ā€œstd.60.myhead-lh.time.niml.dsetā€ output to perform surface-based clustering. Instead of using a full stc time series, Iā€™ve taken the mean value across time (so it would only be spatial clustering, not spatio-temporal). Do you think it is possible with this interpolated output surface? I figured it might involve using both convertDset and SurfClust but wasnā€™t sure if there was a better way to do this.

@schemankr

That is definitely a question for the AFNI folks over at https://discuss.afni.nimh.nih.gov.

1 Like