HCP data compatibility

  • MNE version: 0.24.0
  • operating system: Windows 10

Hi all,

I have been trying to run some source space connectivity analysis on the HCP dataset. I tried it on multiple platforms, e.g., FieldTrip and Brainstorm, and I found that the MNE style (Pythonic) fit my taste the most. Also, there are more documentation and support here, so I want to switch my analysis pipeline completely to MNE.

However, I ran into a few problems.

  1. I found a package MNE-HCP (MNE-HCP — MNE-HCP 0.1.dev12 documentation) that imports HCP data into MNE data structure, but this package haven’t been maintained for a long time. When I tried to load the preprocessed data using the example code raw = hcp.read_epochs('100307', 'rest', hcp_path=hcp_path), it throw me an error, showing the following message:
...
    609     # XXX hack for now due to issue with EpochsArray constructor
    610     # cf https://github.com/mne-tools/mne-hcp/issues/9
--> 611     epochs.times = times
    612     return epochs
    613 

AttributeError: can't set attribute

I am guessing this is because in an earlier version of MNE, it allows user to directly set the epoch attributes, but the current version does not support this anymore. The package documentation does not specify which version of MNE it supports. There is a similar thread someone put up a couple years ago here: Source Modeling HCP data in MNE - #2 by system, but I don’t think it has been solved already. Any thoughts on how I should fix this?

  1. Sensor registration. I wasn’t able to load the preprocessed epoch data, so I tried the raw data. I successfully loaded raw = hcp.read_raw('100307', 'rest', hcp_path=hcp_path) to MNE, and I used raw.plot_psd(fmax=200) to plot a spectral density plot. The signals are nicely plotted, however, I notice a wired rotation of the sensor location:

    You can see in the upper right hand corner, there is a 90 degrees clockwise rotation of the helmet.
    I also tried to just plot the sensor locations using this blog of code from one of the tutorials:
import matplotlib.pyplot as plt
fig = plt.figure()
ax2d = fig.add_subplot(121)
ax3d = fig.add_subplot(122, projection='3d')
raw.plot_sensors(ch_type='mag', axes=ax2d)
raw.plot_sensors(ch_type='mag', axes=ax3d, kind='3d')
ax3d.view_init(azim=70, elev=15)

It gives me this:

My guess is that there is some information missing from HCP that MNE-HCP is not able to correctly place the location of the sensors. Any thoughts about what the issue could be? I am afraid that if I just run any analysis without doing this type of sanity check, non of the result will make sense.

I will come to the office hours on discord tomorrow (2/18/2022) and maybe get a more direct answer to this from the core developers. This MNE-HCP package seems quite important and I am not sure why it is not maintained anymore. I am also a programmer myself (although not as experienced as a lot of the developers here), but I can help contributing if necessary as it is also very crucial for my research. Let me know if there is anything I can do to fix this.

Thanks,

Hey @andy, thanks for posting on Discourse! I think those are fairly easy fixes (well updating all of MNE-HCP to the latest version of MNE might take a bit but fixing that one problem should only take a few minutes). Setting the times is done slightly differently but that’s a really quick fix (I think you just have to use the internal epochs._times instead). The rotation I would guess is actually a transposition where when the coordinates are imported, the x and y coordinates are reversed as y and x somewhere, so that should be pretty easy to un-reverse.

PS In the future, you might want to start a new topic rather than replying to the office hours posts for organizational purposes.

OK, sounds good. Thank you. This makes a lot of sense. Let me see if I can locate where the error is and fix it.

Hey @andy, getting the ball rolling here [MAINT] Update to latest MNE version by alexrockhill · Pull Request #66 · mne-tools/mne-hcp · GitHub, feel free to comment and maybe we can do a few PRs together.

Hi @alexrockhill

I am trying to plot the co-registration of the sensors, the headmodel and compute a source model for the HCP data after extracting the data using MNE-HCP, however, it is a bit complicated because it seems like the HCP output is defined in a different coordinate system from MNE.

I just read through the tutorial you sent me last Friday: How MNE uses FreeSurfer’s outputs — MNE 0.24.1 documentation

If I understand correctly, all the analysis of MNE are based on the MRI surface RAS coordinate system that FreeSurfer uses. If I want to run source localization, I will need the sensor data, the headmodel and the source cortical surface/volume grid all align to the same coordinate system, in this case, the (FreeSurfer) MRI Surface Coordinate system.

My first question is related to the “convert” parameter in the “mne.io.read_raw_bti” function. The HCP data is originally collected using a “Magnes 3600wh” system, and MNE-HCP is using the “mne.io.read_raw_bti” to read in the BTi/4D data. The “covert” parameter, which is set to “False” in MNE-HCP, forces the data to covert to a Neuromag coordinate. I think this is why the orientation of the sensors are rotated in the psd plot, because MNE-HCP does not convert the coordinates to Neuromag. I am still not sure why this is turned off in MNE-HCP, it says in the documentation that there is a compatibility issue between HCP and MNE is terms of the coordinate systems (mne-hcp/read.py at 7dc789f44033e9b57f875da5794202e3a5290971 · mne-tools/mne-hcp · GitHub). According to this page in Fieldtrip (How are the different head and MRI coordinate systems defined? - FieldTrip toolbox), the BTi/4D data is in a ALS orientation. Is there a reason why this is by default converted to Neuromag? It also seems like the Neuromag data is not defined in the FreeSurfer coordinate system that MNE desires. After loading the data into MNE and converted to Neuromag, do I need to transform the sensors to the MRI Surface RAS for source localization?

Second, The MNE-HCP also loaded the headmodel generated by the HCP pipeline. However, I can almost be sure that the headmodel in the HCP output is not define in the MRI Surface RAS coordinate. I am assuming if HCP used FreeSurfer for the reconstruction, we should be able to find a transformation file that converts the headmodel to the FreeSurfer coordinate (MRI Sruface RAS coordinate), which can be co-registered to the sensors and then calculate the forward and inverse solution, correct?

The third question is more related to computing the source, surface-based or volumetric. According to this (Head model and forward computation — MNE 0.24.1 documentation), in order to calculate the source space, I need to have the FreeSurfer reconstruction output for the cortical surface. Can I assume that the source points generated from the FreeSurfer recon output is automatically defined in the MRI Surface RAS coordinate?

It is quite long, but I am trying to bridge the gap between HCP and MNE. Please let me know if you know the answers to these questions. Thank you!

Best,

Hey @andy, good questions, let me see if I can answer some of them!

In general, it’s nice to have some code that I can run on my computer so that I can follow along looking at the data in MNE and other objects. The idea is to have a minimally reproducible example that runs with python imports at the top and everything so that I can follow along. If you put it in between ticks (shift + tilde), you can write code that is nicely formatted like so:

import os.path as op
import mne

To answer your questions, yes, all the Freesurfer surfaces are in surface RAS and that’s where you want to go (also called “mri” in MNE). The steps to get there can be a bit confusing with all the transformation matrices but the process is pretty straightforward. Digitized points start in a device coordinate frame (e.g. Polhemus, Neuromag). I think that’s how they will come out of the bti. This device frame has an origin that is defined by the manufacture and usually the directions make sense (up is up etc. unlike MRIs which can be head-first supine). The first step is a dev->head transform. This should hopefully be done in HCP although I haven’t worked with the data. Alternatively you could get both dev->head and head->mri which from the coregistration GUI in MNE. In the GUI, you select the nasion, left auricular point and right auricular point and that defines the origin as the center of those with left and right being toward the auricular points. Ideally, you wouldn’t want to perform the coregistration because you want to use the one that is provided for reproducibility and because it takes time but given the digitization points and the T1, you could do that (this is a good sanity check to know what the transforms should be). What you do want to do is get the fiducials in terms of the MRI out of a sidecar file or something like that, that’s how it’s done in BIDS. That will give you head->mri. dev->head is going to be derivable from the fiducials in device coordinates which should hopefully be in the bti file. So it’s all linked by the head coordinate system in general which is how MNE stores most information internally (such as in FIF files). I think the next step is to get a working example where you show which files you have and then we can get them aligned to where you want to go. If possible it would be nice if the data could be downloaded with wget or something like that, or a Google Drive link is nice.