Adding a meg-head trans to raw data object

HI there,

Iā€™m trying out a new registration method. Iā€™ve generated a trans object based on my data. Is there a way to add my meg-head trans to a raw object?

  • MNE-Python version: 0.24.dev0
  • operating system: OSx

fifFile = os.path.join(dataDir, subjectID, scanDate, ā€˜megā€™, fifName)
raw = mne.io.read_raw_fif(fifFile, preload=True)

Note: this raw object does not have an meg-head transform

transMatrix = np.load(os.path.join(procDataDir, subjectID, scanDate, ā€˜digiā€™, ā€˜meg_head_trans.npyā€™))
trans = mne.transforms.Transform(fro=ā€œmegā€, to=ā€œheadā€, trans=transMatrix)

Iā€™m not sure how to get trans into raw.

Thanks,
Tim.

sorry I confused about your workflow. Can you provide a complete gist to see
in which order you do what?

thx
A

Yep. Sorry for the confusion.

This is for FieldLine OPM recordings with a fixed helmet. Basically, my goal is to source localization with my data, but for that I need a successful coordinate frame registration inside MNE. For example, I think that I need the sensor locations in the head coordinate frame, and to check that alignment using viz.plot_alignment. However, right now the alignment looks bad, and I think that is due to coordinate frame registration issues.

Currently, FieldLine software outputs sensor locations in ā€œmegā€ (i.e., helmet) coordinate system, and does not have an integrated registration process. That means that there is no meg-head coordinate transform in the raw fif file.

I read in the FieldLine fif file as follows:

  • fifFile = os.path.join(dataDir, subjectID, scanDate, ā€˜megā€™, fifName)
  • raw = mne.io.read_raw_fif(fifFile, preload=True)

I then visualize the sensor locations wrt the fsaverage brain:

  • kwargs = dict(eeg=False, coord_frame=ā€˜megā€™, show_axes=True, verbose=True, trans=ā€˜fsaverageā€™, subject=ā€˜fsaverageā€™, subjects_dir=subjects_dir, surfaces=(ā€˜headā€™, ā€˜pialā€™))
  • mne.viz.plot_alignment(defaultPosRaw.info, meg=ā€˜sensorsā€™, **kwargs)

I get the same (mis-aligned) result whether I set coord_frame to ā€˜megā€™ or ā€˜headā€™. I think that is because the raw object probably has an identity matrix for the meg-head transform, as a default.

As I mentioned, FieldLineā€™s prototype acq software does not include a meg-head registration step. I created my own meg-head transform based on Kinect digitization of the head in the OPM sensor array, and save it as a .npy file (in a different loooong script). Then, I load that transform into MNE python:

  • transMatrix = np.load(os.path.join(procDataDir, subjectID, scanDate, ā€˜digiā€™, ā€˜meg_head_trans.npyā€™))
  • trans = mne.transforms.Transform(fro=ā€œmegā€, to=ā€œheadā€, trans=transMatrix)

I would like to add this transform to the raw object, so that I can check my registration using plot_aligment, and then move onto localization, as per the OPM example on the MNE website.

I hope that is helpful, but please let me know if there are outstanding questions. Looking forward to your guidance.

Best,
Tim.

hi Tim,

Iā€™ve never got my hand on such data. The best I can offer if that you have a look at our doc:

on coord systems:
https://mne.tools/dev/auto_tutorials/forward/20_source_alignment.html

on OPM data:
https://mne.tools/dev/auto_examples/datasets/opm_data.html

Alex