Setting channels colors with add_sensors/mne.viz.plot_alignment

Hello evereyone,

  • MNE-Python version: 0.24.0
  • operating system: linux

Hello all, I am working with seeg and ecog data for my project, and I would like to know whether there is any way to set the color of the electrodes that are being plotted on the brain surface using the function plot_alignment or the brain method add_sensors.

I would like to be able to do so to be able to plot the electrodes of different patients in different colors or using some color scalings based on selectivity index and that sort of things.

While the solution shown in this tutorial: works fine for ecog data, it doesnā€™t work that nicely for seeg data. Indeed, the problem with seeg is that extracting the two D coordinates of each electrodes to overlay a colored dot or time series means that it will be plotted ā€œoverā€ the brain surface, and we therefore loose the depth visualization.

I would therefore like to know whether there is a solution whereby I can set the color of a set of channels. For example, here modify the color from whitish to blue for example:

Is there any way to achieve that?

Thanks in advance for the help :slight_smile:

Kind regards,

Alex

Hello everyone,

Has anyone any idea on how do address this problem?
I looked at the documentation of mne.viz.Brain but couldnā€™t find any way to change the sensors colors. I couldnā€™t find any handle on color on the add_sensors function.
As an alternative, I considered accessing the instance PyVista renderer returned by plot_alignment to change the color, but I couldnā€™t find a way to do so either, as I couldnā€™t locate anything referring to the sensors in the figure attributes.

To give a better understanding of what I am trying to achieve, here is a code snippet of what the goal would be, alongside a made up picture:

from random import randrange
import numpy as np
import mne
import os.path as op
from mne.datasets import sample

# Generating channels and position:
ch_names = ["G" + str(ind) for ind in range(6)]
# Made up position so that it falls on the brain:
ch_position = np.array([[-0.06, 0.05, 0.05],
                        [-0.05, 0.05, 0.05],
                        [-0.04, 0.05, 0.05],
                        [-0.03, 0.05, 0.05],
                        [-0.02, 0.05, 0.05],
                        [-0.01, 0.05, 0.05]])
ch_colors = {ch: [randrange(255), randrange(255), randrange(255)] for ch in ch_names}
# Set montage
montage = mne.channels.make_dig_montage(ch_pos=dict(zip(ch_names, ch_position)),
                                        coord_frame="mni_tal")
# Create data:
times = np.linspace(0, 1, 200, endpoint=False)
cosine = np.cos(10 * np.pi * times)
data = np.array([cosine] * len(ch_names))
info = mne.create_info(ch_names=ch_names,
                       ch_types=['seeg'] * len(ch_names),
                       sfreq=200)

# Create raw object:
simulated_raw = mne.io.RawArray(data, info)
simulated_raw.set_montage(montage)
brain_kwargs = {"alpha": 0.1, "cortex": "low_contrast", "background": "black", "units": "m", "surf": "pial"}
# Plotting the electrodes on the brain surface:
# Getting the dir to the mni subject:
data_path = sample.data_path()
subjects_dir = op.join(data_path, 'subjects')
brain = mne.viz.Brain('fsaverage', subjects_dir=subjects_dir, **brain_kwargs)
brain.add_sensors(simulated_raw.info, 'fsaverage', color=ch_colors)

Example of the type of picture this should produce:

Thanks in advance for the support,

Kind regards,

Alex

Hi, I suppose this might meet your requirements.


However, I didā€™t find a easy way if you directly use brain.add_sensors().
In fact, if a channel is added to the plotter, we could set the color and the name of it.
Pyvista uses actor to control the element we add.
Like if we add a brain mesh to the plotter, then will be one actor to control the brainā€™s color, opacity, et al. You can find all the actors using

actors = brain.plotter._renderer.actors

Itā€™s a dict, so you just need to find the key of the left brainā€™s actor.
Letā€™s say the key is ā€˜brainā€™.
Then we could get it by

brain_actor = actors['left_brain']

Pyvista is a helper module for the Visualization Toolkit (VTK), then we could set the property of the actor if you directly use mne.viz.brain

prop = brain_actor.GetProperty()
prop.SetColor(0.2, 0.2, 0.2)

Then you change the color of the left brain.
The procedure is the same as setting the color of electrodes.
However, in MNE, the actorā€™s name(aka the key of the dict) is not specified, so you could see the keys of the actors are:

['PolyData(Addr=00000229A5233F60)',
 'PolyData(Addr=00000229B4BB1CB0)',
 'PolyData(Addr=00000229B4BBB580)',
 'PolyData(Addr=00000229B4BBE500)']

The second and third are the keys of the electrodes I plot in the screenshot.
I set those to (0.2, 0.2, 0.2) and (0.4, 0.7, 0.9).
Above all, itā€™s not very easy to do so, for you might need to try the key to see which actor it is.

Actually, sometimes I think it might be useful if give these actors a specified name by adding a parameter like name.

Actor naming is indeed supported by PyVista and used in MNE-python when needed but this is something that we do not expose to users simply because it is a relatively advanced feature related to VTK. Most of those are still available through private attributes (i.e. brain._actors) and can be modified by an experienced user but are not intended to.

So I am in favor of updating the public API and add a color parameter similarly to @AlexLepauvre suggested. I updated the main Brain issue board with the feature request:

add a color parameter to add_sensors() to set channels color

Hello everyone,

Thanks a lot for the feedback. In the meantime, I will try implementing what Barry suggested, then whenever the brain functionalities are adjusted, I will go for that.

Is there anything I can do @GuillaumeFavelier to help out with this process? I have been trying to dig into the source code and I was a bit out of my depth with regard to my programming skills. But if there is nonetheless something I can do to help, I am more than happy to try to contribute :slight_smile:

Kind regards,

Alex

Thank you so much for your interest! Well, once the PR is opened, Iā€™ll link it in here so if you want, you can maybe interact on GitHub to test or share your opinion or interesting use cases :+1: