Hello,
I am encountering what appears to be a chromophore labeling issue when applying mne.preprocessing.nirs.beer_lambert_law() (mne version 1.11.0) to correctly formatted optical density (OD) data.
After validating channel ordering using:
picks = mne.preprocessing.nirs._validate_nirs_info(raw_od.info, fnirs="od")
for p in picks:
ch = raw_od.info['chs'][p]
print(p, ch['ch_name'], ch['loc'][9])
I obtain consistent SD pairing in the form:
S1_D1 760
S1_D1 850
S2_D1 760
S2_D1 850
...
However, inside beer_lambert_law, chromophores are assigned positionally:
for ki, kind in zip((ii, jj), ("hbo", "hbr")):
This assumes:
-
first channel in pair → HbO
-
second channel in pair → HbR
Since channels are sorted lexicographically, 760 nm comes before 850 nm. As a result:
-
760 nm is labeled as HbO
-
850 nm is labeled as HbR
This contradicts the physiological sensitivity of the wavelengths and makes the resulting HbO/HbR labels unreliable for interpretation.
The matrix inversion itself appears correct (extinction coefficients follow wavelength order), but the chromophore labeling is hardcoded based on channel position rather than wavelength.
Is this the intended behavior?
Should chromophore assignment instead depend on ch['loc'][9] (wavelength) rather than positional pairing?
Thank you for clarification.
Sélima