Hi @esbenkc thanks for reading the tutorials in such detail and taking the time to report this ambiguity.
MNE-NIRS uses nilearn to do the underlying GLM computations. As such, we inherit their terminology. The returned data is in the RegressionResults format which inherits from LikelihoodModelResults type. Those links provide additional information information that might be handy too.
But to answer your question, the theta values are the coefficients / parameters estimates of the GLM model. These are commonly referred to as Beta in other software. Someone asked a similar question at neurostars.
Stepping back, I think both the nilearn and MNE-NIRS documentation could be improved to clarify this. In the meantime, I hope this helps.
Thank you for the links! After a cursory reading, theta will then mean the estimate for each column in the design matrix and by that measure, we will have a theta value per column (which corresponds to the dataframe outputs, given the ROIs, contrasts, etc.).
So what theta directly means depends on the formula (e.g. {signal} \sim {column} will give a signal average for each design matrix column). Is that correctly understood? Forgive my ignorance here, but what would then be the output of run_glm given the data and design matrix then?
My assumption is just raw average signal over the conditions
theta will then mean the estimate for each column in the design matrix and by that measure, we will have a theta value per column
absolutely correct
So what theta directly means depends on the formula (e.g. signal~column) will give a signal average for each design matrix column). Is that correctly understood?
Again correct. The output will depend on the design matrix, so you will get different estimates (theta) depending on what you model in the design matrix.
I think that I should update the tutorials to include some basic GLM descriptions. But in the meantime you can read general introduction material to fMRI processing. The approach for MNE-NIRS is identical (we actually use the code from fMRI software under the hood).
I will ping back here when I get around to updating or writing a new tutorial on this.