Help with cross-condition generalization

Hello,
I need some help running a temporal generalizing estimator across conditions. I’ve gotten a script to “work”, but the accuracy scores seem far too low for what I’d expect - the scores are so far below chance that they don’t make sense, especially since there should be at least a moderate linear dependence between the two conditions (which will become clear in my later description). I’ve been referencing this documentation but am still stuck, and was hoping that someone could tell me if I’m making some sort of silly mistake.

In my task my participants see a stimulus with two attributes - Attribute A and Attribute B. Each attribute is worth a certain number of points. Participants need to add the Attribute A Points and Attribute B Points into the sum, which we will call “Total Points”, and make an appropriate response. For the sake of simplicity, I will focus on Attribute A, which comes in 5 possible amounts of points (-50, -20, 0, 20, 50). The number of possible Total Points can vary quite a bit, but I have collapsed them into five categories as well (large, negative total number of points; small, negative total number of points; zero total number of points; etc). There should obviously be some sort of linear relationship between Attribute A and the Total Number of Points.

I am trying to see if a temporal generalizing function fit to data collected from Attribute A can be used to predict the Total Points category. Here is what I have tried so far.

The data shape for each variable is:

  • Attribute_A_data: 150 epochs by 64 channels by 218 time points
  • Total_Points_data: 152 epochs by 64 channels by 218 time points
  • Attribute_A_labels : (150,) ndarray, with each entry being a label 1 through 5.
  • Total_Points_labels :(152,) ndarray, with each entry being a label 1 through 5.

I’m working on Windows 10 and MNE 1.1.1.

from sklearn.pipeline import make_pipeline
from sklearn.preprocessing import StandardScaler
from sklearn.svm import LinearSVC
import mne
from mne.datasets import sample
from mne.decoding import (SlidingEstimator, GeneralizingEstimator, Scaler,
                          cross_val_multiscore, LinearModel, get_coef,
                          Vectorizer, CSP)

clf = make_pipeline(StandardScaler(),LinearSVC(C=1.0, max_iter = 10000));
time_gen = GeneralizingEstimator(clf, n_jobs=1, scoring=None, verbose=True)

time_gen.fit(X=Attribute_A_data, y = Attribute_A_labels)
scores = time_gen.score(X=Total_Points_data, y = Total_Points_labels)

print(scores)
[[0.01315789 0.01315789 0.00657895 ... 0.03289474 0.02631579 0.02631579]
 [0.02631579 0.01315789 0.01315789 ... 0.01973684 0.02631579 0.01973684]
 [0.04605263 0.01973684 0.00657895 ... 0.05921053 0.05263158 0.04605263]
 ...
 [0.01315789 0.01315789 0.00657895 ... 0.01315789 0.01315789 0.01315789]
 [0.01315789 0.00657895 0.00657895 ... 0.01315789 0.01315789 0.00657895]
 [0.02631579 0.01973684 0.00657895 ... 0.01315789 0.01973684 0.01315789]]

scores.max()
0.09868421052631579

I’m hoping that I’m missing some sort of setting or preliminary step. Any advice or help will be greatly appreciated!!!