Dear mne community,
- MNE version: e.g. 1.8.0
- operating system: Kubuntu 24.04
Context:
Iām running simulations of raw signal. After loading the head model, determining the source (that is the cortex patch that will be used as the source), and defining the souce time course, I add the source time course to the source SourceSimulator:
source_simulator = mne.simulation.SourceSimulator(src, tstep=tstep)
for t in np.arange(n_trials):
source_simulator.add_data(vis_r, act_1[:,t], events)
Everything works fine!
I then use the SourceSimulator to simulate the raw signal
raw_vis = mne.simulation.simulate_raw(raw.info, source_simulator, forward=fwd)
No problem here neither.
The issue I have concerns the running time of this last line (simulate_raw) as a function of the number of trials I want to simulate (i.e. the number of event in events, which corresponds to the number of trials n_trials). I was expecting this running time to be linear, but it is far from being linear!
Here are some running time as a function of the number of events:
n_trials running time (s)
10 0.5
20 2.8
30 9
40 21
50 42
With such an exponential increase, it is unrealistic to run a ālargeā number of trials (I was targeting āonlyā 300ā¦).
I quickly inspected the code of simulate_raw, but I could not identify what could cause the exponential increase.
Not sure this is relevant, but Iām running the code in a jupyter notebookā¦
Any idea of what could be the cause of the phenomenon ?
Thanks very much in advance,
B.