I’ve been working with some large EEG datasets recently (2GB+) and noticed that exports to external formats can sometimes hit memory bottlenecks during the data-writing stage.
I’m curious to hear from the community: When dealing with files this large, do you usually subset your data before exporting, or would you prefer if MNE’s internal export functions handled chunking automatically?
I’m looking into some I/O refactoring and want to make sure the implementation aligns with how researchers actually use these tools in the field.