Can I use MNE-realtime to capture streaming EEG data from an Ethernet port?

Hi,

I have a biofeedback game written in Python which needs to respond in real time when EEG alpha wave amplitude exceeds a given value. The NeuroConn NEURO PRAX system is able to send a stream of raw EEG data to the PC that runs the game, via an Ethernet port using the TCP/IP protocol. Iā€™m looking for a way to capture the EEG data and process them using Python, so that my game can access the processed data. It seems as if MNE-realtime should be capable of doing this. However, after carefully reading through the code of some of the MNE-realtime examples and the associated MNE documentation, there is a part of the process that I donā€™t understand and which makes me uncertain whether MNE-realtime is in fact able to capture EEG data from an Ethernet port.

The ā€˜rtClientā€™ class has a ā€˜hostā€™ parameter, but this is defined as the ā€œhostname (or IP address) of the host where mne_rt_server is runningā€. Iā€™m struggling to find any documentation for ā€˜mne_rt_serverā€™, but it sounds as if it is an MNE object. The problem is that I want MNE to be running on the client, not on the server ā€“ the server is part of the EEG system and it runs its own proprietary software. The ā€˜FieldTripClientā€™ class has a similar issue: the ā€˜hostā€™ parameter is defined as the ā€œHostname (or IP address) of the host where Fieldtrip buffer is runningā€. It feels as if Iā€™m caught in a circle, where MNE can capture data but only when the data are sent by MNE, or by some other data capturing software such as FieldTripā€¦?! Iā€™ve not done anything like this before, and I guess Iā€™m getting a bit lost.

Please, can somebody tell me if MNE-realtime is able to capture streaming data from an Ethernet port?

Many thanks,
Richard

Hi Richard,

MNE-realtime has 3 clients: LSLClient, RtClient, and FieldTripClient. RtClient is specific to Neuromag MEG systems, so letā€™s leave that aside for now. The fieldtrip buffer and LSL software are vendor independent formats to stream EEG data, but you need to use a known protocol that MNE-realtime can understand. MNE-realtime is not capable of ingesting arbitrary streams of EEG data since it would not know how to construct the metadata needed for further processing.

It sounds like you need to contact the NeuroConn folks to understand what kind of interface they support for the streaming. If they have a Python interface, you could feed the data into pylsl, provide a hostname and then capture that using the same hostname on another computer on the same network. Fieldtrip buffer might also support the EEG system but you should check their website.

Mainak

1 Like

Hey Richard,

Neuroprax has a TCP/IP interface, and neuroConn shares Matlab and C# client code for real-time access. Five years ago, when i was still in academia, i wrote a Neuroprax to LSL converter in Python, and you can find it here: NeuroPrax2LSL Python Client Ā· GitHub [edit because original link was to a private repo]

You can either adapt the code according to your liking, or you use LSL to connect with mne-realtime (or roll your own rt-processing).

I am now working at neuroConn, so you should get more information through your support request with them soon, too.

Thankyou both for your helpful replies.

@mainakjas, your comments prompted me to try and understand the information that NeuroConn have provided about the interface that their ā€˜Data Serverā€™ provides for streaming data, and to investigate how I might communicate with it directly using Pythonā€™s socket module. The interface is said to be proprietary, but there are a number of ā€˜protocolsā€™ that transmit different sorts of information, one of which is for sending EEG data, and the sequence, data type and size of all the parameters included in these protocols are described in detail. I donā€™t have access to the ā€˜Data Serverā€™ yet (weā€™re waiting for it to be installed on our hardware), but Iā€™ve managed to knock together a server program in MATLAB, which sends data formatted in the same way as NeuroConnā€™s ā€˜Data Serverā€™, and a client program in Python that receives the data correctly, knowing what data format to expect. I think this is a proof of principle that a Python program running on the client PC is capable of capturing streaming data coming from the EEG hardware.

I could continue working on the socket module code, but would have to modify it to make the socket non-blocking, or to use multithreading, so that it will run concurrently with the code for the game. In practice, it would probably be easier to use MNE-realtime with a FieldTrip or LSL buffer, as I believe these buffers are executed in their own threads. However, I remain confused by the description of the ā€˜hostā€™ parameter in the MNE-realtime documentation for these buffers, e.g. the ā€˜FieldTripClientā€™ class defines the ā€˜hostā€™ parameter as the ā€œHostname (or IP address) of the host where Fieldtrip buffer is runningā€. Surely this means the hostname or IP address of the server, and there is surely no need for the FieldTrip buffer to be running on the server as well as on the client if the format of the data sent by the TCP/IP interface is known?

@rtgugg, I tried to access the link that you provided, as it would be very useful to see how a Neuroprax to LSL converter works, but unfortunately GitHub told me that it could not find the page. Please could you check if the link is correct? It is clear from your response that what Iā€™m trying to do is possible :blush: and yes, Iā€™m hoping that NeuroConn will be able to give me some more information.

Many thanks,
Richard

Sorry. Fixed the link

Hi Richard,

| rdkirkden Richard Kirkden
June 6 |

  • | - |

Thankyou both for your helpful replies.

@mainakjas, your comments prompted me to try and understand the information that NeuroConn have provided about the interface that their ā€˜Data Serverā€™ provides for streaming data, and to investigate how I might communicate with it directly using Pythonā€™s socket module. The interface is said to be proprietary, but there are a number of ā€˜protocolsā€™ that transmit different sorts of information, one of which is for sending EEG data, and the sequence, data type and size of all the parameters included in these protocols are described in detail. I donā€™t have access to the ā€˜Data Serverā€™ yet (weā€™re waiting for it to be installed on our hardware), but Iā€™ve managed to knock together a server program in MATLAB, which sends data formatted in the same way as NeuroConnā€™s ā€˜Data Serverā€™, and a client program in Python that receives the data correctly, knowing what data format to expect. I think this is a proof of principle that a Python program running on the client PC is capable of capturing streaming data coming from the EEG hardware.

I could continue working on the socket module code, but would have to modify it to make the socket non-blocking, or to use multithreading, so that it will run concurrently with the code for the game.

It sounds like youā€™re beginning to reinvent the wheel at this point ā€¦

In practice, it would probably be easier to use MNE-realtime with a FieldTrip or LSL buffer, as I believe these buffers are executed in their own threads.

I agree. There isnā€™t such a thing as LSL buffer though ā€¦ youā€™d still have to write your own code to convert the NeuroConn data into LSL format but youā€™d be saved the effort of writing a client. On the other hand, you could just use the socket program to knock together a simplistic client but youā€™d still have to reformat it for the needs of MNE for downstream processing.

However, I remain confused by the description of the ā€˜hostā€™ parameter in the MNE-realtime documentation for these buffers, e.g. the ā€˜FieldTripClientā€™ class defines the ā€˜hostā€™ parameter as the ā€œHostname (or IP address) of the host where Fieldtrip buffer is runningā€. Surely this means the hostname or IP address of the server, and there is surely no need for the FieldTrip buffer to be running on the server as well as on the client if the format of the data sent by the TCP/IP interface is known?

FieldTrip buffer is the server for the FieldTripClient. For neuromag the buffer runs only on the acquisition computer and it can serve multiple clients. I havenā€™t worked with the buffer for other systems, so my knowledge is limited. As you can see they support a number of vendors: https://github.com/fieldtrip/fieldtrip/tree/master/realtime/src/acquisition but it looks like Neuroprax is not supported. So I think your path forward should be using LSL.

See how the MockLSLStream is created: https://github.com/mne-tools/mne-realtime/blob/c7c3d9b78af7e897c75b9d541eedb7441d3a4651/mne_realtime/mock_lsl_stream.py#L70 so you can create your own LSL stream for the Neuroprax data.

Perhaps @teonbrooks can help you here since heā€™s the LSL expert.

Mainak

@rdkirkden let us know if the NeuroPrax2LSL and LSLClient combo work for you, and if it doesnā€™t, let us know where itā€™s not working for you

1 Like

Thank you all for your further advice. It has taken me a while, but I think I now understand your comments and what my options are.

Iā€™ve been stuck for a while thinking that the way to capture EEG data from the NeuroPrax system is to examine the NeuroPrax data protocol and, based on this, to somehow tell MNE-realtime what data format to expect, since this approach had worked for my low-level Python ā€˜socketā€™ code. But then I realised that MNE-realtime expects to receive data in the LSL format (or in one of several other specific formats) and that this means not only writing code to receive data in the NeuroPrax format, but to resend the data in the LSL (or other) format so that MNE-realtime can receive the resent data in a format that it can already understand. Instead of ā€˜receive ā†’ processā€™, I need to ā€˜receive ā†’ process ā†’ resend ā†’ receiveā€™.

@rtgugg, what a great resource the NeuroPrax2LSL program is! I found your code easy to follow and can see how it matches up with the NeuroPrax protocol whose details I have (I think the name of one parameter may have changed, as the NeuroPrax documentation that I have calls the sampling frequency ā€˜smpFreqā€™ instead of ā€˜fsā€™). I donā€™t have the NeuroPrax hardware available yet to test with, but I think the NeuroPrax2LSL program might work for me with almost no changes. I can see that using NeuroPrax2LSL is consistent with the advice given by @mainakjas that ā€œyouā€™d still have to write your own code to convert the NeuroConn data into LSL formatā€. As @teonbrooks has said, Iā€™d be using a ā€œNeuroPrax2LSL and LSLClient comboā€.

One thing I havenā€™t fully understood yet is how to run two programs at once ā€“ NeuroPrax2LSL for capturing and resending the EEG data and the other program that Iā€™ve written to run my game, which will use MNE-realtime to capture and process the resent EEG data while also controlling the game itself. When I was experimenting with ā€˜socketā€™ code, I found that I could run a server and client at the same time from separate command lines. I suppose I could take a similar approach for with the NeuroPrax2LSL program and my game program. These two programs are a lot more demanding on processor resources than my server and client programs were, but NeuroPrax2LSL uses a separate thread for most of its work.

The game program alone is going to be pretty demanding on processor resources because it has to both capture and process the EEG data and keep the biofeedback game running. The biofeedback game element is a VR experience, so this will require a lot of resources. Do you think Iā€™d need to use threading to make this work?

There has also been an interesting development. Iā€™ve been in contact with NeuroConn and they have told me something that might simplify things further, if I understand them correctly. Theyā€™ve said that their ā€˜Data Serverā€™, whose data format is the one Iā€™ve been examining and on which NeuroPrax2LSL is based, is not the only interface that they offer for capturing EEG data from the NeuroPrax system. An alternative, in their words, is ā€œto use the amplifier only with a small box which sends out LSL-dataā€. This would give ā€œdirect access to LSLā€. Iā€™m waiting for further explanation, but it sounds as if the NeuroPrax system might be able to send a stream of EEG data that is already formatted for LSL. If so, this would mean that I could actually skip the NeuroPrax to LSL conversion step. I think the disadvantage of this option is that there is a higher latency before the data are sent, but for our biofeedback application we donā€™t need a low latency. Have you come across this NeuroPrax interface at all?

Sorry for the expanding number of questions, but your advice is so helpful in giving me an understanding of how to approach this project.

@rdkirkden interesting development. I think youā€™re right and the ideal solution would be to have the NeuroPrax amplifier to its ā€œboxā€ that streams to LSL and using the LSLClient for processing the data. It would definitely save on the separate process for streaming the data that would then be listened to on the same PC. I think a lot of these paradigms use the working assumption of a dedicated acquisition computer separate from one that would be used for online processing.

When setting up this library, we wanted to have few solutions for getting the input data. We decided upon the RtClient @/mainakjas mentioned since we already had support for it within the library and I added support for LSL because it is a standard that is used in a lot of different research labs and the team behind it has worked to make the different hardware options compatible with LSL.

I havenā€™t come across the NeuroPrax hardware before so I donā€™t have any familiarity with it. Do they sale this LSL streaming box or make one available to you?

You could do the biofeedback and acquisition on the same computer but you could also do them on different computers. Long time ago I wrote two classes in MNE-Realtime: StimServer and StimClient. The server lives on the ā€œacquisitionā€ computer while the ā€œclientā€ lives on the stimulation computer. You can send a trigger code using this interface for the biofeedback purposes. See:

https://mne.tools/mne-realtime/auto_examples/plot_rt_feedback_server.html

Give it a shot if it serves your purposes ā€¦

I donā€™t know if threading will work for you because of the Global Interpreter Lock. It may affect the timings when there is another computationally intensive thread.

Mainak

Thank you both for your continued support. I agree with you that the best approach will be to have two different computers, one to capture and process the EEG data and the other to run the biofeedback game.

The ā€˜boxā€™ that NeuroConn make for sending out EEG data from the amplifier in LSL format is informally known as the ā€˜Lilliā€™ (Iā€™m not sure if it has a formal name). It is sold as a separate piece of hardware. It has a latency of 60-238 ms, depending on the sampling frequency of the amplifier. This is greater than NeuroConnā€™s standard TCP/IP interface (whose latency Iā€™ve been told is in the region of 41-54 or 60-80 ms, Iā€™m not clear which range is correct), but it is OK for our purposes. I think another consideration is that the standard interface is capable of sending data that have been corrected using the NeuroPrax systemā€™s artefact correction algorithms, whereas the ā€˜Lilliā€™ box is not because it takes data directly from the amplifier. I think weā€™re probably going to be able to manage without artefact correction, as weā€™re recording from the occipital region where blinks donā€™t show up as much, but I thought Iā€™d mention it to give you a complete picture of what the differences are between these two interfaces.

I think the overall layout of the hardware/software will be:

  1. NeuroPrax amplifier sends EEG data via the ā€˜Lilliā€™ box in LSL format to the acquisition PC, via an Ethernet connection.

  2. Acquisition PC runs a program using the MNE library to capture and process the EEG data and sends a stream of numbers representing alpha band power to the biofeedback PC via an Ethernet connection.

  3. Biofeedback PC runs the biofeedback game and receives the alpha band power signal, modifying the game display in some way to represent changes in the alpha power magnitude.

My understanding of networking hardware and protocols is limited, so I wonder if I could ask you follow-up question about this? I think the acquisition PC will require two network adapters: one to receive data from the ā€˜Lilliā€™ box and another to send out processed data to the biofeedback PC. Is this correct? If so, I think we can just buy a PC and fit a second network adapter to it? The network adapters should have separate IP addresses and the PC should be able to concurrently receive data on one local network and send data on the other?

@mainakjas, thanks for pointing me to the ā€˜mne_realtime.StimServerā€™ class. I should be able to use this to send a stream of integers from the acquisition PC. On the biofeedback PC, I have found that I can capture a stream of integers using the ā€˜socketā€™ library and that this doesnā€™t seem to interfere with the running of the biofeedback game.

I donā€™t know much about the realtime stuff, but this part sounds correct to me. Itā€™s not unusual for a computer to have multiple network adapters (or a single network adapter that offers two or more jacks with separate addresses).

@drammock, thank you for your feedback about the networking hardware. This confirms that Iā€™m on the right track!

Revisiting the question of which EEG interface to use, I think we need to plan for the eventuality that data correction for blinks turns out to be necessary. This would mean using the standard NeuroPrax interface which sends data in the proprietary NeuroPrax format instead of in LSL format. I would like to use the NeuroPrax2LSL Python Client to handle these data, as it is ready-made for the job. @rtgugg, please could you tell me what the best way would be to integrate the data format conversion carried out by NeuroPrax2LSL with subsequent data processing (to extract the alpha component)? NeuroPrax2LSL is designed to output an LSL data stream and the MNE LSLClient is designed to receive an LSL data stream, so the default approach would seem to be to have two programs running concurrently and sending data to one another. Would it be feasible to have these programs running on the same computer and communicating via localhost, or should they be on different computers? The alternative would to create a single program that incorporates (1) the part of the NeuroPrax2LSL code that captures and handles the NeuroPrax data, yielding a dictionary of settings parameters and an np_array for each packet of data, and (2) some scipy code that processes the data to extract the alpha component, instead of using MNE for the data processing.