Spending the holidays with brain-computing interfaces, EEG data and Python

 


A few days ago I published a post on LinkedIn about my holiday project on brain-computing interfaces (BCI). I decided a blog post made more sense for me so here it goes!

I have been interested in neuroscience for years. Actually, one of my first pet projects for a company was an emotional intelligence business. It didn't go far because I quickly understood how little we know about the actual functioning of the brain.But I continued reading incredible books such as everything I could find by Antonio Damasio, The Brain that Changes itself, the amazing Phantoms in the Brain, again anything written by Oliver Sacks, or a book I wasn't able to finished, The Quest for Consciousness, by Christof Koch. A special shoutout to The Mind's I. A mix of neuro and philosophical articles that just blew my mind (no pun intended) :)

Some years later I purchased an EEG device via Kickstarter and I learned more about what we actually know and can do with the outputs of our brain. 

I also had the pleasure to meet David Vivancos, who has been working around this area and advising companies such as Emotiv for many years (check out his latest paper on the topic)

I continued learning stuff about the brain, but more focused on the relationship with behavior. My readings and drawings around the book Behave were the latest things I was doing around the topic.


But I still wanted to do more with 

the EEG device, so this winter holiday I set myself to dedicate some time to play around with it and also see what else I could do around EEG and the like. I did not have a specific project in mind so the outcome was more related to finally grasping a generic but firm understanding of the workflow required to obtain relevant data from the brain in order to perform behavioral decisions. 


Playing around with the BCI device

 



The first goal was to make my Emotiv Insight device work. While I used it quite a lot around 2015, 2016, I had not really gone much further than that, and the last times I tried it, it was really frustrating (as in "sensors don't work" frustrating :)). But with some conductive gel and some resilience :) I was able to activate the account, download the new apps by Emotiv (the ones I used were inactive now), train the basic face and cube motion stuff and even being able to send mind commands. Even my daughter was able to train the cube and move it with her mind much better than me :D 

 

Using the Cortex 2.0 API


 

I also used the Emotiv Cortex 2.0 API and the Python package to interact with the device programmatically. I am not paying for the Emotiv EEG service (this frustrates me much. If I paid for the device, shouldn't I have access to everything the device generates?) so I only get access to processed information such as emotions, etc. Anyway, I played around using some Jupyter notebooks and was able  alter the examples provided to obtain some training data, etc. 

 

Seeing brainwaves in action  

I also used the BrainViz app from Emotiv to dig a little deeper into the brainwaves world It is intellectually fascinating for me how we use wave frequencies to understand how our brain thinks. And I also grew a little concerned of how, sometimes, the signals would completely disappear :D :D :D


Participating in some experiments


Emotive launched a while ago its Citizen Community where you can participate in experiments and play some games with the device. That was a fun part of the process, understanding what each experiment was trying to test... or at least guessing it! 

 

Oh, the mastoid


By using the device, I also read and learnt more about how the hardware of these devices work. I knew some basics about the sensors, and the like, but I hadn't read much about some details, such as the importance of the reference points (and why one option is to have the reference sensor in the mastoid.)

 

Finding out and being amazed by MNE

 

In parallel, I learnt how to read EEG datasets with MNE, a really, really incredible Python framework. The first part of the holidays were spent just understanding the basics of the whole process (from reading .fif files to preprocessing, epoching, adding events, etc.) I had to use public datasets as I could not use my on brain-generated EEGs, but the good news with that is that I could check that I was doing the right things. In general, I found extremely well-explained tutorials given the complexity of the topic. And just today I was able to read David Vivanco's latest paper where he provides us with lost of EEG and EOG data from his own MindBigData project. Amazing that the community provides so much interesting and useful resources!

 

Going deeper with machine learning


The final days of the week were spent using some machine learning to relate events with electric signals. I focused on finding P300 ERPs (a positive electric spike that happens after you see or hear an event, around 300 to 600 ms after the event itself) because there were more examples. Here is where I need to put more effort in the future, as my eyes are clearly not used to finding some "obvious differences between EEG lines" :D  I thought I wouldn't have time to go a little further and use ICA (Independent Component Analysis) or PCA (Principal Component Analysis), which are techniques used to clean the EEG data in order to remove "artifacts" (basically, signals that come from blinking your eyes or moving your muscles and that are noise to the purpose of brain activity processing) but just today I was able to play around with it a little bit more.

 

I still have lots of thing to study and tinker with. As with any serious research topic, everytime I believe I am starting to grasp a concept, tens of new subtopics pop up. One has to go constantly from the neuroscience side to the behavioral, back to the data processing and machine learning, and then back to the ethics side. I always approach this efforts from a product manager's standpoint, so even though it is more than clear that a brain-related product or service requires a multidisciplinary approach, I try to get an overall understanding of everything related to the topic. This is overwhelming (and another opportunity for my imposter's syndrome to appear again) but totally worth it.

The most relevant outcome is that I have a much deeper understanding of the steps required to go from pure electric signal reading to my final interest, which is to see how it would be useful to build more relevant, ethical and useful products for people. I wouldn't say I can read academic papers about the nuisances of every topic (that would take me another life or two, I guess!) but even if I'm still miles from the bottom of the knowledge pool, I've been able to dive quite a few meters already. 

 

 Some resources in case you're interested

I could not end this post without mentioning the main resources I have used. In this case, and more than ever, I would've been lost without them.

  • The MNE site is impressive in terms of explanations and tutorials: https://lnkd.in/dcUVtjPZ This community is really broad and by just googling a little I found new MNE-based tutorials that helped me understand some concepts in a more detailed way. 
  • The NeurotechEdu community is also very impressive. The tutorials are deeply explained and the tutorials work really well: https://lnkd.in/dZRPTUCn  I just signed up to their newsletter :)
  • I used chatGPT as my smart companion. It's helped me with theory questions, theoretical approaches and even with some of the python code I needed. So thumbs up to Open AI as well: https://chat.openai.com


During my search I found references to some potentially interesting books on BCI and EEG processing. I had already read Tan Le's book, The Neurogeneration (Tan Le is Emotiv's CEO and founder), but now I see I need really deeper literature. I mention some here but without really knowing much about them: 

  • Brain-Computer Interfacing: An Introduction. By Rao.
  • Brain-Computer Interfaces: Principles and Practice. By Wolpaw and Wolpaw.
  • The Brain Electric: The Dramatic High-Tech Race to Merge Minds and Machines. By Gay. 
  • A Practical Guide to Brain–Computer Interfacing with BCI2000: General-Purpose Software for Brain-Computer Interface Research, Data Acquisition, Stimulus Presentation, and Brain Monitoring. By Schalk and Mellinger. 
  • Toward brain-computer interfacing. By Dornhege (editor)
 

 All in all, I really enjoyed learning more about a topic I really love. I hope to continue finding time during the following months to continue this journey, and I hope this can be useful or interesting to some of you!

 

Cover image by Natasha Connell, Unsplash

 

 

Comments

Krithika said…
"Absolutely fascinating read! Exploring the intersection of technology and brain health is crucial. As a resident of Chennai, I appreciate the insight. Are there any renowned Neurologist In Chennai you recommend for further discussions on this topic? #NeurologistInChennai #BrainHealth"

Popular Posts