Absolute0K / OpenLD-Software

Software for the OpenLD Lucid Dreaming Research Platform
MIT License
14 stars 6 forks source link

Hey there! #1

Open Venryx opened 7 years ago

Venryx commented 7 years ago

Just thought I'd mention that this project looks promising, and as a die-hard enthusiast in lucid dreaming, I plan to join up at some point and contribute whatever I can to the iniative (and others like it).

I'm currently working ~6 hours per day on an RTS game (I'm an aspiring indie game developer), but besides that (and one other long-term project), this one ranks next highest--working on helping build a rich, open-source infrastructure for assisting lucid dreamers achieve lucidity. (and facilitate two-way communication, etc. while in the dream)

Looking forward to potentially working with you in the future!

(The main thing holding me back from starting work atm is lack of hardware; OpenBCI looks solid and promising, but it's so expensive! I will take a closer look at the alternative hardware you set up...)

Edit1: https://hackaday.io/project/13285/components That component list is very intimidating to a software-only developer. : |

Edit2: https://hackaday.io/project/13285/logs You are kidding me. You BAKED your own hardware?!? 8 | I'm beginning to think I will not be very useful for this project... (at least not until I have enough money to justify buying a pre-built one)

Venryx commented 7 years ago

Looks like I'll be able to do some experimenting after all!

While the OpenBCI board is still too expensive, I ended up buying a cheaper alternative called Muse. It only has five electrodes instead of sixteen, but it's much more affordable: it's $250 ($211 if you have a referral), and doesn't require any extra hardware.

And here's the cool part: the platform has a nice SDK, which lets you get the raw EEG data (or processings of it) in real-time, without further cost. And the SDK can be used for apps on Android, iOS, as well as desktop.

Here's a page on the developer kit: http://www.choosemuse.com/developer-kit/ And here's a page on the device capabilities, and a screenshot of the EEG data, graphed: http://developer.choosemuse.com/hardware-firmware/hardware-specifications And thirdly, here's a listing of the headset data accessible through the API: http://developer.choosemuse.com/research-tools/available-data

I'm planning to build an Android app that links with the Muse headset, using React Native. The repo for it is here if you're interested: https://github.com/Venryx/LucidLink

Anyway, thought I'd mention it since you've said previously the OpenBCI board was beyond your price range atm.

Absolute0K commented 7 years ago

Hi! Sorry for the late reply, I just found your post today! I had absolutely no idea this issue even existed, as I received no email alerts from Github. (WTF?) I'll check here regularly from now on.

It would be awesome if you could help me out with this project! I've looked into Muse and your LucidLink page and it looks very promising. Note, I barely know anything about JS and React Native so I am afraid I won't be very useful for the LucidLink ;).

One really annoying thing I had with the BioEXG was that I had to constantly clean and put paste crap with the gold cup electrodes when I went to sleep. (it gets old very quickly) Having the muse eliminates that hassle, since it's just a headset, so it'll be much easier to use, which would be ideal if we want to make this project easily accessible for everybody.

But there is a slight problem: My REM detection algorithm requires one EEG channel (Fpz-A2) and two EOG channels (E1-A2 and E2-A2). The Muse's electrodes are not placed that way, so that means that we will have to come up with another REM algorithm for the Muse. I guess we can change the algorithm to use channels AF7 and AF8, but I'm not sure if that will work.

Anyways, that's my two cents. Right now I am trying to implement the "dream-to-real life" morse code communication thing working using eye movements, but school work is killing me. Hopefully I will make some progress soon.

Good luck with your Muse hardware!

Venryx commented 7 years ago

Interesting. So the three electrodes used for your detection algorithm are on the forehead and above it? That's just a guess, as I couldn't find any electrode placements named FPZ-A2 or E1-A2/E2-A2 on the charts I searched up. (eg: https://www.google.com/search?q=E1-A2+eeg&prmd=nmsiv&source=lnms&tbm=isch&sa=X&ved=0ahUKEwi437f8ysvQAhUS6GMKHcN3AJQQ_AUICigE&biw=1280&bih=800#tbm=isch&q=eeg+electrodes+placement&imgrc=uKhLpOi-daZvFM%3A)

Eithet way, that probably would at least make it difficult to detect rem reliably using the same algorithm.

Perhaps instead of frequency-based REM detection, I could fall back to simply checking for eye movement over a long enough period of time, for the Muse (since it picks that up pretty well). (You can see some of my eye movement tests here: http://www.dreamviews.com/lucid-aids/162202-muse-headband-second-look-maybe-its-actually-pretty-good.html#post2207713 )

And very cool that you're also trying to get a morse-code-like system set up. : ) I'm opting, at least for now, for an approach a bit simpler to learn (since no memorization), though also probably a bit slower, which is just having a grid of letters, and using eye movements to virtually move to and "select" them. It's not coded yet, but the idea's pretty simple, and should work--though I may have to forgoe the diagonal movements and just use the four main directions if it turns out not precise enough in its detection abilities.

I eventually might move to a more morse-code like system, but I'd like to at least try the grid approach, since it only takes like one minute to learn, and is something people could start with.

Anyway, good luck to both of us! Will be a great day when there's a lucid dream device that actually works, and is obtainable to the average person.

Absolute0K commented 7 years ago

As you may know, EEG signals are differential, meaning that they are subtraction of two different voltage sources. So in the case of Fp1-A2 (Fpz and Fp1 are pretty much identical), the positive electrode (called non-inverting input) connects to Fp1 while the negative electrode (inverting input) connects to A2.

In the case of muse, the negative electrode is connected to Fpz while the positive electrodes are connected to AF7, AF8, and so forth. So they should actually be named as: AF7-Fpz and AF8-Fpz.

(This is just some technical gibber jabber, just good to know)

I am using the morse code approach since it is very easy to interpret: left and right eye movements. The grid approach feels like a very elegant solution, but I feel like the actual implementation is going to be quite complicated: left, right, down, up, etc. Maybe use some preprocessing + neural networks? I have no idea. But if you do get it to work, then that would be awesome!

If you want, I can send you some C++ codes + algorithm I have on REM detection through dreamviews (this Github platform is not very usable right now...).

Venryx commented 7 years ago

Didn't know that, but thanks for the info.

I'll be looking more into the details of eeg if/when the basic code systems are in place, and I start getting feedback on how well/badly simple pattern matching works on the channel data. (additional knowledge could maybe then be useful for improving/supplementing the simple pattern matching system)

And sure, send me the codes! (that way I'll also know your username on DreamViews 😄) I'll look over it, and probably plug it into the channels that do exist at some point just to see what happens. (of course, I'd need to graph out or log the data through the night to make sense of it/see if it's useful, which means it'll probably have to wait till after I get the "tracker" page in my app created)

As for the eye-movements, yeah, it will be interesting to see if they're distinct enough to be picked up. If not, I'll fall back to a simpler approach. (I'll probably actually have both included either way, for users to use if they want to through JS scripting)