Testing low cost 28 sensor DOT. This repo will be getting updated as I get results and some working code to share. I will share how to drive the LEDs and how I get simultaneous sensor output as I get things demonstrable.
DIY DOT flexible sensor design based on this 2007 Retinotopic mapping paper for a simple and modular 28 channel high-density optical tomography headset. The question I want to answer: can I get deep image reconstruction?
This design uses 28 OPT101s and 24 pairs (48 total) 1206 sized generic LED pairs of whatever wavelength you want. The OPT101s can have the gain turned up by adding a cap and resistor between the right pins see datasheet. You can get really cheap ones on Aliexpress that work consistently.
Use a site like PCBway for cheap flex PCBs. There is FR4 stiffener on both sides. This is meant to be hand-assembled, careful not to burn the pads or sensitive OPT101 pins (i.e. use low temperature solder for testing). Pop the OPT101s (notch facing left from the front) in through the back to make them flush with the stiffener and solder from back.
New (cheapo version):
New (includes transimpedance amps):
Old:
3D simulation with BabylonJS:
The simulation math needs help but we'll figure it out eventually.
Read about fMRI, then assume this is an fMRI but it's dirt cheap in comparison. However, it only gets you the few outer centimeters of depth resolution, which can be enhanced with higher power lighting, guassian windows, interferometry (the most interesting next logical step for this ala OCT), and some other classic filtering tricks from RF and astronomy. So we can do legit mind reading, source localization for EEG, brain topology studies, and potentially real time monitoring of problems like subdural hematoma or possible metabolic issues in the high energy-demanding neocortex. Multimodal imaging promises to solve a lot of modeling problems with this technology, and there's probably a lot more to learn just in terms of how different physiology contains relevant computational information and what the interplay is. That's my very unqualified 2 cents :P
MIT License: Do whatever you want with it, I'm gonna do what I want with it too. And cite this repo if you make it useful! Get in touch @ brewster.joshua1@gmail.com, I'm working on a scalable version of this stuff but it just takes forever between factory times and me not knowing what the hell I'm doing.