Closed monrky closed 1 year ago
Nice progress!
First, I just want to mention that this method for spherical treadmill motion detection is my least favorite aspect of the VR setup, so we're working on a much simpler setup with only slightly fewer capabilities. It wont be ready very soon, so using an Arduino Due with optical sensors is pretty much as good as it gets for now.
If you're getting "waiting for sensor B", then my guess would either be that sensor B is bad or just not wired correctly. You can test whether both sensors are working by swapping them and seeing if the code still doesn't connect to sensor B. But, from your pictures, it looks like sensor 2 is connected to SPI2, when maybe both sensors should be connected to SPI1? They're on different chip select lines so that would make sense to me, and I don't see anywhere in the code that SPI2 is used.
Ohhh I see, thank you! I've changed that wiring (both are now connected to SPI1) and it looks like the serial monitor is printing info from both sensors now. I've plugged the Arduino into the Raspberry Pi to see if it'll control the cursor movements, and both seem to.. but rather unreliably -- sometimes it works, sometimes it doesn't, and most of the time it only works for a second and then stops. Do you have any information about the ADNS-3080 focal length or sensitivity that might make the movements smoother?
Yes those sensors are finicky, they need to be looking at a good surface that's well illuminated and in focus. We used these lenses: https://www.edmundoptics.com/p/f19-8mm-focal-length-green-series-m12-mu-videotrade-imaging-lens/11586/ And this LED for illumination: https://www.digikey.com/en/products/detail/tt-electronics-optek-technology/OP290A/498682 We also added black dots with a marker all over our styrofoam treadmill to improve the tracking.
Do the black dots need to be evenly spaced? Or can they be randomly scattered around the styrofoam treadmill? (if you have a photo of your setup that would be tremendously helpful!)
Evenly spaced enough so you get good spots in any possible image of the sensors, but overall it's pretty random. Our ball looks like this:
Hi @misaacson01 !! I have definitely noticed that the sensors are finicky -- even in a well-illuminated room, they only pick up the ball movements when there's a super bright flashlight pointed right at the surface of the ball. I set up some of the infrared LEDs you linked in your earlier message, but I'm not sure if they're too dim, since they're not really making a difference for the sensors (they're definitely on -- i can see the faint glow on my phone camera). I'm not sure if it's an issue with the sensor being out of focus, either -- I also tried with the lenses you sent earlier from Edmund Optics, and that wasn't great either (even with the bright flashlight). please let me know if you have any suggestions!!
I just uploaded some debugging code for the ADNS-3080 sensors. I haven't used the code personally, but at the very least it should be useful as a starting point to debug. The code lets you see images from the sensors, which makes it easier to set up the best lighting condition, focus, etc. The person who initially built our setup had to use this more than once. One thing that's important in setting up these conditions is to have the elements (sensors, light sources) fixed permanently in place, not with a temporary solution that can drift over time but in a way that's very stable. To use the code, you'll have to first upload the Arduino Due code, which I believe needs to be programmed for USB type Serial instead of XInput (like the VR system uses). Then you plug it into a PC running python (I'm not sure which of Due USB port to use, you can try both), and run mousecam_images.py after editing it for the COM port the Due is connected on. Let me know if you give this code a try and how it works for you!
that worked beautifully -- thank you so much. Seems like the original lenses that come with the ADNS3080 sensors work best, with two LEDs pointed directly at the surface.
also, the image window that pops up from the mousecam_images.py debugging code shows the surface upside-down. is that on purpose?
Good to hear! Upside-down images, I don't think it's on purpose, maybe just arbitrarily drawn from the data as it's streamed. Worst case scenario is you have to flip a sign somewhere in the controller code
hi, so I've set up the lighting and the focal distance to be optimal through the debugging cam you sent. Now I'm running the game engine and it looks like the sensors are only picking up the ball's yaw movements (and translating it as forward/backward movement in the game engine), but the ball's pitch/roll data isn't being picked up or translated. We've played around with some of the movement parameters but it didn't seem to help; not sure if the sensors need to be rotated? I've attached a video to aid my description! I appreciate any help! :)
https://github.com/sn-lab/mouseVRheadset/assets/135660931/4fade1c3-bf9e-4274-b1ef-e9a2691d6792
You're testing with the openfield scene I believe? On line 172 of openFieldScene.gd, you can set a variable to print to the screen for debugging. I would start there to understand what Godot is reading from the Due, e.g. fpslabel.text = str(head_yaw)
, fpslabel.text = str(head_slip)
, and fpslabel.text = str(head_thrust)
. That should give clues to what the problem is. Maybe you're right that one or both of the sensors are rotated in unexpected ways; actually, if you haven't changed the due code at all, there has to be a problem there since I believe that code expects one sensor below the ball and one on the back. So if this is the case it could be an easy fix in the Due code.
In mouseVRheadset_controller_V4.ino, lines 244-246:
yawFloat = float(v_bottom.x) * bottom_scale;
pitchFloat = float(v_top.x) * back_scale;
rollFloat = 0.5 * (float(v_bottom.y) * bottom_scale + float(v_top.y) * back_scale);
These top and bottom x/y values should be rearranged. In your setup, yaw needs an average of the horizontal movement (could be either x or y depending on the sensor rotation) of both cameras, similar to how roll is currently calculated. Pitch only needs the vertical movement of the back camera, and slip only needs the vertical of the side camera.
I see, thank you!!
Hello! I've gotten the dual-display + raspberry pi "headset" part put together, and now am working on getting it to take input from the ADNS-3080 sensor via the Arduino Due microcontroller. I've programmed the Arduino with the sketch you uploaded (
Hardware/mouseVRheadset_controller_V4/mouseVRheadset_controller_V4.ino
), and figured out the proper connections by searching online, but am still encountering issues getting the sensors to work properly.I added some lines in the sketch you uploaded for debugging purposes (I've attached the modified version here -- apologies for the format -- github won't let me upload an .ino file) -- with my setup, the serial monitor only gets up to "Waiting for sensor B" (line 221). I've attached also a photo of my Arduino & connections. I'd greatly appreciate any assistance !!!
I also tried to connect the single working sensor (A) with the Arduino, program it, and send movements to the Raspberry Pi, which also didn't work -- this seems to be a separate issue :(
arduino-test-debug.docx