Closed Zulban closed 8 years ago
I'm going to start on this today. I don't see an obvious way of finding closest fish to a ray in the BoundsOctree so I'll probably do something naive like loop over all the fish. Hopefully this will perform ok for a small school and let us make progress on what happens after a fish is selected.
You definitely don't want to do that! It will not run on a phone. No need for slow octree, loops, or ray casting.
What you'll want to do is attach a VisionTracker script to the game object you want to keep track of. I am going to make a video, likely on Monday, with exact instructions on how to use VisionTracker. In the mean time you can check out the "vision track1" scene.
Beyond that, you'll write another component that attaches to any GameObject with VisionTracker. It can monitor the values in VisionTracker periodically and do some method when the values reach some threshold (I suggest VisionTracker.GetHistoryScore).
My parents are coming to Montreal this weekend so I'll only be back Monday. Hope that helps!
https://github.com/osmosacademy/vr-campaign/wiki/help_programming
Is the intent is to attach a VisionTracker to each fish, or the subset of the fish that can be collected, in the fish school? The VisionTracker will periodically update their LookAt scores. Then something else will periodically iterate over the objects and find highest LookAt scores? Or alternatively, each VisionTracker would call a method on some central object from its Cycle method to notify of score updates.
We're currently wondering how to integrate Cardboard's gaze support with the VisionTracker's LookAt scores. At first glance, the gaze support seems to be based on a synchronous mechanism: ray cast to find object using colliders. Still looking at how to hook in lazily updated scores.
I don't have experience with the Cardboard gaze support. The fish do not have colliders and never will though, so that could be a problem.
You could do that iterate thing. Though the architecture I recommend is two components attached to your object of interest: VisionTracker and GeoffScriptOrWhatever. GeoffScriptOrWhatever periodiocally checks the score from VisionTracker, and executes some thing when it cross a threshold (for just that one VisionTracker).
It looks like the GazeInput from Cardboard is working with the school of fish, so we can target them now with the school of fish scripting and the reticle from cardboard. Looks pretty precise and fast to implement on this development run.
It looks like the ray cast that CardBoard's GazeInputModule performs is finding the fish due to the MeshCollider on the left and right of the "test fish" prefab. This side-steps the VisionTracker script completely though. Don't know about the performance implications yet. Still trying to get set up to test things on my phone...
I would try the vision tracker component on each fish and avoid colliders all together. You can just have the vision tracker run a co routine that checks the dot product between its position and the cameras forward vector (as stuart mentioned last time we met). The cardboard SDK gaze input requires colliders btw.
Yes. Colliders of any type on the schools of fish will destroy performance. We cannot treat individual fish as mesh colliders, or even box colliders. We need to track vision as though they were points. Thus VisionTracker.
I'll work on some stuff tonight, and possibly make a video showing how to use my scripts.
Oh! Good catch Geoff. Yes, there were colliders on the fish. I removed them, and now performance is significantly better (when the camera is moving, Unity is odd).
Done.
It may be best that I do this because I wrote the fish school stuff. I should find time Monday and Tuesday. The idea is to have a school of 200 fish, but ten or so are golden fishies. Look at one of them long enough and they burst, or sparkle or something and you get a point.