Open felipesce opened 2 years ago
Diminishing MAX_FLOW seems to have worked, the images are recognized; I'm not sure if it will have any other effect yet.
My problem seems to be that the scenes have many parts that are very similar, so varying the threshold either makes very few images pass or too many false positives (images with no co-visibility are believed to be co-visible, leading to poor training.)
I'm at odds as to how to tweak the parameters to diminish this problem, so I'll try to construct a hand-made co-visibility graph to test if this is indeed the issue.
Thanks again to anyone who wants to chime in.
Greetings,
I'm having a very strange error, and reckon I probably have to adjust some threshold but haven't had much luck finding what and where to change.
I'm training with 132 sets of 100 images/depths/poses, for a total of close to 13.000 sets of 7 frames.
While building the dataset with the function
_build_dataset
, it effectively finds the 100 images, depths and poses for each folder, but at the end of the dataset creation, the functiondataset_factory
infactory.py
states that only 10-250 images were found.By the definition of
__len__(db)
, inbase.py
, this means that_build_dataset_index
is not adding every image to the frame graph, which is confusing since none of these scenes are designated as test scenes.Any ideas on how to debug this issue?
Some ideas I've had:
fmin
&fmax
hasn't made much difference.build_frame_graph
. I have no idea what this does, or how to know the correct value.MAX_FLOW = 100.0
ors = 2048
orval.mean(-1) < 0.7
incompute_distance_matrix_flow
inrgbd_utils.py
. Same as above.Any and all comments or insights would be deeply appreciated. Have a nice day.