Open IddoBotzer opened 5 years ago
Hi, same question!
Thank you
Hi, the notebook demos are to provide a detailed example of how the input and output structures are. For example, as per the readme, you can run
python run_lfnet.py --in_dir=images --out_dir=outputs
which is actually how we use this codebase these days.
Hi, how is possible to see the matching on my own data, (after running the code above)? Thanks
In LIFT you provide the function to convert keypoints in open cv2.keypoints. Is there any support on that?
I have the same issue, after running the code above how to show the correspondence between the images?
Is there any possibility to solve this?
In LIFT you provide the function to convert keypoints in open cv2.keypoints. Is there any support on that?
pts = outs['kpts'] pts = [cv2.KeyPoint(pts[i][0], pts[i][1], 1) for i in range(pts.shape[0])]
Hi, I would like to calculate the homography between two images using the key points returned by LF-Net (for example with opencv's findHomography). So far I was able to extract the detection endpoints, extract the keypoint patches and get the kpts2_corr list. How to filter for the best matches in order to filter out unwanted correspondences?
Thank you.
Hi, I was able to run the net test and extract features from my images, now I'm looking to see the matching. Is the only way to do so is to run the demo with the notebook?
If so, what do I need to change in order to run it on my own data? (I have depth images as well)
Thank you
Hi, i have the same question, did you finish this question? without run demo in jupyter or use it on my own data thank you!
@IddoBotzer Hi, I prepared to run the feature-extracted demo, my command written below: python run_lfnet.py --in_dir=1_1.jpg --out_dir=output but there is an error occured: "ImportError: libcublas.so.8.0: cannot open shared object file: No such file or directory". Could you give me your command example? And another question is that: Have you found any solution to run the matching demo “demo.ipynb" on your own data?
Looking forward to your reply, Thank you.
Hi. The error is just telling you that you don't have cublas installed properly. This is an environmental configuration issue.
Hi, I was able to run the net test and extract features from my images, now I'm looking to see the matching. Is the only way to do so is to run the demo with the notebook?
If so, what do I need to change in order to run it on my own data? (I have depth images as well)
Thank you