Open liiad opened 5 months ago
It could be the issue with point ops library in my repository. Could you replace the point ops library with the one described in the bottom section of README?
I had replaced ops bottom,but it doesn't work properly on the rtx4090, do you have any other advise
---- Replied Message ---- | From | @.> | | Date | 07/31/2024 13:41 | | To | limhoyeon/ToothGroupNetwork @.> | | Cc | liiad @.>, Author @.> | | Subject | Re: [limhoyeon/ToothGroupNetwork] error to run y to run inference_mid.py (Issue #42) |
It could be the issue with point ops library in my repository. Could you replace the point ops library with the one described in the bottom section of README?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Did you check whether the input point cloud is correctly loaded or not?
try to run inference_mid.py --input_path D:\Tooth\3D_scans_per_patient_obj_files_b2 --save_path D:\Tooth\S ,errors,How to Sovle? "C:\Program Files\Python39\python.exe" D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\inference_mid.py --input_path D:\Tooth\3D_scans_per_patient_obj_files_b2 --save_path D:\Tooth\S The Zen of Python, by Tim Peters
Beautiful is better than ugly. Explicit is better than implicit. Simple is better than complex. Complex is better than complicated. Flat is better than nested. Sparse is better than dense. Readability counts. Special cases aren't special enough to break the rules. Although practicality beats purity. Errors should never pass silently. Unless explicitly silenced. In the face of ambiguity, refuse the temptation to guess. There should be one-- and preferably only one --obvious way to do it. Although that way may not be obvious at first unless you're Dutch. Now is better than never. Although never is often better than right now. If the implementation is hard to explain, it's a bad idea. If the implementation is easy to explain, it may be a good idea. Namespaces are one honking great idea -- let's do more of those! Processing: 0 : D:\Tooth\3D_scans_per_patient_obj_files_b2\00OMSZGW\00OMSZGW_lower.obj Found array with 0 sample(s) (shape=(0, 3)) while a minimum of 1 is required by DBSCAN. Traceback (most recent call last): File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\predict_utils.py", line 101, in predict pred_result = self.chl_pipeline(scan_path) File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\inference_pipeline_mid.py", line 52, in call first_results = self.get_first_module_results(input_cuda_feats, self.first_module) File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\inference_pipeline_mid.py", line 176, in get_first_module_results output = base_model([points]) File "C:\Users\User\AppData\Roaming\Python\Python39\site-packages\torch\nn\modules\module.py", line 727, in _call_impl result = self.forward(*input, *kwargs) File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\models\modules\grouping_network_module.py", line 65, in forward fg_points_labels_ls = tu.get_clustering_labels(b_moved_points, whole_cls_1) File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\tsg_utils.py", line 88, in get_clustering_labels clustering = DBSCAN(eps=0.03, min_samples=60).fit(moved_points[super_point_cond, :], 3) File "C:\Program Files\Python39\lib\site-packages\sklearn\base.py", line 1474, in wrapper return fit_method(estimator, args, kwargs) File "C:\Program Files\Python39\lib\site-packages\sklearn\cluster_dbscan.py", line 393, in fit X = self._validate_data(X, accept_sparse="csr") File "C:\Program Files\Python39\lib\site-packages\sklearn\base.py", line 633, in _validate_data out = check_array(X, input_name="X", check_params) File "C:\Program Files\Python39\lib\site-packages\sklearn\utils\validation.py", line 1072, in check_array raise ValueError( ValueError: Found array with 0 sample(s) (shape=(0, 3)) while a minimum of 1 is required by DBSCAN.
Traceback (most recent call last): File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\inference_mid.py", line 65, in
pred_obj.process(stl_path_ls[i], os.path.join(args.save_path, os.path.basename(stl_path_ls[i]).replace(".obj", ".json")))
File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\predict_utils.py", line 140, in process
labels, instances, jaw = self.predict([input_path])
File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\predict_utils.py", line 101, in predict
pred_result = self.chl_pipeline(scan_path)
File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\inference_pipeline_mid.py", line 52, in call
first_results = self.get_first_module_results(input_cuda_feats, self.first_module)
File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\inference_pipeline_mid.py", line 176, in get_first_module_results
output = base_model([points])
File "C:\Users\User\AppData\Roaming\Python\Python39\site-packages\torch\nn\modules\module.py", line 727, in _call_impl
result = self.forward(*input, *kwargs)
File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\models\modules\grouping_network_module.py", line 65, in forward
fg_points_labels_ls = tu.get_clustering_labels(b_moved_points, whole_cls_1)
File "D:\Tooth\3DTeethSeg_MICCAI\ToothGroupNetwork-challenge_branch\tsg_utils.py", line 88, in get_clustering_labels
clustering = DBSCAN(eps=0.03, min_samples=60).fit(moved_points[super_point_cond, :], 3)
File "C:\Program Files\Python39\lib\site-packages\sklearn\base.py", line 1474, in wrapper
return fit_method(estimator, args, kwargs)
File "C:\Program Files\Python39\lib\site-packages\sklearn\cluster_dbscan.py", line 393, in fit
X = self._validate_data(X, accept_sparse="csr")
File "C:\Program Files\Python39\lib\site-packages\sklearn\base.py", line 633, in _validate_data
out = check_array(X, input_name="X", check_params)
File "C:\Program Files\Python39\lib\site-packages\sklearn\utils\validation.py", line 1072, in check_array
raise ValueError(
ValueError: Found array with 0 sample(s) (shape=(0, 3)) while a minimum of 1 is required by DBSCAN.