Fsoft-AIC / Language-Conditioned-Affordance-Pose-Detection-in-3D-Point-Clouds

[ICRA 2024] Language-Conditioned Affordance-Pose Detection in 3D Point Clouds
https://3dapnet.github.io/
MIT License
23 stars 7 forks source link

About #4

Open hanyueling opened 1 month ago

hanyueling commented 1 month ago

Hello thank you very much for sharing. Can you share the acquisition of the evaluation indicators in this article?

toannguyen1904 commented 1 month ago

Hi @hanyueling, the evaluation code can be found in utils/eval.py.

toolren commented 1 month ago

Hello,can you tell me about your experimental hardware environment configuration including your computer system version

hanyueling commented 1 month ago

Hi @hanyueling, the evaluation code can be found in utils/eval.py.

I test and evaluate the indicators on the detection result file result.pkl based on the function in the eval.py file. The result obtained is as follows: mIoU: 0.7542786070242297 Acc: 0.917440785114841 mAcc: 0.8877087975117793 mESM: 0.45479424029757287 mCR: 0.0005653710247349824 i don't how to solve this problem. And the code I wrote is as follows: import os import torch from gorilla.config import Config from utils import * from utils.eval import affordance_eval, pose_eval import argparse import pickle from tqdm import tqdm import numpy as np from scipy.spatial.transform import Rotation as R

result_path = 'log/detectiondiffusion/result.pkl' with open(result_path, 'rb') as f: result = pickle.load(f)

定义 affordance_list affordance_list = list({affordance for shape in result for affordance in shape['affordance']})

mIoU, Acc, mAcc = affordance_eval(affordance_list, result)

print(f"mIoU: {mIoU}") print(f"Acc: {Acc}") print(f"mAcc: {mAcc}")

gt_poses = [{affordance: shape['pose'][affordance] for affordance in shape['affordance']} for shape in result] pred_poses = [{affordance: shape['result'][affordance][1] for affordance in shape['affordance']} for shape in result]

def ensure_2d(poses): return {aff: np.atleast_2d(poses[aff]) for aff in poses}

gt_poses_2d = [ensure_2d(poses) for poses in gt_poses] pred_poses_2d = [ensure_2d(poses) for poses in pred_poses]

def transform_pred_poses(pred_poses): transformed_poses = {} for affordance, pose in pred_poses.items(): if pose.shape == (2000, 7): positions = pose[:, :3] quaternions = pose[:, 3:] rotations = R.from_quat(quaternions).as_matrix() flattened_rotations = rotations.reshape(2000, 9) transformed_poses[affordance] = np.hstack((flattened_rotations, positions)) else: transformed_poses[affordance] = pose return transformed_poses

def transform_gt_poses(gt_poses): transformed_poses = {} for affordance, pose in gt_poses.items(): if pose.shape == (50, 4, 4): rotations = pose[:, :3, :3].reshape(50, 9) translations = pose[:, :3, 3] transformed_poses[affordance] = np.hstack((rotations, translations)) else: transformed_poses[affordance] = pose return transformed_poses

gt_poses_transformed = [transform_gt_poses(poses) for poses in gt_poses_2d] pred_poses_transformed = [transform_pred_poses(poses) for poses in pred_poses_2d]

mean_min_distance, mean_rate = pose_eval(gt_poses_transformed, pred_poses_transformed)

print(f"mESM: {mean_min_distance}") print(f"mCR: {mean_rate}")

ZhenningZhou commented 3 weeks ago

@toannguyen1904 @hanyueling I have two questions that I would like to ask for your help. Thank you so much!

  1. When running: python3 detect.py --config --checkpoint --test_data <test data in the 3DAP dataset> for testing, how do I prepare my configuration file?
  2. Is the test data the entire 3DAP dataset? Or do I need to manually crop out the objects I want to test? If so, how do I crop them? Thanks so much! Look forward to your reply.
oliver65432 commented 4 days ago

@toannguyen1904 @hanyueling I have two questions that I would like to ask for your help. Thank you so much!

  1. When running: python3 detect.py --config --checkpoint --test_data <test data in the 3DAP dataset> for testing, how do I prepare my configuration file?
  2. Is the test data the entire 3DAP dataset? Or do I need to manually crop out the objects I want to test? If so, how do I crop them? Thanks so much! Look forward to your reply.

Hello, problems solved yet? I met the same problem, it seems that the content of the repo is incomplete.