tangyuhao2016 / CTRG

4 stars 0 forks source link

Request for Detailed Metadata Information for 2D JPG to 3D CT Reconstruction #3

Open xiweideng opened 5 months ago

xiweideng commented 5 months ago

Hello author,

I am currently working with the CTRG chest and brain datasets, specifically attempting to reconstruct 3D CT images from the 2D JPG slices provided in the dataset. Achieving an accurate reconstruction requires precise information on several key metadata parameters, which I was unable to find in the dataset documentation or accompanying materials.

To ensure the fidelity of the reconstructed 3D images to their original scans, could you please provide further details on the following metadata for each slice in the CTRG datasets?

  1. Pixel Spacing: The physical distance (in millimeters) between the center of each pixel. It's essential for accurately scaling the 2D slices in the reconstructed 3D space.

  2. Slice Thickness: The thickness (in millimeters) of each CT slice. This information is crucial for correctly spacing the slices in the Z-dimension during 3D reconstruction.

  3. Slice Spacing: If applicable, the distance (in millimeters) between adjacent slices. Understanding the gap between slices is vital for an accurate 3D model, especially if slice spacing differs from slice thickness.

  4. Image Orientation and Position Information: Details on how each slice is oriented and positioned within the body. This includes the Image Orientation (Patient) (0020,0037) and Image Position (Patient) (0020,0032) DICOM attributes or equivalent metadata. It's critical for assembling the slices in the correct anatomical order and orientation.

Including these metadata attributes would greatly enhance the utility and accuracy of the datasets for advanced imaging research, such as 3D reconstruction, volumetric analysis, and other computational radiology applications.

Thank you for your support and for providing these invaluable resources to the medical imaging research community. I look forward to your response.

Best regards, Xiwei Deng