allenai / Holodeck

CVPR 2024: Language Guided Generation of 3D Embodied AI Environments.
https://yueyang1996.github.io/holodeck
Apache License 2.0
304 stars 25 forks source link

Problems about using objaverse #14

Closed a962097364 closed 4 months ago

a962097364 commented 6 months ago

Hello!Thank you for your nice work!

I want to ask how the code get assets from objaverse.

I download the dataset, but there is no 3D model files in data\objaverse_holodeck\09_23_combine_scale\processed_2023_09_23_combine_scale.

How this project get 3D model from objaverse? By link the URL in attribution.txt and download or I missed some important information?

Thank you again for you help.

YueYANG1996 commented 6 months ago

The objects have been processed into THOR format, which contains a .pkl.gz file (vertices information) and .jpg files of normal, albedo, and emission. To convert our THOR format into .obj, please refer to this https://github.com/allenai/Holodeck/issues/9.

a962097364 commented 6 months ago

The objects have been processed into THOR format, which contains a .pkl.gz file (vertices information) and .jpg files of normal, albedo, and emission. To convert our THOR format into .obj, please refer to this #9.

Thank you for your help, it is very useful.

However, I see the model transformed to .obj has no texture. I tried to use Unity to add the albedo.jpg, normal.jpg and emission.jpg but it doesn't work.

YueYANG1996 commented 6 months ago

@sunfanyunn, could you help answer this question?

Dancing-Github commented 6 months ago

I also want to know that if starting from a glb file in objaverse, how can I convert it to 4 file (pkl, albedo, normal and emission) in the asset's format? Could you please give me some instructions on the method? @YueYANG1996

YueYANG1996 commented 6 months ago

We have a pipeline to do this task internally, and it will be released soon.

sarthakchittawar commented 6 months ago

We have a pipeline to do this task internally, and it will be released soon.

Really looking forward to this. Will be a great contribution and make AI2THOR scenes accessible for much larger tasks.

sunfanyunn commented 6 months ago

@a962097364 You can transform convert *.pkl.gz files into *.obj files using the script @YueYANG1996 provided in #9. Then, you can use bpy to map the texture files back to the assets:

def create_and_assign_textures(material, albedo_path, emission_path, normal_path):
    # Enable 'Use Nodes'
    material.use_nodes = True
    nodes = material.node_tree.nodes

    # Find or create Principled BSDF node
    bsdf = next((node for node in nodes if node.type == 'BSDF_PRINCIPLED'), None)
    if not bsdf:
        bsdf = nodes.new(type='ShaderNodeBsdfPrincipled')
        bsdf.location = (0, 0)

    # Function to create or get an image texture node
    def get_image_node(img_path, location):
        for node in nodes:
            if node.type == 'TEX_IMAGE' and node.image and node.image.filepath == img_path:
                return node
        new_node = nodes.new('ShaderNodeTexImage')
        new_node.image = bpy.data.images.load(img_path)
        new_node.location = location
        return new_node

    # Create or get Albedo Texture node
    albedo_tex = get_image_node(albedo_path, (-300, 100))
    material.node_tree.links.new(bsdf.inputs['Base Color'], albedo_tex.outputs['Color'])

    # Create or get Emission Texture node
    emission_tex = get_image_node(emission_path, (-300, -100))
    material.node_tree.links.new(bsdf.inputs['Emission'], emission_tex.outputs['Color'])

    # Create or get Normal Map node
    normal_map = next((node for node in nodes if node.type == 'NORMAL_MAP'), None)
    if not normal_map:
        normal_map = nodes.new('ShaderNodeNormalMap')
        normal_map.location = (-300, -300)
    normal_tex = get_image_node(normal_path, (-500, -300))
    material.node_tree.links.new(normal_map.inputs['Color'], normal_tex.outputs['Color'])
    material.node_tree.links.new(bsdf.inputs['Normal'], normal_map.outputs['Normal'])
a962097364 commented 6 months ago

@a962097364 You can transform convert *.pkl.gz files into *.obj files using the script @YueYANG1996 provided in #9. Then, you can use bpy to map the texture files back to the assets:

def create_and_assign_textures(material, albedo_path, emission_path, normal_path):
    # Enable 'Use Nodes'
    material.use_nodes = True
    nodes = material.node_tree.nodes

    # Find or create Principled BSDF node
    bsdf = next((node for node in nodes if node.type == 'BSDF_PRINCIPLED'), None)
    if not bsdf:
        bsdf = nodes.new(type='ShaderNodeBsdfPrincipled')
        bsdf.location = (0, 0)

    # Function to create or get an image texture node
    def get_image_node(img_path, location):
        for node in nodes:
            if node.type == 'TEX_IMAGE' and node.image and node.image.filepath == img_path:
                return node
        new_node = nodes.new('ShaderNodeTexImage')
        new_node.image = bpy.data.images.load(img_path)
        new_node.location = location
        return new_node

    # Create or get Albedo Texture node
    albedo_tex = get_image_node(albedo_path, (-300, 100))
    material.node_tree.links.new(bsdf.inputs['Base Color'], albedo_tex.outputs['Color'])

    # Create or get Emission Texture node
    emission_tex = get_image_node(emission_path, (-300, -100))
    material.node_tree.links.new(bsdf.inputs['Emission'], emission_tex.outputs['Color'])

    # Create or get Normal Map node
    normal_map = next((node for node in nodes if node.type == 'NORMAL_MAP'), None)
    if not normal_map:
        normal_map = nodes.new('ShaderNodeNormalMap')
        normal_map.location = (-300, -300)
    normal_tex = get_image_node(normal_path, (-500, -300))
    material.node_tree.links.new(normal_map.inputs['Color'], normal_tex.outputs['Color'])
    material.node_tree.links.new(bsdf.inputs['Normal'], normal_map.outputs['Normal'])

Thanks a lot, I will try it.

a962097364 commented 5 months ago

Hello, thank you for the answers before!

I have another question is how holodeck search models from the dataset?

Does holodeck iterate through all the models’ metadata in the database and compare their metrics?

And it costs me a lot of time to generate one scene.

Thank you.

sunfanyunn commented 4 months ago

I created a new issue for your new question. Closing this one

YueYANG1996 commented 3 months ago

I also want to know that if starting from a glb file in objaverse, how can I convert it to 4 file (pkl, albedo, normal and emission) in the asset's format? Could you please give me some instructions on the method? @YueYANG1996

@Dancing-Github @sarthakchittawar Please refer to objathor repo for this functionality.