AutodeskAILab / Fusion360GalleryDataset

Data, tools, and documentation of the Fusion 360 Gallery Dataset
Other
446 stars 54 forks source link

Doubts regarding FUSION 360 Gallery dataset #70

Closed abdulwajid725 closed 3 years ago

abdulwajid725 commented 3 years ago

I had a few queries regarding the FUSION 360 Gallery dataset. 1 - For each item in the Reconstruction Dataset we have JSON, OBJ, PNG, SMT and STEP files. Also in the paper you have mentioned that in the dataset we have B-Rep, mesh, and construction sequence JSON text format. Among the above formats which ones are B-Rep and mesh formats as I am very new to CAD dataset. 2 - What embedding are you using to train the Neural Network i.e. what input are you giving to the neural network. 3 - From paper(section 5.2) - What do you mean by actions(At) and design (Gt), i.e. which components in the dataset represent these?

karldd commented 3 years ago
  1. Please take a look at the documentation for an explanation of the data formats.
  2. We use a variant of UV-Net where the input are points and surface types for the node features and a face adjacency graph constructed from the B-Rep topology.
  3. The actions is the sequence of face extrusions (image below), and the design is the ground truth B-rep geometry (without construction sequence). image
abdulwajid725 commented 3 years ago

Hi, We have "sketch2image" to convert json file to .png format. Similarly do we have any utility script that generate 3D Object from the json data provided in the Dataset.

karldd commented 3 years ago

Yes you can use Reconverter. Also note that 3D data is provided in the dataset in mesh (obj) and b-rep (step/smt) format.

karldd commented 3 years ago

Closing this for now. Feel free to reopen if you have further questions.

abdulwajid725 commented 3 years ago

Okay, Thanks a lot for your support.

abdulwajid725 commented 3 years ago

Hi, I have generated embeddings for regraph_05 dataset using load_dataset method of train.py. Now I want to calculate similarity between any two embeddings. Is there a way to do this. Since the dimensions and structure of the embeddings are different than a simple vector, I cannot apply cosine similarity or other methods on the embeddings.

chuhang-autodesk commented 3 years ago

Hi, thanks for the comments! I'm happy to help, just wanted to make sure I understand correctly what you are trying to implement. By generating the embeddings using load_dataset, are you referring to the feature vectors for each node in the graph? In that case, all the feature vectors do have the same dimension. You can also forward the MPN network to produce node embedding vectors, which also have the same dimensions. Note that the feature & embedding vectors are for each node. If you are trying to compare two items, you may want to consider a average or sum pooling over the node-wise vectors. Hope this helps!

abdulwajid725 commented 3 years ago

Hi. Thanks for replying As per my understanding Json files in regraph_05 folder has the json file (containing nodes and links) of json files in the "reconstruction" folder. Now using load_dataset method we are converting that json file (containing nodes and links) into embedding with information stored in graph_pairs_formatted list . By Embedding I mean embedding of the graph and not just individual nodes and links. My main goal here is to calculate some similarity score between two objects say "21237_7887a24b_0005" and "21242_6c2af7c2_0006" which are now being represented by embedding. "21242_6c2af7c2_0006" contains two graphs with 29 nodes and 25 nodes respectively where as 21237_7887a24b_0005 contains one graph with 5 nodes. I was looking for a way to compare these two objects. I will try the average and sum pooling suggestion. Thanks

abdulwajid725 commented 3 years ago

Hi @chuhang-autodesk In the previous comment you pointed out that we can generate node embeddings using MPN network. Can you tell in which python file are you exactly doing this? Output of Pretrained model are output_start,output_end,output_op but we want node embeddings as output. Need your help Thanks

chuhang-autodesk commented 3 years ago

You can modify the forward function after tools/regraphnet/src/train.py#L92 and directly print x0 and x1, which contains the embedding vectors for the two graphs.