Closed m-spr closed 1 year ago
Hi @m-spr thanks for reaching out, sounds like a cool project. I am happy to help.
I am not sure I fully understand your questions though. It would be helpful if you could provide more details and perhaps some code examples.
In the embeddings the hypervectors are stored in the weight
property of the embedding class, you can override those. I’m not sure if this is what you’re looking for.
Thank you for your answer. Yes, thank you, can you please help me with how to overwrite the weight and initialize it with my binary vectors? I tried to add initialized values to the embeddings parameter with my tensor data, but it did not work. I even use "torchhd.MAPTensor" to change their type! I think the problem was with rebuilding the package! can you please help me to rebuild the package to apply my changes?
You can do the following:
import torchhd, torch
emb = torchhd.embeddings.Random(5, 1000, "BSC")
print(emb.weight)
# Parameter(BSCTensor([[ True, False, True, ..., True, True, False],
# [False, True, True, ..., False, True, True],
# [False, False, False, ..., False, True, True],
# [ True, False, False, ..., True, False, False],
# [False, True, False, ..., False, False, False]]))
my_hvs = torch.randn(5, 1000) < 0
print(my_hvs)
# tensor([[False, True, True, ..., True, True, True],
# [False, False, False, ..., False, False, True],
# [False, False, True, ..., True, False, False],
# [False, False, False, ..., False, False, True],
# [ True, False, False, ..., False, True, True]])
emb.weight.data.copy_(my_hvs)
# BSCTensor([[False, True, True, ..., True, True, True],
# [False, False, False, ..., False, False, True],
# [False, False, True, ..., True, False, False],
# [False, False, False, ..., False, False, True],
# [ True, False, False, ..., False, True, True]])
Let me know if this helps.
Thanks a lot. :) It works.
If you don't mind, I leave this issue open for a while for future discussions.
Hi @mikeheddes, :)
Can you please help me to read also the hypervector classes? I mean the hypervisors that are generated in training as representatives of the classes and then used to check the queries.
Thanks in advance!
Hi, it would be helpful if you could provide more details about what exactly you are looking for.
One way you could get the learned class vectors is by reading the weight
tensor of the trained model:
import torchhd
num_classes = 2
hv_dimensions = 10
model = torchhd.models.Centroid(hv_dimensions, num_classes)
# perform model training
print(model.weight)
# Parameter containing:
# tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
# [0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
See: https://torchhd.readthedocs.io/en/stable/generated/torchhd.models.Centroid.html#torchhd.models.Centroid for more information.
Hope this helps.
Closed due to inactivity.
Hi,
I am a PhD student at KIT University and I want to develop an FPGA-based hardware accelerator for Torchhd and would be grateful if you could help me to figure out some features of Torchhd. I have two points to mention, First, can you please help me extract the base vector and level vector from the model to test the currently implemented hardware? I tried to extract them by indexing, but I was looking for an easier way. Secondly, my main focus is to have an efficient hardware implementation, and for that, I need to modify the based vector value and check the accuracy. I have already modified the "embedding.py" file but I could not rebuild it. Can you please help me, how to change the values of the base vector, or how can I add my initialized values instead of random values?
Thank you in advance.