Oufattole / meds-torch

MIT License
11 stars 1 forks source link

Embedding tokens #1

Closed Oufattole closed 2 months ago

Oufattole commented 2 months ago
Screenshot 2024-07-03 at 6 11 40 PM

I'm trying to implement this observation level embedder (as described in the figure below) to convert the pytorch dataset batches to a sequence of embeddings, one for each observation, with static variables preceding dynamic variables.

Where do I find the vocab size in the pytorch dataset class?

Oufattole commented 2 months ago

This was resolved in pull request #3 , the pytorch dataset class was given an enum for event_stream type collating (which was previously supported) and for triplet style collating (which is depicted in the image above and was just added!).