-
**Describe the bug**
When working with Set collection, It is not possible to perform add operations on the set collection due to missing hash function. One gets following error:
```
myModel.v…
-
- Right Menu
- Afix menu
- Lesson summary?
- Course summary?
- Top user
- Exercise
- Lesson
- Course
- Create vocab question
- Admin
- Create vocab question:
- cat, big, monkey…
-
In the [README](https://github.com/fastnlp/CPT/blob/master/pretrain/README.md) of pre-training, it mentions that the `dataset`, `vocab` and `roberta_zh` have to be prepared before training.
Is ther…
-
import math
import os
import random
import torch
from d2l import torch as d2l
import os
import matplotlib.pyplot as plt
os.environ["KMP_DUPLICATE_LIB_OK"]="TRUE"
#@save
d2l.DATA_HUB['ptb'] …
-
I find the search link in the README very useful! https://vocab.nerc.ac.uk/search_nvs/L35/
Previously I was just searching on the page with CTRL-F, which is quite inefficient. I think it would be g…
-
Introduce domain-specific ontologies/vocabularies in place of generic/customized ones for concepts from soil health knowledge graph. Here are some candidates:
- [ ] [GloSIS](https://glosis-ld.github.…
-
Hi,
Thanks for sharing your code. I was not sure what format the vocab.txt should be, as the file was not in the repo, so I tested the code with a vocab.txt with single words on each line. Example…
-
![vocab-aliasing](https://user-images.githubusercontent.com/917606/50217943-2e5d2980-0382-11e9-9c98-064b49208761.png)
-
During the build process, a handful of vocabularies from `extra` are removed due to being useless anyway, and to save space. However a number of other vocabs which rely on these vocabs (and are theref…
-
tokenizer.vocab_size=12800, why does token id = 12800 appear? Shouldn't token id < tokenizer.vocab_size?
![1](https://github.com/user-attachments/assets/b31ecaf8-8f9f-4af2-89c0-8aa7cddb1eff)
![2…