-
### 🐛 Describe the bug
**Using `LazyInitContext` and later loading checkpoint do not properly initialize model parameters.**
```python
import colossalai
from colossalai.lazy import LazyInitCon…
-
Expected: "You will the be sent an email with a private link. The last portion of the private link is an access token."
Result: I received the link "https://zenodo.org/records/6631159", which is th…
-
```
What device(s) are you experiencing the problem on?
NEC LifeTouch NOTE ( A Japanese android device; hardware qwerty keyboard
equipped.)
What firmware version are you running on the device?
Not…
-
Unfortunatelly the field Variable Mapping is too short. It is only 30 characters long but i get automatic gernerated variable names for example like this from the "Source Transformer": patientIdentifi…
-
There is a problem with Ksampler node when I use ipadapter with SD1.5 models and layerdiffusion together, The error message is: 'AttentionSharingUnit' object has no attribute 'to_q', But I don't have …
-
Dear Mamba Contributors,
I hope this message finds you well. I am in the process of utilising the Mamba state space architecture for a language modelling task and have been highly impressed with th…
-
```
What steps will reproduce the problem?
1. Scroll down puzzle list when in landscape (horizontal).
2.
3.
What is the expected output? What do you see instead?
Crash with message "Unfortunately, Sh…
-
```
What steps will reproduce the problem?
1. Scroll down puzzle list when in landscape (horizontal).
2.
3.
What is the expected output? What do you see instead?
Crash with message "Unfortunately, Sh…
-
While source maps are not extremely useful for debugging on such a small data sets, they can be still handy for testing transformers and (re)using existing tools that can match transformed pieces of r…
-
I found `seq *= self.item_emb.embedding_dim ** 0.5` in function `log2feats(self, log_seqs)`, Is there any reason for adjusting seqs after embedding?
`seqs = self.item_emb(torch.LongTensor(log_seqs)…