-
I trained MOE on 8 gpus with 8 experts. When I conducted the inference in parallel, I found each process had a similar but different result. I would like to ask you what could be the cause of this?
zws98 updated
7 months ago
-
I just see "conversion to PDF failed"
![Screenshot 2021-03-19 072438](https://user-images.githubusercontent.com/10896821/111779521-3cad8f00-8884-11eb-9a9f-01e5cf78b350.png)
-
# Improving garbage collector
[https://marisa.moe/balancer.html](https://marisa.moe/balancer.html)
-
In [`5127c5d`](https://github.com/NoPlagiarism/services-personal-upptime/commit/5127c5dbcedeccae0e3f95828d6d9b738dea04ec
), CloudTube tube.cadence.moe (https://tube.cadence.moe) was **down**:
- HTTP c…
-
**Describe the bug**
I create a 8 megatron transformer layer model (ori-model) and transform it into deepspeed model (ds-model) by deepspeed.init_inference.
Then I compare the results of ori-model w…
-
https://blog.cascade.moe/posts/arch-with-zfs/?
-
manifest.json is overwritten by sveltekit, so that paimon.moe is not PWA compliant, which means you can't install it as an app anymore. [https://paimon.moe/manifest.json](https://paimon.moe/manifest.j…
-
https://qwq.moe/network-in-my-home
-
Thanks for making this code available. I am trying to run it and having problems with the command-line input values. I downloaded the PAN18-Author-Profiling-master.zip from https://pan.webis.de/clef18…
-
# 币市盈利的秘诀
HODL!
[https://egoist.moe/secret-to-success](https://egoist.moe/secret-to-success)