HazyResearch / safari

Convolutions for Sequence Modeling
Apache License 2.0
848 stars 70 forks source link

configs for Hyena Wikitext103 experiments #28

Open xiaobo-guo opened 1 year ago

xiaobo-guo commented 1 year ago

Your work is excellent! I am trying to follow your work and facing some problems. I wonder if you may share the config for the wiki103 dataset of the Hyena. I try to conduct experiments with 125-slim but the test perplexity is higher than the reported result (about 21 with hyena). And I am wondering whether the removal of flash-atten will influence the result or not.

Zymrael commented 1 year ago

Can you share the config? Wikitext is quite sensitive to a few hyperparameters. Flash attention will not affect the result for Hyena.

xiaobo-guo commented 1 year ago

Thanks for your response.

I attach the config file config.txt

Zymrael commented 1 year ago

You should set dropouts to 0.2 as a first step. After you get to sub 19 ppl you will be in tuning range.

xiaobo-guo commented 1 year ago

Thank you, shall I also set the order to be 3 in the Heyna layer?

sustcsonglin commented 1 year ago

Can you share the config? Wikitext is quite sensitive to a few hyperparameters. Flash attention will not affect the result for Hyena.

Could you please put the configs you used in configs/experiment/wt103? That would be super helpful!

sustcsonglin commented 1 year ago

Thank you, shall I also set the order to be 3 in the Heyna layer?

Did you reproduce the 19 ppl result using dropout=0.2? I still get 22

xiaobo-guo commented 1 year ago

Thank you, shall I also set the order to be 3 in the Heyna layer?

Did you reproduce the 19 ppl result using dropout=0.2? I still get 22

I set the dropout to 0.2 and the order to 3 and get about 20 but still can not get the reported result

Zymrael commented 1 year ago

You can look at this config for an independent reproduction that gets to sub 19. Let me know if after this you still have issues with the loss being too high, and I'll rerun experiments in the new codebase.

radarFudan commented 1 year ago

Thank you, shall I also set the order to be 3 in the Heyna layer?

Did you reproduce the 19 ppl result using dropout=0.2? I still get 22

I set the dropout to 0.2 and the order to 3 and get about 20 but still can not get the reported result

Question: Did you change the attain_layer_indx? It seems in your attached config, there is attention layer at layer 1 and 8 (which is inherited from the base.yaml)

radarFudan commented 1 year ago

You can look at this config for an independent reproduction that gets to sub 19. Let me know if after this you still have issues with the loss being too high, and I'll rerun experiments in the new codebase.

Thanks for the helpful reference. However, I checked that repo and the released [log] from S5 (https://wandb.ai/jimmysmith1919/S5_ICL/reports/Hyena-red-and-Hyena-S5-blue-on-WikiText-103--Vmlldzo0MTkwODEx?accessToken=pk0zw5w75uo1s4zkn3kh7koum902t4q2yzbm28xk0olzzgxuskoq0g1iyauixlob) which shows that Hyena with test perplexity 19.094.

It would be very helpful if you can share the detailed configuration of Hyena on wikitext-103.