Open LordScarface opened 1 year ago
Okay so I was missing the has_normalizations=true
flag, it is working now. But I was wondering if it would be feasible to also add a floating point implementation of Softmax and GELU?
I was wondering if it would be feasible to also add a floating point implementation of Softmax and GELU?
That would be a great feature to have for sure, but it's not planned for the near future, just due to a lack of manpower. Hopefully, we get someone who wants to start working on that, or an outside contributor makes a PR to add that feature.
For now, Gemmini's transformer support is targeted towards I-BERT, rather than floating-point BERT implementations
Hello, I just started with Gemmini and Chipyard and not I am facing some issues with the Softmax, GELU and LayerNorm activation functions.
I got Gemmini running and I get the correct results for matrix multiplications and RELU activation, but Softmax causes a crash in the simulator and also on the FPGA implementation.
I built a baremetal app according to the gemmini-rocc-tests repo that just does a matrix multiplication and then a Softmax, this is the code:
I then compiled verilator with the
./scripts/build-verilator.sh
command and ran the test with./scripts/run-verilator.sh $(which ./software/gemmini-rocc-tests/build/transformers/softmax_test-baremetal)
.I also modified the
xcustom.h
to print all commands that are executed on Gemmini.Here is the output:
So the last command fails, which is the following from
gemmini.h
:The corresponding assert that fails is in DMACommandTracker Line 89
If I just change the activation to be RELU it works as expected.
Here are the Versions I am using:
I hope someone can help me with this, Best Regards, Lukas