second-state / WasmEdge-WASINN-examples

Apache License 2.0
217 stars 35 forks source link

[Example] ggml: increase llama ctx-size for tinyllama CI job #100

Closed dm4 closed 2 months ago

juntao commented 2 months ago

Hello, I am a code review bot on flows.network. Here are my reviews of code commits in this PR.


Commit 6510caf1d16d3e80acbfa7f38be97297fe723b67

Key Changes:

  1. Increased the ctx-size option value from 512 to 1024 in main.rs file.
  2. No change in the size of the wasmedge-ggml-llama.wasm file.

Potential Problems:

  1. Increased ctx-size: The increase in the ctx-size option may have performance implications and might require additional memory resources. It would be advisable to ensure that this change is thoroughly tested to verify that it does not lead to memory-related issues or performance degradation.

  2. Binary Patch in Wasm File: The binary patch in the wasmedge-ggml-llama.wasm file suggests that there were changes made at the binary level. It is important to validate that these changes are intentional and do not introduce any unexpected behavior or security vulnerabilities.

  3. Maintenance of Binary Files: Maintaining binary files in version control can sometimes lead to challenges in tracking changes and reviewing modifications. It is recommended to document the reasons for changes in binary files and consider whether there are alternative approaches to version control for such files.

Overall, the increased ctx-size parameter is the primary change in this patch, and it is crucial to ensure that it is thoroughly tested to prevent any unforeseen issues related to memory usage or performance. Additionally, the binary patch in the Wasm file should be carefully reviewed and validated.