naklecha / llama3-from-scratch

llama3 implementation one matrix multiplication at a time
MIT License
13.72k stars 1.1k forks source link

Request for Authorization to Integrate Llama3 Codebase into Datawhale's LLMs-from-Scratch Project #8

Closed Ethan-Chen-plus closed 5 months ago

Ethan-Chen-plus commented 5 months ago

Hello @naklecha,

I hope this message finds you well.

I am a contributor to the Datawhale community, specifically working on the LLMs-from-Scratch-CN project. Our goal is to recreate large language models from scratch, offering detailed discussions and implementations in the Chinese language. We have successfully completed the implementation of ChatGLM3 and have plans to implement other models such as Mamba, RWKV, Phi, MiniCPM, Qwen, among others.

We came across your impressive work on the Llama3-from-scratch repository. The detailed and precise implementation you have provided is exactly what we are looking for to include in our project. We believe that integrating your Llama3 codebase will not only enhance our project but also provide a valuable resource for the community.

Therefore, we would like to formally request your authorization to integrate your Llama3 implementation into our project. We assure you that we will provide proper attribution and adhere to any licensing terms you specify.

Please let us know if you have any conditions or requirements for this integration. We look forward to your positive response.

Best regards,

Ethan-Chen-Plus Datawhale Community

mbrukman commented 5 months ago

@Ethan-Chen-plus ā€” Please consider waiting until https://github.com/naklecha/llama3-from-scratch/issues/10 is resolved which will provide a license for this project, and then you can decide if you can use the code under that license in your project.

naklecha commented 5 months ago

feel free to use my work, ive added an mit license to it, have fun :) id love for attribution provided to https://www.aaaaaaaaaa.org/ or my github repo, thank you!

Ethan-Chen-plus commented 5 months ago

Thanks a lot for AAAAAAAAAA research!šŸ¤—šŸŽŠ