Closed leseb closed 3 weeks ago
Note: Links to docs will display an error until the docs builds have been completed.
As of commit 42f181148b5a077adde5132f66575c64807ac260 with merge base 54455a336a0732e1c84f70b191e7a2ab931711cf (): :green_heart: Looks good so far! There are no failures yet. :green_heart:
This comment was automatically generated by Dr. CI and updates every 15 minutes.
Hi @leseb!
Thank you for your pull request and welcome to our community.
In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.
In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.
Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed
. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.
If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks!
CLA is in-progress.
@gabe-l-hart PTAL
Thanks for the fix @leseb, kicked off the CI
@leseb Good find! The change I made in 766bee was simply to extend from using a single hard-coded name to allow for a single non-hard coded name, so the limitation of a single weight mapping was there before. With this change, it will not assert, but it will still only ready the weight mapping from the first file found. In the case of mistral
, are the multiple weight mapping files redundant? I suppose we could do some kind of check and/or merge of the multiple files if the mapping is split across multiple files, though I'm not clear if that's actually a valid file layout.
Ah, I see in the files that the multiple index
files are mapped to the multiple different weight format sets (.safetensors
vs .bin
). I think the right fix would be to somehow line them up based on the prefer_safetensors
value for the model.
I've added a few more commits on top of @leseb's: https://github.com/leseb/torchchat/pull/1. I'm not sure whether the best course is to merge my commits into his branch to update the pointer for this PR or to just update this branch's pointer to use my fork. I'm open to either!
I've added a few more commits on top of @leseb's: leseb#1. I'm not sure whether the best course is to merge my commits into his branch to update the pointer for this PR or to just update this branch's pointer to use my fork. I'm open to either!
@gabe-l-hart Thanks for taking the time to come up with a better approach. At this stage, I believe it makes more sense that you take over since there is nothing left from my initial change. Thanks!
42f18114 fix: allow multiple weight mapping files for mistral
commit 42f181148b5a077adde5132f66575c64807ac260 Author: Sébastien Han seb@redhat.com Date: Wed Nov 6 15:20:28 2024 +0100