Closed deependujha closed 2 months ago
Attention: Patch coverage is 28.57143%
with 10 lines
in your changes missing coverage. Please review.
Project coverage is 78%. Comparing base (
b039b64
) to head (e2221d9
). Report is 1 commits behind head on main.
I use Linux (ubuntu 24.04) but still unable to reproduce the issue (chunk deletion).
Tried to make sure all the workers and their respective uploaders, downloaders, and removers are terminated.
If @rasbt or @srikhetramohanty can test the pr, and verify if it works, or report the error logs.
@bhimrazy you also had some issue, if you can check it too!
Please consider investigating the findings and remediating the incidents. Failure to do so may lead to compromising the associated services or software components.
π¦ GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.
Thanks for the PR @deependujha . Just tested the PR and am still getting the
...
Worker 0 is terminating.
Worker 0 is done.
Progress: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 2/2 [00:03<00:00, 1.97s/it]
Workers are finished.βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 2/2 [00:03<00:00, 1.81s/it]
Traceback (most recent call last):
File "/home/sebastian/tmp/test_litdata.py", line 20, in <module>
optimize(
File "/home/sebastian/tmp/litdata/src/litdata/processing/functions.py", line 445, in optimize
data_processor.run(
File "/home/sebastian/tmp/litdata/src/litdata/processing/data_processor.py", line 1148, in run
result = data_recipe._done(len(user_items), self.delete_cached_files, self.output_dir)
File "/home/sebastian/tmp/litdata/src/litdata/processing/data_processor.py", line 812, in _done
raise RuntimeError(f"All the chunks should have been deleted. Found {chunks}")
RuntimeError: All the chunks should have been deleted. Found ['chunk-0-0.bin']
issue.
@tchaton To reproduce, I am using the 2nd example from here: https://github.com/Lightning-AI/litdata/issues/367
import glob
import random
from pathlib import Path
from litdata import optimize
def tokenize(filename: str):
with open(filename, "r", encoding="utf-8") as file:
text = file.read()
text = text.strip().split(" ")
word_to_int = {word: random.randint(1, 1000) for word in set(text)}
tokenized = [word_to_int[word] for word in text]
yield tokenized
train_files = sorted(glob.glob(str(Path("custom_texts") / "*.txt")))
if __name__ == "__main__":
optimize(
fn=tokenize,
inputs=train_files,
output_dir="temp",
num_workers=1,
chunk_bytes="50MB",
)
on the hyperplane1
machine. (It works fine on Studios.). Other users also reported the issue on Linux. So weird that some machines have that issues, others don't. One thing to note is that the hyperplane1
machine has faster processors than most cloud machines, so maybe it's some race condition.
can we pls add a test for this case?
added one, but it's just a copy-paste of the script
It might be modified.
@deependujha I will try it again after manually deleting the tmp/chunks
folder (currently asking a colleague to do that, because I don't have sudo access to remove it on that machine). I am hopeful that this is maybe just a leftover and it works then.
Hi @deependujha ! Sorry for the late follow-up, I was at a conference and also just had a colleague delete the cache (since I don't have sudo access). I am happy to report that the issue is resolved now. Thanks so much for fixing this! I recommend to merge the PR now π
Another user reported that it fixed the issue. Thanks so much again for this @deependujha !
@tchaton @Borda could you help merging this some time? Would be nice to have it in the main branch and upcoming release.
Great to hear the issue is resolved! π Thank you so much, @rasbt, for the updateβIβll go ahead and merge it into the main branch now. π
And once again, huge thanks to @deependujha for the awesome work! π
cc: @tchaton @Borda
Thank you, Sebastian! Glad to hear the issue is resolved. Appreciate your feedback and support π.
Before submitting
- [ ] Was this discussed/agreed via a Github issue? (no need for typos and docs improvements) - [ ] Did you read the [contributor guideline](https://github.com/Lightning-AI/lit-data/blob/main/.github/CONTRIBUTING.md), Pull Request section? - [ ] Did you make sure to update the docs? - [ ] Did you write any new necessary tests?What does this PR do?
Fixes #245 (already works, but has similar issue) fixes #367
PR review
Anyone in the community is free to review the PR once the tests have passed. If we didn't discuss your PR in GitHub issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding π