pypi / support

Issue tracker for support requests related to using https://pypi.org
94 stars 47 forks source link

Project Limit Request: my-unique-datasets-library - 5 GB #4521

Open ravigithubshankar opened 3 months ago

ravigithubshankar commented 3 months ago

Project URL

https://pypi.org/project/my-unique-datasets-library/

Does this project already exist?

New limit

5GB

Update issue title

Which indexes

PyPI

About the project

this project mainly decribes about synthetic datasets which can users can download ,use our library or package to train machine learning and deep learning models , users now need not to be depend on external sites it may be premium versions so, our package will help and solve that type of issues and also provided synthetic datasets for that we are requesting you please extend size total limit to 5 GB this project will be continue or active upto 1 year (ideal)

How large is each release?

we bundle images datasets and classification datasets into our project when we are uploading it consume almost 326.mb so that we are hitting size limit,so that we are requesting you to provided or extend size limit in version of 0.9.5 or 0.9.4 we uploaded new datasets called images kind of datasets like covid19, brain tumor datasets ..updated version to 0.9.5

How frequently do you make a release?

Roughly weekly four days

Code of Conduct

cmaureir commented 3 months ago

Hello @ravigithubshankar :wave: for handling external data, you could look into different approaches for getting them to installations, like how nltk does https://www.nltk.org/install.html#installing-nltk-data have you tried that?

Let me know.

ravigithubshankar commented 3 months ago

yes of course but we are providing high resolution synthetic images datasets so that it requires some external memory it may be mb's to gb's of course we are trying to compressing large GB's of datasets into MB's files. so, if u provide external storage upto 5 gb so it could help ful to us and also many researchers and scholars and students which can benefit by using this type packages

On Mon, 12 Aug 2024 at 19:12, Cristián Maureira-Fredes < @.***> wrote:

Hello @ravigithubshankar https://github.com/ravigithubshankar 👋 for handling external data, you could look into different approaches for getting them to installations, like how nltk does https://www.nltk.org/install.html#installing-nltk-data have you tried that?

Let me know.

— Reply to this email directly, view it on GitHub https://github.com/pypi/support/issues/4521#issuecomment-2284030837, or unsubscribe https://github.com/notifications/unsubscribe-auth/AXXXZWGTXQTI3YWTOHWWIFDZRC3VNAVCNFSM6AAAAABMHWOWZGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBUGAZTAOBTG4 . You are receiving this because you were mentioned.Message ID: @.***>

cmaureir commented 3 months ago

I understand the use case, that's why I strongly recommend you can implement a similar approach like NLTK because having a 5GB wheel can make things very problematic, because you will reach the project limit very fast (in 1-2 releases). Having a mechanism within your package to download the high resolution synthetic images can be a great approach that can even allow you to download different versions of that data.

In most cases, we don't approve the increase of project limits for packages that have data files inside, so that's why I'm recommending you that approach.