Closed fisher1020 closed 4 years ago
Is this the same project in question? https://github.com/MegEngine/MegEngine
If so, are you affiliated with that project? I'm not seeing any public members at https://github.com/orgs/MegEngine/people.
yes,I'm a member on MegEngine,MegEngine open-sourced on March 25th ,welcome to the website:https://megengine.org.cn/
Can you make your membership in the MegEngine organization public so I can see it at https://github.com/orgs/MegEngine/people?
Can you also give us some more details about why this is so large? At 850MB this would be one of the largest releases on PyPI. How many releases are you planning to make? How often are you planning to make new releases?
My membership is public,please review https://github.com/orgs/MegEngine/people We have statically linked CUDA / cuDNN, etc., and we have a very large number of kernel implementations, which makes this package particularly large. Our library hopes to be used in conjunction with torch,If it is not statically linked, various symbol conflicts will occur, which will make it very difficult for users. We plan to release version about every two months.
Hi @di, could you please increase the limit? We want to provide users with a better installation experience, and are blocked by this issue.
Hi @di , could you please increase the limit? We want to provide users with a better installation experience, and are blocked by this issue.
Hi @di
Any update for this issue? So many users are complaining about the inconvenience.
Is there any possible ways to host these large file on our CDN but user can still use pip install megengine
?
Hi folks, please bear with us. PyPI is an almost entirely volunteer-managed project, and we have limited time. Repeatedly asking us for updates is not going to make this happen sooner.
In addition, I haven't gotten all of my questions answered here. 850MB is a huge package and we need to make sure this is worth the added cost it will incur.
I asked:
How many releases are you planning to make?
This means, for each version, how many 850MB artifacts are you planning to upload? Looking at https://megengine.org.cn/whl/mge.html, it looks like the answer here is currently "four", are you planning to increase this?
Is there any possible ways to host these large file on our CDN but user can still use
pip install megengine
?
This is not currently possible, the best is what you're already doing:
pip3 install megengine -f https://megengine.org.cn/whl/mge.html
@di Thanks for the response. It's really kind of you to look at our website to extract the information.
for each version, how many 850MB artifacts are you planning to upload?
We are going to release 4 artifacts for each version, we are not going to increase this for this year. We will host on our server if we are going to provide more. (We may provide CPU version in a separated package)
How many releases are you planning to make?
We are going to shorten the release cycle to 6 weeks, there will be about 8~9 releases per year.
Maybe we can move some legacy releases to our server in the future if you can't afford the cost. (e.g. only leave last 5 releases).
Hi @di Any update for this issue? Your help is most appreciated
I've set the upload limit for megengine
to 850 MB on PyPI. Please be mindful of how frequently you make releases with such a high limit -- each release will have a significant impact on how much traffic PyPI has to serve.
This project does not exist on TestPyPI and so I cannot raise the limit there. If you need that limit raised, please create the project first and let us know.
Project megengine :https://pypi.org/project/MegEngine/
Size of release 850M
Which indexes both
Reasons for the request megengine is deep learning framework,the installation package contains many computing libraries and dependenciesfor,for releasing binaries with even more compute capabilities,Your help is most appreciated