ggerganov / ggml

Tensor library for machine learning
MIT License
11.15k stars 1.02k forks source link

[gpt-J] swap space support #22

Open joshuachris2001 opened 1 year ago

joshuachris2001 commented 1 year ago

is it possible to have swap space support? ( I heard about ' Handling big models for inference' and was wondering if ggml can support a similar feature or store part of the large model in swap.)

ggerganov commented 1 year ago

Can you clarify what is "swap space"? Is it partially loading the weights from disk and then unloading, etc.

joshuachris2001 commented 1 year ago

sorry, I meant to ask if it is possible to split a larger model. as looking at the standard gpt-j ggml usage goes from 16 to 32GB of ram, I do have 16GB of ram but I also have 32GB of swap space that ggml does not seem to use. swap space uses reserved disk space as virtual memory, where the parts of memory gets 'swapped' between disk and ram.

biemster commented 1 year ago

As far as I know swap space is handled by the kernel, to copy memory pages that it deems inactive. I don't think user space applications have any control over that? And since the whole model is actively in use by ggml, swapping those in and out will be detrimental for performance, probably not usable at all. If, and that's a big if, you could convince the kernel that parts of it are currently inactive.

ggerganov commented 1 year ago

Yes, I agree with @biemster The OS automatically decides when to use the swap space. There is nothing special that has to be done in the user code

chatbots commented 1 year ago

For Discussion Purposes Only. Do Not Try at Home!
As an experiment, I ran gpt-j on 6GB RAM with 10GB swapfile. I wanted to run this test before physically uninstalling and reinstalling RAM chips between two computers. NOTE: While NOT recommended as an alternative, this experiment did run very slowly without errors.