/**
* Sets mlock to force system to keep model in RAM.
* @param model A pointer to the llmodel_model instance.
* @param use_mlock true if the model kept in RAM, false otherwise.
*/
void llmodel_setMlock(llmodel_model model, bool use_mlock);
Setting use_mlock=True really speeds up the use at least on my machine so I find it useful. Currently the use_mlock parameter is behind the private pointer cpp d_ptr->params.use_mlock in llamamodel.cpp. This function gives access to changing the use_mlock straight from the main program.
Another way would be to make getters and setters to the d_ptr but I think its set to private for a reason.
The use_mlock feature is currently only in the llamamodel params struct so there is no implementation in gptj.cpp GPTJ.
Title: Adds setMlock function to llmodel_c.h
Setting
use_mlock=True
really speeds up the use at least on my machine so I find it useful. Currently the use_mlock parameter is behind the private pointercpp d_ptr->params.use_mlock
inllamamodel.cpp
. This function gives access to changing theuse_mlock
straight from the main program.Another way would be to make getters and setters to the
d_ptr
but I think its set to private for a reason.The use_mlock feature is currently only in the llamamodel params struct so there is no implementation in gptj.cpp GPTJ.