ngxson / wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
441 stars 21 forks source link

Add delete method to cacheManager #79

Closed flatsiedatsie closed 4 months ago

flatsiedatsie commented 4 months ago

Allows for the deletion of a specific LLM from the cache, based on a partial file name.

felladrin commented 4 months ago

Good addition!


Any thoughts on a way to delete all the cache files, in case we want/need to debug something / free up space? (Maybe "if the hint is undefined, it deletes all files from cache"?) [EDIT: Just realized that we already have the .clear() method for deleting all files at once.]

ngxson commented 4 months ago

I'd suggest a function that accepts function as input (a bit like array.filter)

  /**
   * Delete files from the cache
   */
  async delete(predicate: (entry: CacheEntry) => boolean): Promise<void> {
    const cacheDir = await getCacheDir();
    const list = await CacheManager.list();
    for (const item of list) {
      if (predicate(item)) {
        cacheDir.removeEntry(item.name);
      }
    }
  },
flatsiedatsie commented 4 months ago

I've changed it to the suggested code, thank you.

How would I use it with a function? I'm guessing something like this?

await window.llama_cpp_app.cacheManager.delete((cache_item) => {
  if(typeof cache_item.name == 'string' && cache_item.name.indexOf(partial_url) != -1){
    return true
  }
  else{
    return false
  }
});
ngxson commented 4 months ago

On second though, I think having a function like this will make more sense:

await CacheManager.delete("https://huggingface.co/.../my_model.gguf");

// or
const items = await CacheManager.list();
await CacheManager.delete(items[0].name);

// or
await CacheManager.deleteMany((item) => item.name === items[0].name);

I'm replacing this PR with https://github.com/ngxson/wllama/pull/80 , is that OK for you @flatsiedatsie ?

flatsiedatsie commented 4 months ago

Of course, all fine!