lmstudio-ai / lmstudio-bug-tracker

Bug tracking for the LM Studio desktop application
8 stars 3 forks source link

Can't load any model since 3.x upgrade #164

Open dynamiccreator opened 3 days ago

dynamiccreator commented 3 days ago

As for me no model is able to load anymore with versions higher then 2.x.

yagil commented 3 days ago

Please include:

Screenshots always help. Also specific errors

mrge-org commented 3 days ago

I am on Kubuntu 22.04 with 1 Intel® Xeon® Processor E3-1225 v3 and 32GB RAM (Edit: GPU is NVIDIA Quadro M4000). Just downloaded LM_Studio-0.3.4.AppImage and it starts, downloads Llama-3.2-1B-Instruct-Q8_0-GGUF but cannot load it and show the error: Failed to load the model

Error loading model.

(Exit code: 134). Please check settings and try loading the model again.

Experimenting with different settings does not help.

mrge-org commented 3 days ago

Did the same, this time from console and this is the log in it: (base) gpetrov@LE32:~/Downloads$ ./LM_Studio-0.3.4.AppImage 20:26:48.043 › App starting... [CachedFileDataProvider] Watching file at /home/gpetrov/.cache/lm-studio/.internal/backend-preferences-v1.json [SettingsProvider] Reading package.json at /tmp/.mount_LM_StuK60oiU/resources/app/package.json [CachedFileDataProvider] Watching file at /home/gpetrov/.config/LM Studio/settings.json [HttpServerProvider][SettingsFile] Initializing FileData [FileData] Initializing FileData [SystemResourcesProvider] Hardware survey successfully achieved through bundled 'vulkan' liblmstudio. 20:26:50.328 › Hardware survey for general system resources through 'vulkan' took 290.01ms [CachedFileDataProvider] Watching file at /home/gpetrov/.cache/lm-studio/.internal/model-data.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/chatml.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/codellama_instruct.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/cohere_command_r.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/config.map.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/deepseek_coder.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/falcon.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/google_gemma_instruct.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/ibm_granite.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/llama_3_v2.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/metaai_llama_2_chat.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/mistral_instruct.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/openchat.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/phi_2.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/phi_3.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/phind_codellama.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/starcoder_2_instruct.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/vicuna_v1_5_16k.preset.json [PresetsProvider][PresetsIndexer] Creating index for path /home/gpetrov/.cache/lm-studio/config-presets/zephyr.preset.json [LLMExternalAPIProvider] Creating HTTP server extender [LLMExternalAPIProvider] Registering IPC server [EmbeddingExternalAPIProvider] Creating HTTP server extender [EmbeddingExternalAPIProvider] Registering IPC server [PlatformExternalAPIProvider] Creating HTTP server extender [PlatformExternalAPIProvider] Registering IPC server [SystemExternalAPIProvider] Creating HTTP server extender [SystemExternalAPIProvider] Registering IPC server [DiagnosticsExternalAPIProvider] Creating HTTP server extender [DiagnosticsExternalAPIProvider] Registering IPC server [RetrievalExternalAPIProvider] Creating HTTP server extender [RetrievalExternalAPIProvider] Registering IPC server [FilesExternalAPIProvider] Creating HTTP server extender [FilesExternalAPIProvider] Registering IPC server [LMSInternal][Client=LM Studio] Client created. Client with id 'LM Studio' registered. [ConversationsProvider][Indexer] Creating index for path /home/gpetrov/.cache/lm-studio/conversations [ConversationsProvider][Indexer] Creating index for path /home/gpetrov/.cache/lm-studio/conversations/1729182403129.conversation.json [ConversationsProvider][Indexer] Creating index for path /home/gpetrov/.cache/lm-studio/conversations/1729182685122.conversation.json [ConversationsProvider][ConfigFile] Initializing FileData 20:26:50.542 › [AppUpdater] Checking for updates... (current state: idle) 20:26:50.542 › AppUpdater state changed to checking-for-updates-periodic 20:26:50.543 › [AppUpdater] Fetching version info from https://versions-prod.lmstudio.ai/linux/x86/0.3.4 [DeepLinkHandlingExternalAPIProvider] Registering IPC server [AppSettingsInternalAPIProvider] Registering IPC server [ModelIndexInternalAPIProvider] Registering IPC server [ContextMenuInternalAPIProvider] Registering IPC server [ConversationsInternalAPIProvider] Registering IPC server [HttpServerInternalAPIProvider] Registering IPC server [SearchInternalAPIProvider] Registering IPC server [ModelLoadingInternalAPIProvider] Registering IPC server [LLMInstanceStateInternalAPIProvider] Registering IPC server [TransientStorageInternalAPIProvider] Registering IPC server [RuntimeIndexInternalAPIProvider] Registering IPC server [DownloadsAPIProvider] Registering IPC server [UserFilesInternalAPIProvider] Registering IPC server [SystemResourcesInternalAPIProvider] Registering IPC server [SoftwareUpdateInternalAPIProvider] Registering IPC server [BackendDownloadInternalAPIProvider] Registering IPC server [PredictionProcessInternalAPIProvider] Registering IPC server [UserModelDefaultConfigInternalAPIProvider] Registering IPC server [VirtualModelInternalAPIProvider] Registering IPC server [PathOpenerInternalAPIProvider] Registering IPC server [ModelDataInternalAPIProvider] Registering IPC server [PresetsInternalAPIProvider] Registering IPC server [ServerSessionInternalAPIProvider] Registering IPC server [LMStudioClient][LLM][Generator (LM Studio Default Generator)] Register to LM Studio [LMStudioClient][LLM][Preprocessor (legacyRagPromptPreprocessor)] Register to LM Studio [LMSInternal][Client=LM Studio][Endpoint=registerGenerator] Registering generator with identifier: LM Studio Default Generator [LMSInternal][Client=LM Studio][Endpoint=registerPreprocessor] Registering preprocessor with identifier: legacyRagPromptPreprocessor [FileWatchingProvider] Start syncing file watching provider... Changes may be missed during this process. [FileWatchingProvider][Watcher-0] Sync: Subscribing to /home/gpetrov/.cache/lm-studio/.internal [FileWatchingProvider][Watcher-1] Sync: Subscribing to /home/gpetrov/.config/LM Studio [FileWatchingProvider][Watcher-2] Sync: Subscribing to /home/gpetrov/.cache/lm-studio/extensions/backends [FileWatchingProvider][Watcher-3] Sync: Subscribing to /home/gpetrov/.cache/lm-studio/user-files [FileWatchingProvider][Watcher-4] Sync: Subscribing to /home/gpetrov/.cache/lm-studio/config-presets [FileWatchingProvider][Watcher-5] Sync: Subscribing to /home/gpetrov/.cache/lm-studio/conversations [FileWatchingProvider] Sync completed. 20:26:51.022 › [AppUpdater] Received version info response [SettingsProvider] Reading package.json at /tmp/.mount_LM_StuK60oiU/resources/app/package.json 20:26:51.024 › [AppUpdater] Current version: 0.3.4 (build: 3), new version: 0.3.0 (build: undefined) 20:26:51.024 › No update available: 0.3.0 (build: undefined) <= 0.3.4 (build: 3) 20:26:51.025 › AppUpdater state changed to idle [CachedFileDataProvider] Ignoring file change event due to recent write. Path: /home/gpetrov/.config/LM Studio/settings.json [DelayedInitProvider] Running delayed init: ModelIndexProvider [ModelIndexProvider] Directory added: /home/gpetrov/.cache/lm-studio/models 20:26:51.566 › Checking if LM Studio dev tools exist on the system... [ModelIndexProvider][Op-1] Requested to index directory: /home/gpetrov/.cache/lm-studio/models [ModelIndexProvider][Op-1] Starting indexing operation on directory: /home/gpetrov/.cache/lm-studio/models [ModelIndexProvider] Directory added: /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/bundled-models [ModelIndexProvider][Op-2] Requested to index directory: /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/bundled-models [ModelIndexProvider][Op-2] Starting indexing operation on directory: /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/bundled-models [FileWatchingProvider] Start syncing file watching provider... Changes may be missed during this process. [FileWatchingProvider][Watcher-6] Sync: Subscribing to /home/gpetrov/.cache/lm-studio/models 20:26:51.583 › LM Studio dev tools are not present, copying them over to .cache/lm-studio/bin... [FileWatchingProvider][Watcher-7] Sync: Subscribing to /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/bundled-models [GGUFMetadataProvider] Returning cached GGUF metadata for /home/gpetrov/.cache/lm-studio/models/hugging-quants/Llama-3.2-1B-Instruct-Q8_0-GGUF/llama-3.2-1b-instruct-q8_0.gguf [ModelIndexProvider][Op-1] Finished indexing operation on directory: /home/gpetrov/.cache/lm-studio/models [FileWatchingProvider] Sync completed. [GGUFMetadataProvider] Reading GGUF metadata for /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/bundled-models/nomic-ai/nomic-embed-text-v1.5-GGUF/nomic-embed-text-v1.5.Q4_K_M.gguf took 87ms [ModelIndexProvider][Op-2] Finished indexing operation on directory: /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/bundled-models [CachedFileDataProvider] Watching file at /home/gpetrov/.cache/lm-studio/conversations/1729182685122.conversation.json [CachedFileDataProvider] Watching file at /home/gpetrov/.cache/lm-studio/.internal/user-concrete-model-default-config/.json [CachedFileDataProvider] File at /home/gpetrov/.cache/lm-studio/.internal/gguf-metadata-cache.json is no longer used. Releasing... [CachedFileDataProvider] Watching file at /home/gpetrov/.cache/lm-studio/.internal/user-concrete-model-default-config/hugging-quants/Llama-3.2-1B-Instruct-Q8_0-GGUF/llama-3.2-1b-instruct-q8_0.gguf.json [CachedFileDataProvider] Ignoring file change event due to recent write. Path: /home/gpetrov/.cache/lm-studio/.internal/user-concrete-model-default-config/hugging-quants/Llama-3.2-1B-Instruct-Q8_0-GGUF/llama-3.2-1b-instruct-q8_0.gguf.json [CachedFileDataProvider] Ignoring file change event due to recent write. Path: /home/gpetrov/.cache/lm-studio/.internal/user-concrete-model-default-config/hugging-quants/Llama-3.2-1B-Instruct-Q8_0-GGUF/llama-3.2-1b-instruct-q8_0.gguf.json UNLOADINTERNAL CALLED Error at f.unloadInternal (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1145478) at f.unload (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1146237) at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1145251 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) 20:27:04.430 › [LMSInternal][Client=LM Studio][Endpoint=loadModel] Error in channel handler: Error: Error loading model. at f.d (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1144727) at f.emit (node:events:513:28) at f.onChildExit (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1121044) at ForkUtilityProcess. (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1120650) at ForkUtilityProcess.emit (node:events:513:28) at ForkUtilityProcess.a.emit (node:electron/js2c/browser_init:2:68545) 20:27:05.433 › [LMSInternal][Client=LM Studio][Endpoint=countTokens] Error in RPC handler: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.434 › [LMSInternal][Client=LM Studio][Endpoint=countTokens] Error in RPC handler: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.435 › [LMSInternal][Client=LM Studio][Endpoint=countTokens] Error in RPC handler: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.436 › [LMSInternal][Client=LM Studio][Endpoint=countTokens] Error in RPC handler: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.437 › [LMSInternal][Client=LM Studio][Endpoint=countTokens] Error in RPC handler: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.438 › Unhandled Rejection at: {} reason: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.443 › Unhandled Rejection at: {} reason: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.444 › Unhandled Rejection at: {} reason: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.444 › Unhandled Rejection at: {} reason: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.445 › Unhandled Rejection at: {} reason: Error: Model is unloaded. at /tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:8:1142606 20:27:05.455 › [LMSInternal][Client=LM Studio][Endpoint=countTokens] Error in RPC handler: TypeError: Cannot read properties of undefined (reading 'indexedModelRepresentation') at t.ModelLoadingProvider.getInstanceBySpecifierOrThrow (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:36:6666) at t.ModelLoadingProvider.getLLMModelBySpecifierOrThrow (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:36:8415) at Object.handler (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:58:80911) at t.ServerPort.receivedRpcCall (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:205:76) at a.receivedMessage (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:178:7825) at MessagePortMain. (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:178:6782) at MessagePortMain.emit (node:events:513:28) at MessagePortMain._internalPort.emit (node:electron/js2c/browser_init:2:99603) 20:27:05.474 › [LMSInternal][Client=LM Studio][Endpoint=countTokens] Error in RPC handler: TypeError: Cannot read properties of undefined (reading 'indexedModelRepresentation') at t.ModelLoadingProvider.getInstanceBySpecifierOrThrow (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:36:6666) at t.ModelLoadingProvider.getLLMModelBySpecifierOrThrow (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:36:8415) at Object.handler (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:58:80911) at t.ServerPort.receivedRpcCall (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:205:76) at a.receivedMessage (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:178:7825) at MessagePortMain. (/tmp/.mount_LM_StuK60oiU/resources/app/.webpack/main/index.js:178:6782) at MessagePortMain.emit (node:events:513:28) at MessagePortMain._internalPort.emit (node:electron/js2c/browser_init:2:99603) [CachedFileDataProvider] File at /home/gpetrov/.cache/lm-studio/.internal/user-concrete-model-default-config/hugging-quants/Llama-3.2-1B-Instruct-Q8_0-GGUF/llama-3.2-1b-instruct-q8_0.gguf.json is no longer used. Releasing...

gurusura commented 2 days ago

Screenshot at 2024-10-18 10-22-58 I have been unable to use lmstudio since the 0.3.1 series updates. I get the error when I load any model.

🥲 Failed to load the model

Failed to load model

unable to allocate backend buffer

My system: Distributor ID: Ubuntu Description: Ubuntu 24.04.1 LTS Release: 24.04 Codename: noble

I tried using the more recent 1.3gb sized models such as llama 3.2 1B. How do I solve this issue? Is there any work around? Let me know. Thanks.