Open Melirius opened 2 months ago
OK, now it starts, but why???
I have the same issue but it won't restart at all. Just get the below errors. I am using Debian though:
Error: NSFW was unable to start watching that directory. LM Studio is taking longer than expected to start ...
I was using this successfully when I first downloaded it. But now v3.0 and 3.2 give me the above error. v2.3 seems to still work ok tho. I tried deleting ~/.cache/lm-studio & ~/.config/LM Studio and then running the appimage again, but same issue.
System Specs: Debian 12 12th Gen i7 1260P (16 cores) RAM 64GB GFX Integrated Intel xe (uses system ram) LM Studio v3.2 and v3.0
./LM_Studio-0.3.2.AppImage
16:55:24.357 › App starting...
16:55:24.894 › Error while surveying hardware with backend 'llama.cpp-linux-x86_64-nvidia-cuda-avx2', version '1.1.7': LMSCore load lib failed - child process with PID 3333716 exited with code 1
[CachedFileDataProvider] Watching file at /home/dakka/.cache/lm-studio/.internal/backend-preferences-v1.json
[CachedFileDataProvider] Watching file at /home/dakka/.config/LM Studio/settings.json
[HttpServerProvider][SettingsFile] Initializing FileData
[FileData] Initializing FileData
[SystemResourcesProvider] Hardware survey successfully achieved through bundled 'vulkan' liblmstudio.
[CachedFileDataProvider] Watching file at /home/dakka/.cache/lm-studio/.internal/model-data.json
[LLMExternalAPIProvider] Creating HTTP server extender
[LLMExternalAPIProvider] Registering IPC server
[EmbeddingExternalAPIProvider] Creating HTTP server extender
[EmbeddingExternalAPIProvider] Registering IPC server
[PlatformExternalAPIProvider] Creating HTTP server extender
[PlatformExternalAPIProvider] Registering IPC server
[SystemExternalAPIProvider] Creating HTTP server extender
[SystemExternalAPIProvider] Registering IPC server
[DiagnosticsExternalAPIProvider] Creating HTTP server extender
[DiagnosticsExternalAPIProvider] Registering IPC server
[LMSInternal][Client=LM Studio] Client created.
Client with id 'LM Studio' registered.
[ConversationsProvider][Indexer] Creating index for path /home/dakka/.cache/lm-studio/conversations
[ConversationsProvider][Indexer] Creating index for path /home/dakka/.cache/lm-studio/conversations/1724380570865.conversation.json
[ConversationsProvider][Indexer] Creating index for path /home/dakka/.cache/lm-studio/conversations/1725166925481.conversation.json
[ConversationsProvider][Indexer] Creating index for path /home/dakka/.cache/lm-studio/conversations/1726031012223.conversation.json
[ConversationsProvider][ConfigFile] Initializing FileData
16:55:25.691 › [AppUpdater] Checking for updates... (current state: idle)
16:55:25.691 › AppUpdater state changed to checking-for-updates-periodic
16:55:25.692 › [AppUpdater] Fetching version info from https://versions-prod.lmstudio.ai/linux/x86/0.3.2
16:55:25.995 › [AppUpdater] Version fetch url: https://versions-prod.lmstudio.ai/linux/x86/0.3.2
16:55:25.998 › [AppUpdater] Version info response: {
os: 'linux',
arch: 'x86',
version: '0.3.0',
releaseNotes: 'No release notes found.'
}
16:55:26.001 › [AppUpdater] Current version: 0.3.2, latest version: 0.3.0
16:55:26.001 › No update available: 0.3.0 <= 0.3.2
16:55:26.001 › AppUpdater state changed to idle
[NotepadMinusMinusExternalAPIProvider] Registering IPC server
[DeepLinkHandlingExternalAPIProvider] Registering IPC server
[AppSettingsInternalAPIProvider] Registering IPC server
[ModelIndexInternalAPIProvider] Registering IPC server
[ContextMenuInternalAPIProvider] Registering IPC server
[ConversationsInternalAPIProvider] Registering IPC server
[HttpServerInternalAPIProvider] Registering IPC server
[SearchInternalAPIProvider] Registering IPC server
[ModelLoadingInternalAPIProvider] Registering IPC server
[LLMInstanceStateInternalAPIProvider] Registering IPC server
[TransientStorageInternalAPIProvider] Registering IPC server
[RuntimeIndexInternalAPIProvider] Registering IPC server
[DownloadsAPIProvider] Registering IPC server
[UserFilesInternalAPIProvider] Registering IPC server
[SystemResourcesInternalAPIProvider] Registering IPC server
[SoftwareUpdateInternalAPIProvider] Registering IPC server
[BackendDownloadInternalAPIProvider] Registering IPC server
[PredictionProcessInternalAPIProvider] Registering IPC server
[UserModelDefaultConfigInternalAPIProvider] Registering IPC server
[VirtualModelInternalAPIProvider] Registering IPC server
[PathOpenerInternalAPIProvider] Registering IPC server
[ModelDataInternalAPIProvider] Registering IPC server
[FileWatchingProvider] Start syncing file watching provider... Changes may be missed during this process.
[FileWatchingProvider][Watcher-0] Sync: Subscribing to /home/dakka/.cache/lm-studio/.internal
[FileWatchingProvider][Watcher-1] Sync: Subscribing to /home/dakka/.config/LM Studio
[FileWatchingProvider][Watcher-2] Sync: Subscribing to /home/dakka/.cache/lm-studio/extensions/backends
Same issue with 0.3.4 Below is my log:
[2024-10-12 10:37:31.229] [info] App starting...
[2024-10-12 10:37:36.081] [info] App starting...
[2024-10-12 10:37:36.752] [warn] Failed to survey hardware with backend 'llama.cpp-linux-x86_64-nvidia-cuda-avx2', version '1.1.11': LMSCore load lib failed - child process with PID 481920 exited with code 1
[2024-10-12 10:37:37.104] [info] Hardware survey for general system resources through 'vulkan' took 114.19ms
[2024-10-12 10:37:37.228] [info] [AppUpdater] Checking for updates... (current state: idle)
[2024-10-12 10:37:37.228] [info] AppUpdater state changed to checking-for-updates-periodic
[2024-10-12 10:37:37.229] [info] [AppUpdater] Fetching version info from https://versions-prod.lmstudio.ai/linux/x86/0.3.4
[2024-10-12 10:37:41.016] [info] [AppUpdater] Received version info response
[2024-10-12 10:37:41.016] [info] [AppUpdater] Current version: 0.3.4 (build: 3), new version: 0.3.0 (build: undefined)
[2024-10-12 10:37:41.017] [info] No update available: 0.3.0 (build: undefined) <= 0.3.4 (build: 3)
[2024-10-12 10:37:41.017] [info] AppUpdater state changed to idle
With 0.3.4 on Ubuntu 24.04:
[2775314:1017/000830.520085:FATAL:setuid_sandbox_host.cc(158)] The SUID sandbox helper binary was found, but is not configured correctly. Rather than run without sandboxing I'm aborting now. You need to make sure that /tmp/.mount_lmstudYChfxn/chrome-sandbox is owned by root and has mode 4755. Trace/breakpoint trap (core dumped)
We now have an experimental .deb package which might solve these issues, available here: https://lmstudio.ai/beta-releases
are you able to try again with it?
I downloaded the deb experimental and have the same results as before:
[2024-10-17 15:20:58.934] [warn] Failed to survey hardware with backend 'llama.cpp-linux-x86_64-nvidia-cuda-avx2', version '1.1.11': LMSCore load lib failed - child process with PID 3937710 exited with code 1
[2024-10-17 15:20:59.222] [info] Hardware survey for general system resources through 'vulkan' took 96.15ms
[2024-10-17 15:20:59.328] [info] [AppUpdater] Checking for updates... (current state: idle)
[2024-10-17 15:20:59.328] [info] AppUpdater state changed to checking-for-updates-periodic
[2024-10-17 15:20:59.328] [info] [AppUpdater] Fetching version info from https://versions-prod.lmstudio.ai/linux/x86/0.3.4
[2024-10-17 15:21:04.577] [info] [AppUpdater] Received version info response
[2024-10-17 15:21:04.577] [info] [AppUpdater] Current version: 0.3.4 (build: 7), new version: 0.3.0 (build: undefined)
[2024-10-17 15:21:04.577] [info] No update available: 0.3.0 (build: undefined) <= 0.3.4 (build: 7)
[2024-10-17 15:21:04.578] [info] AppUpdater state changed to idle
I downloaded the deb experimental and have the same results as before:
[2024-10-17 15:20:58.934] [warn] Failed to survey hardware with backend 'llama.cpp-linux-x86_64-nvidia-cuda-avx2', version '1.1.11': LMSCore load lib failed - child process with PID 3937710 exited with code 1 [2024-10-17 15:20:59.222] [info] Hardware survey for general system resources through 'vulkan' took 96.15ms [2024-10-17 15:20:59.328] [info] [AppUpdater] Checking for updates... (current state: idle) [2024-10-17 15:20:59.328] [info] AppUpdater state changed to checking-for-updates-periodic [2024-10-17 15:20:59.328] [info] [AppUpdater] Fetching version info from https://versions-prod.lmstudio.ai/linux/x86/0.3.4 [2024-10-17 15:21:04.577] [info] [AppUpdater] Received version info response [2024-10-17 15:21:04.577] [info] [AppUpdater] Current version: 0.3.4 (build: 7), new version: 0.3.0 (build: undefined) [2024-10-17 15:21:04.577] [info] No update available: 0.3.0 (build: undefined) <= 0.3.4 (build: 7) [2024-10-17 15:21:04.578] [info] AppUpdater state changed to idle
@ddakka does the app not launch after this?
I get that error popup:
I do get the little task icon but the application is not running:
Clicking on "Open LM Studio" Through the taskbar menu item doesn't seem to do anything:
In case you want the running processes for the taskbar menu item (or anything that has lm-s in the name):
)$ ps aux | grep lm-s
dakka 3986831 11.4 0.2 1177537396 172216 ? Sl 15:35 0:00 /usr/lib/lm-studio/lm-studio
dakka 3986834 0.1 0.0 33787100 52472 ? S 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=zygote --no-zygote-sandbox
dakka 3986835 0.2 0.0 33787088 52032 ? S 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=zygote
dakka 3986837 0.0 0.0 33787088 8736 ? S 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=zygote
dakka 3986869 1.2 0.1 34296092 108924 ? Sl 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=gpu-process --enable-crash-reporter=3a82047b-9776-47d4-9a9a-92f732f67d4a,no_channel --user-data-dir=/home/dakka/.config/LM Studio --gpu-preferences=WAAAAAAAAAAgAAAIAAAAAAAAAAAAAAAAAABgAAAAAAA4AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACAAAAAAAAAAIAAAAAAAAAABAAAAAAAAAAgAAAAAAAAACAAAAAAAAAAIAAAAAAAAAA== --shared-files --field-trial-handle=0,i,11722574457836704228,6525295452083124665,131072 --disable-features=SpareRendererForSitePerProcess
dakka 3986872 0.3 0.0 33844736 63600 ? Sl 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=utility --utility-sub-type=network.mojom.NetworkService --lang=en-GB --service-sandbox-type=none --enable-crash-reporter=3a82047b-9776-47d4-9a9a-92f732f67d4a,no_channel --user-data-dir=/home/dakka/.config/LM Studio --shared-files=v8_context_snapshot_data:100 --field-trial-handle=0,i,11722574457836704228,6525295452083124665,131072 --disable-features=SpareRendererForSitePerProcess
dakka 3986873 1.8 0.1 1177337148 89820 ? Sl 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=utility --utility-sub-type=node.mojom.NodeService --lang=en-GB --service-sandbox-type=none --enable-crash-reporter=3a82047b-9776-47d4-9a9a-92f732f67d4a,no_channel --user-data-dir=/home/dakka/.config/LM Studio --shared-files=v8_context_snapshot_data:100 --field-trial-handle=0,i,11722574457836704228,6525295452083124665,131072 --disable-features=SpareRendererForSitePerProcess
dakka 3987003 1.6 0.2 1177566400 140612 ? Sl 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=utility --utility-sub-type=node.mojom.NodeService --lang=en-GB --service-sandbox-type=none --enable-crash-reporter=3a82047b-9776-47d4-9a9a-92f732f67d4a,no_channel --user-data-dir=/home/dakka/.config/LM Studio --shared-files=v8_context_snapshot_data:100 --field-trial-handle=0,i,11722574457836704228,6525295452083124665,131072 --disable-features=SpareRendererForSitePerProcess
dakka 3987016 1.5 0.2 1177556940 139440 ? Sl 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=utility --utility-sub-type=node.mojom.NodeService --lang=en-GB --service-sandbox-type=none --enable-crash-reporter=3a82047b-9776-47d4-9a9a-92f732f67d4a,no_channel --user-data-dir=/home/dakka/.config/LM Studio --shared-files=v8_context_snapshot_data:100 --field-trial-handle=0,i,11722574457836704228,6525295452083124665,131072 --disable-features=SpareRendererForSitePerProcess
dakka 3987028 1.0 0.1 1177337148 82600 ? Sl 15:35 0:00 /usr/lib/lm-studio/lm-studio --type=utility --utility-sub-type=node.mojom.NodeService --lang=en-GB --service-sandbox-type=none --enable-crash-reporter=3a82047b-9776-47d4-9a9a-92f732f67d4a,no_channel --user-data-dir=/home/dakka/.config/LM Studio --shared-files=v8_context_snapshot_data:100 --field-trial-handle=0,i,11722574457836704228,6525295452083124665,131072 --disable-features=SpareRendererForSitePerProcess
We now have an experimental .deb package which might solve these issues, available here: https://lmstudio.ai/beta-releases
This installed and is running, thanks! S
Just installed 3.5 and everything is working again
Just installed 3.5 and the AppImage behaves exactly as 3.4 did. Exits with a sandbox error.
add --no-sandbox to run. ./LM_Studio-0.3.5-beta.AppImage --no-sandbox
or with sudo sysctl -w kernel.apparmor_restrict_unprivileged_userns=0
@miguepollo Is that just a blanket application of the permissions given in the apparmor config for an individual application? Seems a little loose.
Latest 0.3.2 and 0.3.1 versions tested, Ubuntu 22.04 x64.
With first 2-3 runs error of "too many opened files" appear, then it is replaced by "NSFW cannot watch a directory", see picture. Pressing OK Studio appears in tray, but clicking on "Load model`" does not do anything.