issues
search
PicoMLX
/
PicoMLXServer
The easiest way to run the fastest MLX-based LLMs locally
MIT License
207
stars
15
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Models/servers are not persisted between launches
#4
sejmann
opened
6 months ago
1
MLXServer doesn't appear to actually support OpenAI style API calls as suggested in readme
#3
sejmann
opened
7 months ago
7
SwiftUI: View that show @Observable ServerController and Server don't update
#2
ronaldmannak
closed
7 months ago
1
CondaError: Run 'conda init' before 'conda activate'
#1
M4RT1NJB
closed
7 months ago
4