Open win10ogod opened 5 days ago
just give the php code to any LLM like claude.ai or openai, they will convert from php to python.
i have attached the converted script in python for you.
@win10ogod I created a version that runs with local LLM via OpenAI compatible API (in my case LM Studio) and outputs responses locally to the console instead of through a web application if you'd like it.
@win10ogod I created a version that runs with local LLM via OpenAI compatible API (in my case LM Studio) and outputs responses locally to the console instead of through a web application if you'd like it.
I would be delighted to give it a try if you would be so kind.
@win10ogod I created a version that runs with local LLM via OpenAI compatible API (in my case LM Studio) and outputs responses locally to the console instead of through a web application if you'd like it.
I would be delighted to give it a try if you would be so kind.
Forked and attributed this repo, PyThoughtChain created. Let me know what you think!
@win10ogod I created a version that runs with local LLM via OpenAI compatible API (in my case LM Studio) and outputs responses locally to the console instead of through a web application if you'd like it.
I would be delighted to give it a try if you would be so kind.
Forked and attributed this repo, PyThoughtChain created. Let me know what you think!
I am putting your link on the main repo for people who want the python version, I made a very nice update to the php web side view, it shows the though process similar to O1 you might wanna give it a try, I will update the demo in 10 minutes.
Thanks @antibitcoin! Your foundation was great. I'll look at what you have there to see if there's ways to integrate it in with the Python version. Your php version looks really nice aesthetically. I played around with a GUI from Python but you have me beat there since it's all console output right now!
Thanks @antibitcoin! Your foundation was great. I'll look at what you have there to see if there's ways to integrate it in with the Python version. Your php version looks really nice aesthetically. I played around with a GUI from Python but you have me beat there since it's all console output right now!
I have added longer chain of though, the LLM will decide how many steps to take than execute all of them, than say DONE, that will finish the chain of though, check the last update and you can control how many steps by playing with the prompt and variables, you might wanna update you code this one gave me O1-mini results on reasoning using only gpt-4o-mini.
I will try to release a Python backend with the same web gui.
Fantastic. Sounds promising! I've only been testing it with Llama3.1-8B to work through the logic and I need to build in token usage safeguards to make sure if anyone is using it for external APIs it will throttle. The CoT functionality will automatically resolve within 5 chains in my code by default, and will ask for feedback along the way which differs a bit from your PHP code but can be modified via a config module I built.
I have added all the ports to the main project page, feel free to submit more if you create any, I will link them.
Can you do it using py language? I don't know much about php.