issues
search
corca-ai
/
EVAL
EVAL(Elastic Versatile Agent with Langchain) will execute all your requests. Just like an eval method!
MIT License
869
stars
82
forks
source link
Feature/terminal strace
#10
Closed
hanchchch
closed
1 year ago
hanchchch
commented
1 year ago
Current situation
if llm tries to use blocking commands such as
uvicorn backend:app --host 0.0.0.0 --port 7001
or
npm start
with the terminal tool, it hangs forever.
Proposed solution
watch syscalls of the command (with python-ptrace)
set 5 sec timeout on each syscall execution
if times out, return immediately.
if not, reset the timer and watch for next syscall
Current situation
uvicorn backend:app --host 0.0.0.0 --port 7001
ornpm start
with the terminal tool, it hangs forever.Proposed solution