Open ianarawjo opened 1 year ago
I came here to ask about a similar request — specifically, whether it's possible to use a .cforge flow document from within code.
I guess what I was looking for was:
import chainforge as cf
forge = cf.load('example.cforge')
results = forge.process('mydata.csv')
results['prompt1']
# outputs a dictionary of results, one per line of the input
It would also be really cool if you could manipulate the graph from python, e.g.:
forge.prompts[1].models
# prints a list of current model slugs, e.g. 'ollama'
forge.prompts[1].models.append('gpt4') # set this prompt to also use gpt4
results2 = forge.process('mydata.csv')
# now includes gpt4 results
Would this be easy to implement, given the current server structure? I guess if the async llm calling needs a long running process then maybe this could be implemented as an api on the CF server that a python client (or any other lang) could call? e.g.:
forge = cf.forge('http://localhost:8000') #
forge.open('example.cforge')
results = forge.process('mydata.csv')
Users should be able to export the prompt API calls data into python code for a prompt node, for use in other programs. (Either add … button -> “Export API calls to Python code” or add this to right-click context menu). Downloads self-contained py file that does all required imports, feeds all prompts through the added LLMs w the settings.
Doesn’t support Dalai.