Open shiloh92 opened 1 year ago
Hey @shiloh92!
This sounds great! We are currently focused on shipping Aim v4, which provides an interface to build modular metadata applications - defining metadata types and building custom dashboards with python. Afterwards we are going to focus on adding capabilities for tracing and managing AI systems based on Aim v4 groundwork.
Meanwhile, would you like to try out an early alpha version?
Also, we just started tuning Aim for AI agents tracing. I would be supper happy to jump in a quick call to learn more about your use cases and share our vision/direction.
š Feature
In order to combat blackboxing and ignorance of the operations of autogpt style agents and other directed components, it is highly desirable to have visual bird's eye view into real time activity and planning tasks.
Motivation
Improve the utility and feedback loop of agents ensuring robust controls for the end user. The more granular the control, the better - but up to a point. A visual view that provides insights is a powerful aid for most users.
Additional context
This tweet shows a very simplified view of what an agent created. This could be a "ELIF view" : https://twitter.com/i/status/1647741551534125057
Another view could be more like a traditional mind map with just a few buttons to allow "directing" or "supervising" the agents: https://dribbble.com/shots/16089933-Mind-Palace-App-Map
Buttons suggested for robust layer of control for end user: