gnosis / prediction-market-agent

GNU Lesser General Public License v3.0
24 stars 4 forks source link

Allow submission of tasks to the queue #375

Closed kongzii closed 1 month ago

kongzii commented 1 month ago

Subissue for https://github.com/gnosis/prediction-market-agent/issues/314, see that for details.

gabrielfior commented 1 month ago

@kongzii for a simple MVP, I was thinking that this could be done via a centralized queue - for ex, using GCP's pub/sub. I created some files (see https://gist.github.com/gabrielfior/db7b3f1a6c0f0cce3b44e510214ffda7) if we want to use GCP's pub/sub - would not be against using other solutions, like RabbitMQ or similar.

Ref. long term view, I assume this part should be on-chain, for composability/transparency reasons - but probably too complex for an MVP.

Let me know what you guys think.

kongzii commented 1 month ago

If you want to try out GCP's pubsub and don't mind it will get deleted in the end, then go for it, no problem with that. Otherwise, a simple Postgres table would work fine here.

gabrielfior commented 1 month ago

Another question @kongzii - where should this functionality be implemented? I'm thinking PMAT and some classes like FetchTask, what do you think?

I see that you already have a skeleton for this on PMA, but I think this task consumption should be made available to anyone using PMAT (for ex, prediction-prophet uses PMAT but not PMA - link)

kongzii commented 1 month ago

Yep, that makes sense. FetchTask in PMAT and GetTasks in PMA would simply be a wrapper around it, making it available for the general agent.

The only thing to keep in mind is that currently, it can't be available to everyone (until we are sure it's secure enough to run arbitrary code), so it will be either behind the API key for pub-sub/rabbit/Postgres or behind whitelisted addresses.

No preference on my side whether to implement it right away in PMAT, or just PMA for now.

gabrielfior commented 1 month ago

The only thing to keep in mind is that currently, it can't be available to everyone (until we are sure it's secure enough to run arbitrary code), so it will be either behind the API key for pub-sub/rabbit/Postgres or behind whitelisted addresses.

It's behind our GCP credentials, so only project members can push/subscribe.

gabrielfior commented 1 month ago

Infrastructure for this was created as follows: -> Topic ai-agents-topic was created on GCP -> Subscription ai-agents-subscription was created on GCP -> Examples for publishing/consuming messages in Python were written : https://gist.github.com/gabrielfior/db7b3f1a6c0f0cce3b44e510214ffda7

Follow-up tickets can be found under this label