provider does the checks, decrypts encryptedURL (from ddo) and replies with content read from asset_URL (edited)
[5:15 PM]
so asset URL is never exposed, because provider acts as a proxy (reads asset_URLS and send the content to the consumer) (edited)
[5:16 PM]
how streaming should work (my personal scenario, wip)
Publish flow:
publisher takes care of WSS streaming backend -> asset_URL (wss:// ...)
frontend, using ocean.js (py) calls provider: encrypt url -> encryptedURL
frontend, using ocean.js (py) creates DDO ( metadata (user input form) + encryptedURL) with a "stream" service (has a timeout period) and publishes that on chain
aquarius caches DDOs (by monitoring blocks on chain)
frontend queries aquarius and displays a list of assets (DDOs) (creating some URL for display proposes only)
Consume flow:
using frontend, consumers searches for datasets and buy 1 DT (different pricing schemas)
using frontend (ocean.js/py), they call datatoken.startOrder (which generates on chain transaction)
provider does the checks, decrypts encryptedURL (from ddo) and switches to a WSS socket on consumer side
provider binds to asset_URL (wss:///) and proxies any received frame to the consumer connection (simple loop, active as long as both connections are active and timeout is not reached) (edited)
[5:16 PM]
so original WSS URI is never exposed, provider acts as a proxy (just fwds received frames from publisher -> consumer)
[5:18 PM]
obviously, WSS datasets are not meant for frontend (simple web download apps), but for backends ,, that can use ocean.js/py to consume (or C2D jobs, which are running and keep feeding frames to some ML algos, for instance)
Just pasting here as a note a conversation from Discord with Shawn@VantageCrypto
so, how Ocean works in simple steps:
Publish flow:
Consume flow:
provider does the checks, decrypts encryptedURL (from ddo) and replies with content read from asset_URL (edited) [5:15 PM] so asset URL is never exposed, because provider acts as a proxy (reads asset_URLS and send the content to the consumer) (edited) [5:16 PM]
how streaming should work (my personal scenario, wip) Publish flow:
Consume flow:
provider binds to asset_URL (wss:///) and proxies any received frame to the consumer connection (simple loop, active as long as both connections are active and timeout is not reached) (edited) [5:16 PM] so original WSS URI is never exposed, provider acts as a proxy (just fwds received frames from publisher -> consumer) [5:18 PM]
obviously, WSS datasets are not meant for frontend (simple web download apps), but for backends ,, that can use ocean.js/py to consume (or C2D jobs, which are running and keep feeding frames to some ML algos, for instance)