Open bobinson opened 5 years ago
Bobinson Bobby
Date: 10-05-2018
Version : 0.0003
Unlike traditional APIs which were created for financial transactions and depends on Proof of Work, UCEN blockchain is designed from ground zero to develop DApps. In addition to the blockchain with DPoS based consensus, UCEN blockchain also provides middle ware to develop DApps ranging from games, news portals, mobile applications, medical applications etc. With the changing landscape in data protection regulations, extreme care is taken to adapt to the changing scenario of data protection and privacy regulations.
Ucen blockchain's middle ware exposes Data services and APIs and micro services which can be used to develop enterprise class applications with INTERNET scale. The middle ware is horizondally scalable and independent of the blockchain scaling. This enables us to provide geographically distributed, scalable middle ware layer. We expect various DApps performing consuming the API and data end points performing transformations, visualizations and presentation based on the immutable transactions in the blockchain. The middle ware takes care of data ingestion from the blockchain, data sources, support for adding new ingestion destinations and sinks, caching, proxy services etc to build massively scalable applications on top of it.
Along with the middle ware direct access to the blockchain is also possible for wallet access, accessing the real time consensus state, transactions etc.
Middleware components
Streaming of the blockchain is taking real time, over the network snapshots of the blockchian activities / transactions and storing them in a database which can be handled easily.
The streaming is further sub-divided into fetching the data and storing to a long term storage.
STR3 - I/O failure scenarios
In this section we will capture the logic for the micro service which will consume the data from the in-memory cache and store for long term storage. To decide on the data formats needed, we will consider the requirements from the dApps and implment required formats and transformations on a need basis.
The first step is to ensure the long term storage of the data. This layer will perform 2 different types of operations.
3.1 handling live stream
The live streamed data is present in the memory cache and this is captured and stored to the MongoDB. The first version will be developed as a single process Python service.
3.2 Snaphot
The snapshot of the blockchain means taking copy of the database from a starting block to an ending block. Often this will be from the genesis block to the lastirresible block.
3.1 use cases (ING)
• sndp487 added a comment.May 13 2018, 1:10 AM Comment Actions
Q - Data stored in redis-server will grow continuously while streaming. How should MongoDB work along, two possible ways: Say 'p' does redis ---> mongoDB (1) p first stores all the data already present in redis cache and then updates on the go with every new block stored in redis. (2) p can run periodically, resuming from where it stopped last time.
bobinson added a comment.May 14 2018, 10:39 AM Comment Actions
(2) p can run periodically, resuming from where it stopped last time.
there should not be any relation between storing data to MongoDB and Redis
blockchain --> redis should be independent of all other components of the system & so is P
p = will be dump system that was streamed from BC (blockchain ) to MongoDB.
Redis will have to take care of its needs like memory, CPU etc
any serialization, data transformation etc is not the job of "p"
What Redis does with the data is not concern of P.
if P gets an error, it will just report it
whatever is presented by Redis, P will take it.
At a later point we will apply transformations to do the data. We are taking this approach as right now its more important to get the data in MongoDB and then provide the APIs than anything else.
We have introduced a Redis in memory cache as we are not using a ingestion frame work or a event processing framework. In the step one we are doing a copy of the data to the MongoDB as per the dApp requirements.
This will have to be revisited and enhanced to a proper ingestion frame work at a later stage.
At this point in time Python requests will mostly handle our needs with the possibility of using Pandas.
Reference document: https://github.com/Ucen-Blockchain/streamplay/issues/3#issuecomment-458870210
3.1 use cases (ING)