Harryman / hashd

0 stars 1 forks source link

Token datatructure and implementation #14

Open Harryman opened 5 years ago

Harryman commented 5 years ago

I have been doing research on git and how to implement with in the repo, I think a submodule will be optimal. The sub module would act as a checkpoint with a ledger stating the balances of every #D entity holding the token. This way everyone can be sure of the total supply and how many are being minted. It also cleans up the work flow for negotiating token issuance for, rogue contributors asking for compensation without an explicit bounty being set. You could use gogs to manage the token distribution and discussion around that.

I think adapting git to incorporate, bounties, issues, comments and other project managment is not ideal. Eventually I think we will have to roll our own version of git more less that is much more integrated with the #D way of doing things.

For the PoC I propose we have a verified push pull mechanism. When you write a comment or issue or other interaction with gogs the #D client is watching the database for changes and then bundling those changes into a block signing all the data in an unpackable format so there is non-repudiation about bouty requirements and such. When some else does changes on their gogs, ie comment or something, they do the same, then gossip that block to other subscribers. The receiver then checks the received block validates it and unpacks the content storing it in their database on their gogs instance thus syncronizing state while allowing #D to do the heavy lifting for auth and audit trailing. This also means our local gogs instance will not need any normal password id or anything since it's local and trusted you are automatically using it as your #D identity. I think this is a key kill feature using apps without login.

The token data structure I think should just be like json or csv or txt file that is just a ledger of balances. Bounties and such should have some set format like prefix "$#D-" or something like that so we can make a bounty board front end so people can discover bounties easily without having to dig into issues or make major modifications to gogs UI to incorporate bounties in a meaningful way.

Does this all make sense? Is there anything inherently dumb about this idea?

I get it won't be nearly as performent as using postgres native syncronization but by proxying everything through #D it acts as the "thin waist"(al IPLD/IPFS rhetoric) that everything must go through. This reduces attack surface tremendously. This also means the underlying protocol is data type and backing store agnostic as long as a connector can be written for them. The connector API to abstract the database and data store will not be part of the PoC, I think the fastest will be build only around the core apps and postgres we can have PoC that can get the contributor token model off the ground and get us paid to make a more flexible and abstracted implementation that is ready for mass consumption and more diverse apps built on top.

cryptoquick commented 5 years ago

I'm alright with submodules, you have good reasoning behind that, but I disagree that we need to alter the git server or client in any way. This should be done through the interface, altering and watching the repo through PRs and hooks. as for token data format, TSV or CSV is good. JSON is traditionally not appendable or streamable. Agreed on the non-PoC secure interface to Postgres, but it also needs to be able to establish schemas and GraphQL ORMs. This can be done through auditable, PR'd, app-specific modules. These could also benefit the community as well as provide the data necessary for apps. These only need to be merged in for production rollouts.

Harryman commented 5 years ago

@cryptoquick

as for token data format, TSV or CSV is good. JSON is traditionally not appendable or streamable.

be clear, the actual ledger of transactions and history will be contained in everyone's #D chain. The token file in the token sm will just be a snap shot of balances for each holder, and obviously the diff since last checkpoint, this to have clear account of totals and distribution. There are some caveats around the definitivity of the distribution outlined here but really wouldn't come up in except in edge cases and would cause too much of a problem even if they don't match.

cryptoquick commented 5 years ago

Ah yeah... We'd likely store the ledger in the Postgres database, too, just for convenience and efficiency.

Harryman commented 5 years ago

yeah, everything on chain is going into postgres so we can actually query and index chains mildly efficiently.

Harryman commented 5 years ago

Agreed on the non-PoC secure interface to Postgres, but it also needs to be able to establish schemas and GraphQL ORMs. This can be done through auditable, PR'd, app-specific modules. These could also benefit the community as well as provide the data necessary for apps.

I think that is definitely where we need to go but I think #D with only one app with the highest value which show cases what the protocol can do is what we're going for all these abstractions later after more funding comes in. We need to conquer a niche.

This definitely a requirement for wider adoption of is a good abstraction layer which allows for sharing of schemas, essentially plugins that understand #D chains and can present a usable interface for app developers while also giving a simple understanding to users what this plug in will do with their data and data it will request from other peers and such. That is going to a very large task and crucial to get right, but now isn't a time to worry about it.

cryptoquick commented 5 years ago

Agreed. Good considerations. Crucial engineering decisions need to be considered at this critical time.

cryptoquick commented 5 years ago

So, just to be clear, what would you say this discussion distills into, bullet point-style, into acceptance criteria for an MVP of this feature?

Harryman commented 5 years ago
cryptoquick commented 5 years ago

sounds reasonable

15 ?

Harryman commented 5 years ago

writing it up