filecoin-project / devgrants

👟 Apply for a Filecoin devgrant. Help build the Filecoin ecosystem!
Other
369 stars 307 forks source link

Next Step Microgrant: Dataponte #1589

Closed nijoe1 closed 6 months ago

nijoe1 commented 1 year ago

1. What is your project, and what problem does it solve? (max 100 words)

DataPonte revolutionizes decentralized data management and monetization by integrating the FEVM, Lilypad protocol, Lighthouse, Tableland, and Polybase. It offers a comprehensive solution for secure storage, management, aggregation, and monetization of data. Users can perform verifiable computations over their data using Lilypad and engage in a collaborative ecosystem where they can request paid bounties for computational tasks. With complete data control, users can monetize assets as NFTs and grant access to data to NFT holders, all within a decentralized and secure environment. DataPonte empowers individuals and organizations to drive innovation and create value from their data.

2. Project links

3. a) How is IPFS, Filecoin, or related technology used in this project?

Technology Description
Lighthouse DataPonte employs Lighthouse to enhance data security. It provides encryption capabilities, allowing users to encrypt their data before storing it on IPFS and Filecoin. Lighthouse ensures confidentiality and authorized access to sensitive information through custom encrypted rules and sharing options.
Filecoin DataPonte integrates with Filecoin for secure and permanent storage solutions. By leveraging the deal client contract, the deal aggregator Oracle together with the Filecoin data tools, DataPonte facilitates decentralized storage deal-making based on data size. This integration enhances data durability, reliability, and long-term accessibility within the DataPonte ecosystem.
Lilypad DataPonte seamlessly integrates with the Lilypad protocol for secure and verifiable computations. Through the Lilypad contract oracle, DataPonte manages computational bounties and job requests. Users can create bounty requests, specify requirements and rewards, and incentivize job creators for each computed job.
Tableland DataPonte utilizes Tableland technology to enhance the mutability and versioning capabilities of datasets, files, and folders. With Tableland, users can effectively manage and track different versions of their data as NFTs. This ensures data integrity and facilitates collaborative data management.
Polybase Polybase is a vital component of the DataPonte ecosystem, serving multiple key functions. It acts as a metadata database for storing file details required for storage deals on Filecoin. Additionally, Polybase fosters a dynamic social environment, enabling discussions on dataset updates, requests, NFT creation proposals, and job bounties. It also allows users to provide feedback on DataNFTs and computations, promoting transparency and informed decision-making. Polybase is crucial in ensuring reliable and efficient data management and monetization within DataPonte.

b) Is this project building with the current microgrants focus area (FVM)? (Yes or No)

YES

4. How will you improve your project with this grant? What steps will you take to meet this objective?

Sprint based planning --- We work in sprints of two weeks (14 working days - ) `Point for estimation tasks: 1, 2, 3, 5, 8, 13, 20, 40, and 100` **Day 1 - 2: Sprint Planning** As a team, we decide on which user stories will be implemented in the upcoming sprint of two weeks. We organize and prioritize the user stories and we will work out the stories in detail. - Break down user stories into smaller, manageable tasks and assign them to specific team members. - Establish clear acceptance criteria for each user story. - Estimate the effort required for each task. - Identify potential dependencies between tasks and address them. **Day 3 - 9: Development** During these days we will start implementing the functionalities. At the start of the day, we have a daily meeting of around 10-15 minutes where we briefly discuss what they accomplished the day before, what they will work on today, and any obstacles they're facing. This is important for keeping the team aligned and identifying issues early. **Day 10 - 12: Testing** Both developers now focus on testing the functionalities they implemented, fixing any bugs that arise. They make sure that the implemented features are working as expected and meeting the acceptance criteria. - Peer code reviews to get higher code quality and shared knowledge among the team. - Functional testing but also performance, security, and other relevant testing methods. - Document any discovered issues for future reference and fixing. **Day 12 - 14: Production and review.** We merge code with the main codebase and ensure that their new features integrate well with the existing code. - Sprint Retrospective meeting - Create user feedback forms. - Create a demo to share with the community. - Provide updates to the microgrant. - Test production ---
Project improvements --- **1/7/2023 - 30/9/2023** These project improvements below are also described in the development [sprints](https://github.com/orgs/dataponte/projects/3/views/2) as user stories and deliverables. #### Contracts - Optimize and test contracts for deployment on FEVM main-net and make them Upgradable for future improvements, including: - Dataset-NFTs: - Add dataset versions. - Computational bounties x lilypad. - Multi-sig contract and factory: - Improve the multi-sig implementation. - Allow transaction execution based on off-chain signatures instead of on-chain confirmations (save on gas). - Deploy the third-web NFT Marketplace-V3 for the Dataset NFTs. - Integrate Zero-Knowledge proofs for anonymous contributions to data requests using circom and snark.js: - Allow individuals to provide valuable private data without revealing their identity. - Reward those individuals for their contributions using a ZK implementation like Tornado Cash. - Create a user-friendly workflow for data onboarding on Filecoin providing the options depicted in the image below: ![Data Onboarding Workflow](https://gateway.lighthouse.storage/ipfs/QmZFgsfURpY3mB9661Rnf8J55C77yQfxNGB1HMX4rVYJg7) - Lighthouse storage to onboard encrypted and non-encrypted files on IPFS and Filecoin (Implemented). - Create a Deal-Client Interface to be used for each data DAO deal-client created by our contract factory. - Continue our Deal-Client Interface improvements based on this [FRC Deal Client](https://pl-strflt.notion.site/WIP-FRC-Deal-Client-Draft-1-v0-1-458e625f13b14c70bfdfe7ed64007b6c). Create an upgradable proxy factory for the deal client so each user or Multisig based DAO can create storage deals: - We already implemented a dealClient that improves notified deals indexability using the Tableland on-chain database. - Added the data cap API to make verified deals. - Make it work as a dataDAO. - Reduced contract size. - Here is the verified contract [dealClient](https://fvm.starboard.ventures/calibration/explorer/address/0x204ce0695260c5d3ace6626af63936beea63f17d). - Integrate Filecoin data tools and the deal aggregator Oracle: - EdgeUR - Delta. - Stay up-to-date on Alliance's decentralized deal-making workflow. #### Frontend - Improve UI/UX and integrate the new version of the contracts: - Extend the user dashboard: - Provide insights into: - Dataset & survey contributions. - Created bacalhau jobs. - Computations performed. - Total earned rewards. - Created & owned datasets. - Improve and extend the multi-sigs (dataDAOs) page to: - Allow a seamless workflow to create and curate their datasets, files, and folders: - Add a versioning system. - Create and sign(confirm) proposals. - Replicate important datasets more times using their DealClient contract or the Storage Aggregator using the Filecoin data tools based on the size of the data. - Improve application features: - Data Requests. - Dataset NFT Marketplace. - Computation Marketplace. - Computation over Dataset Dashboard. - Dashboard for requesters. - Dashboard for contributors - Multi-sig (dataDAO) dashboard. #### Backend - Optimize and standardize the backend infrastructure: - Data contributions validation: - Dynamic JSON and CSV schema analyzers. - JSON and CSV contributions Merger into a CID. - Automate testing for backend and frontend functionalities. #### Non-technical - Create a snapshot space for creating platform improvement proposals and let early adopters vote and leave feedback. - Create social accounts for DataPonte (Twitter, Discord, Medium). - Stay up-to-date on IPC updates. Looking forward to deploying on IPC. ---
Number Grant Deliverable Briefly describe how you will meet deliverable objectives Timeframe (within 3 months)
1. Contracts Optimization and Testing Optimize and test contracts for deployment on FEVM calibration test net.

Make them upgradable for future updates.
1/7/2023 - 15/8/2023
2. Integrate Zero Knowledge Proofs Implement Zero-Knowledge Proofs for anonymous private data contributions to data requests using Circom and Snark.js. 1/8/2023 - 15/8/2023
3. Frontend and Backend Enhancements Improve UI/UX and integrate the new contracts on calibration. Optimize and standardize backend infrastructure. 1/7/2023 - 31/8/2023
4. Deploy on Production and FEVM Mainet Finalize development sprints.

Deploy the contracts on FEVM main net.

Ready for production application.
1/9/2023 - 30/9/2023

This 3-month roadmap outlines our objectives for application development. Operating in an agile environment, our team is determined to exceed the goals set in the proposal. We recognize the importance of adaptability and continuous improvement, allowing us to build a high-quality application that surpasses the initial specifications. At the end of that period, we will have a deployed on-production alpha version on the FEVM main net.

5. If accepted, do you agree to share monthly project updates in this GitHub issue for 3 months or until the project described here is complete?

YES

6. Does your proposal comply with our Community Code of Conduct?

YES

7. Links and submissions

Additional questions:

tse-lao commented 1 year ago

Hey, @ErinOCon,

I wanted to provide an update on our progress with the Dataponte proposal. Recently, we participated in the eth-paris hackathon, where we developed the OxForm application. Our goal was to provide data sources that can be used within the data ponte platform.

We successfully implemented a sybil-resistant mechanism using Worldcoin and Zero-Knowledge proofs with Sismo. Now, every data request (Form, Survey, Dataset) requires contributors to prove their human identity and additional requirements based on zk-proofs to guarantee authenticity without compromising privacy.

This open whole now use cases, but also brings some challenges. One of the challenges was to tackle the issue of data contributors who submit data to compromise datasets. To address this, we integrated an option of UMA Optimistic Oracle to validate the integrity of the contributed data by the community of the contributors. , incentivising users to provide valuable data while penalising those who submit irrelevant or fake information.

As a result, we've created a fully decentralised platform that allows users to create valuable datasets of any kind, retain ownership of their data, and be rewarded for contributing to privacy concerns addressed through zero-knowledge proofs.

You can find the OxForm repository here. Our plan is to integrate these updates into the proposal and the Filecoin Ecosystem, making it compatible with cross-chain environments.

We are excited about the potential of these improvements and their impact on Datapontes growth and adoption.

Looking forward to your feedback and collaboration.

Best regards, Nick and Koen.

nijoe1 commented 11 months ago

@ErinOCon Any update on this microgrant ?

ErinOCon commented 6 months ago

Hi @nijoe1, thank you for your proposal and for your patience with our review process. Unfortunately, we will not be moving forward with a grant for this project. Please feel welcome to contact our team at grants@fil.org with any questions.

Wishing you the best as you continue to build!