spheronFdn / sos-ai-bounty

Spheron Open Source AI Bounty
https://spheron.network
MIT License
52 stars 5 forks source link

Dopameme (Meme/Art Generator Platform Bounty) #6

Open vivekratr opened 3 weeks ago

vivekratr commented 3 weeks ago

Name : Swayam Sharma, Shreyash Singh, Vivek Shah,

Email and contact details: swayamsharma7021@gmail.com, shreyashsingh865@gmail.com, vivekratr@gmail.com

Project details: Dopameme is an advanced meme-making platform designed to simplify and elevate the creative process using AI technology. It offers a variety of innovative tools, enabling users to effortlessly generate, customize, and share memes. Dopameme transforms the traditionally time-consuming task of meme creation into an engaging, automated experience, making it accessible to everyone.

Demo video: https://www.youtube.com/watch?v=ruex9eExt4s ,

Demo link: https://web.telegram.org/k/#@dopameme_fun_bot ,

GitHub repository details: https://github.com/vivekratr/Dopameme-Miniapp-v2 ,

Tech stack used: Spheron SDK, Telegram SDK, Reactjs, Python, Tailwind CSS, Rainbow Kit, Dynamic Wallet, Ethers JS, Arbitrium Sepolia, Gemini, Computer Vision.

izrake commented 3 weeks ago

Please add a good readme into your repo showcasing it's installation & running process.

Also, Include deployment steps on Spheron with the YAML. Since it's an open source contribution you have to add MIT or Apache license into your repo as well.

Going forward, I would recommend removing your phone numbers from here for privacy reasons.

rekpero commented 3 weeks ago

Also, can you properly describe how you use Spheron in your project and what OSS AI models you use in your project?

vivekratr commented 3 weeks ago

Also, can you properly describe how you use Spheron in your project and what OSS AI models you use in your project?

The open-source models have been used as API inference in our Flask application, which integrates all models and has been deployed on Spheron.

rekpero commented 3 weeks ago

Also, can you properly describe how you use Spheron in your project and what OSS AI models you use in your project?

The open-source models have been used as API inference in our Flask application, which integrates all models and has been deployed on Spheron.

But I don't see that code anywhere in the repo you have shared in the submission. If there are any other repositories for the Flask backend, then please share them here in the submission description. Also, share the spheron ICL (yaml) configuration so that we can test it out.

vivekratr commented 3 weeks ago

Please add a good readme into your repo showcasing it's installation & running process.

Also, Include deployment steps on Spheron with the YAML. Since it's an open source contribution you have to add MIT or Apache license into your repo as well.

Going forward, I would recommend removing your phone numbers from here for privacy reasons.

All the changes mentioned is done

izrake commented 3 weeks ago

Please add a good readme into your repo showcasing it's installation & running process. Also, Include deployment steps on Spheron with the YAML. Since it's an open source contribution you have to add MIT or Apache license into your repo as well. Going forward, I would recommend removing your phone numbers from here for privacy reasons.

All the changes mentioned is done

I don't think it's done, as far as i can see the YAML you have provided is not the real one with your docker containers. You need to setup a deployment YAML of your container on Spheron & not just with the random example.

vivekratr commented 3 weeks ago

Please add a good readme into your repo showcasing it's installation & running process. Also, Include deployment steps on Spheron with the YAML. Since it's an open source contribution you have to add MIT or Apache license into your repo as well. Going forward, I would recommend removing your phone numbers from here for privacy reasons.

All the changes mentioned is done

I don't think it's done, as far as i can see the YAML you have provided is not the real one with your docker containers. You need to setup a deployment YAML of your container on Spheron & not just with the random example.

We've now updated the YAML with the real Docker image and provided a video link demonstrating the deployment process on Spheron, completed by our team. Please check it out and let us know if you need anything else.

video link.

izrake commented 3 weeks ago

Hey @vivekratr took a look into it.

Here are the comments & questions:

Please take your time to updated everything properly. If we will see the same issue again & again we will disqualify your submissions. Ensure you follow everything stated for the submission guidelines with outmost quality.

vivekratr commented 2 weeks ago

Also, can you properly describe how you use Spheron in your project and what OSS AI models you use in your project?

We are deploying our Flask backend using Spheron. The open-source model we are utilizing is Artples/LAI-ImageGeneration-vSDXL-2.

vivekratr commented 2 weeks ago

Also, can you properly describe how you use Spheron in your project and what OSS AI models you use in your project?

The open-source models have been used as API inference in our Flask application, which integrates all models and has been deployed on Spheron.

But I don't see that code anywhere in the repo you have shared in the submission. If there are any other repositories for the Flask backend, then please share them here in the submission description. Also, share the spheron ICL (yaml) configuration so that we can test it out.

The information you mentioned has been updated and is now available for review. Additionally, we have provided the YAML configuration for your reference. Please feel free to test it out.

vivekratr commented 2 weeks ago

Hey @vivekratr took a look into it.

Here are the comments & questions:

  • Readme is still not updated on the main repo Readme.md
  • You have added the Spheron folder for deployment but can't find any GPU usage into it
  • Which models are you using for this & where these models are deployed.
  • If models are deployed else where then we don't accept this submissions. Models must be deployed on the Spheron GPU & tutorial must showcase the process of deployment on Spheron.

Please take your time to updated everything properly. If we will see the same issue again & again we will disqualify your submissions. Ensure you follow everything stated for the submission guidelines with outmost quality.

-> The README has been updated to include detailed deployment steps for both the front-end and back-end components of the application on Spheron. Additionally, the product features have been outlined and included in the README. -> With regard to the GPU, our Flask backend has been successfully deployed to the Spheron folder. You can view the GPU usage in the source code pipeline (src/pipeline) folder. -> Our overall AI system, responsible for the generation component, has been deployed on the Spheron network. You can review it.