daytonaio / content

Daytona Content Programme for Technical Writers
https://www.daytona.io/dotfiles/
Other
82 stars 41 forks source link

Run GPU supported LLM inside container with devcontainer #4

Open nkkko opened 3 months ago

nkkko commented 3 months ago

Content Type

Guide

Article Description

You need to get https://huggingface.co/mistralai/Mamba-Codestral-7B-v0.1 running inside the devcontainer in Daytona and write about it.

Write several short samples of Python scripts.

Target Audience

dev interested in integrating LLMs

References/Resources

No response

Examples

Examples of simple python scripts

Special Instructions

No response

nkkko commented 3 months ago

/bounty $300

algora-pbc[bot] commented 3 months ago

💎 $300 bounty • Daytona

Steps to solve:

  1. Start working: Comment /attempt #4 with your implementation plan
  2. Submit work: Create a pull request including /claim #4 in the PR body to claim the bounty
  3. Receive payment: 100% of the bounty is received 2-5 days post-reward. Make sure you are eligible for payouts

If no one is assigned to the issue, feel free to tackle it, without confirmation from us, after registering your attempt. In the event that multiple PRs are made from different people, we will generally accept those with the cleanest code.

Please respect others by working on PRs that you are allowed to submit attempts to.

e.g. If you reached the limit of active attempts, please wait for the ability to do so before submitting a new PR.

If you can not submit an attempt, you will not receive your payout.

Thank you for contributing to daytonaio/content!

Add a bounty • Share on socials

Attempt Started (GMT+0) Solution
🟢 @vikashsprem Aug 19, 2024, 12:40:55 PM WIP
🟢 @varshith257 Aug 19, 2024, 1:10:05 PM WIP
🟢 @varuns3546 Aug 19, 2024, 9:05:53 PM WIP
vikashsprem commented 3 months ago

/attempt #4

varshith257 commented 3 months ago

/attempt #4

Algora profile Completed bounties Tech Active attempts Options
@varshith257 8 bounties from 4 projects
TypeScript, Go
﹟3
Cancel attempt
aman23bedi commented 2 months ago

Hi @nkkko
It seems the https://huggingface.co/mistralai/Mamba-Codestral-7B-v0.1 is mamba based and not the transformers based, so deploying and running mamba based models require some additional steps into container which might not require in transformers based models. So as the issue suggested it is for developers who are trying to integrate llm's so shouldn't we transfer to some transformers based as in general there are high chances of the model to be transformers based rather than mamba based. Let me know if it makes sense else I will work on mamba based model.

stdthoth commented 4 days ago

is this still availabe @nkkko @mojafa