Closed joseplayero closed 1 week ago
/bounty $20
/attempt #376
with your implementation plan/claim #376
in the PR body to claim the bountyThank you for contributing to reorproject/reor!
Add a bounty • Share on socials
Attempt | Started (GMT+0) | Solution |
---|---|---|
🟢 @DevGajjar28 | Aug 26, 2024, 8:13:40 AM | WIP |
🟢 @Vayras | Aug 26, 2024, 2:09:55 PM | WIP |
🔴 @govindup63 | Aug 26, 2024, 2:12:28 PM | WIP |
🟢 @itsdheerajdp | Aug 26, 2024, 5:15:09 PM | WIP |
/attempt #376
Hi @joseplayero
I'm having an issue while installing Reor for contribution. I followed the contributing guide, but I encountered a problem when running the npm run dev command. The following error message appears:
exec error: Error: Command failed: "D:\open-source\Bounty\New folder\reor\binaries\win32\ollama-windows-amd64.exe" serve '"D:\open-source\Bounty\New folder\reor\binaries\win32\ollama-windows-amd64.exe"' is not recognized as an internal or external command,
The contributing guide does not mention anything about this issue. Could you please provide some guidance on how to resolve this?
/attempt #376
/attempt #376
/attempt #376
Algora profile | Completed bounties | Tech | Active attempts | Options |
---|---|---|---|---|
@itsdheerajdp | 1 bounty from 1 project | HTML, JavaScript, TypeScript & more |
﹟381 |
Cancel attempt |
@joseplayero can i get it assigned so that i'll start working on it
i will assign @DevGajjar28 for now as he was first to submit attempt
ive re-assigned to @Vayras for now. please attempt and submit a PR asap and ill review
@samlhuillier I know how to deal with it. Can I give it a try ?
can i work upon it?
pls wait to implement actually. this issue is blocked by refactor work this week. will close issue temporarily
Problem
When the LLM is generating text in chat and the writing assistant, the text generated goes below the component because the component does not scroll for the user to the bottom where the LLM's latest generated text is.
Solution
Autoscroll in chat and writing assistant. This should auto scroll to the bottom of the LLM's generated text but if the user scrolls themselves (against the auto scroll) then the user's scroll overrides the auto scroll. If the user overrides the auto scroll, the auto scroll should not be re-activated until the next time the LLM is generating (i.e. the next message is sent).
As always, code should be clean and easily understandable :)