-
# Description of the new feature/enhancement
This issue suggests implementing a "Vi mode" similar to the very convenient one implemented by Alacritty, where the user can
move the cursor around, s…
-
Thanks for your great work!
Why is the **motion_repr_clean** data involved in the test process, and how should it be handled during the video demo inference process without GT ?
Looking forward …
-
The correct time/date is displayed under "Status". However, as soon as a video record is triggered, the video always has an incorrect time stamp, always 2 hours earlier.
The correct time zone is se…
-
Thank you for your open-source work. I would like to attempt to replicate the data cleaning in H2O, specifically training the privileged policy in H2O and using the algorithm for data filtering. How c…
-
Thanks for your nice work! I met a problem while I'm training on the TED dataset (Two 32G GPUs).
```python
File "Thin-Plate-Spline-Motion-Model/train.py", line 55, in train
for x in dataloade…
-
Hi, Sorry, I am a little bit new to this area. I am wondering how to augment the current HumanML3D data by simply concatenating the same or different motion sequences together. For example, 'jumping j…
-
Hi all,
I have compiled and installed EzGraver on my Raspberry Pi 3 with raspian (same instructions as for Ubuntu) and am just one step away from engraving.
I have been able to connect to my DK-8-KZ…
-
I know GRBL does not support these right now, but I'm hoping someone here might know the answer. I'm trying to understand what motions are supported by the gcode specification in conjunction with the …
-
From the README:
> This is an AI designed keyboard layout that was built within the **keyboard-gentics** project.
Too bad AI didn't write that README, it probably wouldn't make typos like that (…
-
**Is your feature request related to a problem? Please describe.**
I regularly use lazygit in neovim. Most of the times, I want to see the status of currently opened file, but opening lazygit highlig…