-
**Is your feature request related to a problem? Please describe.**
I often perform repetitive tasks, and to get the best results, I have to craft a descriptive prompt for the first generation. For …
-
Thank you for making code publicly available.
Here are the fix in the code for multi-prompt video generation:
1. Add the following line in the argparse section
parser.add_argument("--multipro…
-
- [ ] [Prompt engineering - OpenAI API](https://platform.openai.com/docs/guides/prompt-engineering/strategy-write-clear-instructions)
# Prompt Engineering - OpenAI API
## Six strategies for getting …
-
`llama3:80b` is not able to consistently validate a valid plan.
-
One possible issue that I’m not sure if you can actually fix, with the iframe for the chat, I can’t see the button to accept/reject the cookies popup.
Maybe if the iframe/chat w…
-
Hey I got another idea please add 150 token limit I noticed it caps at 77 but it would be nice if we could make longer prompts. Also add negative prompt feature for lcm mode
-
![image](https://github.com/woocommerce/woocommerce-ios/assets/524475/e95770db-17a1-4ac0-80d3-0725438dbeae)
1. Send the user-entered keywords to AI and display guidance text.
- Define and refi…
-
The prompts for Llama 2 have not been provided in `prompts/adv_prompts`, so running `load_adv_prompt` doesn't work when using Llama 2. Could these be added, please? Thanks!
-
### Text
```markdown
bring a crawling example in js.
```
### Prompt
help by showing code snippets
### Submission Privacy
- [X] I know that my issue submission content is visible to others
lnxpy updated
3 weeks ago
-
Currently prompts can only be static. Adding syntax that would evaluate an elisp statement / shell commands would allow dynamic prompting.
### Potential Use Cases
#### Data extraction from local m…