Closed 0ptim closed 1 year ago
Because we still want to support a REST endpoint, we still need to find a solution to this problem.
Read more details in Part 1: #47
Since llm text generation got faster by a big leap and will probably improve even more in the near future, I will not fix this until it becomes a more problematic issue.
Because we still want to support a REST endpoint, we still need to find a solution to this problem.
Read more details in Part 1: #47