Open mattmcalister opened 1 year ago
The context limit is determined by the model. If a user is premium, then they can use a model with a higher limit, like Claude Instant
Claude should be able to support articles like this (7k words), but i'm getting the This conversation is too long
error
If you're seeing that error then that means that the actual claude API is giving the error. We could truncate this article by lowering our page length character estimate. It must be set too high.
consider increasing limit from 2k to 4k