BoxcarsAI / boxcars

Building applications with composability using Boxcars with LLM's. Inspired by LangChain.
MIT License
430 stars 39 forks source link

Using the SQL model This model's maximum context length is 4097 tokens #88

Closed eltoob closed 1 year ago

eltoob commented 1 year ago

Hi, I have an issue with a sql database. Any tip or trick to get around this limitation?

eltoob commented 1 year ago

WOuld it be a good example of vectorization?

eltoob commented 1 year ago

I guess one other options here would be to simply create another "shot" and just give the list of tables and ask "based on this list of tables, which should be relevant to ask this sql request."

francis commented 1 year ago

@eltoob - you can reduce the tables when you create your SQL Boxcar. The parameter is tables: and you would give it a subset of the tables in your database. This might be enough to let you fit in under the 4k token limit. Otherwise, we could go the vector route and first find the appropriate tables, and then try to create the query.

francis commented 1 year ago

Let me know if you ran into any trouble here @eltoob