scraperwiki / code-scraper-in-browser-tool

Just like on ScraperWiki Classic; now a part of QuickCode.
https://quickcode.io
Other
38 stars 8 forks source link

Is there a maximum (or maximum advisable) number of tables? #142

Closed philipnye closed 9 years ago

philipnye commented 9 years ago

My scraper runs each day, and generates around 300 rows of data, which I want to save.

I think I want to save each day's output into a new table. But before I implement this, I'm wondering if I'm likely to run up against a hard limit on the number of tables - or else a point at which my scraper becomes effectively unusable. Alternatives would seem to be saving all of the data into one table, with run date recorded against each entry, or else being a bit stricter about which data I really need to be retaining.

Thanks, Philip Nye

frabcus commented 9 years ago

I don't think there's a hard limit!

SQLite itself has one of billions https://sqlite.org/limits.html

Although it may slow:

Whenever a database is opened, the entire schema is scanned and parsed and a parse tree for the schema is held in memory. That means that database connection startup time and initial memory usage is proportional to the size of the schema.

philipnye commented 9 years ago

Thanks for the informative and super speedy response @frabcus!