-
-
## Convert a bundle to jsonl and upload to DuckDB for superfast querying
The main benefit of this approach is that it the ingestion doesn't get wrecked by log format changes. It simply ingests *everyt…
-
The activity quantity and price are coded a f64, this creates an issue with fractional shares.
-
Users often asks about limitations of KDF to handle large dataframes
The User Guide should contain some recommendations and snippets of code to improve User Path here
- some benchmarks on real-w…
-
CSV is useful as a human-readable, easily-transportable file format and serves as a good default. Other formats (HDF5, SQL, numpy's format when using `np.save`, etc) also have their advantages and may…
-
we want to develop new feature to import raw data, like excel, csv, or maybe parquet to postgresql. and here the design that I think match to this project (if you have another approach, let me know). …
-
Not sure if we want to run this using an SQL report or through the API. SQL would be much faster, but any updates would need to be through the API on production.
-
This year's Lahman data update has been released which means it's time to update this package accordingly.
After an initial look at the CSV data files it looks like we may have some import changes …
-
The function needs feedback from the console to know if it is working or not
do ##class(bdb.sql.Dump).Dump("BI_Study.City", "/tmp/table.csv")
-
H2 had a new release in February 2019 with a long [changelog](http://www.h2database.com/html/changelog.html) - besides a lot of bugfixes, it introduced support for Geometry (still based on JTS) and `I…