Closed jhatcher1 closed 1 month ago
Thanks for reporting, those are hard to debug because I can't reproduce.
If you run with env var SLING_PROCESS_BW=false
, I think it should work. It's erroring when sling try to determine how many bytes have been written (for some specific row).
Closing this. Feel free to re-open.
Running sling with SLING_PROCESS_BW=false
worked for me. If I can figure out how to reproduce the issue, I'll reopen this with more info. Thanks!
Hello! I'm a new user of sling, thanks for the project!
Issue Description
Description of the issue: When running a command like:
sling run --src-conn TRINO --src-stream "myschema.mytable" --tgt-object "file:///tmp/mytable.parquet"
I am getting a panic.The stack trace is quite big, so I've attached it in a file here: sling_panic.txt
I'm not seeing this issue for all tables I try to export to parquet, but for the tables where I do see this issue, I'm encountering it consistently. I'm able to export these tables to CSV without any issue. The tables I'm encountering this issue with aren't very complicated - they have a bigint column, ~7 integer columns, and ~30 boolean columns.
Sling version (
sling --version
): v1.2.9Operating System (
linux
,mac
,windows
): Mac (Arm)Replication Configuration:
Log Output (please run command with
-d
):