JuliaData / Feather.jl

Read and write feather files in pure Julia
https://juliadata.github.io/Feather.jl/stable
Other
109 stars 27 forks source link

Added Compatability with copycols keyword in DataFrames #119

Closed jacobadenbaum closed 5 years ago

jacobadenbaum commented 5 years ago

Adjusts for breaking change in DataFrames package that causes Feather files to be materialized on every read.

In this PR, Feather.read chooses the value for copycols as !use_mmap.

ExpandingMan commented 5 years ago

Thanks a lot!

Looks like the test fails on the round trip.

Honestly, the docker in the tests is causing huge problems right now. When it fails we can't generally see what happened, only one person can update the docker image itself and, most importantly for me, it has #113 in limbo without which, to be honest, this package is basically unusable for me right now.

@quinnj , is there anything else we can do to test the round trip without resorting to this docker image? Something with PyCall maybe?

quinnj commented 5 years ago

Just comment out the docker tests for now? They're really just in there as a fancy way to do integration tests w/ python, but I don't think they're necessarily covering anything we don't already cover in tests.

ExpandingMan commented 5 years ago

I'll admit it makes me pretty damn nervous to just go ahead when the round-trip test is failing for whatever reason, but it sure is making life pretty miserable for this package right now.

Let me go back in #113, remove the docker test and implement some optional tests that just run python from bash in the same environment the package is in. We won't be able to run it in Travis, which is scary, but at least anyone can run the optional tests locally with ease. We are stuck doing stuff like that on all the optimization wrapper packages because the optimization binaries are proprietary and we can't run them in travis anyway.

That way I can go back to using this package again.

ExpandingMan commented 5 years ago

This change was incorporated in #113 which has now been merged. We will be tagging soon. Thank you everyone for your help.