Closed dbast closed 5 years ago
Hi! This is the friendly automated conda-forge-linting service.
I just wanted to let you know that I linted all conda-recipes in your PR (recipe
) and found it was in an excellent condition.
Hm... why does CircleCI not get triggered and builds this (as it does in my forked repo)...
Hah... successfully built by CircleCI :)
Tested this locally... pyspark shell is working... compared old vs noarch tarball ... looks good.
Would merge this soon. Any comments welcome :)
Hm... this needs to be double checked ...the noarch checklist in https://github.com/conda-forge/pyspark-feedstock/pull/11 asks for Scripts argument in setup.py is not used
... but pysparks setup.py seems to heavily use it... That could be a problem for windows... Something to test or to just revert the noarch commit.
Build a noarch package on osx and tested it on windows... the scripts seem to work... so this looks promising ... A noarch package would be quite beneficial as currently every upload = 9 CI builds = 200MB x 9 = 1.8GB.
pyarrow should probably stay optional ...
To not delay spark 2.4.0 any further, merging first https://github.com/conda-forge/pyspark-feedstock/pull/14 .. After that this can be reworked.
Closing this in favor of https://github.com/conda-forge/pyspark-feedstock/pull/18
Using
conda-skeleton pypi pyspark --extra-specs pypandoc --version 2.3.2 --all-extras
to fully update the recipe.This also adds pyarrow as runtime requirement. The integration of Apache Arrow is one of the great Spark 2.3 features for python.
Noarch is possible now with the pypi package. Tried also with the existing direct download from the Apache mirror, which did not work.
Checklist
0
(if the version changed)conda-smithy
(Use the phrase code>@<space/conda-forge-admin, please rerender in a comment in this PR for automated rerendering)