Open francescocamussoni opened 1 year ago
Had the same error and solved it by installing the sagemaker-feature-store-pyspark-3.1 like this:
python -m pip install sagemaker-feature-store-pyspark-3.1 --no-binary :all: --no-cache-dir
Previously it wasn't working as I installed it through a requirements.txt file which didn't include the --no-binary :all: --no-cache-dir part.
Hello, I've this code that is pretty similar to the example provided by aws: https://docs.aws.amazon.com/sagemaker/latest/dg/batch-ingestion-spark-connector-setup.html
This is my java version: java version "1.8.0_231" Java(TM) SE Runtime Environment (build 1.8.0_231-b11) Java HotSpot(TM) 64-Bit Server VM (build 25.231-b11, mixed mode)
But I get this error:
I've also tried with specifying the jars that I've locally:
But I get the same final error:
Where I've replaced the real arn with for this post :)