Open alexisdrakopoulos opened 2 weeks ago
Thank you, @alexisdrakopoulos, for reporting an issue!
I tried to reproduce an issue, and I created conf/base/spark.yml
and set
CONFIG_LOADER_ARGS = {
"base_env": "base",
"default_run_env": "local",
"config_patterns": {
"spark": ["spark*/"],
}
}
and it seems to be working well; at least it can find conf/base/spark.yml
So, for me, it looks like this line in the docs might not be relevant: parameters = context.config_loader.get("spark*", "spark*/**")
We will double-check and come back.
Description
The stable doc here: https://docs.kedro.org/en/stable/integrations/pyspark_integration.html is out of date I think.
Specifically:
parameters = context.config_loader.get("spark*", "spark*/**")
needs to be update to the new method.
I am mentioning this as I tried
config_loader["spark"]
with:but it couldn't find the
conf/base/spark.yml
for some reason, so I moved it toconf/databricks/spark.yml
and now it finds it.Documentation page (if applicable)
https://docs.kedro.org/en/stable/integrations/pyspark_integration.html
Context