I used SAC for select -> inser stament at Spark, like:
val df = spark.read.parquet(source_path)
df.write.parquet(target_path)
as a result at the Atlas I have a tons of hdfs_path entities (for each file from "source_path"- this is folder) and just one entity for "target_path"(just folder).
Is it normal behaviour?
Spark version 2.4.4
Scala version 2.11.12
Atlas: Version : 1.1.0.3.1.0.0-78
SAC: spark-atlas-connector-assembly-0.1.0-SNAPSHOT.jar
I used SAC for select -> inser stament at Spark, like:
as a result at the Atlas I have a tons of hdfs_path entities (for each file from "source_path"- this is folder) and just one entity for "target_path"(just folder). Is it normal behaviour?