Closed dbalexp closed 1 year ago
Typo in the example code for Step 4: Configure Auto Loader to ingest data to Unity Catalog
# Import functions from pyspark.sql.functions import input_file_name, current_timestamp # Configure Auto Loader to ingest JSON data to a Delta table (spark.readStream .format("cloudFiles") .option("cloudFiles.format", "json") .option("cloudFiles.schemaLocation", checkpoint_path) .load(file_path) .select("*", input_file_name().alias("source_file"), current_timestamp().alias("processing_time")) .writeStream .option("checkpointLocation", checkpoint_path) .trigger(availableNow=True) .option("mergeSchema", "true") .toTable(table))
It should be this instead:
# Import functions from pyspark.sql.functions import input_file_name, current_timestamp # Configure Auto Loader to ingest JSON data to a Delta table (spark.readStream .format("cloudFiles") .option("cloudFiles.format", "json") .option("cloudFiles.schemaLocation", checkpoint_path) .load(source) .select("*", input_file_name().alias("source_file"), current_timestamp().alias("processing_time")) .writeStream .option("checkpointLocation", checkpoint_path) .trigger(availableNow=True) .option("mergeSchema", "true") .toTable(table))
⚠ Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.
@alexphu Thanks for your feedback! We will investigate and update as appropriate.
@alexphu
Thanks for reporting this! We have created a PR for this issue and the changes should go live soon.
Typo in the example code for Step 4: Configure Auto Loader to ingest data to Unity Catalog
It should be this instead:
Document Details
⚠ Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.