Closed tusharchou closed 1 month ago
After you have the data from BigQuery and an Iceberg table ready, you can insert the dataset into Iceberg for storage.
from pyiceberg.table import Table table = catalog.load_table("near.transactions")
import pyspark spark = pyspark.sql.SparkSession.builder.appName("IcebergApp").getOrCreate() spark_df = spark.createDataFrame(transactions_df)
table.new_append(spark_df).commit()
After you have the data from BigQuery and an Iceberg table ready, you can insert the dataset into Iceberg for storage.
Load the Iceberg table
from pyiceberg.table import Table table = catalog.load_table("near.transactions")
Write data into Iceberg (converting Pandas DataFrame to PySpark DataFrame)
import pyspark spark = pyspark.sql.SparkSession.builder.appName("IcebergApp").getOrCreate() spark_df = spark.createDataFrame(transactions_df)
Append the data to the Iceberg table
table.new_append(spark_df).commit()