ConcurrentAppendException while writing dataframe to Feature-store

30 views Asked by At

I'm trying to write a spark(pyspark) data frame to feature store

fs = FeatureStoreClient()

    try:
        fs.get_table(fs_name)
    except ValueError:
        fs.create_table(
            name=fs_name,
            primary_keys=pri_key_cols,
            df=df,
            timestamp_keys=timestamp_cols,
            description=table_description,
            tags=tags_dict,
        )
    else:
        fs.write_table(name=fs_name, mode="overwrite", df=df)

This Databricks-notebook is attached to a ADF pipeline, when it's running sometimes it's giving an exception and pipeline gets fail.

ConcurrentAppendException: Files were added to the root of the table by a concurrent update. Please try the operation again.
Conflicting commit:
0

There are 0 answers