from pyspark.sql import SparkSession, Row
from datetime import date
spark = SparkSession.builder.getOrCreate()
tempDf = spark.createDataFrame([
Row(date=date(2022,1,22), average=40.12),
Row(date=date(2022,1,23), average=41.32),
Row(date=date(2022,1,24), average=44.23),
Row(date=date(2022,1,26), average=45.34),
Row(date=date(2022,2,7), average=32.56),
Row(date=date(2022,2,10), average=43.78),
Row(date=date(2022,2,12), average=37.89)
])
%sql CREATE DATABASE IF NOT EXISTS feature_store
from databricks import feature_store
fs = feature_store.FeatureStoreClient()
fs.create_feature_table(
name="feature_store.uk_avg_temperature_feature",
keys=["date"],
features_df=tempDf,
description="UK Temperature Features"
)
Getting the following error while trying to create feature table -
2022/03/04 12:02:40 ERROR databricks.feature_store.utils.rest_utils:
API request to https://community.cloud.databricks.com/api/2.0/feature-store/feature-tables/get
failed with code 503 != 200, retrying up to 2 more times.
API response body:
{"error_code":"TEMPORARILY_UNAVAILABLE","message":"The service at /api/2.0/feature-store/feature-tables/get is temporarily unavailable. Please try again later."}
Note - I'm using databricks community edition.
I think databricks community edition can't handle Feature Store functionality. It doesn't even have the icon/feature in the side menu.
Let me know if you found a workaround!