Spark- The purpose of saving ALS model

688 views Asked by At

I'm trying to understand what would be a purpose of storing ALS model and what would be a use case for use of stored model.

I have a dataset which has over 300M rows and I'm using Hadoop Cluster and Spark to calculate recommendations based on ALS algorithm. Whole computation takes around 5h and I'm wondering what would be the case of storing my model and use it- for example- the next day and... I don't see any. So, either I'm doing something wrong (which is possible, taking into account fact that I'm beginner in ML world) or ALS algorithm in Spark and possibility of saving on disk is not very helpful.

Right now, I use it as following:

df_input = spark.read.format("avro").load(PATH, schema=SCHEMA)
als = ALS(maxIter=12, regParam=0.05, rank=15, userCol="user", itemCol="item", ratingCol="rating", coldStartStrategy="drop")

model = als.fit(df_input)
df_recommendations = model.recommendForAllUsers(10)

And as I mentioned. df_input is a DataFrame which contains over 300M rows. Total calculation time is around 5h and after that I receive 10 recommended items for each user in the dataset.

In many tutorials or books. There is an example of training the model and validate it with test data. Something like:

df_input = spark.read.format("avro").load(PATH, schema=SCHEMA)
(training, test) = df_input.randomSplit(weights = [0.7, 0.3])
als = ALS(maxIter=12, regParam=0.05, rank=15, userCol="user", itemCol="item", ratingCol="rating", coldStartStrategy="drop")

model = als.train(training)

model.write().save("saved_model")
...
model = ALSModel.load('saved_model')

predictions = model.transform(test) // or df_input to get predictions for each user

I don't see any pros of using it in a such way. However I see a one big cons- You don't use 30% of data to train a model
As far as I know there isn't a way to use ALS model online (in real time). At least without using any external package/library.
You can't incrementally update this model.
You can't use it for newly registered users because there they don't exist in stored Matrix Factorization, so there won't be any recommendations for them.
All you can do is to check what would be a prediction for given user-item pair. Which is basically the same thing which would be return in the first example of code (with used fit() method)

What would be a reason to store this model on disk and load it when needed? or when (what conditions should be met) should I consider to store model and reuse it? Could you provide a use case?

1

There are 1 answers

0
Gokul Raam On

As you have stated, it may take 5 hours to fit a model. Suppose you have thousands of daily users, will you train the model for each user, every time whenever they need some recommendation? No...

You can save the trained model, and use the same to provide recommendations for the users.

Whenever new data is logged, you can design your system to retrain the model after a threshold, 1000 new logs for example. (You can use Apache Kafka to stream data in real time).

In case you want recommendations for new users, you can retrain the model after the new user has logged specific number of actions (which are specific to you business use case)