Using Infer.Net models in an ASP.Net web service

64 views Asked by At

I'm building an ASP.Net Web API 2 web service in Azure to give access to an Infer.Net naive Bayes model. There are two modes for starting up the model: building the model from scratch or loading the last-saved state of the model. The latter is faster than the former, but neither is near-instantaneous. Web service calls need to be immediate and near-stateless. The two ways I can imagine associating the running model with the web API are to use ASP.Net's application state or to use an Azure Worker Role and communicate between the web service methods and the running worker role using queues.

What is the best practice for associating a running Infer.Net model with a web service API?

0

There are 0 answers