AI – Artificial Intelligence – is a trending keyword that becomes more and more popular in modern system nowadays. AI can be used in several domains such as medical, bank, entertainment, … To seize opportunities, software engineers we need to learn more about AI, even if it is not our expertise.

AI is not all about research and create models. Deploy the models and integrate with other components in system, or serving model, is also an important aspect. Although we need to learn a lot about serving model to master this skill, get started is not so difficult. Especially when cloud providers provide services that help fasten this process.

In this article, I will introduce Azure Machine Learning and a step by step guideline to introduce how to serve custom model using Azure Machine Learning and integrate with Azure Stream Analytics to create an extendable processing pipeline.

In my project, I need to deploy an AI model to detect sentiment (positive or negative) from social post. My mission is to serve an AI model in Azure Machine Learning workspace so I can publish API for other component to consume and continue to process.

For the mission, I have to provision following Azure service:

  • Azure Machine Learning

Azure Machine Learning is a cloud-based environment for AI engineer in AI project. As you can see when you have accessed to Azure ML workspace, Azure ML is capable of all kinds of AI-related work including: train, deploy, automate, manage, and track ML models.

Costs incurred in Azure Machine Learning include compute and storage resource that you have used. You can get more information about cost in this link.

I am assigned with following files from my AI engineer partner:

  • svm.pkl
  • tfidf_vect.pkl
  • score.py: python file to consume above models to output the final result.

(Actually, I have to convert from my partner’s jupyter notebook to score.py file by myself since AI engineer often done their work using ipynb)

My work is to deploy and serve above models. You can accomplish this task using Azure CLI or Python file. In my case, I choose the latter option, so I will create a notebook. My notebook is called deployment.ipynb:

Cell 1:

This cell is to test AzureML SDK. To run cells, you have to create a compute instance, in this case a VM. You then start the VM to be able to execute cells.

Note that on Azure Machine Learning, you are charged by compute and storage resource. So remember to stop/start your compute instance according to your need to save the cost. You can check all of created compute in Compute menu.

In this article, we will create one Compute Instance to execute deployment script, and one Inference Cluster to serve model as web service.

Cell 2:

This cell is to import Workspace. You will be asked to login to pass this step.

Cell 3:

This cell is to register model into Azure Machine Learning. After you have executed this cell, you can check in Models menu to view your registered model and version.

Note that each time you execute the cell, a new version will be created.

Cell 4:

As the code explained, this cell is to create environment, which include conda package, to execute the score.py script.

Cell 5:

This cell is to deploy models to selected target, in this case: AKS – Azure Kubernetes Service.

There are various of “places” you can deploy models to Azure, as described in detail in Azure document. Which each deployment target, you will have to modify content of Cell 5 respectively.

In my case, I choose AKS to be deployment target because:

  • AKS is used for real-time inference (suitable for my requirement)
  • Publish model as web service (I need this feature to integrate with Stream Analytics in later step)

You can read “Choose a compute target” session in above link to decide which compute target is suitable for you.

After Cell 5, your model should have been published. I will assume you have successfully executed Cell 5 to “speed up the process), since in reality, I have to fix my “score.py” file lots of time before I can publish.

With my assumption, you have created an Endpoints which you can check in Endpoints menu:

You can get service url, authentication token… in endpoint details.

You can test the url with following script;

With that, I have successfully serve your model as web service using Azure Machine Learning model.

Integration with Azure Stream Analytics:

To integrate with Azure Stream Analytics, you modify your score.py as below;

Note that:

  • Both input and output parameter need to be np.array
  • input_schema’s first parameter need to be “Inputs”

(These points are my experiences after lots of “trial and error”, and may not be suitable in the future. Please tell me if you have any update)

At this time, you successfully integrate AzureML web service as a function in Azure Stream Analytics.

Evaluations:

Hard works:

  • Require lots of research: I actually have to read lots of article and sample to understand how each script impact into the deployment and condensed into this article. I suggest you should read official Azure Machine Learning document if you are interested and want to understand more about service.
  • Integration with Azure Stream Analytics: Due to how Stream Analytics get scoring API format, I have to update “score.py” file several times to be able to adapt to ASA. There is not much detail document about this item.
  • Other “Azure stuff”: I have met unusual problem with creating an Inference cluster (AKS) that tell me “Failed to load VM size” (it’s work fine now). But in the future, if you have problems with using Azure services’ UI, you can use CLI as alternative.

Good points:

  • All-in-one service: You can do almost everything relate to AI in Azure Machine Learning. This is a convenient service for AI project, as long as you can control the cost (through control consumed resources).
  • Easy to do: After hosted one model successfully, update or host other models is much easier. You can use CLI to create CI/CD pipeline.
  • Integration with other Azure services: I do not need to say more, you can take a look at this document.

Through this article, I have demonstrated how to serve a custom model using Azure Machine Learning. I hope this article has inspired you to get started to involve in AI world.

Trần Gia Quốc Hưng
Solution & Technology Unit, FPT Software

Related posts: