Lab 3 - Deploy your model
In this lab we are going to deploy the model wrappend in an API in an Azure Container Instance and sending data to it with postman.

1. Deploy to an Azure Container Instance

Download the scoring script

inference_folder = "./inference"
​
inference_script_url = "https://raw.githubusercontent.com/hnky/DevelopersGuideToAI/master/amls/resources/score.py"
inference_script_download_path = os.path.join(inference_folder,"score.py")
if not os.path.exists(inference_folder):
os.mkdir(inference_folder);
urllib.request.urlretrieve(inference_script_url, filename=inference_script_download_path)

Create an inference environment

inference_env = Environment(name="simpsons-inference")
​
conda_dep = CondaDependencies()
conda_dep.add_pip_package("azureml-defaults")
conda_dep.add_pip_package("torch")
conda_dep.add_pip_package("torchvision")
conda_dep.add_pip_package("pillow==5.4.1")
​
inference_env.python.conda_dependencies=conda_dep

Create an Inference config

inference_config = InferenceConfig(
entry_script="inference/score.py",
environment=inference_env
)

Create a Azure Container Instance deployment config

deploy_config = AciWebservice.deploy_configuration(
cpu_cores = model.resource_configuration.cpu,
memory_gb = model.resource_configuration.memory_in_gb,
description='Simpson Lego Classifier'
)

Deploy the model to an ACI

aci_service = Model.deploy(ws,
name="simpsons-pt-aci",
models = [model],
inference_config = inference_config,
deployment_config = deploy_config,
overwrite = True)
​
aci_service.wait_for_deployment(show_output=True)
print("Scoring endpoint:",aci_service.scoring_uri)
This step can take up to 10 minutes
You can find the deployment location from your model back under the model: https://ml.azure.com
Note: if you don't see it immediately refresh the tab
Scoring URL

2. Test the model in the API

Post an image to the endpoint

The easiest way to test your scoring endpoint is the code below.
image_uri = "https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Bart.jpg"
result = aci_service.run(input_data=json.dumps({ "url": image_uri}))
print(result)

Use Postman

  • Get the scoring uri
    print("Scoring endpoint:",aci_service.scoring_uri)
  • Create a new request in Postman
    • POST request
    • Put scoring URL in call
  • Send a raw body with the JSON below. Make sure JSON is selected in orange next to the raw radio button
    { "url": "https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Bart.jpg"}
Scoring URL

Try other images

https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Krusty.jpg
https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Bart.jpg
https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Flanders.jpg
https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Homer.jpg
https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Lisa.jpg
https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/marge.jpg
https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Milhouse.jpg
https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/MrBurns.jpg
https://raw.githubusercontent.com/hnky/dataset-lego-figures/master/_test/Wiggum.jpg
End
Copy link
On this page
1. Deploy to an Azure Container Instance
Download the scoring script
Create an inference environment
Create an Inference config
Create a Azure Container Instance deployment config
Deploy the model to an ACI
2. Test the model in the API
Post an image to the endpoint
Use Postman
Try other images