Create the Luma propensity recipe

A main component of the Data Science Workspace lifecycle involves authoring Recipes and Models. The Luma propensity model is designed to generate a prediction on whether customers have a high propensity to purchase a product from Luma.

To create the Luma propensity model, the recipe builder template is used. Recipes are the basis for a Model, as they contain machine learning algorithms and logic designed to solve specific problems. More importantly, Recipes empower you to democratize machine learning across your organization, enabling other users to access a Model for disparate use cases without writing any code.

Follow the create a model using JupyterLab Notebooks tutorial to create the Luma propensity model recipe which is used in subsequent tutorials.

Import and package a recipe from external sources (optional)

If you wish to import and package a recipe for use in Data Science Workspace, you must package your source files into an archive file. Follow the package source files into a recipe tutorial. This tutorial shows you how to package source files into a recipe, which is the prerequisite step for importing a recipe into Data Science Workspace. Once the tutorial is complete, you are provided a Docker image in a Azure Container Registry, along with the corresponding image URL, in other words, an archive file.

This archive file can be used to create a recipe in Data Science Workspace by following the recipe import workflow using the UI workflow or the API workflow.

Train and evaluate a model

Now that your data is prepared and a recipe is ready, you have the ability to create, train, and evaluate your machine learning model further. While using the Recipe Builder, you should have already trained, scored, and evaluated your model before packaging it into a recipe.

The Data Science Workspace UI and API allow you to publish your recipe as a model. Additionally, you can further fine-tune specific aspects of your model such as adding, removing, and changing hyperparameters.

Create a Model

To learn more about creating a model using the UI, visit the train and evaluate a model in the Data Science Workspace UI tutorial or API tutorial. This tutorial provides an example on how to create, train, and update hyperparameters to fine tune your model.

NOTE
Hyperparameters cannot be learned, therefore they must be assigned before training runs occur. Adjusting hyperparameters may change the accuracy of your trained model. Since optimizing a model is an iterative process, multiple training runs may be required before a satisfactory evaluation is achieved.

Score a model

The next step in creating and publishing a model is to operationalize your model in order to score and consume insights from the data lake and Real-Time Customer Profile.

Scoring in Data Science Workspace can be achieved by feeding input data into an existing trained Model. Scoring results are then stored and viewable in a specified output dataset as a new batch.

To learn how to score your model, visit the score a model UI tutorial or API tutorial.