Skip to main content
Version: latest

Model Integration

This is a guide to registering models, adding experiments and runs, and integrating models into Skills in SENSA Fabric.

In Fabric ML models are registered and integrated into Skills that complete computational AI tasks. Skills are used to build Agents or to perform the Intervention actions in Missions.

Skill Building and Model Registration Methods

There are two options for registering models and building Skills for SENSA Fabric:

This document focuses on the using the SENSA Fabric Console and CLI to register models and package them into Skills after models are prepared in an outside IDE.

Model components

Models consist of:

  • Metadata: Title, name, description, type, mode, source
  • Properties: required data schema
  • Tags: Descriptors used to organize models in Fabric
  • Experiments: Versions of the model that can be run

Experiments consist of:

  • Artifacts: The output from an experiment that is saved in managed content
  • Metadata: The informational content of an Experiment record (e.g. artifact_name)
  • Metrics: The attributes of the experiment being measured and how they are expressed as key-value pairs (e.g. precision, accuracy)
  • Parameters: Experiment attributes expressed as key-value pairs defining an experiment (e.g. category, version)
  • Runs: Invocation instances of a model experiment

The Model integration or ML-Ops process consists of 3 parts that are detailed below:

ML-Ops summary

  1. Model registration (in Console or using CLI)
  2. Experiment and run artifact upload (using CLI)

Model Registration using Fabric Console

Model registration sets up the metadata and data framework for the model experiments and experiment-runs that are uploaded in step 2.

Prerequisite

In any IDE develop an ML model (algorithm).

Process

  1. Login to Fabric Console and select your Project Context.

  2. Click Models in the left menu.

  3. A list of Models is displayed with title, name, number of tags and create timestamp.

    You may click the vertical ellipsis menu on the left in any row to Edit or Delete the Model.

  4. Click Add Model at the top right to register a new model.

  5. Enter a brief descriptive title for the Model.

  6. Review the auto-generated Model Name.

    You may toggle the auto-generated switch to off and edit the name or enter a unique URL-friendly name. Names must be alphanumeric, beginning with a letter and ending with a letter or number. Dashes and underscores are allowed in the middle; no other special characters can be used. (Names must be 20 characters or fewer)

  7. Select the type of model your are registering: Classification, Ranking, Clustering, or Regression.

  8. (Optional) Select a Mode: Single or Multi-model

  9. (Optional) Add Properties:(The schema requirements for the model)

      1. Click +Add New Property under the table to open the dialog.
      1. Enter a Key (property name)
      1. Select a Data Type
      1. (Optional) Enter a default value for the Property
  10. (Optional) Add tags (descriptors used to help you organize and find your models in Fabric)

    1. Click in the empty field
    1. Select an existing tag (check the box and click in the field) or type a new value and then click the checkbox (under the field) to select it.
    1. Selected tags are listed under the field.
    1. Click the 'x' beside the tag label to delete it from the list.
  1. (Optional) Enter a brief but complete description of the what the model does so other Fabric users can understand the model.
  2. Click NEXT. (To register experiments)
  3. Register Experiments as described in the section below OR click SAVE at the bottom of the page to save the Model registration.

Manage Experiments

After you have registered a Model, you may register Experiments associated with that Model.

  1. Click Models in the left menu to open the Model list view.
  2. Find the Model you want to add Experiments to and open the action menu at the far right in the row.
  3. Click Edit Model.
  4. At the bottom of the Model registration dialog click Add Experiment.
  5. Enter a Title for the Experiment.
  6. The Experiment Name follows the same conventions as the Model Name described above.
  7. Enter a brief Description to assist collaborators.
  8. Click Save Experiment below the description field.
  9. The Experiment is saved in a table.
  10. Click SAVE at the bottom of the page to save the Model and Experiments.

You can click Add Experiment above the table to add more experiments.

From the action menu at the far right in each row in the Experiment List you can:

  • Edit the Experiment metadata.
  • Delete the Experiment.

Set Model Status to Published.

Registered Models statuses are:

  • IN DEVELOPMENT (default)
  • PUBLISHED

Models are only published when they are manually set to this status using either the UI or the CLI.

To set a Model to PUBLISHED/UNPUBLISHED from the Model List:

  1. Find the Model in the Model list.
  2. Open the Action menu on the right in the list row and click Publish Model (or if the model has been published click "Unpublish Model").

To set a Model to PUBLISHED/UNPUBLISHED from the Model Details (Edit):

  1. Find the Model in the Model list.
  2. Open the Action menu on the right in the list row.
  3. Click Edit Model.
  4. At the top of the Edit Model page open the action Menu on the right of the status.
  5. If the Model shows a status of IN DEVELOPMENT, click Publish Model. The status is changed to PUBLISHED. If the Model shows a status of PUBLISHED, click Un-publish. The status is changed to IN DEVELOPMENT.
note

Only Models with a status of PUBLISHED can be added to Skills.

Model registration using the CLI

You may register the Model by running the following CLI command with the payload that provides the model registration metadata.

cortex models save <path_to_json_file> -—project myProject

Sample JSON file:

{
"source": "test src",
"name": "test-model-name",
"title": "test title",
"camel": "1.0.0",
"tags": [{"label": "tag-1", "value": "tag-1"}],
"properties": [
{
"key": "propertyIdentifier"
"type": "dataType"
"default": "value"
"secure": "true"
},
],
"mode": "Single",
"type": "classification",
"description": "model purpose"
}

Where:

  • source is where the model is coming from (e.g. IBM watson, Azure ML).
  • name is the system-assigned unique identifier for the model
  • title is the human-readable identifier for the Model
  • camel is the version of the Camel spec used ("1.0.0")
  • tags are descriptors used to help you organize and find your models in Fabric
  • properties are the schema requirements for the model
  • mode is "single" (one model) or "multi-model" (models wired together in a group).
  • type is the type of model (e.g. "classification", "regression", "clustering", or "ranking")
  • description is a brief but complete description of the what the model does so other Fabric users can understand the model.

Experiment and experiment-run artifact upload

Experiments are used by data scientists to train and test AI model algorithms and make modifications that improve the model results for specific use cases. The goal of the experiments API, library, and CLI is to save, view, update, and download the details (metadata, parameters, and metrics) and output (artifacts) of experiments and experiment-runs for analysis, so the optimum model for a business use case can be packaged for deployment.

Use the following CLI commands to upload experiment and experiment-run artifacts that are used during Skill building.

  1. Save a model experiment.

    cortex experiments save <path_to_json_file> -—project myProject

    Sample Experiment body

    {
    "name": "test experiment name",
    "title": "test experiment name",
    "camel": "1.0.0",
    "tags": [],
    "properties": [],
    "description": "test2 updated",
    "modelId": "test-model-name",
    }
  2. Create an experiment-run.

    cortex experiments create-run <path_to_json_file_containing_body> -—project myProject

    Sample Run body

    {
    "experimentName": "demo_exp_1",
    "params": {
    "category": "Claims Classifier",
    "version": "1",
    "SourceData": "Upstream Server Data"
    },
    "metrics": {
    "metric1": 1111,
    "metric2": 2222
    },
    "meta": {
    "algo": "SVM"
    }
    }
  1. Upload the experiment-run artifact.

    (NOTE: ArtifactKey is an arbitrary string that the developer creates. When a user wants to download an artifact, the artifactKey must be specified.)

    cortex experiments upload-artifact experimentName runId <path_to_artifact_file> artifactKey -—project myProject

    To find an experimentName run:

    cortex experiments list -—project myProject

    To find a runID run:

    cortex experiments list-runs experimentName
  2. Set the Model status to PUBLISH to signal that it is ready to be packaged into a Skill. (Any Model not set to "PUBLISH" has default status of "IN DEVELOPMENT".)

    cortex models publish modelName

    You can find a modelName by logging in to Console, selecting your Project context, and clicking Models in the left menu to view a list of Models.

    OR

    Run the following CLI command to return a list of models for the Project specified.

    cortex models list --project myProject

Unpublish Models

Models that have been published can be unpublished using the CLI.

cortex models unpublish modelName --project myProject

Models that are unpublished cannot be package into Skills.

Delete Models

Deleting Models is a protected action in Cortex. When you run the delete command an Impact Assessment is run, and if downstream dependencies are found, the deletion is not allowed, and the resources using that Model are listed. You must remove dependencies to unblock the delete action.

Models may be deleted via the CLI by running:

cortex models delete modelName --project myProject

Next Steps

Package your Model into a Skill.