Prepare model endpoints for scanning
Prerequisites
To scan your own model with Certifai, you must provide an endpoint that matches the expected predict API.
Examples of using the Certifai Model SDK to package your Python models are
provided in the models
folder of the
cortex-certifai-examples repository.
Create a model endpoint using Certifai Model SDK
For development purposes, you can run your model in a local flask application without using Docker.
This walkthrough uses the german_credit
model from the
cortex-certifai-examples repository.
In another terminal, clone the cortex-certifai-examples repository:
git clone https://github.com/CognitiveScale/cortex-certifai-examples.gitGo to the
models/german_credit
folder in the cloned cortex-certifai-examples repository.cd cortex-certifai-examples/models/german_creditCreate a new conda environment.
conda create -n model-server python=3.8Activate the conda environment.
conda activate model-serverInstall the
cortex-certifai-common
andcortex-certifai-model-sdk
packages from the Certifai Toolkit. Replacecertifai_toolkit
in the following with the path where you unzipped the toolkit.On MAC or Linux:
pip install certifai_toolkit/packages/all/cortex-certifai-common*pip install certifai_toolkit/packages/all/cortex-certifai-model-sdk*For Windows Powershell:
Get-ChildItem .\packages\all\cortex-certifai-common* | ForEach-Object -Process { pip install $_ }Get-ChildItem .\packages\all\cortex-certifai-model-sdk* | ForEach-Object -Process { pip install $_ }Train the models using the provided Python script.
python train.pyRun the prediction service for the Decision Tree model using the provided Python script.
python app_dtree.pyYou should see build output that ends similar to the following:
* Serving Flask app "certifai.model.sdk.simple_wrapper" (lazy loading)* Environment: productionWARNING: This is a development server. Do not use it in a production deployment.Use a production WSGI server instead.* Debug mode: offYour prediction service is running at: http://127.0.0.1:8551/predict.
Alert
If you get the error "Address already in use", check to ensure that you do not have another model server already running. If you do, close the server window or press CTRL-C to exit the model server, and retry the command.
If you are asked to allow an external connection, click Deny
as the model only needs to be available locally.
Next steps
You are now ready to define and run scans locally.