ML Workflows
Getting Started

Getting Started

Installation

Install infernet-ml from your terminal:

To install via pip (opens in a new tab):

pip install infernet-ml

Quickstart

In this example we'll use the HuggingfaceInferenceClientWorkflow to perform inference on a Huggingface model.

Step 1: Import and Instantiate a Workflow

Import the HuggingfaceInferenceClientWorkflow class from infernet_ml and create an instance of it.

from infernet_ml.workflows.inference.hf_inference_client_workflow import HuggingfaceInferenceClientWorkflow

In this instance, we're going to use the text_classification task type, and use the Kaludi/Reviews-Sentiment-Analysis (opens in a new tab) model. You can use any other model tagged as Text Classification (opens in a new tab) from the Huggingface model hub.

workflow = HFInferenceClientWorkflow(task="text_classification", model="Kaludi/Reviews-Sentiment-Analysis")

Step 2: Setup the Workflow

We're going to setup our model. Depending on the workflow, this does various tasks to make the model ready for inference:

  • For workflows that execute the model themselves, this might do something like downloading the model weights.
  • For workflows that use a remote inference service, this might setup the connection to the service, and ensure the model is available on the service.
workflow.setup()

Step 3: Perform Inference

Now we can perform inference on our model. All of the workflows in infernet-ml have a inference() method that takes in the input data and returns the output.

results = workflow.inference({
    "text": "I was extremely disappointed with this product."
})

Step 4: Putting it All Together

Finally, we can display the results of our inference. In the case of Kaludi/Reviews-Sentiment-Analysis (opens in a new tab) we expect the output to have different classes and their probabilities.

from infernet_ml.workflows.inference.hf_inference_client_workflow import HuggingfaceInferenceClientWorkflow
 
workflow = HFInferenceClientWorkflow(task="text_classification", model="Kaludi/Reviews-Sentiment-Analysis")
workflow.setup()
results = workflow.inference({
    "text": "I was extremely disappointed with this product."
})
print(f"results: {results}")

Running this code, we'll get an output similar to the following:

results: {'output': [{'label': 'Negative', 'score': 0.9863545298576355}, {'label': 'Positive', 'score': 0.013645444996654987}]}

And just like that, we've performed inference on a Huggingface model using infernet-ml!

Where to next?

This example shows one of our many workflows. Check out our architecture documentation, as well as Inference Workflows to see what other workflows are available and how to use them.