Skip to content

Configure and Deploy Inferencer Node for Image Classification

This tutorial guides you through using the inferencer node to perform image classification (within your workflow.

1. Add Inferencer Node to Flow

First, search the node and drag it onto the flow.

Add Inferencer Node to Flow

2. Connect Inferencer Node

Connect it to ohter nodes.

Connect Inferencer Node

3. Access Node Settings

Click the node to open its settings.

Access Node Settings

4. Main Settings

First, choose the input field, which indicates where the node will read the incoming image inside the message object. Then select the output field, which determines where the node will store the processing results.

Main Settings

5. Select Model Name for Use

Select Model Name for Use

6. Choose Model Results Option

Next. We select the model we want to use, in this case, classresults.

Choose Model Results Option

7. Select class_results

Select class_results

8. Activate Model Warm-Up

First inferences are usually slower. We'll enable warm-up so that the model does a first inference to avoid that. The remaining settings are described in the node’s documentation.

Activate Model Warm-Up

9. Open JSON Configuration

Open JSON Configuration

10. View JSON Configuration Details

JSON config window allows modifying model settings, such as minimum confidence, image size or tag names, for example.

View JSON Configuration Details

11. Modify Node Configuration

Modify Node Configuration

12. Complete Configuration Editing

Complete Configuration Editing

13. Complete Configuration Editing

Click "Done" to save your configuration changes.

Complete Configuration Editing

14. Deploy Node Configuration

Click "Deploy" to apply the current settings and start the service.

Deploy Node Configuration

15. Inject Image

We inject an image using the from-dataset node.

Inject Image

16. Review Classification Output

On the debug window, we can see the results. For classification models, we get the main tag and its score, along an array with all possible tags and their scores.

Review Classification Output

17. Sort Results by Confidence

Sort Results by Confidence

18. Expand Results Details

Expand Results Details

19. Enable Debug Image Display

In case we want to debug even more, we can visualize the image with the results.

Enable Debug Image Display

20. Set Debug Image Width Value

Set Debug Image Width Value

21. Navigate to Debug Settings

Go here to access debug image configuration options.

Navigate to Debug Settings

22. Redeploy Configuration

Redeploy Configuration

23. Redeploy

Click "Deploy" to apply debug image settings

Redeploy

24. Test Inference

We can inject both OK and NOK images and see we're correctly classifying them.

Test Inference

25. Review and Test Node Functionality

Review and Test Node Functionality

26. Overview

Overview

With this, we end the tutorial for basic usage of inferencer node. Thanks for using Rosepetal