Skip to main content
Version: 3.29 (latest)

Utility blocks

Utility blocks are Processing Blocks that implement auxiliary functions. Currently, there is only one modification of the UTILITY_MODULE type: face_cut.

face_cut

Face normalization refers to the rotation of a non-frontal face to a frontal position. It is needed for better handling of face recognition and other operations with detected faces.

Within the Face Recognition Pipeline, this operation is performed inside the FACE_TEMPLATE_EXTRACTOR module, but it is often convenient to use normalized images to store detected faces.

Configuration

The following parameters are used when creating the face_cut block:

{
"unit_type": "UTILITY_MODULE",
"modification": "face_cut",
"version": 1,
"cut_type": {"FACE_CUT_BASE", "FACE_CUT_TOKEN_FRONTAL", "FACE_CUT_FULL_FRONTAL"}
}

The cut_type parameter specifies the type of normalization used:

  • FACE_CUT_BASE: basic normalization (any sample type). Default value for cut_type.
  • FACE_CUT_FULL_FRONTAL: ISO/IEC 19794-5 Full Frontal (only frontal sample type). It is used for saving face images in electronic biometric documents.
  • FACE_CUT_TOKEN_FRONTAL: ISO/IEC 19794-5 Token Frontal (only frontal sample type).

Processing Block specification

  1. The input Context must contain an image in binary format and objects array from Face Detector and Face Fitter:

    Click here to expand the input Context specification
    {
    "image" : {
    "format": "NDARRAY",
    "blob": "data pointer",
    "dtype": "uint8_t",
    "shape": [height, width, channels]
    },
    "objects": [{
    "id": {"type": "long", "minimum": 0},
    "class": "face",
    "confidence": {"double", "minimum": 0, "maximum": 1},
    "bbox": [x1, y2, x2, y2]
    "keypoints": {
    "left_eye_brow_left": {"proj" : [x, y]},
    "left_eye_brow_up": {"proj" : [x, y]},
    "left_eye_brow_right": {"proj" : [x, y]},
    "right_eye_brow_left": {"proj" : [x, y]},
    "right_eye_brow_up": {"proj" : [x, y]},
    "right_eye_brow_right": {"proj" : [x, y]},
    "left_eye_left": {"proj" : [x, y]},
    "left_eye": {"proj" : [x, y]},
    "left_eye_right": {"proj" : [x, y]},
    "right_eye_left": {"proj" : [x, y]},
    "right_eye": {"proj" : [x, y]},
    "right_eye_right": {"proj" : [x, y]},
    "left_ear_bottom": {"proj" : [x, y]},
    "nose_left": {"proj" : [x, y]},
    "nose": {"proj" : [x, y]},
    "nose_right": {"proj" : [x, y]},
    "right_ear_bottom": {"proj" : [x, y]},
    "mouth_left": {"proj" : [x, y]},
    "mouth": {"proj" : [x, y]},
    "mouth_right": {"proj" : [x, y]},
    "chin": {"proj" : [x, y]},
    "points": ["proj": [x, y]]
    }
    }]
    }

    Example of face detection and landmarks estimation

  2. After calling the face_cut Processing Block, each object from the "objects" array will be added attributes:

{
"cut_rect": [x1, y1, x2, y2, x3, y3, x4, y4], // coordinates of the corners of the normalized crop starting from the top left and continuing counterclockwise
"face_crop": { // normalized crop
"format": "NDARRAY",
"blob": "data pointer",
"dtype": "uint8_t",
"shape": [height, width, channels]
},
}

Example of working with the face_cut processing block

To obtain normalized crops, perform the following steps:

  • Create a face_cut modification of Processing Block
  • Pass the Context container obtained after the Face Detector and Fitter Processing Blocks have been executed
  • Call the face_cut Processing Block
  • Obtain the result of the Processing Block
auto configCtx = service->createContext();
configCtx["unit_type"] = "UTILITY_MODULE";
configCtx["modification"] = "face_cut";
pbio::ProcessingBlock blockFaceCut = service->createProcessingBlock(configCtx);

//------------------
// creation of processing blocks FACE_DETECTOR, FACE_FITTER, and a Context container with a binary image
//------------------

faceDetector(ioData);
faceFitter(ioData);
blockFaceCut(ioData);