Face Estimation
In this section you will learn how to integrate Emotion and Gender estimators to your C++ project.
Emotion Estimation (C++)
Requirements
- Windows x86 64-bit or Linux x86 64-bit system.
- Installed OMNI package windows_x86_64 or linux_x86_64 (see Getting Started).
1. Creating an Emotion Estimator
1.1. To create an Emotion Estimator, follow steps 1-3 described in Creating a Processing Block and specify the values:
"EMOTION_ESTIMATOR"
for the"unit_type"
key;- path to Emotion Estimator model file for the
"model_path"
key.
configCtx["unit_type"] = "EMOTION_ESTIMATOR";
// default path to Emotion Estimator model file is "share/faceanalysis/emotion.enc" in the package's root directory
configCtx["model_path"] = "share/faceanalysis/emotion.enc";
1.2. Create an Emotion Estimator Processing block:
pbio::ProcessingBlock emotionEstimator = service->createProcessingBlock(configCtx);
2. Emotion Estimation
2.1. Create a Context container ioData
for input-output data using the createContext()
method:
auto ioData = service->createContext();
2.2. Create a Context container imgCtx
with RGB-image following the steps described on
Creating a Context container with RGB-image.
2.3. Put input image to the input-output data container:
ioData["image"] = imgCtx;
2.4. Call the emotionEstimator
and pass the context with source image ioData
:
emotionsEstimator(ioData);
The result of calling emotionEstimator()
will be appended to ioData
container.
The format of the output data is presented as a list of objects with the "objects"
key.
This list object has the "class"
key with the "face"
value.
/*
{
"objects": [{ "id": {"type": "long", "minimum": 0}
"class": "face",
"emotions" : [
"emotion": {
"enum": ["ANGRY", "DISGUSTED", "SCARED", "HAPPY", "NEUTRAL", "SAD", "SURPRISED"]
}
"confidence": {"type": "double", "minimum": 0, "maximum": 1}
]
}]
}
*/
3. GPU Acceleration
Emotion Estimator can be used with GPU acceleration (CUDA). For more information, please follow this link.
Gender Estimation (C++)
Requirements
- Windows x86 64-bit or Linux x86 64-bit system.
- Installed Face SDK package windows_x86_64 or linux_x86_64 (see Getting Started).
1. Creating a Gender Estimator
1.1. To create a Gender Estimator, follow steps 1-3 described in Creating a Processing Block and specify the values:
"GENDER_ESTIMATOR"
for the"unit_type"
key;- path to Gender Estimator model file for the
"model_path"
key.
configCtx["unit_type"] = "GENDER_ESTIMATOR";
// default path to Gender Estimator model file is "share/faceanalysis/gender.enc" in the Face SDK's root directory
configCtx["model_path"] = "share/faceanalysis/gender.enc";
1.2. Create a Gender Estimator Processing block:
pbio::ProcessingBlock genderEstimator = service->createProcessingBlock(configCtx);
2. Gender Estimation
2.1. Create a Context container ioData
for input-output data using the createContext()
method:
auto ioData = service->createContext();
2.2. Create a Context container imgCtx
with RGB-image following the steps described on
Creating a Context container with RGB-image.
2.3. Put input image to the input-output data container:
ioData["image"] = imgCtx;
2.4. Call the genderEstimator
and pass the context ioData
with source image:
genderEstimator(ioData);
The result of calling genderEstimator()
will be appended to ioData
container.
/*
{
"objects": [{ "id": {"type": "long", "minimum": 0}
"class": "face",
"gender": {
"enum": ["FEMALE", "MALE"]
}
}]
}
*/
3. GPU Acceleration
Gender Estimator can be used with GPU acceleration (CUDA). For more information, please follow this link.