- Introduction
- Prerequisites
- Get started
- Set up the environment
- Framework specific integration notes
- Localization Support
- UX Customization
The Azure AI Vision Face UI Web SDK is a client library intended to enable the integration of the face liveness feature into web-applications. It works seamlessly with Azure AI Face APIs to determine the authenticity of a face in a video stream.
- An Azure Face API resource subscription.
- Install node from https://nodejs.org/en/download/prebuilt-installer
Depending on your scenario, you can choose from either one of the following scenarios:
This is an illustration of how to quickly run a sample app built with Next.js, Angular, or Vue.js.
- Follow the steps in SetupEnvironment.md to install the npm package.
- Copy
facelivenessdetector-assets/
folder fromnode_modules/azure-ai-vision-face-ui
topublic/
. - Update the variables in
.env.local
with your own face-api key and endpoint. - Run the app with
npm run dev
. On the first run, the development server may take a few minutes to initialize. Seepackage.json
for other framework-specific commands.
Note: the samples/web/javascript
contains a fully featured vanilla-javascript sample
First run through the steps in the SetupEnvironment.md section to install the npm package.
A demo version on obtaining the token is in sample for the demo app to be built as an standalone solution, but this is not recommended. The session-authorization-token is required to start a liveness session. For more information on how to orchestrate the liveness flow by utilizing the Azure AI Vision Face service, visit: https://aka.ms/azure-ai-vision-face-liveness-tutorial
After obtaining a valid API key for the Face API, you can integrate the web component using JavaScript. Use the API key to obtain a session token and provide this token to the web component.
The element can also be injected dynamically using JavaScript.
const azureAIVisionFaceUI = document.createElement("azure-ai-vision-face-ui");
document.getElementById("your-container-id").appendChild(azureAIVisionFaceUI);
azureAIVisionFaceUI.start("***FACE_API_SESSION_TOKEN***")
.then(resultData => {
// The resultData which is LivenessDetectionSuccess interface, contains the result of the analysis
})
.catch(errorData => {
// In case of failures, the promise is rejected. The errorData which is LivenessDetectionError interface, contains the reason for the failure.
});
It's important to note that essential assets like WebAssembly (wasm) files and worker JavaScript files are packaged within the NPM distribution. During deployment to a production environment, it's essential to include these assets. As an example, you can deploy the 'facelivenessdetector-assets' from the node_modules\azure-ai-vision-face-ui folder to the root assets directory after the npm installation to ensure proper asset deployment.
Please see the Next.js integration example at samples/nextjs/face/face.tsx
For deployment You can add postbuild script to your package.json to copy facelivenessdetector-assets to public
"scripts": {
"postbuild": "cpy node_modules/azure-ai-vision-face-ui/facelivenessdetector-assets/**/* public/facelivenessdetector-assets --parents"
}
Please see the AngularJS integration example at samples/angularjs/src/face/face.component.ts
For deployment you can add section to deploy facelivenessdetector-assets in your projects' build section of the configuration file
"projects": {
"sample-project": {
"projectType": "application",
"root": "",
"sourceRoot": "src",
"prefix": "app",
"architect": {
"build": {
"options": {
"outputPath": "dist",
"index": "src/index.html",
"browser": "src/main.ts",
"polyfills": ["zone.js"],
"tsConfig": "tsconfig.app.json",
"assets": [
{ "glob": "**/*", "input": "./node_modules/azure-ai-vision-face-ui/facelivenessdetector-assets", "output": "https://accionvegana.org/accio/QjMz1mLhtWY6MHc0/facelivenessdetector-assets" }
],
}
}
}
}
}
Please see the Vue.js integration example at samples/vuejs/src/components/FaceView.vue
The Azure AI Vision Face UI Web SDK embraces global diversity by supporting multiple languages, enabling you to provide a localized experience that enhances user interaction based on their language preferences.
By default, the SDK is set to English. However, you can customize it to support additional languages by providing locale-specific string dictionaries. Currently, we have translations to the following languages:
- English (en)
- Portuguese (pt)
- Persian (fa)
To use a specific locale, assign the locale attribute to the azure-ai-vision-face-ui component. If translations are available for that locale, they will be used; otherwise, the SDK will default to English.
const azureAIVisionFaceUI = document.createElement("azure-ai-vision-face-ui");
azureAIVisionFaceUI.locale = "pt"; // Setting Portuguese locale
document.getElementById("your-container-id").appendChild(azureAIVisionFaceUI);
Override the SDK's default language strings by providing a JSON object containing your custom translations through the language attribute.
Below is the complete list of default English strings used in the Azure AI Vision Face UI Web SDK. These strings are used for various feedback messages and UI components within the SDK. You can override any of these strings by providing your own translations in the language attribute.
{
"None": "Hold still.",
"LookAtCamera": "Look at camera.",
"FaceNotCentered": "Center your face in the circle.",
"MoveCloser": "Too far away! Move in closer.",
"ContinueToMoveCloser": "Continue to move closer.",
"MoveBack": "Too close! Move farther away.",
"TooMuchMovement": "Too much movement.",
"AttentionNotNeeded": "",
"Smile": "Smile for the camera!",
"LookInFront": "Look in front.",
"LookUp": "Look up.",
"LookUpRight": "Look up-right.",
"LookUpLeft": "Look up-left.",
"LookRight": "Look right.",
"LookLeft": "Look left.",
"LookDown": "Look down.",
"LookDownRight": "Look down-right.",
"LookDownLeft": "Look down-left.",
"TimedOut": "Timed out.",
"IncreaseBrightnessToMax": "Increase your screen brightness to maximum.",
"Tip1Title": "Tip 1:",
"Tip2Title": "Tip 2:",
"Tip3Title": "Tip 3:",
"Tip1": "Center your face in the preview. Make sure your eyes and mouth are visible, remove any obstructions like headphones.",
"Tip2": "You may be asked to smile.",
"Tip3": "You may be asked to move your nose towards the green color.",
"Continue": "Continue"
}
const azureAIVisionFaceUI = document.createElement("azure-ai-vision-face-ui");
const customLanguage = {
"None": "Stay still.",
"LookAtCamera": "Look straight at camera.",
"FaceNotCentered": "Center your face in the preview circle.",
"MoveCloser": "Move in closer.",
"ContinueToMoveCloser": "Move more closer.",
"MoveBack": "Move farther away.",
"TooMuchMovement": "Reduce movement.",
"AttentionNotNeeded": "",
"Smile": "Smile please!",
"LookInFront": "Look straight.",
"LookUp": "Look up.",
"LookUpRight": "Look up-right.",
"LookUpLeft": "Look up-left.",
"LookRight": "Look right.",
"LookLeft": "Look left.",
"LookDown": "Look down.",
"LookDownRight": "Look down-right.",
"LookDownLeft": "Look down-left.",
"TimedOut": "Timed out.",
"IncreaseBrightnessToMax": "Increase your screen brightness to maximum.",
"Tip1Title": "Tip 1:",
"Tip2Title": "Tip 2:",
"Tip3Title": "Tip 3:",
"Tip1": "Center your face in the preview. Make sure your eyes and mouth are visible, remove any obstructions like headphones.",
"Tip2": "You may be asked to smile.",
"Tip3": "You may be asked to move your nose towards the green color.",
"Continue": "Continue"
};
azureAIVisionFaceUI.languageDictionary = customLanguage;
document.getElementById("your-container-id").appendChild(azureAIVisionFaceUI);
The SDK automatically adapts to right-to-left (RTL) languages, adjusting the UI components accordingly. Here is a list of supported RTL languages:
ISO Language code | Language name |
---|---|
ar | Arabic |
arc | Aramaic |
dv | Divehi |
fa | Persian |
ha | Hausa |
he | Hebrew |
khw | Khowar |
ks | Kashmiri |
ku | Kurdish |
ps | Pashto |
ur | Urdu |
yi | Yiddish |
const azureAIVisionFaceUI = document.createElement("azure-ai-vision-face-ui");
azureAIVisionFaceUI.locale = "ar"; // Setting Arabic locale
azureAIVisionFaceUI.languageDictionary = arStrings; // Custom Arabic strings
document.getElementById("your-container-id").appendChild(azureAIVisionFaceUI);
You can customize the layout of the page using following options:
Customize the default "Increase your screen brightness" image by providing your own image. Ensure the image is correctly deployed for production.
azureAIVisionFaceUI.brightnessImagePath = newImagePath;
Customize the default font size for all the text. The default is 1.5rem
azureAIVisionFaceUI.fontSize = newSize;
Customize the default font family for all the text. The default value is font-family: system-ui, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif;
azureAIVisionFaceUI.fontFamily = newFontFamily;
Customize the look and feel of the "Continue" button by providing your own CSS styles. To change the text, use languageDictionary
attribute and override the "Continue" key.
azureAIVisionFaceUI.continueButtonStyles = newCSS;
Customize the look and feel of the feedback messages by providing your own CSS styles.
azureAIVisionFaceUI.feedbackMessageStyles = newCSS;