How to use face-api.js for face detection in video or image using Angular

A JavaScript API for Face Detection, Face Recognition and Face Landmark Detection

Rushi Panchariya
4 min readMar 18, 2021
Face-detection using face-api.js

Hi guys, In this article, we are going to implement face detection using the javascript library face-api.js on Angular. face-api.js is built on top of tensorflow.js core, which implements several CNNs (Convolutional Neural Networks) to solve face detection, face recognition, and face landmark detection, optimized for the web and mobile devices.

To get more information about face-api.js refer to this

If you want to detect real-time webcam face detection and emotion recognition without any framework refer to this video

Steps to Implement face-api.js

  1. Create an angular project and install all dependencies.
  2. Create webcam component in Angular.
  3. Implementing face-api.js in the Angular component.
  4. Fix warnings and errors.
  5. Enjoy

Let’s Implement face-api.js in Angular

  1. Create an angular project and install all dependencies.

First, create the new angular project

ng new project_name

After the project is created install Face-Api.js using npm with “ — save” to save dependency in the package.json

npm install --save face-api.js

Also, install @tensorflow/tfjs-core

npm install --save @tensorflow/tfjs-core

2. Create webcam component in Angular

Create a new component in angular I am going to create a component using cli

ng g c webcam --skipTests=true

Remove all default code from app.component.html and add webcam component

<app-webcam></app-webcam>

3. Implementing face-api.js in the Angular component

Go to webcam.component.ts and create a video stream to get input from the webcam and to show mask over the face in live video stream add canvas also. I have added the div tag with #canvas to take reference in typescript code same for video also.

<div class="video-container">
<video #video
[width]="WIDTH"
[height]="HEIGHT"
autoplay
muted>
</video>
<div #canvas></div>
</div>

In webcam.component.ts declare all variables and take element reference from #video and #canvas also import face-api.js.

import * as faceapi from 'face-api.js';...............................................
export class Webcam implements OnInit{
WIDTH = 440;
HEIGHT = 280;
@ViewChild('video',{ static: true })
public video: ElementRef;
@ViewChild('canvas',{ static: true })
public canvasRef: ElementRef;
constructor(private elRef: ElementRef) {}stream: any;
detection: any;
resizedDetections: any;
canvas: any;
canvasEl: any;
displaySize: any;
videoInput: any;

Now we have to load models which can recognize the face, plot landmark & show expression on it. To add this create a models folder inside the assets folder.

To get the models clone the face-api.js repo and copy models from the weights folder of the repo and paste it to our assets/models folder.

Now it's time to load them inside webcam.component.ts in ngOnInit(). when these models get loaded it will call the startVideo().

async ngOnInit() {
await Promise.all([faceapi.nets.tinyFaceDetector.loadFromUri('../../assets/models'),
await faceapi.nets.faceLandmark68Net.loadFromUri('../../assets/models'),await faceapi.nets.faceRecognitionNet.loadFromUri('../../assets/models'),await faceapi.nets.faceExpressionNet.loadFromUri('../../assets/models'),]).then(() => this.startVideo());
}

In startVideo() we are going to request for user webcam and get the stream from it. To get the video stream we are accessing the video tag using ElementRef & the same for canvas. we have already declared all variables.

startVideo() {
this.videoInput = this.video.nativeElement;
navigator.getUserMedia(
{ video: {}, audio: false },
(stream) => (this.videoInput.srcObject = stream),
(err) => console.log(err)
);
this.detect_Faces();
}

Now it will try to detect faces from a video stream using detect_Faces().

async detect_Faces() {
this.elRef.nativeElement.querySelector('video').addEventListener('play', async () => {
this.canvas = await faceapi.createCanvasFromMedia(this.videoInput);
this.canvasEl = this.canvasRef.nativeElement;
this.canvasEl.appendChild(this.canvas);
this.canvas.setAttribute('id', 'canvass');
this.canvas.setAttribute(
'style',`position: fixed;
top: 0;
left: 0;`
);
this.displaySize = {
width: this.videoInput.width,
height: this.videoInput.height,
};
faceapi.matchDimensions(this.canvas, this.displaySize);
setInterval(async () => {
this.detection = await faceapi.detectAllFaces(this.videoInput, new faceapi.TinyFaceDetectorOptions()).withFaceLandmarks().withFaceExpre ssions();
this.resizedDetections = faceapi.resizeResults(
this.detection,
this.displaySize
);
this.canvas.getContext('2d').clearRect(0, 0, this.canvas.width,this.canvas.height);
faceapi.draw.drawDetections(this.canvas, this.resizedDetections);
faceapi.draw.drawFaceLandmarks(this.canvas, this.resizedDetections);
faceapi.draw.drawFaceExpressions(this.canvas, this.resizedDetections);
}, 100);
});
}

4. Fix warnings and errors

Now it's time to run the program

ng serve

oops!!! we got a bunch of errors

error
error
Error: node_modules/@types/webgl2/index.d.ts:582:13 - error TS2403: Subsequent variable declarations must have the same type.  Variable 'WebGL2RenderingContext' must be of type '{ new (): WebGL2RenderingContext; prototype: WebGL2RenderingContext; readonly ACTIVE_ATTRIBUTES: number; readonly ACTIVE_TEXTURE: number; ... 556 more ...; readonly WAIT_FAILED: number; }', but here has type '{ new (): WebGL2RenderingContext; prototype: 
WebGL2RenderingContext; readonly ACTIVE_ATTRIBUTES: number; readonly ACTIVE_TEXTURE: number; ... 557 more ...; readonly MAX_CLIENT_WAIT_TIMEOUT_WEBGL: number; }'.
582 declare var WebGL2RenderingContext: {
~~~~~~~~~~~~~~~~~~~~~~
node_modules/typescript/lib/lib.dom.d.ts:16354:13
16354 declare var WebGL2RenderingContext: {
~~~~~~~~~~~~~~~~~~~~~~
'WebGL2RenderingContext' was also declared here.

To fix the above error add the below code to your tsconfig.json.

"skipLibCheck": true

Let’s run it again

ng serve

This time you will not get the error but the warning we are making progress.

warning

Warning: ./node_modules/face-api.js/build/es6/env/createFileSystem.js   
Module not found: Error: Can't resolve 'fs' in 'project_path\node_modules\face-api.js\build\es6\env'

To get rid of this warning add the below code to your package.json after devDependancies.

"browser": {
"fs": false,
"os": false,
"path": false
}

Run it again

ng serve

Tada your face-api.js is implemented go to localhost:4200 check it out.

But still, have 1 warning ignore it. It's just a warning don't worry….

igonore warning
Ignore warning

Thank you if you made it to the end……..

--

--

Responses (3)