how to use faceapi.js landmarks.getMouth()

284 views Asked by At

I managed to detect emotions and compare two images but I can't figure out how to use the landmarks.getMouth() method...

If anyone's able to help me with that I'll be grateful forever!

That's a sample of my code:

Promise.all([
      faceapi.nets.tinyFaceDetector.loadFromUri('/models'),
      faceapi.nets.faceLandmark68Net.loadFromUri('/models'),
      faceapi.nets.faceRecognitionNet.loadFromUri('/models'),
      faceapi.nets.faceExpressionNet.loadFromUri('/models'),
      faceapi.nets.ssdMobilenetv1.loadFromUri('/models')
    ]).then(startVideo);

const detections = await faceapi
      .detectAllFaces(video, new faceapi.TinyFaceDetectorOptions())
      .withFaceLandmarks()
      .withFaceExpressions()
      .withFaceDescriptors();
1

There are 1 answers

1
Bidisha On

You can try using the detectFaceLandmarks() function, with getMouth().

const landmarks = await faceapi.detectFaceLandmarks(faceImage);
const landmarks2 = faceapi.detectFaceLandmarksTiny(faceImage) // for tinyface model
const mouth = landmarks.getMouth()