I managed to detect emotions and compare two images but I can't figure out how to use the landmarks.getMouth() method...
If anyone's able to help me with that I'll be grateful forever!
That's a sample of my code:
Promise.all([
faceapi.nets.tinyFaceDetector.loadFromUri('/models'),
faceapi.nets.faceLandmark68Net.loadFromUri('/models'),
faceapi.nets.faceRecognitionNet.loadFromUri('/models'),
faceapi.nets.faceExpressionNet.loadFromUri('/models'),
faceapi.nets.ssdMobilenetv1.loadFromUri('/models')
]).then(startVideo);
const detections = await faceapi
.detectAllFaces(video, new faceapi.TinyFaceDetectorOptions())
.withFaceLandmarks()
.withFaceExpressions()
.withFaceDescriptors();
You can try using the detectFaceLandmarks() function, with getMouth().