2

How to detect liveness of face using face landmark points? I am getting face landmarks from android camera frames. And I want to detect liveness using these landmark points. How to tell if a human is making a specific movement that can be useful for liveness detection?

Shayan Shafiq
  • 350
  • 1
  • 4
  • 12

3 Answers3

0

One approach would be to obtain labeled ("yes/no") samples, e.g. via Mechanical Turk, it costs though) and approach it as supervised learning task. E.g. a neural network will discover itself through the learning what features (landmark positions) are helpful for detection.

There should be examples around for mood detection etc.

Xpector
  • 161
  • 3
0

BioID has a liveness detection algorithm that you can test here and it is free!

You can get their model at their GitHub. I think it is open-source (check the license).

Their model relies on facial landmarking. They also have algorithms for: - Face Verification - Photo Verification - Cloud Based Solutions in BWS

0

Detecting liveness of a face using face landmark points is a common technique in computer vision-based liveness detection systems. It involves analyzing the movement patterns of facial landmarks to distinguish between a live human face and a static (e.g., printed or digital) representation. There are various methods you can employ to achieve this. Here's a general approach you can follow:

1: Collect a Dataset: First, you need a dataset of face images or video frames with corresponding ground truth labels indicating whether each face is live or fake (spoofed). This dataset should cover a variety of facial expressions, movements, and environmental conditions.

2: Face Detection and Landmark Extraction: In your Android app, use a face detection algorithm (e.g., Haar cascades, Single Shot Multibox Detector (SSD), or deep learning-based face detectors) to locate and extract facial regions. Then, a face landmark detection model is applied to obtain the key facial landmarks' coordinates.

3: Define Liveness Indicators: Identify specific facial movements or characteristics that indicate a live face. Some common liveness indicators include blinking, head rotation, talking, eye movement, and facial expressions.

4: Temporal Analysis: Many liveness detection methods rely on the temporal analysis of facial landmark movements over a sequence of frames. By tracking the movement of facial landmarks over time, you can assess the consistency and dynamics of facial expressions.

5: Feature Engineering: Extract relevant features from the landmark data. For instance, you can calculate the mean displacement of the facial landmarks over time or the standard deviation of landmark positions.

6: Train a Classifier: Use the labeled dataset to train a machine learning classifier (e.g., SVM, Random Forest, or deep learning models like CNN) to distinguish between live and fake faces based on the extracted features.

7: Validation and Testing: Split your dataset into training and testing sets. Evaluate the performance of your liveness detection model on the testing set using metrics like accuracy, precision, recall, and F1-score.

8: Model Optimization: Iterate on your model to improve its performance by adjusting hyperparameters, trying different feature representations, or experimenting with different classifiers.

9: Deployment: Integrate your trained liveness detection model into your Android app, where you can use it to assess the liveness of the face in real-time camera frames.

I also Found KYChub Face Liveness solution, you can check here, hope this will you