What this is
- A complete, ready-to-run prototype for a hand/finger landmark-based biometric verifier using a webcam.
- Uses MediaPipe for 21 hand landmarks, stores landmark vectors per-user, trains a lightweight SVM verifier, and performs live verification.
- Includes simple liveness (motion) check and threshold calibration script.
Contents
capture.py— capture landmarks from webcam intodata/hand_landmarks.csvtrain.py— train an SVM verifier and savemodels/hand_verifier.joblibverify.py— live verification GUI (uses trained model)calibrate.py— compute thresholds, EER-style curve and suggested thresholdutils.py— shared helper functions (normalization, IO)requirements.txt— Python packagesexample_data/sample_hand_landmarks.csv— tiny example CSV with two usersLICENSE— MIT
Quick start (Linux / Windows WSL / macOS)
- Create virtualenv and install:
python -m venv venv source venv/bin/activate # or venv\Scripts\activate on Windows pip install -r requirements.txt
- Capture samples for user
alice:Repeat for other users (e.g.,python capture.py capture alice --n 30
bob). - Train:
python train.py
- Verify live:
python verify.py verify alice
Notes & recommendations
- Good lighting and consistent hand pose improve results.
- Normalize landmarks (translation, scale, optional rotation) is performed in
utils.py. - For production: use secure storage for templates, stronger embeddings (siamese nets), and robust liveness (NIR/depth).