Face Anti-spoofing Based on Supervised Multi-view Contrastive Learning and Two-stage Bilinear Feature Fusion
Author:
  • Article
  • | |
  • Metrics
  • |
  • Reference
  • |
  • Related [10]
  • |
  • Cited by
  • | |
  • Comments
    Abstract:

    In this study, a multi-branch network that integrates multi-scale frequency features and depth map features trained by generative adversarial network (GAN) is proposed. Specifically, edge texture information in high-frequency features is beneficial to capturing moire patterns. Low-frequency features are more sensitive to color distortion. Depth maps are more discriminative than RGB images from the visual level as auxiliary information. Supervised multi-view contrastive learning is employed to further enhance multi-view feature learning. Moreover, a two-stage bilinear feature fusion method is proposed to effectively integrate multi-branch features from different views. To evaluate the model, ablation experiments, feature fusion comparison experiments, intra-set experiments and inter-set experiments are conducted on four widely used public datasets, namely CASIA-FASD, Replay-Attack, MSU-MFSD, and OULU-NPU. The experiment result shows that the average HTER of the proposed model on the four tested protocols is 5% (20.3% to 15.0%) better than the DFA method in the inter-set evaluation.

    Reference
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

孙文赟,李进,金忠.有监督多视图对比学习和两阶段双线性特征融合的人脸活体检测.计算机系统应用,2024,33(11):131-141

Copy
Share
Article Metrics
  • Abstract:653
  • PDF: 982
  • HTML: 582
  • Cited by: 0
History
  • Received:May 08,2024
  • Revised:May 29,2024
  • Online: September 27,2024
Article QR Code
You are the first992297Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063