Skip to content

BiDAlab/SwipeFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

58 Commits
 
 
 
 
 
 

Repository files navigation

SwipeFormer: Transformers for Mobile Touchscreen Biometrics

Header

Overview

This article explores and proposes novel touchscreen verification systems based on Transformers.

Transformers are more recently proposed DL architectures that have already garnered impmense interest due to their effectiveness across a range of application domains such as language assessment, vision, reinforcement learning, and biometrics [1]. Their main advantages compared with traditional CNN and RNN architectures are: i) Transformers are feed-forward models that process all the sequences in parallel, therefore increasing efficiency; ii) They apply Self-Attention/Auto-Correlation mechanisms that allows them to operate in long sequences; iii) They can be trained efficiently in a single batch since all the sequence is included in every batch; and iv) They can attend to the whole sequence, instead of summarising all the previous temporal information.

To our knowledge, this study represents the initial exploration of the potential application of Transformers in the domain of mobile touchscreen biometrics. Furthermore, it stands as the primary investigation to analyze unconstrained touchscreen gestures, yielding promising outcomes.

Benchmark Evaluation of SwipeFormer

We analyse the performance of SwipeFormer using the popular publicly available databases collected under constrained conditions: i) Frank DB [2] and ii) HuMIdb [3].

Overall, SwipeFormer outperforms previous state-of-the-art approaches in both databases under the same experimental protocol. In particular, for the Frank database [2], SwipeFormer achieves an EER of 11.30% in comparison with the 43.60% EER obtained with [4] and 25.00% with [5] (relative improvements of 74.80% and 56.00% respectively). In addition, for HuMIdb [3], SwipeFormer obtains an EER of 5.60% while [4] and [5] approaches achieve 43.10% and 13.00% EERs respectively (relative improvements of 88.40% and 61.50%).

Dependences

conda=22.9.0

CUDA

numpy=1.24.1

python=3.9.7

torch=1.11.0

Code

We provide the evaluation scripts in the Code folder.

References

[1] Paula Delgado-Santos, Ruben Tolosana, Richard Guest, Ruben Vera-Rodriguez, and Farzin Deravi, “Exploring Transformers for Behavioural Biometrics: A Case Study in Gait Recognition”, Pattern Recognition, 2023.

[2] M. Frank, R. Biedert, E. Ma, I. Martinovic, D. Song, Touchalytics: On the Applicability of Touchscreen Input as a Behavioral Biometric for Continuous Authentication, IEEE Transactions on Information Forensics and Security 8 (2012) 136–138

[3] A. Acien, A. Morales, J. Fierrez, R. Vera-Rodriguez and O. Delgado-Mohatar, BeCAPTCHA: Behavioral Bot Detection using Touchscreen and Mobile Sensors Benchmarked on HuMIdb, Engineering Applications of Artificial Intelligence 98 (2021) 104058

[4] J. Fierrez, A. Pozo, M. Martinez-Diaz, J. Galbally and A. Morales, Benchmarking Touchscreen Biometrics for Mobile Authentication, IEEE Transactions on Information Forensics and Security 13 (2018) 2720--2733

[5] *A. Acien, A. Morales, R. Vera-Rodriguez and J. Fierrez, Smartphone Sensors for Modeling Human-Computer Interaction: General Outlook and Research Datasets for User Authentication, Proc. IEEE Annual Computers, Software, and Applications Conference (2020)

Citation

If you use our code please cite:

@article{delgado2024swipeformer,
  title={{Exploring Transformers for Behavioural Biometrics: A Case Study in Gait Recognition}},
  author={Delgado-Santos, Paula and Tolosana, Ruben and Guest, Richard and Lamb, Parker and Khmelnitsky, Andrei and Coughlan, Colm, and Fierrez, Julian},
  journal={Expert Systems with Applications},
  volume = {237},
  pages = {121537},
  year = {2024}
}

Contact

If you have any questions, please contact us at paula.delgadodesantos@telefonica.com or ruben.tolosana@uam.es.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages