1. Evaluation of the Human Biometric Sensor Interaction using Hand GeometryCarnahan Conference| San Jose, CA| October 7th, 2010 Biometric Standards, Performance, and Assurance Laboratory | Purdue University www.bspalabs.org www.twitter.com/bspalabs www.slideshare.net/bspalabs www.linkedin.com/companies/bspa-labs
2. Agenda Current Testing Standards and Norms The Missing Link? What performance evaluations should also explain Usability & Biometrics: Our systems should be usable? The Human-Biometric Sensor Interaction (HBSI) HBSI Framework Applicability to Hand Geometry Questions
3. Scope – from Mansfield & Grother’sThe Wide World of Biometric Testing Measure FRR after data collection What should be done Have tests been driven by what can be done VS Observe and count mispresentation effects
4. Development of a general model Wayman “commonly-held knowledge and oft-repeated descriptions of biometric identification systems are more complicated than necessary because of this lack of a generally accepted system model”
6. What do our testing standards say? Distinctions between technology and scenario evaluations according to ISO/IEC 19795-2:
7. What do our testing standards say? What about the: Environment ISO/IEC 1st WD 29197, Evaluation methodology for environmental influence in biometric systems Human-Sensor Interaction …
8. Lack of user-centric design (A. Adams and M.A. Sasse, “Users are Not the Enemy: Why Users Compromise Security Mechanisms and How to Take Remedial Mea- sures,” Comm. ACM, vol. 42, no. 12, 1999, pp. 41–46 explain that typical security deployments ignore, or at least leave till the end the user-centric design.
9. Aim of HBSI model Provide a structure and definition to errors that are observed while undertaking biometric tests on various biometric modalities.
10. What Performance Evaluations Should Also Explain Is the algorithm the cause of matching errors? Is the application or environment the problem? Is the design of the sensor the problem? Are the users/agents causing the issue? Can users/agents do what the system/sensor is asking for? Do users/agents understand how to use the system/sensor? Can users/agents produce repeatable images?
11. HBSI model and deployed system Understanding how users interact with the system is also important for successful deployment of the biometric system. Full habituation arises when match scores are stabilized
13. Usability & Biometrics “Usability” Usability The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use (ISO 9241-11:1998, ISO/IEC 25062:2006) Failure to Acquire (FTA) Traditional measure of “usability” in biometrics Proportion of verification or identification attempts for which the system fails to capture or locate an image or signal of sufficient quality (ISO/IEC 19795-1)
14. The Human-Biometric Sensor Interaction (HBSI) Derived from multiple research fields to better understand and evaluate overall functionality and performance of a biometric system
15. HBSI Framework for Biometric Interactions Objective Classify every human-sensor interaction “event” with the resulting biometric system “reaction” Event + Reaction = HBSI episode Purpose Understand and classify all interactions / movements / behaviors that occur with a biometric device to improve performance, quality, and usability Examines a biometric system from 2 perspectives: User & Biometric System
17. Incorrect Presentation – Defective Interaction A defective interaction occurs when a bad presentation is made by the subject to the hand geometry machine that is not detected by the system
18. Incorrect Presentation – User Concealed Interaction Presentations to the hand geometry machine when the user is at fault for the erroneous presentation.
19. Incorrect Presentation – System Concealed Interaction The user interacts with the hand geometry machine, but does not provide a good quality sample.
20. Incorrect Presentation – System Failure to Detect Where the user correctly places their hand in the hand geometry machine, but the machine does nothing and times out.
22. Incorrect Presentation – Failure to Extract Occurs after a sample has been collected from the hand geometry machine, but the algorithm fails to extract meaningful data
23. Incorrect Presentation – Successfully Acquired Sample When the sample has been detected by the system and subsequently extracted and passed through to the biometric matching system
24. Observations Interaction videos are interesting so that we can segment these errors Once these errors have been identified we can improve training
33. Future Work Evaluate more modalities with the framework physical-interactive image-based behavioral Refinement of the metrics Inter-rater reliability T&E Standard Methodology?
34. Any Questions? Follow the discussion on the research blog after the conference www.bspalabs.org/
35. Authors and Primary Contact Information Authors Benny Senjaya Graduate Researcher at BSPA Lab bennysenjaya@gmail.com Stephen Elliott, Ph.D. BSPA Lab Director & Associate Professor elliott@purdue.edu Eric Kukula, Ph.D. Visiting Assistant Professor ekukula@gmail.com Mark Wade Undergraduate Researcher Jason Werner Undergraduate Researcher Contact Information Stephen Elliott, Ph.D. Associate Professor Director of BSPA Labs elliott@purdue.edu