Staff View
Acoustic-based hand biometric sensing for user verification on mobile devices

Descriptive

TitleInfo
Title
Acoustic-based hand biometric sensing for user verification on mobile devices
Name (type = personal)
NamePart (type = family)
Yang
NamePart (type = given)
Yilin
DisplayForm
Yilin Yang
Role
RoleTerm (authority = RULIB)
author
Name (type = personal)
NamePart (type = family)
Chen
NamePart (type = given)
Yingying
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
chair
Name (type = personal)
NamePart (type = family)
Wei
NamePart (type = given)
Sheng
DisplayForm
Sheng Wei
Affiliation
Advisory Committee
Role
RoleTerm (authority = local)
member
Name (type = personal)
NamePart (type = family)
Ortiz
NamePart (type = given)
Jorge
DisplayForm
Jorge Ortiz
Affiliation
Advisory Committee
Role
RoleTerm (authority = local)
member
Name (type = personal)
NamePart (type = family)
Wang
NamePart (type = given)
Yan
DisplayForm
Yan Wang
Affiliation
Advisory Committee
Role
RoleTerm (authority = local)
member
Name (type = corporate)
NamePart
Rutgers University
Role
RoleTerm (authority = RULIB)
degree grantor
Name (type = corporate)
NamePart
School of Graduate Studies
Role
RoleTerm (authority = RULIB)
school
TypeOfResource
Text
Genre (authority = marcgt)
theses
OriginInfo
DateCreated (encoding = w3cdtf); (qualifier = exact); (keyDate = yes)
2023
DateOther (encoding = w3cdtf); (type = degree); (qualifier = exact)
2023-05
CopyrightDate (encoding = w3cdtf); (qualifier = exact)
2023
Language
LanguageTerm (authority = ISO 639-3:2007); (type = text)
English
Abstract (type = abstract)
Acoustic frequencies, commonly approximated from 20Hz to 20kHz, possess great potential for wireless sensing applications in the mobile Internet-of-Things (IoT). However, modern mobile IoT has underutilized this spectrum relative to higher frequency bands (i.e., MHz or higher), leaving much of this potential untapped. Inaudible acoustic sensing is both possible and practical on mobile devices (e.g., smartphones, smartwatches, tablets) for myriads of essential daily functions, including facilitating and securing telecommunications through user verification. Such feats are possible due to the propagation behaviors of acoustic frequencies near the thresholds of human hearing ability (i.e., under 500Hz orover 16kHz) when travelling through air and solids. Acoustic signals are attenuated by the material they propagate through. Ordinarily regarded as interference, this attenuation can also reveal information about the propagation medium, not only if it was a human body, but if it was a specific body. Thus, we allow mobile devices to identify users from mere
physical contact and respond accordingly, such as by locking or unlocking access to data. This dissertation aims to demonstrate these ideas by studying acoustic behavior on mobile devices and acoustic responsiveness to different user hands and bodies. We first investigate the versatility of inaudible acoustic frequencies and their aptitude at transferring information between transmitter and receiver sensors. We theoretically model speaker non-linearity and transmission power, designing communication schemes utilizing two speakers to achieve inaudibility. At the receiver side, we double the coefficient of received signal strength by leveraging microphone non-linearity. Experimental results suggest that our system can achieve over 2m range and over 17kbps throughput, achieving longer range and/or higher throughput than similar works while remaining inaudible. We then study the ability of acoustic signals to capture user-specific information when travelling through the hand that holds the device. We propose a non-intrusive hand sensing technique to derive unique acoustic features in both the time and frequency domains, which can effectively capture the physiological and behavioral traits of a user’s hand (e.g., hand contours, finger sizes, holding strengths, and holding styles). Learning-based algorithmsare developed to robustly identify the user under various environments and conditions. We conduct extensive experiments with 20 participants, gathering 80,000 hand geometry samples using different smartphone and tablet models across 160 key use case scenarios. Our results were shown to identify users with over 94% accuracy, without requiring any
active user input. Having verified the concept on smartphones, we then extend the study to smartwatches, which possess considerably less powerful sensors and new design constraints. Our redesigned system employs a challenge-response process to passively capture behavioral and physiological biometrics from an unobtrusive touch gesture using low-fidelity acoustic and vibration smartwatch sensors. We develop a cross-domain sensing technique (i.e., measuring acoustic signals in the vibration domain) to capture robust and effective features specific to user fingers and improve robustness. A low-cost profile matching-based classifier is designed to enable stand-alone user authentication on smartwatches. Experimentswith 54 participants using varied hardware, environments, noise levels, user motions, and other impact factors, achieved around 97% true positive rate and 2% false positive rate in user authentication. Finally, we explore how structural characteristics of the mobile device can heighten the sensitivity of acoustic sensing. We thus propose an acoustic sensing system for smartphones that leverages smartphone cases modified with internal mini- structures to capture finger-tip biometric information. The design of the mini-structure allows developers to control the behavior of structure-borne sound such that unique responses are produced when different users and fingers touch the smartphone case at different locations. Experiments with 46 users over 10 weeks illustrate how we can differentiate users with over 94% accuracy at a 5% false positive rate.
Subject (authority = RUETD)
Topic
Computer engineering
Subject (authority = local)
Topic
Acoustic sensing
Subject (authority = local)
Topic
Biometrics
Subject (authority = local)
Topic
Mobile devices
Subject (authority = local)
Topic
User verification
RelatedItem (type = host)
TitleInfo
Title
Rutgers University Electronic Theses and Dissertations
Identifier (type = RULIB)
ETD
Identifier
http://dissertations.umi.com/gsnb.rutgers:12389
PhysicalDescription
InternetMediaType
application/pdf
InternetMediaType
text/xml
Extent
159 pages : illustrations
Note (type = degree)
Ph.D.
Note (type = bibliography)
Includes bibliographical references
RelatedItem (type = host)
TitleInfo
Title
School of Graduate Studies Electronic Theses and Dissertations
Identifier (type = local)
rucore10001600001
Location
PhysicalLocation (authority = marcorg); (displayLabel = Rutgers, The State University of New Jersey)
NjNbRU
Identifier (type = doi)
doi:10.7282/t3-cxj2-ck57
Back to the top

Rights

RightsDeclaration (ID = rulibRdec0006)
The author owns the copyright to this work.
RightsHolder (type = personal)
Name
FamilyName
Yang
GivenName
Yilin
Role
Copyright holder
RightsEvent
Type
Permission or license
DateTime (encoding = w3cdtf); (qualifier = exact); (point = start)
2023-04-27T16:17:00
AssociatedEntity
Name
Yilin Yang
Role
Copyright holder
Affiliation
Rutgers University. School of Graduate Studies
AssociatedObject
Type
License
Name
Author Agreement License
Detail
I hereby grant to the Rutgers University Libraries and to my school the non-exclusive right to archive, reproduce and distribute my thesis or dissertation, in whole or in part, and/or my abstract, in whole or in part, in and from an electronic format, subject to the release date subsequently stipulated in this submittal form and approved by my school. I represent and stipulate that the thesis or dissertation and its abstract are my original work, that they do not infringe or violate any rights of others, and that I make these grants as the sole owner of the rights to my thesis or dissertation and its abstract. I represent that I have obtained written permissions, when necessary, from the owner(s) of each third party copyrighted matter to be included in my thesis or dissertation and will supply copies of such upon request by my school. I acknowledge that RU ETD and my school will not distribute my thesis or dissertation or its abstract if, in their reasonable judgment, they believe all such rights have not been secured. I acknowledge that I retain ownership rights to the copyright of my work. I also retain the right to use all or part of this thesis or dissertation in future works, such as articles or books.
Copyright
Status
Copyright protected
Availability
Status
Open
Reason
Permission or license
Back to the top

Technical

RULTechMD (ID = TECHNICAL1)
ContentModel
ETD
OperatingSystem (VERSION = 5.1)
windows xp
CreatingApplication
Version
1.5
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2023-04-05T21:18:51
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2023-04-05T21:18:51
ApplicationName
pdfTeX-1.40.24
Back to the top
Version 8.5.3
Rutgers University Libraries - Copyright ©2023