Staff View
Co-adaptive multimodal interface guided by real-time multisensory stochastic feedback

Descriptive

TitleInfo
Title
Co-adaptive multimodal interface guided by real-time multisensory stochastic feedback
Name (type = personal)
NamePart (type = family)
Kalampratsidou
NamePart (type = given)
Vilelmini
DisplayForm
Vilelmini Kalampratsidou
Role
RoleTerm (authority = RULIB)
author
Name (type = personal)
NamePart (type = family)
Torres
NamePart (type = given)
Elizabeth
DisplayForm
Elizabeth Torres
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
chair
Name (type = personal)
NamePart (type = family)
Metaxas
NamePart (type = given)
Dimitris
DisplayForm
Dimitris Metaxas
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Bekris
NamePart (type = given)
Kostas
DisplayForm
Kostas Bekris
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Moustakides
NamePart (type = given)
Geiorge
DisplayForm
Geiorge Moustakides
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Ihlefeld
NamePart (type = given)
Antje
DisplayForm
Antje Ihlefeld
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
outside member
Name (type = corporate)
NamePart
Rutgers University
Role
RoleTerm (authority = RULIB)
degree grantor
Name (type = corporate)
NamePart
School of Graduate Studies
Role
RoleTerm (authority = RULIB)
school
TypeOfResource
Text
Genre (authority = marcgt)
theses
OriginInfo
DateCreated (qualifier = exact)
2018
DateOther (qualifier = exact); (type = degree)
2018-05
CopyrightDate (encoding = w3cdtf); (qualifier = exact)
2018
Place
PlaceTerm (type = code)
xx
Language
LanguageTerm (authority = ISO639-2b); (type = code)
eng
Abstract (type = abstract)
In this work, we present new data-types, analytics, and human-computer interfaces as a platform to enable a new type of co-adaptive-behavioural analyses to track neuroplasticity. We exhibit seven different works, all of which are steps in creating an interface that “collaborates” in a closed-loop formula with the sensory-motor system in order to augment existing or substitute lost sensations. Such interfaces are beneficial as they enable the systems to adapt and evolve based on the participants’ rate of adaptation and preferences in evolution, ultimately steering the system towards favorable regimes. We started by trying to address the question: textit{"how does our novel sensory-motor system learn and adapt to changes?"}. In a pointing task, subjects had to discover and learn the sequence of the points presented on the screen (which was repetitive) and familiarise themselves with a non-predicted event (which occurred occasionally). In this very first study, we examined the learnability of motor system across seven individuals, and we investigated the learning patterns of each individual. Then, we explored how other bodily signals, such as temperature, are affecting movement. At this point, we conducted two studies. In the first one, we looked into the impact of the temperature range in the quality of the performed movement. This study was conducted in 40 individuals, 20 Schizophrenia patients, known to have temperature irregularities, and 20 controls. We identified the differences between the two populations in the range of temperature and the stochastic signatures of their kinematic data. To have a better look into the relation of movement and temperature, we conducted a second study utilizing data of a pre-professional ballet student recorded during her 6h training and her follow up sleep. For this study, we designed a new data type that allows us to examine movement as a function of temperature and see how each degree of temperature impacts the fluctuations in movement. This new data structure could be used for the integration of any bodily signal. Next, we identified the need to build visualization tools that could picture in real-time sensory information extracted from the analysis that would be informative to the participant. Such tools could be used in a vision-driven co-adaptive interface. For this reason, we designed an in-Matlab avatar that enables us to color-code sensory information to the corresponding body parts of the participant. In our next study, we examined two college-age individuals (a control and an Asperger syndrome) under sensory modalities and preferences. We built methods to extract for each individual the preferred sensory modality from the motor stream, textbf{selectivity}, and preferences of the particular modality that motivate the system to perform at its best, textbf{preferability}. These two parameters were critical to finally close the loop by letting the system decide upon the individual preferences. Therefore, we moved from the open-loop approach, to which all the so-far described studies belong to, into the closed loop approach. Firstly we study a natural closed-loop interface established by the dyadic interaction of two ballet dancers while rehearsing. In this natural paradigm, the closed-loop coadaptation happens through the touches and the pushes that dancers apply on each other in order to co-ordinate, kinesthetic adaptation. Therefore, we applied network connectivity metrics and extracted information such as underlying synergies, leading, and lagging body-parts to name a few. Such tools could be used in a vison-driven co-adaptive interfaces to evaluate the interaction between the participant and the displayed avatar. Finally, we built an artificial audio-driven co-adaptive interface which can track the adaptation and progress of the individual and intelligently steer the system towards the preferred and motivational conditions of the participant. For this conducted study, we utilized the heart rate of a salsa dancer to adjust the tempo of the music. The study showed that such a system can steer the stochastic signatures even of the heart (autonomic signal) creating strong evidence that we can guide the human system towards desire regimes.
Subject (authority = RUETD)
Topic
Computer Science
RelatedItem (type = host)
TitleInfo
Title
Rutgers University Electronic Theses and Dissertations
Identifier (type = RULIB)
ETD
Identifier
ETD_8839
PhysicalDescription
Form (authority = gmd)
electronic resource
InternetMediaType
application/pdf
InternetMediaType
text/xml
Extent
1 online resource (xxxviii, 200 p. : ill.)
Note (type = degree)
Ph.D.
Note (type = bibliography)
Includes bibliographical references
Subject (authority = ETD-LCSH)
Topic
Afferent pathways
Subject (authority = ETD-LCSH)
Topic
Human-computer interaction
Note (type = statement of responsibility)
by Vilelmini Kalampratsidou
RelatedItem (type = host)
TitleInfo
Title
School of Graduate Studies Electronic Theses and Dissertations
Identifier (type = local)
rucore10001600001
Location
PhysicalLocation (authority = marcorg); (displayLabel = Rutgers, The State University of New Jersey)
NjNbRU
Identifier (type = doi)
doi:10.7282/T39K4FPM
Genre (authority = ExL-Esploro)
ETD doctoral
Back to the top

Rights

RightsDeclaration (ID = rulibRdec0006)
The author owns the copyright to this work.
RightsHolder (type = personal)
Name
FamilyName
Kalampratsidou
GivenName
Vilelmini
Role
Copyright Holder
RightsEvent
Type
Permission or license
DateTime (encoding = w3cdtf); (point = start); (qualifier = exact)
2018-04-13 11:41:40
AssociatedEntity
Name
Vilelmini Kalampratsidou
Role
Copyright holder
Affiliation
Rutgers University. School of Graduate Studies
AssociatedObject
Type
License
Name
Author Agreement License
Detail
I hereby grant to the Rutgers University Libraries and to my school the non-exclusive right to archive, reproduce and distribute my thesis or dissertation, in whole or in part, and/or my abstract, in whole or in part, in and from an electronic format, subject to the release date subsequently stipulated in this submittal form and approved by my school. I represent and stipulate that the thesis or dissertation and its abstract are my original work, that they do not infringe or violate any rights of others, and that I make these grants as the sole owner of the rights to my thesis or dissertation and its abstract. I represent that I have obtained written permissions, when necessary, from the owner(s) of each third party copyrighted matter to be included in my thesis or dissertation and will supply copies of such upon request by my school. I acknowledge that RU ETD and my school will not distribute my thesis or dissertation or its abstract if, in their reasonable judgment, they believe all such rights have not been secured. I acknowledge that I retain ownership rights to the copyright of my work. I also retain the right to use all or part of this thesis or dissertation in future works, such as articles or books.
RightsEvent
DateTime (encoding = w3cdtf); (point = start); (qualifier = exact)
2020-01-22
DateTime (encoding = w3cdtf); (point = end); (qualifier = exact)
2021-05-31
Type
Embargo
Detail
Access to this PDF has been restricted at the author's request. It will be publicly available after May 31, 2021.
Copyright
Status
Copyright protected
Availability
Status
Open
Reason
Permission or license
Back to the top

Technical

RULTechMD (ID = TECHNICAL1)
ContentModel
ETD
OperatingSystem (VERSION = 5.1)
windows xp
CreatingApplication
Version
1.5
ApplicationName
MiKTeX pdfTeX-1.40.18
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2018-04-13T05:05:15
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2018-04-13T05:05:15
Back to the top
Version 8.5.5
Rutgers University Libraries - Copyright ©2024