Staff View
Explanation-driven learning-based models for visual recognition tasks

Descriptive

TitleInfo
Title
Explanation-driven learning-based models for visual recognition tasks
Name (type = personal)
NamePart (type = family)
Daniels
NamePart (type = given)
Zachary
NamePart (type = date)
1991
DisplayForm
Zachary Daniels
Role
RoleTerm (authority = RULIB); (type = text)
author
Name (type = personal)
NamePart (type = family)
Metaxas
NamePart (type = given)
Dimitris N
DisplayForm
Dimitris N Metaxas
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
chair
Name (type = personal)
NamePart (type = family)
Michmizos
NamePart (type = given)
Konstantinos
DisplayForm
Konstantinos Michmizos
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Moustakides
NamePart (type = given)
George
DisplayForm
George Moustakides
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Li
NamePart (type = given)
Fuxin
DisplayForm
Fuxin Li
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
outside member
Name (type = corporate)
NamePart
Rutgers University
Role
RoleTerm (authority = RULIB)
degree grantor
Name (type = corporate)
NamePart
School of Graduate Studies
Role
RoleTerm (authority = RULIB)
school
TypeOfResource
Text
Genre (authority = marcgt)
theses
OriginInfo
DateCreated (qualifier = exact); (encoding = w3cdtf); (keyDate = yes)
2020
DateOther (type = degree); (qualifier = exact); (encoding = w3cdtf)
2020-10
Language
LanguageTerm (authority = ISO 639-3:2007); (type = text)
English
Abstract
Safety-critical applications (e.g., autonomous vehicles, human-machine teaming, and automated medical diagnosis) often require the use of computational agents that are capable of understanding and reasoning about the high-level content of real-world scene images in order to make rational and grounded decisions that can be trusted by humans. Many of these agents rely on machine learning-based models which are increasingly being treated as black-boxes. One way to increase model interpretability is to make explainability a core principle of the model, e.g., by forcing deep neural networks to explicitly learn grounded and interpretable features. In this thesis, I provide a high-level overview of the field of explainable/interpretable machine learning and review some existing approaches for interpreting neural networks used for computer vision tasks. I also introduce four novel approaches for making convolutional neural networks (CNNs) more interpretable by utilizing explainability as a guiding principle when designing the model architecture. Finally, I discuss some possible future research directions involving explanation-driven machine learning.
Subject (authority = local)
Topic
Computer vision
Subject (authority = RUETD)
Topic
Computer Science
RelatedItem (type = host)
TitleInfo
Title
Rutgers University Electronic Theses and Dissertations
Identifier (type = RULIB)
ETD
Identifier
ETD_11011
PhysicalDescription
Form (authority = gmd)
InternetMediaType
application/pdf
InternetMediaType
text/xml
Extent
1 online resource (xxiii, 210 pages)
Note (type = degree)
Ph.D.
Note (type = bibliography)
Includes bibliographical references
Genre (authority = ExL-Esploro)
ETD doctoral
RelatedItem (type = host)
TitleInfo
Title
School of Graduate Studies Electronic Theses and Dissertations
Identifier (type = local)
rucore10001600001
Location
PhysicalLocation (authority = marcorg); (displayLabel = Rutgers, The State University of New Jersey)
NjNbRU
Identifier (type = doi)
doi:10.7282/t3-8w7j-1v31
Back to the top

Rights

RightsDeclaration (ID = rulibRdec0006)
The author owns the copyright to this work.
RightsHolder (type = personal)
Name
FamilyName
Daniels
GivenName
Zachary
Role
Copyright Holder
RightsEvent
Type
Permission or license
DateTime (encoding = w3cdtf); (qualifier = exact); (point = start)
2020-06-02 18:53:29
AssociatedEntity
Name
Zachary Daniels
Role
Copyright holder
Affiliation
Rutgers University. School of Graduate Studies
AssociatedObject
Type
License
Name
Author Agreement License
Detail
I hereby grant to the Rutgers University Libraries and to my school the non-exclusive right to archive, reproduce and distribute my thesis or dissertation, in whole or in part, and/or my abstract, in whole or in part, in and from an electronic format, subject to the release date subsequently stipulated in this submittal form and approved by my school. I represent and stipulate that the thesis or dissertation and its abstract are my original work, that they do not infringe or violate any rights of others, and that I make these grants as the sole owner of the rights to my thesis or dissertation and its abstract. I represent that I have obtained written permissions, when necessary, from the owner(s) of each third party copyrighted matter to be included in my thesis or dissertation and will supply copies of such upon request by my school. I acknowledge that RU ETD and my school will not distribute my thesis or dissertation or its abstract if, in their reasonable judgment, they believe all such rights have not been secured. I acknowledge that I retain ownership rights to the copyright of my work. I also retain the right to use all or part of this thesis or dissertation in future works, such as articles or books.
Copyright
Status
Copyright protected
Availability
Status
Open
Reason
Permission or license
Back to the top

Technical

RULTechMD (ID = TECHNICAL1)
ContentModel
ETD
OperatingSystem (VERSION = 5.1)
windows xp
CreatingApplication
Version
1.5
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2020-07-06T23:01:38
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2020-07-06T23:01:38
ApplicationName
pdfTeX-1.40.19
Back to the top
Version 8.5.5
Rutgers University Libraries - Copyright ©2024