Staff View
Reflectance and texture encoding for material recognition and synthesis

Descriptive

TitleInfo
Title
Reflectance and texture encoding for material recognition and synthesis
Name (type = personal)
NamePart (type = family)
Zhang
NamePart (type = given)
Hang
DisplayForm
Hang Zhang
Role
RoleTerm (authority = RULIB)
author
Name (type = personal)
NamePart (type = family)
Dana
NamePart (type = given)
Kristin
DisplayForm
Kristin Dana
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
chair
Name (type = personal)
NamePart (type = family)
Patel
NamePart (type = given)
Vishal
DisplayForm
Vishal Patel
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Zhang
NamePart (type = given)
Yanyong
DisplayForm
Yanyong Zhang
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Nishino
NamePart (type = given)
Ko
DisplayForm
Ko Nishino
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
outside member
Name (type = personal)
NamePart (type = family)
Muller
NamePart (type = given)
Urs
DisplayForm
Urs Muller
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
outside member
Name (type = corporate)
NamePart
Rutgers University
Role
RoleTerm (authority = RULIB)
degree grantor
Name (type = corporate)
NamePart
School of Graduate Studies
Role
RoleTerm (authority = RULIB)
school
TypeOfResource
Text
Genre (authority = marcgt)
theses
OriginInfo
DateCreated (qualifier = exact)
2017
DateOther (qualifier = exact); (type = degree)
2017-10
CopyrightDate (encoding = w3cdtf); (qualifier = exact)
2017
Place
PlaceTerm (type = code)
xx
Language
LanguageTerm (authority = ISO639-2b); (type = code)
eng
Abstract (type = abstract)
Material recognition plays an important role for a machine to understand and interact with the world. For example, an autonomous vehicle can use material recognition to determine whether the terrain is asphalt, grass, gravel, ice or snow in order to optimize the mechanical control and a robot can easily grab an object with proper pressure if the object’s composition is known. This thesis is dedicated to developing fast material recognition techniques towards the goal of building camera seeing materials in real-time. Color and geometry are not a full measure of the richness of visual appearance. Reflectance describes the characteristics of light interaction with a surface, which is uniquely determined by how the surface is made up at a microscopic (e.g., pigments in the surface medium) and mescopic scale (e.g., geometric 3D texture). Naturally, reflectance provides an invaluable clue about the surface, including what it is made of (i.e., material), and how it is shaped (i.e., surface roughness). We study how reflectance can reveal the material categories and physical properties. 1. Reflectance Hashing: Reflectance is challenging to measure and use for recognizing materials due to its high-dimensionality. In this work, we bypass the use of a gonioreflectometer by using a novel one-shot reflectance camera based on a parabolicmirror design. The pixel coordinates of these reflectance disks correspond to the sur- face viewing angles. The reflectance has class-specific structure and angular gradients computed in this reflectance space reveal the material class. These reflectance disks encode discriminative information for efficient and accurate material recognition. We introduce a framework called reflectance hashing that models the reflectance disks with dictionary learning and binary hashing. We demonstrate the effectiveness of reflectance hashing for material recognition with a number of real-world materials. 2. Deep Reflectance Code: We introduce a framework that enables prediction of actual friction values for surfaces using one-shot reflectance measurements. This work is a first of its kind vision-based friction estimation. We develop a novel representation for reflectance disks that capture partial BRDF measurements instantaneously. Our method of deep reflectance codes combines CNN features and fisher vector pooling with optimal binary embedding to create codes that have sufficient discriminatory power and have important properties of illumination and spatial invariance. The experimental results demonstrate that reflectance can play a new role in deciphering the underlying physical properties of real-world scenes. 3. Texture Encoding Network: We propose a Deep Texture Encoding Net- work (DeepTEN) with a novel Encoding Layer integrated on top of convolutional layers, which ports the entire dictionary learning and encoding pipeline into a single model. The features, dictionaries and the encoding representation for the classifier are all learned simultaneously. The representation is orderless and therefore is particularly useful for material and texture recognition. The Encoding Layer generalizes robust residual encoders such as VLAD and Fisher Vectors, and has the property of dis- carding domain specific information which makes the learned convolutional features easier to transfer. The experimental results show superior performance as compared to state-of-the-art methods using gold-standard databases such as MINC-2500, Flickr Material Database, KTH-TIPS-2b, and two recent databases 4D-Light-Field-Material and GTOS. 4. Real-time Texture Synthesis We introduce a Multi-style Generative Net- work (MSG-Net) with a novel Inspiration Layer, which retains the functionality of optimization-based approaches and has the fast speed of feed-forward networks. The proposed Inspiration Layer explicitly matches the feature statistics with the target styles at run time, which dramatically improves versatility of existing generative net- work, so that multiple styles can be realized within one network. The proposed MSG- Net matches image styles at multiple scales and puts the computational burden into the training. The learned generator is a compact feed-forward network that runs in real-time after training. In conclusion, this thesis developed robust visual techniques for material and texture modeling. We introduce the concept of angular gradient that has been proved effective in material recognition. Our proposed texture encoding network has achieved state of the art results on material and texture recognition. We expect the proposed solutions for material material recognition have a tremendous impact on a great number of real- world applications.
Subject (authority = RUETD)
Topic
Electrical and Computer Engineering
RelatedItem (type = host)
TitleInfo
Title
Rutgers University Electronic Theses and Dissertations
Identifier (type = RULIB)
ETD
Identifier
ETD_8414
PhysicalDescription
Form (authority = gmd)
electronic resource
InternetMediaType
application/pdf
InternetMediaType
text/xml
Extent
1 online resource (xx, 114 p. : ill.)
Note (type = degree)
Ph.D.
Note (type = bibliography)
Includes bibliographical references
Subject (authority = ETD-LCSH)
Topic
Computer vision
Note (type = statement of responsibility)
by Hang Zhang
RelatedItem (type = host)
TitleInfo
Title
School of Graduate Studies Electronic Theses and Dissertations
Identifier (type = local)
rucore10001600001
Location
PhysicalLocation (authority = marcorg); (displayLabel = Rutgers, The State University of New Jersey)
NjNbRU
Identifier (type = doi)
doi:10.7282/T3Z89GMT
Genre (authority = ExL-Esploro)
ETD doctoral
Back to the top

Rights

RightsDeclaration (ID = rulibRdec0006)
The author owns the copyright to this work.
RightsHolder (type = personal)
Name
FamilyName
Zhang
GivenName
Hang
Role
Copyright Holder
RightsEvent
Type
Permission or license
DateTime (encoding = w3cdtf); (qualifier = exact); (point = start)
2017-09-29 20:32:23
AssociatedEntity
Name
Hang Zhang
Role
Copyright holder
Affiliation
Rutgers University. School of Graduate Studies
AssociatedObject
Type
License
Name
Author Agreement License
Detail
I hereby grant to the Rutgers University Libraries and to my school the non-exclusive right to archive, reproduce and distribute my thesis or dissertation, in whole or in part, and/or my abstract, in whole or in part, in and from an electronic format, subject to the release date subsequently stipulated in this submittal form and approved by my school. I represent and stipulate that the thesis or dissertation and its abstract are my original work, that they do not infringe or violate any rights of others, and that I make these grants as the sole owner of the rights to my thesis or dissertation and its abstract. I represent that I have obtained written permissions, when necessary, from the owner(s) of each third party copyrighted matter to be included in my thesis or dissertation and will supply copies of such upon request by my school. I acknowledge that RU ETD and my school will not distribute my thesis or dissertation or its abstract if, in their reasonable judgment, they believe all such rights have not been secured. I acknowledge that I retain ownership rights to the copyright of my work. I also retain the right to use all or part of this thesis or dissertation in future works, such as articles or books.
Copyright
Status
Copyright protected
Availability
Status
Open
Reason
Permission or license
Back to the top

Technical

RULTechMD (ID = TECHNICAL1)
ContentModel
ETD
OperatingSystem (VERSION = 5.1)
windows xp
CreatingApplication
Version
1.5
ApplicationName
pdfTeX-1.40.17
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2017-09-29T21:07:18
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2017-09-29T21:07:18
Back to the top
Version 8.5.5
Rutgers University Libraries - Copyright ©2024