Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/32000
DC FieldValueLanguage
dc.contributorDepartment of Electronic and Information Engineering-
dc.creatorFung, KK-
dc.creatorLam, KM-
dc.date.accessioned2015-09-30T09:43:42Z-
dc.date.available2015-09-30T09:43:42Z-
dc.identifier.isbn978-1-4244-5560-7-
dc.identifier.urihttp://hdl.handle.net/10397/32000-
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.subjectFeature extractionen_US
dc.subjectGabor filtersen_US
dc.subjectImage processingen_US
dc.subjectInvarianceen_US
dc.subjectObjection recognitionen_US
dc.titleRotation-invariant texture analysis using multi-resolution slit kernelsen_US
dc.typeConference Paperen_US
dc.identifier.spage199-
dc.identifier.epage204-
dc.identifier.doi10.1109/ICSIPA.2009.5478618-
dcterms.abstractThis paper presents a set of simple Slit kernel suitable for regular and rotation-invariant texture analysis. The fast algorithm used to compute its feature is also presented. Extracting Slit features requires 11 to 17% of arithmetic operations when compared to that for Gabor features. Experimental results for the classification of rotated texture images indicate that Slit kernels perform as well as, or sometimes better than, Gabor ones. This makes Slit suitable for applications where computational speed is important.-
dcterms.bibliographicCitation2009 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), 18-19 November 2009, Kuala Lumpur, p. 199-204-
dcterms.issued2009-
dc.relation.ispartofbook2009 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), 18-19 November 2009, Kuala Lumpur-
dc.identifier.rosgroupidr46940-
dc.description.ros2009-2010 > Academic research: refereed > Refereed conference paper-
Appears in Collections:Conference Paper
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page view(s)

80
Last Week
0
Last month
Citations as of Oct 26, 2020

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.