Please use this identifier to cite or link to this item:
Title: Implicit visual learning : image recognition via dissipative learning model
Authors: Liu, Y 
Liu, Y
Zhong, S
Wu, S
Keywords: Dissipative implicit learning model
Dissipative theory
Image recognition
Implicit learning
Visual data analysis
Issue Date: 2016
Publisher: Association for Computing Machinary
Source: ACM transactions on intelligent systems and technology, 2016, v. 8, no. 2, 31 How to cite?
Journal: ACM transactions on intelligent systems and technology 
Abstract: According to consciousness involvement, human's learning can be roughly classified into explicit learning and implicit learning. Contrasting strongly to explicit learning with clear targets and rules, such as our school study of mathematics, learning is implicit when we acquire new information without intending to do so. Research from psychology indicates that implicit learning is ubiquitous in our daily life. Moreover, implicit learning plays an important role in human visual perception. But in the past 60 years, most of the well-known machine-learning models aimed to simulate explicit learning while the work of modeling implicit learning was relatively limited, especially for computer vision applications. This article proposes a novel unsupervised computational model for implicit visual learning by exploring dissipative system, which provides a unifying macroscopic theory to connect biology with physics. We test the proposed Dissipative Implicit Learning Model (DILM) on various datasets. The experiments show that DILM not only provides a good match to human behavior but also improves the explicit machine-learning performance obviously on image classification tasks.
ISSN: 2157-6904
EISSN: 2157-6912
DOI: 10.1145/2974024
Appears in Collections:Journal/Magazine Article

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Jul 12, 2018

Page view(s)

Last Week
Last month
Citations as of Jul 16, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.