Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105602
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorAn, Wen_US
dc.creatorWang, Hen_US
dc.creatorSun, Qen_US
dc.creatorXu, Jen_US
dc.creatorDai, Qen_US
dc.creatorZhang, Len_US
dc.date.accessioned2024-04-15T07:35:19Z-
dc.date.available2024-04-15T07:35:19Z-
dc.identifier.isbn978-1-5386-6420-9 (Electronic)en_US
dc.identifier.isbn978-1-5386-6421-6 (Print on Demand(PoD))en_US
dc.identifier.urihttp://hdl.handle.net/10397/105602-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication W. An, H. Wang, Q. Sun, J. Xu, Q. Dai and L. Zhang, "A PID Controller Approach for Stochastic Optimization of Deep Networks," 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 2018, pp. 8522-8531 is available at https://doi.org/10.1109/CVPR.2018.00889.en_US
dc.titleA PID controller approach for stochastic optimization of deep networksen_US
dc.typeConference Paperen_US
dc.identifier.spage8522en_US
dc.identifier.epage8531en_US
dc.identifier.doi10.1109/CVPR.2018.00889en_US
dcterms.abstractDeep neural networks have demonstrated their power in many computer vision applications. State-of-the-art deep architectures such as VGG, ResNet, and DenseNet are mostly optimized by the SGD-Momentum algorithm, which updates the weights by considering their past and current gradients. Nonetheless, SGD-Momentum suffers from the overshoot problem, which hinders the convergence of network training. Inspired by the prominent success of proportional-integral-derivative (PID) controller in automatic control, we propose a PID approach for accelerating deep network optimization. We first reveal the intrinsic connections between SGD-Momentum and PID based controller, then present the optimization algorithm which exploits the past, current, and change of gradients to update the network parameters. The proposed PID method reduces much the overshoot phenomena of SGD-Momentum, and it achieves up to 50% acceleration on popular deep network architectures with competitive accuracy, as verified by our experiments on the benchmark datasets including CIFAR10, CIFAR100, and Tiny-ImageNet.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitation2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 18 - 22 June 2018, Salt Lake City, Utah, p. 8522-8531en_US
dcterms.issued2018-
dc.identifier.scopus2-s2.0-85062879163-
dc.relation.conferenceConference on Computer Vision and Pattern Recognition [CVPR]-
dc.description.validate202402 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberCOMP-0762-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS14958891-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
An_Pid_Controller_Approach.pdfPre-Published version1.25 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

73
Last Week
5
Last month
Citations as of Nov 9, 2025

Downloads

37
Citations as of Nov 9, 2025

SCOPUSTM   
Citations

122
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

99
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.