Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/711
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electronic and Information Engineering-
dc.creatorSmall, M-
dc.creatorTse, CKM-
dc.date.accessioned2014-12-11T08:24:55Z-
dc.date.available2014-12-11T08:24:55Z-
dc.identifier.issn1539-3755-
dc.identifier.urihttp://hdl.handle.net/10397/711-
dc.language.isoenen_US
dc.publisherAmerican Physical Societyen_US
dc.rightsCopyright 2002 by the American Physical Societyen_US
dc.subjectAlgorithmsen_US
dc.subjectComputer simulationen_US
dc.subjectData reductionen_US
dc.subjectExtrapolationen_US
dc.subjectInformation analysisen_US
dc.subjectIterative methodsen_US
dc.subjectPolynomialsen_US
dc.subjectRadial basis function networksen_US
dc.subjectReal time systemsen_US
dc.subjectTime series analysisen_US
dc.titleMinimum description length neural networks for time series predictionen_US
dc.typeJournal/Magazine Articleen_US
dc.description.otherinformationPACS number(s): 02.70.Rr, 05.45.Tp, 05.45.Pqen_US
dc.description.otherinformationAuthor name used in this publication: C. K. Tseen_US
dc.identifier.spage1-
dc.identifier.epage12-
dc.identifier.volume66-
dc.identifier.issue6-
dc.identifier.doi10.1103/PhysRevE.66.066701-
dcterms.abstractArtificial neural networks (ANN) are typically composed of a large number of nonlinear functions (neurons) each with several linear and nonlinear parameters that are fitted to data through a computationally intensive training process. Longer training results in a closer fit to the data, but excessive training will lead to overfitting. We propose an alternative scheme that has previously been described for radial basis functions (RBF). We show that fundamental differences between ANN and RBF make application of this scheme to ANN nontrivial. Under this scheme, the training process is replaced by an optimal fitting routine, and overfitting is avoided by controlling the number of neurons in the network. We show that for time series modeling and prediction, this procedure leads to small models (few neurons) that mimic the underlying dynamics of the system well and do not overfit the data. We apply this algorithm to several computational and real systems including chaotic differential equations, the annual sunspot count, and experimental data obtained from a chaotic laser. Our experiments indicate that the structural differences between ANN and RBF make ANN particularly well suited to modeling chaotic time series data.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationPhysical review. E, Statistical, nonlinear, and soft matter physics, Dec. 2002, v. 66, no. 6, 066701, p. 1-12-
dcterms.isPartOfPhysical review. E, Statistical, nonlinear, and soft matter physics-
dcterms.issued2002-12-
dc.identifier.isiWOS:000180427100105-
dc.identifier.scopus2-s2.0-41349101486-
dc.identifier.pmid12513438-
dc.identifier.eissn1550-2376-
dc.identifier.rosgroupidr15728-
dc.description.ros2002-2003 > Academic research: refereed > Publication in refereed journal-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_IR/PIRAen_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
series-prediction_02.pdf150.57 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

125
Last Week
1
Last month
Citations as of Apr 14, 2024

Downloads

300
Citations as of Apr 14, 2024

SCOPUSTM   
Citations

74
Last Week
0
Last month
0
Citations as of Apr 5, 2024

WEB OF SCIENCETM
Citations

66
Last Week
0
Last month
0
Citations as of Apr 18, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.