Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/105278
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Rehabilitation Sciences | - |
dc.creator | Song, L | - |
dc.creator | Wang, A | - |
dc.creator | Zhong, J | - |
dc.date.accessioned | 2024-04-12T06:51:14Z | - |
dc.date.available | 2024-04-12T06:51:14Z | - |
dc.identifier.uri | http://hdl.handle.net/10397/105278 | - |
dc.language.iso | en | en_US |
dc.publisher | MDPI AG | en_US |
dc.rights | © 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). | en_US |
dc.rights | The following publication Song L, Wang A, Zhong J. Inverse Dynamics Modeling and Analysis of Healthy Human Data for Lower Limb Rehabilitation Robots. Electronics. 2022; 11(23):3848 is available at https://doi.org/10.3390/electronics11233848. | en_US |
dc.subject | Gated recurrent unit | en_US |
dc.subject | Long short-term memory | en_US |
dc.subject | Lower limb exoskeleton robot | en_US |
dc.subject | Model learning | en_US |
dc.title | Inverse dynamics modeling and analysis of healthy human data for lower limb rehabilitation robots | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.volume | 11 | - |
dc.identifier.issue | 23 | - |
dc.identifier.doi | 10.3390/electronics11233848 | - |
dcterms.abstract | Bio-controllers inspired by the characteristics of the human lower limb play an important role in the study of lower limb rehabilitation robots (LLRRs). However, the inverse dynamics modeling of robots for human lower limb rehabilitation remains a challenging issue due to the non-linear and strong coupling characteristics of the bio-controller. To further improve the inverse dynamics model’s accuracy, this paper proposes the use of a non-parametric modeling approach in order to learn it. In detail, the main idea is to use the motion data of the main joints of the lower limbs of healthy people as an input and the corresponding joint moments as an output, which are learned through the training of a neural network. To ensure that the learned model can be used on LLRRs, all data collected in this paper are real data from human lower limbs. In addition, since the type of data collected is time series, this paper proposes the use of the long short-term memory (LSTM) and gated recurrent unit (GRU) networks to learn the inverse dynamics model of the robot-like human lower limb and to compare the learning effects of the two networks. The evaluation metric for both network models is the root mean square error (RMSE). The experimental results show that both networks have sound learning effects, and that the GRU network has a more significant learning ability than the LSTM network. | - |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Electronics (Switzerland), Dec. 2022, v. 11, no. 23, 3848 | - |
dcterms.isPartOf | Electronics (Switzerland) | - |
dcterms.issued | 2022-12 | - |
dc.identifier.scopus | 2-s2.0-85143492718 | - |
dc.identifier.eissn | 2079-9292 | - |
dc.identifier.artn | 3848 | - |
dc.description.validate | 202403 bcvc | - |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | OA_Scopus/WOS | en_US |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | National Natural Science Foundation of China | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | CC | en_US |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
electronics-11-03848-v2.pdf | 9.86 MB | Adobe PDF | View/Open |
Page views
13
Citations as of Jul 7, 2024
Downloads
2
Citations as of Jul 7, 2024
SCOPUSTM
Citations
3
Citations as of Jul 4, 2024
WEB OF SCIENCETM
Citations
1
Citations as of Jul 4, 2024
![](/image/google_scholar.jpg)
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.