Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/110998
PIRA download icon_1.1View/Download Full Text
Title: Method and system for training model based on data quantization and hardware acceleration
Other Title: 一种训练基于数据量化与硬件加速的模型的方法及系统
Authors: Guo, S 
Zhou, Q 
Xie, X 
Issue Date: 13-Feb-2024
Source: 中国专利 ZL 202110211440.1
Abstract: The invention discloses a method for training a model based on data quantization and hardware acceleration and an edge intelligent system, and the method comprises the steps: converting the processing data of an edge intelligent model into low-ratio specific points in a forward propagation stage of model training, and enabling the calculation cost of the edge intelligent model to be effectively reduced; adopting an error compensation mechanism to guarantee the quality of a final model and the accuracy of a reasoning result; and adopting a gradient truncation mechanism in the backward propagation stage of model training, so the stability of the model updating process is guaranteed. The problems that in the prior art, in the training and reasoning process, the calculation and storage overhead of a model of an edge intelligent device end is large, the prediction accuracy of the model is lower, and the model is difficult to competent for a high-dynamic real-time task are solved.
本发明公开了一种训练基于数据量化与硬件加速的模型的方法及边缘智能系统,通过在模型训练的前向传播阶段将边缘智能模型的处理数据转换为低比特定点数,从而使得边缘智能模型的计算成本有效降低,并采用误差补偿机制保障最终模型的质量和推理结果的准确性。在模型训练的后向传播阶段采用梯度截断机制,保障模型更新过程的平稳性。解决了现有技术中边缘智能设备端的模型在训练与推理过程中的计算与存储开销较大,且模型的预测准确性较低,难以胜任高动态的实时性任务的问题。
Publisher: 中华人民共和国国家知识产权局
Rights: Assignee: 香港理工大学深圳研究院
Appears in Collections:Patent

Files in This Item:
File Description SizeFormat 
ZL202110211440.1.pdf1.11 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Show full item record

Page views

3
Citations as of Apr 14, 2025

Downloads

10
Citations as of Apr 14, 2025

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.