Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/109475
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineeringen_US
dc.creatorXie, Ben_US
dc.creatorCui, Hen_US
dc.creatorHo, IWHen_US
dc.creatorHe, Yen_US
dc.creatorGuizani, Men_US
dc.date.accessioned2024-10-30T06:44:38Z-
dc.date.available2024-10-30T06:44:38Z-
dc.identifier.issn1536-1233en_US
dc.identifier.urihttp://hdl.handle.net/10397/109475-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication B. Xie, H. Cui, I. W. -H. Ho, Y. He and M. Guizani, "Computation Offloading and Resource Allocation in LEO Satellite-Terrestrial Integrated Networks With System State Delay," in IEEE Transactions on Mobile Computing, vol. 24, no. 3, pp. 1372-1385, March 2025 is available at https://doi.org/10.1109/TMC.2024.3479243.en_US
dc.subjectComputing offloadingen_US
dc.subjectDeep reinforcement learningen_US
dc.subjectSatellite-terrestrial integrated networksen_US
dc.subjectSystem state delays in learningen_US
dc.titleComputation offloading and resource allocation in LEO satellite-terrestrial integrated networks with system state delayen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage1372en_US
dc.identifier.epage1385en_US
dc.identifier.volume24en_US
dc.identifier.issue3en_US
dc.identifier.doi10.1109/TMC.2024.3479243en_US
dcterms.abstractComputing offloading optimization for energy saving is becoming increasingly important in low-Earth orbit (LEO) satellite-terrestrial integrated networks (STINs) since battery techniques has not kept up with the demand of ground terminal devices. In this paper, we design a delay-based deep reinforcement learning (DRL) framework specifically for computation offloading decisions, which can effectively reduce the energy consumption. Additionally, we develop a multi-level feedback queue for computing allocation (RAMLFQ), which can effectively enhance the CPU's efficiency in task scheduling. We initially formulate the computation offloading problem with the system delay as Delay Markov Decision Processes (DMDPs), and then transform them into the equivalent standard Markov Decision Processes (MDPs). To solve the optimization problem effectively, we employ a double deep Q-network (DDQN) method, enhancing it with an augmented state space to better handle the unique challenges posed by system delays. Simulation results demonstrate that the proposed learning-based computing offloading algorithm achieves high levels of performance efficiency and attains a lower total cost compared to other existing offloading methods.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on mobile computing, Mar. 2025, v. 24, no. 3, p. 1372-1385en_US
dcterms.isPartOfIEEE transactions on mobile computingen_US
dcterms.issued2025-03-
dc.identifier.eissn1558-0660en_US
dc.description.validate202410 bcchen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera3255-
dc.identifier.SubFormID49844-
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Xie_Computation_Offloading_Resource.pdfPre-Published version6.22 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

62
Citations as of Apr 1, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.