Please use this identifier to cite or link to this item:
Title: Nonvolatile main memory aware garbage collection in high-level language virtual machine
Authors: Chen, P
Xie, M
Yang, CM
Shao, ZL 
Hu, JT
Keywords: Virtual machines
Embedded systems
Maximum likelihood estimation
Random-access storage
Storage management
Issue Date: 2015
Publisher: Institute of Electrical and Electronics Engineers
Source: The International Conference on Embedded Software (EMSOFT 2015), Amsterdam, The Netherlands, October 4-9, 2015, p. 197-206 How to cite?
Abstract: Non-volatile memories (NVMs) such as Phase Change Memory (PCM) have been considered as promising candidates of next generation main memory for embedded systems due to their attractive features. These features include low power, high density, and better scalability. However, most existing NVMs suffer from two drawbacks, namely, limited write endurance and expensive write operation in terms of both time and energy. These problems are worsen when modern high-level languages employ virtual machine with garbage collector that generates a large amount of extra writes on non-volatile main memory. To tackle this challenge, this paper proposes three techniques: Living Objects Remapping (LORE), Dead Object Stamping (DOS), and Smart Wiping with Maximum Likelihood Estimation (SMILE) to reduce the unnecessary writes when garbage collector handles objects. The experimental results show that the proposed techniques not only significantly reduce the writes during each garbage collection cycle but also greatly improve the performance of virtual machine.
ISBN: 978-1-4673-8079-9 (electronic)
978-1-4673-8080-5 (Print on Demand(PoD))
DOI: 10.1109/EMSOFT.2015.7318275
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Jul 3, 2018

Page view(s)

Last Week
Last month
Citations as of Jul 22, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.