Please use this identifier to cite or link to this item:
Title: Improving the packet send-time accuracy in embedded devices
Authors: Mok, RK
Li, W 
Chang, RK 
Issue Date: 2015
Publisher: Springer
Source: Lecture notes in computer science (including subseries Lecture notes in artificial intelligence and lecture notes in bioinformatics), 2015, v. 8995, p. 332-334 How to cite?
Journal: Lecture notes in computer science (including subseries Lecture notes in artificial intelligence and lecture notes in bioinformatics) 
Abstract: A number of projects deploy Linux-based embedded systemsto carry out large-scale active network measurement and network experiments.Due to resource constrains and the increase of network speed,obtaining sound measurement results from these low-end devices is verychallenging. In this paper, we present a novel network primitive, OMware,to improve the packet send-time accuracy by enabling the measurementapplication to pre-dispatch the packet content and its schedule into thekernel. By this pre-dispatch approach, OMware can also reduce the overheadsin timestamp retrievals and sleeping, and the interference fromother application processes.Our evaluation shows that OMware can achieve a microsecond-levelaccuracy (rather than millisecond-level in a user-space tool) in the interdeparturetime of packet trains, even under heavy cross traffic. OMwarealso offers optimized call for sending back-to-back packet pairs, whichcan reduce the minimum inter-packet gap by 2 to 10 times. Furthermore,OMware can help reduce the error of replaying archived traffic from40% to at almost 19%.
Description: 16th International Conference on Passive and Active Measurement, PAM 2015, New York, 19-20 March 2015
ISSN: 0302-9743
EISSN: 1611-3349
DOI: 10.1007/978-3-319-15509-8_25
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Jan 5, 2018

Page view(s)

Last Week
Last month
Citations as of Jan 14, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.