Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/74909
Title: PHY assisted tree-based RFID identification
Authors: Hou, Y 
Zheng, Y 
Issue Date: 2017
Publisher: Institute of Electrical and Electronics Engineers Inc.
Source: Proceedings - IEEE INFOCOM, 2017, 8056984 How to cite?
Abstract: Tree-based RFID identification adopts a binary-tree structure to collect IDs of an unknown set. Tag IDs locate at the leaf nodes and the reader queries through intermediate tree nodes and converges to these IDs using feedbacks from tag responses. Existing works cannot function well under random ID distribution as they ignore the distribution information hidden in the physical-layer signal of colliding tags. Different from them, we introduce PHY-Tree, a novel tree-based scheme that collects two types of distribution information from every encountered colliding signal. First, we detect if all colliding tags send the same bit content at each bit index by looking into inherent temporal features of the tag modulation schemes. If such resonant states are detected, either left or right branch of a certain subtree can be trimmed horizontally. Second, we estimate the number of colliding tags in a slot by computing a related metric defined over the signal's constellation map, based on which nodes in the same layers of a certain subtree can be skipped vertically. Evaluations from both experiments and simulations demonstrate that PHY-Tree outperforms state-of-the-art schemes by at least 1.79×.
Description: 2017 IEEE Conference on Computer Communications, INFOCOM 2017, Atlanta, GA, USA, 1-4 May 2017
URI: http://hdl.handle.net/10397/74909
ISBN: 9781509053360
ISSN: 0743166X
DOI: 10.1109/INFOCOM.2017.8056984
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

3
Last Week
0
Last month
Citations as of Oct 6, 2018

Page view(s)

8
Citations as of Oct 15, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.