Please use this identifier to cite or link to this item:
Title: Towards self-tuning parameter servers
Authors: Liu, Chun Yin
Advisors: Lo, Eric (COMP)
Yiu, Ken (COMP)
Keywords: Machine learning
Issue Date: 2018
Publisher: The Hong Kong Polytechnic University
Abstract: Machine Learning (ML) has driven advances in many applications in recent years. Nowadays, it is common to see industrial-strength machine learning jobs that involve billions of model parameters, petabytes of training data, and weeks of training. Good efficiency, i.e., fast completion time of running a specific ML job, therefore, is a key feature of a successful ML system. While the completion time of a long-running ML job is determined by the time required to reach model convergence, practically that is largely influenced by the values of various system settings. In this thesis, we present techniques towards building self-tuning parameter servers. Parameter Server (PS) is a de-facto system architecture for large-scale machine learning; and by self-tuning we mean while a long-running ML job is iteratively training the expert-suggested model, the system is also iteratively learning which setting is more efficient for that job and applies it online. We have implemented our three techniques, namely, (1) online ML job optimization framework, (2) online ML job progress estimation, and (3) online ML system recon.guration, on top of TensorFlow. Experiments show that our techniques can reduce the completion times of long-running TensorFlow jobs from 1.7X to 5.1X.
Description: xiv, 62 pages : color illustrations
PolyU Library Call No.: [THS] LG51 .H577M COMP 2018 Liu
Rights: All rights reserved.
Appears in Collections:Thesis

Files in This Item:
File Description SizeFormat 
991022164554103411_link.htmFor PolyU Users167 BHTMLView/Open
991022164554103411_pira.pdfFor All Users (Non-printable)1.11 MBAdobe PDFView/Open
Show full item record
PIRA download icon_1.1View/Download Contents

Page view(s)

Citations as of Jan 14, 2019


Citations as of Jan 14, 2019

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.