IEEE PerCom PerIoT 2017

Large-scale Offloading in the Internet of Things


Large-scale deployments of IoT devices are subject
to energy and performance issues. Fortunately, offloading is a
promising technique to enhance those aspects. However, several
problems still remain open regarding cloud deployment and
provisioning. In this paper, we address the problem of provisioning
offloading as a service in large-scale IoT deployments.
We design and develop an AutoScaler, an essential component
for our offloading architecture to handle offloading workload. In
addition, we also develop an offloading simulator to generate
dynamic offloading workload of multiple devices. With this
toolkit, we study the effect of task acceleration in different
cloud servers and analyze the capacity of several cloud servers
to handle multiple concurrent requests. We conduct multiple
experiments in a real testbed to evaluate the system and present
our experiences and lessons learned. From the results, we find
that the AutoScaler component introduces a very small overhead
of around 150 milliseconds in the total response time of a request, which
is a fair price to pay to empower the offloading architectures
with multi-tenancy ability and dynamic horizontal scaling for
IoT scenarios.

Pre-camera PDF 

IEEE Library Access

 author = {Flores, Huber and Xiang, Su and Kostakos, Vassilis and Ding, Aaron Yi and Nurmi, Petteri and Tarkoma, Sasu and Hui, Pan and Li, Yong},
 title = {Large-scale Offloading in the Internet of Things},
 booktitle = {Proceedings of 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)},
 series = {PerCom PerIoT '17},
 year = {2017},
 location = {Kona, Hawaii, USA},
 publisher = {IEEE},
How to cite:

H. Flores, X. Su, V. Kostakos, A. Y. Ding, P. Nurmi, S. Tarkoma, P. Hui, Y. Li, "Large-scale Offloading in the Internet of Things", In Proceedings of IEEE PerCom Workshop on Mobile and Pervasive Internet of Things (PerCom PerIoT '17).