TY - GEN
T1 - Reservoir
T2 - 20th IEEE International Conference on Pervasive Computing and Communications, PerCom 2022
AU - Al Azad, Md Washik
AU - Mastorakis, Spyridon
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - In edge computing use cases (e.g., smart cities), where several users and devices may be in close proximity to each other, computational tasks with similar input data for the same services (e.g., image or video annotation) may be offloaded to the edge. The execution of such tasks often yields the same results (output) and thus duplicate (redundant) computation. Based on this observation, prior work has advocated for 'computation reuse', a paradigm where the results of previously executed tasks are stored at the edge and are reused to satisfy incoming tasks with similar input data, instead of executing these incoming tasks from scratch. However, realizing computation reuse in practical edge computing deployments, where services may be offered by multiple (distributed) edge nodes (servers) for scalability and fault tolerance, is still largely unexplored. To tackle this challenge, in this paper, we present Reservoir, a framework to enable pervasive computation reuse at the edge, while imposing marginal overheads on user devices and the operation of the edge network infrastructure. Reservoir takes advantage of Locality Sensitive Hashing (LSH) and runs on top of Named-Data Networking (NDN), extending the NDN architecture for the realization of the computation reuse semantics in the network. Our evaluation demonstrated that Reservoir can reuse computation with up to an almost perfect accuracy, achieving 4.25-21.34× lower task completion times compared to cases without computation reuse.
AB - In edge computing use cases (e.g., smart cities), where several users and devices may be in close proximity to each other, computational tasks with similar input data for the same services (e.g., image or video annotation) may be offloaded to the edge. The execution of such tasks often yields the same results (output) and thus duplicate (redundant) computation. Based on this observation, prior work has advocated for 'computation reuse', a paradigm where the results of previously executed tasks are stored at the edge and are reused to satisfy incoming tasks with similar input data, instead of executing these incoming tasks from scratch. However, realizing computation reuse in practical edge computing deployments, where services may be offered by multiple (distributed) edge nodes (servers) for scalability and fault tolerance, is still largely unexplored. To tackle this challenge, in this paper, we present Reservoir, a framework to enable pervasive computation reuse at the edge, while imposing marginal overheads on user devices and the operation of the edge network infrastructure. Reservoir takes advantage of Locality Sensitive Hashing (LSH) and runs on top of Named-Data Networking (NDN), extending the NDN architecture for the realization of the computation reuse semantics in the network. Our evaluation demonstrated that Reservoir can reuse computation with up to an almost perfect accuracy, achieving 4.25-21.34× lower task completion times compared to cases without computation reuse.
KW - Computation Reuse
KW - Edge Computing
KW - Locality Sensitive Hashing
KW - Named-Data Networking
UR - http://www.scopus.com/inward/record.url?scp=85129977167&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85129977167&partnerID=8YFLogxK
U2 - 10.1109/PerCom53586.2022.9762397
DO - 10.1109/PerCom53586.2022.9762397
M3 - Conference contribution
AN - SCOPUS:85129977167
T3 - 2022 IEEE International Conference on Pervasive Computing and Communications, PerCom 2022
SP - 141
EP - 151
BT - 2022 IEEE International Conference on Pervasive Computing and Communications, PerCom 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 21 March 2022 through 25 March 2022
ER -