Performance Comparison of Caching Strategies for Information-Centric IoT

Similar documents
Caching Algorithm for Content-Oriented Networks Using Prediction of Popularity of Content

Cache Less for More in Information- Centric Networks W. K. Chai, D. He, I. Psaras and G. Pavlou (presenter)

On the Analysis of Caches with Pending Interest Tables

Optimal Cache Allocation for Content-Centric Networking

Operating Systems Design Exam 2 Review: Spring 2011

CS 416: Opera-ng Systems Design March 23, 2012

Mobile Routing : Computer Networking. Overview. How to Handle Mobile Nodes? Mobile IP Ad-hoc network routing Assigned reading

Master s Thesis 修士論文 論文題目 CACHE CONSISTENCY IN ICN: LEASE 5114FG21-6 THEINT THEINT MYO. Hidenori NAKAZATO. Supervisor 指導教員 年 7 月 1 9 日

TOP-CCN: Topology aware Content Centric Networking for Mobile Ad Hoc Networks

Computation of Multiple Node Disjoint Paths

MPC: Popularity-based Caching Strategy for Content Centric Networks

Coordinated En-Route Web Caching

CIT 668: System Architecture. Caching

Lightweight caching strategy for wireless content delivery networks

Effects of Sensor Nodes Mobility on Routing Energy Consumption Level and Performance of Wireless Sensor Networks

Computer Architecture and System Software Lecture 09: Memory Hierarchy. Instructor: Rob Bergen Applied Computer Science University of Winnipeg

Personal Content Caching for Mobiles

Coupling Caching and Forwarding: Benefits, Analysis & Implementation

Debunking some myths about structured and unstructured overlays

Arvind Krishnamurthy Fall 2003

Modeling and Caching of P2P Traffic

SSD Admission Control for Content Delivery Networks

Towards a Robust Protocol Stack for Diverse Wireless Networks Arun Venkataramani

Publish Subscribe Deployment Option for NDN in the Constrained IoT

Page Replacement. (and other virtual memory policies) Kevin Webb Swarthmore College March 27, 2018

First-In-First-Out (FIFO) Algorithm

Impact of Frequency-Based Cache Management Policies on the Performance of Segment Based Video Caching Proxies

Chapter 8: Virtual Memory. Operating System Concepts

What Benefits Does NDN Have in Supporting Mobility

3. Evaluation of Selected Tree and Mesh based Routing Protocols

Data cube and OLAP. Selecting Views to Materialize. Aggregation view lattice. Selecting views to materialize. Limitations of static approach.

QUALITY OF SERVICE EVALUATION IN IEEE NETWORKS *Shivi Johri, **Mrs. Neelu Trivedi

File System Performance (and Abstractions) Kevin Webb Swarthmore College April 5, 2018

Last Class: Consistency Models. Today: Implementation Issues

SCAN: Scalable Content routing for

Memory Hierarchy: Caches, Virtual Memory

Routing protocols in WSN

THE CACHE REPLACEMENT POLICY AND ITS SIMULATION RESULTS

Chapter 6 Objectives

Poonam kori et al. / International Journal on Computer Science and Engineering (IJCSE)

Performance Evaluations of Data-Centric Information Retrieval Schemes for DTNs

Dynamic Energy-based Encoding and Filtering in Sensor Networks (DEEF)

Event Driven Routing Protocols For Wireless Sensor Networks

KSN Radio Stack: Sun SPOT Symposium 2009 London.

Virtual Memory Outline

CS24: INTRODUCTION TO COMPUTING SYSTEMS. Spring 2014 Lecture 14

Application aware access and distribution of digital objects using Named Data Networking (NDN)

Perform page replacement. (Fig 8.8 [Stal05])

Ad Hoc Routing. Ad-hoc Routing. Problems Using DV or LS. DSR Concepts. DSR Components. Proposed Protocols

Icarus: a Caching Simulator for Information Centric Networking (ICN)

INF5071 Performance in distributed systems Distribution Part II

A Comparative study of On-Demand Data Delivery with Tables Driven and On-Demand Protocols for Mobile Ad-Hoc Network

Network Layer: Routing

Dynamic TTL Assignment in Caching Meta Algorithms. and Study of the Effects on Caching. Named Data Networks

Efficient Hybrid Multicast Routing Protocol for Ad-Hoc Wireless Networks

Link Estimation and Tree Routing

Analyzing Cacheable Traffic in ISP Access Networks for Micro CDN applications via Content-Centric Networking

Content Dissemination in Mobile Wireless Networks. Chadi Barakat

CHAPTER 2 WIRELESS SENSOR NETWORKS AND NEED OF TOPOLOGY CONTROL

15-441: Computer Networking. Lecture 24: Ad-Hoc Wireless Networks

Named Data Networking for 5G Wireless

Wireless Sensornetworks Concepts, Protocols and Applications. Chapter 5b. Link Layer Control

Caches. Samira Khan March 23, 2017

Chapter 6 Memory 11/3/2015. Chapter 6 Objectives. 6.2 Types of Memory. 6.1 Introduction

Replicate It! Scalable Content Delivery: Why? Scalable Content Delivery: How? Scalable Content Delivery: How? Scalable Content Delivery: What?

Performance Analysis of MANET Using Efficient Power Aware Routing Protocol (EPAR)

EECS 122, Lecture 16. Link Costs and Metrics. Traffic-Sensitive Metrics. Traffic-Sensitive Metrics. Static Cost Metrics.

A Location-based Predictive Route Caching Scheme for Pure Reactive Zone-based Routing Protocol in Mobile Ad Hoc Networks Abstract Introduction

TCP Spatial-Temporal Measurement and Analysis

Memory. Objectives. Introduction. 6.2 Types of Memory

Locality of Reference

CHAPTER 4 OPTIMIZATION OF WEB CACHING PERFORMANCE BY CLUSTERING-BASED PRE-FETCHING TECHNIQUE USING MODIFIED ART1 (MART1)

Elastic Queue: A Universal SSD Lifetime Extension Plug-in for Cache Replacement Algorithms

IEEE TRANSACTIONS ON MULTIMEDIA, VOL. 8, NO. 2, APRIL Segment-Based Streaming Media Proxy: Modeling and Optimization

CS551 Ad-hoc Routing

Routing Protocols in MANETs

ECE 2300 Digital Logic & Computer Organization. More Caches

CHAPTER 7 SIMULATION OBSERVATIONS

Broadcasting Techniques for Mobile Ad Hoc Networks

QoS Routing By Ad-Hoc on Demand Vector Routing Protocol for MANET

SharePoint Server 2010 Capacity Management for Web Content Management Deployments

A Packet-Based Caching Proxy with Loss Recovery for Video Streaming

A Gentle Introduction to Ceph

HSM: A Hybrid Streaming Mechanism for Delay-tolerant Multimedia Applications Annanda Th. Rath 1 ), Saraswathi Krithivasan 2 ), Sridhar Iyer 3 )

ADB: An Efficient Multihop Broadcast Protocol Based on Asynchronous Duty-Cycling in Wireless Sensor Networks

A Performance Comparison of Multi-Hop Wireless Ad Hoc Network Routing Protocols. Broch et al Presented by Brian Card

Automated Storage Tiering on Infortrend s ESVA Storage Systems

Estimate the Routing Protocols for Internet of Things

Networking for Data Acquisition Systems. Fabrice Le Goff - 14/02/ ISOTDAQ

Internet Content Distribution

Performance Evaluation of CCN

Energy-Efficient Forwarding Strategies for Geographic Routing in Lossy Wireless Sensor Networks

AODV-PA: AODV with Path Accumulation

A Popularity-based Caching Strategy for the Future Internet

On the Interdependence of Congestion and Contention in Wireless Sensor Networks

Performance metrics for caches

An Industrial Employee Development Application Protocol Using Wireless Sensor Networks

Flexible Cache Cache for afor Database Management Management Systems Systems Radim Bača and David Bednář

Operating Systems. Operating Systems Sina Meraji U of T

CS420: Operating Systems

Transcription:

Performance Comparison of Caching Strategies for Information-Centric IoT Jakob Pfender, Alvin Valera, Winston Seah School of Engineering and Computer Science Victoria University of Wellington, New Zealand September 22, 2018

2 Traditional Caching Strategies Caching Decision Cache Replacement

2 Traditional Caching Strategies Caching Decision Cache Replacement

3 Caching Decision Traditional Approaches Cache Everything Everywhere (CEE) Feasible in traditional ICN thanks to large caches Fastest possible propagation of content through network rapid replication

Caching Decision Traditional Approaches Cache Everything Everywhere (CEE) Feasible in traditional ICN thanks to large caches Fastest possible propagation of content through network rapid replication High redundancy Non-optimal resource utilisation 3

Caching Decision Traditional Approaches Cache Everything Everywhere (CEE) Feasible in traditional ICN thanks to large caches Fastest possible propagation of content through network rapid replication High redundancy Probabilistic Caching (Prob(P)) Increases cache diversity across the network More popular content likelier to be stored Non-optimal resource utilisation 3

Caching Decision Traditional Approaches Cache Everything Everywhere (CEE) Feasible in traditional ICN thanks to large caches Fastest possible propagation of content through network rapid replication High redundancy Non-optimal resource utilisation Probabilistic Caching (Prob(P)) Increases cache diversity across the network More popular content likelier to be stored Desirability of diversity depends on application scenario If request pattern has strong skew, diversity hurts performance by wasting resources on unpopular content The more uniform the distribution, the more beneficial diversity 3

4 Traditional Caching Strategies Caching Decision Cache Replacement

5 Cache Replacement Decision Traditional Approaches Least Recently Used (LRU): Unpopular / outdated content more likely to be removed Generally effective, but danger of thrashing Alternative: Least Frequently Used (LFU) avoids thrashing, performs poorly with variable access patterns & request spikes

5 Cache Replacement Decision Traditional Approaches Least Recently Used (LRU): Unpopular / outdated content more likely to be removed Generally effective, but danger of thrashing Alternative: Least Frequently Used (LFU) avoids thrashing, performs poorly with variable access patterns & request spikes Random Replacement (RR): Evict a randomly chosen content chunk Simple, fast, no overhead Some argue that cache replacement should be performed as fast as possible Simple and fast more desirable than effective but complex

Traditional ICN caching vs. IoT

7 Traditional ICN caching vs. IoT Lessons from traditional ICN caching

7 Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources)

7 Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Cache diversity generally desirable 7

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible 7

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Key differences in IoT Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible 7

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Key differences in IoT Limited memory and processing power Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible 7

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Key differences in IoT Limited memory and processing power Available cache space extremely valuable Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible 7

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Cache diversity generally desirable Key differences in IoT Limited memory and processing power Available cache space extremely valuable Unreliable links Cache replacement policies should be as fast & simple as possible 7

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible Key differences in IoT Limited memory and processing power Available cache space extremely valuable Unreliable links Small, transient data 7

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible Key differences in IoT Limited memory and processing power Available cache space extremely valuable Unreliable links Small, transient data Request distributions tend to be uniform 7

Traditional ICN caching vs. IoT Lessons from traditional ICN caching CEE is inefficient (high redundancy, low diversity, poor utilisation of resources) Caching less better performance Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible Key differences in IoT Limited memory and processing power Available cache space extremely valuable Unreliable links Small, transient data Request distributions tend to be uniform Can we apply the lessons from traditional ICN caching to the IoT? 7

Advanced Caching Strategies (for the IoT)

9 Advanced Caching Strategies Dynamic Caching Probability Dynamically compute caching probability for each node and/or each content chunk, based on available information Caching behaviour adapts to the state of the network

9 Advanced Caching Strategies Dynamic Caching Probability Dynamically compute caching probability for each node and/or each content chunk, based on available information Caching behaviour adapts to the state of the network Example: pcasting (Hail et al. 2015) Consider content age, node battery, cache occupancy Values normalised & weighted by relative importance Fully distributed, no communication overhead Uses purely local information

Advanced Caching Strategies Dynamic Caching Probability Dynamically compute caching probability for each node and/or each content chunk, based on available information Caching behaviour adapts to the state of the network Example: pcasting (Hail et al. 2015) Consider content age, node battery, cache occupancy Values normalised & weighted by relative importance Fully distributed, no communication overhead Uses purely local information Max Diversity Most Recent (MDMR) Hahm et al. 2017 Cache replacement strategy developed specifically for information-centric IoT Aim: Maximise diversity while maintaining freshness Always attempt to replace older content from same producer as new content, otherwise oldest chunk from producer with more than one chunk in cache, else oldest chunk 9

Performance Metrics

11 Performance Metrics Server load / cache hit ratio Indicates how well popular content is distributed across the network

11 Performance Metrics Server load / cache hit ratio Indicates how well popular content is distributed across the network Data retrieval delay Also affected by network congestion, density, etc.

11 Performance Metrics Server load / cache hit ratio Indicates how well popular content is distributed across the network Data retrieval delay Also affected by network congestion, density, etc. Interest retransmission ratio On-path caching may reduce hop count for retransmissions

11 Performance Metrics Server load / cache hit ratio Indicates how well popular content is distributed across the network Data retrieval delay Also affected by network congestion, density, etc. Interest retransmission ratio On-path caching may reduce hop count for retransmissions Total cache evictions How well does the strategy adapt to content popularity & propagation?

11 Performance Metrics Server load / cache hit ratio Indicates how well popular content is distributed across the network Data retrieval delay Also affected by network congestion, density, etc. Interest retransmission ratio On-path caching may reduce hop count for retransmissions Total cache evictions How well does the strategy adapt to content popularity & propagation? Diversity Metric (DM) D = C disj S S : Number of content producers C disj : Number of disjoint name prefixes in all caches

11 Performance Metrics Server load / cache hit ratio Indicates how well popular content is distributed across the network Data retrieval delay Also affected by network congestion, density, etc. Interest retransmission ratio On-path caching may reduce hop count for retransmissions Total cache evictions How well does the strategy adapt to content popularity & propagation? Diversity Metric (DM) D = C disj S S : Number of content producers C disj : Number of disjoint name prefixes in all caches Cache Retention Ratio (CRR) Measures ratio of distinct objects in caches to all generated objects: C = Dq D p

Evaluation

13 Evaluation Experiment Setup All experiments conducted on FIT IoT-LAB open testbed using M3 nodes STM32 (ARM Cortex M3), 512 kb ROM, 64 kb RAM, Atmel AT86RF231 2.4 GHz transceiver on IEEE 802.15.4 Simple RIOT application using CCN-lite as ICN implementation, modified to support the different caching strategies

13 Evaluation Experiment Setup All experiments conducted on FIT IoT-LAB open testbed using M3 nodes STM32 (ARM Cortex M3), 512 kb ROM, 64 kb RAM, Atmel AT86RF231 2.4 GHz transceiver on IEEE 802.15.4 Simple RIOT application using CCN-lite as ICN implementation, modified to support the different caching strategies 60 M3 nodes distributed evenly in a single building Multihop setup, average path length 2 3 hops

13 Evaluation Experiment Setup All experiments conducted on FIT IoT-LAB open testbed using M3 nodes STM32 (ARM Cortex M3), 512 kb ROM, 64 kb RAM, Atmel AT86RF231 2.4 GHz transceiver on IEEE 802.15.4 Simple RIOT application using CCN-lite as ICN implementation, modified to support the different caching strategies 60 M3 nodes distributed evenly in a single building Multihop setup, average path length 2 3 hops Prefix announcements recorded with hop count and rebroadcast with increased hop count Interests forwarded according to lowest hop count, broadcast fallback

13 Evaluation Experiment Setup All experiments conducted on FIT IoT-LAB open testbed using M3 nodes STM32 (ARM Cortex M3), 512 kb ROM, 64 kb RAM, Atmel AT86RF231 2.4 GHz transceiver on IEEE 802.15.4 Simple RIOT application using CCN-lite as ICN implementation, modified to support the different caching strategies 60 M3 nodes distributed evenly in a single building Multihop setup, average path length 2 3 hops Prefix announcements recorded with hop count and rebroadcast with increased hop count Interests forwarded according to lowest hop count, broadcast fallback Nodes produce random content chunks prefixed with their ID every 1 5 seconds Nodes request random existing content every 0.5 1.5 seconds, using uniform or Zipfian pattern

Evaluation Caching decision policies

15 Evaluation Caching decision policies Server load Probabilistic caching results in lower probability that given content can be found in given cache pcasting seems to cache with higher average probability than p = 0.5 CEE is better for skewed request pattern because it increases replication of popular content Server load (in %) 100 80 60 40 20 0 CEE PCASTING PROB05 Zipf Uniform

Evaluation Caching decision policies Interest retransmission ratio No statistically significant effect Influence of network factors (topology, congestion) stronger than caching policy Previous work (Hail et al.) found more significant differences using simulation Average Interest retransmissions (in %) 100 80 60 40 20 0 CEE PCASTING PROB05 Zipf Uniform 16

Evaluation Caching decision policies Cache evictions Correlated with rate at which caches are filled Skewed distribution reduced thrashing Total number of transmissions during experiment run: 40,000-75,000 Uniform requests: 5%-12% evictions Zipfian requests: 1%-4% Strong dependence on cache size Number of cache evictions 4500 4000 3500 3000 2500 2000 1500 1000 CEE PCASTING PROB05 Zipf Uniform 17

Evaluation Caching decision policies Data retrieval delay Large initial latency because content takes time to propagate Cache hit probability lower for probabilistic policies higher peak and slower decline as caches take longer to fill All strategies reach minimal delay very quickly Data retrieval delay (in s) Data retrieval delay (in s) 30 25 20 15 10 5 100 CEE PROB05 PCASTING 0 50 100 150 200 250 300 Time (in s) 80 60 40 20 Zipf Uniform 0 CEE PCASTING PROB05 Time (in s) 18

Evaluation Caching decision policies Diversity metric Majority of content producers represented in network at any given time CEE caches all contents from the start caches fill more quickly But since everything is cached, cache contents are highly redundant Probabilistic methods have higher probability of representing all content producers Diversity metric (in %) Diversity metric (in %) 100 90 80 70 60 50 40 100 80 60 40 20 CEE PROB05 PCASTING 0 200 400 600 800 1000 1200 Time (in s) Zipf Uniform 0 CEE PCASTING PROB05 Time (in s) 19

Evaluation Caching decision policies Cache Retention Ratio Content inevitably fades from network after a while Content creation rate is constant, but probabilistic approaches take longer to fill caches More diverse caches delayed decline Zipfian skew means less popular content fades much quicker caches become more homogeneous over time Uniform distribution ensures similar lifespans for all content Cache retention ratio (in %) Cache retention ratio (in %) 100 80 60 40 20 0 100 80 60 40 20 CEE PROB05 PCASTING 0 200 400 600 800 1000 1200 Time (in s) Zipf Uniform 0 CEE PCASTING PROB05 Time (in s) 20

Evaluation Cache replacement policies

22 Evaluation Cache replacement policies Server load Cache replacement strategy has negligible effect on server load RR performs on par with much more complex strategies Server load (in %) 100 80 60 40 20 0 LRU RR MDMR Zipf Uniform

Evaluation Cache replacement policies Interest retransmission ratio No statistically significant effect Thus: Any observed variations in retransmission rates have causes independent from caching strategy Average Interest retransmissions (in %) 100 80 60 40 20 0 LRU RR MDMR Zipf Uniform 23

24 Evaluation Cache replacement policies Cache evictions Significant effect of distribution MDMR explicitly designed to maximise diversity for uniform patterns, but can t outperform LRU given skewed distribution Zipfian distribution already favours popular content, reducing impact of cache-shaping strategies Surprising: RR outperforms other approaches given Zipfian distribution Thrashing at popularity tail end counteracted by RR? Number of cache evictions 4000 3000 2000 1000 LRU RR MDMR Zipf Uniform

Evaluation Cache replacement policies Data retrieval delay MDMR prioritises content freshness Freshness only becomes significant factor after some time has elapsed MDMR requires minimum time to become effective Significance of early spikes mostly in terms of strain on individual nodes, especially if devices battery-powered Data retrieval delay (in s) Data retrieval delay (in s) 30 25 20 15 10 5 100 LRU RR MDMR 0 50 100 150 200 250 300 Time (in s) 80 60 40 20 Zipf Uniform 0 LRU RR MDMR Time (in s) 25

Evaluation Cache replacement policies Diversity Metric Why no impact? 100 90 Diversity metric (in %) 80 70 60 50 40 LRU RR MDMR 0 200 400 600 800 1000 1200 Time (in s) Diversity metric (in %) 100 80 60 40 20 0 LRU RR MDMR Time (in s) Zipf Uniform 26

Evaluation Cache replacement policies Diversity Metric Why no impact? Note: MDMR intended for scenario with periodic sleeping, which is not the case here Diversity metric (in %) 100 90 80 70 60 50 40 LRU RR MDMR 0 200 400 600 800 1000 1200 Time (in s) Diversity metric (in %) 100 80 60 40 20 Zipf Uniform 0 LRU RR MDMR Time (in s) 26

Evaluation Cache replacement policies Cache Retention Ratio Content lifetime solely influenced by caching decision, not by replacement Faster fading given skewed distribution unsurprising Cache retention ratio (in %) 100 80 60 40 20 LRU RR MDMR 0 0 200 400 600 800 1000 1200 Time (in s) Cache retention ratio (in %) 100 80 60 40 20 0 Zipf Uniform LRU RR MDMR Time (in s) 27

Revisiting lessons from traditional ICN

29 Revisiting lessons from traditional ICN Caching less better performance

29 Revisiting lessons from traditional ICN Caching less better performance Yes, at least in terms of diversity

Revisiting lessons from traditional ICN Caching less better performance Yes, at least in terms of diversity Cache diversity generally desirable 29

Revisiting lessons from traditional ICN Caching less better performance Yes, at least in terms of diversity Cache diversity generally desirable Yes, if request patterns are uniform 29

Revisiting lessons from traditional ICN Caching less better performance Yes, at least in terms of diversity Cache diversity generally desirable Yes, if request patterns are uniform CEE is inefficient 29

Revisiting lessons from traditional ICN Caching less better performance Yes, at least in terms of diversity Cache diversity generally desirable Yes, if request patterns are uniform CEE is inefficient Depends on scenario 29

Revisiting lessons from traditional ICN Caching less better performance Yes, at least in terms of diversity Cache diversity generally desirable Yes, if request patterns are uniform CEE is inefficient Depends on scenario Stateless cache replacement policies are sufficient 29

Revisiting lessons from traditional ICN Caching less better performance Yes, at least in terms of diversity Cache diversity generally desirable Yes, if request patterns are uniform CEE is inefficient Depends on scenario Stateless cache replacement policies are sufficient 29

Further conclusions

31 Further conclusions The performance of simple stateless strategies is encouraging, because it means effective caching can be achieved even on resource-constrained IoT devices

31 Further conclusions The performance of simple stateless strategies is encouraging, because it means effective caching can be achieved even on resource-constrained IoT devices Identifying ideal caching decision strategy depends on application & resources

Further conclusions The performance of simple stateless strategies is encouraging, because it means effective caching can be achieved even on resource-constrained IoT devices Identifying ideal caching decision strategy depends on application & resources Is data homogeneous & needs to be distributed as rapidly as possible or is cache diversity more important than response time? How much strain can we afford to place on individual nodes? 31

Further conclusions The performance of simple stateless strategies is encouraging, because it means effective caching can be achieved even on resource-constrained IoT devices Identifying ideal caching decision strategy depends on application & resources Is data homogeneous & needs to be distributed as rapidly as possible or is cache diversity more important than response time? How much strain can we afford to place on individual nodes? Results presented here offer no universal solution 31

Future work One experimental setup cannot reflect the diversity of IoT applications Many more proposed caching decision strategies Take into account topological factors, content popularity, other aspects More comprehensive survey/evaluation desirable 32

Thank you!