Brett D. Higgins, Jason Flinn, T.J. Giuli, Brian Noble, Christopher Peplin, and David Watson
Prefetching is a double-edged sword. It can hide the latency of data transfers over poor and intermittently connected wireless networks, but the costs of prefetching in terms of increased energy and cellular data usage are potentially substantial, particularly for data prefetched incorrectly. Weighing the costs and beneﬁts of prefetching is complex, and consequently most mobile applications employ simple but sub-optimal strategies.
Rather than leave the job to applications, we argue that the underlying mobile system should provide explicit prefetching support. Our prototype, IMP, presents a simple interface that hides the complexity of the prefetching decision. IMP uses a cost-beneﬁt analysis to decide when to prefetch data. It employs goal-directed adaptation to try to minimize application response time while meeting budgets for battery lifetime and cellular data usage. IMP opportunistically uses available networks while ensuring that prefetches do not degrade network performance for foreground activity. It tracks hit rates for past prefetches and accounts for network-speciﬁc costs in order to dynamically adapt its prefetching strategy to both the network conditions and the accuracy of application prefetch disclosures. Experiments with email and news reader applications show that IMP provides predictable usage of budgeted resources, while lowering application response time compared to
the oblivious strategies used by current applications.
Public Review uploaded by ArunVenkataramani:
This public review was written by Arun Venkataramani.
This paper is like a well-aged wine in a new bottle. The paper addresses prefetching, long known to be a double-edged sword that fundamentally trades off the benefit of latency reduction against the costs of increased resource usage incurred by mis-predictions. Over the decades, researchers have studied this trade-off and made the case for system support for prefetching in several different contexts all the way from processor memory hierarchies to the Internet. What this paper brings to the table is to extend some of these ideas to today's mobile environments, where the costs of prefetching are different and more complex than more traditional wired environments.
To appreciate why mobile prefetching is challenging, consider at least three different types of concerns that it must balance -- performance, energy usage, and wireless data usage -- no two of which can be measured by a common metric. Furthermore, estimating on-demand performance or energy usage is ridden with uncertainty because of oddities like "tail effects" in 3G networks and, unlike traditional environments, energy and bandwidth are budgeted resources in mobile phones. The paper's primary contribution is the design and implementation of IMP (for Informed Mobile Prefetching) that provides system support to simplify prefetching for mobile applications. A key insight is to allow applications to specify prefetching hints while the IMP decision algorithm under the covers balances the heterogeneous costs and benefits by translating them to comparable quantities. The paper describes an implementation and evaluation of IMP on Android phones showing nontrivial improvements over naive schemes like "prefetch-always" or "prefetch-with-static-limits" using realistic email and news reader workloads.
While the paper definitely hits on several important concerns specific to mobile prefetching, it leaves untouched plenty of fertile territory for future research in this space. The paper mainly confines itself to budgeted energy and data usage, but not other resources such as load on the server or network infrastructure, storage on the device, or other pay-as-you-use data usage plans. The problem of systematic non-interference with foreground transfers is a thorny one in practice, but the paper only pays lip service to the concern. The limited evaluation with email and news workloads leaves readers wondering how representative the benefits are under typical phone usage. Although the paper leverages cheaper WiFi connectivity when available for prefetching, it does not attempt to intelligently predict WiFi availability dismissing it as too difficult. A critical reader may rightfully wonder if the trade-offs addressed in the paper, or the questions it leaves unanswered, or broader economic or social factors are the factors holding back the deployment of widespread prefetching on mobile phones today. Read the paper to form your opinion!
The problem of systematic non-interference with foreground transfers
is indeed thorny. In this paper, we leverage our prior work answering
that question, which is described in our MobiCom '10
paper, "Intentional Networking: Opportunistic Exploitation of Mobile
Network Diversity". While we naturally like our own solution to this
problem, IMP could potentially leverage other systems that can
prioritize network traffic.
Predicting future bandwidth is challenging, but we don't view it as an
impossible task. As methods are developed for making good
predictions of future connectivity, we can plug those methods into
IMP. We believe there is a lot of interesting research to be done in
that area (and we hope to do some of it ourselves!)