Site Optimized for Chrome and Firefox
Site Optimized for Chrome and Firefox
Site Optimized for Chrome and Firefox
Site Optimized for Chrome and Firefox
The MMS API was deprecated on July 29, 2015. Get more information about our supported APIs.
/apis/sms-mms /apis/mms/docs
The Device Capabilities API was deprecated on July 29, 2015. Get more information about our supported APIs.
/apis/device-capabilities /apis/device-capabilities/docs


There are two conflicting facts when it comes to delivering content for mobile applications.

  • There is greater latency when joining wireless networks and wired networks together, than there is in wired networks alone. This often makes content delivery slower.
  • Users expect wireless networks to behave just as quickly as wired ones.

To balance these two contradictory facts, developers need to devise a content management plan. Such a plan includes the following choice:

Should content be delivered only as it is requested by the user? Or, should the expected behavior of the user be anticipated and additional content be download before the user requests it?

Prefetching is the process of retrieving and caching content before it has been requested by the user, and when used intelligently, it can speed up the user's experience with your mobile app.

This Best Practice Deep Dive looks at how prefetching works, describes some issues involved in content management, and provides recommendations for how to use prefetching in an application.


When a user accesses a mobile application, the application can not only deliver the content they requested, but also "prefetch" content they will most likely want next. If the application, the hardware, and the network are all fast enough, it makes sense to download pages in advance that users are likely to want.

With prefetching, a set of predictions are made on what the user will most likely consume next. After the application is finished loading the content, it begins prefetching specified content in the background and stores it in its cache. By storing the prefetched content in the cache, the application ensures that the content will appear quickly when the user requests it.

The following questions need to be considered when designing a prefetching approach:

  • What are the goals for different types of content?
  • What will the underlying baseline system be for how prefetching is applied?
  • What workload will be used in testing?
  • What are the key performance metrics?

The Issue

Downloading files as they are needed can slow down the user experience. If a user scrolls through an application screen and has to wait for content to load, the application appears slow to them. It's better to deliver the content that the user will be likely to request next, by prefetching it.

Prefetching techniques can be very helpful in reducing your user's perceived wait time, but they need to be thought out strategically.

There are potential issues with prefetching:

  • Users can be hard to predict.
  • Users may download content that they never consume, which creates unnecessary overhead
  • Content that is downloaded but not consumed adds needlessly to user's data totals, which can be a problem if they have a data cap.
  • Analytics may become less reliable if content that was never seen is registered as being seen.

If you download too much content, you run the risk of utilizing too much memory on the device, draining the battery, using excess network resources, placing unneeded strain on your servers, and stressing your customers monthly data cap for content that they may never see.

For example, you should think twice about downloading the video highlights for the baseball game when users only ask for the score.

Despite the issues involved with prefetching, the biggest question is not whether to use it, but determining what content to prefetch.

Best Practice Recommendation

The Best Practice Recommendation is to use prefetching. In other words, you should select content to download in anticipation of what your user will want to see next.

To use prefetching effectively, you need to evaluate the content your application uses in order to determine meaningful indexes that identify which content is appropriate for prefetching. Statistical understanding of your user's behavior can help suggest the correct approach to take when evaluating your content. Choosing an appropriate metric is the key.

For example, if your user is downloading the score of a baseball game, it's safe to assume they may want the full box score and game summary. Prefetching this data reduces the number of connections, and since each connection takes two seconds to set up before the download can even start, your application will appear significantly faster.

Prefetching data can be overdone, and a balanced approach should be taken. For instance, prefetching should be stopped when the user clicks on a link, loads different content, or makes the app do any sort of networking activity.

It all comes down to understanding your content and understanding your users.

Prefetching should be done with content that has been specifically chosen based on the historical behavior of users using your app. This can be gathered through analytics and user testing. Setting up proper prefetching standards means thinking critically about the ways users will engage with your app.

Understand your user's behavior.

Understand your content.

Use common sense and think about the user interaction relative to the content.