Professional Documents
Culture Documents
Policy MGMT Paper 2 - Bearer-Aware Policy and Charging
Policy MGMT Paper 2 - Bearer-Aware Policy and Charging
charging
Summary: Mobile broadband data traffic is growing extremely quickly, and while
capacity is being added, operators need to face the realities of real-world constraints
and manage policies more effectively. They need to balance the need for profitability
against consumer and regulatory expectations for fair and transparent network
management.
This will increasingly mean that any interventions in data transmission will need to be
“bearer-aware” – i.e., making decisions based on what is actually going on in the
radio network mapped onto user activity, rather than using blanket or arbitrary
policies enforced further back in the network. The addition of new bearer types such
as femtocells, LTE or WiFi offload will add additional complexity. This will also need
to transfer through to the operator’s billing and charging systems, so that pricing for
data use is more closely aligned with the actual costs of providing services.
Background
This white paper covers the potential for bearer-aware policy management and
charging in mobile data networks, as well as the role of Deep Packet Inspection (DPI)
infrastructure in enabling this. It has been written by the independent industry analyst
and consulting firm Disruptive Analysis, and sponsored by Continuous Computing, as
part of an initiative to promote thought-leadership, differentiation and innovative
networking concepts for the mobile broadband and network policy-management
marketplace. The opinions expressed are Disruptive Analysis’ own, and are not
specific endorsements of any vendor’s or operator’s products or strategy.
Introduction
A “bearer” in a mobile communications context is a particular way of connecting to
the network – 2G, 3G, Long Term Evolution (LTE), WiFi and other technologies are
all bearers. This document considers when and how it might useful for the control
and charging parts of the operator to have more awareness of which bearer is being
used, and how it is operating.
In the past, the telecoms industry “establishment” has often suggested that
applications and services should be bearer-agnostic, with the core and service layers
of the network behaving identically, however the user has connected. The idea is that
services can be abstracted from the means of connection.
The reality is turning out to somewhat different. Different bearers have different
properties, and this reflects on what services can be provided, how well they work,
and which is the “best” for any particular circumstance. This can be considered from
both users’ and operators’ perspectives:
From the operator’s point of view, different bearers have different levels of
performance and reliability, coverage patterns, costs of service delivery,
likelihood of congestion and different degrees of control and flexibility.
From the user’s point of view, various different bearer technologies have
implications for service speed and latency, connection set-up time, ease of
user, power consumption and of course, price.
These variables present a challenge for the policy management and charging
infrastructure being installed by many operators, in order to control exploding mobile
broadband traffic. The industry is still at a very early stage of understanding how best
to structure traffic management and pricing for data traffic, so as to optimise
simultaneously for:
Potentially, improving bearer intelligence within the policy and charging domain could
improve the smoothness of network operations, enhance the user’s perceived Quality
of Experience (QoE), and ultimately help support new business models and revenue
streams for the operator.
Historically, most mobile devices had limited options for connectivity – perhaps 2G
and 3G, with usually only a single radio frequency band available for the 3G radio in
a given location. Added to the fact that mobile data was only rarely used compared
with voice, there was little need for complex policy management involvement in
selecting / discriminating between bearers. Typically, the phone would use 3G where
possible, falling back to 2G under the control of the SIM card and radio resource
management function of the network.
Move forward a few years, and the picture has changed considerably. We now have
a broad range of data-centric (or data-only) devices, exploding traffic volumes, and a
proliferation of connectivity options. Many service providers have started working with
policy management platforms, DPI tools and assorted other approaches to controlling
data traffic for fairness and network integrity.
Most smartphones and all laptops have WiFi radios, but these are often not under
direct control of the operator, instead subject to a layer of device software called the
“connection manager”. 3G radios are commonplace, but data networks are
congested and sometimes have poor propagation, especially indoors. LTE networks
are on the verge of rollout – but are likely to be spread across a wide range of
frequency bands with different characteristics, as well as both TDD (Time Division
Duplexing) and FDD (Frequency Division Duplexing) variants.
It is quite possible that a future smartphone might theoretically have fast access to
the Internet via three, five or even more different connections, of which two might be
useable simultaneously. It is possible that the best or default connection from the
radio network’s point of view may not always be the optimal in terms of the
perspective of the core network, the end-user or a given application or service.
Added to this are industry structural changes – network sharing, outsourcing and new
wholesale / roaming arrangements may mean added choices for connection. Various
forms of network offload have emerged to relieve pressure on the Radio Access
Network (RAN) and backhaul – in some cases involving third-party operators.
There is also growing pressure for operators to actively monitor the status of their
radio networks – especially in instances where local Net Neutrality law permits
intervention in the traffic stream.
One can imagine quite complex scenarios – imagine a user with a smartphone on the
fringes of a 2.6GHz LTE cell, but within range of access to a local 2.1GHz HSPA
(High Speed Packet Access) femtocell, plus a 900MHz HSPA+ (Evolved HSPA)
network run on a wholesale open-access basis. They also have potential access to
WiFi provided by a third-party operator on a managed-service basis. Working out
how that maps onto the customer’s service plan and data allocation, the operator-
hosted and third-party applications being used, the relative cost of each to the
operator, the level of mobility expected, and the real-time likelihood of congestion is
an almost intractable problem.
At the moment, probably the most common use case for mobile policy management
is the enforcement of tiers and caps. Many operators now offer their users a fixed
allocation of traffic per month and policy infrastructure is used to ensure that these
are tracked and complied with. If the caps are breached, there are options to cut off
service, downgrade to slower speeds, or levy additional charges.
But at the same time, operators are also encouraging users to purchase femtocells,
partly to enhance coverage, but also to offload the macro network and backhaul,
putting the burden on the user’s own broadband instead.
There is, however, understandable pushback from end users about the notion of
paying three times over for the same connectivity – the mobile data plan, the
broadband subscription, and the femtocell purchase or rental costs – especially if the
femto-derived traffic is then deducted from the same cap as data consumed over the
macro network. It would be naïve to assume that users (or journalists) are unaware
that operators need to offload their networks to reduce congestion, and achieve lower
costs – and that clearly, femtocells help achieve this.
Ideally, femtocell traffic would be zero-rated and not count against the quota –
especially if WiFi offload is available for free. Early signs are that operator practices
differ here, with some such as AT&T and Vodafone charging femto data consumption
against quotas, while Softbank goes to the other extreme and provides not just free
femtocells, but free ADSL links to connect them.
From a DPI point of view, it should be very easy to detect which traffic came via a
femtocell, as it will have gone through the operator’s femto gateway with an
identifiable IP address and specific path through the network. This should make it
easier to treat differently in terms of rating and billing, assuming that the charging
systems itself is sufficiently flexible to deal with the extra detail.
There is a huge amount of regulatory and government focus on mobile (and fixed)
broadband at the moment, especially around the issues of Net Neutrality and traffic
management. The FCC, the EU, Ofcom and various other national regulators from
countries as diverse as Canada and Chile are considering when it is appropriate for
operators to actively manage data and/or application flows, and by which
mechanisms.
One likely outcome is that some form of transparency on traffic management will be
mandated by regulators – there will need to be clear publishing of relevant
information so that users can make better decisions about choosing a service
provider. In some cases governments may decide that actions such as throttling or
traffic-shaping are only permissible when the network is actually congested, while in
other instances more relaxed and flexible regimes may be pursued. It is possible that
different network / bearer technologies, or even specific frequency bands, will be
subject to different rules. Some regulators may demand collection of data on actual
achieved data rates, rather than theoretical peaks. Some may even set minimum or
average speed requirements.
user in the network. However, such actions may fall foul of strict interpretation of
copyright laws or various rights agreements.
In short, it seems probable that the laws around mobile data will become more
complex – and there may well be internal codes-of-conduct to support as well.
The ability of the policy infrastructure and DPI to both collect appropriate information
and act upon it will be highly important. While there are many dimensions to this
(subscriber, application, device, location, etc.), it seems highly likely that bearer-
awareness will play a central part as well.
A cellular radio network generally makes internal decisions about the appropriate
bearer and frequency band for connection of a give device, at a given place / time.
However, the criteria for choice are relatively hard-coded and context-insensitive,
basically revolving around factors such as signal strength, measurements of “channel
quality”, preferred networks on the SIM, and the capabilities of the device.
However, not all other things always will be equal – and the best total “quality of
experience” (QoE) or other factors such as the operator’s cost-to-serve, may be
achievable only through forcing connection shifts based on context information. For
example, if a user’s behaviour and past data usage suggests a tendency towards
frequent, low-bandwidth consumption of capacity, there may be a different set of
criteria to apply compared to those that need maximum speed on rarer occasions.
Figure: QoE is about much more than merely internal network QoS
A user downloading 5GB of video in an isolated countryside cell at 3am may cause
fewer problems (and cost less to serve) than another watching 50MB of video in the
bus to work through a city centre at 9am. A laptop connecting for 30 minutes to
download a major operating system update might generate less signalling load than
a smartphone checking for new emails 40 times an hour – and downloading nothing.
Congestion, poor performance and customer complaints might actually be a result of
a mis-configured network element. Certain types of device might generate more
problems than others, for the same notional amount of usage.
Ideally, operators will move to a much more holistic and intelligent approach to traffic
management, blending real-world observables of network usage patterns with clever
pricing and charging structures, and direct intervention in the traffic stream where
needed. The solution here partly involves investing in better tools and probes to
watch the network performance, traffic and user experience more thoroughly – and
aggregate and report the results with adequate data-mining and analysis capabilities.
Bearer-awareness is a critical part of this process.
This approach also has the potential to help the operator understand their customers
better – what do they want to do, how do they go about doing it – and what problems
do they encounter. A good analysis may enable the operator to predict and smooth
traffic demands, improve customer satisfaction, network efficiency and form a basis
for new partnerships and revenue streams.
It is worth noting that some of the larger network equipment providers are attempting
to combine multiple “touch-points” of policy management for more end-to-end
propositions – potentially spanning from device, through radio network and core, right
to the application-hosting infrastructure and data centres. Many are looking at the
introduction of all-IP networks for LTE and EPC (Evolved Packet Core) as a lever for
this.
Bearer-specific applications
Certain applications may only work over certain bearers – or might be optimised for
them. For example, as operators look to deploy LTE networks, there are likely to be a
whole range of issues around voice and how it is supported. For a long voice call, it
may be better to force the connection to HSPA, rather than risk glitches in running
VoIP over LTE, especially if cell-to-cell handoffs are still maturing in reliability.
Other examples could also crop up – certain gaming applications may need the low
latency of LTE or WiFi, rather than 2G or 3G, while there may be in-home services
that are only appropriate for use while on a femtocell rather than the macro network.
For example, the IPTV division at the operator might have negotiated broadcast
rights only for domestic usage of certain content, but not while the user is fully-
mobile.
There is also a link to charging and rating here – it should be possible to create tariffs
which allow users to access video downloads only over certain radio networks, or
only if a cell is currently under-utilised. Potentially, using cell-ID technology, it could
also be possible for the policy infrastructure to restrict access to particular
applications to certain geographic areas.
Outside the scope of detailed analysis in this paper, it is also important to think about
how individual smartphone apps, or browser-based widget frameworks, may be able
to get access to low-level device APIs (Application Programming Interfaces). This
could tell the app what the current connection status is – or even allow them to force
a change, for example prompt the user to connect to WiFi, or even switch off one of
the radios. There is a significant risk that this could conflict with operator-driven
connection management, especially on fully-open devices.
Femtocell-awareness
This has immediate applicability to the charging domain, but may also be useful for
assessing the value of femtocell investments, and how the affect user behaviour.
Correlating either device usage, or differential usage of applications on macro vs.
femto gives important information about user preferences, or could form the basis of
particular tariff plans or new value-added services. For example, it could be used to
trigger higher-quality video with better frame rates or resolution, assist with the
targeting of adverts if the user is known to be at home, or improve operators’
cooperation with third party service providers such as social networks.
Congestion awareness
As well as bearer type, Disruptive Analysis believes that it will also be increasingly
important for policy infrastructure to start taking account of bearer and network
conditions. Knowledge of actual or predicted congestion is hugely important for:
Increasingly, applications and devices are becoming more aware of their context –
what network they are connected to, what the apparent performance of the network
is and so forth. But what is missing is a way for the network to flag to users (or the
© Disruptive Analysis Ltd, November 2010 Bearer-aware policy management 8
Disruptive Analysis Don’t Assume
One of the worst flaws in the concept of content providers paying for QoS is that it
seems perverse to pay for extra quality when it isn’t actually needed. Nobody would
pay extra for driving on priority (toll) lane on a road between two cities at 3am when
there is no traffic on the normal highway.
It would be an act of bad faith for the operator to try to pretend an unneeded service
is premium – especially if the alternative was in any way artificially degraded for
unnecessary reasons. (It is probable that such chicanery will become measurable
and reverse-engineered, in any case).
But if the application can be made aware of the current state of the network – or even
that a given cell is often congested in certain hours, perhaps – it is more likely that
either the provider will choose to down-rate the service and warn the user (e.g.,
“Congested network – video may pause occasionally”), or perhaps even pay for
additional quality. Providing extra information should be valuable, as well as showing
awareness of an important part of behavioural psychology – people are much more
accepting of problems if they are warned about them, even if the warning is a poor
excuse (e.g., “The flight is delayed owing to late arrival of the incoming aircraft”).
There are numerous possible use cases for this type of intelligence:
Any plans which have lots of “ifs and buts” will start to raise concern among both
users and consumer-advocacy organisations. One approach may be to communicate
network status and policy actions via alerts and handset clients – perhaps a “battery
meter” style UI element to convey network congestion levels, for example, or perhaps
Power management
One possible new use case for bearer-aware DPI and policy management is around
adjusting a device’s connections to optimise its battery life. Given knowledge about
device type, application usage and the subscriber’s behaviour, it may be possible to
tune the bearer selection process, or alert the user to more power-friendly options.
Such a proposition would need careful work involving collaboration of several groups
within a service provider, but being able to go to market with a proposition that
“batteries last longer on our network” could be extremely compelling. It is probable,
however, that such a solution would need both network-based intelligence and a
smart connection manager client on the device itself.
Most devices are not completely operator-controlled, although their (licensed) cellular
radio elements usually are. However, it needs to be recognised that manufacturer- or
user-installed connection management software may well be designed around
different priorities to the operator’s own policy infrastructure. All things being equal,
end-users will choose to obtain maximum speed and data consumption, at the lowest
price and power consumption – and their software will reflect these preferences and
may attempt to “game” the network to obtain them.
The answer is to embrace and extend this model. Over time, Disruptive Analysis
expects operators’ mobile policy management to involve device-side software as well
as network-based elements. Alerts or notifications to the user of network congestion,
options or recommendations for content format choice or transcoding, or rate-
adaptive applications that work collaboratively with the operator’s infrastructure.
While there will always be some tensions and differences of emphasis, attempting to
dictate the behaviour of complex end-devices (which may well have >1GHz of
computing power) from the network alone is unlikely to be possible.
Conclusions
The area of policy-based control of mobile broadband traffic is an exceptionally
complex one, in terms of technology, business model, law and user experience.
There is a broad range of vendors with a panoply of distinct propositions and
emphasis. There are often organisational issues within operators with multiple silos.
It is quite common for operators to purchase point solutions within different groups.
The concept of a holistic approach to mobile data traffic management should
transcend this structure, but in many cases there are organisational barriers to
forming cross-functional teams.
Many of the current solutions for policy management and content optimisation have
little knowledge or control of “what is happening in the air”. In many cases, the gating
factor on user experience is basic radio coverage, not quality, for example. In
addition, many of the real problems in mobile broadband come from signalling for
things like RF (Radio Frequency) power control, not from the bulk of traffic.
In addition, the base stations typically allocate radio capacity to whichever use has
the best signal – not which is the most important. It is extremely hard to control the
fine-grained details of packet-scheduling from the core network.
Disruptive Analysis believes that increasing the bearer-aware intelligence of the DPI
and policy management infrastructure is a critical element of creating fair, effective
traffic management that balances the needs of user and operator. The increasing
diversity of bearer technologies, frequency bands and ownership models will
otherwise start resulting in inconsistencies and arbitrage opportunities, as pricing and
charging inadequately reflect the realities of network and device behaviour.
The company produces research reports and white papers, conducts consulting projects on
technology strategy and business models, and provides speakers and moderators for
workshops and conferences.
Website: www.disruptive-analysis.com
Blog: disruptivewireless.blogspot.com
Twitter: @disruptivedean
Every reasonable effort has been made to verify research undertaken during the work on this
document. Findings, conclusions and recommendations are based on information gathered
in good faith from both primary and secondary sources, whose accuracy it is not always
possible to guarantee. Disruptive Analysis Ltd. disclaims all warranties as to the accuracy,
completeness or adequacy of such information. As such no liability whatever can be accepted
for actions taken based on any information that may subsequently prove to be incorrect. The
opinions expressed here are subject to change without notice.