Summary: below is the Executive Summary and extract from a report on key issues for operators seeking to optimise mobile broadband network economics which were debated at the recent Telco 2.0 EMEA Brainstorm in London.
(NB: New video presentations exploring these issues in more detail will be broadcast online at Telco 2.0 Best Practice Live! on 28-30 June. Register here - it’s FREE.)
At the 9th Telco 2.0 Executive Brainstorm, held in London on April 28-30, a dedicated session addressed the technical and business model challenges of mobile broadband, specifically looking at the cost problems and opportunities of the data boom.
The primary points made by the presenters were that:
Delegates: tiering sounds good but how do we do it?
Charging for providing higher and tiered Quality of Service (QoS) was a major topic of debate, and although this was ultimately voted as the most important potential current strategy, there were also strong disparate views offered by delegates. Other major themes were potential technological approaches, the role of content owners, LTE, and application based pricing.
Telco 2.0 Next Steps: Optimising Mobile Broadband Business Model Economics
Optimising mobile broadband economics is a complex challenge, or might perhaps be more accurately described as a collection of different challenges for different operators. There’s always a temptation to try to solve complex problems with a single ‘silver bullet’ idea, but in this instance this is almost certainly impossible, as there are many different possible solutions and different combinations of solutions will work at different times for different operators.
In our series of Future Broadband Business Models Strategy Reports, Telco 2.0 has previously explored the long term business model and technical architectures in Beyond Bundling: Growth Strategies for Fixed and Mobile Broadband - "Winning the $250Bn delivery game.", the structure and evolution of the online video distribution market in Online Video Market Study: The impact of video on broadband business models, and most recently updated our analysis on a range of nearer term potential business model strategies in New Mobile, Fixed and Wholesale Broadband Business Models.
We will next create a new report summarizing the main options for optimizing mobile broadband business model economics. In addition, Mobile Broadband will feature in the first Telco 2.0 Best Practice Live! event at the end of June. This will provide a video-based online data bank of some of the most interesting Mobile Broadband case studies from across the world.
- Start of Detailed Report Extract -
Moving attention away from the service side of the mobile broadband debate, speakers at the 9th Telco 2.0 Executive Brainstorm concentrated instead on how to move the needle on the cost side of the mobile broadband economics equation.
Stimulated by presentations by Dean Bubley, Senior Associate, Telco 2.0, and Dan Kirk, Director, Value Partners and a panel discussion that also included Johan Wickman, CTO Mobility, TeliaSonera, Eddie Chan, Global Head, Efficiency, NSN Consulting, and Andrew Bud, Chairman, MBlox, delegates came to the conclusion that pricing and segmentation strategies, together with offloading capabilities are more important than LTE in dealing with the data-inspired capacity crunch.
Dean Bubley, Senior Associate, Telco 2.0, laid out the problem facing mobile operators. He displayed the now-iconic chart illustrating the ‘broadband incentive problem’ but argued that this was not a problem in itself – he said it was interesting but not necessarily a problem. It didn't, for example, follow that the data service was going to be provided at a loss. Indeed, Johan Wickman’s TeliaSonera is one of a number of operators that are experiencing data revenues higher than is commonly believed. The incentive problem also doesn’t say anything about where cost or capacity issues would manifest themselves - in which elements of the network, or indeed what the right strategy would be to deal with them. Indeed, there are complex technology strategy issues present that aren’t addressed by such a statement at all.
Understanding Costs and Technology
Furthermore, he suggested that the industry may be paying more attention to how revenues from mobile broadband might be increased than how its costs could be controlled. Referring to an Agilent Technologies presentation on LTE, he pointed out that the large majority of all current and future wireless capacity was accounted for by the creation of new cells, therefore radio air interface improvements and spectrum release would not be anywhere near enough to support continued traffic growth without much more cell subdivision, with all its associated costs, and more use of "small cells" such as femtocells, WiFi, or pico-cells.
Network Solutions and Limitations
It is inevitable therefore to look at ways to better use the capacity available. However, the options for managing and shaping traffic are not straightforward and, as NSN’s Eddie Chan said, it is necessary to realise that “efficient” is not the same as “cheap” - efficiency is also about service improvements.
Traffic Management Mess
Bubley was particularly critical of traffic management solutions. He pointed to the important subtlety that traffic management could easily become a “mess”, particularly as traffic to and from PCs is difficult to manage. It tends to include many applications and, what is more, many applications and protocols can often be tunnelled within each other. The PC is a powerful open development platform and therefore there is much scope for users to circumvent traffic shaping. The share of PC traffic that consisted of non-voice data is in the order of 90%+ and essentially all of it is going to or from the public Internet, so whatever the operator does would be come at a cost. The complexity of this is illustrated below.
Figure 4 – Traffic Management Options
Source: Telco 2.0, 9th Telco 2.0 Executive Brainstorm, April 2010
Bubley did point out that smartphone data and featurephone traffic are much more likely to be open to operators "adding value" than PC traffic as they are going to operator-hosted or operator-managed services. The traffic still has to be "managed", but it's now "friendly" traffic which is much more predictable. M2M devices, meanwhile, send all their traffic through the operator's network – which might be a good reason to promote them as a line of business. Given the associated behaviours, it might be wise to segment by device rather than by application, an approach that Bubley feels is even more pertinent given concerns over DPI (Deep Packet Inspection), a technique by which network equipment looks beyond the header used for routing to non-header content (typically the actual payload) for some purpose, in this case to prioritise traffic.
The Doubtful Promise of DPI
Bubley argues that application-layer traffic shaping based on DPI has serious downsides; a major one simply being the definition of an application. For example, which service class would a YouTube video inside a Facebook plug-in have? Users would also adapt to it, use encryption, and tunnel one application in another to get round restrictions. Indeed, much of the file-sharing traffic had already moved to HTTP or HTTPS on ports 80 and 443. This may sound overly ‘techie’ but what it means is that file sharing traffic becomes indistinguishable from, and blends with, generic Web traffic. In addition, there would certainly inaccurate results and ‘false positives’, which could lead to political, regulatory, and reputational issues.
The only uses for deep packet inspection he could see were compliance with law-enforcement demands and gathering data for network optimisation, which might help the industry clear up whether its problems were caused by pirates running P2P, or bad network designs, aggressive data use by smartphones etc., or software updates.
So, if managing and shaping traffic effectively on one network is problematic, does it make more sense to offload it onto another?
The major advantage of the offload concept is that nobody’s data is being de-prioritised – rather than a re-allocation of (supposedly) scarce bandwidth, it represents an actual improvement in the efficiency of the network. It is therefore much less complex from a regulatory, political, and economic standpoint.
Solutions at the Business Layer
There are certainly some valuable options for addressing the data issue from a technical point of view, offload perhaps the most valuable amongst them. However, these are not all the weapons in an operator’s arsenal. They can also look to manage the impact of traffic on their networks and their bottom lines by looking at different business model and pricing options.
On the revenue side, Bubley says the bulk of revenue will be from ‘downstream’ subscription and pre-pay customers, and while helpful, that the near-term growth of new ‘upstream’ or wholesale / carrier services revenues alone would not be enough to cover the costs of capacity increases.
Figure 5 – New Revenue Streams Not Enough to Offset Capacity Requirements
Source: Telco 2.0, 9th Telco 2.0 Executive Brainstorm, April 2010
This view was backed up by a delegate vote (see below) that suggests that while other options are possible, in the short term better tiering and segmentation strategies will be the best answer, followed by device-orientated solutions.
In this vote, the “New device categories” captures M2M (Machine-to-Machine) markets, “device bundled” refers to “comes with data” business models such as the connectivity Sprint provides for the Amazon Kindle, while ‘better tiering and segmentation’ refers to service and tariff packages. ‘Sender party pays’ is where users receive the service free and the sending party, be it an advertiser or other enterprise, pays, and 'government sponsored' is the case where the government pays for the connection as a public service.
Figure 6 – Impact of Mobile Broadband Business Models
Source: 9th Telco 2.0 Executive Brainstorm, April 2010
All Devices Are Not Equal
Returning to Bubley’s earlier claim that device segmentation may be more effective than application management policies, devices are a natural place to start when looking at business segmentation strategies. However, not all devices are created equal.
Smartphones, for example, tend to generate many relatively brief data sessions, they move around constantly and therefore carry out large numbers of register/handoff transactions with the network, and they also generate voice and messaging traffic. Because the signalling overhead for a data call is incurred when setting up and tearing down the session, a given amount of traffic split into 10 brief sessions is dramatically more demanding for the network than the same amount in one continuous session. Also, smartphones often have aggressive power-management routines that cause more signalling as they attempt to migrate to the cell that requires the least transmitter power.
On the other hand, although laptops tend to consume lots of bulk data, they do so in a relatively network-friendly fashion. The cellular dongles are typically operated much like a fixed-line modem, registering with the network and staying on-line throughout the user session. Their use profile tends to be nomadic rather than truly mobile, as the user is typically sitting down to work at the computer for an extended session. And the modems rarely have any serious power management, as they draw power over USB from the computer. These behaviours therefore create natural segments.
To read the rest of the report, covering...
...Members of the Telco 2.0TM Executive Briefing Subscription Service and Future Networks Stream can download the full 20 page report in PDF format here. Non-Members, please see here for how to subscribe. Please email email@example.com or call
+44 (0) 207 247 5003 for further details.