Pages

Pages

Tuesday, April 29, 2014

Net Neutrality - getting the right compromise: Focus on innovation!

I'm currently finishing a research report on new and "non-neutral" mobile broadband business models. When I started writing it, I hadn't realise that it would coincide with such a huge period of industry turmoil, with the EU Connected Continent legislation going through the Parliament, and the US FCC reworking its rejected Net Neutrality rules and suggesting controversial alternatives. Add in various other national legislative initiatives, the recent Internet Governance shindig in Brazil, and it's clear that a lot is going on, and it's the right time to be looking afresh at this area.


A core part of the debate is around so-called "specialised services", particularly involving either prioritised "fast lanes", or perhaps just 3rd-party payments for data traffic instead of the end-user. (I'm also looking at other flavours of non-neutrality such as zero-rating and application-based charging, but they're discussions for another day).

While I've got my own personal preferences and beliefs, I'm wary of confirmation bias and I'm looking at the broadband/Internet industry through the lens of "what compromises are right for the telecoms industry, consumers, the content/developer community & the broader economy/society". 
 In other words, there needs to be:
  • Clear benefits for end-users (consumers & businesses) in terms of speed & reliability of both Internet connections and other non-Internet broadband use-cases (IPTV, VPNs etc). Users must be able to pick and choose any applications they want, and in the case of public Internet-based services they should continue to meet the same (variable, but generally good) performance levels that are common today. It is also important that users are offered ever-improving networks (however measured, but typically peak/average throughput) and a rich stream of new and innovative content, apps and services.
  • Continued innovation in apps and investment in content by various media, web and software players. This is covered below - but a cornerstone here is that developers/content companies should not need to have any interaction or commercial relationship with access providers at the user-end. They might choose to do so, but as a general case they should have reasonable expectation that the Internet "just works" end-to-end for most uses, as it does today, without extra friction. This is what has made the Internet so successful - you don't need ISPs' permission or engagement to build a "killer app" or website; there are no gatekeepers, although you might want catalysts or accelerators.
  • Enough incentive for telecom operators to continue to build and run broadband networks, and offer Internet access as a (often the) primary application that exploits them. This applies to both fixed and mobile operators. Operators may also choose to offer apps, services and content, either standalone or as bundles, together with connectivity. These may involve innovation by telcos directly, or may be reliant on others' innovation. There's a further issue around telephony, which needs continuity and reliability, despite its inevitable & imminent decline in importance and value. There is an open question about levels of consolidation and infrastructure competition.
  • Continued economic and societal benefits from the Internet and (where appropriate) other non-Internet services. This includes fostering national software and content innovators, maintaining critical infrastructure/services, some forms of universal access, and fostering fair competition, enablement of innovative "big projects" (learning, energy, transport), improving citizen engagement & democracy, and earning appropriate levels of tax.
Clearly trying to "optimise" across all those separate constituencies is a tall order. But the astute will have recognised that the word "innovation" appears in all four buckets.

(There are also other constituencies such as technology vendors, industry analysts/consultants, and lawyers/lobbyists who also have their own self-serving needs, but I'm excluding those here)
 
The interesting thing is that all sides in the current Net Neutrality furore claim that  their position is essential for innovation. Internet advocates point to the "generativity" of the last couple of decades in creating new services and applications, and that "permissionless innovation" by developers must be maintained. 

Telcos believe that allowing managed "specialised services" on networks will encourage additional innovation, especially for demanding applications.

The counter-arguments tend to focus on potential competitive threats, eg large media companies hogging the expensive "fast lane" and squeezing out lesser-resourced rivals and startups, or telcos feather-bedding their in-house services at the expense of Internet-based alternatives.

There is also a sense that some telcos just want to extract rents/taxes from existing services, by acting as gatekeepers and enforcing monopolies on end-user connections. (Disruptive Analysis does not buy the argument that new pricing models are innovative in and of themselves. Interesting and possibly profitable yes, innovative in the sense of "creating new stuff", no).

It seems clear that the debate between "Neutrality = Innovation" and "Non-Neutrality = Innovation" is not going to be easily argued and won conclusively.

I've been reading a couple of interesting books recently. One is Antifragile, about things which gain from disorder and randomness. It has lots of applications - and misapplications - across telecoms, the Internet and networks in general. I've been arguing with Martin Geddes on Twitter about this a lot recently. The other is The Why Axis, which looks at doing real-world experiments to determine cause-and-effect, incentives to change behaviour and so on. It's more about individuals, but some of the ideas apply to businesses as well.

I'm not certain precisely how we can test the question of "does neutrality generate more innovation than non-neutrality?", but it strikes me that any form of way to test that would be incredibly valuable. 

(And yes, we'd have a separate argument about the metrics and measurement, but to a first qualitative approximation I'd say "how much cool new stuff emerges, and how fast?" isn't a  bad start).

I think I've worked out a way to "make a silk purse out of a sow's ear", on the thorny issue of "specialised services", aka "fast lanes".

The potential innovation upside of such managed connections is tempered by the downside of anti-competitive behavior, or unfair rent-seeking.

While I'm a huge believer in market economics, a lot of areas around broadband either lack competition (access networks), or move too slowly at regulatory/legal levels to prevent inordinate damage. The usual suggestion of "let competition sort it out, with easier switching for customers if dissatisfied" is too slow. (It also doesn't help protect the developer/app constituency). We all know how fast "web speed" is - and how fast value is created by innovation, or lost by delay. Most startups - or established Internet companies - have neither the time nor money for protracted and arcane fights against telcos. 

In Europe, the original draft laws did not go far enough to define or constrain specialised services, and politicians saw it as a broad get-out clause for telcos, with too many grey areas. They voted for an amendment which tried to tighten the definition:


“Specialised service” means an electronic communications service optimised for specific content, applications or services, or a combination thereof, provided over logically distinct capacity, relying on strict admission control, offering functionality requiring enhanced quality from end to end, and that is not marketed or usable as a substitute for internet access service.


I actually think that's a bit clunky, and may have some unintentional side-effects. I'm especially unclear about exactly why the term "admission control" is so important here. Hopefully it will get explained and clarified in a subsequent draft.

The US is currently going through the same loop, with the FCC coming out with hugely controversial proposals about similarly allowing "specialised services" to run alongside a supposedly "neutral Internet". Unsurprisingly, this has induced howls of rage from the pro-neutrality lobby, and it is unclear how far it will get through the rest of the US regulatory process, vs. other ideas about redefining Internet access as a "common carrier" platform.

But I have an idea.

Let's kill two birds with one stone.

Regulators could allow specialised services, but only for ones which are not also available on the "open Internet". In other words, specialised services should actually be "special", and not just chunks of the existing Internet sold at a higher price.

This would have a number of effects:

  • It would foster true innovation by companies or individuals that have eschewed the Internet because of needs for QoS guarantees - for example, home healthcare or maybe cloud-gaming
  • It would reduce the fear that big media/app companies could use a "fast lane" to squeeze out less-resourced rivals and startups suffering from the "dirt road" second tier
  • It would reduce the ability of telcos to subtly "force" content companies to pay for access connections that end-users have already paid for
  • It would give a clear set of services for telcos to retail (or develop in-house) without worrying that some of that activity would "leak" over to the public Internet as it gets faster
  • It allows things to be regulated separately if needed
  • It could come from establishing players launching entirely new services, if they wanted (eg 4K video from Netflix, or a home cleaning & security robot service from Google) as long as they were not also available via robot.google.com
  • It gives telcos an incentive to both sell specialised services (for new revenue) and continue investing in "vanilla" Internet provision
  • It will encourage telcos to invest R&D $$ into their own specialised services
  • It doesn't risk "breaking the Internet model" through stifling the normal process of developer innovation - it just adds another channel for creative thought & product release. 
  • It focuses back on telco core strengths of availability & uptime, as it can be dimensioned differently
  • It could prove in a measurable fashion whether QoS capabilities do meet a real need, and provide valuable input for policymakers
  • It allows vendors to sell lots of new policy kit and charging solutions
Yes, I know this isn't going to be "quite that easy", but it strikes me as a reasonable compromise that gets around a lot of the objections on both sides. It gives a clear dividing line between the Internet and the Ain'ternet (You heard that one here first). It could help create a broad range of new broadband propositions that are risky or insecure to do over the public Internet (telemedicine, smart homes & the like).

And it means that if we do get proof that QoS-managed/specialised-services connections generate innovation and "cool new stuff", then future law-making will be much more evidence-based, rather than just hot air from lobbyists.

Friday, April 25, 2014

Telecoms regulators should encourage multiple access, not just competitive access

Much of the regulatory debate around telecoms competition (fixed or mobile) involves consideration of investment in networks. 

Broadly, this centres on two or three dimensions:


  • Investment in adding capacity, coverage & capability to existing networks, eg fibre deployment, or upgrading to 4G mobile networks
  • Ensuring adequate facilities-based competition between (retail) access networks, eg cable vs. telco
  • And also where retail competition is difficult, enabling wholesale competition, such as by unbundling of local loops, or allowing MVNOs
The general idea is that consumers should have access to alternative suppliers for connectivity, especially broadband. Some regulators also try to ensure that switching/churn is made quick and easy.

At the same time, a reverse trend is occurring with network consolidation (mobile/mobile, mobile/fixed & fixed/fixed) as a result of declining profitability. We also see an ongoing trend towards network-sharing in order to reduce costs, although that is still only patchily-accepted.

I am increasingly of the opinion that regulators need to shift stance a bit.

Rather than focusing on competition at a national level and assessing the ability for consumers to switch access providers, I think an alternative approach is warranted.

Regulators should look to make it easier for users to access *multiple* networks dynamically, rather than necessarily switch from one primary provider to another. New devices and applications now make it much easier for users to exploit several routes to obtain data or content, either switching on short time-scales, or even connecting to multiple sources simultaneously.

Most obviously, smartphones are typically able to connect to both a cellular network, and the user's choice of WiFi in a given venue. Less obviously & commonly, they can also be "tethered" together, to use another phone's cellular connection by proxy. In the fixed broadband world, we sometimes see mobile broadband used as a backup - or in the business space, perhaps multiple redundant fibre connections into the same building.

Where customers are able to choose and switch in (near) realtime  between different connectivity options, they (or apps working on their behalf) can minimise costs, maximise up-time, arbitrage around "neutrality" issues, and perhaps bond together multiple connections for highest speeds.

In particular, users are then not locked into a single business model - such as monthly ongoing "subscription", but can benefit from alternatives such as one-off payments, sponsored or "free" access, amenity or utility provision, bundling with other goods or services and so on.

The key here is to ensure that customers have access to multiple independent networks, and not just some form of centrally-controlled "HetNet" with converged billing and policy (and business model) functions.

There are a number of further ways that "multiple access" can be fostered:


  • Encouragement of dual-SIM (& dual-standby) mobile devices
  • Fostering of alternative / overlay infrastructure in both fixed and wireless broadband, such as new generations of LEO/MEO broadband satellites, or more "far-out" options such as Internet-based drones and balloons
  • Allowing pole-mounted fibre to be deployed more easily
  • Reducing any onerous limitations on public/amenity WiFi (such as user registration)
  • Controversially, perhaps limiting the ability of cross-ownership or forced-offload between cellular & WiFi providers (ie ensuring that users have "WiFi neutrality" and are able to select a network of their choice)
  • Removing limits or blocking of tethering or other device-to-device connectivity and access-sharing
  • Looking at mechanisms to encourage households to obtain and manage two fixed-broadband connections, eg cable+fibre. This could involve promotion of "dual-homing" broadband gateways or set-top boxes
  • Examination of ways to encourage new entrants into domains such as white-space wireless, mesh networks and so forth
  • Consideration that many users will have several devices - and perhaps several different access providers - that can substitute for each other, in many cases.
  • Encouraging the adoption of third-party network monitoring software and reporting - allowing users to make informed decisions about their network choices (speed, neutrality, cost etc) at any location and time
The point here is that there does not need to be "monolithic" competition - ie nationwide networks. Neither does it need "seamlessness". Users and (increasingly) devices and apps are capable of switching networks very quickly, if perhaps not always fast enough or easily-enough for cases like inbound phone calls. That is a "nice-to-have" which can be added later, if and only if it doesn't impact users' ability to choose and arbitrage across multiple providers' networks.

To a first approximation and for an early regulatory goal, it is sufficient for there to be multiple independent ways for most users to access a basic set of Internet services such as email, social networks, e-commerce, messaging and voice communications, at the majority of locations and majority of the time. This might involve using a phone instead of a PC, or a tablet instead of a TV. But people are quite flexible in achieving their real goals, such as "being entertained for 30 minutes" or "arranging to meet a friend", and cheaper devices and better software/apps allow tasks to be substituted easily.





On the other hand, it is unreasonable to expect full "access redundancy" to 4K video-streams ubiquitously.


If regulators focus on "temporary switching" power of end-users, they reduce the potential for abusive market practices, irrespective of Net Neutrality position. If a customer's mobile operator tries to block Skype, but it is available instead via a neighbour's shared-access WiFi with minimal hassle, that both mitigates the harm to the user, and incentivises the original cellular operator to adopt more reasonable network policies. If an ISP tries to extort unreasonable paid-peering or transit fees from a CDN or content provider, then perhaps a 3rd-party Internet drone or satellite connection can step in, and in the process act to "keep honest" the interconnection fees.

This is a theme that I will come back to in various other guises. But the bottom line is that operators should consider the ability for users to exploit *multiple* independent network connections and business models on a dynamic basis, not just choose occasionally between primary providers when contracts are up for renewal.

Tuesday, April 22, 2014

5G standardisation needs to be multi-stakeholder, not just a cosy telco+vendor process

I'm seeing a huge amount of interest in the early definition of 5G networks, which are expected to start appearing sometime around 2020 - even though there is thus far no formal definition. I attended a Huawei-sponsored event about 5G in Munich a couple of months ago, which is just one of many similar conferences and gatherings involving most of the traditional industry. And there certainly seems to be a long list of technologies - and potential requirements - vying for inclusion and consideration.

However, in my view 5G should take a different path to standardisation to 4G/LTE. That process definitely had some highlights - especially the bringing-together of the formerly separate CDMA and GSM/UMTS worlds. LTE has also taken off rapidly in some countries (especially the US, Japan and South Korea), paralleling and being catalysed by the rise of smartphones.

On the other hand, LTE has some downsides. In terms of business model and user-behaviour, it is still largely "like 3G use, but more so". It's faster, cheaper (per-MB) and has lower latency. But it's also often patchy in coverage, has far too many separate frequency bands, and of course is sub-optimal for telephony, with CSFB's compromises and VoLTE's huge delays and cost/complexity, occurring right at the same time as "peak telephony". The LTE speedboat has had to drag the ugly anchor-weight of IMS along with it, as the 3GPP standards have meshed them so tightly. (Or to use my 2009 metaphor, the dead parrot of IMS has been nailed to the LTE perch).

LTE also sits somewhat uneasily with the growth of WiFi almost everywhere. Ignore the HetNet hype for a moment - most WiFi is, and will continue to be, totally separate from the mobile network. WiFi is mostly either private (part of a home or office LAN enviroment), controlled by fixed/cable carriers, or provided as an amenity (rather than a service) by venue owners, event organisers, software developers and others.

There is a reason for this - WiFi is not constrained to just a single business/user-interaction model, ie a "subscription". It does not need a SIM card. It can be subscribed-to with an ongoing business relationship, or it can be transient, free, sponsored, venue-based, time-based, anonymous, tethered or assorted other approaches. This stands in contrast to 3G/4G, which for all its lobbyist whining about Net Neutrality, still comes as a Henry Ford-like "any business model you like, as long as it's a subscription".

I've never had a conference organiser (or a cafe) give me a code for "free LTE" while on-site.

This is partly because WiFi's technical standards (defined by IEEE & WiFi Alliance) do not include elements that pre-define its usage model. It does not need a SIM (subscriber ID module - the hint is in the name). There are multiple authentication models, and can have "users" rather than subscribers, with access not requiring many of the characteristics expected in the cellular world. As such, it is a "multi-stakeholder" technology - it involves network operators (fixed & mobile), end-users, enterprises, device vendors, developers, municipalities, venue owners, tenants, OS suppliers, aggregators, advertisers and many other interested parties. WiFi also does not mandate a specific service or control infrastructure - while IMS can theoretically be used, it in almost all cases is not.

As a result, unlicenced wireless data has a hugely diverse range of use-cases and manifestations, which has driven immense amounts of innovation, consumer benefit, application consumption and, ultimately, economic and social gain.

We should be thinking the same way for 5G network architecture. We need (we=governments, users, regulators, vendors, investors) to make sure that it too is "multi-stakeholder". Unlike 4G/LTE, we have an opportunity now for many other groups to get involved in defining "requirements" for 5G, and especially, making sure that whatever technical standards emerge do not constrain either business models, nor application/user interactions. Clearly, if it is expected to operate in licenced spectrum, it will need adequate mechanisms for management (especially for interference and probably aspects of performance/quality), but it needs to be technology-neutral end to end. In particular, it needs to be core-neutral and not assume a particular architecture.

Interestingly, we're already seeing discussions to put LTE into unlicenced bands - but at the moment, just with the same structures and architectures as "normal" cellular.

In order to get to those endpoints, the discussion needs to involve many more parties than just the cosy vendor/MNO process seen in 3G and 4G. The upfront discussions defining the technology need to involve a similar - indeed, broader - range of parties to WiFi.

While clearly Vodafone, Verizon, Ericsson & Huawei will need to be heavily involved in the 5G technology definition, so do Ford, Boeing, IBM, Comcast, Google, General Electric, Hilton, Apple, GlaxoSmithkline, Sony, Disney, ABB, WPP, Shell, Westfield, Starbucks, NATO and the Greater London Authority. There also need to be marketeers, behavioural psychologists and other social scientists involved, who can help steer the direction towards what customers actually want - rather than just what (old, mostly male) engineers think we should have.

Governments and regulators need to get involved immediately to ensure that 5G standards definition is not inherently anti-competitive. If we believe in the Internet of Things, eGovernment, Cloud, wearables and digital inclusion, we need to ensure that 5G is not just "4G on steroids". With the coming of virtualisation, we also need to make 5G much more easily "hackable", especially if it's used in unlicenced spectrum. Bits of the radio technology should have developer kits, or even be open-sourced if possible. As long as there is adequate protection against interference, 5G should allow experimentation. We need to be certain that there is not a cartel-like grip on IPR, that funnels 5G into being (to all intents and purposes) merely an overlay/upgrade for 3G and 4G networks. The idea of wholesale - at multiple levels - needs to be ingrained upfront as well.

It should be remembered that the telecom industry is not the only source of capex or managed-services opex. It is in ALU & Ericsson & Huawei's interests to develop versions of 5G that they can sell direct to governments, Facebook, Exxon and electricity companies, as well as traditional network operators. We already see LTE starting to appear for public safety or industrial uses - the future architecture of 5G needs to enable that approach to be expanded massively, especially as the telecom industry consolidates inevitably in coming years.

This probably means that bodies like 3GPP and ETSI are not the right places to start. It is questionable if the new 5G-PPP organisation is, either. It is not obvious to me that historically non-cellular companies (eg Toyota, or medical device vendors, or train operators) will easily be able to fit into the clubby telecom-standards world processes and strictures. This is already seen in the attempts by the mobile industry to embrace/subsume WiFi, where the other stakeholders are effectively excluded from many of the technical discussions. (To Huawei's credit, the 5G Munich event included BMW and a number of other non-traditional participants, although none obviously from the web world).

In a nutshell, as we go towards 2020 and 5G, and as mobile technology becomes more pervasive and important, it is critical that we make sure, upfront, that other voices are heard, and that we don't find the standards process just steam-rollering its way to perpetuating the past.

Regulators and governments should inspect the underlying assumptions - for example whether 5G is always going to be a "service" or whether it can also be "owned", or provided as a utility or amenity.

There should be nothing in the technology to preclude this Elements like SIM cards & IMS cores can still remain - but should be entirely optional. The radio technology needs to be decoupled from transmission, from the core, from the service layer, and from software - ideally with open APIs throughout. Equally, there should be no assumption that 5G is to be used just for Internet access - it should be neutral to "back end" network infrastructure and service domain as well.

5G should be open and exploitable by satellite, drone, balloon and device-to-device innovators, as well as traditional base-station infrastructure providers.

In brief, telecoms is becoming too important to just leave it up to the telcos and their vendors. Government needs to exert a heavier hand to make sure 5G standards are not just stitch-ups, excluding newcomers that can prove to be true sources of innovation and value. They need to ensure that tempting investments and lobbying prowess from the incumbent cellular world do not skew the playing field. And other parties, from consumer electronics to property to vehicle manufacture  to app developers to defence, need to get involved NOW, and ensure that 5G pre-research includes them, and reflects their needs upfront.