Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here

Tuesday, April 29, 2014

Net Neutrality - getting the right compromise: Focus on innovation!

I'm currently finishing a research report on new and "non-neutral" mobile broadband business models. When I started writing it, I hadn't realise that it would coincide with such a huge period of industry turmoil, with the EU Connected Continent legislation going through the Parliament, and the US FCC reworking its rejected Net Neutrality rules and suggesting controversial alternatives. Add in various other national legislative initiatives, the recent Internet Governance shindig in Brazil, and it's clear that a lot is going on, and it's the right time to be looking afresh at this area.

A core part of the debate is around so-called "specialised services", particularly involving either prioritised "fast lanes", or perhaps just 3rd-party payments for data traffic instead of the end-user. (I'm also looking at other flavours of non-neutrality such as zero-rating and application-based charging, but they're discussions for another day).

While I've got my own personal preferences and beliefs, I'm wary of confirmation bias and I'm looking at the broadband/Internet industry through the lens of "what compromises are right for the telecoms industry, consumers, the content/developer community & the broader economy/society". 
 In other words, there needs to be:
  • Clear benefits for end-users (consumers & businesses) in terms of speed & reliability of both Internet connections and other non-Internet broadband use-cases (IPTV, VPNs etc). Users must be able to pick and choose any applications they want, and in the case of public Internet-based services they should continue to meet the same (variable, but generally good) performance levels that are common today. It is also important that users are offered ever-improving networks (however measured, but typically peak/average throughput) and a rich stream of new and innovative content, apps and services.
  • Continued innovation in apps and investment in content by various media, web and software players. This is covered below - but a cornerstone here is that developers/content companies should not need to have any interaction or commercial relationship with access providers at the user-end. They might choose to do so, but as a general case they should have reasonable expectation that the Internet "just works" end-to-end for most uses, as it does today, without extra friction. This is what has made the Internet so successful - you don't need ISPs' permission or engagement to build a "killer app" or website; there are no gatekeepers, although you might want catalysts or accelerators.
  • Enough incentive for telecom operators to continue to build and run broadband networks, and offer Internet access as a (often the) primary application that exploits them. This applies to both fixed and mobile operators. Operators may also choose to offer apps, services and content, either standalone or as bundles, together with connectivity. These may involve innovation by telcos directly, or may be reliant on others' innovation. There's a further issue around telephony, which needs continuity and reliability, despite its inevitable & imminent decline in importance and value. There is an open question about levels of consolidation and infrastructure competition.
  • Continued economic and societal benefits from the Internet and (where appropriate) other non-Internet services. This includes fostering national software and content innovators, maintaining critical infrastructure/services, some forms of universal access, and fostering fair competition, enablement of innovative "big projects" (learning, energy, transport), improving citizen engagement & democracy, and earning appropriate levels of tax.
Clearly trying to "optimise" across all those separate constituencies is a tall order. But the astute will have recognised that the word "innovation" appears in all four buckets.

(There are also other constituencies such as technology vendors, industry analysts/consultants, and lawyers/lobbyists who also have their own self-serving needs, but I'm excluding those here)
The interesting thing is that all sides in the current Net Neutrality furore claim that  their position is essential for innovation. Internet advocates point to the "generativity" of the last couple of decades in creating new services and applications, and that "permissionless innovation" by developers must be maintained. 

Telcos believe that allowing managed "specialised services" on networks will encourage additional innovation, especially for demanding applications.

The counter-arguments tend to focus on potential competitive threats, eg large media companies hogging the expensive "fast lane" and squeezing out lesser-resourced rivals and startups, or telcos feather-bedding their in-house services at the expense of Internet-based alternatives.

There is also a sense that some telcos just want to extract rents/taxes from existing services, by acting as gatekeepers and enforcing monopolies on end-user connections. (Disruptive Analysis does not buy the argument that new pricing models are innovative in and of themselves. Interesting and possibly profitable yes, innovative in the sense of "creating new stuff", no).

It seems clear that the debate between "Neutrality = Innovation" and "Non-Neutrality = Innovation" is not going to be easily argued and won conclusively.

I've been reading a couple of interesting books recently. One is Antifragile, about things which gain from disorder and randomness. It has lots of applications - and misapplications - across telecoms, the Internet and networks in general. I've been arguing with Martin Geddes on Twitter about this a lot recently. The other is The Why Axis, which looks at doing real-world experiments to determine cause-and-effect, incentives to change behaviour and so on. It's more about individuals, but some of the ideas apply to businesses as well.

I'm not certain precisely how we can test the question of "does neutrality generate more innovation than non-neutrality?", but it strikes me that any form of way to test that would be incredibly valuable. 

(And yes, we'd have a separate argument about the metrics and measurement, but to a first qualitative approximation I'd say "how much cool new stuff emerges, and how fast?" isn't a  bad start).

I think I've worked out a way to "make a silk purse out of a sow's ear", on the thorny issue of "specialised services", aka "fast lanes".

The potential innovation upside of such managed connections is tempered by the downside of anti-competitive behavior, or unfair rent-seeking.

While I'm a huge believer in market economics, a lot of areas around broadband either lack competition (access networks), or move too slowly at regulatory/legal levels to prevent inordinate damage. The usual suggestion of "let competition sort it out, with easier switching for customers if dissatisfied" is too slow. (It also doesn't help protect the developer/app constituency). We all know how fast "web speed" is - and how fast value is created by innovation, or lost by delay. Most startups - or established Internet companies - have neither the time nor money for protracted and arcane fights against telcos. 

In Europe, the original draft laws did not go far enough to define or constrain specialised services, and politicians saw it as a broad get-out clause for telcos, with too many grey areas. They voted for an amendment which tried to tighten the definition:

“Specialised service” means an electronic communications service optimised for specific content, applications or services, or a combination thereof, provided over logically distinct capacity, relying on strict admission control, offering functionality requiring enhanced quality from end to end, and that is not marketed or usable as a substitute for internet access service.

I actually think that's a bit clunky, and may have some unintentional side-effects. I'm especially unclear about exactly why the term "admission control" is so important here. Hopefully it will get explained and clarified in a subsequent draft.

The US is currently going through the same loop, with the FCC coming out with hugely controversial proposals about similarly allowing "specialised services" to run alongside a supposedly "neutral Internet". Unsurprisingly, this has induced howls of rage from the pro-neutrality lobby, and it is unclear how far it will get through the rest of the US regulatory process, vs. other ideas about redefining Internet access as a "common carrier" platform.

But I have an idea.

Let's kill two birds with one stone.

Regulators could allow specialised services, but only for ones which are not also available on the "open Internet". In other words, specialised services should actually be "special", and not just chunks of the existing Internet sold at a higher price.

This would have a number of effects:

  • It would foster true innovation by companies or individuals that have eschewed the Internet because of needs for QoS guarantees - for example, home healthcare or maybe cloud-gaming
  • It would reduce the fear that big media/app companies could use a "fast lane" to squeeze out less-resourced rivals and startups suffering from the "dirt road" second tier
  • It would reduce the ability of telcos to subtly "force" content companies to pay for access connections that end-users have already paid for
  • It would give a clear set of services for telcos to retail (or develop in-house) without worrying that some of that activity would "leak" over to the public Internet as it gets faster
  • It allows things to be regulated separately if needed
  • It could come from establishing players launching entirely new services, if they wanted (eg 4K video from Netflix, or a home cleaning & security robot service from Google) as long as they were not also available via robot.google.com
  • It gives telcos an incentive to both sell specialised services (for new revenue) and continue investing in "vanilla" Internet provision
  • It will encourage telcos to invest R&D $$ into their own specialised services
  • It doesn't risk "breaking the Internet model" through stifling the normal process of developer innovation - it just adds another channel for creative thought & product release. 
  • It focuses back on telco core strengths of availability & uptime, as it can be dimensioned differently
  • It could prove in a measurable fashion whether QoS capabilities do meet a real need, and provide valuable input for policymakers
  • It allows vendors to sell lots of new policy kit and charging solutions
Yes, I know this isn't going to be "quite that easy", but it strikes me as a reasonable compromise that gets around a lot of the objections on both sides. It gives a clear dividing line between the Internet and the Ain'ternet (You heard that one here first). It could help create a broad range of new broadband propositions that are risky or insecure to do over the public Internet (telemedicine, smart homes & the like).

And it means that if we do get proof that QoS-managed/specialised-services connections generate innovation and "cool new stuff", then future law-making will be much more evidence-based, rather than just hot air from lobbyists.


Unknown said...

Outernet, surely, not Ain'ternet? It's not In, it's Out...

Anonymous said...

Your faith in regulation is quaint

InfoStack said...

Dean, the IP stack lacks price signals and incentives. Today it is private interwork, ad exchanges and telco transport/interconnect, that scales the internet infrastructure.

Yet the internet itself cannot upgrade layers 3 or 4, or coordinate investment in layers 1-2 at the edge.

What's needed is balanced settlements as well as open or equal access somewhere in layers 1-2 to provide for active multi-modal competition.

Dean Bubley said...

Sorry, I have no idea what that actually means in practice, or why it is important or likely to happen.

Please explain in more straightforward terms. I am not a network engineer.

Dean Bubley said...

More specifically, what *exactly* do you mean by "balanced settlements"?

Who pays what, to whom, under what conditions, for what service, how?

How does that impact a typical content company (let's say Spotify) or a consumer or business app provider?

Unknown said...

Isn't this distinction between radically new services and "selling chunks of the Internet at a higher price" splitting hairs. After all 4K could be said to be TV at a higher price in the case of most viewing screens, since the human eye can barely tell the difference from existing HD, especially so called "full HD" at 1080p resolution.

Dean Bubley said...

Philip - fair point on the TV example, and it's probably the thing I'm least comfortable with.

Conversely, I could certainly envisage a medical remote-monitoring service with hard QoS requirements unsuitable for the public Internet. Nobody is going to use pacemaker.com , so there's clear blue water there.

That's my point - if it's called "specialised" it had to be special & innovative in some way - and probably compete successfully with Internet-based equivalents to prove it.

Stefan H said...

Hey Dean,

I think your general analysis is spot on.

But I wanted to comment on your point of only allowing specialized services (SS) that are not also available on the internet.

We actually had that on the table in the Committee (ITRE) stage, when S&D in their CA added one single line to the definition: "its application layer is not functionally identical to services and applications available over the public internet access service".

That did not get a majority because I think the philosophy was that would have been a disproportionate limiting of the scope of SS in terms of the actual application or content. The idea was that SS can be allowed, but only if we kill any potential for abuse as described in the BEREC 2012 QoS report in two scenarios: ISPs degrading IAS as a whole, or individual applications.

So S&D was actually trying to address the latter, and I think everyone agreed they had a point, but how they tackled it could be better. Therefore in the plenary stage the end result was that this idea was moved from art 2.15 to 23.2, where it now stated: "Providers of internet access to end-users shall not discriminate between functionally equivalent services and applications."

I'm still a bit torn if that was the right thing in the end, or if the S&D solution would have been better.

Happy to elaborate and discuss further.