Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here

Friday, May 20, 2011

Thoughts from the LTE World Summit


Earlier this week, I spent two days at the LTE World Summit in Amsterdam. More than 2000 great & good members of the telecom industry, including a ton of operators and most of the major vendors. Multiple streams of presentations, a decent-sized exhibition show floor and “big conference” production rather than a small meeting room in a hotel. I was hosting an analyst roundtable on Voice, VoLTE and the Future of Communications, and also presenting a 30 minute presentation on similar topics.

Those of you following my Twitter stream (@disruptivedean) will have seen a fair amountof  ongoing commentary, but I thought a few issues were worth drilling into here. I'll be writing a separate post about the Future of Voice, and my upcoming workshops with Martin Geddes, so I won't overdo the VoLTE analysis here.

Overall though, I’ve come away rather pessimistic, despite all the bombastic hyperbole I’ve heard. I’m hearing the same old stories I heard at last year's event – and a lot of them are getting worse rather than better. Loads of hoary old clichés about peak rates and “exponential” data growth. How flatrate plans don’t cut it, long after most of them have been phased out anyway. A ton of unrealistic vendor hype about application-specific policy and charging “business models”.

The big story was also much the same as last year, albeit stated a bit more loudly rather than just implied – there are too many spectrum bands for LTE. At least 8 “core” bands, and another 10-20 also being deployed or considered. Europe will probably get by with three mains ones – 800, 1800 and 2600MHz. With perhaps a little bit of 2100. And some use of bits of TDD spectrum knocking around. Then there’s a variety of US bands, Japanese-specific ones, Chinese ones and a variety waiting in the wings to get approved. 

That is *much* worse than 3G, which had one core band for much of the world (2100MHz) and still took a long time to get either coverage or good handset performance.

Bottom line is that LTE spectrum fragmentation is not going to go away. This has a number of implications – firstly, roaming is going to be a real pain when moving “off-net” beyond a single operator’s OpCos, or between regions of the world. In all likelihood, HSPA will continue be used for roaming in a lot of cases. Secondly, handset vendors will likely have to create either regional versions of handset hardware platforms, or make “world phones” that suffer from coverage issues in some markets. Either way, scale economies will be lower, prices higher, testing more problematic and time-to-market longer.

It will not be possible, for example, to have one iPhone variant that supports 3 European FDD bands, Verizon and AT&T 700MHz, the Chinese LTE-TDD variant, something for Japan, and perhaps another US band like AWS or LightSquared. I reckon that Apple will need to create three, possibly four distinct versions of future LTE iPhones.

Now Apple can afford to do that - it only has a single model introduced at a time, it sells in high volumes per device model/version and makes a huge margin on each. In other words, even if each "spin" costs an extra $100m to develop, it's still a drop in the ocean. If it creates three versions and sells 10m of each, it will probably make $2-3bn gross margin on each variant, so it can "wear" the extra hardware development and test cost quite easily.

But it would get very painful for lower-volume devices, or manufacturers that have broad ranges of devices. This in turn means it's probably going to be painful for operators with unusual spectrum bands (eg LightSquared) to get a decent range of decent handsets. 

In Amsterdam, we heard repeated pleading from operators - even DoCoMo - essentially saying "Support *my* band! Please! It's really good, and we can get economies of scale & support from all the vendors!". 

There are going to be some disappointed players left standing in this game of musical frequency chairs. And everyone else is likely to feel the knock-on effects of component suppliers' hesitation and uncertainty. Some operators will likely hold off on LTE decisions until the spectrum situation becomes a bit clearer.

One other option for LTE that got a little exposure - but was obviously still highly contentious - was that of wholesale-only shared networks like Yota (and LightSquared and a couple of others). I think that although that model makes sense in terms of spectrum usage efficiency, it also poses a risk for incumbent operators that will start to lose control over their core business enabler (the network) and may face a future where all differentiation comes in terms of the (often mythical, and always competitive) "services" layer.

I'll be writing more about the threat from "under the floor" players in the coming months - and why shared/outsourced/structurally-separate mobile infrastructure plays are both inevitable and highly disruptive. I'll be at the network-sharing conference in London next week as well.

One interesting angle on voice and VoLTE that is starting to bubble up - and which I've been suggesting / advocating for some time is that of dual-radio phones. We already see dual-radio CDMA/LTE phones for Verizon and Metro PCS, which use CDMA for voice and LTE for data. This has a distinct advantage over the proposed "Circuit Switched Fallback" standard, in that an incoming voice call doesn't switch off the LTE data channel. I'm expecting to see the same approach appear for GSM/LTE dual-radio phones, but that is much more complex as (unlike CDMA) both radios will probably need separate SIM cards, or two IMSIs on one card. At least one major vendor was openly discussing this approach - but at the moment the lack of standards about handling this type of device is a concern for operators.

Like VoLGA before, dual-radio "velcro" GSM/LTE is a solution that *works* conceptually very well, but it will be interesting to see if the politics of the standards world - and some entrenched interests wanting to ensure that nothing detracts from VoLTE/IMS's uncontested anointment as top solution - get in its way. My view is that this should be the main backup plan or straight replacement for VoLTE: as telephony revenues start to fall, why would many operators want to invest in a new core network and applications when their existing GSM telephony works perfectly? 

In my view, operators should invest their future voice/telephony budget in creating new voice products and playtforms - and do the absolute minimum necessary to get decent "old school" telephony working on LTE smartphones. I think the Velcro (yes I know it's a trademark) approach could free the operators to concentrate on creating new and possibly more valuable voice and VoIP applications - before Skype/Microsoft does it for them.

The last comment in this post is about WiFi and LTE. I've had a few conversations recently about the rising star of WiFi usage for offload, onload, roaming and other operator use cases. I think that all of these are extremely important.... but I also sense a dangerous level of groupthink around the "telco-isation" of WiFi. There's a host of new standards and solutions that make bolting WiFi onto 3G/4G networks more "seamless" or more controllable. 

Those of you with long memories will know that I have an intense suspicion of the word "seamless". It represented all that was wrong with the ill-fated UMA technology. More than four years ago, I wrote what I thought was the requiem of seamlessness. But it's back, it seems. In a nutshell - seams are important. They're boundaries. Sometimes I want to know when I reach a boundary, sometimes I don't. Things change at boundaries - speeds, policies, price, ownership, security, latency and so on. In particular with WiFi, it is absolutely critical to enable a good user experience between choosing between "operator WiFi" and "private WiFi". 

I see far too few advocates of the "private WiFi" use cases - there seems to be an asusmption that WiFi access on smartphones will default to being "service"-mode. I think that is a deeply flawed belief, and unless address will come back to haunt some of the new approaches to offload or operator-provisioned WiFi. More to come in later posts, conference presentations and so forth.

A few quick bullet points of "other" interesting items:
  • Apparently, TeliaSonera intends to charge extra for VoIP on its LTE network. Good luck with that. Maybe you can start by providing us with a clear legal definition of "voice"? Downloading a spoken poem? Audio telepresence? Skype video with "mute" switched on? Accessing voicemail? Encrypted speech inside HTML streams? If you're a Swedish-speaking telecoms lawyer, you're going to make a lot of money over the next few years....
  • Verizon was being very coy about its rollout and recent outage. Its conference speaker was not even from Verizon Wireless but from the EMEA arm of the company which is mostly the former MCI/WorldCom enterprise services division. Unsurprisingly, probing questions about the progress of VoLTE testing were not especially illuminating.
  • Apparently, SMS over the SG interface *is* working. Just that vendors haven't bothered to tell anyone about it as it's not considered sexy. Let's see how the full SMS-over-LTE experience works on future phones though.
  • It was good to hear an anecdote from T-Mobile Netherlands that the biggest problem isn't "tonnage" of data traffic, but simultaneous signalling from lots of smartphones and apps in the same place. More interesting still was the massive explosion of the SMS-replacing "WhatsApp" service in Holland, which apparently got to 70% penetration (of smartphones I assume) in just 3 months. Hence KPN's profit warning a couple of weeks ago. (It's worth noting that Netherlands is slightly unusual when it comes to messaging, as it's historically been a low-Facebook use country, instead using its own local social network Hyves)
There were certainly more nuances I picked up about LTE, but the overwhelming sense was that, in Europe at least, there is "no hurry" to push it to the massmarket. That's a big contrast to the US, where a 4G marketing frenzy is taking place, dragging network deployment in its wake.


Monday, May 16, 2011

Telcos paying OTT players - balance of payments will look ugly

Through my work with Telco 2.0, I spend quite a lot of time thinking about how telcos can get "two-sided business models" (2SBM) to work. This involves deriving revenues from companies "upstream" of the users themselves, who pay to use the operator as a "platform" for doing business with the users more effectively.

An easy example of 2SBM is advertising, with the telco facilitating a brand by helping it market to the telco's (paying) users. Google does much the same, but only monetises the upstream (ads) and not the downstream (users searching). Another example of existing telco 2SBM is "bill on behalf of" - for example collecting payment for apps through carrier billing, and taking a rev-share from the developer.

Harder examples of 2SBM are where the operator wants to act as an identity/authentication provider, enabling various network-based APIs like location, or where it wants to provide some form of QoS or "slices and diced" bandwidth for fixed or mobile broadband. Notwithstanding the ongoing wrangling about Net Neutrality, operators would dearly love to charge Internet companies such as Google or Facebook or Netflix for using "their pipes". As I've written before, simply acting as a bottleneck or tollgate is improbable - for any chance of getting "cold hard cash" for broadband 2SBM, the operators need to help the so-called OTT players do something extra, which best-efforts connectivity cannot do.

This is proving tricky, because the Internet companies have proven quite adept at making the most of ordinary Internet access connections, while the operators have found it hard to deliver "provable" enhanced QoS, especially in mobile, even where the law permits.

So at present, the amount of revenue flowing from to operators from YouTube, Hulu, eBay and so on is vanishingly small, once you exclude basic connectivity from their servers - and perhaps some newer trends about peering / transit for those generating the greatest volume of video. Many of these companies have developed their own in-house alternatives to operator APIs (location has been the easiest, but others such as messaging and identity are evolving too).

So despite some ridiculous, sycophantic and wishful-thinking "telcowash" (4MB PDF) from the likes of consultants such as ATKearney, the chances of deriving extra revenues from Internet companies, by just sitting in the middle of the network with a couple of DPI and optimisation boxes, seems as hard today as it did three years ago.

Instead, there's a slow trickle of cash going *the other direction*. Operators are paying OTT companies for their unique applications and capabilities. DoCoMo has just cut a deal with Twitter to embed its apps into featurephones, and use its "firehose" feed for location-based services. Verizon has partnered with Skype, as has H3G - something I feel might evolve much further now, given the Microsoft acquisition. Facebook is reportedly charging for bulk access to its own APIs - which makes those RCS visions of handset addressbooks injesting profile pictures and statuses look unlikely. And then we've had the acquisitions - France Telecom buying a major stake in local YouTube rival DailyMotion, Telefonica buying Jajah and so on.

And then of course there's the huge amount that operators spend on Google Advertising.

In other words, despite all the rhetoric, it seems like the OTT players are charging the telcos, not vice versa. The reason is simple - the OTT players are typically selling innovation and new value *first*, not attempting to monetise control. Enhanced Twitter will add value to DoCoMo's customers. Google's clever advertising and analytics help operators sell more stuff.

When the operators can demonstrate that their 2SBM offers add value (and revenue) to upstream players, especially on broadband, they are likely to buy them. But they are unlikely to pay a "control point tax" without upside.

How many operators employee marketing staff to show that they can help Facebook, Google et al make more money if they use the operator APIs or QoS mechanisms?

Until that point, the balance of payments between telcos and OTTs will stay in the red.

Wednesday, May 11, 2011

Another reason why app / service based pricing for mobile broadband will fail

Imagine, if you will, that you are the CEO of a mobile operator that's just launched a new tiered-pricing model for mobile data on laptops and smartphones. It's based on differential pricing and QoS for different data/Internet applications. You've bought a ton of DPI and policy boxes to detect and enforce traffic, and you've proudly announced a new "menu" of pricings.

$10 per month = email, IM, basic web browsing
$15 per month = adds in social networking & mapping
$25 per month = adds in low-quality video & selected cloud services & basic VoIP
$35 per month = adds in high-quality video & high-quality VoIP

You've nicely defined all the different web services into the different buckets, and set up the T's and C's and the policy boxes appropriately.

Now this morning, you've woken up to find that Microsoft has bought Skype. So now Microsoft has extra IM, VoIP and video-calling, as well as its own way of doing WiFi offload via the Skype/Boingo relationship. There's likely to be a whole host of mashed-up applications, launched over the next couple of years - some fixed, some mobile, some consumer, some business, some free, some paid.

So how exactly does that fit with the carefully-crafted pricing model and network policy setup? What's the business process for evaluating what has to change? What are the technical implications? What are the legal implications? How does it fit with partnering deals? How will users be informed? Does Microsoft have VPN services? What happens to stuff Microsoft / Skype does in the cloud? Does everything look the same on different devices & OS's? And how fast can any updates be made?

The list of headaches is endless. The scope for messing up is huge. And it's all highly dynamic & will change continually.

For me, this is yet another example of why app-specific pricing & policy is doomed to be limited to a few niches (eg anti-virus, throttling P2P uplink). Never mind the Net Neutrality legal debate - it is practical problems like this that make service-specific tariffs and so-called "personalisation" service menus irrelevant at best, and outright damaging at worst.

Tuesday, May 10, 2011

Microsoft + Skype + Nokia = NextGen 4G Mobile VoIP & messaging done properly

NOTE: The Microsoft / Skype deal is not yet confirmed, as I write this.

But if it goes ahead "as leaked" this is another major step for Microsoft's aggressive pursuit of Google and Apple, which also may have a secondary effect: further pain for the telcos and especially mobile IMS and its flag-waving applications VoLTE and RCS.

[Plug: I'm running a series of upcoming "Future of Voice" Masterclasses if you want to understand more about the implications & rationale for this. Contact details below to learn more]

I'm pretty sure that a lot of the comment and analysis today will be around whether Microsoft can execute better than eBay, why the price is so high, whether this is "all about Google" and whether Skype would have been better off living inside Facebook.

For me, this actually looks like a near-perfect fit for Skype. The other candidates I had in mind were Vodafone, AT&T, Cisco and Ericsson. No, not the most intuitive choices indeed - but companies with deep pockets, an interest in innovative services models and a willingness to pick and choose among standards vs. proprietary solutions where it suits them.

Some comments that help to explain my conclusions:

  • A substantial part of Skype's current user base is from PCs. Although mobile devices get all the glory at the moment, Skype epitomises what's best about desktop VoIP. More importantly, a laptop is probably the perfect device for many video-calling use cases, as the keyboard+hinge and upright camera is much better ergonomically than a propped-up tablet or mobile phone.This would have been lost in a purely handset-focused company (eg Nokia in the past, RIM or perhaps Qualcomm). This may have ruled out Vodafone too, I guess.
  • Skype gets widely-used in business - often only semi-officially, but it's a critical tool for many travellers, people doing conference calls and so forth. It is also increasingly working on corporate-grade solutions. This would have been lost inside a Facebook or similar company.
  • I think that some of the operators that are less aggressive about deploying LTE - especially for smartphones - are doing so partly because of doubts about getting VoIP to work properly, to a degree comparably-good with GSM telephony today. Skype has a significant chance of being the only massmarket VoIP that has a big user base, and works well on LTE, by 2014. The "option value" for that is potentially huge. Hence AT&T and Vodafone on my "other possible acquirers" list - I also would have added Hutchison 3 and maybe Telenor, but the price is too high.
  • Skype is class-leading in terms of understand and helping manage QoE (quality of experience) for IP communications *from the user device*. It doesn't control QoS (in the middle of the network), but involves the user and the device hardware to make the best of what's available, and alert the user not just to problems "in the middle", but also to other things like not having microphone working properly, temporarily poor WiFi or 3G reception, or if your device's processor is running too slow. Both Cisco and Ericsson urgently need device-side expertise to really understand "end to end" performance, but both know how hard it is to get across numerous classes and brands of device.  Skype has that knowledge. They have missed out today - but I suspect that Cisco's investors would have been wary, and Ericsson probably would worry too much about annoying its telco customers. It is also why it would have been a poor fit with Apple, which is much less platform-agnostic than Microsoft, especially in mobile.
  • Skype is leading the way on personal video communications. I don't use it personally, but many users do - the % of minutes that are video-based is astonishing. I remember speaking to a friend recently who didn't know Skype could work in voice-only mode. He thought it was JUST a video comms tool. It just works, and is cross-platform unlike FaceTime.
  • In the massmarket, Skype is probably the only platform that has (by skill or luck) will worked out a way to get users to adopt "permission-based" voice communications. Many Skype voice or video calls are pre-arranged, or "escalate" from an IM chat and presence in a way that telcos have long dreamed about. Its desktop-first strategy (and timing) has enabled it to do what IMS should have done, had it been universally available and using a Freemium model in 2005. As such, this would have been a near-perfect (if expensive) Telco-OTT proposition - and also help craft a voice experience that is much more than "just telephony", but fits with the Future of Voice vision I wrote about recently for VisionMobile
Would Skype have fitted well inside Google? It's difficult to tell. Google doesn't have much heritage of making and integrating large acquisitions, while Microsoft is "not bad", with some successes (eg Hotmail, Great Plains) and some failures (eg Danger). More importantly, Google has its own voice/VoIP initiatives, and internal politics would probably have been horrible with a Skype acquisition.

There are many other issues to explore around the Microsoft/Skype deal - especially the missed opportunity for one of the telecom operator community to "escape the herd" and lead the emerging Telco-OTT space with a head start.

But it's worth stepping back and focusing here on the impact on IMS, VoLTE and RCS. I still take the view that VoLTE is "necessary but not sufficient" - it's very late in coming, but there definitely needs to be a "simple circuit telephony replacement" technology for 3G/4G networks. GSMA and its partners are heavily focused on getting VoLTE working, especially focusing on interoperability and familiar themes like roaming. However, there also needs to be a focus on two other things that I reckon are being overlooked:

  • There needs to be a view about the Future of Voice angle. If VoLTE had started as VoHSPA 5 years ago, it could have just been Telephony v1.1 and that would have been fine. But the timing now is wrong - LTE is a key transition point, further catalysed by the smartphone explosion. In the next few years, voice *will* change irrevocably, expanding well beyond mere telephony to a huge diversity of applications and use cases. If VoLTE gets delayed, it will have missed its window of opportunity - and I think that's a significant risk.
  • More practically, I think that VoLTE will have to content with a ton of real-world horribleness about getting VoIP to work while mobile and on cellphones. RF issues, battery issues, echo, poor acoustics, sound glitches, codec choices, packet-loss concealment and so forth. QoS only gets you so far - and then you need software and proper audio expertise to fix what's left. The network companies and standards bodies in VoLTE aren't really focused on microphones and sound volume levels - they're hoping that the handset companies will fix all that. Have a look back at the history of fixed-line VoIP to see how "easy" all that is to get right, even on relatively predictable home broadband or corporate LANs. Skype has been doing mobile VoIP for many years - and while it's not perfect, it's got a huge head-start.
In other words, Microsoft is buying a $8bn option on the future of the mobile telephony industry. If we get to 2014 and VoLTE isn't working as well as it should - Microsoft (and its partners like Nokia) will have both an OTT option and a "white label" proposition for operators. Also, don't forget that Microsoft also sells IP-PBX functionality in its Lync / OCS product - it doesn't think that all call control should reside in the operator domain.

As for RCS.... well I think this is just another nail in an already well-sealed coffin. Microsoft has never really bothered to grasp IMS ("Oh, that's just SIP isn't it?" was one response I got in a interview at MWC a few years ago) and it's now looking even more of a poor fit when combined with MSN, Live, corporate UC products and so forth. It seems likely that none of the big smartphone OS providers - Apple, Google, RIM or Microsoft will be particularly well-disposed to RCS. Sure, there will be various third-party add-ons for Android and perhaps other platforms, but it's unlikely to be a key priority for Windows Phone now.

I'll try and update this post or add another later on, after some reflection, but this should be enough to catalyse some discussion.

Also, now seems like a good time to highlight some upcoming events on "The Future of Voice" I'm running together with Martin Geddes. These will be small-group collaborative Masterclasses, drilling into the use cases, technologies, applications and user behaviour for voice communications, as we pass the point of "peak telephony" and move on to other modes of B2B, B2C and C2C interaction. The first one is in Santa Clara on June 30th, followed by London on July 14th. Martin and I will also be conducting customised private workshops for specific clients. Email information AT disruptive-analysis DOT com for details.

Monday, May 09, 2011

My Top 10 blog posts from the past few years

I don't check Google Analytics that often for this blog - most of the time I write what I want, and don't stress about the levels of readership beyond a link on Twitter or two. I'm not selling advertising space, and much of my "target audience" seems to find the blog anyway, so why bother with all that SEO nonsense?

But I thought it was interesting to look at the most popular posts I'd written and see if there's a pattern somewhere. Over the past 3 years or so, these are the ones that come out on top:

  1. A post about sharing 3G dongle modems via a docking-station and WiFi. Not sure why this was so popular, except for the fact I coined the term "Dongle Dock". They never really took off as a concept, as MiFi-style products make for a simpler solution of 3G over WiFi.
  2. My (in)famous post re-writing tbe Monty Python dead parrot sketch - in which the deceased Mobile IMS has been nailed to its LTE perch by the 3GPP/GSMA pet store owner. I know this one got very widely circulated - I still get comments about it today - so no surprises there.
  3. A complaint about poor customer service I got from Carphone Warehouse. "Personal" rants tend to get seen by large numbers of similarly-dissatisfied people looking for others who share their pain. In fact, this would probably be at #1 as it was written in 2006, but I only started tracking hits this way in 2008.
  4. Back to IMS, LTE and VoIP, with a post discussing the original announcement of GSMA OneVoice (now VoLTE)
  5. This was a very short post from 2006, talking about how to integrate SMS into IP and IMS. It's notable that I still haven't seen any live demos (or even major announcements) about getting SMS over SGs working in LTE.
  6. An ongoing theme of mine is about the over-hype of NFC and mobile payments. In particular, I expect NFC is going to be about interactions not transactions. One of my first & best-read posts on the theme was a couple of years ago. A more recent analysis is here
  7. LTE voice once again - and specifically looking at the ill-fated VoLGA. I still think that it makes sense - and if VoLTE encounters the problems I anticipate, I wouldn't be surprised to see it get reincarnated somehow, perhaps as the basis for a Telco-OTT VoIP service on other telcos' LTE phones.
  8. Another post stemming from personal frustration - this time about Vodafone's egregious data roaming pricing strategy about a year ago. Who knows, maybe I contributed to their eventual decision about adoption of the £2/day plan.
  9. And *another* post on IMS, LTE, VoLGA and SMS. I hadn't fully realised how much traffic this discussion had got. For all those interested in my views, I'm hosting both a breakfast session on LTE Voice (Tues 17th at 8am), and an Analyst Spotlight on VoLTE and the Future of Voice (Weds 18th at 11am) at next week's LTE Summit in Amsterdam.
  10. And finally in the Top 10, my predictions for 2010, from the perspective of end-2009. Not too bad on the whole. But yes, I know I'm still complaining about Twitter despite using it for a year now. It's still awful, but unfortunately essential. I'm @disruptivedean.

Friday, May 06, 2011

Business model innovation in mobile broadband - the insurance model?

At the recent Telco 2.0 event in Palo Alto, I was on the panel discussing mobile broadband economics.

I had an idea there, on the spur of the moment, that I haven't had a chance to write up until now. It's still in "prototype" form and definitely not 100% practical straight away, but nevertheless represents the sort of lateral thinking I have yet to see in the mobile industry.

I pay my car insurance based on an annual premium payment. I phone around (or look online) for quotes, which typically ask me for my age, address, type of car, security I use, history of accidents or convictions, some evidence of history of my actual insurance usage (ie claims) and a bunch of other questions that help them categorise my risk level with some very complex software. Some specialist insurers target particular demographics, or have detailed underwriting expertise that allows them to provide custom quotes, taking into account unusual circumstances. I also get a discount if my previous year's driving didn't result in excess "usage" - ie a no-claims discount.

It got me thinking - why don't we price for mobile data in a similar way? A 37yo male living in central London with an iPhone 4, commuting during busy periods, with a history of video downloads & obsessive Facebook use might get quoted £500 a year for mobile data, while a 57yo female with a BlackBerry living in a rural area and working from home might get a quote of £200. And if someone "abuses" the service, the operator has the right to decline to quote them for a continuation of service next year - or raise the premium considerably - so there's an incentive to be sensible.

Now clearly, this would need a major change to IT and billing systems - as well as some interesting discussions with regulators and re-training of customer service. I'm certainly not saying it's easy. But leave that aside for a second - do you really believe that if the *insurance* industry (hardly the most dynamic group of companies....) can do something like this, then the telecoms industry couldn't as well?

The nice thing about it is that the actual metrics that the telco uses to estimate risk are hidden privately inside the system. It might be a measure of GB data "tonnage". It might bias against people who use lots of signalling-intensive applications. It might involve clever location-based algorithms. It might give discounts for people who have use of 2+ phones. It might discount people prepared to accept a higher "excess" (eg policy management downgrades during busy periods). There's an infinity of clever ways to tweak the system.

I'm sure that there are other industries whose pricing schemes might be borrowed as well - energy, airlines, hotels and so on. Once again, it's about getting rid of the notion that "subscriptions" - especially monthly-based - are the only way to bill or market for telecoms services.

There's lots of nonsense being talked at the moment about "personalisation" fo mobile data - picking from a menu of apps and other such implausibilities. *This* is an example of true personalisation - a unique price and policy, just for you, calculated by examining your individual "risk" characteristics based on network cost and contribution to congestion.

Wednesday, May 04, 2011

Is KPN about to hit the Net Neutrality "suicide button" in the Netherlands?

According to this article on TelecomTV, the Dutch incumbent operator may be about to try to attempt the so-called "personalisation" approach to mobile broadband, charging "per-service" for Internet functions, in an effort to stem the rising use of 3rd-party applications that are substituting for SMS - behind last week's surprise profit warning.

Disruptive Analysis' view is that most app-specific personalisation concepts for mobile data charging are deeply flawed, and will likely lead to churn, counter-measures and outright animosity from otherwise neutral Internet players, as well as probable regulatory intervention. They will also most likely have significant problems with "false positives" and "false negatives". In other words, they will make a bad problem even worse. I wrote a blog post detailing some of the complexities a few months back - here. Although details are still sketchy, it looks like KPN's management may have made a knee-jerk reaction to its poor figures in an attempt to reassure investors.

Although this is in essence a "Net Neutrality" issue, it's not really something where I feel the issue is the principle at stake. It's more that it's a tactic that just won't work - and may quite possibly backfire to the extent it becomes suicidal.

To recap, KPN explained last week that its Dutch consumer wireless revenue shortfall in Q1 2011 was because:

"accelerated changing customer behavior became visible amongst smartphone users. New popular ‘apps’ on smartphones offer alternative ways of communication beyond traditional voice and SMS. The increased usage of these ‘apps’ lead to decreasing SMS and voice usage resulting in lower service revenues."

It further said that "Short term measures are taken to mitigate the impact on service revenues from these trends; these measures include personalized ARPU optimization and reduced discounts on data."

The accompanying presentation to the results shows the icons of Viber, WhatsApp and another service I'm not sure of (on slide #22). I suspect that KPN picked some relatively-minor examples rather than stir the pot by blaming Facebook or BlackBerry Messenger straight away. In other words, what many people have long, long expected has finally started to come true - SMS is starting to be eroded by alternative IP-based messaging firms. I described SMS as a "sitting duck" as long ago as November 2006.

Given that the writing has been on the wall for that long, you might have reasonably imagined that operators and their standards bodies might have thought about creating newer, better versions of messaging to compete. And that doesn't just mean relying on the pipe-dream of IMS RCS or old-style mobile IM - but actually creating something innovative that has the appeal of FB or BBM. Yet KPN only invested €82m in R&D in 2010 in total, against revenues of €13.4bn, much of which probably went on testing LTE and new fibre systems, developing back-office systems and so on. I'd be surprised if more than €10m went into trying to invent new services and applications, either solo or collaboratively. By way of comparison, one of the companies that has KPN scared has only just raised $8m - and so has presumably spent less than $5m so far.

But no.

KPN seems about to try to bill "per-application" for things that compete with SMS. Even leaving aside the inevitable use of email as a work-around, there are dozens of gotchas relating to mashups, VPN tunnels, or even hiding messages steganographically inside images. Will it block PCs from accessing parts of the web as well? What about prepaid data users? Roamers? MVNO subscribers?

I got upgraded to the new Facebook messaging platform the other day, which blends email, IM and FB messages. All inside an SSL tunnel to facebook.com. Packet-inspect that...

Now it's possible that KPN is going to try and do this in a more sensible fashion - perhaps zero-rating "friendly" sites, and increasing data prices for everything else. White-lists are sometimes easier to manage than black-lists.

But I have a sneaking suspicion that KPN has been over-sold on "personalisation" capabilities, and is about to bite off more than it can chew. Becoming "part of the problem" and taxing innovative substitutes will not succeed in competitive markets. Operators need to become "part of the solution" and offer something better. Despite the appealing Dutch folklore, this is not a hole in a leaking dyke than can be plugged by a small boy's DPI-enabled finger.

Wednesday, April 27, 2011

Guest post on Visionmobile about Future of Voice

I've written for many years about the future of VoIP and personal communications. Recently, I've been mentioning terms such as "non-telephony voice" and the need for service providers to understand that the historic concept of the "phone call" is only one way of interacting with another person using speech.

Last week's profit warning from Dutch operator KPN highlights the fragility of "old" telecoms communications services such as telephony and SMS, as newer applications better-customised to the idiosyncracies of human behaviour start to emerge. An open question is how well platforms such as IMS can cope with new modes of communication - especially those that aren't based on "sessions", but more fluid forms of interaction.

This is a broad theme I'm going to be addressing in some depth over coming months, through a variety of publications and events.


EDIT: If you are interested in learning more about the Future of Voice, I will be running a series of small-group Masterclasses together with Martin Geddes, as well as providing private internal workshops. Email me at information AT disruptive-analysis DOT com for more details

For now, however, please check out the guest post I've written for fellow analyst Andreas Constantinou's blog, VisionMobile on the Future of Voice, and the challenges being posed for "your grandmother's telephony service".

Monday, April 18, 2011

Is mobile data roaming structurally flawed?

Fascinating article by David Meyer at ZDnet, as part of his ongoing coverage of mobile data roaming.

He points out the possibility of the European Commission forcing a structural split between domestic and roaming service provision. Basically, there seems to be frustration that voice (and especially data) prices and consumer choices have not changed quickly enough, despite recent regulation on tariff caps and anti-billshock thresholds. In particular, there is concern that customers don't know in advance how/when/where they will travel, so they cannot make an educated decision about which tariff is "best" at the start of a contract. Most people have a feel for the number of minutes / texts they send per month - but no idea how much data they might use on visits Spain, the US or in Kyrgyzstan over the next 24 months.

Ironically, even when people *do* look at roaming prices as part of making a decision among competitive domestic offers, the operators feel that it's such a minor part of the plan that they are free to make unilateral changes to those roaming prices, while the contract is still in force. This is exactly what happened to me, last year. Certainly, few price plans in Europe are marketed upfront as 'roamer-friendly'.

Although it's too early to judge exactly how any future regulation might manifest, a possible option is that customers choose their "domestic" tariff and plan as normal, but then get to choose again about which network(s) and price-plans to use when actually roaming, or before departure.

That said, there's clearly a whole host of issues, concerns and possible "gotchas" here:

  • Is this choice made on a per-trip basis, or at the original time of signing a contract? 
  • How does billing work when roaming? Would (say) Vodafone act as a retailer / billing agent for Orange if I pick them when travelling in France? 
  • What's the user experience like?  
  • Do I need a separate SIM card for my roaming provider? 
  • What happens if my phone is SIM-locked - and how would you avoid worsening the grey market in subsisided phones? 
  • Would I use the same roaming provider for both voice and data? 
  • Whose ultimate responsibility would look after emergency calls; lawful intercept etc? 
  • Will this lead to weird distortions - eg people "roaming" permanently in Europe on a Luxembourg mobile contract, because it's cheaper?
I'm expecting the current mobile operators to scream blue murder about this - it's technically complex, and impacts an area of significant profitability, and potentially means that a licencee in one European country can offer services on an almost-equal basis throughout the continent. They will no doubt point out that there are already assorted opt-ins, or discount programmes (Vodafone Passport etc) that enable customers to tweak their roaming cost profiles.

Also, from my perspective, the problem is less about in-Europe roaming - for which we're seeing OK packages such as Vodafone's £2 / day for 25MB, and more for travelling outside Europe. The current typical charges of £3-6 per MB when I travel to the US, along with £1+ per minute for voice, are completely unjustifiable and make a mockery of smartphone ownership.

I now routinely switch data roaming off completely, and just rely on WiFi. I recently spent a whole week in San Francisco recently without using 3G at all, although it does seem silly that I have to resort to using paper printouts of Google Maps, or buying Starbucks coffee to check my email, when I'm quite prepared to pay a sensible amount for cellular data.

The problem is that there is no jurisdiction that can enforce price caps at both ends of (say) Vodafone/AT&T or Orange/SKT bilateral roaming arrangements. The structure of roaming involves both the wholesale (visited) fee, and the retail (outbound) mark-up price. Maybe the ITU, GSMA or even WTO needs to get involved ultimately, although none of them wants to kill the golden goose, even though they realise how unpopular the rates have become.

Another interim approach might be to make it a requirement for operators to disclose the wholesale rates they are paying, in an attempt to shame the visited network into sensible pricing. (imagine getting this SMS when you arrive at the airport: "Data costs £3/MB because the greedy network you're roaming onto charges a wholesale fee of £2.50/MB. Here's the CEO's email address if you'd like to complain").

Perhaps the best option will be an MVNO, or soft-SIM or dynamic-IMSI approach, with Apple or GroupOn or another third party acting as a tariff aggregator for customers. They could use negotiating power to force down wholesale rates for visited networks (eg Europeans roaming onto AT&T in the US, especially), or emulate the style of Truphone's "Local Anywhere" proposition in having multiple accounts on a single SIM card.

Fundamentally, the model for data roaming is completely flawed - unless you're using your home operator's in-house data services such as mobile TV, there is no need to have your data routed back home anyway. If you just want to connect to the Internet in a foreign country, there's no justification for your domestic service provider to have any role, except acting as a source of convenience. I don't phone up Vodafone for permission every time I want to use WiFi in Lithuania, or an Internet cafe in Mozambique. Now, I *am* prepared to pay for convenience - which is why I'll use ATMs and credit cards everywhere, despite some incremental fees. But I'm certainly not paying a potential £500 for a typical week's worth (100-200MB) of non-EU data usage.

The whole ridiculous process is about to be replicated in LTE - at least when the question of supporting the right frequency bands in a decent % of phones means that LTE roaming becomes vaguely practical. Just as VoLTE is "yesterday's telephony reinvented for LTE", we can expect to see "yesterday's data roaming reinvented for LTE" as well.

The effect of this is likely to further drive the use of free WiFi in traveller-centric hotspots. We're already seeing an increasing prevalance of hotels, airports and tourist cafes offering free data. I've stayed in remote parts of the world and been able to use Skype and Facebook for my communications needs, for free.  In other words, the current structure for mobile data roaming is driving users to a polarised situation. Many now expect *free* WiFi data when travelling, rather than be willing to pay a smaller, reasonable charge for cellular. In the short term, operators are benefiting from the grudging use of roaming by travellers on expenses - or by occasional roamers who are going to suffer from bill shock because of inadvertant use. That is not a sustainable business - the industry needs to wake up & reinvent how data roaming is organised, because the current system (especially outside the EU or other roaming regions) is broken.

EDIT: as an afterthought, ponder the notion that data roaming is, from your home operator's point of view, "best efforts" especially where it's provided through a telco that is not an affiliate. You would have thought that the lowest level of ownership & control (and therefore QoS) would mean you got charged a *lower* price than at home, not higher, would you not? Or perhaps best-efforts data is really good enough, after all?

Monday, April 11, 2011

The risks of ignorance-based pricing strategies for telecoms

Almost exactly 5 years ago, I wrote a blog post cutting through the myth of "value-based pricing" in the telecoms industry. It followed on from the observation that people seemed happy to pay for SMS messages, and so therefore it must make sense for telcos to try and extract the maximum amount from all users, for all services at all times, rather than under-price and "leave cash on the table".

In principle, I agree that perfect markets (and perfect marketing) should indeed result in optimal yields and the "right" prices mapped on to realistic assessments of users' utility and perceived value. However, we live in a far from perfect world in telecoms, in which obfuscatory marketing, lock-in and sheer rip-offs prevail - and also, it must be said, sometimes too-low pricing as well.

Updating my definitions, it's worth a quick recap:

Bargain-based pricing - it's so cheap, it's unbelievable. You tell everyone about it. You use it for the sheer sake of it. You buy other stuff just as an excuse to use it more. Example: free WiFi in cafes, or 3G dongles that are cheaper than ADSL lines. Most free Internet services like Google Search and Facebook are also a "bargain" if you're prepared to suffer some advertising.

Value-based pricing - it's the right price. It seems reasonable given the probable underlying costs or its inherently fair market-based pricing mechanism. It does what it says on the tin. You can justify it easily. You mention it to friends or colleagues. Examples: Normal smartphones data plans, iTunes music, Google AdWords, eBay pricing, Inflight WiFi.

Inertia-based pricing - it's a bit steep. You know you could find it a bit cheaper. But it works, it's convenient, and it's not worth the effort to shop around or switch. You don't complain, but you don't recommend it either. Example: SkypeOut calls, your current broadband provider, your current mobile voice tariff, airport food, iPhones

Ignorance-based pricing - it's a ripoff, but you don't realise it. You've got no real benchmarks, so it seems "reasonable". If it was cheaper, you'd probably use it more. You don't know it's available to other people (maybe in another country) at a much lower price. If you found out, you'd be quite annoyed, complain to friends, and probably feel a bit gullible & prone to switch suppliers when the opportunity arose. Example: SMS pricing, PSTN calls.

Resentment-based pricing - you know you're being ripped off hugely, but you "have" to pay as you have no immediate alternative. You grit your teeth, and (hopefully) expense it afterwards. You actively look for a way to avoid the cost, and minimise your usage. You complain to friends & colleagues. You develop "active customer disloyalty" and vow to switch suppliers, out of distaste for their show of customer disrespect, whenever you can. Examples: Hotel WiFi, most mobile data roaming.

You'll notice the assertion that mobile voice pricing is "inertia-based". But according to a new piece of research this morning, UK mobile subscribers appear to have sleepwalked more into the "ignorance-based" tier, spending on average 44% more than necessary on phone tariffs, or £195 a year (about $300), because they choose plans that are unsuitable.

That's enough to have pretty much every UK media outlet pointing out how much we're over-paying. Now to anyone in the cellular industry, this probably doesn't come as any massively-surprising news. Often, plans are specifically set to encourage upgrade to the next-higher tier. If average usage is 280 mins a month, then thresholds will likely be set at 250 and 500 mins, rather than the logical (but cheaper) 300 mins. Whether you view this as opportunist cynicism, or smart marketing, depends on your point of view. And whether you get called out on it.

The open question is whether this type of approach - while clearly generating revenues in the short term - is sustainable, and also whether it creates a damaging perception in customers' minds that operators are ripping them off. At a time when telcos are hoping to become trusted enough to be used for payments, digital lockers, identity management and so forth, they need to be careful to watch their reputation if they hope to gain true loyalty. Google and Facebook don't over-charge their users.

The other risk is that this type of egregious pricing strategy opens the door to "white knights" that can rescue customers-in-distress from the clutches of the evil, firebreathing pricing dragons. It is quite easy to imagine a GroupOn-type approach to buying mobile plans - collective groups of consumers that act with similar power to enterprises start to negotiate bulk deals, disintermediating the operators from identity while they are doing it. (I just realised I wrote a post about "consumer-oriented collective purchasing" 3 years ago, by the way).

Or alternatively, perhaps the UK's price comparison site uSwitch gets recast by Apple as iSwitch, exploiting their patented (and much-hated) notion of a remote-updateable SIM. What better way to perpetuate the $300 gross margins on iPhones than to offer users a way to monitor & optimise their phone plans? "We have calculated that you can save £10 a month by switching to Operator X from Operator Y. Click here to initiate number portability via iTunes and switch to your new provider".

One of the reasons for the mobile industry's historic profitability is that it has been able to derive huge profits from services which aren't really worth what people pay for them. SMS, roaming, too-large telephony plans. This is fine while people don't realise they're over-paying, and while there are no easy workarounds. But as the fixed-line voice providers have learned, once the process of discovering lower prices becomes more transparent, there can be a huge exodus of previously-loyal customers. By contrast, people buying an Apple product - or any other premium brand - knows that the supplier is making money, but they obtain value in other ways such as convenience or status.

There's no "cachet" safety-net in getting a too-large mobile minutes bundle, though.

Communications innovators - get thee to eComm in June!

I've been involved in Lee Dryburgh's series of eComm events for several years, both as speaker and as a member of its advisory list. For those of you not familiar with eComm, it's an event that is more about a shared understanding of the future (or possible future) of communications, rather than specific takes on a given technology. It spans next-gen voice services, wireless technologies, apps, social networks, messaging, devices, services business models, regulations and much more. Previous speakers have included the Android founders, senior Skype execs, FCC staffers and a plethora of others.

Up to a point, eComm has something of an anti-establishment feel, which surfaces in occasional anti-telco attitudes - although ironically some of the most provocative speakers have been from thought-leading telco business units. Overall, eComm tends to rail against the status quo, or restrictions on communication. It also tends to favour innovation over centralisation - standards are useful but not essential tools.

The next event is coming up at the end of June in San Francisco, but for various personal reasons Lee has had to take some time off from organising it.

This is a call to my blog readers with interesting stories to tell to apply for a speaking slot. This could be something about new services, new communications apps, perhaps new enabling platforms, or new takes on devices, user-experience and regulation. It *shouldn't* be a straightforward vendor pitch for something essentially me-too. (The back-channel can be pretty merciless on corporate powerpoint-mongers).

Either way, I'd exhort you to have a look at eComm, perhaps looking at the speaker roster from previous events such as US 2010, or Europe 2009.

Thursday, March 17, 2011

WiFi highlights an inconvenient truth about QoS...

... it's not always needed.

Increasingly, smartphones get used with WiFi. Some estimates suggest that up to half of data usage now goes over WiFi. Most of that WiFi is connected from homes, offices or public hotspots over backhaul provided by an operator other than that providing the cellular connection to the smartphone. Although in some cases there is an offload agreement in place, there is usually no direct measurement or control of QoS end-to-end.

But some operators have (or are launching) their own data and content services - whether it's a content site, their appstore, remote backup or even RCS. This means that some of the access will come in to the operator domain via the open Internet. This isn't new in itself - technologies such as UMA/GAN have been around for a while, as have assorted softphones, remote access clients and so forth. But what this implicitly means is that for some of the time, at least, operators are happy to have their services accessed by their customers over the public Internet. With all of the potential downsides that suggests.

Plus, this means that in those situations, the operator is itself acting a so-called "OTT" provider, riding for free on somebody else's pipes. Are they first in the queue to offer to pay their ADSL/cable saviours for QoS guarantees? No, I thought not.

So the obvious question has to be - if it's OK to connect via an unmanaged network some of the time, then why not all of the time? Are they warning their customers that reliability might be lower if they connect via WiFi? What rights do their customers have if performance is below par?

Now obviously in most cases here the fixed connection used for WiFi is faster than the mobile network would have been - so "quality" in some regards is arguably actually better. But it's still not actively monitored and managed, and both the Internet portion of the access and the WiFi radio itself are subject to all sorts of contention, congestion, packet loss and other threats.

I know that various attempts are being made to bring WiFi into the operator's control - or at least visibility and policy oversight - with selective offload and ANDSF and I-WLAN and various proprietary equivalents. But even these will not cover all situations, even when viewed throught the rosiest tinted glass.

But surely, if a QoS-managed and policy-controllable network is that critical, surely there ought to be explicit notifications to users that they are accessing the service via an unmanaged connection? Maybe, in extremis such access should even be blocked?

Flipping this around the other way.... if it's OK for your access customers to access your services over the Internet on an OTT basis, at least some of the time, why not also let other people access those services as well?

Tuesday, March 15, 2011

UK ISPs Code of Practice on Traffic Management - OK as a start, but major flaws

A group of the UK's largest fixed and mobile ISPs have published a "Code of Practice" about managing traffic on their broadband networks. The full document is here with the announcement press release here. The group includes BT, Vodafone, 3, O2, Virgin, BSkyB and TalkTalk, but currently excludes others, notably EverythingEverywhere, the Orange/T-Mobile joint venture.

(Regular readers may remember that I put up a suggested draft Code of Conduct for traffic management last year - there seems to be a fair amount that has been picked up in the UK document. My input also fed into the manifesto published my partners at Telco 2.0, here)

There's some good stuff, and some less-good stuff about the new Code of Practice. Of course, if you a Net Neutrality purist, your good/bad scale will shift a bit.

On the positive side, the general principle of transparency is extremely important. The committment to being "Understandable, Appropriate, Accessible, Current, Comparable, Verifiable" is entirely the right thing to do. I think there is a lot of good stuff in the Code here, going as far as the need for independent verification (although that would probably happen anyway - I'm sure Google and others have their own techniques for watching how traffic shaping is used by telcos).

The fact that it has been signed by both fixed and mobile operators is also a good thing, although there isn't much in the document about the specific issues inherent in wireless networks.

But the main problem is that it attempts to define traffic management policies by "type of traffic" in terms of descriptions that are only meaningful to boxes in the network, not to users themselves. Ironically, this fails the Code's own insistence on being understandable and appropriate. There are also no clear definitions on what constitutes the various categories such as "gaming" or "browsing".

The problem here is that DPI boxes don't really understand applications and services in the way that users perceive them. "Facebook" is an example of an application, including links or video which are displayed on the web page or inside a mobile app. "WebEx" is another application, which might include video streaming, messaging, file transfer and so on. Add in using HTML5 browsers and it all gets messier still.

Having a traffic policy that essentially says "some features of some applications might not work" isn't very useful. It's a bit like saying that you've got different policies for the colour red, vs. green. Or that a telephone call is #1 priority, unless a voice-recognition DPI box listens and senses that you're singing, in which case it gets reclassified as music and gets down-rated.

And even in terms of traffic types, the CoP conspicuously misses out how to deal with encrypted and VPN traffic, which is increasingly important with the use of HTTPS by websites such as YouTube and Facebook. Given that SSL actually is a protocol and "traffic type" this is pretty important. At the moment, the footnote "***If no entry is shown against a particular traffic type, no traffic management is typically applied to it." to me implies that encrypted traffic passes through unmolested under this code of practice. (I'd be interested in a lawyer's view of this though).

Another problem is that there is an assumption that traffic management is applied only at specified times (evening, weekends etc), and therefore not just when or where there is *actual* congestion. I suspect Ofcom will take a dim view of this - my sense is that regulators want traffic management to be proportionate and "de minimis" and there seems no justification for heavy-handed throttling or shaping when there is no significant congestion on the access network or first-stage backhaul.

There is also no reference to what happens to any company which fails to meet its obligations under the Code (which is "voluntary"), or how enforcement might happen in the future.

Lastly, there is no reference to bearer-type issues important in mobile. In particular, whether the same policies apply to femtocell or WiFi offload.

Overall, on first read I'd give it a 5 out of 10. A useful start, but with some serious limitations.




Thursday, March 10, 2011

Revenue from content/app transport? Operators need to be part of solution, not part of the problem


I'm still seeing a lot of discussions that go along the traditional and rather tired lines of saying that Facebook / YouTube / Hulu / BBC etc should "pay for their use of our pipes". I've just been debating on Twitter with Flash Networks, an optimisation company, about the fact that YouTube is now watched by a huge proportion of broadband-enabled people in India (mostly fixed, not mobile)

Flash asked the question "should YouTube be financially accountable", to which the answer I think is pretty clearly "no" - the users are financially accountable for buying Internet access services. If they all seem to prefer the same website for video, so what? Maybe at some point it becomes a question for competition authorities, but I really can't see what difference it makes if people watch videos from one site or 10 different ones.

If I have a mobile phone plan with 600 minutes, and use 500 of them calling my best friend and 100 calling everyone else, you wouldn't send my friend a bill for "generating traffic".

But that doesn't preclude the operator doing a deal with YouTube for something extra. Maybe they offer QoS guarantees (empty promises won't cut it, there needs to be proof and an SLA) for prioritisation or low-latency. Maybe they have a way to over-provision extra bandwidth - for example the customer subscribes for a 6Mbit/s line speed, but YouTube pays extra to boost it to 10Mbit/s if the copper can handle it. Maybe the operator gives YouTube a way to target its advertising better, through exposing some customer data. Maybe the operator improves performance and reduces costs by using caching or CDN technology.

But all that is on top of the basic Internet access - and of course, YouTube will be doing its own clever things to squeeze better performance out of basic access as well. It will be playing with clever codecs and buffering and error-correction and so on, so the telco has to make sure its value-add "happy pipe" services give YouTube a better ROI than spending it on a more R&D tweaking the software.

What won't fly (in most competitive markets) is attempting to erect tollgate for the baseline service. The telco gets a chance to participate in the upside beyond that, if it can prove that it's adding value. It can't just exploit YouTube's R&D, user loyalty and server farms "for free".

The same is true in mobile - the operator needs to be part of the solution, not part of the problem. Which means that before it has the moral authority to say it's providing value from "extras", it needs to get the basics right, such as adequate coverage and reasonable capacity. It also has to demonstrate neutrality on the basic Internet access service - it can't be seen to transcode or otherwise "mess about" with traffic.

But assuming that there is good - and provable - coverage (including indoors, for something like YouTube), then once again the operator has a chance to participate in improving the performance of vanilla Internet access. It can offer device management, user data, possible higher speeds and prioritisation and so forth. But there are many more complexities to getting this right, as mobile is less predictable and "monitorable" than fixed-line. Ideally, quality needs to be seen and measured from the user's perspective, not inferred imperfectly from the network. And there needs to be some pretty complex algorithmic stuff going on in the radio network too - how do you deal with a situation where you have both "Gold users" and "Gold applications" competing for resources in a cell? And just how much impact should one Gold user/app right at the cell-edge have on 50 Silver users in the middle?

All of this needs to be based on upside from what is possible with a best-effort standard mobile Internet connection, where the user and app provider are in control and can alter their behaviour according to personal preferences. The operator and network need to show a demonstrable solution which offers more than can be reasonably expected, and not just try to extract fees by creating an artifical problem.

So in pharmaceutical terms, the performance of the baseline, unmodified transmission is like a placebo in a double-blind test of a new drug. Any new network "treatment" such as higher QoS or optimisation has to show measurable and repeatable benefits against the placebo. It is also possible (and necessary) to double-check that the placebo is uncontaminated.

This is the challenge for mobile operators in particular, looking to derive extra fees from users and/or content and application providers from "smarter" networks. They need to get the basics right (coverage), and provide an acceptable basic service (unmolested Internet). And then they have to offer something more (proven quality or targetting) at a cost and effectiveness better than that which could be done either by the app software, or simply providing more capacity.

Tuesday, March 08, 2011

Insistence on a single, real-name identity will kill Facebook - gives telcos a chance for differentiation

Note: This post was written before Google+ , Google's stance on pseudonyms, and the rise of #nymwars . Most of this article applies just as much to Google as Facebook.
 
There's been a fair amount of debate about online identity in recent days, partly spurred by Techcrunch's shift to using Facebook IDs for blog comments in an effort to reduce trolling and spamming. Various web luminaries have weighed in on one side of the debate or the other.

Mark Zuckerberg, founder of Facebook, has been quoted in David Kirkpatrick's The Facebook Effect: "You have one identity. The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly ... Having two identities for yourself is an example of a lack of integrity."

I think that's narrow-minded nonsense, and I also believe that this gives the telcos a chance to fight back against the all-conquering Facebook - if, and only if, they have the courage to stand up for some beliefs, and possibly even push back against political pressure in some cases. They will also need to consider de-coupling identity from network-access services.

Operators could easily enable people to have multiple IDs, disposable IDs, anonymity, tell lies, have pseudonyms or nicknames, allow your past history to "fade" and various other options.

In other words, they could offer "privacy as a service".

There are numerous reasons why people might wish to use a "fake" identity - segmenting work and personal lives, segmenting one social circle from another and so on. There are many real-world situations in which you want to participate online, but with a different name or identity: perhaps because you have a stage or performance name, perhaps you have a (legal) "guilty secret" of some sort, or maybe because you want to whistleblow against people in authority or those that you perceive as dangerous. It can even be because your name is just too common (JohnSmith16785141), or too unusual or difficult to spell (Bubley). It is also common for people to want to participate as part of a company, not an individual.

I know plenty of people who use pseudonyms on Facebook and other social media sites, and for *personal* things I'd say that's good for all sorts of reasons. In a business context, I agree with websites such as LinkedIn and Quora that enforce real names, because there is a strong "reputation" angle to their businesses. But on the other hand, if I had to deal with 300 LinkedIn requests a day from random people I haven't met, I'd probably change my mind.

There is another, important side to anonymity and multiple identities - obfuscating parts of your persona and contact details from advertisers and spammers. Being able to give a secondary (and ideally disposable) email address or mobile phone number to untrusted parties is important. I still use my fixed number for most online forms in the UK, because there's a legally-enforced telemarketing opt-out, while giving a mobile number risks spam SMS. The same is true of online identities - I want to be able to corral spammers and unwanted advertisers in a corner of my Internet world that I can safely nuke if I have to.

So, there is an opportunity for operators to offer - either individually or collectively - a more friendly set of identity options. This probably relates more to mobile operators than fixed operators, but not necessarily. A critical element here is that ID *cannot* be always tied to a SIM card or phone number, for most of these use cases. Users will not wish to be tied to a single access provider, not least because many times they will not be using a single, operator-issued device or that provider's access network. They will also not want to pay for an access account in perpetuity, just to make blog comments or something equally trivial. And, painful though it is to telcos, they *will* churn, and using identity as a lock-in will reduce trust and take-up of the services.

In other words, a telco-provided custom ID will need to be provided OTT-style - something like Orange's ON service , a cross-network app which enshrines principles from studies of psychology and anthropology - such as the right to lie. You need to be able to "take your privacy/idnetity profile with you" when you move to another operator. Unless we want to wait 10 years to force through "identity portability" laws, operators will fail to exploit this opportunity if they just try and see it as a churn-reduction tool.

This also means that interoperability between privacy providers is unncessary and even undesirable. Operators can - and should - go it alone to start with, which is why fixed operators have a chance as well as mobile. Living in the UK, would I use AT&T or Telenor as a privacy provider? Maybe, depends on whether I like a specific service and trust them, but I'd be more keen than going with one of the UK operators who'd try to link the capability into other services. Although that said, I'd probably use certain aspects of this broader idea from my current telecom providers - perhaps a second "fake" number I could use for advertisers and potential spammers.

(It goes without saying that most or all of this will need to be built outside rigid architectures such as IMS or RCS, which also have centralised repositories for subscriber information, unique personal identifiers attached to credentials such as SIMs, and an assumption of access/service coupling).

Now there is an open question here about full anonymity. A lot will come down to local attitudes and laws. Some countries already force users of previously-anonymous services such as Internet cafes or prepaid mobile phones to register with the authorities - for example Italy, Spain and India. Others like the UK and Portugal are still OK with off-the-shelf purchases of SIM cards, anonymous web access and so forth - luckily our new government binned the hideous UK ID card project when it came to power last year. As events in the Middle East have shown, anonymous and easy access to communications helps protesters against despotism - possibly a price worth paying for a minuscule rise in terrorism risk. Personally I have the luxury of democracy, and I tend to vote for libertarianism rather than nannying state intervention, but your opinion may vary.

(And yes, I understand that real, true anonymity is almost impossible - both online and in the real world. We are traceable via credit cards, mobile phone records, facial-recognition CCTV, and probably online semantics and other behaviours. But at the moment, it's difficult to join the dots unless you are Google or a government security agency).

Don't get me wrong, I'm a huge fan of Facebook and believe that in many ways it is going to eat the telcos' collective lunch. Friend lists are already usurping the notion of a phone "address book", and web-based approaches make social networks much more flexible than a telecoms infrastructure can be. It's tempting to believe that Facebook is now too big to fail - but don't underestimate the fickleness of social groups. I've had a few friends who have had pseudonym-based profiles deleted, and they are definitely no longer loyal users.


I strongly suspect this is not an area in which the telcos will move together, en masse. It is an opportunity for some of the more forward-thinking and perhaps renegade operators (or specific product teams) to move aggressively and across network boundaries. If ID gets mired in years of interop talks and nonsense about support of roaming, it will go the same way as other "coalitions of the losers". This needs to be done NOW and done aggressively by those brave enough to step up - perhaps in partnership with a web provider or two.

Monday, March 07, 2011

Time for the word "terminal" to reach the end of the line

I stirred up a bit of debate over the weekend via posts on Twitter suggesting that the use of the word "terminal" in the telecoms industry is always a good sign that the speaker is stuck in a legacy age. (Twitter being the terrible medium for debate that it is, I was unable to discuss this meaningfully - hence this post).

Typically used by network-centric, standards-centric, telephony-centric members of the industry, I have long believed that 'terminal' exemplifies the denial of reality endemic in many "old school" telecoms professionals. Nobody outside of the network fraternity uses the word "terminal". You'll never hear Steve Jobs, or even most of Nokia's current and former execs utter the term. People say "mobile", "device", "cellphone", "smartphone".

This is not a new stance of mine either - I made the same point almost exactly 5 years ago in this blog post.

After a bit of a verbal ping-pong match with @TMGB this morning (I'm tempted to describe him as the dinosaurs' "Chief Asteroid Denier", but that's perhaps a bit unfair), I've reached a slightly clearer position. In historic telephony standards, there is indeed still a specific technical notion of a "terminal" defined. It's a bit similar to the old mainframe/green-screen architecture, or various other technology domains like industrial SCADA systems.

But in the past, being a terminal was pretty much the only thing that a phone did. Even more recently, being a terminal was the main or most important thing it did, even if it was as an SMS terminal rather than a telephone terminal. Therefore it was fairly natural for people to refer to any mobile phone as a "terminal", firstly because that was the only type of device, and secondly because it was - to all intents and purposes - the only useful thing it did.

But obviously, over the last 10 years, things have changed. Modern devices do a huge range of things - often simultaneously. Acting as network terminal in a standards-based, telephony sense is simply one of a smartphone's functions, and increasingly not the most important. Many of those functions are not even anything to do with a network connection - the camera, MP3 player and so on. Arguably, connectionless technologies like HTTP and IP do not have "terminals" in the telecoms sense of the word. The majority of device value thus resides in "non-terminal" functions.

Using the word "terminal" now to refer to a smartphone or other new device is therefore extremely sloppy. Today, terminal=function in mobile, not terminal=physical product. And yes, this is more than just an abstruse semantic discussion, because perpetuating the idea that the terminal function is somehow the paramount use case of a device- and, moreover, is independent of the other functions is a huge fallacy which may drive the industry down blind alleys.

The idea that a telephony call (the most obvious example of the terminal function) should over-ride anything else the device or user may be doing is not just arrogant, but a huge error in understanding user behaviour and modern OS's. Yet that remains an unspoken assumption among many in the industry.

Often a smartphone (or, certainly, tablet) user will be doing many things more important than receiving a phone call, particularly a trivial one from somebody they don't want to talk to. Yet the "terminal is the #1 application" mentality is insidious - standards like Circuit-Switched Fallback for LTE telephony assume it to be true. Multi-tasking, multi-connection devices mean that the terminal capability does not exist in isolation - and concurrent tasks need to be considered and sometimes given priority. This will need clever UI design, as well as various user interactions in the device's upper software layers that are not generally considered in network-centric views of "terminal" behaviour.

Furthermore, as we move towards smarter devices and especially VoIP-based telephony, the idea that the "terminating software client" is actually the last point of the chain becomes ever less true. The OS, or another application or browser, might intercept a phone call before it reaches you, or initiate an outbound one on your behalf. The ultimate "voice" application may simply be calling a telephony API - or may pick-and-choose other non-service based voice capabilities.

In other words, even the word "terminal" becomes factually incorrect.

So, to be clearer:

The word "terminal" is a legacy of a time when mobile devices were primarily intended for connection to specific services (especially voice telephony), over a network access run by the same service provider. Nowadays, a mobile device may have a terminal function but can also operate in many other modes - standalone & offline, connected to another network (eg WiFi), using a specific installed app. It is therefore not just factually wrong, but dangerously naive to continue referring to it as just a "terminal" - and thus I believe I am justified in my views that continued misuse of the term is a good indicator of the mindset of the person saying it.

Wednesday, March 02, 2011

I want to report a 3G coverage problem - how difficult can it be?

Various emerging business models demand good, reliable, near-ubiquitous mobile data coverage, especially in dense urban areas. We hear a lot about congestion, but rather less about the more basic problems of getting a signal. Whether it's a "not-spot" because of buildings, poor setup of the antennas, inability to site a base station, a recurring equipment fault or just some other RF weirdness, gaps and other coverage-free zones are going to be an increasing problem.

In particular, cloud-based services are going to be very sensitive to the quality of a given operator's network. It's bad enough losing access to the web and email in certain locations - think how much more problematic it would be for critical business processes dependent on hosted applications, used via mobile devices.

Because of this, you'd expect that operators would want to get prompt feedback from their customers about any real-world problems they've missed. Surely in this area of their business, they'd recognise that overall "quality of experience" is best monitored and reported by the end-user, not simply deduced and inferred from boxes & probes and software in the network.

Well, that's certainly not the case for Vodafone UK. Over the last year I've been on its network for my main phone, I've noticed quite a lot of coverage gaps and holes around central London. Sometimes I get bumped down to 2G, sometimes nothing at all. And some of those gaps are in absolutely predictable and consistent physical locations - I've encountered them repeatedly, at different times of day, to the extent that I can even plan my usage around them on certain trips around town. To me, this suggests that congestion and capacity isn't the problem - it's plain and simple coverage.

I've put them on this personalised Google Map - http://goo.gl/maps/hTv3 - both are near Regents Park and Camden in London. One is right in between two of the busiest train stations in the country - Euston  and Kings Cross, right outside the British Library and near the Eurostar terminal at St Pancras.

In the big scheme of things, the two most obvious gaps are not a huge problem for me. Given my typical travel patterns around London, I probably lose 2 mins of mobile data access a week, usually when I'm on a couple of specific bus routes and using my phone for a mix of email, personal apps and so forth. But they contribute to my sense that Vodafone's London network isn't that great - especially as the company hasn't detected and fixed the (very consistent) problems proactively using whatever "service assurance" tools it presumably has at its disposal.

So I decided to report the issue.

I've heard good things about the @vodafoneUK Twitter team, so I thought I'd try that route rather than calling customer service on the phone, especially as I was reporting outdoor locations without knowing the postcodes. The @vodafoneUK team pointed me towards the VFUK online e-forums, rather than (say) giving me a direct phone line or email address to report coverage issues.

Already feeling like this was a lot of work, I nevertheless proceeded to register for the eforum (which needs a different login to other VF services, naturally), read through their harsh instructions to search for pre-existing forum posts that might cover the problem already. Then I had to go to the coverage-checker engine to see if there were any existing problems reported - which meant that I had to use Google to find two appropriate post-codes to enter, as you can't just click on the map.

Both inquries gave the response "Important service information - we're working on correcting a network problem that may affect the performance of your device"

Given that both problems have been ongoing for months, I didn't have too much confidence in this being accurate, so I put this post up on the eforum. Nothing too controversial, just a quick note to tell Voda they've got some issues. I gave a link to this blog so that their support people would know I'm not just an "average user" but have some knowledge of the industry.

The first response almost beggars belief "Now I'm not saying there isn't a problem, but the investigation I've just done points to this at the moment." . Yes, that's right, I spend all day signing up for forums and posting messages about non-existing problems. I've got nothing better to do. And your "open cases" support system is obviously better than a real-world customer with a real-world device, reporting on a real-world problem. Unreal.

Somehow, I remain civil, writing another post pointing out that yes, these issues are still real. And give some hints on how the VF engineers might replicate them if they want to do tests.


The next reply takes the biscuit: "If you can provide 3 examples of these drops  for every area you experience these in then I will definitely raise this case.". Coupled with a request by email (with a spam-tastic "Customer Service" as sender and "No subject") for my information. So if I wanted to "raise a case", I had to send through not just my phone number, but also full name (OK), and also "for security" - two digits of my VF security code (!!! very secure via email), my address (irrelevant to the question and they know this from my number), and my date of birth.


Because "security" is always important when reporting network problems.... perhaps I am some evil-doer wanting to do a "denial of service" attack on their radio engineers' time by submitting fake faults?

Oh and then the email asks for a few more details, copy-and-paste from some stupid template (possibly the wrong one too, voice not data):
  • Fault description: (please detail the exact nature of the fault)
  • Tests performed (Manual roam SIM in different handset)
  • Date issue started:
  • Device make an model:
  • Results of trying SIM in another handset:
  • IMEI number of the handset:
  • Postcode of location:
  • How far do you have to travel to get signal?
  • Address of issue:
  • Error tone/wording:
  • Numbers effected (Please provide 3 failures, including Number called, date, time and location when call made/received):
As you can understand, I decided that a more profitable use of my time was to write this blog post instead. I'm shaking my head in disbelief about how hard it is to report an important - but simple - problem. Without basic coverage, a whole host of future business models are rendered useless. The idea, for example, of getting media companies or Internet firms to pay for "priority delivery" for 3G data, or some other sort of non-neutral network approach, is totally contingent upon delivering a reliable service.

So just to spice things up a bit more, I've also reported some other holes.... in the road.... to my local council, Westminster. I pay them about the same per month as I pay Vodafone. The road in question is less than a mile from the other sites mentioned. Let's see which one has better processes & more efficient engineering. The Council has a head start, as they have a simple page to report problems, including doing it via street name (not postcode) or "pinpoint on a map". Asks for details, gives a reference number, sends an email acknowledgement. Not a complex customer interface, but about 10x better than a supposedly customer-centric phone company worried about churn.

So - it's definitely easier to report holes in the road, than holes in the air. Let's see if it's quicker to get them fixed too.

Tuesday, March 01, 2011

Policy and traffic management moves to the edge of the network - the device

One of the hidden trends that I've been watching for a while, in the complex world of mobile broadband traffic management, is now starting to come to the surface: the action is moving down to the device/handset itself.

While a lot of manufacturers of "big iron" boxes like to imagine that the core network or the RAN is all-seeing and all-powerful, the truth is that any discussion of "end-to-end" is only true if it extends out to the user's hand (or ideally, retina or ear-drum). That is where quality of experience (QoE) really manifests itself and where radio decisions (especially about WiFi) are controlled. Anything observed or inferred from within the network about the handset is a second-best simulacrum, if that.

That's not say that the network-side elements are not useful - clearly the policy engines, offload and femto gateways and analytical probes in the RAN have major (even critical) roles to play, as well as the billing/charging functions allowing the setting of caps & tiers - even if I am less convinced by the various optimisation servers sitting behind the GGSN on the way to the Internet.

But most major network equipment vendors avoid getting involved in client software for devices for a number of reasons:

  • The standards bodies are generally very poor at specifying on-handset technology beyond the radio and low-level protocols, and even worse at encouraging OEMs to support it. Few network equipment firms are willing to go too far down the proprietary route
  • There is a huge variety of device types and configurations, which means that vendors are likely to need to develop multiple complex solutions in parallel - a costly and difficult task. It is also unclear how device-level software can be easily monetised by network vendors, except in the case of integrated end-to-end solutions.
  • There are various routes to market for devices, which makes it very difficult to put operator-centric software on more than a fraction of products. In particular, buyers of unlocked devices such as PCs or "vanilla" smartphones are going to be vary wary of installing software seen as controlling and restricting usage, rather than offering extra functionality
  • Testing, support, localisation, upgrades and management are all headaches
But despite these difficulties, some vendors are (sometimes grudgingly) starting to change their stance and are dipping their toes into the on-handset realm.

There are various use cases and software types emerging around device "smarts" for assisting in mobile traffic management, for example:

  • Offload assistance and WiFi connection management
  • Security such as on-device application policy and encryption
  • User alerting - or operator feedback - on congestion and realtime network conditions from the handset's point of view
  • Quota / data-plan management
  • Feedback to the network on device status (eg power level, processor load etc)
  • User control of application data traffic
  • Low-level connectivity aspects
 I'm maintaining a list of vendors active in these areas (and a few others) as well as my thoughts on who really "gets it", but I'm going to hold off on naming them all on this occasion, as I know many of my esteemed  rivals occasionally drop by this blog.

However, one that I will highlight as being very interesting is Mobidia [not a client], which aims to put control into users' hands, rather than boxes in the network making arbitrary policy decisions. For example, it's one thing for an optimisation server to guess whether the user prefers a "non-stalling" but degraded video - but quite another (and much better) solution, for a software client to let the user participate directly in that decision and trade off quality vs. impact on their monthly data quota, via an app. I was very impressed when speaking to them, especially in comparison with some of the purely network-centric DPI/policy/optimisation vendors I met in Barcelona. I think this type of user involvement in policy will be an important piece of the puzzle.

Management of WiFi connectivity is another area where device-level touch points are important. Although some aspects can be managed from a device management / configuration box in the network - or via standards like 802.11u - that is only ever going to be a partial answer. There will need to be a proper on-device client with a UI, in order to get the experience right in all contexts. (I'll do another post on WiFi offload soon as there's other important issues, especially about the idea of backhauling traffic through the core).

Overall - device-based policy management is difficult, messy, heterogeneous and difficult to monetise. But it is going to be increasingly important, and the most far-sighted network vendors would do well to look to incorporate the "real edge" into their architectures.