Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here

Wednesday, April 27, 2011

Guest post on Visionmobile about Future of Voice

I've written for many years about the future of VoIP and personal communications. Recently, I've been mentioning terms such as "non-telephony voice" and the need for service providers to understand that the historic concept of the "phone call" is only one way of interacting with another person using speech.

Last week's profit warning from Dutch operator KPN highlights the fragility of "old" telecoms communications services such as telephony and SMS, as newer applications better-customised to the idiosyncracies of human behaviour start to emerge. An open question is how well platforms such as IMS can cope with new modes of communication - especially those that aren't based on "sessions", but more fluid forms of interaction.

This is a broad theme I'm going to be addressing in some depth over coming months, through a variety of publications and events.


EDIT: If you are interested in learning more about the Future of Voice, I will be running a series of small-group Masterclasses together with Martin Geddes, as well as providing private internal workshops. Email me at information AT disruptive-analysis DOT com for more details

For now, however, please check out the guest post I've written for fellow analyst Andreas Constantinou's blog, VisionMobile on the Future of Voice, and the challenges being posed for "your grandmother's telephony service".

Monday, April 18, 2011

Is mobile data roaming structurally flawed?

Fascinating article by David Meyer at ZDnet, as part of his ongoing coverage of mobile data roaming.

He points out the possibility of the European Commission forcing a structural split between domestic and roaming service provision. Basically, there seems to be frustration that voice (and especially data) prices and consumer choices have not changed quickly enough, despite recent regulation on tariff caps and anti-billshock thresholds. In particular, there is concern that customers don't know in advance how/when/where they will travel, so they cannot make an educated decision about which tariff is "best" at the start of a contract. Most people have a feel for the number of minutes / texts they send per month - but no idea how much data they might use on visits Spain, the US or in Kyrgyzstan over the next 24 months.

Ironically, even when people *do* look at roaming prices as part of making a decision among competitive domestic offers, the operators feel that it's such a minor part of the plan that they are free to make unilateral changes to those roaming prices, while the contract is still in force. This is exactly what happened to me, last year. Certainly, few price plans in Europe are marketed upfront as 'roamer-friendly'.

Although it's too early to judge exactly how any future regulation might manifest, a possible option is that customers choose their "domestic" tariff and plan as normal, but then get to choose again about which network(s) and price-plans to use when actually roaming, or before departure.

That said, there's clearly a whole host of issues, concerns and possible "gotchas" here:

  • Is this choice made on a per-trip basis, or at the original time of signing a contract? 
  • How does billing work when roaming? Would (say) Vodafone act as a retailer / billing agent for Orange if I pick them when travelling in France? 
  • What's the user experience like?  
  • Do I need a separate SIM card for my roaming provider? 
  • What happens if my phone is SIM-locked - and how would you avoid worsening the grey market in subsisided phones? 
  • Would I use the same roaming provider for both voice and data? 
  • Whose ultimate responsibility would look after emergency calls; lawful intercept etc? 
  • Will this lead to weird distortions - eg people "roaming" permanently in Europe on a Luxembourg mobile contract, because it's cheaper?
I'm expecting the current mobile operators to scream blue murder about this - it's technically complex, and impacts an area of significant profitability, and potentially means that a licencee in one European country can offer services on an almost-equal basis throughout the continent. They will no doubt point out that there are already assorted opt-ins, or discount programmes (Vodafone Passport etc) that enable customers to tweak their roaming cost profiles.

Also, from my perspective, the problem is less about in-Europe roaming - for which we're seeing OK packages such as Vodafone's £2 / day for 25MB, and more for travelling outside Europe. The current typical charges of £3-6 per MB when I travel to the US, along with £1+ per minute for voice, are completely unjustifiable and make a mockery of smartphone ownership.

I now routinely switch data roaming off completely, and just rely on WiFi. I recently spent a whole week in San Francisco recently without using 3G at all, although it does seem silly that I have to resort to using paper printouts of Google Maps, or buying Starbucks coffee to check my email, when I'm quite prepared to pay a sensible amount for cellular data.

The problem is that there is no jurisdiction that can enforce price caps at both ends of (say) Vodafone/AT&T or Orange/SKT bilateral roaming arrangements. The structure of roaming involves both the wholesale (visited) fee, and the retail (outbound) mark-up price. Maybe the ITU, GSMA or even WTO needs to get involved ultimately, although none of them wants to kill the golden goose, even though they realise how unpopular the rates have become.

Another interim approach might be to make it a requirement for operators to disclose the wholesale rates they are paying, in an attempt to shame the visited network into sensible pricing. (imagine getting this SMS when you arrive at the airport: "Data costs £3/MB because the greedy network you're roaming onto charges a wholesale fee of £2.50/MB. Here's the CEO's email address if you'd like to complain").

Perhaps the best option will be an MVNO, or soft-SIM or dynamic-IMSI approach, with Apple or GroupOn or another third party acting as a tariff aggregator for customers. They could use negotiating power to force down wholesale rates for visited networks (eg Europeans roaming onto AT&T in the US, especially), or emulate the style of Truphone's "Local Anywhere" proposition in having multiple accounts on a single SIM card.

Fundamentally, the model for data roaming is completely flawed - unless you're using your home operator's in-house data services such as mobile TV, there is no need to have your data routed back home anyway. If you just want to connect to the Internet in a foreign country, there's no justification for your domestic service provider to have any role, except acting as a source of convenience. I don't phone up Vodafone for permission every time I want to use WiFi in Lithuania, or an Internet cafe in Mozambique. Now, I *am* prepared to pay for convenience - which is why I'll use ATMs and credit cards everywhere, despite some incremental fees. But I'm certainly not paying a potential £500 for a typical week's worth (100-200MB) of non-EU data usage.

The whole ridiculous process is about to be replicated in LTE - at least when the question of supporting the right frequency bands in a decent % of phones means that LTE roaming becomes vaguely practical. Just as VoLTE is "yesterday's telephony reinvented for LTE", we can expect to see "yesterday's data roaming reinvented for LTE" as well.

The effect of this is likely to further drive the use of free WiFi in traveller-centric hotspots. We're already seeing an increasing prevalance of hotels, airports and tourist cafes offering free data. I've stayed in remote parts of the world and been able to use Skype and Facebook for my communications needs, for free.  In other words, the current structure for mobile data roaming is driving users to a polarised situation. Many now expect *free* WiFi data when travelling, rather than be willing to pay a smaller, reasonable charge for cellular. In the short term, operators are benefiting from the grudging use of roaming by travellers on expenses - or by occasional roamers who are going to suffer from bill shock because of inadvertant use. That is not a sustainable business - the industry needs to wake up & reinvent how data roaming is organised, because the current system (especially outside the EU or other roaming regions) is broken.

EDIT: as an afterthought, ponder the notion that data roaming is, from your home operator's point of view, "best efforts" especially where it's provided through a telco that is not an affiliate. You would have thought that the lowest level of ownership & control (and therefore QoS) would mean you got charged a *lower* price than at home, not higher, would you not? Or perhaps best-efforts data is really good enough, after all?

Monday, April 11, 2011

The risks of ignorance-based pricing strategies for telecoms

Almost exactly 5 years ago, I wrote a blog post cutting through the myth of "value-based pricing" in the telecoms industry. It followed on from the observation that people seemed happy to pay for SMS messages, and so therefore it must make sense for telcos to try and extract the maximum amount from all users, for all services at all times, rather than under-price and "leave cash on the table".

In principle, I agree that perfect markets (and perfect marketing) should indeed result in optimal yields and the "right" prices mapped on to realistic assessments of users' utility and perceived value. However, we live in a far from perfect world in telecoms, in which obfuscatory marketing, lock-in and sheer rip-offs prevail - and also, it must be said, sometimes too-low pricing as well.

Updating my definitions, it's worth a quick recap:

Bargain-based pricing - it's so cheap, it's unbelievable. You tell everyone about it. You use it for the sheer sake of it. You buy other stuff just as an excuse to use it more. Example: free WiFi in cafes, or 3G dongles that are cheaper than ADSL lines. Most free Internet services like Google Search and Facebook are also a "bargain" if you're prepared to suffer some advertising.

Value-based pricing - it's the right price. It seems reasonable given the probable underlying costs or its inherently fair market-based pricing mechanism. It does what it says on the tin. You can justify it easily. You mention it to friends or colleagues. Examples: Normal smartphones data plans, iTunes music, Google AdWords, eBay pricing, Inflight WiFi.

Inertia-based pricing - it's a bit steep. You know you could find it a bit cheaper. But it works, it's convenient, and it's not worth the effort to shop around or switch. You don't complain, but you don't recommend it either. Example: SkypeOut calls, your current broadband provider, your current mobile voice tariff, airport food, iPhones

Ignorance-based pricing - it's a ripoff, but you don't realise it. You've got no real benchmarks, so it seems "reasonable". If it was cheaper, you'd probably use it more. You don't know it's available to other people (maybe in another country) at a much lower price. If you found out, you'd be quite annoyed, complain to friends, and probably feel a bit gullible & prone to switch suppliers when the opportunity arose. Example: SMS pricing, PSTN calls.

Resentment-based pricing - you know you're being ripped off hugely, but you "have" to pay as you have no immediate alternative. You grit your teeth, and (hopefully) expense it afterwards. You actively look for a way to avoid the cost, and minimise your usage. You complain to friends & colleagues. You develop "active customer disloyalty" and vow to switch suppliers, out of distaste for their show of customer disrespect, whenever you can. Examples: Hotel WiFi, most mobile data roaming.

You'll notice the assertion that mobile voice pricing is "inertia-based". But according to a new piece of research this morning, UK mobile subscribers appear to have sleepwalked more into the "ignorance-based" tier, spending on average 44% more than necessary on phone tariffs, or £195 a year (about $300), because they choose plans that are unsuitable.

That's enough to have pretty much every UK media outlet pointing out how much we're over-paying. Now to anyone in the cellular industry, this probably doesn't come as any massively-surprising news. Often, plans are specifically set to encourage upgrade to the next-higher tier. If average usage is 280 mins a month, then thresholds will likely be set at 250 and 500 mins, rather than the logical (but cheaper) 300 mins. Whether you view this as opportunist cynicism, or smart marketing, depends on your point of view. And whether you get called out on it.

The open question is whether this type of approach - while clearly generating revenues in the short term - is sustainable, and also whether it creates a damaging perception in customers' minds that operators are ripping them off. At a time when telcos are hoping to become trusted enough to be used for payments, digital lockers, identity management and so forth, they need to be careful to watch their reputation if they hope to gain true loyalty. Google and Facebook don't over-charge their users.

The other risk is that this type of egregious pricing strategy opens the door to "white knights" that can rescue customers-in-distress from the clutches of the evil, firebreathing pricing dragons. It is quite easy to imagine a GroupOn-type approach to buying mobile plans - collective groups of consumers that act with similar power to enterprises start to negotiate bulk deals, disintermediating the operators from identity while they are doing it. (I just realised I wrote a post about "consumer-oriented collective purchasing" 3 years ago, by the way).

Or alternatively, perhaps the UK's price comparison site uSwitch gets recast by Apple as iSwitch, exploiting their patented (and much-hated) notion of a remote-updateable SIM. What better way to perpetuate the $300 gross margins on iPhones than to offer users a way to monitor & optimise their phone plans? "We have calculated that you can save £10 a month by switching to Operator X from Operator Y. Click here to initiate number portability via iTunes and switch to your new provider".

One of the reasons for the mobile industry's historic profitability is that it has been able to derive huge profits from services which aren't really worth what people pay for them. SMS, roaming, too-large telephony plans. This is fine while people don't realise they're over-paying, and while there are no easy workarounds. But as the fixed-line voice providers have learned, once the process of discovering lower prices becomes more transparent, there can be a huge exodus of previously-loyal customers. By contrast, people buying an Apple product - or any other premium brand - knows that the supplier is making money, but they obtain value in other ways such as convenience or status.

There's no "cachet" safety-net in getting a too-large mobile minutes bundle, though.

Communications innovators - get thee to eComm in June!

I've been involved in Lee Dryburgh's series of eComm events for several years, both as speaker and as a member of its advisory list. For those of you not familiar with eComm, it's an event that is more about a shared understanding of the future (or possible future) of communications, rather than specific takes on a given technology. It spans next-gen voice services, wireless technologies, apps, social networks, messaging, devices, services business models, regulations and much more. Previous speakers have included the Android founders, senior Skype execs, FCC staffers and a plethora of others.

Up to a point, eComm has something of an anti-establishment feel, which surfaces in occasional anti-telco attitudes - although ironically some of the most provocative speakers have been from thought-leading telco business units. Overall, eComm tends to rail against the status quo, or restrictions on communication. It also tends to favour innovation over centralisation - standards are useful but not essential tools.

The next event is coming up at the end of June in San Francisco, but for various personal reasons Lee has had to take some time off from organising it.

This is a call to my blog readers with interesting stories to tell to apply for a speaking slot. This could be something about new services, new communications apps, perhaps new enabling platforms, or new takes on devices, user-experience and regulation. It *shouldn't* be a straightforward vendor pitch for something essentially me-too. (The back-channel can be pretty merciless on corporate powerpoint-mongers).

Either way, I'd exhort you to have a look at eComm, perhaps looking at the speaker roster from previous events such as US 2010, or Europe 2009.

Thursday, March 17, 2011

WiFi highlights an inconvenient truth about QoS...

... it's not always needed.

Increasingly, smartphones get used with WiFi. Some estimates suggest that up to half of data usage now goes over WiFi. Most of that WiFi is connected from homes, offices or public hotspots over backhaul provided by an operator other than that providing the cellular connection to the smartphone. Although in some cases there is an offload agreement in place, there is usually no direct measurement or control of QoS end-to-end.

But some operators have (or are launching) their own data and content services - whether it's a content site, their appstore, remote backup or even RCS. This means that some of the access will come in to the operator domain via the open Internet. This isn't new in itself - technologies such as UMA/GAN have been around for a while, as have assorted softphones, remote access clients and so forth. But what this implicitly means is that for some of the time, at least, operators are happy to have their services accessed by their customers over the public Internet. With all of the potential downsides that suggests.

Plus, this means that in those situations, the operator is itself acting a so-called "OTT" provider, riding for free on somebody else's pipes. Are they first in the queue to offer to pay their ADSL/cable saviours for QoS guarantees? No, I thought not.

So the obvious question has to be - if it's OK to connect via an unmanaged network some of the time, then why not all of the time? Are they warning their customers that reliability might be lower if they connect via WiFi? What rights do their customers have if performance is below par?

Now obviously in most cases here the fixed connection used for WiFi is faster than the mobile network would have been - so "quality" in some regards is arguably actually better. But it's still not actively monitored and managed, and both the Internet portion of the access and the WiFi radio itself are subject to all sorts of contention, congestion, packet loss and other threats.

I know that various attempts are being made to bring WiFi into the operator's control - or at least visibility and policy oversight - with selective offload and ANDSF and I-WLAN and various proprietary equivalents. But even these will not cover all situations, even when viewed throught the rosiest tinted glass.

But surely, if a QoS-managed and policy-controllable network is that critical, surely there ought to be explicit notifications to users that they are accessing the service via an unmanaged connection? Maybe, in extremis such access should even be blocked?

Flipping this around the other way.... if it's OK for your access customers to access your services over the Internet on an OTT basis, at least some of the time, why not also let other people access those services as well?

Tuesday, March 15, 2011

UK ISPs Code of Practice on Traffic Management - OK as a start, but major flaws

A group of the UK's largest fixed and mobile ISPs have published a "Code of Practice" about managing traffic on their broadband networks. The full document is here with the announcement press release here. The group includes BT, Vodafone, 3, O2, Virgin, BSkyB and TalkTalk, but currently excludes others, notably EverythingEverywhere, the Orange/T-Mobile joint venture.

(Regular readers may remember that I put up a suggested draft Code of Conduct for traffic management last year - there seems to be a fair amount that has been picked up in the UK document. My input also fed into the manifesto published my partners at Telco 2.0, here)

There's some good stuff, and some less-good stuff about the new Code of Practice. Of course, if you a Net Neutrality purist, your good/bad scale will shift a bit.

On the positive side, the general principle of transparency is extremely important. The committment to being "Understandable, Appropriate, Accessible, Current, Comparable, Verifiable" is entirely the right thing to do. I think there is a lot of good stuff in the Code here, going as far as the need for independent verification (although that would probably happen anyway - I'm sure Google and others have their own techniques for watching how traffic shaping is used by telcos).

The fact that it has been signed by both fixed and mobile operators is also a good thing, although there isn't much in the document about the specific issues inherent in wireless networks.

But the main problem is that it attempts to define traffic management policies by "type of traffic" in terms of descriptions that are only meaningful to boxes in the network, not to users themselves. Ironically, this fails the Code's own insistence on being understandable and appropriate. There are also no clear definitions on what constitutes the various categories such as "gaming" or "browsing".

The problem here is that DPI boxes don't really understand applications and services in the way that users perceive them. "Facebook" is an example of an application, including links or video which are displayed on the web page or inside a mobile app. "WebEx" is another application, which might include video streaming, messaging, file transfer and so on. Add in using HTML5 browsers and it all gets messier still.

Having a traffic policy that essentially says "some features of some applications might not work" isn't very useful. It's a bit like saying that you've got different policies for the colour red, vs. green. Or that a telephone call is #1 priority, unless a voice-recognition DPI box listens and senses that you're singing, in which case it gets reclassified as music and gets down-rated.

And even in terms of traffic types, the CoP conspicuously misses out how to deal with encrypted and VPN traffic, which is increasingly important with the use of HTTPS by websites such as YouTube and Facebook. Given that SSL actually is a protocol and "traffic type" this is pretty important. At the moment, the footnote "***If no entry is shown against a particular traffic type, no traffic management is typically applied to it." to me implies that encrypted traffic passes through unmolested under this code of practice. (I'd be interested in a lawyer's view of this though).

Another problem is that there is an assumption that traffic management is applied only at specified times (evening, weekends etc), and therefore not just when or where there is *actual* congestion. I suspect Ofcom will take a dim view of this - my sense is that regulators want traffic management to be proportionate and "de minimis" and there seems no justification for heavy-handed throttling or shaping when there is no significant congestion on the access network or first-stage backhaul.

There is also no reference to what happens to any company which fails to meet its obligations under the Code (which is "voluntary"), or how enforcement might happen in the future.

Lastly, there is no reference to bearer-type issues important in mobile. In particular, whether the same policies apply to femtocell or WiFi offload.

Overall, on first read I'd give it a 5 out of 10. A useful start, but with some serious limitations.




Thursday, March 10, 2011

Revenue from content/app transport? Operators need to be part of solution, not part of the problem


I'm still seeing a lot of discussions that go along the traditional and rather tired lines of saying that Facebook / YouTube / Hulu / BBC etc should "pay for their use of our pipes". I've just been debating on Twitter with Flash Networks, an optimisation company, about the fact that YouTube is now watched by a huge proportion of broadband-enabled people in India (mostly fixed, not mobile)

Flash asked the question "should YouTube be financially accountable", to which the answer I think is pretty clearly "no" - the users are financially accountable for buying Internet access services. If they all seem to prefer the same website for video, so what? Maybe at some point it becomes a question for competition authorities, but I really can't see what difference it makes if people watch videos from one site or 10 different ones.

If I have a mobile phone plan with 600 minutes, and use 500 of them calling my best friend and 100 calling everyone else, you wouldn't send my friend a bill for "generating traffic".

But that doesn't preclude the operator doing a deal with YouTube for something extra. Maybe they offer QoS guarantees (empty promises won't cut it, there needs to be proof and an SLA) for prioritisation or low-latency. Maybe they have a way to over-provision extra bandwidth - for example the customer subscribes for a 6Mbit/s line speed, but YouTube pays extra to boost it to 10Mbit/s if the copper can handle it. Maybe the operator gives YouTube a way to target its advertising better, through exposing some customer data. Maybe the operator improves performance and reduces costs by using caching or CDN technology.

But all that is on top of the basic Internet access - and of course, YouTube will be doing its own clever things to squeeze better performance out of basic access as well. It will be playing with clever codecs and buffering and error-correction and so on, so the telco has to make sure its value-add "happy pipe" services give YouTube a better ROI than spending it on a more R&D tweaking the software.

What won't fly (in most competitive markets) is attempting to erect tollgate for the baseline service. The telco gets a chance to participate in the upside beyond that, if it can prove that it's adding value. It can't just exploit YouTube's R&D, user loyalty and server farms "for free".

The same is true in mobile - the operator needs to be part of the solution, not part of the problem. Which means that before it has the moral authority to say it's providing value from "extras", it needs to get the basics right, such as adequate coverage and reasonable capacity. It also has to demonstrate neutrality on the basic Internet access service - it can't be seen to transcode or otherwise "mess about" with traffic.

But assuming that there is good - and provable - coverage (including indoors, for something like YouTube), then once again the operator has a chance to participate in improving the performance of vanilla Internet access. It can offer device management, user data, possible higher speeds and prioritisation and so forth. But there are many more complexities to getting this right, as mobile is less predictable and "monitorable" than fixed-line. Ideally, quality needs to be seen and measured from the user's perspective, not inferred imperfectly from the network. And there needs to be some pretty complex algorithmic stuff going on in the radio network too - how do you deal with a situation where you have both "Gold users" and "Gold applications" competing for resources in a cell? And just how much impact should one Gold user/app right at the cell-edge have on 50 Silver users in the middle?

All of this needs to be based on upside from what is possible with a best-effort standard mobile Internet connection, where the user and app provider are in control and can alter their behaviour according to personal preferences. The operator and network need to show a demonstrable solution which offers more than can be reasonably expected, and not just try to extract fees by creating an artifical problem.

So in pharmaceutical terms, the performance of the baseline, unmodified transmission is like a placebo in a double-blind test of a new drug. Any new network "treatment" such as higher QoS or optimisation has to show measurable and repeatable benefits against the placebo. It is also possible (and necessary) to double-check that the placebo is uncontaminated.

This is the challenge for mobile operators in particular, looking to derive extra fees from users and/or content and application providers from "smarter" networks. They need to get the basics right (coverage), and provide an acceptable basic service (unmolested Internet). And then they have to offer something more (proven quality or targetting) at a cost and effectiveness better than that which could be done either by the app software, or simply providing more capacity.

Tuesday, March 08, 2011

Insistence on a single, real-name identity will kill Facebook - gives telcos a chance for differentiation

Note: This post was written before Google+ , Google's stance on pseudonyms, and the rise of #nymwars . Most of this article applies just as much to Google as Facebook.
 
There's been a fair amount of debate about online identity in recent days, partly spurred by Techcrunch's shift to using Facebook IDs for blog comments in an effort to reduce trolling and spamming. Various web luminaries have weighed in on one side of the debate or the other.

Mark Zuckerberg, founder of Facebook, has been quoted in David Kirkpatrick's The Facebook Effect: "You have one identity. The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly ... Having two identities for yourself is an example of a lack of integrity."

I think that's narrow-minded nonsense, and I also believe that this gives the telcos a chance to fight back against the all-conquering Facebook - if, and only if, they have the courage to stand up for some beliefs, and possibly even push back against political pressure in some cases. They will also need to consider de-coupling identity from network-access services.

Operators could easily enable people to have multiple IDs, disposable IDs, anonymity, tell lies, have pseudonyms or nicknames, allow your past history to "fade" and various other options.

In other words, they could offer "privacy as a service".

There are numerous reasons why people might wish to use a "fake" identity - segmenting work and personal lives, segmenting one social circle from another and so on. There are many real-world situations in which you want to participate online, but with a different name or identity: perhaps because you have a stage or performance name, perhaps you have a (legal) "guilty secret" of some sort, or maybe because you want to whistleblow against people in authority or those that you perceive as dangerous. It can even be because your name is just too common (JohnSmith16785141), or too unusual or difficult to spell (Bubley). It is also common for people to want to participate as part of a company, not an individual.

I know plenty of people who use pseudonyms on Facebook and other social media sites, and for *personal* things I'd say that's good for all sorts of reasons. In a business context, I agree with websites such as LinkedIn and Quora that enforce real names, because there is a strong "reputation" angle to their businesses. But on the other hand, if I had to deal with 300 LinkedIn requests a day from random people I haven't met, I'd probably change my mind.

There is another, important side to anonymity and multiple identities - obfuscating parts of your persona and contact details from advertisers and spammers. Being able to give a secondary (and ideally disposable) email address or mobile phone number to untrusted parties is important. I still use my fixed number for most online forms in the UK, because there's a legally-enforced telemarketing opt-out, while giving a mobile number risks spam SMS. The same is true of online identities - I want to be able to corral spammers and unwanted advertisers in a corner of my Internet world that I can safely nuke if I have to.

So, there is an opportunity for operators to offer - either individually or collectively - a more friendly set of identity options. This probably relates more to mobile operators than fixed operators, but not necessarily. A critical element here is that ID *cannot* be always tied to a SIM card or phone number, for most of these use cases. Users will not wish to be tied to a single access provider, not least because many times they will not be using a single, operator-issued device or that provider's access network. They will also not want to pay for an access account in perpetuity, just to make blog comments or something equally trivial. And, painful though it is to telcos, they *will* churn, and using identity as a lock-in will reduce trust and take-up of the services.

In other words, a telco-provided custom ID will need to be provided OTT-style - something like Orange's ON service , a cross-network app which enshrines principles from studies of psychology and anthropology - such as the right to lie. You need to be able to "take your privacy/idnetity profile with you" when you move to another operator. Unless we want to wait 10 years to force through "identity portability" laws, operators will fail to exploit this opportunity if they just try and see it as a churn-reduction tool.

This also means that interoperability between privacy providers is unncessary and even undesirable. Operators can - and should - go it alone to start with, which is why fixed operators have a chance as well as mobile. Living in the UK, would I use AT&T or Telenor as a privacy provider? Maybe, depends on whether I like a specific service and trust them, but I'd be more keen than going with one of the UK operators who'd try to link the capability into other services. Although that said, I'd probably use certain aspects of this broader idea from my current telecom providers - perhaps a second "fake" number I could use for advertisers and potential spammers.

(It goes without saying that most or all of this will need to be built outside rigid architectures such as IMS or RCS, which also have centralised repositories for subscriber information, unique personal identifiers attached to credentials such as SIMs, and an assumption of access/service coupling).

Now there is an open question here about full anonymity. A lot will come down to local attitudes and laws. Some countries already force users of previously-anonymous services such as Internet cafes or prepaid mobile phones to register with the authorities - for example Italy, Spain and India. Others like the UK and Portugal are still OK with off-the-shelf purchases of SIM cards, anonymous web access and so forth - luckily our new government binned the hideous UK ID card project when it came to power last year. As events in the Middle East have shown, anonymous and easy access to communications helps protesters against despotism - possibly a price worth paying for a minuscule rise in terrorism risk. Personally I have the luxury of democracy, and I tend to vote for libertarianism rather than nannying state intervention, but your opinion may vary.

(And yes, I understand that real, true anonymity is almost impossible - both online and in the real world. We are traceable via credit cards, mobile phone records, facial-recognition CCTV, and probably online semantics and other behaviours. But at the moment, it's difficult to join the dots unless you are Google or a government security agency).

Don't get me wrong, I'm a huge fan of Facebook and believe that in many ways it is going to eat the telcos' collective lunch. Friend lists are already usurping the notion of a phone "address book", and web-based approaches make social networks much more flexible than a telecoms infrastructure can be. It's tempting to believe that Facebook is now too big to fail - but don't underestimate the fickleness of social groups. I've had a few friends who have had pseudonym-based profiles deleted, and they are definitely no longer loyal users.


I strongly suspect this is not an area in which the telcos will move together, en masse. It is an opportunity for some of the more forward-thinking and perhaps renegade operators (or specific product teams) to move aggressively and across network boundaries. If ID gets mired in years of interop talks and nonsense about support of roaming, it will go the same way as other "coalitions of the losers". This needs to be done NOW and done aggressively by those brave enough to step up - perhaps in partnership with a web provider or two.

Monday, March 07, 2011

Time for the word "terminal" to reach the end of the line

I stirred up a bit of debate over the weekend via posts on Twitter suggesting that the use of the word "terminal" in the telecoms industry is always a good sign that the speaker is stuck in a legacy age. (Twitter being the terrible medium for debate that it is, I was unable to discuss this meaningfully - hence this post).

Typically used by network-centric, standards-centric, telephony-centric members of the industry, I have long believed that 'terminal' exemplifies the denial of reality endemic in many "old school" telecoms professionals. Nobody outside of the network fraternity uses the word "terminal". You'll never hear Steve Jobs, or even most of Nokia's current and former execs utter the term. People say "mobile", "device", "cellphone", "smartphone".

This is not a new stance of mine either - I made the same point almost exactly 5 years ago in this blog post.

After a bit of a verbal ping-pong match with @TMGB this morning (I'm tempted to describe him as the dinosaurs' "Chief Asteroid Denier", but that's perhaps a bit unfair), I've reached a slightly clearer position. In historic telephony standards, there is indeed still a specific technical notion of a "terminal" defined. It's a bit similar to the old mainframe/green-screen architecture, or various other technology domains like industrial SCADA systems.

But in the past, being a terminal was pretty much the only thing that a phone did. Even more recently, being a terminal was the main or most important thing it did, even if it was as an SMS terminal rather than a telephone terminal. Therefore it was fairly natural for people to refer to any mobile phone as a "terminal", firstly because that was the only type of device, and secondly because it was - to all intents and purposes - the only useful thing it did.

But obviously, over the last 10 years, things have changed. Modern devices do a huge range of things - often simultaneously. Acting as network terminal in a standards-based, telephony sense is simply one of a smartphone's functions, and increasingly not the most important. Many of those functions are not even anything to do with a network connection - the camera, MP3 player and so on. Arguably, connectionless technologies like HTTP and IP do not have "terminals" in the telecoms sense of the word. The majority of device value thus resides in "non-terminal" functions.

Using the word "terminal" now to refer to a smartphone or other new device is therefore extremely sloppy. Today, terminal=function in mobile, not terminal=physical product. And yes, this is more than just an abstruse semantic discussion, because perpetuating the idea that the terminal function is somehow the paramount use case of a device- and, moreover, is independent of the other functions is a huge fallacy which may drive the industry down blind alleys.

The idea that a telephony call (the most obvious example of the terminal function) should over-ride anything else the device or user may be doing is not just arrogant, but a huge error in understanding user behaviour and modern OS's. Yet that remains an unspoken assumption among many in the industry.

Often a smartphone (or, certainly, tablet) user will be doing many things more important than receiving a phone call, particularly a trivial one from somebody they don't want to talk to. Yet the "terminal is the #1 application" mentality is insidious - standards like Circuit-Switched Fallback for LTE telephony assume it to be true. Multi-tasking, multi-connection devices mean that the terminal capability does not exist in isolation - and concurrent tasks need to be considered and sometimes given priority. This will need clever UI design, as well as various user interactions in the device's upper software layers that are not generally considered in network-centric views of "terminal" behaviour.

Furthermore, as we move towards smarter devices and especially VoIP-based telephony, the idea that the "terminating software client" is actually the last point of the chain becomes ever less true. The OS, or another application or browser, might intercept a phone call before it reaches you, or initiate an outbound one on your behalf. The ultimate "voice" application may simply be calling a telephony API - or may pick-and-choose other non-service based voice capabilities.

In other words, even the word "terminal" becomes factually incorrect.

So, to be clearer:

The word "terminal" is a legacy of a time when mobile devices were primarily intended for connection to specific services (especially voice telephony), over a network access run by the same service provider. Nowadays, a mobile device may have a terminal function but can also operate in many other modes - standalone & offline, connected to another network (eg WiFi), using a specific installed app. It is therefore not just factually wrong, but dangerously naive to continue referring to it as just a "terminal" - and thus I believe I am justified in my views that continued misuse of the term is a good indicator of the mindset of the person saying it.

Wednesday, March 02, 2011

I want to report a 3G coverage problem - how difficult can it be?

Various emerging business models demand good, reliable, near-ubiquitous mobile data coverage, especially in dense urban areas. We hear a lot about congestion, but rather less about the more basic problems of getting a signal. Whether it's a "not-spot" because of buildings, poor setup of the antennas, inability to site a base station, a recurring equipment fault or just some other RF weirdness, gaps and other coverage-free zones are going to be an increasing problem.

In particular, cloud-based services are going to be very sensitive to the quality of a given operator's network. It's bad enough losing access to the web and email in certain locations - think how much more problematic it would be for critical business processes dependent on hosted applications, used via mobile devices.

Because of this, you'd expect that operators would want to get prompt feedback from their customers about any real-world problems they've missed. Surely in this area of their business, they'd recognise that overall "quality of experience" is best monitored and reported by the end-user, not simply deduced and inferred from boxes & probes and software in the network.

Well, that's certainly not the case for Vodafone UK. Over the last year I've been on its network for my main phone, I've noticed quite a lot of coverage gaps and holes around central London. Sometimes I get bumped down to 2G, sometimes nothing at all. And some of those gaps are in absolutely predictable and consistent physical locations - I've encountered them repeatedly, at different times of day, to the extent that I can even plan my usage around them on certain trips around town. To me, this suggests that congestion and capacity isn't the problem - it's plain and simple coverage.

I've put them on this personalised Google Map - http://goo.gl/maps/hTv3 - both are near Regents Park and Camden in London. One is right in between two of the busiest train stations in the country - Euston  and Kings Cross, right outside the British Library and near the Eurostar terminal at St Pancras.

In the big scheme of things, the two most obvious gaps are not a huge problem for me. Given my typical travel patterns around London, I probably lose 2 mins of mobile data access a week, usually when I'm on a couple of specific bus routes and using my phone for a mix of email, personal apps and so forth. But they contribute to my sense that Vodafone's London network isn't that great - especially as the company hasn't detected and fixed the (very consistent) problems proactively using whatever "service assurance" tools it presumably has at its disposal.

So I decided to report the issue.

I've heard good things about the @vodafoneUK Twitter team, so I thought I'd try that route rather than calling customer service on the phone, especially as I was reporting outdoor locations without knowing the postcodes. The @vodafoneUK team pointed me towards the VFUK online e-forums, rather than (say) giving me a direct phone line or email address to report coverage issues.

Already feeling like this was a lot of work, I nevertheless proceeded to register for the eforum (which needs a different login to other VF services, naturally), read through their harsh instructions to search for pre-existing forum posts that might cover the problem already. Then I had to go to the coverage-checker engine to see if there were any existing problems reported - which meant that I had to use Google to find two appropriate post-codes to enter, as you can't just click on the map.

Both inquries gave the response "Important service information - we're working on correcting a network problem that may affect the performance of your device"

Given that both problems have been ongoing for months, I didn't have too much confidence in this being accurate, so I put this post up on the eforum. Nothing too controversial, just a quick note to tell Voda they've got some issues. I gave a link to this blog so that their support people would know I'm not just an "average user" but have some knowledge of the industry.

The first response almost beggars belief "Now I'm not saying there isn't a problem, but the investigation I've just done points to this at the moment." . Yes, that's right, I spend all day signing up for forums and posting messages about non-existing problems. I've got nothing better to do. And your "open cases" support system is obviously better than a real-world customer with a real-world device, reporting on a real-world problem. Unreal.

Somehow, I remain civil, writing another post pointing out that yes, these issues are still real. And give some hints on how the VF engineers might replicate them if they want to do tests.


The next reply takes the biscuit: "If you can provide 3 examples of these drops  for every area you experience these in then I will definitely raise this case.". Coupled with a request by email (with a spam-tastic "Customer Service" as sender and "No subject") for my information. So if I wanted to "raise a case", I had to send through not just my phone number, but also full name (OK), and also "for security" - two digits of my VF security code (!!! very secure via email), my address (irrelevant to the question and they know this from my number), and my date of birth.


Because "security" is always important when reporting network problems.... perhaps I am some evil-doer wanting to do a "denial of service" attack on their radio engineers' time by submitting fake faults?

Oh and then the email asks for a few more details, copy-and-paste from some stupid template (possibly the wrong one too, voice not data):
  • Fault description: (please detail the exact nature of the fault)
  • Tests performed (Manual roam SIM in different handset)
  • Date issue started:
  • Device make an model:
  • Results of trying SIM in another handset:
  • IMEI number of the handset:
  • Postcode of location:
  • How far do you have to travel to get signal?
  • Address of issue:
  • Error tone/wording:
  • Numbers effected (Please provide 3 failures, including Number called, date, time and location when call made/received):
As you can understand, I decided that a more profitable use of my time was to write this blog post instead. I'm shaking my head in disbelief about how hard it is to report an important - but simple - problem. Without basic coverage, a whole host of future business models are rendered useless. The idea, for example, of getting media companies or Internet firms to pay for "priority delivery" for 3G data, or some other sort of non-neutral network approach, is totally contingent upon delivering a reliable service.

So just to spice things up a bit more, I've also reported some other holes.... in the road.... to my local council, Westminster. I pay them about the same per month as I pay Vodafone. The road in question is less than a mile from the other sites mentioned. Let's see which one has better processes & more efficient engineering. The Council has a head start, as they have a simple page to report problems, including doing it via street name (not postcode) or "pinpoint on a map". Asks for details, gives a reference number, sends an email acknowledgement. Not a complex customer interface, but about 10x better than a supposedly customer-centric phone company worried about churn.

So - it's definitely easier to report holes in the road, than holes in the air. Let's see if it's quicker to get them fixed too.

Tuesday, March 01, 2011

Policy and traffic management moves to the edge of the network - the device

One of the hidden trends that I've been watching for a while, in the complex world of mobile broadband traffic management, is now starting to come to the surface: the action is moving down to the device/handset itself.

While a lot of manufacturers of "big iron" boxes like to imagine that the core network or the RAN is all-seeing and all-powerful, the truth is that any discussion of "end-to-end" is only true if it extends out to the user's hand (or ideally, retina or ear-drum). That is where quality of experience (QoE) really manifests itself and where radio decisions (especially about WiFi) are controlled. Anything observed or inferred from within the network about the handset is a second-best simulacrum, if that.

That's not say that the network-side elements are not useful - clearly the policy engines, offload and femto gateways and analytical probes in the RAN have major (even critical) roles to play, as well as the billing/charging functions allowing the setting of caps & tiers - even if I am less convinced by the various optimisation servers sitting behind the GGSN on the way to the Internet.

But most major network equipment vendors avoid getting involved in client software for devices for a number of reasons:

  • The standards bodies are generally very poor at specifying on-handset technology beyond the radio and low-level protocols, and even worse at encouraging OEMs to support it. Few network equipment firms are willing to go too far down the proprietary route
  • There is a huge variety of device types and configurations, which means that vendors are likely to need to develop multiple complex solutions in parallel - a costly and difficult task. It is also unclear how device-level software can be easily monetised by network vendors, except in the case of integrated end-to-end solutions.
  • There are various routes to market for devices, which makes it very difficult to put operator-centric software on more than a fraction of products. In particular, buyers of unlocked devices such as PCs or "vanilla" smartphones are going to be vary wary of installing software seen as controlling and restricting usage, rather than offering extra functionality
  • Testing, support, localisation, upgrades and management are all headaches
But despite these difficulties, some vendors are (sometimes grudgingly) starting to change their stance and are dipping their toes into the on-handset realm.

There are various use cases and software types emerging around device "smarts" for assisting in mobile traffic management, for example:

  • Offload assistance and WiFi connection management
  • Security such as on-device application policy and encryption
  • User alerting - or operator feedback - on congestion and realtime network conditions from the handset's point of view
  • Quota / data-plan management
  • Feedback to the network on device status (eg power level, processor load etc)
  • User control of application data traffic
  • Low-level connectivity aspects
 I'm maintaining a list of vendors active in these areas (and a few others) as well as my thoughts on who really "gets it", but I'm going to hold off on naming them all on this occasion, as I know many of my esteemed  rivals occasionally drop by this blog.

However, one that I will highlight as being very interesting is Mobidia [not a client], which aims to put control into users' hands, rather than boxes in the network making arbitrary policy decisions. For example, it's one thing for an optimisation server to guess whether the user prefers a "non-stalling" but degraded video - but quite another (and much better) solution, for a software client to let the user participate directly in that decision and trade off quality vs. impact on their monthly data quota, via an app. I was very impressed when speaking to them, especially in comparison with some of the purely network-centric DPI/policy/optimisation vendors I met in Barcelona. I think this type of user involvement in policy will be an important piece of the puzzle.

Management of WiFi connectivity is another area where device-level touch points are important. Although some aspects can be managed from a device management / configuration box in the network - or via standards like 802.11u - that is only ever going to be a partial answer. There will need to be a proper on-device client with a UI, in order to get the experience right in all contexts. (I'll do another post on WiFi offload soon as there's other important issues, especially about the idea of backhauling traffic through the core).

Overall - device-based policy management is difficult, messy, heterogeneous and difficult to monetise. But it is going to be increasingly important, and the most far-sighted network vendors would do well to look to incorporate the "real edge" into their architectures.

Saturday, February 26, 2011

2011 events I'm attending or speaking at

This is a quick post to list various conferences or other events I'm expecting to speak at or attend, primarily in H1 2011.

Please let me know if you're interested in meeting at one of these, developing custom material such as research studies or white papers, or indeed you're looking for a speaker or moderator for your own event. Email:  information AT disruptive-analysis DOT com


14th March, London: TEN (Telecom Executive Network) Next Generation Mobile Broadband

30th-31st March, London: Next Generation Core Networks

5th-7th April, Palo Alto: Telco 2.0 New Digital Economics Americas

11th-13th May, London: Telco 2.0 New Digital Economics EMEA

17th-18th May, Amsterdam:  LTE World Summit 




23rd-25th May, London: Managed Services & Network-Sharing

24th-26th May, London: Avren Connected Home Global Summit

14th-16th June, Berlin: Mobile Data Offloading

22nd-23rd June, Singapore (tentative) Telco 2.0 New Digital Economics Asia Pacific

27th-29th June, San Francisco (tentative) eComm Emerging Communications

In H2 2011, I'll probably be at another couple of Telco 2.0 events in Europe & the US, plus at least one of the IIR Broadband Traffic Management series of conferences.

Thursday, February 24, 2011

1000th post - a retrospective. What I've got right, and what I've got wrong....

I started this blog about five and a half years ago, in October 2005. At the time, I said that "I specialise in looking for "failures of consensus" - either positive or negative" - and that is still true now.

Having watched various areas of the telecoms and IT industry for almost 20 years, it saddens me that that there is still a tendency towards "groupthink". I genuinely enjoy the speed of technological progress, yet it often amazes me that huge amounts of time and money are wasted going down obvious blind alleys. Too often, nobody stands up and says "No! You're all wrong!" - or just points out that a seemingly good idea will encounter a huge stream of "gotchas" that will derail its progress.

Conversely, there are sometimes new trends and truly disruptive opportunities that remain unexploited.

Flattening the "hype cycle"

We're all familiar with the famous Gartner "hype cycle" about technologies. I'd like to flatten it out, by alerting people to the inevitable "second order" problems well in advance, rather than suffering delays and disappointments because those issues are not pre-empted. They're not all predictable - but many of the most disruptive are, especially if you look at adjacent sectors and parallel trends.

Hype costs money. The whole process of innovation ---> unrealistic expectations ---> disappointment ---> renaissance ---> eventual success is deeply inefficient. It is driven by many understandable human psychological effects, especially around the fear of missing out on something. Yet this same herd-mentality and refusal to assess future problems can be catastrophic - especially if upcoming substitutes are evolving faster. 

I've got a few standard questions I use in my research, to see how clearly ideas have been thought out. For example: "will it work indoors?", "what's the impact on the battery?", or "does that proposition make sense for prepay users?". But sometimes there are bigger issues that are lurking like elephants in the room.

Five years ago, for example, I wrote a research report examining why the notion of using IMS for next-generation operator-controlled mobile services would likely fail, because nobody had worked out what an IMS-capable handset was, or had recognised the scale of the challenges involved in creating one. When the wheels finally started turning a few years later, it was fairly obvious that the RCS variant was also lacking both technically and in terms of user appeal. In the meantime, Facebook has 200m mobile users, while mobile IMS has (essentially) zero.

(Incidentally my most-read, most-circulated post is the one in which I re-wrote the script of the famous Monty Python Dead Parrot sketch as a discussion about IMS and LTE - it's here)


My 2005 predictions and their outcomes

I'm a big fan of looking back at the accuracy of predictions. I reckon I've scored quite a lot of "I told you so's" over the years, although I've called a few things wrongly as well. Going back to my very first post, I said that the following were over-hyped and wouldn't live up to expectation, as at late-2005. Let's see how I fared. (2011 comments in italic)

Overhyped in 2005 #1) UMA (unlicenced mobile access)  - Yes, absolutely spot-on. Never got traction outside T-Mobile US and Orange France. Still trickling on with Kineto's WiFi offload client.

Overhyped in 2005 #2) Cellular operator IM - With a couple of niche exceptions, again absolutely on-the-money. RCS is just the latest version of failure here.


Overhyped in 2005 #3) Near-term massmarket WiMAX - Yes. 'Nuff said

Overhyped in 2005 #4) Free wireless VoIP - Also true. Starting to happen more now (as predicted in my VoIPo3G report in 2007), but has been a distraction not major cannibalisation.

Overhyped in 2005 #5) Dual-mode WLAN/cellular phones - OK, I got this one rather wrong, at least in the mid-term after the iPhone's launch 18 months later. Although that said, globally WiFi is probably present in less than 30% of new shipped handsets because of the sheer volumes of low-end devices.


Overhyped in 2005 #6) Wireless presence - Yes. Still very little use of PC-style presence engines on phones. A bit of Skype, maybe the next rev of Facebook on mobile. RCS failure unsurprising.

Overhyped in 2005 #7) Smartphones - Sort of. In terms of 2005-era definition of smartphones, as just phones with an open OS, I wasn't too far wrong - as the Nokia/Microsoft deal has proven. The *new* definition of smartphones that act as part of an ecosystem had not been invented at that point, and only started to become *really* important from 2008 onwards.
 
Overhyped in 2005 #8) "Seamless" roaming (especially WiFi to cellular) - Correct. We're still talking about it today as if WiFi / 3G (or 3G / LTE) handover is some sort of magical Holy Grail. Classic case of technologists solving the wrong problem, and not realising that "seams" are actually important.


Conversely, at the time I thought some other things were being *under-hyped* and would get more attention from vendors, investors or operators in coming years:

Underhyped in 2005 #1) PBX/cellular integration - Fair. A lot more attention, but still never really got to the stage I'd hoped. Too much futile focus on cellular substitution of PBXs instead, with a variety of pointless and niche hosted "mobile PBX" solutions.

Underhyped in 2005 #2) Poor indoor performance of 3G, WiMAX and other services - Absolutely right. I said in 2005 that nobody paid attention to indoor coverage, especially for data. It was indeed a problem that has since had much more attention...

Underhyped in 2005 #3) Novel in-building wireless coverage solutions - ... especially around innovations such as femtocells, which I was the first analyst to discuss and cover.

Underhyped in 2005 #4) "Single-mode" (non-cellular) VoWLAN phones - OK OK, I was flat out wrong on this. So much for DECT-replacement spurring demand for cheap cordless WLAN phones in-building. Although there *is* a lot of VoIP over WLAN from PCs and now tablets.

Underhyped in 2005 #5) Impact of VoIP on cellular pricing - Difficult to distil out the impact of VoIP vs. impact of regulation vs. market saturation. But there's certainly be a broad decline in per-minute pricing, especially for roaming. I think that VoIP will impact cellular telephony pricing more from now on, as it enables "non-telephony voice" applications to substitute for expensive proper phone calls.


Underhyped in 2005 #6) Upgrading cellular network backhaul - Absolutely right. Easily identifiable as a bottleneck in 2005, even with HSDPA still only just appearing over the horizon.

Underhyped in 2005 #7) Difficulty of integrating & testing new features on mobile handsets - There *still* isn't a proper IMS-capable phone. And Apple proved that good integration/testing was *hard* and expensive if done right. Getting much easier now with Android and Appstores, but 6 years ago nobody (especially network vendors) appreciated how much of a tough problem the little UE box on the end of the chart actually was.

Underhyped in 2005 #8) The impact of a lack of "email portability" on FMC business models - I had to look this one up & remind myself what I was talking about. Essentially I was saying that the stickiness of ISP email addresses would mean a reluctance to switch ISP to one offering an FMC-style voice service. I hadn't accounted for the fact that FMC-style services would be so poor, that few people got to a decision point around email anyway.

Underhyped in 2005 #9) The role of "service enabled" home gateways for FMC - True up to a point. Again, 2005-era voice FMC as espoused by the UMA or SIP voice advocates, never really took off. The home gateways, like email, weren't really the weak points of the proposition - it was the business model. On the other hand, the gateway/STB market has certainly evolved to support some cool services such as IPTV and FON.


More recent "I told you so" and "OK, mea culpa" moments

Looking back at some other predictions from the past 1000 posts, I've got quite a few other things spot-on - but I've also made a couple of howlers as well.

Back in 2006, I noted that operators'  "pipe" revenues from mobile broadband were going to be much more important than other supposed value-added services such as content downloads or mobile TV. Other analysts at the time were advising against open-Internet access, while my view was that it was an inevitable consequence of consumer demand.

Also in 2006, I laughed at the notion of the phone as "mobile wallet" . It still hasn't happened in the last 5 years, and phones still won't replace cash in the next 5 either, no matter how hard some other analysts blow that NFC-enabled trumpet.

On the other hand, I wasn't a believer of the Amazon Kindle in 2007.  It's been moderately successful, especially in more recent versions, so I'll hold my hands up and admit I mis-judged the e-book phenomenon a bit.

More accurate was my prediction about "multiplicity" - that people would have multiple SIMs, multiple devices, multiple service providers and so on. It's a theme I've expanded upon several times and is why I have such as negative view on concepts like "family plans" for mobile. The future is going to get more heterogenous, not consolidated.

Later in 2007, I published a report which suggested that by end-2012, I was expecting to see as many as 250m users of mobile VoIP over 3G/LTE networks. Given that there are now various 3G-capable VoIP clients for Android, Symbian and iPhone - plus heavy use of VoIP among dongle-connected laptop users - and likely more coming in the next 2 years of LTE, I reckon the top-level numbers were prescient. On the other hand, I'd been expecting operators to have developed a workable carrier-grade LTE VoIP solution by now, as long as they had got some "practice" in tuning it on older HSPA networks first. I also suggested they should work with Skype, Fring and others in the meantime, getting experience in real-world mobile VoIP. It hasn't happened, and so one of my predicted scenarios is now happening - Skype, Google and others will take the lead, not the operators. The operator community's over-focus on slow-moving standards like IMS and VoLTE/MMTel allows swifter alternatives to gain a foothold.


In 2008, I pointed out that while embedded-3G notebooks and netbooks seemed to be "elegant", the business model and economics didn't stack up. Users prefer the flexibility of dongles (which can be prepaid as well as contract), PC OEMs don't want to wear the cost of a module which costs a sizeable % of the device gross margin, and retailers would much rather stock a 3-inch stick than a large laptop box in their store-rooms. Today, only a small % of laptops have 3G built-in, and only a small % of those are actually activated ("attach rate").

My predicted timing on femtocell market evolution has been pretty decent as well, despite more bullish forecasts from some of my peers. From early 2008 "Some niche success, but practicalities will mean it's H2'09 or 2010 before massmarket deployment."

One topic where I have to admit defeat is in my effort to get the telecoms industry to abandon the term "Over the top" (OTT) to refer to Internet or other access-independent service providers. I still think it's a stupid and derogatory term for companies that should be considered as respected and equal peers, or potential partners/customers. In my view, this attitude symptomises the problems of the traditional telecoms industry today. It's also utterly hypocritical, given that virtually ever operator is developing its own portfolio of OTT-style services. Meanwhile, there is a larger threat emerging from "under the floor" providers such as wholesale networks or vendor-outsourced infrastructure.


The next 1000 posts

I'm obviously not going to go through every post I've made to date, and it's certainly possible to find more examples of things I've got wrong. But on balance, I'm pretty pleased with the calls I've made - but somewhat saddened that I have not managed to stop some of the more predictable mistakes.

Going forward, I can see other imminent issues with technology and business models. I think that current forms of mobile video optimisation are likely to face severe push-back from regulators, customers, content companies and competitors. There is a huge amount of wishful thinking about "monetising" and "personalising" services based on the network trying to decode application flows and treat them differently. They won't work - the network doesn't and cannot understand applications from a user's perspective, and they are inherently game-able.

The industry still isn't thinking about the big-picture impact of Moore's Law (hints: intelligence moves to the edge, while applications-oriented standards reduce in importance as inter-working boxes improve in capability). There is also too much legacy thinking about links between access and service - operators should be spending more effort on creating their own OTT-style services, and less on vertical integration. There is still a "them and us" stance between traditional operators and new incumbents such as Google and Facebook. They are competitors, yes - but also peers, and equally deserving of customer respect even if they do not own (or pay for) physical networks.

I hope that my next 1000 posts help to flatten out the telecoms hype curve in coming years. I'm intending to continue giving early warning of avoidable problems - and highlight new opportunities that have not been addressed. I will call out bad ideas - or ineffective companies - even if they are my clients. And I'll try and add a dose of humour, irreverence and fresh air. And maybe even another Pythonesque satirical post....


The sales pitch

While I enjoy writing and being opinionated anyway, the main reason I write this blog is to drive consulting & advisory business for my company Disruptive Analysis and its partners. The blog posts illustrate areas of knowledge and expertise, as well as the type of research, critical thinking and challenging stance I employ.

Much of Disruptive Analysis' consulting work involves critiquing business plans & propositions, or helping firms find new addressable markets and business models that fit with their capabilities. Often, I will "stress test" ideas against plenty of possible "gotchas".

Similarly, my published research tends to focus on contrarian themes - there's a ton of larger research houses looking at mainstream and uncontroversial topics and I see little value in adding my own "me too" reports.


If you find this blog interesting & useful, then please get in touch with me. As well as in-depth consulting assignments, I also do more free-form brainstorm workshops and public presentations.

Email information AT disruptive-analysis DOT com

Monday, February 21, 2011

MWC 2011 round-up: Smartphones, Policy/Offload and VoLTE/RCS

I've now had a chance to gather my thoughts about last week's Barcelona Bunfight. I can't say I enjoyed the trip this year (having only grudgingly decided to go right at the last minute), but it was nonetheless productive and informative. That said, it's still got to have been better in Barca, than if it was held in the other 2013-2017 MWC candidate cities of Paris, Munich or Milan.

For me, there were three standout themes:

- Handset OS's and ecosystems, especially the rise of Android, and the Nokia/Microsoft fall-out
- Ongoing discussion about the 3G/4G data traffic "problem", and how to add capacity, manage traffic and hopefully generate new revenue opportunities
- The future of personal communications, especially around VoIP on 3G/4G networks and the evolution of social networks and messaging.

Of course, there was a ton of other stuff going on as well - such as a lot of hype about mobile money / payments, the reality-distortion field around NFC, zillions of identiclone tablets, assorted fluffy apps/content things, and an absolutely bubble-tastic fleet of private jets at BCN airport.

Android gave everyone something to talk about & play with, while conveniently ignoring that the people with money (and especially discretionary technology income) still prefer Apple, while BlackBerry seems to be winning the hearts and minds of the next generation better than either. Nevertheless, the Android zone in Hall 8 was pretty much a full-on party, and maybe free smoothies can start to catalyse a sense of "cool" around the brand over time.

For me, perhaps the most interesting thing about Android isn't the high-end but the opposite. The sheer volume of designs and OEMs involved is driving the inevitable downward pressure on phone prices and margins. We can't be too far away from having a "good enough" smartphone at $100 price points - which then opens the market up to the vast prepay subscriber base. It also makes it harder for the operators to convince users to go for long contracts with subsidised devices - who's going to sell their soul for a 24-month contract (and a possibly locked-down device), just for the benefit of a $100 loan?

I'm not going to re-hash all the arguments about Microsoft / Nokia (apart from anything else, my input is going into something soon to be published by my friends over at Telco 2.0) but it was certainly on everyone's lips throughout the week. Overall, I'm cautiously positive, although the killing of Symbian seems to have been done a bit too abruptly. There's certainly going to be an uncomfortable transition period before any WP7 Nokia devices ramp up to take its place, plus the company seems to have lost an awful lot of developer goodwill. On the other hand, not dashing after the unproven non-Apple tablet space immediately seems very wise. My view is that there's going to be at least a year of disappointments around Android / WebOS / whatever tablets, and Nokia/Microsoft might be better off waiting to pick up the pieces in 2012 after the inevitable short-term bloodbath.

I stopped off to see HP and its new WebOS phones & tablet. Seem to be nicely engineered, although without *quite* the level of UI intuitiveness I might have expected, but I suspect that would change with practice. The larger phone with the QWERTY was quite appealing - the sort of thing I'd consider replacing my iPhone with next time, as it has a hint of individuality about it. In fact, I thought much the same about some of the WP7 devices as well.

(With impeccable timing, my iPhone black-screened and refused to restart, while I was standing at the Microsoft booth. Either the iPhone has a hidden "tantrum" feature triggered by detecting potential disloyalty from its owner, or else the guys from Redmond have special competitor-disruptor rays mounted on their stand).

One of my largest research themes in the last year has been around mobile data traffic management. It was at MWC in 2010 that I realised quite how many silo solutions there were - everyone had an answer to the "data tsunami" generated by dongles and smartphones, but there was no consistent approach to stitching together the various bits. Various flavours of offload, optimisation, policy, charging, connection managers, protocol tweaks and assorted other were around, but there was a lack of any "holistic" approach to blending these intelligently. I published a research note on this theme last May.

Since then, I've been continuing to see the evolution of the space, which has been fuelled by many operators' short-term needs for a "quick fix" to their data problems, coupled with some longer-term strategic thinking. I've been pretty vociferous about the claims of video optimisation as "the answer", in particular where it's done way back in the network without any decent awareness of the radio. I also have a lot of doubts about so-called "application-based policy", whereby operators can supposedly create "personalised" mobile data plans which include/exclude/prioritise specific traffic types or web destinations.

This trip to Barcelona enabled me to catch up with some of the latest developments around sub-topics such as offload and connection-manager software on devices, which I see as more strategic for overall traffic management than core-network heavy machinery. I was also struck by the fact that the largest believers in the "holistic" approach are actually the largest traditional network vendors, not some of the startups.

Ericsson's deal with Akamai looks truly important here, as in theory it should be able to combine video optimisation / caching with the ability to tweak policy right the way down to the scheduling algorithms in the base stations. All the boxes sitting in the core network or out on the Gi interface have a critical flaw - they neither understand the radio domain very well (eg is the user temporarily out of coverage, rather than in a congested area?), or else aren't in a position to "enforce" anything sufficiently granular to deal with the problem. Alcatel-Lucent is also tying various bits of its policy and content portfolio together - it bought CDN vendor Velocix last year, and also has some probe/DPI cleverness in the RAN.

More importantly, getting a firm like Akamai involved in video optimisation and policy is important, because unlike the operators, content publishers actually trust a CDN not to mangle or degrade video content without permission. Unnecessary transcoding or compression of video, without the awareness or permission of the producer, is extremely unpopular. While "in extremis" it may be necessary in times of congestion, it would still be better done with the involvement of the originator, not by the operator's network operating autonomously. I'm expecting to see some smarter content companies put notifications in their apps that monitor when telcos are "fiddling about" with their traffic, and inform the user of who is to blame if "artistic integrity" is compromised.

Among the policy & DPI vendors, I was particularly impressed by those that are offload-centric (eg Bridgewater), rather than those that are app-centric (eg Sandvine). I also saw some neat on-device software from the likes of Roke, which also optimises for cost/battery life as well as radio bearer availability.

It's time for an over-generalisation with a large kernel of truth here: network people don't *really* understand applications. They don't understand how users perceive applications, they don't understand how apps evolve and interact, they don't understand the limitations of their own boxes, and they don't understand the difference between an application and a traffic flow.

It's not their fault, to be honest - it's more the paucity of the English language for helping us distinguish between different grades of "things that run on top of platforms". Everything is an "application", from the silicon to the ecosystem as a whole. The mobile industry now has a layer-cake of different platform tiers and applications, and it's not the network vendors' fault that the app-stratum they can watch isn't as useful as the semantics would have them believe.

I lost count of the number of people suggesting that operators could have tiered services, say with one mobile broadband package optimised for "social networking". The descriptions of a theoretical "Facebook data plan" I heard from a few just don't stack up. For example, none of them had a good answer for me when asked what they'd be able to do now that Facebook forces its sites to encrypted SSL. Nor did they have a good answer about whether zero-rating Facebook traffic would include web links shared by friends, which are displayed *inside* the Facebook app on a smartphone. Or what would would happen if the June update of the FB app added something new, like video.


The other theme I was following at the show was the evolution of personal communications, and especially the role of operators and other participants. Various of my clients have been asking me to advise on "the future of voice" and new business models which take account of fully IP-ised networks like LTE. Much of the discussion is around new voice apps and revenue streams, and especially the growing distinction between "voice" in general and "telephony" as merely one specific voice application.

As expected, we heard a fair amount of noise about VoLTE for voice on LTE, as well as the news that T-Mobile was dropping its cherished advocacy of VoLGA. There were also quiet a few companies quietly pitching non-IMS approaches to LTE VoIP, which I think have a strong chance of adoption. I'm extremely skeptical about the suggested timelines I've heard from VoLTE, which seem to be driven by the needs of both PR and the desire to foster consensus through inertia (or at least collectivising the risk of failure). In my view, the likelihood of getting normal handsets into the market, running VoLTE for "primary mobile telephony" with good quality, battery life & mobility, before 2013 is very slim indeed. Yes, even on Verizon. Nevertheless, I do agree that VoLTE will eventually happen, for some operators in some contexts - although it will certainly never be ubiquitous. My post-MWC views are essentially unchanged since the pre-MWC post & lengthy discussion in the comments here.

If VoLTE doesn't deliver, I wouldn't be surprised if VoLGA gets reincarnated at some point - probably with a face-saving rebrand. I'm sure T-Mobile knows this as well. (I tried to come up with an amusing pun for a future DaNUBE or THaMeS acronym for LTE Voice, but I haven't managed it yet - suggestions welcome).

On the messaging and social-network side, MWC included a lot of discussion about integrating incumbent platforms such as Skype and Facebook, as well new telco-centric niche efforts such as the GSMA's RCS-e (a new revised version). My views on RCS have been pretty consistent since its announcement 3 years ago, so it's good to see that finally the GSMA has ditched the focus on presence because it kills the phone battery and floods the network with signalling. I'm still wading through the new specs, but despite the high-profile announcement of future operator support and a few demos, I still think it is too little, too late.

And there's still no involvement from the key players in messaging and social communications such as Facebook, Apple, RIM or Skype - the operators need to stop the ridiculous "them and us" stance and prove their credentials in social communications, interoperating via the web, and gaining viral adoption because users find the services valuable. For RCS to succeed there also need to be a firm commitment to a "freemium" business model, and a route to getting away from the legacy of the "phonebook" - a hundred-year old way of viewing your social affiliations, that is rapidly becoming obsolete. Anyone wanting a full critique of RCS should obtain a copy of my report from the end of last year, available here.

Friday, February 11, 2011

Nokia + Microsoft: is there an effective third silent partner, Qualcomm?

I'm at the Nokia strategy event in London today, listening to Stephen Elop discussing the Microsoft deal & wider issues around the company's strategy for smartphones, mobile phones and "future disruptions".

There's a ton of angles on this, which I've been dropping comments about on Twitter, and which are well covered elsewhere.

But I'm wondering if there's another angle here in the discussion of "the Windows Phone ecosystem". All of the first batch of WP7 devices have been based on Qualcomm hardware and specifically the Snapdragon processor.

Nokia had a long-running legal spat over patents with Qualcomm, which was resolved in mid-2008. Added towards the traditional antipathy of the GSM community towards CDMA, this meant that Nokia firstly never developed Symbian devices for the broader US market, and also missed Qualcomm's increasing competitiveness in creating 3G basebands and application processors. A year ago at MWC, Nokia announced the first Snapdragon-powered phone would ship by end-2010. It didn't happen. The N8 uses a Texas Instruments apps processor, and the rumoured N9 hasn't appeared.

It's also not clear what's happening now to Nokia's deal with Broadcom for 3G chipset supply - or what happens with its future LTE devices.

I suspect that Qualcomm may - explicitly or implicitly - turn out to be a big winner with today's announcement. It certainly seems likely that for Nokia to get its first WP7 phones to market ASAP ("focus on speed"), it will go down the proven route to get devices out as fast as possible.

The interesting thing here is that the big Q could therefore end up as a pivotal player in Apple, Android and WP7/Nokia ecosystems - although in the iPhone it's on the baseband side than the apps processor.

The big questionmark is around the future role of Intel in mobile. It is clearly now being left out in the cold (again), as Meego is effectively being mothballed, at least from Nokia's point of view. It will be interesting to see its next move - partnership with Samsung on Bada, or RIM on QNX, perhaps? An acquisition of a company like MediaTek? Or a wholehearted move to support WP7 and Android however possible?

Thursday, February 03, 2011

NFC will be about free "interactions", not monetised transactions

One of the mobile operators' big problems at the moment is their inability (or unwillingness) to deal with Freemium-style business models. The sheer "weight" and complexity of operator infrastructure and bureaucracy makes it ill-suited to managing events that are unbilled and non-monetisable. This is especially true in circumstances where free calls/sessions/events massively outweigh paid ones. It's a problem exacerbated by many vendors selling them boxes or software based on usage tiers.

This is almost certainly going to be a problem for operator appstores. A large % of Apple downloads, and a huge % of Android downloads are for free applications. People wanting clients for Facebook, Twitter or a million advertising- or brand-led apps. The implicit (and irretrievable) costs of managing, uploading, storing and delivering free applications is likely to be a significant part of the business model for Apple, Google and others. It will be interesting to see how telcos cope with this challenge.

This problem is likely to arise again with NFC. I suspect that once Apple and Google and RIM and Nokia expose NFC/contactless APIs to their developer communities, there will be a huge rise in the number of transactions that don't involve any form of payment. Many of these will not need the "secure element", which is the focus of much of the political wrangling around NFC.

If someone walks out of a restaurant, and taps their phone on a theoretical Facebook-branded "Like" terminal on the way out, there isn't really a need for an uber-secure back end system. Same deal if I tap my phone at a gig, to get added to a band's mailing list. Or a million other applications and use cases.

The net result is that an overwhelming % of all NFC connections will probably be non-financial. Not mobile payments. Not mobile ticketing with a pseudo-Oyster. Not peer-to-peer money transfer. They will be inter-actions, not trans-actions. Not only that, but these apps will appear much faster, assuming that readers are affordable and easy to use (more on that in a moment).

I know I've been very skeptical about NFC in the past, but that's because the focus was on payments or convoluted operator-inclusive value chains. Not just simple "tap to do stuff" apps - basically similar to 2D barcode use cases but much simpler and far less geeky. In other words, it finally looks like we'll get offline applications for NFC - something that's been key to virtually every handset innovation in recent years.

All of which makes the operator business model around NFC rather tricky, in terms of justifying any additional subsidy or promotion, or somehow taking a margin. All the complex mobile-money, transportation ticket and government projects are huge systems integration and IT minefields, likely to need $$$ being spent with IBM or Accenture and taking years to implement.

The big question is around readers and how they are connected. For big complex projects (eg integration into retailers' point of sale terminals), telcos may have a role to play. But for the Facebook-style touch-to-like concept I mentioned above, it should be possible to get USB-connected standalone readers hooked into a PC for a few 10's of dollars.

I have a hunch about this. Sooner or later (sooner?) Apple will put not just the NFC chip into phones, it will put the *readers* into next-gen iPads. I've already had an experience in a restaurant where the host came around with an iPad for people to enter their email addresses for the mailing list. Touching the phone could be to a tablet (or a USB-connected reader to a PC) should be a no-brainer, especially if it allows the user to decide whether to provide specific information sets (name, email, phone-number.... or just a Facebook ID or even a pseudonym).

The question will then be how operators manage to regain relevance for their role in NFC transactions (which will come later, if at all), when the first trillion NFC interactions will have bypassed them.

My guess is that Apple and Google will (initially at least), focus on using NFC as just another tool to entrench their developers and extend their ecosystems. Apple isn't especially bothered about really monetising apps - its own profit on the Appstore is peanuts - it just uses it as a way to add utility to its hardware and sell more units. If NFC-capable iPhone 5's and iPad 2/3's help it sell another 50 million units @$300 gross margin a time, it really doesn't need to care about slicing 2% off of the handful of financial transactions it might facilitate. And Google, Facebook or others could subsidise readers for a variety of advertising / marketing purposes.

So what should operators do about this?

One thing is to have some "skin in the game" in terms of interactions as well as transactions. That will mean acting just like all the other developers and exploiting the NFC APIs on all the various handset platforms. Potentially, they could act as an interaction clearing-house, or even adding value through other internal APIs and assets. They should NOT assume that the key identity layer is around the SIM card, but should look to develop OTT-style applications that can be downloaded to any handset running on any operator's network.


I have other thoughts on this as well, but I'll reserve those for other channels and paying clients. This is my own freemium strategy....