Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here

Thursday, April 29, 2010

Most commentators don't get RIM's strategy

I've seen a huge amount of comment over the past week about RIM supposedly being "behind the curve" on the latest version of its OS release - especially about it playing catch-up with Apple's huge developer base and app store, as well as Android's rapidly-increasing developer mindshare.

I absolutely agree that BlackBerry has suffered from a historically sub-par browser experience, and has certainly had less app support from some of the "sexier" (or more trivial) developers.

But much of the analysis I've seen has overlooked something small and exceptionally important - the reason why consumers [at least in some segments] seem to be buying BlackBerries. And, in particular, they overlook the killer app.

No, not email. BlackBerry Messenger, BBM - the evolution of PIN-to-PIN messaging.

While you can get to most social networks or messaging services on any device - Facebook, Twitter, MSN, IM and so on - BBM is unique to BlackBerries. And, perhaps surprisingly, it is becoming viral within certain groups. I certainly notice it among teenagers and students - but speaking to other people, it's also used in other widely diverse demographics. It's actually the exclusivity and silo nature of the service which *adds* value to it - completely contrary to the usual mobile industry hype about combining social networks into a single client application. I've been saying for some time that there is little rationale for combining social networks into a single interface on a mobile - and this appears to prove me right.

I've also said before that I'm unconvinced that zillions of apps are *that* important for the real massmarket of smartphone users, beyond a few must-haves like a decent Facebook client. I just don't believe that a billion people will follow in the path of the geek evangelists and load up their phone with pages of application jewellery.

I use just 5 apps on my personal iPhone, 3 of which are things like RSS that ought to be have been in the device OS to begin with anyway. I have no particular interest in looking for other ones, unless I have a very specific urgent need. The browser (and to a degree, browser-resident apps) is a different story - I think it will be pretty essential for far more users than a long tail of native apps.

So the decision for some customers is "Do I want a fully-loaded appstore, at the risk of missing out on the gossip from my friends on BBM?". Quite a lot of younger users are going for the gossip approach - as well as the cheapness of devices that fit with their preference for prepaid tariffs rather than contracts, something that the other smartphone suppliers are not really yet addressing.

The problem is that for industry observers (and developers) the idea of someone preferring a basic, text-based IM client over whizzy graphics and multi-touch UIs is very hard to grasp. But unfortunately, there's precedent here - SMS has huge "social value" even if its design "elegance" and sophistication is minimal.

In fact, BBM also has the rather more disruptive aspect of replacing SMS with a sometimes-cheaper (and more exclusive) alternative. Why do you think RIM has just introduced a new version of the Pearl 3G with a normal featurephone numeric keypad, suitable for the billions of users who are happy with multi-tap entry? BBM appears to be the mobile successor to MSN as the default messaging platform for a large swathe of the demographic landscape (and, anecdotally, somewhat skewed towards females as well).

Maybe the end-point will be leaving the SMS client for spam adverts and boring messages from parents or "that guy you met in the pub".... all the important and high social-value messages from friends could get siphoned off.

At the moment, I'm certainly not predicting a huge polarity switch in smartphones back from touchscreen to QWERTY, solely on the basis of BBM. But it is definitely a wildcard, especially among youth - Apple has abdicated the global 75% of users who use prepay and usually non-subsidised handsets.

So among the analyst and journalist and blogging classes, I can understand why RIM's perceived lack of "shininess" has led people to downplay its position against its peers, but I continue to believe it is a more formidable competitor than many think

Wednesday, April 28, 2010

IMS role in mobile will remain minor. RCS is dead.

OCT 11 2010 NEW REPORT AND BLOG POST ON RCS HERE
So, a day spent yesterday with the IMS part of the operator and vendor industry at the IMS World Vision event in Barcelona. The event is going on for a couple of additional days, but I'm back in London at the Telco 2.0 Brainstorm event instead now.

Wow. It's been quite a while since I heard the level of introspective, defensive groupthink I got bombarded by. Unreconstructed, old-world telco network views, perhaps occasionally spiced with a light flavour of Web 2.0. I lost count of the time I found myself shaking my head about how little some in the industry "get" what's happened. A highlight was Telefonica claiming that IMS was needed to "compete against Internet service providers" in basic communications services like voice, and will enable them to "really fight".

Which contrasted rather a lot with some of the vendors' more sanguine views that competing with Facebook, Skype et al wasn't the point of IMS today - that train has left the station, and instead IMS should be about assisting the Internet players and providing some form of glue to interface that world with the phonebook. (And of course, supporting VoIP on LTE or fixed broadband).

To me, the legacy thinking is summed up by the continuous usage of the word "terminal" throughout the day. It's like the network folk are stuck in a 1980s timewarp, back in the days of mainframes and green-screens. Let's forget that there are 1GHz Snapdragon-enabled devices out there, which are more than capable of gaming the access and core networks of the operators. Even more astonishing was the assertion that "the Cloud" is the same thing as the [operator] network.

Right, time to stop equivocating on one of IMS's main problem children. I've been writing about the lack of IMS-capable mobile phones for over 4 years now, criticising RCS for more than two years, and it's now appropriate to nail the coffin lid shut. RCS is dead. RIP. There's no business model, no justification for the battery drain, no clear plan to get clients onto the bulk of phones sold through non-operator channels, no prepay story, no MVNO story, no reason it should generate revenue uplift, as it just gives users access to a few websites that were free anyway. It might well cannibalise the sale of data plans by reducing the use of the browser and widget frameworks. It is also near-useless until it becomes cross-network capable - which means it just needs one or two operators in a given market to say "no" to completely destroy its theoretical value.

That said, although it's dead it's still twitching a bit. We'll probably see some half-hearted attempts to pretend it can be rescuscitated, in France and maybe Spain or Sweden during late 2010 or 2011. Japan and Korea might try some almost-RCS offerings. And then it will disappear. It reminds me of UMA in 2007.

Mind you, my views on RCS were positively benign compared to those of Paulo Simoes of Portuguese operator TMN at the conference. He launched the most comprehensive, coruscating annihilation of the hapless technology I've ever heard. He pointed out a large herd of elephants in the room - the paucity of use cases, the excessive power consumption, useablity, lack of "sexiness", the pointlessness and clumsiness of filesharing, the downsides of presence and availability, the lack of enterprise focus, the reliance on MSISDN and more besides. Mixing metaphors horribly, the elephants collectively make it a "lame duck". He challenged vendors to give him RCS for free upfront, as he was willing to pay a volume-based usage fee if it was actually used. Several delegates apparently compained to the organisers that his witty and surgical evisceration of RCS was "too negative" for an event they clearly hoped would be a happy-clappy evangelical conference of consensus.

The other big theme was voice on LTE, especially in VoLTE (GSMA / IMS) guise. I'm somewhat less negative about that, for two reasons. First, the ideas of a barebones version of voice-on-IMS for mobile makes sense, in the same way that using IMS for NGN VoIP on ADSL does - it's a straightforward PSTN / PLMN replacement. VoLTE doesn't depend on presence, doesn't mandate messing about with video or filesharing or pretence of being a social network - and, crucially, might be deployable in a fairly silo'd "in a box" version.

Unlike RCS, there is not a "prisoner's dilemma" situation that everyone needs to do it either - one operator in a country could deploy VoLTE with IMS, another could use VoLGA, a third could use Skype and a fourth could stick with circuit-switched voice via HSPA+. Interworking via gateways would be messy, but the voice industry is used to that already.


That said, I still think that VoLTE's immaturity is one of many factors that will delay LTE as a mainstream technology to 2015 and beyond.

Tuesday, April 27, 2010

Telcos = Google advert affiliates?

I'm at the IMS conference in Barcelona today. My bulletproof vest has already protected me from a couple of shots from large vendors. It's going to be an interesting day.

Most interestingly, the first presentation was from Google - specifically the YouTube rep for Southern Europe. She started with saying "this isn't about IMS", and carried on by showing a slide of an Alcatel Lucent video (hosted on YouTube.... the subtext being fairly obvious). It covered a broad set of cool innovations within YouTube, as well as the advertising / monetisation model. Basically, it pitched YouTube as a hugely valuable and important service platform, with numerous usage cases and features, as well as a plethora of revenue opportunities.

My question, which she deflected, was why she was at the event at all. What relevance did it have, either to IMS, or to the audience. Then I realised that the audience held a good number of services/apps people from - not necessarily IMS fundamentalists, but just looking for ways of increasing revenues at manageable risk levels.

My take is that Google is offering a (smallish) olive branch to operators at the moment - basically a revenue-share on advertising, where operators help them extend the reach of their existing properties. This is behind Android, and I expect the subliminal messaging behind the presentation was to convince operators that carrying (and indeed promoting) YouTube is in their own interest.

Certainly more so than whingeing to the European Commission about imposing a tax on Google's cleverness, which is the current strategy.

There's a word that Google uses to describe partners with which it shares ad revenue, when they help improve its reach.

That word is "affiliate".

Edit: The more I think about it, the more I realise how clever this approach actually is. It is basically saying to the operators "You can become a *happy* pipe, by sharing in our success". Push more and better quality YouTube, or Gmail, or Google Maps..... and it generates more advertising, and therefore more rev-share for the operators. And it doesn't require lots of complex core network capex with uncertain returns.

And it doesn't have to be "dumb" - adding smartness to the network adds even more value, enabling Google to better target its adverts (more rev-share!) and provide higher-quality user experiences (HD YouTube = more expensive ads = more rev-share!), and reducing churn means more users/viewers (more rev-share!).

There are many, many companies making money as Google affiliates - so why shouldn't telcos just be seen as a special case?

[The above is me playing Devil's Advocate, by the way. I'm not expecting many operators to find this a particularly palatable world-view - even though it may ultimately make a lot more sense].

Wednesday, April 14, 2010

Mobile websites

I am getting increasingly annoyed with mobile versions of websites. I really wish there was a way of configuring handset browsers (I'm using Safari on iPhone here) to permanently toggle to the full PC version of sites, if necessary by spoofing the ID it gives the server.

Many of my favourite sites have useless 'light' mobile versions and even if there's a link or switch to toggle the mobile version to 'off' it seems to revert the next time rather than using cookies or whatever to remember my preference.

Personally I want my Internet experience when mobile to be as close as possible to that I get on my desktop. It may not be a universal view but I cannot believe I'm alone in being irritated by site designers assuming I want a lousy half-cut version of their pages.

Yes, I can see the reason to "mobilise" websites very slightly - perhaps some tweaks because it is expecting touch rather than mouse, or browsers on devices with no "end of page" key. But complete redesigns are an anachronism, based on an expectation of 2G networks, limited browsers and small screens, in a time when we're moving to large screens, enough processor speed and zoom/pinch/drag tools to zoom around a full page.

I strongly suspect that much of this nonsense is down to "digital media agencies" and their ilk, insisting that their clients double their web spending to create mobile-specific sites. If I was running a larger site, I'd certainly make sure I got three or four separate opinions before running down this particular blind alley.



Tuesday, April 13, 2010

Conference season! Telco 2.0, Open Mobile and others. Discounts available....

Looks like April and May will be Powerpoint central for me.... numerous events that I've been asked to speak at or help develop.

Unfortunately I won't be at next week's eComm in San Francisco, but if you're interested in the most cutting edge innovations around the whole "next communications" arena, from cloud voice to social networks to augmented reality, it will be there.

The big event of the month is undoubtedly the Telco 2.0 9th Brainstorm here in London. I'm going to be running sessions on Mobile Broadband Network Economics, and telcos' ability to gain additional revenue streams via New Devices.

In May I'll be at the Open Mobile Summit from May 26-27, chairing the stream on "Open Mobile Networks".

For any of these, if you tell 'em I sent you, there should be a discount available on delegate fees. Email me for details if you need them.

In addition, I'm also at upcoming events on
- Telecom Cloud Services
- IMS (I'll be wearing my bulletproof vest)
- LTE
- Indoor Wireless

Get in touch if you'd like to arrange a meeting to discuss Disruptive Analysis consultancy services and published research at any of these - information AT disruptive-analysis DOT com.

Oh, and lastly a I'm speaking at next week in a personal capacity, although it's organised by a name many in the mobile industry will find familiar, David Wood, former SVP at Symbian. The UK Humanity+ event is all about ways we may change both our species and our planet over the coming decades, from cognitive enhancement to anti-ageing to the "singularity". Should be an utterly fascinating day, and if you're in London I'd exhort you to come along.

Mobile social networking doesn't really deal with networks very well

For the last six months or so, I've been feeling that the mobile industry is looking at social networking in the wrong way, but I've been finding it hard to articulate exactly why - it's just been a feeling that something is a bit wrong.

In particular, I've been having a lot of doubts about the million different attempts to bridge between social networks, so you have a single "active phonebook", or "active homescreen" which aggregates email and SMS and Facebook and Twitter and Ovi and Gmail and Skype and Vodafone 360 and Orange On and Apple MobileMe and all the rest.

The usual metaphor is to have an icon or picture of a particular contact, with some form of showing all the various ways to message them, get status updates and so forth. In particular, there's usually an over-riding expectation that somehow the mobile phone address book remains the natural "hub" for all of this, either still resident on the device, or abstracted to the cloud for your supposed benefit.

To me, that doesn't gel. It's yet another way of trying to push "unified communications" rather clunkily onto something it doesn't fit, with a good dose of unfriendly "customer lock-in" as well.

Now I'm not a human interactions or useability expert, but I'd like to think I'm reasonably well-attuned to the ways in which people use both mobile and Internet communications. I can talk about "social value" and easily describe why stupid concepts like SMS-to-the-TV destroy the inherent, implicit value in a particular form of communications.

My thoughts are starting to resolve into a couple of separate areas here:

- Mobile phones - and especially the inbuilt address book - are great for person-to-person communications. But they tend to be lousy for orchestrating groups of people or events.

- The web tends to be good for free-form communications, but only between selected groups. With the exception of email, most web-based comms is within rich but limited "islands" - chat rooms, fan pages, blogs and comments, IM communities, VoIP peers and so forth

- A lot of the value in social networks involves the interaction of individuals with groups. Sending a party invitation to 50 people on Facebook with directions. Messaging your community of blog followers via RSS and so forth.

- Certain forms of communication integrate better than others. Emails have attachments and embedded links and are easy to "cc". Skype video and file-sharing. SMS for ubiquity. But *all are different* and risk losing their individual characteristics if they are blended.

- People like to keep their communities segregated by default. Work contacts, personal contacts, very personal contacts... but they are too lazy to administer that segregation actively. It's much easier just to have silo'd islands of acquaintances and groups, even if they overlap, because you don't have to *actively administer* membership. If you just know all your work contacts are on Skype, then you quickly learn not to use Skype for certain types of messaging - you don't change your status to "feels very hungover this morning".

- The mobile handset address-book metaphor is useless for things like group membership and events, as it's usually too oriented around a number and specific "communications sessions" that it's hoping to enable. This is especially true for IMS-based variants, where the whole underlying premise it creating billable events. Do you expect to be charged to "like" someone else's status, or to RSVP to an event invitation?

- Increasingly, messages are duplicated through different channels for notification purposes. Facebook updates are sent to email. Event reminders are sent by SMS as well. Do you want to de-duplicate your active, aggregated homescreen whenever any of your contacts does anything?

I feel that I'm a bit closer to having a holistic view on how all this should work, but I'm not there yet. It is interesting that some of the most significant growthand behavioural trends (and viral buzz) are in new, completely-isolated islands like BlackBerry BBM, rather than aggregated platforms like 360 or Ovi.

Comments very definitely welcome....

An interesting angle on mobile broadband offload

Since the demise of mobile TV, the word "multicast" has pretty much gone out of the wireless industry. The idea of increasing efficiency of networks, by stopping the same content being transferred multiple times has pretty much dissipated. Yes, there is some discussion of cacheing certain files at various points in the network, but it would still need to be transmitted multiple times over the radio.

Obviously, side-loading between devices is an option, either via Bluetooth or memory card, but it can be a pretty clunky experience, and obviously iPhones don't have card slots anyway. It's also not good for realtime sharing.

The LoKast application from NearVerse looks like it might have some interesting potential. Although the initial use case has been local sharing of music (eg promos at SXSW in Austin), there appears to be a more general option to use it as a way of "offloading" mobile broadband traffic *to other handsets*. I'm pretty convinced that peer-to-peer connectivity among mobile devices is going to be a disruptor, whether it's MiFi-type products or virtual hotspots like Joiku's.

The CEO's phrase "unifying carrier and short-range wireless networks into a holistic, optimized system" in the funding announcement is an interesting comment that is evocative of various forms of cellular/WiFi (or in this case cellular/Bluetooth) hybrids.

Automated local sharing of content has some interesting implications for the UK's new and much-hated Digital Economy Bill too.

The secret social network success: RIM now advertising its BBM messenger direct to consumers

I've been researching the BlackBerry business model quite deeply recently, including a detailed briefing document for Telco 2.0's executive briefing service.

My general view is that the end to end RIM ecosystem tends to get underestimated in potential by operators, for example in terms of the potential for two sided revenue streams. I think that both the App World and BES could be interesting platforms to enable operators to interact with enterprise customers, for example for generating revenue from APIs.

But it is the consumer side of BlackBerry that is still really poorly understood - particularly its growing adoption among youth. My own anecdotal experience in London is that ownership has two separate peaks in age - business users around 40 and consumers about 22, the latter predominantly female.

The sheer number of students with BlackBerries is unexpected to many - there seems to be an assumption that the iPhone would be the device of choice, while in fact RIM has a "secret weapon"

The proprietary BBM messaging service seems to have evolved into a new social network purely by chance - and it has gone unnoticed by most commentators. Yet it has suddenly evolved into a quite exclusive club - pretty much acting as a mobile version to replace MSN, with what seems like viral uptake.

As usual, the social media industry gets hung up on newcomers like FourSquare and obviously Twitter, ignoring what looks like an old-school IM service. The mobile operators assume (wrongly) that people want all their social networks converged and bundled together with the handset phonebook. In fact, it makes perfect sense to have separate (and good) capabilities for Facebook, BBM, email and SMS in the same device.

Yet to my mind, BBM seems to have the most underground "viral" uptake of anything I see in the real world. Yes, I'm in London, which creates its own technology microclimate. But evidence suggests that BBM is also driving BlackBerry sales in places like Indonesia and Venezuela as well.

So it is with great interest that I noticed yesterday some saturation-advertising specifically for BBM on the Tube, on the platforms and escalators. Good to see RIM actually promoting it actively.

For all the hype about apps, it is wrong to underestimate the power of a messaging service or social network in device choice. Most apps are only appealing to individual users. But messaging gets the n-squared factor. Give an average teenager or student a choice between a must-have app.... or missing out on their friends' gossip.

As always, two maxims hold true:
- Divergence is more important than convergence
- Communications is more important than content

A side-issue here is that RIM (unlike Apple, or in the past Symbian) has been directly targeting prepaid users with low-end devices. Given the youth market outside the US tends to be 80%+ prepay, this is a completely open field that other smartphone platforms have abdicated, although Android is catching up now).

Monday, April 12, 2010

The incumbent telcos' attack on YouTube looks like a suicide pact

There is a right way and a wrong way to go about finding new business models for broadband services. Having just published a report jointly with Telco 2.0, I've been spending a lot of time delving beneath the surface of two-sided business models, and the opportunities for operators to make additional revenue from "upstream" companies like media and Internet properties.

Picking a fight with Google about YouTube traffic is categorically the wrong way to go about it.

Yet Deutsche Telekom, France Telecom and Telefonica are all trying to tweak the lion's tail. According to the FT, they're even suggesting that regulators could "supervise a settlement" if they can't persuade the Big G to hand over a slice of its advertising and application revenues.

Google's recent stance with China over censorship doesn't suggest that it is going to be happy playing that sort of game. But what bemuses me most is that the operators seem to be chasing after the wrong target. Google pays for its network connections like anyone else. That's the way the Internet works. It's not like phone calls, where there are originating and terminating ends to a connection. If end users start *uploading* traffic to Google's servers, are the operators going to be happy to pay for the traffic to be terminated in Mountain View?

There certainly are potential revenue-earning opportunities for the large telcos - either from Google or other players (Governments, application developers, other operators, media companies and so on). There are already signs that two-sided models are emerging to serve specific device types, specific customer groups, or "third-party paid" data traffic. But for those type of services to work, the operator needs to add value above and beyond connectivity.

Some form of toll-gate is not a value-add.

Payment mechanisms, authentication, managed security, guaranteed SLAs, device management and numerous other options represent value-adds. Partitioning a broadband connection and giving absolute guarantees of speed, latency, jitter and so forth might qualify. But simply attempting to gain a "free ride" on Google's skillful aggregation of services will not.

My view is that it will be extremely difficult for operators to derive additional monetisation from services that terminate on PCs. Smartphones, TVs and other devices are different - and those are some of the areas where they should be focusing their marketing and executive firepower. It is possible that this is just an opening skirmish before YouTube starts trying to pitch full HD-quality video for living room TV sets, in which case there is definitely a discussion to be had.

But otherwise, it wouldn't surprise me to see Google start to think about charging operators for carrying its content: after all, as DT's CEO says "There is not a single Google service that is not reliant on network service"... which could well be inverted to say "There's not a single broadband user that isn't reliant on Google's service".

It would also be remarkably easy for Google to offer free advertising to a given operator's competitors, less inclined to consider forms of extortion. Of course, in a properly competitive broadband market, there should be no barriers to operators trying whatever strategy they like - as long as there are no unfair attempts to block subscriber churn.

If operators are really going to push for non-neutral Internet connections, are they also prepared to deal with the fallout from non-neutral Internet services as a response.

If you are interested in the new Broadband Business Models report, which identifies a multi-$bn amount of Telco revenue opportunity via much less-confrontational approaches, then more details are here. Or email information AT disruptive-analysis DOT com.

iPad impact on mobile networks likely to be negligible

OK, I'm making myself a hostage to fortune here, but my expectation is that the Apple iPad should be very low on the list of operators' worries when considering mobile broadband traffic patterns.

There's been a lot of speculation that it could cause additional problems above and beyond the iPhone and 3G dongle-equipped PCs, but I'm really unconvinced it's worth other than some cursory monitoring.

Firstly, even the optimists (I'm not one of them) expect the iPad to sell in far smaller numbers than the iPhone.

Secondly, not all of them (or even most of them) are likely to come with 3G modems. They're optional and not even on sale yet.

Thirdly, they are unlikely to be used for many of the "quick hit" on-the-go access to maps, email, Facebook and so forth that are characteristic of iPhones.

Fourth, they are being sold with the benefit of hindsight - and at least nascent offload / traffic management strategies. It wouldn't surprise me if operators adopt some very specific iPad policy and enforcement techniques.

Fifth, a proportion are likely to be used almost exclusively as "stay at home" tablet devices, plus the occasional trip out of WiFi range.

Sixth, most iPad owners are likely to have an iPhone, which will probably be used for applications that need to be "always on" or most frequently-checked and generate much
of the signalling load. For networks like AT&T's it has been the iPhone's constant setting-up and tearing-down of data connections that has been at least as much a problem as the sheer bulk of data downloaded.

Seventh, it lacks a camera, which means that uplink traffic may well be much lower.

The main variable is the potential of the device to be used to consume large amounts of video over the 3G connection. Frankly, in comparison, the downloads of newspapers or a few apps is trivial in data volume.

Thursday, April 08, 2010

RIP Guy Kewney

Sad news this morning that UK tech journalist Guy Kewney has passed away. I'd spoken to him, or met him at conferences and press events, on numerous occasions over the last 15 years or so.

He was witty, erudite, intensely knowledgeable and a totally unique character known by pretty much everyone in the UK IT and telecoms industry.

My condolences to his family and friends.

Thursday, April 01, 2010

The dangers of over-reliance on simplistic metrics

I wrote yesterday about the decreasing relevance of "$ per GB" as a yardstick for revenues or costs in mobile broadband.

Thinking about it, I reckon it's symptomatic of an industry that tends to live by snappy, marketing-friendly soundbites that obscure underlying complexities.

Simple messages are great for headlines, but can lead to wrong decisions if they become too entrenched.

A classic pair of errors in mobile has been the unthinking over-use of two basic metrics:

- Number of subscribers
- ARPU

Subscriber numbers have been simple to measure - largely because they map nicely to SIM card MSISDNs (a phone number to most of us). While they're easy to count, they don't really give a good view of either the actual or potential base of customers. Some people have multiple SIMs, some are shared, while an increasing % go into machines rather than phones. In fact, the whole terminology has skewed business models towards subscription-based types - while non-subscription models (eg transactional) have been downplayed or totally eschewed.

Almost no service business should rely totally on subscription models. Yes, it's a useful *segment* and trends are useful indicators, but it shouldn't be viewed as a pivotal metric.

Worse still is ARPU. More accurately, it's Average Revenue per Subscription, not Per User. Ironically, if it were used properly, it would be more useful.

The fact that I have a £40 per month phone, plus a £15 per month 3G dongle (from a different operator) makes me a £55 a month mobile user. Describing me as two £27 per month subs on average doesn't really help anyone to understand their business.

ARPU is inherently biased towards operators giving large subsidies, then recouped as "revenue" over the contract. Yes, it's possible to do some maths with (ARPU minus acquisition/retention costs) but there's still often too much focus on the headlines.

I remember that 3UK always used to trumpet its high ARPU. Until it realised that this was simply because it didn't *have* any low-tier packages or prepay, so obviously the average was going to look high. Nowadays it has realised how much money it was leaving on the table, and has targeted those segments aggressively, and is about to reach profitability (finally), largely thanks to *lower* ARPU.

A classic example of ARPU-blindness has been the reticence to focus too heavily on M2M services. "What, an extra 10 million subscribers on $5 per month? What will that do to our figures? What will our investors think?" . "But they're vending machines and remote utility meters. They hardly use the network. We'll make $4 on each in profit margin"......

There have been plenty of suggestions about using Average Margin per Subscriber/User and so forth - and certainly, most operators' internal management teams are rather more sophisticated about financial analysis these days.

But nevertheless, the ghost of ARPU lives on. While it might be largely discredited by those who really care and play with spreadsheets deep in the strategy department, it is still measured and watched by observers - and used as a tool by vendors in their marketing. It hasn't really gone away.

Its influence remains disproportionately pervasive.

Another example is handset shipments, which lumps together a $15 GSM phone on an Asian market stall, with a $5000 Vertu in a Dubai shopping mall. There's still a regular refrain that Apple is irrelevant because it ships tens of millions of devices per year, compared to Nokia's half a billion. Only rarely is it mentioned that Nokia's average selling price is €63 while Apple's is perhaps eight times that figure. Or that Apple and RIM account for a hugely disproportionate % of handset industry margins.

It's like claiming that Honda is more important than Toyota, because it sells 13m vehicles a year against 7m. Let's ignore the fact that 10m of them are motobikes. Or that Giant is in the top 5 vehicle manufacturers. Never heard of them? I'm not surprised, as they make bicycles rather than cars - but hey, they all have wheels, yes?

To sum up - raw, headline numbers rarely tell the whole story. Over-focus on them can actually damage a business, and even where management "understands" this, it's still possible to be subconciously swayed.

As I mentioned yesterday, the next oversimplified metric to hit the headlines is "$ per GB". My recommendation is to take it with a pinch of salt - chasing that figure either in terms of revenue or capex/opex costs is likely to be a mistake.

Remember, the best-value way of transporting data, if you just used $ per GB as a metric, would be to drive a truck full of flash memory from A to B. The latency is pretty lousy, though.

(But if someone would like to pay me at 1 cent per GB, for a network that gets mobile data around the UK at a terabit per sec, please get in touch. I reckon a guy on $20 bike from Giant can easily a petabyte of memory cards....)

Wednesday, March 31, 2010

Mobile broadband problems - looking beyond the "obvious culprits"

I've written a few times in the past about the "capacity crunch" for mobile broadband, as well as the potential for offloading traffic or policy-managing the network to prioritise certain users or data types.

More recently, I've discussed the role of signalling as an important factor in driving network congestion, especially from smartphones.

But there is a fair amount of uninformed comment about what's causing the problems - it "must" be laptop users downloading HD video or doing P2P filesharing over their 3G dongles, or it "must" be iPhone users using Google Maps and Facebook.

My view is that these over-simplistic analyses are leading to knee-jerk reactions around capacity upgrades, or stringent policy-management and traffic-shaping installations. Many vendors don't want to (or can't) give a fully-detailed picture of what really causes problems for the network and user experience.

It is in many suppliers' interests to market a neat single-cause and single-solution message - "You need to upgrade to LTE to add more capacity"; "You need PCRFs and charging solutions to limit video"; "You need to upgrade your GGSNs to really big routers" and so on.

The truth is rather more complex. Different situations cause different problems, needing different solutions. Smartphone chipsets playing fast-and-loose with radio standards may cause RNCs to get blocked with signalling traffic. Clusters of users at a new college might overload the local cell's backhaul. A faulty or low-capacity DNS server might limit users' speed of accessing websites. And so on.

Or, as many parts of London are experiencing today, a fire at a BT office might knock out half the local exchanges' ADSL and also the leased-line connections to a bunch of cell towers.

Now, in the long run there certainly will be a need to carefully husband the finite amount of radio resources deployed for mobile broadband. My order-of-magnitude estimates suggest that the macrocellular environment (across all operators, with the latest technology) will struggle to exceed a total of maybe 3Gbit/s per square kilometre, even on a 10-year view. So offload to pico / femto / WiFi will certainly be needed.

But in the meantime, I'm moving to a view that Stage 1 for most operators involves getting a better insight into exactly what is going on with their mobile data networks. Who is using what capacity, in which place, with which device, for how long? And exactly what problems are they - or the network - experiencing?

In recent weeks I've spoken to three suppliers of products that try to analyse the "root cause" ofmobile data congestion [Velocent, Compuware and Theta] and I'm starting to hear a consistent story that "there's more than meets the eye" with regard to network pains.

Some of the outputs can be eye-opening. It may be that a lot of customer complaints about poor data speeds can be traced back to a single cell or aggregation box that is mis-configured. It could be that a particular group of devices are experiencing unusually high problem rates, that may be due to a fault in the protocol stack. It might be that viruses (or anti-virus updates) are responsible for problems. Or it might just be that all the iPhone users are using YouTube too much.

One thing is for certain - the yardstick of "Dollars per gigabyte downloaded" is an extremely blunt tool to measure the profitability of mobile broadband, especially when opex costs around support and retention are included in the equation. There's no value in having a blazing-fast and ultra-cheap network, if users end up spending an extra 4 hours on the phone to customer care, complaining that they can't get access because of flaky software.

Note: The new Disruptive Analysis / Telco 2.0 report on Broadband Business Models is now available. Please email information AT disruptive-analysis DOT com for details.

Tuesday, March 30, 2010

LTE and offload - a few random questions

A quick series of observations:

1) It is highly likely that LTE will have to be provided (at least in part) by femtocells and/or WiFi access points, rather than being solely transmitted via the macro network. This will be for coverage reasons, especially for 2.6GHz, but also because of the limits of capacity which will be quickly reached in some areas.

2) Given rising traffic volumes, there will also be a strong desire to offload bulk IP traffic direct to the Internet close to the edge of the network, rather than backhauling it all through to the operator core & then out again

3) Voice will in some packetised form on LTE, whether it's VoLTE or VoLGA or Skype or CSFB.

So... there will need to be a mechanism to send VoIP to the operator core, but dump bulk web traffic locally. I guess that could be achieved by using separate APNs (or whatever they're called in LTE-speak).

Or, will it need the new "Selective IP Traffic Offload" standard being worked on by 3GPP?

Or, could you use some sort of dual-radio solution in handsets, sending certain traffic over the macro LTE network (or HSPA), while sending low-value data via WiFi / femto?

Separately... presumably this also means than any VoIP that *does* go via an offload path (femto / WiFi) will need to be tunnelled back to the operator core via some sort of VPN.

So potentially we may see some LTE phones using CS-Fallback over a GAN-type tunnel.....

Monday, March 29, 2010

Hopefully, we'll be rid of the Twitter obsession soon

I've been a long-term Twitter skeptic.

I think it's value-negative, and of total irrelevance to anyone outside an unholy alliance of geeks, narcissists (politicians, celebrities etc), marketeers, "media" and their drone-like followers. It's mostly used by lazy journalists and broadcasters, as far as I can see.

I highlighted it as one of my "zeros" in my predictions for 2010.

So, it's edifying to see that the "growth" stats are proving my point for me.

The appearance of incredibly annoying floating Twitter buttons on some websites is a sign of desperation - and is hugely counter-productive, as visual spam of that sort is a great way to alienate people.

About time to swap the silly bird logo for a dodo, methinks.

Extinction beckons.

Network policy management and "corner cases"

I've been speaking to a lot of people about policy management recently, fitting in with the work I'm doing on mobile broadband traffic management, as well as the Business Models aspect of my newly published report on Broadband Access for Telco 2.0.

A lot of what I hear makes sense, at least at a superficial level. Certainly, I can see the argument for using PCRFs to enable innovative tariffing plans, such as offering users higher maximum speeds at different times of day, or using DPI or smarter GGSNs to limit access by children to undesirable sites.

But there's a paradox I see on the horizon. In the past, telcos (fixed and mobile) have been pretty obsessed with corner-cases. "What happens if a user tries to set up a 3-way call while they're switching between cells?", "What happens to calling-line ID when I'm roaming?" and so on. Sometimes this is because of regulatory requirements, sometimes it's because they're worried about the impact on legacy systems not being supported - and sometimes it just seems to be preciousness about some minor complementary service that nobody really cares about.

So what happens with *data* policy management and corner cases? What happens if I'm roaming and the local operator's policy conflicts with my home operator's? Do I get a subset or a superset? The lowest common denominator, or some sort of transparency? Imagine my home operator allows VoIP on its mobile broadband, but limits YouTube viewing to 100MB a month. But the visited network doesn't allow VoIP for its local customers, but also doesn't have the ability to discriminate video traffic - or, perhaps, applies some sort of compression via a proxy. Sure, everything might be backhauled via my home network.... or it might be offloaded locally.

[Side question - what happens to international data roaming traffic on a visited operator that does WiFi offload, provided by a separate managed offload operator?]

In a nutshell, I guess this boils down to "Policy Interoperability". And a need for policy IOT testing, on an ongoing basis. I strongly suspect this won't be as easy an many think.

Whether the "corner case" problems impact the overall use of policy management will probably depend on hard problems around with local regulations and laws, I suspect. But as a customer, will I really be happy with having the most stringent superset of policies applied, if there are multiple operators involved in my providing my connectivity?

Friday, March 26, 2010

Nokia acquisition of Novarra - fragmentation of optimisation?

Very interesting to see Nokia's acquisition of web/video optimisation & transcoding vendor Novarra, which has been quite widely deployed in operators looking to reduce data traffic sent over mobile networks.

The fascinating thing for me is that it's being pitched as a way to optimise web browsing on low-end Series 40 devices - in other words, it's *not* primarily about reduction in outright traffic levels for operators, which are dominated by laptops & top-end smartphones.

The other stand-out is that the acquisition is by *Nokia* and not NSN.

I've been giving a lot of thought recently to various ways to optimise / compress / offload / policy-manage mobile broadband networks, trying to work out a way of reconciling the different options available to operators.

As part of this, I've been looking at the approaches to transcoding and proxying of web traffic - either in centralised boxes from Novarra or peers like ByteMobile and Flash networks - or by specific client/server implementations like RIM's NOC for BlackBerry Internet traffic, or the Opera Mini platform.

My view is that there will be no "one size fits all" approach to traffic management, and that operators will have to be smart about treating different use cases in different ways. This is less about treating traffic differently on a per-application basis, and more about device/business model/customer scenario segmentation.

My current thinking is that laptop traffic will probably be offloaded close to the edge of the network, especially with WiFi, or femtocells once it's easier to use techniques like SIPTO or Direct Tunnel to avoid congesting the mobile core with traffic that is 99.99% destined for the wider Internet.

Smartphone traffic will be part-offloaded, and part-optimised or policy-planned.

And, based on this acquisition, it increasingly looks like featurephone traffic will be optimised at the application level.

In addition, there are likely to be a range of general capacity improvements and efficiency gains in the RAN, backhaul and elsewhere - dealing both with total traffic volumes and signalling load.

More on this topic to come...

Non-user revenue models for broadband - excellent example from Vodafone

One of the major themes I explore in the new Telco 2.0 / Disruptive Analysis report on broadband is that of "non-user revenues", otherwise known as two-sided business models.

The basic concept behind a 2SBM is for an operator (fixed or mobile) to use its network and IT platform to derive revenues directly from end-users, and also from various "upstream" companies like advertisers, governments, content providers, application developers and so on.

The idea is that retail broadband revenues will start to flatten - and must be incremented with new advanced wholesale propositions. Some of these will be evolutions of current telco-to-telco wholesale (roaming, interconnect, MVNOs, dark fibre and so on), while others will evolve sale of broadband capacity to "non-users". The Amazon Kindle is a good example of this - it's Amazon paying for the connectivity for book downloads, not the end user through a separate subscription.

One particular opportunity identified in the report is for governments to pay for broadband services (either outright, or for specific capacity / capabilities) on behalf of their citizens. It might be that data connections are bundled into an e-Healthcare service, or perhaps in the context of Smart Grids.

Or, as Vodafone has illustrated this morning, a government agency like a local council or development authority might choose to sponsor fixed or mobile broadband connections for those beyond the "digital divide". In this example, it's unclear whether Voda is providing fully "open Internet" broadband, or a more restricted service just providing access to educational websites. Either way, it's a perfect example of "non-user revenue streams" and highlights the power of two-sided models to add incremental opportunities to an operator's existing maturing propositions.

This type of sponsored broadband is just one of a class of "new wholesale" approaches to selling access. Telco 2.0 / Disruptive Analysis has developed a unique forecast model which suggests that these types of innovative propositions could ultimately account for over 15% of the total broadband access market value globally.

The full dataset, analysis and modelling methodology is featured in the new Fixed & Mobile Broadband Business Models Report, which is now available for purchase.

To inquire, please contact please contact Disruptive Analysis

Thursday, March 25, 2010

New Research Report on Fixed & Mobile Broadband Business Models

Written and published in collaboration with the Telco 2.0 Initiative, Disruptive Analysis' founder and director, Dean Bubley, has produced a new 248-page Strategy Report on the future of operator business models for both mobile and fixed broadband, spanning retail propositions and new, advanced wholesale offers.

The report examines critical issues such as:

- Whether operators can use policy management and deep packet inspection as the basis for new revenue streams from Internet companies
- The prospect of greater government intervention in broadband, through regulation, stimulus investment or major national projects like Smart Grids
- The opportunities from new approaches to selling broadband - tiering, prepaid, 3rd-party paid, capped, bundled with devices etc.
- Differentiated wholesale models for mobile and fibre-based networks, including "comes with data" propositions and advanced roaming.

Complete with comprehensive forecasts spanning retail and wholesale tiers, this report is a unique analysis of Business Model Innovation in broadband, separating the practical from the "wishful thinking".

The report also includes a very detailed "use case" chapter, looking at the opportunities for fixed/cable operators to assist their mobile-industry peers by providing "managed offload" capabilities via WiFi or femtocells.

I'll be highlighting specific themes from the report in coming weeks in future posts.

For more details, contents pages and an extract/summary, please contact Dean Bubley at Disruptive Analysis

Wednesday, March 24, 2010

Picocells and the return of the DECT guard band - New service in Netherlands

Many people forget that before the advent of femtocells, a similar technology - picocells - has been around since 2001 or earlier. Picos have more capacity, but are considerably more expensive than femtos, and have required more expensive controllers and specialised installation procedures.

While many picos have been deployed by mobile operators for cheap "fill-in" coverage, or used in niche locations like oil rigs, ships or small islands, a more interesting business model was "Low Power GSM", pioneered in the UK with the auction of the 1800MHz DECT guard band four years ago. This enabled multiple new operators to bid for low-cost licences for indoor wireless services, using a thin sliver of unused 2G spectrum - especially enabling low-cost or free indoor private cellular.

I wrote about this here and closely watched the evolution of service launches, although uptake out to be comparatively slow. Cable & Wireless launched a corporate service for clients including Tesco, and Teleware has had some success with its Private Mobile Network. Two years ago, the market status was still limited.

One of the big problems has been for the new "indoor" operators to gain some sort of MVNO or roaming deal with the incumbent "outdoor" service providers, so they can provide a universal mobile coverage service. Perhaps unsurprisingly, the traditional "macro" operators have been unwilling to assist their new indoor-only cut-price competitors.

But something more interesting is occuring in the Netherlands. I wrote about 4 months ago that LPGSM was being enabled on a licence-exempt basis. And one of the companies that is now exploiting it has solved the indoor/outdoor conundrum, as it is *already* an MVNE, operating on Vodafone's Dutch network. Teleena announced its converged service yesterday.

Now obviously this is just GSM - so perhaps not much use for today's 3G smartphone-toting executive who finds that data services are sent over EDGE when in the office. Nevertheless, I'm considerably more positive about this type of approach than enterprise femtocells, which I continue to believe are unlikely to make traction for many years.