Pages

Pages

Thursday, September 22, 2011

I'm running the next Future of Voice Workshop in London, on the 27th October

The next in the series of ground-breaking workshops on The Future of Voice is taking place on 27th October in a central London venue.

Associate Martin Geddes and I ran previous successful public events in London and California over the summer, and we've also been privately advising various operators and others market participants about the current state of telephony and messaging.

The public events have featured a really broad cross-section of the voice ecosystem: we've had Tier 1 operators' strategy groups, start-up 4G network operators, vendors from hardware & software domains, regulators, Internet VoIP specialists, voice-centric developers and investors. We've even had some of the internal "Telco-OTT" skunkworks divisions of major telcos join the discussion.

The workshop format is designed for interactive learning and networking, with a maximum of just 20 attendees. Martin and I try to "curate" the group for optimal discussion and differences of viewpoint, as we've seen that generate the most interesting and unexpected debate and alignment.

The timing for Future of Voice is particularly exciting. Whilst I've been following and commentating on voice and VoIP for years, it's only really in the last 6 months that I've seen a sudden ramp-up of concern and attention from operators, as they realise that some major tectonic shifts are occurring, especially in mobile. Core revenues now look under threat, and the need for both defensive and offensive action is urgent.

Some of the areas we'll be covering on the day include:

- Why do people make calls anyway? What's the underlying psychology of human communication, that's driving the fragmentation of mechanisms by which we reach each other?
- What's the difference between "voice services in general" and classical "telephony"? What other formats and business models are there for voice comms, beyond the familiar 100-year old "call"?
- How should telcos' respond to OTT voice and messaging providers? Should they try to block them, beat them, buy them, or copy them?
- What exactly is a "voice application"? How are they enabled by "voice platforms"? And who is in control, the telcos, Internet VoIP players, developers..... or nobody at all?
- What's going on with messaging and social networking? Is SMS revenue about to fall off a cliff.. and can the telcos' preferred answer of RCS be the saviour (I think my regular readers will know my answer about that one, but something different may come out in discussion)
- What are the organisational challenges for telcos in coping with voice? Who "owns" the core voice and messaging portfolio within operators, and what should they be thinking about?
- What differences does LTE make to the telephony marketplace? Is VoLTE the answer?
- How does the enterprise and B2C/customer-care communications fit into all of this?
- Will be still be paying per-minute and per-message in the future? Will it all have been competed away to zero, or given away by Apple or Google or Skype? Or are there other potentially successful business models to consider?

This is going to be a busy and intense day. If your company or livelihood depends on understanding where voice service is going, and you want to know what you peers are thinking, you should be there.

More details are at futureofcomms.com

You can book online here, or contact me via information AT disruptive-analysis DOT com to inquire about invoicing or private internal events.

Monday, September 19, 2011

Who's driving policy within operators?

Last week I chaired sessions at two co-located conference in Berlin - Policy Control, and Mobile Broadband. Some of the sessions involved the usual handwringing about data "explosions" and so-called OTT providers, but others posed some really interesting issues about how and where control over mobile broadband gets instantiated in operators' infrastructure and businesses.

I was particularly struck by a couple of operator speakers coming from very different perspectives - the IT and Network sides of two telcos. It is no news that there is still something of a gulf between these these two worlds, but I hadn't realised quite how vigorously that plays out in the policy management space - and by extension, in congestion/traffic management as well. Together with a very detailed interview I did with a vendor, it's led me to a few observations and conclusions.

In a nutshell, network-oriented policy is about controlling costs, while IT-based policy is about revenues. Yes, there are many nuances and shades of grey, but that simple observation has helped clarify a lot in my mind.

In particular, it encapsulates why I find it hard to buy into the "monetisation" stories from many of the network-side vendors: in general, those pitches involve trying to define plans and tariffs based on what the network can see on its own, with minimal integration into the billing/charging side of the house. This is why so many network vendors are adamant about application-based charging - their boxes have visibility of packets, flows and so forth, and they are looking for a reason to extend from basic control (eg throttling P2P) to more value-added capabilities, such as driving tariffs. But many of those processes have unacceptably high levels of false-positives and false-negatives, and are at the mercy of shifts in application structure and user behaviour that are hard to discern from inside the network. (For example, the mashup phenomena I've discussed many times before). This means that superficially good ideas have a credibility gap when viewed through the lens of practicality.

Conversely, the IT-integrated policy functions are much more oriented around users - who the subscriber is and what they're entitled to, rather than trying to decode what they're doing on an instantaneous basis. And in that circumstance, there is much less vagueness to contend with. If data consumption is associated with me or you, there's no 95% confidence interval - it's known to 100% accuracy.

Now obviously the network-side policy & DPI infrastructure still has to be the measurement and enforcement point - for example ensuring that traffic is counted and caps or tiers are enacted - but that's driven by the user and data-plan, rather than a complex internal feedback loop based on realtime interpretation of the user's activities.

I agree with the vendor I spoke to that this will change over time - but the key dimension will be around the network identifying when there is actual congestion occurring. "If a cell is actually full with data traffic, then take actions XYZ, starting with those subscribers that are both lower priority and who are doing something like video streaming". This will need to come from the radio as well - simply looking at TCP from back on the Gi interface by the gateway won't cut it.

In a way, I suppose you could rephrase the cost/revenue description above as being: "Charging-based policy is about prevention. Network-based policy is about cure". In other words, the IT-based approach attempts to moderate users' behaviour in advance, through segmentation and templating. The network-based approach is there to fix things, when problems occur - especially congestion. Over time, the cures will become much more granular and targetted, while the prevention strategies will become more subtle and sophisticated.

Related to this is the observation that IT departments are often more responsive to the needs to operators' marketing functions than networking groups. Rather than forever trying to standardise, they tend to have a more pragmatic view towards customisation and flexibility - something that is much-needed with controlling mobile data. There is also much more likely to be a willingness to involve the end-user in the process, for example through a trend towards on-device self-care portals for visibility and tariff upsell.

I'm still trying to get my head fully around all this - but distilling all the conversations I've had recently I am moving to the view that charging-driven policy use cases are, on the whole, more plausible and workable than those that are purely driven by the network. This is also true of issues around dealing with mobile video - transparent "optimisation" of content, without reference to user details is only ever going to be a partial (and often unsatisfactory) solution.


EDIT - within hours of posting this piece, I saw this news item about Verizon, which seems to underscore my point. Although implementation details are thin, it sounds like the company is using a combination of network-based congestion detection, plus subscriber records to pick out the "usual culprits" as warranting throttling. Being in the US, Net Neutrality obviously plays an important part in the approach, and this appears neutral in terms of content/service differentiation.


Saturday, September 10, 2011

Great example of a mashup creating a network-policy failure

A very quick post.

I've got the My Vodafone application downloaded onto my iPhone. To the basic function of reporting usage against data/voice/caps (which works only sporadically anyway), they have now added a WiFi finder, presumably to encourage use of the BT Openzone option that's bundled into the package. The finder part of the app helps you locate your nearest hotspot, so you can get online - and implicitly, so Vodafone can benefit from offloading your data.

However, there's an amusing little "gotcha" hidden away in the terms:

"Access to the My Account part of this app is not chargeable. However, if you use the WiFi Finder section, it will access Google Maps, which will fall into the same category of data usage as accessing Facebook or Twitter, and you will be charged for data usage"

In other words, you have to pay extra, in order to help Vodafone offload traffic. (Separately, the VF app also wants me to download a configuration file of some sort for WiFi. Not a chance, as I don't trust it to mess up the settings for my BT-Fon and Onavo connectivity clients. WiFi Neutrality rules....)

As I've said numerous times before, web mashups fit exceptionally badly with DPI and application-specific policy or charging approaches. Vodafone can't even zero-rate its own application - either because the policy engine can't distinguish which device-side app that a particular Maps session appears in, or because it can't/won't pass that information back from the app to the billing system.

And even if it could zero-rate that bit of maps usage, it would then have to hack the application and it use of the Google API, to stop users just using the WiFi finder version of Maps as a general navigation application.

This goes along with another example I'm using at the moment - Facebook can now render web pages inside the app and also keep YouTube videos within that window as well - still with the blue Facebook bar across the top of the screen. So when the mythical "Facebook data plan" gets launched, will I still be able to watch the embedded videos? Especially if FB puts up its own video streaming service at some point in the future?

Friday, September 09, 2011

More thoughts and observations on WiFi Neutrality

I wrote a month ago about the importance of "WiFi Neutrality" - that users should not face unreasonable blocks on which WiFi networks they can access, especially where a mobile operator attempts to force them towards "preferred WiFi networks", and stops (or makes it hard for) that decision being over-ridden by the customer.


Over the past few weeks, I've been speaking to a variety of operators and vendors in more depth about this issue. I'm seeing something of a polarisation:

  • There are some who understand that the WiFi genie is well and truly out of the bottle, and that (perhaps grudgingly) users need to be given tools to set their own preferences and policies, and that the operator's / network's role is in helping make the telco-preferred WiFi easier and safer to use, when that's appropriate. Ideally over time, a greater proportion of WiFi usage will occur on the operator's own or partnered networks, and more control can be exerted, or unique services offered.
  • There remain others who are either hostile to WiFi on smartphones entirely, or who believe that it can/should be strictly controlled by the operator, even to the extent of greying-out menus or features on connection manager software that allow user choice to be exerted.
My view is that the latter group are in for an extremely hard and unpleasant surprise, especially if they operate in competitive markets. There are probably now a billion or more regular users of WiFi, who understand it well, deal with occasionally-clunky authentication, but who like the freedom to choose when and where they connect - especially if it's free. Taking that choice away from them, when they have already come to expect WiFi Neutrality on their smartphones, is unlikely to be received positively. Even changing the connection experience to add in more clicks/swipes to get to what used to be easy is likely to be anathema.

It's also important to note that about 2/3rd of WiFi users (in the UK at least) *never* use public hotspots at all. This is something I backed-out of some recent Ofcom data about Internet usage, devices and locations: the proportion using hotspots is tiny compared to the total "wireless" use on laptops, even when you factor in some of them using 3G dongles. Conclusion - the bulk of WiFi usage is in homes or offices - "private WiFi", not offload or anything to do with the operator.

As I've been saying for years, WiFi is actually WLAN, or Wireless Ethernet. You wouldn't expect a service provider to manage the RJ45 socket on your laptop, nor for that matter the USB ports or Bluetooth. Exactly the same deal with WiFi - it's a general-purpose utility for laptops and smartphones, with many use-cases, only a small proportion of which are anything to do with "services".

There is one large exception here - China - where WiFi has always been a lot more tightly regulated and controlled than in much of the rest of the world, for both public and private use. Like other countries with restrictions on their citizens' use of technology, I can accept that a more proscriptive approach to on-device WiFi connection management is likely.

But more generally, I still believe that operators need to balance their own uses of WiFi on consumers' devices with their users' other applications of the technology. If they don't they will likely catalyse a shift towards "vanilla" devices bought through retail, just using an operator SIM and minimal app/branding customisation. These will be owned by the user, who will then obviously be able to use WiFi however they wish, just as they do today on laptops.

Operators need to encourage on-net WiFi use (ie their own / partners' WiFi) by making the experience and performance better. The opposite strategy - trying to make the experience worse for off-net WiFi - is not tenable.

Wednesday, September 07, 2011

Deciphering the Orange Sosh proposition

I've just seen that FT/Orange France has announced a new brand/proposition called Sosh, aimed at "hyper-connected" youth segment, majoring heavily on data and bundled SMS.

I'm a bit confused by the press release though - it's not obvious that the 2-hour, 5-hour or 24/7 product names refer to telephony minutes, or data time allowed online. There is a reference to "All of the Sosh offers are commitment-free, and they give users access to all of their content and digital services, including social networks, videos, e-mail, internet and VoIP." ... but I'm not sure how that squares with 2/5 hours. It would be bizarre if you could use Facebook and Twitter for 2 hours a day, as it's not clear how long a "session" lasts, especially if you've got a background app open on a phone sending keep-alives.


On the other hand, €20 a month sounds quite a lot for a 500MB/120 mins plan that doesn't include a handset, even if it does allow you to use VoIP and Orange's WiFi hotspots. Same for the 1GB/300mins for €40 plan.


Interestingly, handsets *are* available, bought separately, and with 12/24 month payment plans. That's very interesting as it fits in with some earlier posts of mine about the accountancy grey-zone of handset subsidy/loan repayments being counted as part of ARPU.
 
The press release also has logos from Youtube, Twitter and Facebook, as well as Orange's 50%-owned Dailymotion videp service. It's not clear if these are official endorsements, if there are commercial agreements in place, special content/deals/pricing or anything else.

Overall, it's all a bit confusing - possibly because there's less English-language info published about what is initially a France-only service. Hopefully, we'll get some clarity on this soon.

Edit: mystery solved. Apparently French mobile operators often quote talk-time in hours and not minutes. Still seems weird that they're calling Sosh a proposition for hyper-connected, social-networked youth, and still using voice as the primary way of segmenting the pricing tiers.

Thursday, September 01, 2011

The fallacy of "monetising OTT" against the greater threat from Under the Floor (UTF) players

As far as I know, I was the first to coin the term UTF (Under the Floor) player in telecoms. It's simple really - that's where you put the pipes in most buildings, under the floor. Initially, I used it as a light-hearted riposte to the derogatory use of the term "OTT", and I though it might get adopted by Internet companies as a way to sneer at telcos.

Although I now grudgingly say OTT as well, I cringe inwardly every time I do - it has become an industry-standard term, unfortunately. But in my mind it just highlights the confrontational "them vs. us" of telcos and many of their suppliers. I confidently believe that no operator will make "cold hard cash" or otherwise monetise OTTs - insulting your potential customers is never a winning strategy. Vendors that spout this type of #telcowash by perpetuating these epithets will also likely suffer over time; pandering to your customers' crude prejudices is not a mark of strength and vision.

Conversely those companies (operators and vendors) that take a softer line dealing with companies that they genuinely and sincerely see as their peers and equals - perhaps referring to them as Content & Application Service Providers - have a better chance of commercial success. That said, my experience tends to be that operators divide on departmental or even individual lines. Some of the more forward-thinking marketing or product staff understand the value of Internet-based applications and seek to work collaboratively. Conversely, the traditionalists who believe that everything should be a centralised and interoperable standardised "services" - or who pretend to misunderstand the selling of vanilla Internet access - are the true dinosaurs.

But all this masks a more subtle - but perhaps ultimately larger - threat. It actually suits many vendors' wider purposes to cast so-called OTT players as the villains, posing an existential threat to operators. But all the time, another source of value erosion is occurring beneath the surface - quite literally under the flooor.

So over the last couple of years, I've had a bit of a rethink on the use of the term UTF. While from the content/Internet perspective, operators are indeed "underneath" them, they don't really pose a particular threat except in cases of egregious abuse of Net Neutrality, or where they are protected by competition law.

What's at the bottom of the stack from the telco perspective is a collection of true UTF players, that are eating away at the underpinnings of value for the telecom industry: the network itself, and the supporting OSS/BSS systems. Outsourcing, managed services, and especially wholesale networks operated by governments or infrastructure-based telcos are the perpetrators here.

I've talked before about the roles and risks of wholesale-centric networks (see this piece I wrote for Telco 2.0) in LTE, but it's worth drilling into the outsourcing side of UTF in more depth.

Now, I need to qualify this. Clearly, some things are indeed often best outsourced (eg maintenance), and in some cases hosted services can enable telcos to reduce upfront capex and risk when launching a new offering.  And in some cases, structural separation is forced on unwilling telcos, or governments such as Australia invest in a universal broadband infrastructure. In other instances, wholesale-centric wireless networks are being built because that's the strategy of a new entrant (eg LightSquared) or because of - one again - government intervention (eg Yota in Russia).

But collectively - and especially with the conscious, full-scale outsourcing of mobile network build and operation - I see a substantial risk. Nobody knows what business models might be found appropriate in 2 years' time, with LTE or other wireless networks. Nobody knows what applications may prove important, or where/how they are used. Nobody knows what the effects of Apple, Google or others spending their war-chests of cash might be, and what strategic responses are appropriate.

The basics of the mobile communication industry - and the basis for network capex and opex - is shifting. In the past, we knew that next year would basically be like this year, with a bit more traffic and some more users. That's easy to plan for, and can potentially be handed over to a third party for an extended contract, if you do your sums correctly.

But now, flexibility and responsiveness are key. Think back just 3 years. Who predicted traffic growth from PC dongles? Who predicted usage patterns of iPhones and Androids? Who thought seriously about WiFi offload? Who talked about signalling load breaking networks? Who thought about innovative wholesale approaches? Who thought about localised mobile services in sports stadia or retail outlets? Indoor data coverage, hotspots and not-spots? Who thought about all the various use cases for mobile policy management, and the formats of mobile data and application they might apply to?

And who thought about the impact these could have on network design, architecture, dimensioning and operational priorities? Operators have been forced to react to these changes, and will likely to continue to face uncertainty and a dynamic landscape for mobile data in the future.

How much damage would be done, if an outsourcing deal had been written insufficient flexibility and response-times to cope with these issues? Even the basic metrics ("subscribers", "traffic") are becoming irrelevant or in need of more granularity.

How much inherent value is there to the operator in being able to make their own decisions, differentiate and manage their own network in the way they see fit? In my view, that is at the core of being a responsive, profitable operators. Yes, it's possible to construct some managed service contracts to build in flexibility, but ultimately, how well does that work, when the really unanticipated occurs?

It is interesting to read that Sunrise, the Swiss operator, is cancelling an outsourcing contract [original press release PDF here ] This is a telling line "In view of the future modernization of the mobile network and the large upcoming investments in the next mobile generation LTE, Sunrise is dependent on great flexibility." There is also a reference to wholesale as a driver for more flexibility.

In other words - Sunrise wants to become a UTF player itself, not be supported by one.

Somehow, I don't see the vendor community being as keen on the UTF acronym as they are about OTT. Operators should ask themselves why that is. My view is that it serves UTF providers' purposes to push operators into believing that all the value is in chasing end-user services - even if yesterday's services are now turning into applications or mere features.

Footnote: if you'd like to see a (rather contrarian) presentation on the UTF threat I gave at a recent conference on network-sharing, it's viewable here