Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here

Monday, May 31, 2010

VoIPo3G becoming a reality. Remember, you read it here first....

Fascinating post by Andy Abramson about various experiences with Skype over 3G running on the iPhone.

Coupled with efforts by various other VoIP players - and the evolution of assorted voice-on-LTE advocates, it certainly looks as though VoIPo3G (or VoIPo4G if you buy into the marketing guff around LTE) is becoming much more feasible and important.

I've noticed a few of my rival analyst firms putting out forecasts and comments recently, as if this was somehow new and unexpected.

Readers of this blog, and customers of Disruptive Analysis' reports will have seen this coming more than two years ago (I'm pretty sure that I coined the term VoIPo3G myself - see the Google search results here)

I published a research report at the end of 2007 which included a full analysis of what was likely to occur - as well as the probable use cases and partnerships that could/should arise. I predicted up to 255m active users of VoIPo3G by the end of 2012. Not all of these would be for "primary telephony" - the majority would be using VoIP as an adjunct to ordinary full operator voice service.

Now, some things I expected in the report haven't unfolded quite the way I expected at the time I wrote the report:

- HSUPA has been slower to be rolled out in devices than I predicted, which has limited VoIP quality
- Apple and Google have de-railed the 2007 smartphone dominance of Symbian and Microsoft, sending some of the VoIP plans to the drawing board, especially as Apple's deals with telcos prohibited VoIPo3G
- Laptop mobile broadband has grown very swiftly, often used as a direct substitute for fixed broadband, and used with various apps including VoIP
- LTE and WiMAX have rolled out more slowly, voice-over-LTE has been standardised more slowly than anticipated.
- UMB disappeared from radar
- Operators have (generally) not followed my recommendation to experiment with VoIP on HSPA/HSPA+ before committing to it with LTE. This is a strategic error in my view.

Nevertheless, the idea of partnership between operators and Internet VoIP providers has seemed prescient - especially given the Verizon/Skype and Telefonica/Jajah tie-ups. It will be interesting to hear if Andy's theory about close Apple/Skype collaboration is true - and if AT&T and other operators have some indirect involvement as well.

It's about time I did a full reassessment of the mobile VoIP space - I've been covering it since 2001 and have been pretty on-track thus far.

Watch this space.

(If any historians of telecoms strategy and forecasting would like to get the original report, let me know & I'll cut you a very good deal)

Is it just me, or is 3G either really good or really bad, but rarely "OK"

I've started noticing that my experience of mobile broadband (iPhone 3GS on Voda UK) is much more polarised than that of fixed broadband.

With ADSL, performance seems to vary over a fairly wide spectrum of possibilities, because of the vagaries of contention at my local exchange's DSLAM, the connection of the server at the other end, and especially WiFi in different parts of my house. Overall, it's generally OK - I've had one major outage in 5 years (after a massive fire/flood at a BT Exchange that took out half of central-west London's phones).

But on mobile, it seems to be all or nothing. Either very good... or frustratingly bad.

I've been playing around with the Speedtest app on my iPhone (3GS, on Voda UK). And the results are really polarised. The downlink speeds (in Kbit/s) for my last 20 or so tests:

1940
1238
1 (yes, really - and I got 236kbit/s uplink!)
2638
13
2249
36
1554
33
690
1125
75
1088
717
3465

In other words, I either get 1-2Mbit/s+, or less than 100kbit/s (and not on EDGE, either, according to the indicator). I've only got two "middling" results around 700kbit/s, and none at all in the 200-500kbit/s range. Several times I've had results that make GPRS look poor. On several occasions recently I've had to switch the radio on/off via the "flight mode" switch to see if it helped to re-register. A couple of times I've done a full on/off of the phone.

Now, I recognise that I probably tend to do speed tests when I'm most frustrated, so there's a bias there. There could also be some bugs in the phone's firmware. But it's really conspicuous that I almost never get a "sub-par, but sort-of OK" performance. It's essentially all or nothing, even when I supposedly actually have coverage.

That's a very different user experience to fixed broadband.

I'm sure there are commenters who have much more test/measurement expertise than my amateur armchair efforts. But to my uneducated eye, it looks like there are problems beyond raw air-interface capacity. I'd expect a much more graceful decline in performance if it was just due to 2, 5, 10 or 100 other people in the same cell at the same time checking Facebook at the same time as me.

Is it a glitch somewhere in the RNC or GGSN? Who knows? Am I being policy-managed in response to my usage (middling, maybe 300-500MB/month) or my coruscating blog post on Voda's roaming practices a few weeks ago? [No conspiracy stories here to be fair - the variability was there long before].

Either way, it can't be long before we get some massive crowd-sourced programme to track network quality and deconstruct the root causes - the open question in my mind is whether this happens before or after the operators themselves get on top of it.

The theme of using DPI, probes and data-mining to analyse network and service-assurance problems is one of the top 10 technologies I introduce in my recent research paper on Mobile Broadband Traffic Management.

Thursday, May 27, 2010

What are the side-effects of speed/vol tiering for mobile broadband?

I'm at the Open Mobile Summit again today. I've still got a load of things to write up from yesterday as well (there are some too-brief comments up on my experimental and much-hated Twitter feed @disruptivedean).

I've just seen an interesting presentation from TeliaSonera, talking about its early deployment of LTE in the Nordic market and its pricing/tiering for mobile broadband.

It's worth noting a few things first - Sweden and Norway are quite "special" markets: affluent people, initial introduction of cheap and "very flatrate" USB modems for PCs in 2007 (€20 for all-you-can eat, I think was stated), lots of fast fixed broadband, and quite a lot of quite cheap spectrum, split between relatively few operators (mostly fixed/mobile integrated).

The speaker highlighted an interesting tiering scheme being introduced in the summer, as it extends its LTE rollout beyond its current (very) early-adopter phase of a "couple of thousand" users.

It is going to charge LTE access ("4G") for 30GB at €36 per month, at "the highest speeds feasible on the network", including dropping down to 3G where necessary.

It will charge lower prices of €4, €23 and €32 for 2GB, 10GB and 20GB at 3G-only speeds, dropping down to 2G when the cap is exceeded. It's not clear if this is a throttled "3G speed on a 4G" network, or whether it just involves an non-LTE modem.

So basically, they are using 3G vs 4G to provide price discrimination on a proxy for speed performance - and then combine it with different caps to enable hybrid speed/volume tiers.

But when asked, they are not doing speed discrimination within 3G (eg 3Mbit/s vs. 7Mbit/s vs. HSPA+ at 21Mbit/s) because it is too difficult to guarantee real-world perception of tiering. It's much easier to discriminate between HSPA at "1-10Mbit/s" vs. LTE at "10+".

This all makes sense - trying to have more than 2 or 3 tiers is difficult even in fixed broadband, where there isn't the variable of changing radio conditions involved.

In future, they're planning to offer "top ups" for your cap, if you run short at (say) week 3 of a month. They will also do zero-rated traffic for specific applications supplied by themselves (eg a deal with Spotify will mean that streamed music may not count towards the quota).

There are prepaid plans for 3G but not LTE at the moment.

I'm curious about a few things, though. The first is what the performance of the LTE network will look like when it starts loading up - it's all very well to claim customer-achievable speeds of 20-40Mbit's when you've got maybe 2000 users and 500 base stations (1500 sectors) - most of the time, any active user is going to be alone or perhaps sharing capacity 1-3 other people. What's going to happen when there's a conference room with 50 people with laptops and dongles, or even just a branch of Starbucks with 5 users?

The other question is around unexpected side-effects of price discrimination.

Ultimately, it sounds like we are getting to a position of differential pricing for what is a (relatively) commodity service. The big question is how "tradeable" that commodity is - and how arbitrage might work.

In prepaid voice markets, it is common for users to have multiple SIMs, and swap them over frequently to take account of pricing oddities. Many have multiple phones, to enable them to always call "on net" and benefit from the lowest prices rather than pay extra for cross-network fees.

It seems likely that while me may see more clever operator pricing, we'll also see more clever user behaviour to circumvent this. Given that data services don't need fixed phone numbers, I see much greater opportunity for dual-SIM, connection-sharing, tethering and MiFi-style products to facilitate arbitrage. There is also a lot of potential for new forms of least-cost routing and collaborative applications that pool different users' allowances.

So for example, how long before we see an application that can suggest:

"If I'm in week 4 of the month, and still have >30% of my quota left, then share or trade that capacity with my Facebook friends automatically, when we're within WiFi range of each other".

I think we're on for some fascinating device/web-based techniques for "gaming" the new data pricing structures.

Friday, May 21, 2010

NEW research report on Mobile data: Offload vs. Compression vs. Policy

Disruptive Analysis has published a short "thought leadership" paper on technologies for managing mobile broadband traffic.

While future deployments of LTE or HSPA+ will add more capacity for mobile operators, it is still critical to examine ways to reduce congestion on a shorter-term basis. And even with future 3G / 4G capacity additions, the hunger from new applications - especially those based around video or other rich media - will demand strong discipline for optimisation.

This is a complex area - there are many ways to control traffic, or minimise its impact on the most expensive parts of the cellular network. There are arguments for offload to WiFi or femtocells, compression in the core network, or numerous forms of innovation for policy management and charging.

This new research paper examines the roles of 10 technology-led approached to data traffic management. Some of them have been introduced in this blog before - offload has been discussed for over two years, and mobile policy management and traffic-shaping for over four years.

I started to discuss the huge diversity of approaches earlier this year - this thought-leadership paper expands on the analysis, based on dozens of extra meetings and interviews I have conducted over the past few months.

*** LINK TO MORE INFORMATION AND PURCHASE OPTIONS HERE***

There is no easy answer - the best approach will depend on the existing customer base and its behaviour, the mix of smartphones and laptops on the network, the operator's spectrum and cell site holdings..... and its forecasts and beliefs about the future.

In many operators, there is also an organisational and management problem: there is often no single individual who "owns" the issue of data traffic, who can develop a holistic solution. Instead, there are often diverse individuals who pursue narrow goals, which can have unintended consequences elsewhere in the network - or impacts customer experience.

In almost no operator is there an individual with the job title of "Policy Manager".

Vodafone recently announced that 85% of its mobile data traffic comes from laptops - but it is seeing faster growth from smartphones now. This highlights the complexity of the decision space - which also affects decisions about spectrum purchase and timelines for LTE/HSPA adoption.

More details on the new Disruptive Analysis paper on Top 10 Technologies for Mobile Broadband Traffic Management are here.

As this is a shorter and more tactical document than full strategy reports, the price points are correspondingly lower, starting at just $350 + tax if appropriate.

Alternatively, please inquire about more customised workshops or consulting projects on this field - an area that Disruptive Analysis has tracked for longer than any other analyst firm.

Thursday, May 20, 2010

Thoughts on the LTE conference

So, two days being bombarded by LTE technology and operator trial/deployment presentations.

Some take-aways:

- When it works, LTE apparently works very well in terms of peak speeds etc.

- For a greenfield operator in an uncontested market, with limited current 3G uptake and users without pre-existing expectations, it would likely be a winner in the medium term

- Unfortunately, LTE is going to deployed into a very messy world, with entrenched business models, diverse frequency allocations, existing "good enough" voice and data technologies that have predictable user experience - and capital constraints imposed by the economy and investors.

The frequency thing is a bit of a killer. So far, Verizon is doing 700MHz, one of the Swedish networks is at 900MHz and the other at 2.6GHz. NTT DoCoMo is using 2.1GHz and later 1.7GHz. China is looking at the TD flavour of LTE in 2.6GHz. Some operators were talking about refarming 1.8GHz for LTE, and some of the US operators mentioned their unique AWS band. O2 is trialling 800MHz digital dividend band in the UK. And I think I've seen a reference to 2.3GHz as well somewhere.

In other words, it's a mess - and quite a few early operators were trying to talk up their preferred option, in an attempt to drive scale. Then add in the fact that most of these bands are quite small in terms of outright size - not enough for competing operators with 2x20MHz allocations, for example. And the fact that the largest spare band at 2.6GHz is (a) still be auctioned in many places, and (b) is, according to O2 UK "very disappointing" in terms of propagation in the real world.

One other message that came out quite strongly is that investors do not seem happy about the notion of another big round of radio network capex, especially given the doubts over whether mobile broadband can be monetised beyond connectivity.

(It was notable that my term "happy pipe" seemed to resonate with quite a few people at the event, which was itself, amusingly, held in a part of Amsterdam called Der Pijp).

My conclusion is that in many markets, LTE networks will develop in a patchwork form, either for specific hotspots to use new spectrum to manage high densities of data users, or else perhaps for city-wide deployments. But especially in Europe, the use of 2.6GHz is going to struggle to enable LTE-only networks - they will either need to be dual-band with a sub-1GHz frequency from Day One, or else will have to rely heavily on HSPA as a fallback. (In which case, why bother with the added complexity - why not just use 2.6GHz HSPA instead? or WiFi?)

This means that the concept of an "LTE application" looks pretty weak in the medium term. And I really don't buy the idea of M2M devices (or cars) being ideal for LTE rather than 2G or 3G - coverage is king. Who wants an LTE tablet that doesn't work in at least as many places as their existing dongle, let-alone an LTE healthcare terminal?

One other thing is likely to disappear from the near-term wishlist is roaming - especially because of the likely diversity in LTE voice implementations for the next 10 years. And if voice roaming is ditched in favour of 2G / 3G circuit connections - why is LTE data roaming urgent anyway? Frankly, 21Mbit/s roaming costing $5 per second is quite enough - who really wants 100Mbit/s at $25 per second? Let's wait for the data roaming price to lose a few zeros before worrying too much about cross-border LTE, eh?

The other elephant in the room is that of user expectations. To be honest, the whole concept of "the user" was woefully lacking in the event. I think I only heard the word "battery" mentioned once. Customers have been accustomed to both price and experience for HSPA and will expect that as a baseline for LTE. There is also strong evidence from the fixed broadband world that the feasible premium for (lots of) extra speed is typically only 0-30% on the price, except for a handful of enthusiasts. Once we get to LTE phones, users will also have reasonable expectations that voice quality, reliability, battery life and coverage are at least as good as that achievable with a $20 GSM handset.

Overall, some aspects of LTE technology development sound positive. But I'm still not expecting any sort of miraculous or revolutionary shift in user perception of mobile broadband on a 5-year view, versus what we have today.

It will also be very interesting to see if any of the new Indian 3G licence-holders opt for LTE rather than HSPA. I suspect they will find the business case quite tricky, given market immaturity and an immediate pent-up demand for cheap mobile broadband in a market with little copper or cable.



NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Mobile local search - just leave it to Google. Again.

I've written various times before about my skepticism around "Mobile Search", and especially the idea that any form of "local search" is useful on a handset - at least, beyond that done by a decent implementation of Google.

I've heard any number of straw-man arguments that "people want to find things, not search for them", usually made by wannabee directory-services companies that want to charge companies to be listed.

I've heard similar arguments about the evolution of Yellow Pages to something altogether more interactive and cool - or about using SMS to query the network operator's own database, hooked into your cell ID or other details.

The problem I have is that most of these things just don't work that well, because they rely on an incomplete database that is usually compiled by someone else. It's then interpreted by an imperfect filter, and updated infrequently.

I think that a case can be made for mobile search for "commodity" services, where the user doesn't need the absolute closest option, nor need a complete list of all relevant providers. Plumbers, window-cleaners, maybe dry cleaners, conceivably petrol stations.

But for those areas were there is a qualitative, subjective opinion involved.... you really want as broad and accurate a base as possible. Ideally, you wouldn't want to choose a restaurant from a guide with a random selection of 25% of the possible options - or, worse, where a particular chain had cut a deal and accounted for 30% of the options. And in many cases, mobile search is competing against two other options - Google on your PC (good for non time-critical things like dry cleaners), or Google search & maps on your handset.

I tried a new option out today - BT's Exchanges app for the iPhone, apparently developed by a firm called Locayta. So, I tried it out on my local area in central London.

Immediate "FAIL". Under the tab 'pubs', it misses my favourite local - as well as three others in the area. Under "petrol stations" it gets the closest one right, but also includes the location of another that shut four years ago. On the first screen of 10 hospitals, it misses the enormous new University College Hospital which has full casualty facilities - because there are 8 private clinics on Harley Street that are 300 yards closer to me.

Google Maps is a bit better - it gets my local boozer OK, and doesn't list the long-defunct Texaco. But it also doesn't get UCH on the first selection(maybe I'm being unfair because Harley St is a bit of an anomaly).

But best of all is good old Google Search, for which "Pubs Baker St" immediately throws up beerintheevening.com, which lists everything locally - with unbiased reviews, maps etc. A similar strategy yields UCH at #6 on the main search page - and #3 with "hospitals marylebone".

Put simply - the source data that Google Search uses is , it seems, far better for these types of look-up action. If I'm locked out of my house and need a locksmith, then maybe a yellow-pages type thing is useful. But for most searches - you already have all you need.

Wednesday, May 19, 2010

A quick thought on operators and APIs

Operators want to sell access to certain capabilities and assets in their network. Many are looking at exposing APIs to developers, enterprises or content providers. The GSMA is pushing its OneAPI programme as a lowest-common denominator set of offerings, while most major operators have major efforts for proprietary API exposure.

In general, I am a believer in this - and it goes to the core of many of the two-sided business models expounded by my partners at Telco 2.0.

But... if the general notion of API exposure and consumption is truly believed by those at the network operators, shouldn't they be "eating their own dogfood"?

As a starting point, wouldn't you expect the amount of API value to be imported from other sources, to be roughly equivalent to the amount exported to these new customer groups?


Now obviously the operators' large databases and infrastructure probably gives them a natural bias here, but that's not to say that similar sources of convenience and value lie elsewhere, which could be bought in.

If APIs are genuinely *that* valuable & available in the short-term, telcos ought to be avid purchasers of each other's capabilities, or those from banks or Google or Facebook or numerous vertical-market specialists. If customer data is valuable, shouldn't a mobile operator want to get access to fixed customer insights, and vice versa?



NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Tuesday, May 18, 2010

More agreement with the "Happy Pipe" concept

One of the options covered in the recent Disruptive Analysis / Telco 2.0 report on broadband strategies was that of the "Happy Pipe", as discussed in my post the other day.

So it's interesting to see Amdocs' announcement of a study it's done with the Economist Intelligence Unit, which appears to make much the same point.

Linked in with this theme is another recent study from NSN about the cost-effectiveness of providing raw mobile broadband capacity.

Coupled with a couple of presentation at the LTE event this morning from operators that said "we want to be a profitable pipe.... and sell other stuff too", this definitely seems to be an emerging theme.



NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

End to end QoS is impossible for LTE Voice

I'm currently at the LTE Summit in Amsterdam

Yesterday, I attended the session on voice over LTE, and this morning I hosted a roundtable analyst breakfast. In the former, the speakers were fairly monochromatically IMS-advocates; the GSMA talking up its VoLTE initiative, followed by Ericsson and ALU.

One interesting thing was the GSMA referring to "migratory solutions" before operators reach the "target" of IMS-based VoIP. Conspicuously, it was fairly even-handed about both VoLGA and CS Fallback. Together with some other anecdotes I've heard recently, there definitely seems to be a softening of enthusiasm for CSFB as the Plan B. [For my views on its deficiencies, see white paper here]

Another audience member asked about the viability of using Skype as a solution. The answer was that it should certainly be possible over a broadband connection... but that it would not benefit from the QoS enabled by an operator-based solution.

My own realisation is that this might be completely the wrong way around. In fact, Skype and Google may well be in a position to guarantee *better* quality of experience for voice on LTE than the operator's in-house solutions, of whatever type.

Basically, we are moving from a world of handset "certainty" for voice, to a world of handset uncertainty, which operators and their network vendors are poorly positioned to control.

The nice, new policy-managed IP core and radio network is the easy bit.

The big half-truth coming from the network vendors is claim that they can guarantee "end to end" QoS for voice. I believe this to be very wrong, especially on higher-end devices. The specifications, as far as I can see, give very little guidance for how to create or enact QoS right down to the client application level.

Remember - old-style CS voice is guaranteed in quality on the handset itself ("certainty"), because it runs on the baseband chip, with a real-time operating system. Nothing gets in its way. The telephony stack is, essentially, baked into the hardware - an evolution from Alexander Graham Bell's architecture.

But this is not true if the VoLTE application runs on top of the handset's OS and applications processor - where it will implicitly compete for resources with all the other apps and services. These may well be "gating factors" on overall QoE, well as radio coverage . Over time, VoIP might migrate "lower" into the OS itself, although it will likely be integrated with higher-layer apps like the advanced phonebook or social network client. I can't see it being buried deep in the baseband for a long time.

This inter-dependence with the OS was brought home to me using Skype recently - and seeing an unexpected red icon, before I went to make a call. I clicked on it, and up popped Skype's QoE window - which shows that the application itself not just measures the quality of the network (and its ability to support VoIP), but also checks the load on the processor, and whether the microphone and headset are working OK.

It told me that I had insufficient processor speed available for a decent VoIP call - I realised I had a looping script running in a browser window. I closed it, and then made my call.

This was a real illustration of a problem I expect to see for LTE Voice: if the telephony application runs on top of the OS, rather than being baked into lower-level parts of the software stack and hardware, it is going to have to deal with the vagaries of the new "computing" style enviroment. It is not obvious to me that handsets will automatically be able to prioritise VoIP as well they have handled CS voice operations in the past, especially on multi-tasking phones.

This has many knock-on implications. It makes it much more difficult to test and certify the voice performance on the device, because the number of extra variables and dimensions increases (plus also for LTE of course, others relating to frequency bands also further complicate the issue). It also makes the performance sensitive to OS updates and versions.

It also means that there may be unexpected interactions with other software components - for example if an enterprise installs a VPN client to tunnel all of an employee's handset IP traffic via the company network. Does the VoIP / VoLTE client avoid this, or comply? Is that secure for all concerned?

And who owns the algorithm for prioritising the apps in terms of the resources they can access? The operator, the OEM or the user? What if the user decides that a data application is more important than voice (eg for healthcare, or anti-virus, or a financial trading app)? What is the "policy management" architecture for the handset's internal operation?

None of these issues appear for normal CS voice in GSM, as it is all essentially hardwired into a different part of the chipset. Nor does it occur in fixed VoIP, where most handsets are relatively "dumb". It does exist on PCs with VoIP client - hence my Skype experience.

This means that it is likely not possible for an operator to claim control over end-to-end performance and quality of voice on LTE, even where coverage is complete.

The most likely response is to "leave it up to the handset vendors", but in my view that is abdication of responsibility to at least define *requirements*, if not actual full standards.

Instead, coupled with other variables in the LTE ecosystem, this means that voice is likely to be less reliable, and less test-able than is the case with 2G and 3G. It might have more fine-grained control in the transport part of the system - but that's of little benefit to a user who can't make a call for other reasons.

(QoE also includes battery life of course - another variable outside operators' control, and which has not been a primary focus in VoLTE to date, although the GSMA is now looking at related areas like "fast dormancy" on the radio network).

So we are making a move from device "certainty" to "uncertainty", irrespective of the performance and policy cleverness of the infrastructure. This, to me, suggests that VoIP players that are most-capable of coping with device uncertainty will have an advantage.

This means companies that can look out for processor performance, measure and manage battery life, let the user make easy choices ("Switch to GSM to preserve battery?", "switch to HD voice for this call?"), handle application mashups and conflicts etc. will be in a much stronger position for voice than those that just "leave it up to the IMS client".




NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

I'm holding my nose and trying Twitter

My views on Twitter are well-known.

But in the interests of unbiased analysis, I'll have a try at using it.

My Twitter name is "disruptivedean". Or probably @disruptivedean using the silly Twitter symbology.

I'll try to update it fairly regularly (I'm at a few conferences over the next couple of weeks).

However, if I have not demonstrated clear ROI (either by getting identifiable new business revenues, or gaining research insights that are *incremental* not substitutional) within 3 weeks, I will be deleting the account.

Monday, May 17, 2010

NEW research paper on the Top 10 Technologies for mobile broadband traffic management

This post is a quick "early alert" for a new thought-leadership report published by Disruptive Analysis this week.

It is a concise, 24-page briefing on the technical options for managing traffic on 3G/4G networks, following on from earlier posts I've written on the diverging solutions that are emerging.

It outlines the different roles of:

  • WiFi Offload
  • Femtocell Offload
  • Radio network enhancements (including signalling management, macro network offload and r packet scheduling and prioritisation)
  • Compression, adaptation and transcoding
  • Device-based traffic management techniques (including compression, rate-adaptation and network-sharing)
  • Contention management & tuning TCP/IP
  • Deep packet inspection, policy-based traffic shaping & differential charging
  • End-to-end service assurance and monitoring
  • Caching, multicast & CDNs
  • Congestion APIs
It also examines some "starting point" recommendations for operators assessing their options for reducing traffic on congested networks, as well as discussing some of the most worrying operational and tactical pitfalls that may occur.

(Note: this is not a full analyst "strategy report", but a short document to introduce the expanding variety of technical options emerging to solve the "capacity crunch". It does not include detailed analysis of vendor positions or the minutiae of architectures. Further details on business models are covered in other research reports)

I'm am going to be at the LTE event in Amsterdam over the next two days (ash-cloud permitting) and I will be setting up a more fancy electronic document-delivery system at the end of the week, and talking through the analysis in more depth.

However, if you're desperate to get the research paper immediately, I've set up a "get it now" button for instant purchases which then go straight to the PDF download page. Prices start at US$350 for 1-5 users, plus VAT for UK/EU customers.

ONLINE PAYMENT FOR NON-EU CUSTOMERS ONLY - For UK / EU purchases, please email information AT disruptive-analysis DOT com for VAT invoicing.

The payment is done via Paypal's merchant system, but you do not need a Paypal account - you can pay by credit card if you click on the link in the lower-left hand corner of the payment page.






Users and licences




Telcos' own-brand iPhone apps

Just out of curiosity, I decided to have a look through the Apple AppStore to see what other telco-branded applications are available.

So far, I've found:

- BT Exchanges local search and yellow-pages style app
- Orange UK WiFi Hotspot finder, plus also Orange Wednesdays movie application and "Your Orange"
- AT&T "Connect Mobile" conferencing app, Yellow Pages and NBC Olympics apps
- Telefonica Espana "123", O2 UK - My O2 account management app (useless without an O2 account), O2 Czech IPTV guide called "O2TV"
- Telstra's "Official V8 Supercars" apps for Australian motor-racing fans
- Portuguese operator TMN's "Pond" social media aggregation tool
- Taiwan operator Far EasTone's "Do U Love Me?"
- Four apps from Korean operator KT Corp, none of which I can understand
- Something from Softbank mobile in Japanese that I don't understand
- Various Vodafone apps including People Sync (part of 360), a multi-headed social network front end called "Vodafone Update", and others from various local properties including Omnitel (Italy) and its Turkish subsidiary

I guess we are moving slowly to a world in which operators start to become so-called over-the-top providers of applications, but there is still huge resistance from the corporate culture of blending access and service.

Sunday, May 16, 2010

A problem with WiFi-based offload?

In general, I'm a fan of using WiFi to reduce load on 3G macro networks, especially for laptops used with 3G USB dongles and smartphones like the iPhone. Where the connection manager works well, there are various models which can permit the device to connect to home/office or public WiFi.

In particular, I'm seeing a fair amount of mobile operators use their own, or partners' networks of public WiFi APs. Most notably, AT&T acquired Wayport and has other footprint for use in iPhone offload, while Vodafone has cut a deal with BT OpenZone in the UK.

However, there is one problem as the density of such offload points increases. As well as the main public OpenZone points in London, I can also use the BT Fon "virtual hotspots" as an extension of my home broadband account.

While this is great for my laptop, it's causing me difficulties on my smartphone. The density of BT / Fon / OpenZone access points in central London is so high (in homes and offices or other locations) that as I walk down the street, my iPhone keeps attempting to register. But by the time I've got online, I've walked past it and onto the next one along the street with stronger signal.

If I walk 5 minutes from home to my local tube station, I need to switch off WiFi temporarily, if I actually want to use mobile data - otherwise I have a constant stream of pop-ups from the connection manager on-screen, and no reliable connection.

It probably wouldn't help with a high density of femtocells either, until there's a reliable way of doing femto-to-femto handoff as you walk down the street.

Food for thought.



NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Friday, May 14, 2010

An open letter to Vodafone on data roaming pricing

Dear Marketing and Pricing Executives at Vodafone,

In the past, I've been pretty complimentary about you guys, especially with regard to things like Passport and innovations around third-party paid mobile broadband.

My attitude has just changed polarity.

In February, I switched my main personal mobile account to an iPhone on Vodafone UK. My main criterion for choosing Voda over iPhones supplied by O2, Orange or Tesco was specifically about international roaming charges. All the other pricing was much the same, and I thought that perhaps Vodafone's network might be less congested than O2's, without a million other iPhone users sharing it.

So I went to my local Carphone Warehouse, asked about the different international fees, checked online - and picked the Reds on the basis of a simple and intelligent charging structure, which seemed to be much less of a rip-off than the others.

We all know that most international data roaming is ludicrously-priced, and that it needed heavy regulation from Brussels to get European tariffs into the vague realm of sanity, with notification-of-charge and so forth. The whole industry privately agrees that data roaming pricing is a joke, even if few people at operators want to be seen killing the golden goose by acknowledging it in public.

But I thought that the costs on Vodafone - basically a flat fee of £5 per day for up to 25MB - were not too unreasonable. It's about half the price of a day of hotel WiFi, or perhaps 2 hours in an Internet cafe. Or 10x the price of Vodafone's UK prepaid daily tariff, which gives 25MB for 50p.

Expensive but acceptable, especially as I'm checking my business email. 25MB is ample for non-heavy use of an iPhone: email, catching up on blogs, a quick bit of Google maps and Facebook. Most importantly, I can be pretty confident that I can just use my phone normally, without double-checking the amount of data usage on the settings menu once an hour.

I know that the current EU-mandated wholesale price cap in Europe is €1 per MB, falling to €0.80 from July. I also know that Vodafone has its own footprint across Europe, so it's not as though it's getting stung for lots of wholesale data fees, as generally you're on-net anyway - so any disparities wash out on inter-country transfer pricing.

So, given that the only reason I'd chosen Vodafone in the first place was because of this tariff - and I'd recommended it to other people as well - I wasn't best-pleased to receive an SMS announcing:

"On June 15 data roaming prices are changing plus some countries will be moved into a different travel zone"

The new prices are £1 per MB, up to £5, then £5 for every 5MB after.

In other words, for 10MB the price has doubled, and for 25MB it has quintupled. For 25MB, that's 50x the cost of the domestic daily price. Apparently you get SMS alerts when close to 5MB and 10MB thresholds.

I see that the press release somehow manages to pitch the change as being positive for customers. I'm impressed that you seem to have managed to hire Alastair Campbell and Peter Mandelson so soon after the general election, to spin bad news for you.

I guess I'm back to using WiFi only on the iPhone when I'm travelling, and I'll go back to taking an unlocked Nokia and buying local data SIMs again.

I've double-checked this with customer service (your IVR system is broken, by the way) and asked to get transferred to someone whose job it is to deal with what I described as "high nuisance value" customers like me. Double-checked with her too, that this applies across the board, including iPhone tariffs.

So. My main (only!) reason for choosing Vodafone against its competitors has just been removed at a month's notice. Three months into an 18-month contract, so I can't just churn immediately as I'd like.

Honestly guys - I've heard Vodafone talk at various events and conference about customer loyalty, stickiness and so forth. Have you not worked out that if you show *contempt* for your own customers, that might work against you? You even got a plaudit the other day for smartphone customer loyalty - and up until 3pm this afternoon I would have given the survey a thumbs-up as well. Let's see what the next one looks like, eh?

[Edit: Note to all mobile operators: as long as you pull stunts like this, do you *really* think you can convince your customers to sign up for mobile payments/wallet service, or managed identity & authentication, or similar? How do I know you're not going to change the rules mid-contract to permit spam or charge extra? This whole idea of massive price changes to live contracts illustrates a huge amount of bad faith and a lack of business professionalism]

It's not as though you're increasing per-GB price for mass mobile broadband downloads either, where perhaps there's an argument that costs and prices are out of kilter. This is a per-MB roaming price - quite probably the single most overpriced, unjustifiable and most-hated item on operators' tariffs in the entire industry.

Now it's vaguely possible that this cost somehow reflects increased signalling through VLRs and RNCs, rather than actual data downloads, because of regular international data connection setups from smartphones. But if that's the case, you should say so - and frankly I can't believe anyone uses $20 worth of VLR resources per day. It would be cheaper for you to give users a local Vodafone SIM to switch to.

Overall, it's fair to say I'm furious about this. Oh, and I wasn't going to mention this before today - but your network in central London sucks too - the number of times I've been with friends with O2 iPhones who have coverage when I don't is amazing. As are the number of dropped data connections, mysterious "403" errors that need me to switch off & re-register and any number of other glitches.

I've written before about "Resentment Base Pricing" and "Active Customer Disloyalty". Looks like I've got a good case study.

So Vodafone people: any excuses? Is it genuine customer contempt, or just accidental?

Responses welcome either on this blog or via email.

Dean

(Note 1: journalists - feel free to quote me)
(Note 2: Google, you should probably work out a way to get the Maps app to send a GPS look-up via SMS when the user has data roaming switched off, returning with the nearest free WiFi cafe and a voucher for a discount espresso)




NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

From "dumb pipe" to "happy pipe"

Recently, I've been wondering exactly who coined the term "Dumb Pipe".

David Isenberg wrote a piece called "The Rise of the Stupid Network", in 1997, when he worked at AT&T. A copy is still available here, but although it uses the phrases "dumb bits" and "dumb transport", it doesn't mention the word "pipe".

The negative associations with this snappy, convenient epithet have probably cost the telecoms industry a trillion dollars. It is so unappealing, it seems to induce an almost visceral and irrational fear. You only have to look at the way that some vendors sneer "You don't want to end up a dumb pipe, do you?" to recognise that we're beyond cool-headed analysis and getting close to some legal form of discriminatory "-ism" here.

For the last couple of years, the term "smart pipe" has bubbled around, making a few people think a bit more closely about areas like policy, QoS and so forth. Yet it still does not appear to have dented the shield of fear or bias around the "dumb pipe" dystopia that many perceive to be encroaching. The frenzied and ridiculous appeals to the European Commission for a "Google Tax" are prime examples of this. In essence, the lawsuits appear to say:


"We are dumb.... so can you please tax the clever people for us?"

Recently, I've been using the term "Happy Pipe" instead, to point out some possible different futures - and also to confront the almost bigoted preconceptions that surround networks' supposed "dumbness". This applies to both fixed and mobile broadband - although the challenges, technologies and capacity are different, as are the business models emerging to monetise the happiness.

There are a few separate strands here:

  • Today's networks are pretty far from dumb, and there is huge value in deploying and running them well
  • The smartest networks are the ones which work collaboratively *with* Internet and content companies, not antagonistically against them. This specifically related to areas like policy management.
  • There is much under-exploited potential for revenue around wholesale models. There are many potential business opportunities, both for "bulk" wholesale and "slice and dice" methods of deriving extra fees for capacity and value-added services.
It is conspicuous that it has required a range of new players in mobile data, such as Jasper Wireless, to develop extra functionality that translates between a carrier's wholesale offerings, and those consumer electronics and M2M firms that which to exploit connectivity in their new products. While some operators have innovated in their platforms (eg Telenor, speaking last week at Telco 2.0 about M2M), others have become introspective.

It is difficult for a camera manufacturer to think "I'd love to sell a new SLR with 1000 photo uploads included" and then find someone who could structure that deal, because it's not a "subscription". It's difficult for a hotel chain to shop around for a way to part-subsidise roaming charges for its international guests. It is difficult for an Internet video provider to get a "network congestion API" so it can cleverly rate-adapt its codecs during peak hours, or even just push a message to its users warning them of likely buffering delays.

There are so many ways that the capabilities of a broadband network - fixed or mobile - could be used to improve customer experience, work more effectively with upstream partners, improve traffic management without interfering with users' expectations and unlock new revenue streams.

I covered a large amount of analysis on these and other sub-themes in my recent report on Fixed and Mobile Broadband Business Models, published by Telco 2.0 (details are here or email information AT disruptive-analysis DOT com).

In it, the report concludes that operators have 4 main strategic choices:

- Becoming a full, Telco 2.0-style service provider with a broad set of retail, wholesale and "two-sided" propositions, engaging with users, developers, content providers and so forth
- Becoming a "happy pipe" provider, focusing more on wholesale propositions in addition to class-leading access and related infrastructure based value-added services
- Becoming a "government department" - ie running national broadband networks or critical infrastructure like electricity smart grids.
- Becoming a "device specialist" focused on creating user experiences and product/service end-to-end propositions in either fixed or mobile domains - exploiting Moore's Law, rather than betting against it.

These are not mutually exclusive, and certainly I would expect the very largest operators to have a foot in all camps, especially where they have multiple national properties, or dedicated wholesale divisions. Fixed operators with "structural separation" provide an interesting model for their peers in mobile.

One other missing piece of the puzzle is exactly what type of services can/should be offered on top of access - and how they should be charged. The simplistic attitude that YouTube / Skype / Facebook / Salesforce.com somehow act as predatory "over the top" providers, somehow disenfranchising operators from their rightful revenue streams is weak thinking.

There is no reason why Verizon or Orange or China Mobile could not have acquired YouTube instead of Google - except the telcos' historical inertia behind maintaining the link between access and service businesses. Despite the past 10 years, there is *still* a reluctance by network owners to offer services beyond the confines of their own access customer base - thus denying themselves the global scale required to compete with Internet-based providers. Yes, those services may well be *enhanced* over their own infrastructure, but that is not a reason to eschew pushing the widest possible distribution as well.

The bottom line is that we need to move away from this "dumb pipe" slogan. Separating connectivity and service is inevitable in a lot of ways - but that can actually add value to providers of both.

(In addition to the research report on business models , Disruptive Analysis also undertakes strategic consultancy for vendors and service providers in this area. This encompasses diverse aspects including management workshops, business plan review, competitive analysis, organisational development and executive coaching, and studies of market dynamics and forecasting).



NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Sunday, May 09, 2010

Mobile broadband traffic - be careful about language

I am currently writing a Disruptive Analysis research report on mobile broadband traffic management strategies. I have discussed various concepts on this for the past year or so - the relative merits of offload, compression, policy management and so forth.

One important factor for vendors and operators to keep reminding themselves is about the importance of accurate language, logic and semantics. The wrong words can drive poor decision-making, especially on "emotive" issues. Non-sequiturs and logical fallacies can lead discussions or engagements astray.

One of the most mis-used words is "capacity".

What triggered this post was seeing a sentence along the lines of "3% of mobile data users take up 40% of capacity".

This is almost certainly untrue - as very few networks (none?) actually run at capacity-utilisation rate of above 40% - especially when averaged across all cells. If that were true, there would be almost-permanant and geographically-ubiquitous congestion for mobile data.

Add in to this the fact that "capacity" is actually an ill-defined term embracing multiple separate variables (uplink capacity, downlink capacity, signalling capacity etc) and measurable at various points in the network, and it becomes even more useless as a description of the current state of affairs.

What I expect may be the more accurate statement is "3% of mobile data users account for 40% of aggregate downstream traffic".

Which is an interesting observation - but not in itself a "problem statement", and certainly not something that can immediately lead to conclusions such as "... therefore flat-rate pricing is untenable" or "... therefore it is critical to manage specific applications".

Those are examples of non-sequiturs which are potentially damaging. There is no direct logical connection.

Instead, it is critical first to understand what the problem actually is. So, 3% of mobile data users account for 40% of aggregate downstream traffic - but what impact does that have, either on the other 97% of users, or the operator's cost base?

If that 40% of traffic was confined to rural cells operating at much higher rates in the middle of the night, it is likely that the impact on other users would be zero, although it might have some variable costs associated with peering. If that 40% was instead concentrated in the busiest urban cells in the middle of the day, when existing capacity really is creaking, then there's a much more pressing problem.

But what if heavy users tend to download a lot at night... but then have usage during daytime that is broadly on a par with everyone else? They are then not using capacity in a way that causes any more congestion than light users. It could even be that a nominally light user, doing a sudden big burst of mobile video at 9.30am on the bus to work, causes more problems than another user trickling P2P traffic throughout 24 hours.

And in each of these cases, there are varying signalling loads as well. A smartphone user checking his email 10 times an hour might be causing more headaches than a laptop user watching 15 mins of video once a day.

My view is that until there is really good, really granular data on actual usage patterns (and scenarios and forecasts for how that might change in future), knee-jerk comments about "bandwidth hogs" are likely to cause more trouble than they solve.

Instead, I am working on a priority list of actions that operators can take to reduce the pressures on the network without creating unintended consequences in terms of user experience, customer satisfaction, or fixing "the wrong problem".

There are various actions - and technological avenues - that can be pursued without risking money on over-complex solutions. I am particularly skeptical of policy management approaches that stress focus on application differentiation, rather than (for example) time-of-day.

Watch this space for more extracts from the analysis.

(As well as the research study, I am also sharing my views and data on this in private advisory consultations. Please contact me for further details - information AT disruptive-analysis DOT com)



NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Thursday, May 06, 2010

Paying for mobile QoS? Three thought experiments

A regular refrain from vendors I speak with is that content companies, or application providers, could be persuaded to pay for extra mobile broadband "quality". The argument goes that a video website or cloud computing provider would pay for guarantees of absolute or relative prioritisation, bandwidth levels, latency, jitter etc.

Irrespective of the legal situation - which in any case varies by country and over time, I have my doubts about the technical and commercial practicality.

I think it is much more achievable in the fixed world, where the operator doesn't have to contend with the vagaries of radio, and where the presence of a "box" like a gateway gives a much better chance of monitoring what is actually delivered. The WiFi or ethernet connection at the fixed-broadband end-point to a final end-device (PC, TV, phone, tablet etc) also gives a clear demarcation point of responsibility. The operator can say with confidence that their bit of the end-to-end system did its job - and any issues with battery life, memory, device configuration and so forth are your problems. That's much more difficult with a smartphone - if the extra-quality video doesn't work, whose fault is it? And does the video provider still pay?

Nevertheless, whenever I spell out my concerns about differential charging for applications, I get bombarded by vendors (and some operators) insisting that their DPI box can detect absolutely everything, right down to what the user had for breakfast that morning.

Rather than get to an impasse, I thought that as well as commercial, legal and technical reasons, I'd also have a try at logical flaws in the argument - and perhaps highlight some extra opportunities along the way.

First off is prioritisation of the operators' in-house services. Many mobile carriers have their own video streaming, music or other rich application/content platforms. I'm assuming that some measure of optimisation is typically used by the operators to ensure these perform well - obviously it will be easier to test inhouse, and senior management can ensure adequate cooperation between network and application teams.

But.... if "real" quality can only be achieved at the level of manageable network QoS... and if "serious" content providers are willing to pay for it,.... then why not set up an effective structural separation between the services group and the network delivery team? If it actually came out of their own budget and P&L, would the inhouse video content team really pay money to the other department for improved network access? Or would they instead use rate-adaption and other tools to work around the limitations of best-efforts?

I haven't heard of any operators running an internal QoS market, but I'd be fascinated if any readers have anecdotes.

The next thought experiment takes this concept a bit further.

Now, consider the situation once again, where the operator's video content team is willing to pay extra for QoS to ensure their streaming is delivered better than it would be from the open Internet.

And consider that another operator offers network-based QoS - perhaps in the same country, or perhaps elsewhere in the world. Given there's already an "open market" in video streaming via YouTube, Hulu and so on.... shouldn't that operator in-house team therefore be prepared to deliver its content via other carriers' networks? Let's say, for the sake of argument, Verizon providing its video service to users on Telefonica O2 in the UK.

Given that they are in-house teams within operators, surely *they* understand better than anyone the capabilities - and potential differentiation - that comes from network-based prioritisation and QoS? If Verizon paid guaranteed-QoS fees to O2, shouldn't it be able to create and market a class-leading video service that end users would pay for? And shouldn't the O2 network team also think that Verizon's video people are therefore much easier to sell QoS to than, say, YouTube?

In other words.... if operators (and their vendors) really believe that "premium" network QoS can enhance the competitiveness of applications and content, or raise ARPU and improve customer satisfaction... why don't they put their money where their mouths are? If Telco X is clever enough and network-savvy enough to create a QoS- managed service that should outperform YouTube.... why hasn't it happened?

My last point is not about prioritisation, but coverage. Often, the gating factor on overall Quality of Experience is not radio or transport or core resource, it's a simple lack of decent signal. (Yes, I know that in theory coverage is a bit dependent on other users in the same cell, but let's just assume the culprit here is a thick stone wall).

Certainly, Vodafone is marketing a femtocell in the UK under the name of "Sure Signal" - and getting users to pay a premium for an "enhanced quality network" in their home or workplace. While some of the customers are just buying the femto to get any reliable signal at all, there is some anecdotal evidence that a proportion want "better than normal" coverage "5 bars, all the time!" - for example if they live inside, but near the edge of a cell. This tends to support the argument that a certain (smallish) group of users might pay extra for "gold service" QoS, however that is defined.

So then the question is... for an operator wanting to offer an improved *average* experience, both in terms of absolute coverage, and higher performance throughout each cell - is the best and cheapest mechanism really through network prioritisation? Or might it be more effective to do some sort of national-roaming deal with competitors, where the phone switches to a rival's network at a given location and time, if that network has a better signal and uncongested capacity?

Wouldn't it make sense for the mobile industry to have the equivalent of the airlines' interlining and codesharing agreements? In those situations, you can get an end-to-end ticket issued on Airline A, which covers one leg actually operated by Airline B. Your luggage gets "handed off" seamlessly and Airline A takes overall responsibility for end-to-end quality. Airline A benefits from Airline B's better coverage or schedule at a local level, while Airline B gets incremental revenue and traffic from Airline A's better sales and distribution to end users. Interestingly, low-cost carriers like EasyJet and Ryanair generally don't participate in these type of arrangements, only the premium-priced airlines do.

The analogy is simple.

If a customer with the "nameplate" Vodafone service sometimes actually gets connectivity via the Orange or T-Mobile network, would they really care, as long as their average performance went up? And, in fact, might they not even pay a premium for it? Could there ever become a distinction between a "full service network" (which puts you on one of its nominal rivals' cells when it gives you better coverage, but charges you more to cover the wholesale fees), versus the low-cost network which is all-or-nothing, running just its own, silo'd coverage?

Is the answer to better QoS (which customers are prepared to pay for) not another box in the network, but just better/richer wholesale arrangements and national roaming? Can we go further than current development in network-sharing, and have a more generalised platform for deals?

[Note: I know that it's not that simple, either because of regulation or because it takes a finite time to find, register and roam onto another network. But they could be improved by a telecom effort to find a convenient codeshare/interline approach]





NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Wednesday, May 05, 2010

Does the outcome of the Dutch 2.6GHz auction represent skepticism on LTE?

There are various spectrum auctions ongoing at present or the near future. The big ones are the 3G bands in India (2100 and 2600MHz), as well as a multi-band auction (800 / 1800 / 2000 /2600 MHz) in Germany.

But there is a smaller one that has just finished in the Netherlands, for the 2.6GHz band only. The outcome has been pretty lacklustre - just €2.6m. Martin Sauter has the breakdown of it here.

Trying to analyse this is a bit further, my current thoughts are:

- The two newcomers both have extensive fixed broadband assets - Tele2 has 431k subscriptions and the other (Ziggo) is a joint venture between cable operators. That potentially points to an "inside-out" strategy at 2.6GHz, plus Tele2 attempting to switch some traffic (data?) away from its MVNO arrangement.
- There are only three incumbents, which means that competition for spectrum in other bands is not as harsh as in other markets. Nevertheless, it seems odd that they only bid for 2x5 and 2x10MHz - although it's not immediately clear how those fit with the spectrum caps under the auction rules. The 5MHz is particularly strange, as it potentially means lower peak and shared rates for devices running in a "hotspot" 2.6GHz band location, rather than a wider macrocell.
- We can pretty much write off any opportunity for mobile WiMAX or TD-LTE in the Netherlands for the forseeable future, given the lack of bids for unpaired TDD spectrum.

One interesting possibility is that the Netherlands' very high fixed broadband penetration might mean that operators are looking to WiFi and femtocells rather than spectrum additions for capacity enhancement. The Dutch are already among the leaders in the deployment of picocells as well - both for public locations and for low-power GSM.

Another questionmark is around LTE. The results of the auction suggest that 2.6GHz (the main likely band for LTE in Europe) is not seen as particularly strategic - which may reflect reticence overally for the technology in Holland. I've suggested before that operators should lean on their vendors (and chipset suppliers) for support of 2.6GHz HSPA, which would seem to fit better with the allocations.

I'll try and catch up with the German, Danish and Indian auctions over the next week or so.



(There is also an ongoing 2.1 / 2.6GHz auction in Denmark)

Tuesday, May 04, 2010

Why I think the iPad won't change anything

At last week's Telco 2.0 summit in London, I crossed swords with financial analyst Richard Kramer of Arete Research.

He has a view that the PC industry (and specifically laptops) has failed to innovate for much of the last 10-20 years, and will be overturned by newcomers, particularly Apple's iPad and more generally a new wave of tablet-style competitors. He was less definitive about the role of telcos in supporting these devices, but definitely felt that they represented a step change in how people engage with the web and various forms of content. He singled out the newspaper and magazine industry as being a prime candidate for iPad-isation in similar fashion to the iPod and music.

I disagree strongly. I believe that the iPad is a side-show, albeit a glamorous one. I also have extremely grave doubts about the massmarket viability of next-generation tablets (or MIDs, or smartbooks or mobile computers etc) based on Android, Meego or Chome OS. I'm even less sanguine about the possibility that there could be a telecom operator model underpinning those ecosystems.

My belief is that the PC industry is guilty not so much of a lack of innovation, but a lack of cohesive marketing strategy. There is no "Windows PC Community" estimating the incremental GDP arising in the developing world from PC-based Internet access and software industries. There is no glossy marketing pointing out that sharks haven't bothered evolving for 300 million years, because they are essentially perfect for their niche.

Instead, the PC industry has gleefully taken the GSMA's shilling, hoping for a few extra crumbs of operator subsidy and extra retail outlets during the recession. At the time when banks and credit card companies were taking a dim view of incremental consumer purchases, mobile operators cleverly managed to disguise loans under the guise of "free" laptops. They have been complicit in pretending that "embedded 3G" was going to be pervasive, despite knowing full-well that most netbooks are sold through ordinary channels to people wanting a cheap PC for use on WiFi at home or in school, or clogging up the 3G networks with commodity traffic supplied via commodity dongles.

Let me switch to the iPad, and by extension other tabletty-type computers that might come next.

Yes, it's pretty. Yes, it's sold to quite a lot of the usual star-struck Apple-istas already. Yes, I'm sure there are plenty of uses for it. But there are plenty of uses for many cool gadgets which make them appealing to gadget-lovers. And I guarantee that every single one of them will own (a) a mobile phone and (b) a computer (PC or Mac) already, and wouldn't give them up if you paid them.

You'll notice that the iPad has been cleverly positioned by Apple so as to avoid any risk of competition with its Mac range. Jobs clearly believes that people will want a fully-open computer as well as at least one locked-down device. How many Macs would he sell, if he prohibited them from running Flash? Or only allowed apps or content that had been vetted by his censorship team? Or restricted the use of external media like SD cards?

Now, I certainly can't blame Apple for trying to create a potential new pool of profit from a cool gadget betwixt Computer and Smartphone. If it takes surplus cash away from people who'd otherwise be buying electric can-openers or new TVs, then fair enough.

But to claim (as some people do) that it either:

- a) renders netbooks and laptops obsolete, or
- b) heralds a mass switch-over from print media

sounds ridiculous to me.

Yes, netbooks are *mostly* used for web-based applications [such as my writing this post on my Samsung], but I definitely want a full suite of native applications as well, which I choose and install myself, not subject to the vagaries of Apple's appstore. I cannot imagine myself with an iPad writing this - with my email, streaming music, Skype and Yahoo Messenger running in the background, working on a number of office files as well.

Now that doesn't mean that I couldn't ever use an iPad myself as well - I already browse the web a fair amount on my iPhone, even though my netbook is just downstairs. So there is an argument that Internet usage will become more segmented by task type. If I want to move pictures from my camera to my hard drive, and selctively upload a few to Facebook, I'll use my desktop. If I'm on a plane, I'll use the netbook. If I'm in bed and want to check my email and overnight SMS's first thing in the morning, the phone.

So maybe it's a device for the 10-20% of Internet usage time when you don't have anything else to hand.

The print media thing is a bit different, and clearly is outside my main domain of industry coverage. And I understand that there's a whole world of pain in that industry at the moment. But I'm unconvinced the iPad is the answer for more than a tiny fraction of readers. I buy a fair number of magazines, a fair number of books, and I read a fair number of newspapers (some of them free and disposable, like London's Evening Standard). I'm expert at folding them to read on the Tube. I usually have reading material for flights. I've got a stack of old Wired magazines around home, and a few copies of Top Gear for anyone desperately in need of reading material in my bathroom. I have a bookcase full of Lonely Planets and Rough Guides - my traveller's equivalent of a trophy cabinet.

Yet I don't have a Kindle, nor have I seriously considered getting an e-reader. I've only ever seen three people with them in London, one of them a semi-famous TV celebrity who was in my local Starbucks, hoping people would first notice it and then recognise him.

I simply cannot see a situation where a large bulk of the world's readers of the FT or Cosmopolitan or Harry Potter go digital *in substitution* of their usual print media. It's not like music, for which the move from CD to MP3 reduced the fallibility and bulk of moving parts, and for which headphones insulate you from the vagaries of the environment. People read in places with no power, bright light, risk of theft, or where comfort and tactility are all part of the experience (armchair + book + whisky, or cafe + cappucino + newspaper in the sunshine). They may want to avoid carrying a bag - or baulk at the need to carry both tablet and PC together.

I can understand the appeal of interactiveness of a connected tablet for media owners and their advertisers. But I am just unconvinced that the user experience and intangible benefits of print has been given as much thought.

In summary, I can see a market for iPad-type devices of a similar scale to (say) personal navigation devices - maybe a worldwide target audience of perhaps 50m people. There are some fascinating niches - perhaps education, or gaming, or a few video applications. But I cannot see them replacing PCs (or Macs or netbooks), nor making a meaningful dent in the consumption of newspapers opr magazines. And outside a few metropolitan hotspots, I can't seem them heavily impacting operators' revenues or their networks either.

[Note: if you represent a company in the mobile industry that wants a contrarian view of device strategy and its impact on business models, please get in touchwith me via information AT disruptive-analysis DOT com]