Wandering around Barcelona last week, I started feeling a deep unease at the current level of hysteria around mobile apps. It is was compounded this week by seeing a T-Mobile advert on the London Underground which didn't show a phone, but just said "Would you like a free phone with apps for just £20 a month?" [meaning "We'll sell you a cheap Android instead of an iPhone, but don't dare mention it or show it"]. Apple is bombarding the world with "apps, apps, apps" advertising as well.
I'm a great believer in the Tyranny of Consensus. When everyone agrees - it usually means that they're all wrong.
I'm wondering if the great drive towards mobile applications on smartphones, catalysed by Apple although obviously around for years before, has the longevity that many people seem to be assuming. Operators, device vendors, OS providers, 3rd parties - everyone wants a piece of the supposed action.
But maybe it's just a fashion? After all, do you *really* want any form of ongoing "relationship" with a handset manufacturer? Will the mass market really want to keep adding new stuff to their device?
The first 100-200m owners of PCs bought and installed lots of applications. The most recent 100-200m have probably just got Office, a browser, Norton or some other security package, Skype and their favourite IM client. Apart from gamers, most people don't continually look for and download PC apps - although they're there occasionally if need strikes.
Maybe the mobile will go the same way - you'll get a phone with a pretty good set of pre-installed capabilities, perhaps through some sort of pre-sale configurator. "Skype - tick. Spotify - tick" and so on. You'll get another set of applications when you set it up for the first time or over the first week of ownership. And then on an ongoing basis you'll get occasional updates of these, but it will only be once in a blue moon that you'll actually download anything new.
(One slight difference is the current perceived trust in AppStore downloads - there's not the same worry getting some random .exe )
Most "cool new stuff" will be in the browser, just as it is with the PC. And maybe, just maybe after you've got used to it, you'll bother to find out if there's a 20%-better application. Once there are easy metaphors for multiple browser windows and tabs on mobile, and more ubiquitous support for multi-tasking, the idea of a "widget" becomes obsolete. They're just contrivances to get around small screen size, I think.
Yes, there is an alternative future, where we're all browsing app stores on a daily basis. Perhaps it could come true. I have to be careful here, because when I'm wearing my "personal" hat, I have an in-built bias as I'm utterly disinterested in mobile applications, bar a very, very small handful that I'd prefer were pre-loaded on a device in the first place (Google Maps, Skype, Facebook, maybe Spotify).
Obviously my "professional" hat as an analyst means I have to be both interested in, and familiar with the broader ecosystems, but my job doesn't need to extend to my private life. I'm a bit of a purist, in other words - also reflected by the fact I drive a car with absolutely no modifications, and indeed no weight-adding fripperies like ABS or airbags either.
Cars hold another useful metaphor though. It's generally considered to be absolutely fine to "customise" a vehicle when it's ordered new - the colour, trim, special options like better brakes or sportier suspension. But that changes - for most people other than special groups of enthusiasts - in terms of aftermarket customisation. Suddenly, add-ons become gauche and geeky. Sure, it's more socially acceptable for a rickshaw driver in India or a bus driver in Guatemala, but in normal circumstances loading a ton of extra tat onto your car makes you look like an idiot. (Caveat here: yes, I know some countries have a more permissive attitutde to modifying cars, but then some also have a lax attitude to wearing phones in holsters on your belt).
The bottom line is that I'm wondering if the massed billions of phone users will really care about iPhone-style junk applications. Personalisation is all very well - but it's best done upfront, not on an ongoing basis. The hand of fashion could also start to dictate that people customise something else rather than phones.
A vision of 4 billion "modified" smartphones represents a dystopia of geekiness.
Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event
Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here
Thursday, February 25, 2010
Wednesday, February 24, 2010
Smartphones are adjuncts to PCs, not replacements
I've long been deeply skeptical about the notion that phones would displace PCs (I'm including Macs here) as the principal platform for computing and Internet access. In particular, I always try to bust the myth that "the next billion Internet users will first see the net on a handset" - a catchy assertion, but one backed up by zero evidence or clear rationale. I've been similarly unconvinced by Nokia's rhetoric about its smartphones being "mobile computers".
Yes, an increasing number of quick computing/Internet tasks are done on smartphones - checking email or Facebook, using a navigation app, jotting down your expenses, writing short blog posts and so on. But almost all of these are transactional updates of processes *anchored* by traditional PCs. Nobody sets up their Facebook account and configures all the settings on a phone.
I also disagree that the gulf between laptops and smartphones is narrowing. Netbooks are small laptops - the fact that a handful are sold through carrier channels with embedded connectivity is not changing the overall paradigm. Phones are "service" devices which are mostly useless without a linked subscription of some sort, while PCs are computing "products" owned and controlled by the user or IT department.
The interesting thing is that only Apple (and Microsoft to a lesser degree) seem to *really* understand the full importance and longevity of the PC - and the extra options it enables in terms of the business model for handsets/smartphones. Generations of mobile users have struggled with awful PC-phone connectivity software from almost all vendors, which have viewed the desktop as an afterthought.
Many people seem to make the mistake of assuming that Apple is targetting the 1-billion unit annual mobile handset market with the iPhone (and 4-5bn mobile users), usually citing its market share as tiny and insignificant, with a disproportionate amount of media and developer attention being focused on it. Apple itself seems happy to shrug off such criticisms.
I think this because it defines its market differently. I suspect that the real "addressable market" it's after is the installed base of PC owners/users - probably about 1.3bn/2.0bn repectively. By no great coincidence, pretty much all PC users also have a mobile phone.
Unlike purveyors of "mobile computers", "smartbooks" and the like, Apple doesn't want to substitute the iPhone against a desktop or a laptop. It appears disinterested in the iPhone being a new user's *only* source of Internet access. It wants the phone to be an adjunct - and also hopefully encourage a long-term switch to a Mac desktop or laptop.
Accessories often have some of the highest profit margins. And if they can drive the sale of premium-priced "core" products then so much the better.
But the real kicker here is iTunes. Physically connecting an iPhone to a PC means it can be totally and reliably updated and upgraded - not just in terms of the apps and OS, but also right down to the radio firmware and low-level internals. This is a lot less clunky than the over-the-air approach to device management elsewhere. The reach it gives into the installed base is hugely more than most other operator/device approaches.
I think there's more here to this. I think the notion that smartphones are somehow cheaper than PCs and will appeal to first-time users in developing countries is wrong. A second-hand $50 Pentium 3 can get a whole family online, show movies and be used for a broad set of applications. It will also last for years. (I know, that's exactly what a Sri Lankan cab driver told me in January when I asked him if he used the Internet on his phone).
In that scenario, the "cost of ownership" for a basic PC is less than $5 per person per year [excluding the connectivity, obviously]. Even a $50 secondhand unlocked Symbian device would be more expensive on that basis, given its expected working life and difficulty in sharing. There is no way that a $100-150 device would "get a whole family online" and provide anything like the functionality or useability of a PC.
Bottom line: The addressable market for smartphones is (to 80/20 accuracy) the base of people who already have a PC. The most interesting and relevant market share statistic would be the break-down of PC users who are also *active* smartphone users, with a data plan.
Yes, an increasing number of quick computing/Internet tasks are done on smartphones - checking email or Facebook, using a navigation app, jotting down your expenses, writing short blog posts and so on. But almost all of these are transactional updates of processes *anchored* by traditional PCs. Nobody sets up their Facebook account and configures all the settings on a phone.
I also disagree that the gulf between laptops and smartphones is narrowing. Netbooks are small laptops - the fact that a handful are sold through carrier channels with embedded connectivity is not changing the overall paradigm. Phones are "service" devices which are mostly useless without a linked subscription of some sort, while PCs are computing "products" owned and controlled by the user or IT department.
The interesting thing is that only Apple (and Microsoft to a lesser degree) seem to *really* understand the full importance and longevity of the PC - and the extra options it enables in terms of the business model for handsets/smartphones. Generations of mobile users have struggled with awful PC-phone connectivity software from almost all vendors, which have viewed the desktop as an afterthought.
Many people seem to make the mistake of assuming that Apple is targetting the 1-billion unit annual mobile handset market with the iPhone (and 4-5bn mobile users), usually citing its market share as tiny and insignificant, with a disproportionate amount of media and developer attention being focused on it. Apple itself seems happy to shrug off such criticisms.
I think this because it defines its market differently. I suspect that the real "addressable market" it's after is the installed base of PC owners/users - probably about 1.3bn/2.0bn repectively. By no great coincidence, pretty much all PC users also have a mobile phone.
Unlike purveyors of "mobile computers", "smartbooks" and the like, Apple doesn't want to substitute the iPhone against a desktop or a laptop. It appears disinterested in the iPhone being a new user's *only* source of Internet access. It wants the phone to be an adjunct - and also hopefully encourage a long-term switch to a Mac desktop or laptop.
Accessories often have some of the highest profit margins. And if they can drive the sale of premium-priced "core" products then so much the better.
But the real kicker here is iTunes. Physically connecting an iPhone to a PC means it can be totally and reliably updated and upgraded - not just in terms of the apps and OS, but also right down to the radio firmware and low-level internals. This is a lot less clunky than the over-the-air approach to device management elsewhere. The reach it gives into the installed base is hugely more than most other operator/device approaches.
I think there's more here to this. I think the notion that smartphones are somehow cheaper than PCs and will appeal to first-time users in developing countries is wrong. A second-hand $50 Pentium 3 can get a whole family online, show movies and be used for a broad set of applications. It will also last for years. (I know, that's exactly what a Sri Lankan cab driver told me in January when I asked him if he used the Internet on his phone).
In that scenario, the "cost of ownership" for a basic PC is less than $5 per person per year [excluding the connectivity, obviously]. Even a $50 secondhand unlocked Symbian device would be more expensive on that basis, given its expected working life and difficulty in sharing. There is no way that a $100-150 device would "get a whole family online" and provide anything like the functionality or useability of a PC.
Bottom line: The addressable market for smartphones is (to 80/20 accuracy) the base of people who already have a PC. The most interesting and relevant market share statistic would be the break-down of PC users who are also *active* smartphone users, with a data plan.
Tuesday, February 16, 2010
MWC Day 1 ... thoughts on offload, mobile data traffic, LTE and IMS
Quick post on some thoughts from Day 1 here in Barcelona:
As predicted, offload and data traffic management are top-of mind. Too much cheap traffic transiting expensive networks. Yes, people are paying for mobile broadband on laptops & smartphones, but shoehorning gigabytes of Internet-destined data across a network designed for in-house, data-sipping operator-controlled apps is clearly a recipe for disaster. What's striking, though, is that the "offload industry" hasn't yet mapped the various solutions onto a matrix of use cases and deployment scenarios. I think there needs to be a lot more thought here - how do you selectively offload certain applications, but route others through the core?
In particular, a lot of the network vendors don't quite understand what "applications" are on the network. Many will tell you that "video" is an application. It's not, especially from the user's perspective, which is where any policy or tariff changes will bite. There's a huge *perceptual* difference between watching a YouTube video in large-screen mode in a browser, versus watching the same YouTube clip embedded in your Facebook page, where it's been shared with you by a friend. Yet the network is unlikely to be able to spot the difference. One is "media consumption", the other is "social networking and recommendation", and I think that in users' eyes, they are very different. How you capture this behavioural component (layer 9?) is going to be a difficult challenge.
NSN had an interesting pitch yesterday - creating a smartphone-optimised network. This is pretty much a first, as normally the network vendors ignore the role and usage modes of devices in the real world, and vice-versa. I spend a lot of time pointing out to network vendors that innovations in infrastructure technology always have device-side implications (hence my work on IMS-capable handsets, femto-optimised devices and so on). NSN observed that although laptops create most of the data traffic on 3G networks, it is smartphones that are much more aggressive when it comes to signalling traffic, and that in some cases this is to blame in clogging the network. They're working on a 3GPP mechanism for reducing the signalling overload, called Paging. I'll delve more deeply into it on my return, but it sounds interesting.
One notable theme here seems to be that LTE is being viewed with more reality - it's conspicuous that both NSN and Ericsson are touting go-faster HSPA+ quite loudly, as it's becoming clear that many operators are going that route, especially in Europe. I have my suspicions that many might skip "beta" LTE (aka 3GPP Releases 8 & 9) and wait for the bug fixes to appear in LTE Advanced (Release 10). After all, the first versions of WCDMA were pretty useless until HSDPA arrived.
Add in the issues concerning LTE spectrum availability, devices, indoor coverage, voice & SMS, roaming, handover, power consumption and the lack of a new business case, and I think that we're looking at 2014-2015 for any real mass uptake. Apparently there's a few early handsets on the horizon, but I suspect they'll be provided with a free oven glove as an accessory.
During today I'm going to brave the dangers and gauge the state of the mobile IMS zombie. Yes, it's dead but still moving - but is it like the shambling creatures from Dawn of the Dead, or one of the scary, fast-moving vermin from 28 Days Later or I am Legend? The big question has to be whether it's already bitten LTE so hard it won't let go, and infected it. Anyway, I'm seeing the GSMA, the NGMN and various vendors with IMS-based products, so I'll try to avoid getting my brain washed (or, indeed, eaten).
As predicted, offload and data traffic management are top-of mind. Too much cheap traffic transiting expensive networks. Yes, people are paying for mobile broadband on laptops & smartphones, but shoehorning gigabytes of Internet-destined data across a network designed for in-house, data-sipping operator-controlled apps is clearly a recipe for disaster. What's striking, though, is that the "offload industry" hasn't yet mapped the various solutions onto a matrix of use cases and deployment scenarios. I think there needs to be a lot more thought here - how do you selectively offload certain applications, but route others through the core?
In particular, a lot of the network vendors don't quite understand what "applications" are on the network. Many will tell you that "video" is an application. It's not, especially from the user's perspective, which is where any policy or tariff changes will bite. There's a huge *perceptual* difference between watching a YouTube video in large-screen mode in a browser, versus watching the same YouTube clip embedded in your Facebook page, where it's been shared with you by a friend. Yet the network is unlikely to be able to spot the difference. One is "media consumption", the other is "social networking and recommendation", and I think that in users' eyes, they are very different. How you capture this behavioural component (layer 9?) is going to be a difficult challenge.
NSN had an interesting pitch yesterday - creating a smartphone-optimised network. This is pretty much a first, as normally the network vendors ignore the role and usage modes of devices in the real world, and vice-versa. I spend a lot of time pointing out to network vendors that innovations in infrastructure technology always have device-side implications (hence my work on IMS-capable handsets, femto-optimised devices and so on). NSN observed that although laptops create most of the data traffic on 3G networks, it is smartphones that are much more aggressive when it comes to signalling traffic, and that in some cases this is to blame in clogging the network. They're working on a 3GPP mechanism for reducing the signalling overload, called Paging. I'll delve more deeply into it on my return, but it sounds interesting.
One notable theme here seems to be that LTE is being viewed with more reality - it's conspicuous that both NSN and Ericsson are touting go-faster HSPA+ quite loudly, as it's becoming clear that many operators are going that route, especially in Europe. I have my suspicions that many might skip "beta" LTE (aka 3GPP Releases 8 & 9) and wait for the bug fixes to appear in LTE Advanced (Release 10). After all, the first versions of WCDMA were pretty useless until HSDPA arrived.
Add in the issues concerning LTE spectrum availability, devices, indoor coverage, voice & SMS, roaming, handover, power consumption and the lack of a new business case, and I think that we're looking at 2014-2015 for any real mass uptake. Apparently there's a few early handsets on the horizon, but I suspect they'll be provided with a free oven glove as an accessory.
During today I'm going to brave the dangers and gauge the state of the mobile IMS zombie. Yes, it's dead but still moving - but is it like the shambling creatures from Dawn of the Dead, or one of the scary, fast-moving vermin from 28 Days Later or I am Legend? The big question has to be whether it's already bitten LTE so hard it won't let go, and infected it. Anyway, I'm seeing the GSMA, the NGMN and various vendors with IMS-based products, so I'll try to avoid getting my brain washed (or, indeed, eaten).
Monday, February 15, 2010
MWC / 3GSM - start of the week
I've got a pretty hectic schedule in Barcelona, but I'll try and put a few posts up as I go along.
First thing to notice is that the mobile-related advertising around the airport and city is still nowhere near the levels of a few years ago - the recession is obviously still hitting corporate marketing budgets quite hard. This is also reflected by the absence of some "big names" in terms of stands, notably Nokia, which has instead set up shop (more cheaply) around the corner in the ONCE building for meetings and demos.
OK, one churlish comment: while I'm grateful to the GSMA for finally sorting out analyst registration as well as press, they *still* have the ridiculous photo-ID policy on *every* entrance and exit to the Fira grounds. Yes, I know security is an issue, but so is privacy. Some sort of PIN code or other authentication is much preferable to having to carry my passport around & show it on demand. I guess that given the GSMA's reputation for trying to centrally-plan the mobile economy, it seems fitting that they have their Stalinist henchmen (and henchwomen) demanding to see your papers all the time.
One other amusement is that the Media Centre at the world's largest mobile conference has around 300 fixed LAN connections and PCs. There's WiFi as well, but they're obviously worried it'll be as flaky as usual at the Fira, while insane data roaming charges will prohibit the use of mobile broadband for most attendees. (Footnote: Barcelona is one of those cities where it's usually cheaper to use a taxi to navigate to your destination, rather than using Google Maps over roaming).
There's going to be a zillion press releases and other announcements over the next few days, I'll cherry-pick the ones that made me curious, or made me wince to comment on.
First up in the firing line is Critical Path, yet another purveyor of "social address books" and various other attempts to force social-networking into the phone without the use of a proper Facebook app. Amusingly, their survey (release not yet up on their web page) claims that "94% of consumers in emerging markets would like to be able to automatically update all of their contacts with new contact information" while also claiming that "Operators are ideally positioned to help ease these frustrations".
The disconnect here seems to be that new contact information is usually associated with churning - especially in the prepay-centric markets the survey focuses on. People buy whichever new SIM card offers the best package and then tell all their friends their new number when they can't/don't port it. I can't really see how that can easily be an operator play or enhance loyalty - unless you separate the service from the access.
Edit: I'm chasing a few themes here in Barcelona, notably femtocells, broadband offload & traffic management (about which this post seems to have had quite a lot of attention), connection management, mobile VoIP and HSPA+/LTE/WiMAX competition.
I'm also hoping to have another shot at my perennial punchbag, IMS RCS. The initiative has 96 hours to show committment from big names like Apple, Google, RIM and Skype, or else I'll have to find time to write a full obituary.
First thing to notice is that the mobile-related advertising around the airport and city is still nowhere near the levels of a few years ago - the recession is obviously still hitting corporate marketing budgets quite hard. This is also reflected by the absence of some "big names" in terms of stands, notably Nokia, which has instead set up shop (more cheaply) around the corner in the ONCE building for meetings and demos.
OK, one churlish comment: while I'm grateful to the GSMA for finally sorting out analyst registration as well as press, they *still* have the ridiculous photo-ID policy on *every* entrance and exit to the Fira grounds. Yes, I know security is an issue, but so is privacy. Some sort of PIN code or other authentication is much preferable to having to carry my passport around & show it on demand. I guess that given the GSMA's reputation for trying to centrally-plan the mobile economy, it seems fitting that they have their Stalinist henchmen (and henchwomen) demanding to see your papers all the time.
One other amusement is that the Media Centre at the world's largest mobile conference has around 300 fixed LAN connections and PCs. There's WiFi as well, but they're obviously worried it'll be as flaky as usual at the Fira, while insane data roaming charges will prohibit the use of mobile broadband for most attendees. (Footnote: Barcelona is one of those cities where it's usually cheaper to use a taxi to navigate to your destination, rather than using Google Maps over roaming).
There's going to be a zillion press releases and other announcements over the next few days, I'll cherry-pick the ones that made me curious, or made me wince to comment on.
First up in the firing line is Critical Path, yet another purveyor of "social address books" and various other attempts to force social-networking into the phone without the use of a proper Facebook app. Amusingly, their survey (release not yet up on their web page) claims that "94% of consumers in emerging markets would like to be able to automatically update all of their contacts with new contact information" while also claiming that "Operators are ideally positioned to help ease these frustrations".
The disconnect here seems to be that new contact information is usually associated with churning - especially in the prepay-centric markets the survey focuses on. People buy whichever new SIM card offers the best package and then tell all their friends their new number when they can't/don't port it. I can't really see how that can easily be an operator play or enhance loyalty - unless you separate the service from the access.
Edit: I'm chasing a few themes here in Barcelona, notably femtocells, broadband offload & traffic management (about which this post seems to have had quite a lot of attention), connection management, mobile VoIP and HSPA+/LTE/WiMAX competition.
I'm also hoping to have another shot at my perennial punchbag, IMS RCS. The initiative has 96 hours to show committment from big names like Apple, Google, RIM and Skype, or else I'll have to find time to write a full obituary.
Saturday, February 13, 2010
VoIPo3G forecasts... I hate to say "I told you so", but....
I *did* tell you:
"The number of VoIPo3G users could grow from virtually zero in 2007 to over 250m by the end of 2012"
"It will be the operators themselves which will be mainly responsible for the push towards VoIP being carried over cellular networks"
"About 60m will be using independent or Internet-based solutions – many actually operated in partnership with carriers or retailers"
OK, yes, I missed a few things. At the time I wrote those comments (published in Nov 2007 and researched for the previous 6-9 months), I was still expecting CDMA to go beyond EV-DO and head towards UMB, rather than be usurped by LTE. And I also expected that the industry would have sorted out some form of standardised VoIP for LTE by now, rather than the current mess of CSFB, VoLGA, IMS VoIP, OneVoice and assorted others.
But given the imminent announcement of a Skype / Verizon Wireless deal (which seems to be rumoured as being VoIPo3G rather than call-through like 3's), plus the inexorable rise of VoIP and video like Skype on 3G-connected PCs, it looks like my general pitch is coming true. AT&T is allowing 3G VoIP on the iPhone, and most other operators seem to view it as an inevitability.
And, quite frankly, allowing customers to use VoIP on their mobile broadband means they'll spend less time downloading video and clogging up the network.
So, to various of the skeptics who suggested my prediction of 250m VoIPo3G users by end-2012 was unrealistic.... let's see what the next 3 years bring. I'm not anticipating most of those connections being what I'd call "primary telephony", but I can certainly imagine that a decent proportion of PC, smartphone and "connected device" users by end-2012 will be running voice at least occasionally.
"The number of VoIPo3G users could grow from virtually zero in 2007 to over 250m by the end of 2012"
"It will be the operators themselves which will be mainly responsible for the push towards VoIP being carried over cellular networks"
"About 60m will be using independent or Internet-based solutions – many actually operated in partnership with carriers or retailers"
OK, yes, I missed a few things. At the time I wrote those comments (published in Nov 2007 and researched for the previous 6-9 months), I was still expecting CDMA to go beyond EV-DO and head towards UMB, rather than be usurped by LTE. And I also expected that the industry would have sorted out some form of standardised VoIP for LTE by now, rather than the current mess of CSFB, VoLGA, IMS VoIP, OneVoice and assorted others.
But given the imminent announcement of a Skype / Verizon Wireless deal (which seems to be rumoured as being VoIPo3G rather than call-through like 3's), plus the inexorable rise of VoIP and video like Skype on 3G-connected PCs, it looks like my general pitch is coming true. AT&T is allowing 3G VoIP on the iPhone, and most other operators seem to view it as an inevitability.
And, quite frankly, allowing customers to use VoIP on their mobile broadband means they'll spend less time downloading video and clogging up the network.
So, to various of the skeptics who suggested my prediction of 250m VoIPo3G users by end-2012 was unrealistic.... let's see what the next 3 years bring. I'm not anticipating most of those connections being what I'd call "primary telephony", but I can certainly imagine that a decent proportion of PC, smartphone and "connected device" users by end-2012 will be running voice at least occasionally.
Friday, February 12, 2010
Telefonica is playing with fire....
Interesting time for Telefonica to try and stick the boot into Google about net neutrality, a few days before Eric Schmidt gives a keynote at the 3GSM conference in Barcelona.
Apparently, Telefonica is "considering charging" Google for the bandwidth it apparently uses "for free".
I'd pay very good money to be a fly on the wall during a meeting of the two companies. I wonder how much Google should charge Telefonica to permit its subscribers to access its search & other servers "for free"?
Alierta is playing with fire on this - its customers are more likely to churn ADSL provider than churn from Google Search or YouTube.
If I was Google, I'd start threatening to give free adverts to competing Spanish broadband providers, for customers with Telefonica IP addresses.... in fact, I wonder if the big G could even negotiate a finders fee for directing switchers.....
Apparently, Telefonica is "considering charging" Google for the bandwidth it apparently uses "for free".
I'd pay very good money to be a fly on the wall during a meeting of the two companies. I wonder how much Google should charge Telefonica to permit its subscribers to access its search & other servers "for free"?
Alierta is playing with fire on this - its customers are more likely to churn ADSL provider than churn from Google Search or YouTube.
If I was Google, I'd start threatening to give free adverts to competing Spanish broadband providers, for customers with Telefonica IP addresses.... in fact, I wonder if the big G could even negotiate a finders fee for directing switchers.....
Wednesday, February 03, 2010
Traffic management and offload - diverging solutions
One of the major trends I'm seeing at the moment is that of mobile network offload - typically "dumping" traffic onto WiFi or other networks to avoid congestion from mobile broadband.
I'm expecting it to be a huge feature of this year's MWC / 3GSM in Barcelona.
But at first sight the offload trend seems to be a confusing mish-mash of technologies and techniques, all designed to reduce the impact of bulk traffic on cellular networks, but applied in very different ways. I'm trying to start to categorise these various areas, and also work out how they are being prioritised.
One important fact to note is that "congestion" is itself quite complex. It can be congested radio (RAN) networks in terms of downlink capacity, uplink capacity or signalling. It might be congestion in the backhaul from a cell-site to an aggregation point or the core network. It could be various elements of the core itself - SGSNs, GGSN and so forth. It could be a supporting IT system that handles IP networking (eg the DNS), or the billing/rating engine.
And then we have the various classes of solution. I'm still thinking about a full taxonomy (and terminology), but an initial starting point might be:
- Macro offload onto femtocells (in homes, offices, hotspots or outdoors)
- Macro offload onto WiFi (in homes, offices, hotspots or outdoors)
- Local IP breakout for WiFi & femtos (in the premise or at the broadband DSLAM / cable head-end)
- Managed offload (eg where the fixed or cable access provider, used for the WiFi or femto connection, actively assists in the offload process)
- Macro backhaul offload (eg direct connection to a cache or CDN for "bulk" Internet video)
- Core network offload / bypass
- Content compression (eg video format transcoding)
- Traffic-shaping (eg selectively degrading / capping specific flows or traffic types - sometimes misnamed as "applications")
- Policy control (prioritising specific users / subscribers, or administering specific profiles to some)
- Optimisation of capacity utilisation at the IP level, by changing the TCP protocol or packet scheduling, or by introducing "scavenger class" traffic
EDIT - MAY 21st - New research paper published - see below.
I'm sure there are others - while these segments can undoubtedly be fine-grained more as well. The bottom line revolves around a reduction or delay in the need for capex to enhance network capacity, and as a corollary an improvement in the more nebulous quality of "user experience" - at least for the bandwidth non-hogs.
What's not clear to me is which of these techniques is the most effective or important overall. I suspect it probably varies by operator, maybe even by cell or time. From a top-level ROI perspective, which of them enables spending on network upgrades to be minimised? Or perhaps introduces new revenue streams.
My gut feel is that for networks dominated by *PC-based* mobile broadband, the best option is some form of radio offload. All notebooks have WiFi, and they also tend to have the most complex applications and mashups, as well as being able to spot any degradation of quality or operator "interference" most readily. Having a PC user say "hang on - why's this video performing better over my ADSL line than over my HSDPA, what's going on?" is likely to lead to damaging PR, if you're trying to present your mobile broadband as a direct replacement for fixed connectivity.
I reckon it's simpler and safer just to dump PC traffic to a standard Internet connection as close to the device as possible. For iPhones and some smartphones, the same may be true. But for other devices, it may make more sense to route the traffic via the operator core and play around with subscriber policies, or adapt traffic at the application layer. But I'm sure there are exceptions in both cases - for example, where a PC is actually an operator-controlled netbook.
I'm going to try and resolve the picture more clearly over coming months.
EDIT - MAY 21st - New research paper published - see below.
I'm expecting it to be a huge feature of this year's MWC / 3GSM in Barcelona.
But at first sight the offload trend seems to be a confusing mish-mash of technologies and techniques, all designed to reduce the impact of bulk traffic on cellular networks, but applied in very different ways. I'm trying to start to categorise these various areas, and also work out how they are being prioritised.
One important fact to note is that "congestion" is itself quite complex. It can be congested radio (RAN) networks in terms of downlink capacity, uplink capacity or signalling. It might be congestion in the backhaul from a cell-site to an aggregation point or the core network. It could be various elements of the core itself - SGSNs, GGSN and so forth. It could be a supporting IT system that handles IP networking (eg the DNS), or the billing/rating engine.
And then we have the various classes of solution. I'm still thinking about a full taxonomy (and terminology), but an initial starting point might be:
- Macro offload onto femtocells (in homes, offices, hotspots or outdoors)
- Macro offload onto WiFi (in homes, offices, hotspots or outdoors)
- Local IP breakout for WiFi & femtos (in the premise or at the broadband DSLAM / cable head-end)
- Managed offload (eg where the fixed or cable access provider, used for the WiFi or femto connection, actively assists in the offload process)
- Macro backhaul offload (eg direct connection to a cache or CDN for "bulk" Internet video)
- Core network offload / bypass
- Content compression (eg video format transcoding)
- Traffic-shaping (eg selectively degrading / capping specific flows or traffic types - sometimes misnamed as "applications")
- Policy control (prioritising specific users / subscribers, or administering specific profiles to some)
- Optimisation of capacity utilisation at the IP level, by changing the TCP protocol or packet scheduling, or by introducing "scavenger class" traffic
EDIT - MAY 21st - New research paper published - see below.
I'm sure there are others - while these segments can undoubtedly be fine-grained more as well. The bottom line revolves around a reduction or delay in the need for capex to enhance network capacity, and as a corollary an improvement in the more nebulous quality of "user experience" - at least for the bandwidth non-hogs.
What's not clear to me is which of these techniques is the most effective or important overall. I suspect it probably varies by operator, maybe even by cell or time. From a top-level ROI perspective, which of them enables spending on network upgrades to be minimised? Or perhaps introduces new revenue streams.
My gut feel is that for networks dominated by *PC-based* mobile broadband, the best option is some form of radio offload. All notebooks have WiFi, and they also tend to have the most complex applications and mashups, as well as being able to spot any degradation of quality or operator "interference" most readily. Having a PC user say "hang on - why's this video performing better over my ADSL line than over my HSDPA, what's going on?" is likely to lead to damaging PR, if you're trying to present your mobile broadband as a direct replacement for fixed connectivity.
I reckon it's simpler and safer just to dump PC traffic to a standard Internet connection as close to the device as possible. For iPhones and some smartphones, the same may be true. But for other devices, it may make more sense to route the traffic via the operator core and play around with subscriber policies, or adapt traffic at the application layer. But I'm sure there are exceptions in both cases - for example, where a PC is actually an operator-controlled netbook.
I'm going to try and resolve the picture more clearly over coming months.
EDIT - MAY 21st - New research paper published - see below.
NEW Mobile Broadband Traffic Management Paper
Monday, February 01, 2010
Debating the use of the term "4G"
I'm noticing an increasing use of the term 4G to describe either WiMAX or LTE networks.
It makes me wince, (although I've probably fallen into the "convenience trap" myself a couple of times).
Yes, they're both different to current versions of 3G, and use techniques like OFDMA - but so what? That doesn't make them 4G any more than it made EDGE (part of GSM) into 3G.
From a purist point of view, 4G doesn't yet exist. 3G refers to the families of technologies covered by the ITU's definition of "IMT 2000". 4G is expected to be the term used for the forthcoming "IMT Advanced" specifications currently being thrashed out by ITU, for which there are two prospective main candidates - LTE Advanced and WiMAX variant 802.16m.
Any use of the term 4G at present is therefore pure marketing fluff. A lot of WiMAX and LTE operators and device/network suppliers are fluffing, in an effort to come up with a brand that conveys evolution and new-ness.
The irony that the WiMAX community a huge amount of time and effort to convince ITU that their technology was in fact 3G (and therefore allowed to use IMT-2000 spectrum bands) seems to be ignored.
My instinctive reaction is to "deduct some credibility points" from the offenders when I see their announcements, or talk to their executives. If they are that sloppy that they mis-describe their technology, then surely it's reasonable to assume they're also sloppy about other aspects of their business?
Yet after a while, the practice may become so entrenched that even the more sensible participants have to wince, take a deep breath and mis-use the term 4G as well, so as not to lose out in the marketplace. If you can't beat them, you have to join them.
So, some solutions:
1) Any operator launching HSPA+ should also describe it as 4G. Well, if it's got MIMO, then it's *also* different from existing 3G. If the OFDMA guys are going to pick arbitrary definitions, then they can hardly complain when you do the same.
2) An operator with a brave PR department (and possibly a good legal team) should publicly take rivals to task for using 4G, using terms like "misleading", "lying", "sloppy" or "false advertising". Potentially, some of the current providers are on thin ice with regard to consumer protection law - although as the ITU hasn't yet called IMT-Advanced "4G" the response is probably that there's no strict definition in place.
3) Lobby the ITU to hurry up and call IMT-Advanced "4G" while it still has the ability to do so, before it gets lost in the marketing waffle.
As for me - well, I'm going to rely on the body language and non-verbal communications of people I speak to. If someone says 4G when describing their new LTE or WiMAX gizmo, but slightly winces, or rolls their eyes, or grits their teeth... then I'll give them the benefit of the doubt. But if they brazenly, unashamedly and unequivocally claim the 4G label for their 3G goods, I'll definitely be looking closely to find what else they've glossed over.
It makes me wince, (although I've probably fallen into the "convenience trap" myself a couple of times).
Yes, they're both different to current versions of 3G, and use techniques like OFDMA - but so what? That doesn't make them 4G any more than it made EDGE (part of GSM) into 3G.
From a purist point of view, 4G doesn't yet exist. 3G refers to the families of technologies covered by the ITU's definition of "IMT 2000". 4G is expected to be the term used for the forthcoming "IMT Advanced" specifications currently being thrashed out by ITU, for which there are two prospective main candidates - LTE Advanced and WiMAX variant 802.16m.
Any use of the term 4G at present is therefore pure marketing fluff. A lot of WiMAX and LTE operators and device/network suppliers are fluffing, in an effort to come up with a brand that conveys evolution and new-ness.
The irony that the WiMAX community a huge amount of time and effort to convince ITU that their technology was in fact 3G (and therefore allowed to use IMT-2000 spectrum bands) seems to be ignored.
My instinctive reaction is to "deduct some credibility points" from the offenders when I see their announcements, or talk to their executives. If they are that sloppy that they mis-describe their technology, then surely it's reasonable to assume they're also sloppy about other aspects of their business?
Yet after a while, the practice may become so entrenched that even the more sensible participants have to wince, take a deep breath and mis-use the term 4G as well, so as not to lose out in the marketplace. If you can't beat them, you have to join them.
So, some solutions:
1) Any operator launching HSPA+ should also describe it as 4G. Well, if it's got MIMO, then it's *also* different from existing 3G. If the OFDMA guys are going to pick arbitrary definitions, then they can hardly complain when you do the same.
2) An operator with a brave PR department (and possibly a good legal team) should publicly take rivals to task for using 4G, using terms like "misleading", "lying", "sloppy" or "false advertising". Potentially, some of the current providers are on thin ice with regard to consumer protection law - although as the ITU hasn't yet called IMT-Advanced "4G" the response is probably that there's no strict definition in place.
3) Lobby the ITU to hurry up and call IMT-Advanced "4G" while it still has the ability to do so, before it gets lost in the marketing waffle.
As for me - well, I'm going to rely on the body language and non-verbal communications of people I speak to. If someone says 4G when describing their new LTE or WiMAX gizmo, but slightly winces, or rolls their eyes, or grits their teeth... then I'll give them the benefit of the doubt. But if they brazenly, unashamedly and unequivocally claim the 4G label for their 3G goods, I'll definitely be looking closely to find what else they've glossed over.
Subscribe to:
Posts (Atom)