Something to ponder on:
- With a mobile phone, you have two tie-ins to keep you in a subscription-type relationship with your operator: your number and the handset subsidy.
- With PSTN or ADSL connections, you also have the number, which necessitates a subscription-type relationship. In addition, the need for the copper line to be maintained and terminate on a switch port ensures that only continuous types of service are feasible.
- With public WiFi, you can have a subscription, or you can purchase adhoc connectivity when and where you want it.
So what about mobile broadband? As modems get cheaper, or included in PCs, subsidy becomes irrelevant. And there's no need for a permanent "number" - apart from the mechanistic requirement of using a SIM card. But there's no real reason for a customer to be a "subscriber" with an ongoing relationship, rather than buying connectivity on a transaction basis.
This is one of the reasons why operators are desparate to add value to PC-based mobile broadband through additional services like SMS or IM. Otherwise, there is no more need for a subscription than there is to subscribe to "cinema service", or get a season ticket for a transport system. It may be convenient, if you're a regular user and you get a discount and less hassle.
But in principle, there's no reason for mobile broadband connections to be ongoing relationships. This is likely to lead to:
a) Very high churn levels
b) The emergence of a plethora of business models other than "normal" contracts and prepay accounts. Ad-hoc and bundling approaches will become particularly important
c) The near-irrelevance of ARPU as a metric, as it becomes impossible to identify a significant proportion of connections as regular "users"
Pages
▼
Pages
▼
Friday, May 29, 2009
Wednesday, May 27, 2009
Does the mobile network standards process inhibit business model innovation?
One of the things that struck me from the LTE summit last week was that the way that some standards bodies operate (notably 3GPP) risks entrenching legacy business models for operators and others.
This is ironic, as many standards groups, staffed by engineering-type people, try and avoid the whole issue of commercial models. This is either because they have limited understanding of that side of the industry, or limited time - or perhaps are worried about regulatory and anti-trust implications.
The problem arises because certain aspects of technical architecture can act as limiting factors. Physical SIM cards, for example, need to be distributed physically. Which means that a customer has to physically go to a store, or via the post. What seems like a technology-led decision can mitigate against particular business models, such as ad-hoc usage - and add in "latency" of hours or days to a process.
Or alternatively, dependencies between otherwise separate sub-systems can cause huge brittleness overall. LTE is being optimised for use with IMS-based core networks. But not all operators want to deploy IMS, even if (in theory) they want LTE - again, restricting business model choices or forcing them towards what is now a non-optimised radio technology.
The insistence of a lot of mobile operators to only view each other as peers (through the GSMA club, and various of its standards initiatives like IPX) is another example. This reinforces the notion that alternative service providers like Skype or FaceBook are *not* peers, but instead deadly enemies. For some operators that may be true, but for others they might be critical partners or even (whisper it) in a dominant role, for which the MNO is a junior part of the ecosystem.
Freezing old-fashioned assumptions into standards and architectures, often without even identifying that those assumptions exist is a recipe for disaster.
This isn't to say that standards are bad - but just that there is often no mechanism by which seemingly-sensible technology decisions are double-checked against potential future business models. Having a cycle in which people ask questions like "Will this work with prepay?" or "What's the wholesale model?" or "What happens if 3 people want to share one 'account'" or whatever would avoid many of these mistakes. You can never account for all eventualities, but you can certainly test for flexbility against quite a range.
Again thinking about the LTE Summit, I did not hear a single mention of the word "MVNO" during the whole event. Nobody has thought what an LTE-based MVNO might look like - or whether there might be cool features which could enable such an provider to provide more valuable services. I was met with blank stares when I asked about implementing open APIs on the radio network, to make it "programmable" for developers or partners. So I guess we won't be getting latency-optimised virtual mobile networks for gaming, then.
Many speakers appeared to view the only mobile broadband business models as traditional contract and prepay mechanisms - no talk of sponsored or third-party paid access. No consideration of the importance of Telco 2.0 strategies. No discussion about where in the EPS or LTE networks a content delivery network might interface, and so on.
One option for fixing this problem is via the other industry bodies that don't set standards themselves, but which can consider use cases and business models a bit more deeply - NGMN, OMTP, Femto Forum and so forth. Perhaps that's the level to bring in these considerations, so that they can then "suggest" specifications for the standards bodies to work to.
How about "Future mobile networks MUST be able to support a variety of MVNOs of example types A, B and C" for example?
On the same theme, I'll write another separate post soon, about why the increasing desperation to get IMS deployed is a particularly dangerous risk to the industry. In my view "legacy IMS" is a set of standards that is not fit-for-purpose in mobile - in large part because it is entrenched in a philosophy of walled-garden business models, rather than built around openness from Day 1.
This is ironic, as many standards groups, staffed by engineering-type people, try and avoid the whole issue of commercial models. This is either because they have limited understanding of that side of the industry, or limited time - or perhaps are worried about regulatory and anti-trust implications.
The problem arises because certain aspects of technical architecture can act as limiting factors. Physical SIM cards, for example, need to be distributed physically. Which means that a customer has to physically go to a store, or via the post. What seems like a technology-led decision can mitigate against particular business models, such as ad-hoc usage - and add in "latency" of hours or days to a process.
Or alternatively, dependencies between otherwise separate sub-systems can cause huge brittleness overall. LTE is being optimised for use with IMS-based core networks. But not all operators want to deploy IMS, even if (in theory) they want LTE - again, restricting business model choices or forcing them towards what is now a non-optimised radio technology.
The insistence of a lot of mobile operators to only view each other as peers (through the GSMA club, and various of its standards initiatives like IPX) is another example. This reinforces the notion that alternative service providers like Skype or FaceBook are *not* peers, but instead deadly enemies. For some operators that may be true, but for others they might be critical partners or even (whisper it) in a dominant role, for which the MNO is a junior part of the ecosystem.
Freezing old-fashioned assumptions into standards and architectures, often without even identifying that those assumptions exist is a recipe for disaster.
This isn't to say that standards are bad - but just that there is often no mechanism by which seemingly-sensible technology decisions are double-checked against potential future business models. Having a cycle in which people ask questions like "Will this work with prepay?" or "What's the wholesale model?" or "What happens if 3 people want to share one 'account'" or whatever would avoid many of these mistakes. You can never account for all eventualities, but you can certainly test for flexbility against quite a range.
Again thinking about the LTE Summit, I did not hear a single mention of the word "MVNO" during the whole event. Nobody has thought what an LTE-based MVNO might look like - or whether there might be cool features which could enable such an provider to provide more valuable services. I was met with blank stares when I asked about implementing open APIs on the radio network, to make it "programmable" for developers or partners. So I guess we won't be getting latency-optimised virtual mobile networks for gaming, then.
Many speakers appeared to view the only mobile broadband business models as traditional contract and prepay mechanisms - no talk of sponsored or third-party paid access. No consideration of the importance of Telco 2.0 strategies. No discussion about where in the EPS or LTE networks a content delivery network might interface, and so on.
One option for fixing this problem is via the other industry bodies that don't set standards themselves, but which can consider use cases and business models a bit more deeply - NGMN, OMTP, Femto Forum and so forth. Perhaps that's the level to bring in these considerations, so that they can then "suggest" specifications for the standards bodies to work to.
How about "Future mobile networks MUST be able to support a variety of MVNOs of example types A, B and C" for example?
On the same theme, I'll write another separate post soon, about why the increasing desperation to get IMS deployed is a particularly dangerous risk to the industry. In my view "legacy IMS" is a set of standards that is not fit-for-purpose in mobile - in large part because it is entrenched in a philosophy of walled-garden business models, rather than built around openness from Day 1.
Tuesday, May 26, 2009
Upcoming events - Open Mobile Summit and WiMAX Forum
I've got another couple of important mobile industry events coming up over the next couple of weeks.
In particular, the second Open Mobile Summit is taking place in London on 10th & 11th June. The first one, at the end of 2008 in San Francisco, was a pretty good read across the whole of the industry, covering most of the various types of "openness" from net neutrality to smartphone OS's to open spectrum and white space.
I've doing a "fireside chat" with Truphone's CEO Geraldine Wilson, and will also sitting in the audience asking nasty questions to most of the other speakers as well.
The other parts of the agenda looks good this time as well, and I'm especially looking forward to seeing friend & fellow cynic James Enck chairing the first day.
If you're interested in going - tell 'em I sent you.
Also upcoming next week is the WiMAX Forum summit in Amsterdam. I'll be talking about the potential for new business models emerging around the technology. Should be interesting to compare with LTE last week.
In particular, the second Open Mobile Summit is taking place in London on 10th & 11th June. The first one, at the end of 2008 in San Francisco, was a pretty good read across the whole of the industry, covering most of the various types of "openness" from net neutrality to smartphone OS's to open spectrum and white space.
I've doing a "fireside chat" with Truphone's CEO Geraldine Wilson, and will also sitting in the audience asking nasty questions to most of the other speakers as well.
The other parts of the agenda looks good this time as well, and I'm especially looking forward to seeing friend & fellow cynic James Enck chairing the first day.
If you're interested in going - tell 'em I sent you.
Also upcoming next week is the WiMAX Forum summit in Amsterdam. I'll be talking about the potential for new business models emerging around the technology. Should be interesting to compare with LTE last week.
Wednesday, May 20, 2009
LTE for heavy mobile broadband users? A paradox
Various presenters at the LTE Summit have said things along the lines of:
- 5% of mobile broadband customers generate 70-80% of traffic (eg the Telenor subscriber downloading 230GB per month over HSPA)
- A few inner-city HSPA cells are overloaded (3 or 4 carriers) while the ones in rural areas are essentially empty
- Most of the traffic comes from PCs or iPhones
But what's certainly not obvious to me is that the 5% of heavy HSPA users suddenly become profitable if you give them LTE instead, especially in expensive new spectrum for 20MHz channels. Why not just give them a femtocell for use at home, implement strict caps, or just get rid of them entirely?
Or that upgrading a few urban "hotspots" to LTE solves all the problems, given the need for everyone to have new LTE devices (you can't really say "only upgrade your dongle if you think you might use it in London W1 some time in the next year")
The only argument for short-term LTE deployment at the moment seems to be around spectrum flexibility, and the option to use thin slivers in refarmed GSM bands, or awkward allocations elsewhere that don't map onto HSPA's 5MHz channels. For most operators, that's unlikely to be an easy business case.
- 5% of mobile broadband customers generate 70-80% of traffic (eg the Telenor subscriber downloading 230GB per month over HSPA)
- A few inner-city HSPA cells are overloaded (3 or 4 carriers) while the ones in rural areas are essentially empty
- Most of the traffic comes from PCs or iPhones
But what's certainly not obvious to me is that the 5% of heavy HSPA users suddenly become profitable if you give them LTE instead, especially in expensive new spectrum for 20MHz channels. Why not just give them a femtocell for use at home, implement strict caps, or just get rid of them entirely?
Or that upgrading a few urban "hotspots" to LTE solves all the problems, given the need for everyone to have new LTE devices (you can't really say "only upgrade your dongle if you think you might use it in London W1 some time in the next year")
The only argument for short-term LTE deployment at the moment seems to be around spectrum flexibility, and the option to use thin slivers in refarmed GSM bands, or awkward allocations elsewhere that don't map onto HSPA's 5MHz channels. For most operators, that's unlikely to be an easy business case.
LTE, mobile broadband & SMS - VoLGA vs. IMS
Just watched a presentation on the UMA-derived approach to voice over LTE, called VoLGA, which I wrote about a few months ago at launch.
As well as the discussion about voice, re-use of existing circuit cores vs. IMS and so on, there was a fascinating comment by the speaker (from T-Mobile) about the "under the hood" use of SMS in mobile broadband dongles.
It's used for a wide range of functions - most visibly in enabling texts to be sent by the user from the operator dashboard connection sooftware. But beyond that, it's used for things like roaming notifications, internal configuration of roaming lists and other settings, and assorted others. There are apparently "many more than 10" systems at T-Mobile that rely on SMS in the context of a mobile broadband computing service - and which would need to be changed if SMS was not supported easily.
This all represents a huge issue for early releases of LTE-based mobile broadband. Despite the argument by 3GPP that "IMS will eventually get deployed, and phones will take a while to develop, so voice on LTE can wait a while", it seems likely that SMS will be needed from Day 1 .
In any case, I completely disagree with the 3GPP representative that every LTE network will inevitably feature an IMS back-end. If it becomes almost mandatory (and some of the "hooks" in the radio and EPC bits of the standard are heading that way), it's a good way to ensure that it won't get deployed universally. Many operators are implacably set against IMS, as it still has huge limitations. (For example: would an MVNO on an LTE network need its own IMS, or be forced to use wholesales apps via the host operator's IMS?)
There was also a comment from 3GPP that LTE could use CS-fallback for SMS. How that would work in the context of a 3G dongle and PC I'm not sure - presumably this would mean that an incoming text (or system message) would force the connection to degrade. Fine for an occasional message - but not for a teenager sending & recieving 200 SMS's per day from their laptop. Can you imagine any other type of broadband link having to drop speed each time an email or IM arrived?
I don't think VoLGA is perfect - once again, it will need extra software in the device protocol stacks. But given the complete abdication by the vendor and standards bodies in sorting out voice/SMS over LTE in timely fashion (ie 3 years ago), it's currently the best workaround I've seen. The fact that it's not even been included in 3GPP's release 9 workplan is ludicrous.
[Sidenote for the "voice on LTE can wait" believers: pretty much every laptop user connected via LTE, especially where it's used a fixed broadband substitute, will want some sort of voice capability. Right now, the only options are Skype or various Internet SIP alternatives. Are you really happy to give them an open goal?]
As well as the discussion about voice, re-use of existing circuit cores vs. IMS and so on, there was a fascinating comment by the speaker (from T-Mobile) about the "under the hood" use of SMS in mobile broadband dongles.
It's used for a wide range of functions - most visibly in enabling texts to be sent by the user from the operator dashboard connection sooftware. But beyond that, it's used for things like roaming notifications, internal configuration of roaming lists and other settings, and assorted others. There are apparently "many more than 10" systems at T-Mobile that rely on SMS in the context of a mobile broadband computing service - and which would need to be changed if SMS was not supported easily.
This all represents a huge issue for early releases of LTE-based mobile broadband. Despite the argument by 3GPP that "IMS will eventually get deployed, and phones will take a while to develop, so voice on LTE can wait a while", it seems likely that SMS will be needed from Day 1 .
In any case, I completely disagree with the 3GPP representative that every LTE network will inevitably feature an IMS back-end. If it becomes almost mandatory (and some of the "hooks" in the radio and EPC bits of the standard are heading that way), it's a good way to ensure that it won't get deployed universally. Many operators are implacably set against IMS, as it still has huge limitations. (For example: would an MVNO on an LTE network need its own IMS, or be forced to use wholesales apps via the host operator's IMS?)
There was also a comment from 3GPP that LTE could use CS-fallback for SMS. How that would work in the context of a 3G dongle and PC I'm not sure - presumably this would mean that an incoming text (or system message) would force the connection to degrade. Fine for an occasional message - but not for a teenager sending & recieving 200 SMS's per day from their laptop. Can you imagine any other type of broadband link having to drop speed each time an email or IM arrived?
I don't think VoLGA is perfect - once again, it will need extra software in the device protocol stacks. But given the complete abdication by the vendor and standards bodies in sorting out voice/SMS over LTE in timely fashion (ie 3 years ago), it's currently the best workaround I've seen. The fact that it's not even been included in 3GPP's release 9 workplan is ludicrous.
[Sidenote for the "voice on LTE can wait" believers: pretty much every laptop user connected via LTE, especially where it's used a fixed broadband substitute, will want some sort of voice capability. Right now, the only options are Skype or various Internet SIP alternatives. Are you really happy to give them an open goal?]
NEW Mobile Broadband Traffic Management Paper
Tuesday, May 19, 2009
Evolved Packet Core in LTE
Many people in the mobile industry are unaware that LTE, the radio Cinderalla, also has an ugly sister, called EPC (Evolved Packet Core - originally it was called SAE).
Many things about EPC are actually worthy - "flatter" IP network, lower latency, the ability to mesh between base stations, getting rid of a lot of other legacy equipment like SGSNs and GGSNs and so forth.
However, there is one thing that appears to be missing - the option for local offload of traffic. Everything is still backhauled via various gateway and policy boxes. It's not possible to say "that's a 3GB web video download - just short-circuit it straight to the Internet". It still has to go through the core network.
If I'm understanding this correctly, this has a couple of important implications:
- Roaming traffic will still need to be backhauled via the home network, even if it's just plain-vanilla Internet data. You can't just dump it to the web in the visited country. So you can forget about LTE roaming being able to compete with local WiFi (or a locally-acquired LTE SIM) on price, because there's still a huge extra overhead of network elements and unnecessary transport involved.
- In LTE femtocells, the option for "split tunnels", for example to offload web traffic or operator services like IPTV in the home gateway, or DSLAM / cable headend, won't work. Again, the traffic needs to go through the mobile EPC core. You can also forget about using LTE instead of WiFi for UPnP or other types of home networking via a femto - the traffic will need to be "tromboned" in and out.
Many things about EPC are actually worthy - "flatter" IP network, lower latency, the ability to mesh between base stations, getting rid of a lot of other legacy equipment like SGSNs and GGSNs and so forth.
However, there is one thing that appears to be missing - the option for local offload of traffic. Everything is still backhauled via various gateway and policy boxes. It's not possible to say "that's a 3GB web video download - just short-circuit it straight to the Internet". It still has to go through the core network.
If I'm understanding this correctly, this has a couple of important implications:
- Roaming traffic will still need to be backhauled via the home network, even if it's just plain-vanilla Internet data. You can't just dump it to the web in the visited country. So you can forget about LTE roaming being able to compete with local WiFi (or a locally-acquired LTE SIM) on price, because there's still a huge extra overhead of network elements and unnecessary transport involved.
- In LTE femtocells, the option for "split tunnels", for example to offload web traffic or operator services like IPTV in the home gateway, or DSLAM / cable headend, won't work. Again, the traffic needs to go through the mobile EPC core. You can also forget about using LTE instead of WiFi for UPnP or other types of home networking via a femto - the traffic will need to be "tromboned" in and out.
Monday, May 18, 2009
Are there risks associated with an LTE "monoculture"?
I'm listening to a presentation on LTE-TDD, highlighting the close integration between FDD and TDD branches of the technology.
The underlying assumption in the industry is that it's wonderful that we might have, for once, a single global standard for wireless networks, albeit in two "flavours".
The subtext (or sometimes explicitly stated position) is that alternatives like WiMAX are thus undesirable.
But I'm wondering whether there is a dark side to full standardisation - even with its associated conveniences and scale economies. What happens if we create what biologists call a monoculture? In agriculture, this carries serious dangers.
The most obvious risk is that the impact of any catastrophic risk is magnified hugely. Any vulnerability - either technical, or perhaps IPR-based - would have universal effects if exploited.
In addition, certain business models are hard-coded in, or out, as a result of specific standards. This is already occurring in LTE because of requirements for mandatory SIM use, and because there is no option for local offload of traffic without backhauling via the packet core.
I'm not convinced that the short-term gains from standards and scale economy necessarily outweighs the long-term losses from business model flexibility and reach. An LTE monoculture may result in the eradication of valuable gene-lines elsewhere in today's mobile environment.
It's worth considering that what the technology industry calls "fragmentation", an ecologist would call "biodiversity".
The underlying assumption in the industry is that it's wonderful that we might have, for once, a single global standard for wireless networks, albeit in two "flavours".
The subtext (or sometimes explicitly stated position) is that alternatives like WiMAX are thus undesirable.
But I'm wondering whether there is a dark side to full standardisation - even with its associated conveniences and scale economies. What happens if we create what biologists call a monoculture? In agriculture, this carries serious dangers.
The most obvious risk is that the impact of any catastrophic risk is magnified hugely. Any vulnerability - either technical, or perhaps IPR-based - would have universal effects if exploited.
In addition, certain business models are hard-coded in, or out, as a result of specific standards. This is already occurring in LTE because of requirements for mandatory SIM use, and because there is no option for local offload of traffic without backhauling via the packet core.
I'm not convinced that the short-term gains from standards and scale economy necessarily outweighs the long-term losses from business model flexibility and reach. An LTE monoculture may result in the eradication of valuable gene-lines elsewhere in today's mobile environment.
It's worth considering that what the technology industry calls "fragmentation", an ecologist would call "biodiversity".
Microsoft - bearer-aware applications are the way forward
I just listened to an ultra-detailed and very informative presentation by a technology planning director for Windows Mobile. Irrespective of the current lagging commercial status of WinMob in today's market, I was hugely impressed with the vision.
In particular, it was the most coherent presentation I've heard about the future split of mobile app functionality between device and server ("the cloud"). Various examples were cited of using LTE as a way to offload processing from phone to network, such as realtime voice and image processing, which would overwhelm most handset's chips and batteries.
There was also a discussion of using the handset as a web server - which given that I heard almost exactly the same thing from Nokia at the Telco 2.0 event last week, suggests to me that it's likely to be a big deal quite soon.
He also put up one of the few slides I've seen at a public conference that acknowledged the important future role of both connection manager and some way of exposing information about radio bearers. It mentioned the magic words for APIs that give "Notification of radio states & options" and "Ability for apps to choose radio options".
This is precisely what I've been talking about for several years, and to be fair, it's also something that fits in with Symbian's Freeway architecture and some of the bearer-aware iPhone apps I've seen.
In Q&A he also agreed with me on femtocell awareness - ie applications ideally being able to distinguish between femto & macro (& WiFi) connections and behave differently. (This was one of the main topics covered in the Disruptive Analysis Femtocell-Aware Handsets report a while back)
In particular, it was the most coherent presentation I've heard about the future split of mobile app functionality between device and server ("the cloud"). Various examples were cited of using LTE as a way to offload processing from phone to network, such as realtime voice and image processing, which would overwhelm most handset's chips and batteries.
There was also a discussion of using the handset as a web server - which given that I heard almost exactly the same thing from Nokia at the Telco 2.0 event last week, suggests to me that it's likely to be a big deal quite soon.
He also put up one of the few slides I've seen at a public conference that acknowledged the important future role of both connection manager and some way of exposing information about radio bearers. It mentioned the magic words for APIs that give "Notification of radio states & options" and "Ability for apps to choose radio options".
This is precisely what I've been talking about for several years, and to be fair, it's also something that fits in with Symbian's Freeway architecture and some of the bearer-aware iPhone apps I've seen.
In Q&A he also agreed with me on femtocell awareness - ie applications ideally being able to distinguish between femto & macro (& WiFi) connections and behave differently. (This was one of the main topics covered in the Disruptive Analysis Femtocell-Aware Handsets report a while back)
At the LTE Summit. With a herd of elephants in the room.
I'm in Berlin at the LTE World Summit for the next couple of days. It's a well-attended event, with a better-than-average mix of operators vs. vendors, as well as broad range of heavy hitters.
I'll try and put up a few posts over the next couple of days, but already a few things are standing out. Never mind "an elephant in the room", I'm wondering if there's an entire herd, plus a few hippos thrown in for good measure.
The two largest and ugliest pachyderms are:
- Voice
- Business model
A panel of FT/Orange, T-Mobile, GSMA, Ericsson and 3 failed singularly to deliver a compelling answer on how we'll get voice working over LTE, especially in the short term. T-Mobile wants to use IMS in the longterm, but reckons it might need UMA/VoLGA in early deployments. Orange is much less convinced about VoLGA (is Uniq *really* that successful?) and thinks 2G/3G fallback is the way forward. Ericsson is still flogging the dead horse that is IMS MMtel (remember that?). The GSMA wants us to focus on data for LTE, and thinks it's actually quite impressive that we're even thinking about voice now, rather than as a post-hoc "oops" after LTE launches. And 3 seems to be skeptical about LTE for the very reason that it's already looking really fragmented for things like voice. (I reckon 3 might spring a surprise with "native Skype" on HSPA+ - or else will stick with GSM for a long time to come).
Up to a point the GSMA is right - at least we're thinking about this now. Although it was already a clear problem 2 years ago, when I first published a report on VoIPo3G. But the notion that it's all on track, especially given the protracted timeframes for handset platform & device development, is untenable.
(Separately - I'd love to be a fly on the wall when Ericsson tries to pitch the idea of an MMtel-enabled LTE iPhone to Apple....)
My view is that voice on LTE is looking ever-more intractable. VoLGA looks like a good idea, but I'm not sure that every operator will want to go through the hassle of deploying all the gateways & tunnelling paraphernalia, although some will. 2G / 3G fallback strikes me as a nightmare, especially for high-end devices, unless it's possible to run multiple radios simultaneously. Ed Candy talked about the issue of time needed to change network - but the other elephant is what happens to any running 3G data services if you want to make a call. Talk on the phone and use Google Maps at the same time? I don't think so, unless you have very clever multitasking connection managers & radio layers. Add into the mix the impact of femto vs. macro voice for LTE, and it gets even messier - the optimum approach could well be different when on a femtocell.
I'm starting to think that my old favourite theme of bearer-aware applications will be essential for voice on LTE. The dialler/telephony app (perhaps in conjunction with the app server in the network, especially for inbound calls) may need to make smart decisions about the best way to "game" the network for phone calls, based on an intelligent view of latency, quality, concurrent applications running and so forth.
The other thing that's going to be problematic is that of business model for data-centric devices. Moray Rumney from Agilent asked an impassioned question to the panel about the ridiculousness of roaming tariffs, and how they can be mitigated (or whether WiFi will remain as the nomadic broadband standard). The GSMA response was equivocal, pointing out how happy the association is at the idea of extra revenue from data roaming, and pointing out that rates are coming down as a result of regulatory intervention.
This is palpable nonsense of the first order - the fact that we've gone from 5 orders-of-magnitude overpricing, to just 1000x, on the pain of regulator pressure, is hardly cause for celebration or smugness that the problem is solved. And in any case, that's only intra-European roaming anyway. Even more egregious was the comment that the hotel's WiFi (fast & free - thanks Informa / Hotel Palace) was costing someone a lot of money, and therefore this highlighted the "value" of the cellular alternative.
Now it's absolutely true that had it not been "free" it would have cost a ludicrous €22 per day at this particular establishment. No surprises there - it's provided by Swisscom, which is always consistently outrageous.
But the delegates are not paying for it. Someone else - Informa, the hotel, or one of the sponsors - is picking up the tab.
This is a business model which simply would never work for LTE, because the underlying architectural standards work against it. The reason that third-party sponsored WiFi works, for a room of 200 people, from maybe 30 different countries - is that it does not require cumbersome roaming mechanisms, or a physical SIM card.
In my view, the lack of a SIM-less option for LTE is a huge mistake. It completely destroys the possibility for a whole raft of innovative session-based, ad-hoc, temporary or location-specific business models for data connectivity. Don't get me wrong - absolutely there should be SIMs for normal subscription services and various value-adds. But there are plenty of other commercial opportunities for data services in particular, for which the need for the user to obtain a physical SIM would be a deal-breaker.
The industry is leaving money on the table by ignoring this - it will instead continue to be picked up by WiFi, fixed networks and WiMAX, all of which are more flexible in their authentication options. For now, this is masked by the rapid increase of monthly-subscription contracts. But the pool of suitable customers is limited, and as the growth tails off, it will be essential to find new pricing models.
I think this will get fixed somehow (or perhaps worked-around with clunky roaming options) - but it needs to be examined ASAP for LTE to be as much of a success as its proponents hope.
I'll try and put up a few posts over the next couple of days, but already a few things are standing out. Never mind "an elephant in the room", I'm wondering if there's an entire herd, plus a few hippos thrown in for good measure.
The two largest and ugliest pachyderms are:
- Voice
- Business model
A panel of FT/Orange, T-Mobile, GSMA, Ericsson and 3 failed singularly to deliver a compelling answer on how we'll get voice working over LTE, especially in the short term. T-Mobile wants to use IMS in the longterm, but reckons it might need UMA/VoLGA in early deployments. Orange is much less convinced about VoLGA (is Uniq *really* that successful?) and thinks 2G/3G fallback is the way forward. Ericsson is still flogging the dead horse that is IMS MMtel (remember that?). The GSMA wants us to focus on data for LTE, and thinks it's actually quite impressive that we're even thinking about voice now, rather than as a post-hoc "oops" after LTE launches. And 3 seems to be skeptical about LTE for the very reason that it's already looking really fragmented for things like voice. (I reckon 3 might spring a surprise with "native Skype" on HSPA+ - or else will stick with GSM for a long time to come).
Up to a point the GSMA is right - at least we're thinking about this now. Although it was already a clear problem 2 years ago, when I first published a report on VoIPo3G. But the notion that it's all on track, especially given the protracted timeframes for handset platform & device development, is untenable.
(Separately - I'd love to be a fly on the wall when Ericsson tries to pitch the idea of an MMtel-enabled LTE iPhone to Apple....)
My view is that voice on LTE is looking ever-more intractable. VoLGA looks like a good idea, but I'm not sure that every operator will want to go through the hassle of deploying all the gateways & tunnelling paraphernalia, although some will. 2G / 3G fallback strikes me as a nightmare, especially for high-end devices, unless it's possible to run multiple radios simultaneously. Ed Candy talked about the issue of time needed to change network - but the other elephant is what happens to any running 3G data services if you want to make a call. Talk on the phone and use Google Maps at the same time? I don't think so, unless you have very clever multitasking connection managers & radio layers. Add into the mix the impact of femto vs. macro voice for LTE, and it gets even messier - the optimum approach could well be different when on a femtocell.
I'm starting to think that my old favourite theme of bearer-aware applications will be essential for voice on LTE. The dialler/telephony app (perhaps in conjunction with the app server in the network, especially for inbound calls) may need to make smart decisions about the best way to "game" the network for phone calls, based on an intelligent view of latency, quality, concurrent applications running and so forth.
The other thing that's going to be problematic is that of business model for data-centric devices. Moray Rumney from Agilent asked an impassioned question to the panel about the ridiculousness of roaming tariffs, and how they can be mitigated (or whether WiFi will remain as the nomadic broadband standard). The GSMA response was equivocal, pointing out how happy the association is at the idea of extra revenue from data roaming, and pointing out that rates are coming down as a result of regulatory intervention.
This is palpable nonsense of the first order - the fact that we've gone from 5 orders-of-magnitude overpricing, to just 1000x, on the pain of regulator pressure, is hardly cause for celebration or smugness that the problem is solved. And in any case, that's only intra-European roaming anyway. Even more egregious was the comment that the hotel's WiFi (fast & free - thanks Informa / Hotel Palace) was costing someone a lot of money, and therefore this highlighted the "value" of the cellular alternative.
Now it's absolutely true that had it not been "free" it would have cost a ludicrous €22 per day at this particular establishment. No surprises there - it's provided by Swisscom, which is always consistently outrageous.
But the delegates are not paying for it. Someone else - Informa, the hotel, or one of the sponsors - is picking up the tab.
This is a business model which simply would never work for LTE, because the underlying architectural standards work against it. The reason that third-party sponsored WiFi works, for a room of 200 people, from maybe 30 different countries - is that it does not require cumbersome roaming mechanisms, or a physical SIM card.
In my view, the lack of a SIM-less option for LTE is a huge mistake. It completely destroys the possibility for a whole raft of innovative session-based, ad-hoc, temporary or location-specific business models for data connectivity. Don't get me wrong - absolutely there should be SIMs for normal subscription services and various value-adds. But there are plenty of other commercial opportunities for data services in particular, for which the need for the user to obtain a physical SIM would be a deal-breaker.
The industry is leaving money on the table by ignoring this - it will instead continue to be picked up by WiFi, fixed networks and WiMAX, all of which are more flexible in their authentication options. For now, this is masked by the rapid increase of monthly-subscription contracts. But the pool of suitable customers is limited, and as the growth tails off, it will be essential to find new pricing models.
I think this will get fixed somehow (or perhaps worked-around with clunky roaming options) - but it needs to be examined ASAP for LTE to be as much of a success as its proponents hope.
Thursday, May 14, 2009
Can mobile operators revenue-share with handset vendors?
Interesting article in last week's edition of the UK Mobile trade magazine, Mobile Today. Apparently O2 is considering a revenue-share plan with handset vendors, where they would only pay 50% upfront for devices, with the remainder geared into actual customer expenditures over the life of the contract.
Presumably this would be intended to encourage vendors into loading features into phones that are service revenue-generative, rather than which work standalone (eg camera, memory) or "independent app provider" fashion (eg Ovi).
I'm very doubtful about how this might work in practice, and I'm struggling to think of any other (physical product) industry in which a similar model works, as clearly manufacturers have to pay their own suppliers, and getting bank loans to cover a 12-24 month gap between costs and cash seems highly doubtful to me. I can't see EasyJet telling Boeing it would only pay for 737's based on future passenger load & revenue figures.
In fact, I can see this type of move drastically backfiring - it could well tip the balance to device vendors switching to an Asian-type distribution strategy, where customers buy handsets through separate channels to their SIMs and service strategy. While this would reduce pressures on operators to pay out in subsidy, it would reduce the ability of operators to supply customised handsets with their favoured applications and UIs.
If handset vendors are going to face a cashflow hit, they might as well just offer consumer finance themselves for handset purchases - "Buy your Nokia XYZ in 18 convenient payments of £10 a month - comes preloaded with Skype, Ovi, Facebook and Music - just add a basic SIM with voice and data access".
There might be a very few cases where there is a win-win from this scenario, but I'm struggling to think of any off the top of my head.
Presumably this would be intended to encourage vendors into loading features into phones that are service revenue-generative, rather than which work standalone (eg camera, memory) or "independent app provider" fashion (eg Ovi).
I'm very doubtful about how this might work in practice, and I'm struggling to think of any other (physical product) industry in which a similar model works, as clearly manufacturers have to pay their own suppliers, and getting bank loans to cover a 12-24 month gap between costs and cash seems highly doubtful to me. I can't see EasyJet telling Boeing it would only pay for 737's based on future passenger load & revenue figures.
In fact, I can see this type of move drastically backfiring - it could well tip the balance to device vendors switching to an Asian-type distribution strategy, where customers buy handsets through separate channels to their SIMs and service strategy. While this would reduce pressures on operators to pay out in subsidy, it would reduce the ability of operators to supply customised handsets with their favoured applications and UIs.
If handset vendors are going to face a cashflow hit, they might as well just offer consumer finance themselves for handset purchases - "Buy your Nokia XYZ in 18 convenient payments of £10 a month - comes preloaded with Skype, Ovi, Facebook and Music - just add a basic SIM with voice and data access".
There might be a very few cases where there is a win-win from this scenario, but I'm struggling to think of any off the top of my head.
Tuesday, May 12, 2009
Piracy, Net Neutrality & DPI - Imminent Failure
At the moment, it's Europe's turn to have Net Neutrality in the spotlight - especially with regard to spotting illegal content downloads, and administering punishment to offenders. There is also a lot of renewed discussion about deep packet inspection, application-level prioritisation or charging and so forth. A new group of vendors, particularly coming from the OSS environment, has jumped on the bandwagon recently.
My view on both these areas is that the practice is going to be much more difficult than the theory, irrespective of how 'moral' or 'elegant' some of their advocates's concepts might seem. I also think there are huge potential pitfalls which threaten to make a bad situation even worse.
Let's start with the piracy issue. There are suggestions that if ISPs are not willing to "police" traffic on their consumers' connections, the media industry might set up a managed service provider to do it for them. Either way, there is a hope for network-level content monitoring as a means to identify the more egregious offenders, and potentially disconnect (or even ban them) from the Internet.
The BBC has an article here , which also highlights that the European Parliament seems to trying to limit the more draconian suggestions of disconnection. It appears to recognise that banning certain groups of people who are heavy file-sharers could work against its efforts for digital inclusion - it seems reasonable to believe that some of the lowest-income people (eg students) are likely to be P2P enthusiasts.
But the "rights and wrongs" seem to be almost irrelevant, because I can see numerous technical and practical difficulties in any scheme:
- Anonymity: in the UK and many other European countries, it is perfectly legal to use anonymous prepay mobile data connections, which may be completely distinct from the user's normal voice SIM and number. Fixed broadband is obviously difficult to anonymise to the same degree.
- Shared use: In theory, whoever pays the bill for broadband is responsible for the content transiting the connection. In practice, this means a parent is responsible for their children's activities, which is fair enough - but is it a proportionate response to excommunicate an entire family from the net because of an unruly teenager's download habits? By the same token, if an employee of a large organisation (eg a university or government department) transgresses the rules, do you cut their main connection to the IP universe?
- VPNs and encryption: As with more general concepts for DPI and application-level monitoring, one of the easiest way to kill the whole notion of monitoring is to stick all the traffic in a secure tunnel (or tunnels). Yes, you can infer lots of things about VPNs based on their destination IP address and other characteristics, but inference is even more vague (and challengeable) than circumstantial evidence when it comes to enforcement. I'm sure most of the P2P guys are a few years ahead in terms of thinking of new and clever ways to use VPNs for their software.
- Obfuscation: For both general DPI and anti-piracy content monitoring, there is a significant risk that software developers will try to "game" the boxes in the network, in ways that could backfire on ISPs in nasty ways. Looking for big file transfers as evidence of illegal content? Tracking P2P "signatures"? Then what happens when the bad guys' software pads out the pirated content with legal stuff? That could increase total traffic. Or P2P software just blends bits of legitimate shared P2P content with illegitimate, perhaps using steganographic techniques? You'll also probably get the open source crowd trying to find ways to spoof the system using "fake" traffic signatures - I wonder what happens to a DPI box if you flip the app signature once a millisecond?
- AJAX: Are you responsible for anything an active web page downloads in the background without your knowledge? Irrespective of the precise legal situation, it's the sort of thing that could mire the whole exercise in lawyers' bills for years. There's also numerous other issues around mashups which completely break the notion of "application" from a DPI perspective. How do you know if an application is YouTube.... or YouTube running in a Facebook page?
- Tracking and auditing: The onus of proof should clearly be on the accusers - and it's far from obvious that the current systems being suggested for piracy control or application DPI are robust enough to generate impeccable audit trails "proving" what is being tracked to a level that would stand up in court. Which, given that we seem to be moving towards Internet access as a basic human right, might be necessary if people start getting disconnected.
- Coverage: One of the themes at last week's Telco 2.0 was around "sender pays data" and various other ways of prioritising content delivery for those media companies or advertisers that pay for it. You can get gold-class service for your video download and get a sponsor to pay for it! Sounds great, but it falls down in mobile if you haven't got signal. There's not much value in 99.9% QoS, if you only get it 70% of the time.
- Network diversity: How do you deal with multi-network connectivity? If I'm simultaneously doing P2P (or legal video downloads) through the cellular network, and WiFi+home broadband, via different ISPs or operators, on the same device, it's going to be rather more complicated to spot. And even more tricky to enforce against. Add in the possibility of localised network-sharing - perhaps 10 smartphones 'pooling' their data connections via local Bluetooth or WiFi - and the problem gets exponentially harder still.
- Reverse engineering of policies: Any attempt to covertly degrade specific apps or streams is likely to be uncovered by the use of monitoring software designed to decode DPI policies. I'm expecting most operators to either publish their network management rules in detail - or cope with 3rd parties publishing reverse-engineered analysis instead.
In the past, I've said that I'm ambivalent about a lot of the Net Neutrality issues in the US, as competition or consumer legislation would kill any companies being stupid (eg blocking VoIP). If the vendors sell service providers the rope to hang themselves, so be it.
Now, as the attention moves to either piracy prevention or perhaps content prioritisation, I have a certain measure of sympathy for media rights owners or operators facing congestion. But I still think that they are set to waste a huge amount of money chasing after the myth of application-level (or content-level) monitoring and enforcement. That's not to say that all attempts at bandwidth management or monitoring are doomed to failure - they're not, and there are all sorts of other legitimate use cases that should work OK.
But operators need to be very wary of both vendor oversimplification, or content-owner indignation, when it comes to dealing with video or other media on their networks.
My view on both these areas is that the practice is going to be much more difficult than the theory, irrespective of how 'moral' or 'elegant' some of their advocates's concepts might seem. I also think there are huge potential pitfalls which threaten to make a bad situation even worse.
Let's start with the piracy issue. There are suggestions that if ISPs are not willing to "police" traffic on their consumers' connections, the media industry might set up a managed service provider to do it for them. Either way, there is a hope for network-level content monitoring as a means to identify the more egregious offenders, and potentially disconnect (or even ban them) from the Internet.
The BBC has an article here , which also highlights that the European Parliament seems to trying to limit the more draconian suggestions of disconnection. It appears to recognise that banning certain groups of people who are heavy file-sharers could work against its efforts for digital inclusion - it seems reasonable to believe that some of the lowest-income people (eg students) are likely to be P2P enthusiasts.
But the "rights and wrongs" seem to be almost irrelevant, because I can see numerous technical and practical difficulties in any scheme:
- Anonymity: in the UK and many other European countries, it is perfectly legal to use anonymous prepay mobile data connections, which may be completely distinct from the user's normal voice SIM and number. Fixed broadband is obviously difficult to anonymise to the same degree.
- Shared use: In theory, whoever pays the bill for broadband is responsible for the content transiting the connection. In practice, this means a parent is responsible for their children's activities, which is fair enough - but is it a proportionate response to excommunicate an entire family from the net because of an unruly teenager's download habits? By the same token, if an employee of a large organisation (eg a university or government department) transgresses the rules, do you cut their main connection to the IP universe?
- VPNs and encryption: As with more general concepts for DPI and application-level monitoring, one of the easiest way to kill the whole notion of monitoring is to stick all the traffic in a secure tunnel (or tunnels). Yes, you can infer lots of things about VPNs based on their destination IP address and other characteristics, but inference is even more vague (and challengeable) than circumstantial evidence when it comes to enforcement. I'm sure most of the P2P guys are a few years ahead in terms of thinking of new and clever ways to use VPNs for their software.
- Obfuscation: For both general DPI and anti-piracy content monitoring, there is a significant risk that software developers will try to "game" the boxes in the network, in ways that could backfire on ISPs in nasty ways. Looking for big file transfers as evidence of illegal content? Tracking P2P "signatures"? Then what happens when the bad guys' software pads out the pirated content with legal stuff? That could increase total traffic. Or P2P software just blends bits of legitimate shared P2P content with illegitimate, perhaps using steganographic techniques? You'll also probably get the open source crowd trying to find ways to spoof the system using "fake" traffic signatures - I wonder what happens to a DPI box if you flip the app signature once a millisecond?
- AJAX: Are you responsible for anything an active web page downloads in the background without your knowledge? Irrespective of the precise legal situation, it's the sort of thing that could mire the whole exercise in lawyers' bills for years. There's also numerous other issues around mashups which completely break the notion of "application" from a DPI perspective. How do you know if an application is YouTube.... or YouTube running in a Facebook page?
- Tracking and auditing: The onus of proof should clearly be on the accusers - and it's far from obvious that the current systems being suggested for piracy control or application DPI are robust enough to generate impeccable audit trails "proving" what is being tracked to a level that would stand up in court. Which, given that we seem to be moving towards Internet access as a basic human right, might be necessary if people start getting disconnected.
- Coverage: One of the themes at last week's Telco 2.0 was around "sender pays data" and various other ways of prioritising content delivery for those media companies or advertisers that pay for it. You can get gold-class service for your video download and get a sponsor to pay for it! Sounds great, but it falls down in mobile if you haven't got signal. There's not much value in 99.9% QoS, if you only get it 70% of the time.
- Network diversity: How do you deal with multi-network connectivity? If I'm simultaneously doing P2P (or legal video downloads) through the cellular network, and WiFi+home broadband, via different ISPs or operators, on the same device, it's going to be rather more complicated to spot. And even more tricky to enforce against. Add in the possibility of localised network-sharing - perhaps 10 smartphones 'pooling' their data connections via local Bluetooth or WiFi - and the problem gets exponentially harder still.
- Reverse engineering of policies: Any attempt to covertly degrade specific apps or streams is likely to be uncovered by the use of monitoring software designed to decode DPI policies. I'm expecting most operators to either publish their network management rules in detail - or cope with 3rd parties publishing reverse-engineered analysis instead.
In the past, I've said that I'm ambivalent about a lot of the Net Neutrality issues in the US, as competition or consumer legislation would kill any companies being stupid (eg blocking VoIP). If the vendors sell service providers the rope to hang themselves, so be it.
Now, as the attention moves to either piracy prevention or perhaps content prioritisation, I have a certain measure of sympathy for media rights owners or operators facing congestion. But I still think that they are set to waste a huge amount of money chasing after the myth of application-level (or content-level) monitoring and enforcement. That's not to say that all attempts at bandwidth management or monitoring are doomed to failure - they're not, and there are all sorts of other legitimate use cases that should work OK.
But operators need to be very wary of both vendor oversimplification, or content-owner indignation, when it comes to dealing with video or other media on their networks.
Friday, May 08, 2009
Thoughts on managed ID services by mobile operators
Over the last couple of years, I have regularly heard discussion about the possibility of mobile operators providing "managed ID" services to either businesses or consumers. It's something I haven't really delved into in depth, but I've generally thought it makes a fair amount of sense in some circumstances, especially for the corporate market where employees may need ID cards or other forms of secure logon.
I hadn't really thought about the options for massmarket consumers until seeing a couple of presentations and panel discussions at yesterday's Telco 2.0 brainstorm in Nice. I'm now less certain about the whole thing.
Orange presented the notion of becoming like an individual's bank, trusted to store digital content, provide single log-on capabilities and so forth, "for their whole life". Insofar as I could make out however, this was all tied to the individual maintaining an Orange access connection (SIM, fixed broadband etc).
This is utterly unrealistic. I can't imagine that any sane person would want to lock themselves into a mobile or fixed access provider for life. Why would you want to entrust your photos, or music or digital signature to a company that actively prevents you from churning if you just want to get an exclusive phone offered by another operator?
I think that any link between identity management and access will lead to a complete erosion of trust. Loyalty is not the same as lock-in: it is earned, not enforced. This is something that telcos (especially mobile ones) tend to ignore. I keep hearing terms like "stickiness" in discussions about churn reduction, which is a codeword for lock-in. This is a surefire way for customer dissatisfaction, and you can bet that when the beleagured punter finally extricates himself from his contract with his data, he'll shout it from the rooftops.
I caused a fair amount of consternation by asserting that I trust my bank and Google more with my data than my main mobile operator (O2 in my case) or the UK government.
Over coffee, a representative of a vendor asked me why I don't trust operators with personal data - and why I trust Google more. My answers were:
(a) they already spam me through SMS and post;
(b) I actually don't know where O2's ownership of my data ends and Carphone Warehouse begins. I get bills from CPW, but I'm on the O2 network.
(c) Operators (and governments) frequently outsource their IT to other companies. If they don't trust themselves to hold my data, why should I trust them?
(d) I know that operators pursue aggressive lock-in strategies, which makes me very wary of falling into the traps
(e) I know that there is a reasonable chance I'll want to churn at some point, and I see no reason to make my decision harder. What happens if Vodafone gets an exclusive on a particular handset I want?
(f) There's no bilateralism in identity management, which points to arrogance and closed behaviour. If operators believe that ID can be provided by third parties as a managed service, then operators should also buy into the concept for inbound ID as well as outbound. I'm not aware of being able to register on a cellular network with a BT or Deutsche Telekom managed identity, much less a Google or Skype ID
Overall, I still think there's scope for providing ID and authentication services as part of the Telco2.0 / two-sided business model concept. But I don't buy the pitch of consumer, massmarket "escrow" of personal data.
(Obviously, I trust the Government even less. I'd much rather have O2 manage my ID than the Home Secretary. SIMs are more secure, cheaper and better-managed than the UK's ridiculous Stalinist ID card system).
I hadn't really thought about the options for massmarket consumers until seeing a couple of presentations and panel discussions at yesterday's Telco 2.0 brainstorm in Nice. I'm now less certain about the whole thing.
Orange presented the notion of becoming like an individual's bank, trusted to store digital content, provide single log-on capabilities and so forth, "for their whole life". Insofar as I could make out however, this was all tied to the individual maintaining an Orange access connection (SIM, fixed broadband etc).
This is utterly unrealistic. I can't imagine that any sane person would want to lock themselves into a mobile or fixed access provider for life. Why would you want to entrust your photos, or music or digital signature to a company that actively prevents you from churning if you just want to get an exclusive phone offered by another operator?
I think that any link between identity management and access will lead to a complete erosion of trust. Loyalty is not the same as lock-in: it is earned, not enforced. This is something that telcos (especially mobile ones) tend to ignore. I keep hearing terms like "stickiness" in discussions about churn reduction, which is a codeword for lock-in. This is a surefire way for customer dissatisfaction, and you can bet that when the beleagured punter finally extricates himself from his contract with his data, he'll shout it from the rooftops.
I caused a fair amount of consternation by asserting that I trust my bank and Google more with my data than my main mobile operator (O2 in my case) or the UK government.
Over coffee, a representative of a vendor asked me why I don't trust operators with personal data - and why I trust Google more. My answers were:
(a) they already spam me through SMS and post;
(b) I actually don't know where O2's ownership of my data ends and Carphone Warehouse begins. I get bills from CPW, but I'm on the O2 network.
(c) Operators (and governments) frequently outsource their IT to other companies. If they don't trust themselves to hold my data, why should I trust them?
(d) I know that operators pursue aggressive lock-in strategies, which makes me very wary of falling into the traps
(e) I know that there is a reasonable chance I'll want to churn at some point, and I see no reason to make my decision harder. What happens if Vodafone gets an exclusive on a particular handset I want?
(f) There's no bilateralism in identity management, which points to arrogance and closed behaviour. If operators believe that ID can be provided by third parties as a managed service, then operators should also buy into the concept for inbound ID as well as outbound. I'm not aware of being able to register on a cellular network with a BT or Deutsche Telekom managed identity, much less a Google or Skype ID
Overall, I still think there's scope for providing ID and authentication services as part of the Telco2.0 / two-sided business model concept. But I don't buy the pitch of consumer, massmarket "escrow" of personal data.
(Obviously, I trust the Government even less. I'd much rather have O2 manage my ID than the Home Secretary. SIMs are more secure, cheaper and better-managed than the UK's ridiculous Stalinist ID card system).
Thursday, May 07, 2009
Ubiquisys and Intrinsyc - femto-aware Android phones
I first wrote about the desirability of "femto-awareness" in handsets almost a year ago. Since then, there have been various applications on devices (such as ip.access and others' demos) which have harnessed femtozone capabilities.
The announcement yesterday by Ubiquisys and handset integrator Intrinsyc is an interesting one. It changes the "theme" on Android phones when in femto range, based on what the partners call presence.
The exact mechanism for the trigger is a little unclear from the press release - in particular whether the femto is somehow "pushing" notification to the phone app, or if the connection manager is somehow detecting a changing cell ID or similar.
Either way, it fits in well with the general thesis that optimising handsets for use with femtos confers a range of benefits on both vendors and operators.
The announcement yesterday by Ubiquisys and handset integrator Intrinsyc is an interesting one. It changes the "theme" on Android phones when in femto range, based on what the partners call presence.
The exact mechanism for the trigger is a little unclear from the press release - in particular whether the femto is somehow "pushing" notification to the phone app, or if the connection manager is somehow detecting a changing cell ID or similar.
Either way, it fits in well with the general thesis that optimising handsets for use with femtos confers a range of benefits on both vendors and operators.
Monday, May 04, 2009
Reconnecting again!
Hello all
I've been away for the past 3 weeks in India, participating in a cross-country charity drive in a 3-wheel autorickshaw.
It's been an absolutely fantastic experience & adventure, made even better by the fact that I didn't make or receive a single phone call during that time. My "mobile" experience was a few SMS's exchanged with other the teams participating, plus a couple to my family. I went online at Internet cafes every 3 days or so to do "triage" on the most critical emails, and more importantly update the Rickshaw Run blogs & Facebook pages.
For anyone who never goes "off grid" for more than 15 minutes unless they're on a plane, I'd heartily recommend it. Next time you go on holiday, leave the phone at home.
Anyway - I'm now trying to get back to normal existence again, so one of the first things is to kickstart this blog back to life. There's been quite a lot going on while I've been away - Sun/Oracle, continued economic fallout in the mobile industry, debates about spectrum policy and so on. I'll try and catch up on the key themes.
Also, I'll be in Nice this week at the Telco 2.0 Executive Brainstorm, at which I'm moderating a panel session on devices.
I'll thank all the sponsors & other supporters of my India trip in a separate post, as well as giving a flavour of stuff I saw while I was away. But one question to leave you with - how much would you charge to have an entire wall of your house painted with a mobile operator's colours and logo?