Tuesday, December 30, 2008
The problem is exacerbated with smartphones that allow full multi-tasking, as they may have multiple applications pinging the network. This isn't too bad with 2G GPRS/EDGE, because of the way the radio channels are set up. But with 3G, and especially HSPA, there's a problem, because the radio has two "active" states (lets say "full" and "standby"), with different levels of power consumption, as well as a third state which could be called "off".
(If you want the full technical explanation on DCH and FACH channels, Martin Sauter has a great overview here)
Keeping the radio in the either fully-active or standby-active state is a battery killer.
This doesn't just impact "over-the-top" third party apps either. At the moment, it applies equally to "through-the-middle" operator services like RCS too. In theory, the operator should be able to control this a bit better, as it knows the various timings for the states on its network, and should in theory be able to configure the application to work around this.
But this would then need the application to be aware of the network state, and/or vice versa. And it would be complicated where the user had multiple operator clients running on the device (say, both email and presence).
As per normal, the handset, network and application areas of the industry don't really talk to each other. As with other themes in the industry, there's still the brick wall when you suggest that applications should be "bearer aware".
One option is to have some sort of "notification broker" that bundles up all the "keep alive" messages and sends them together, at times that optimise for battery life on the device and/or impact of the sheer number of these short messages on the network.
The next question is who controls the notification broker. And whether it's in the OS, a higher-level client, or even the radio. I think this may turn out to be a future battleground for the industry.
One thing to ponder on: Apple has been working on a push notification engine, although it's been delayed. And it also doesn't like background applications running on the iPhone.
It wouldn't surprise me to see an operator-sponsored approach as well. Possibly it already exists in some of the more "controlled" OS/app stacks like DoCoMo's. But I suspect that this will be more about protecting the network, rather than the handset battery - one solution optimising simultaneously for both seems unlikely.
This has huge implications for application developers, Internet players, handset vendors and operators.
I suspect the optimum solution would be a full two-sided version run by the carriers, used for internal apps like their own presence, but also with a completely open and standardised API for 3rd-party developers and hosted MVNOs. It might even be a new source of revenue. The other option is one run by the OS vendors - but even then, it would be preferable to have some sort of coordination with both operators and each other.
One to watch in 2009, I think.
Friday, December 26, 2008
Solution - as an alternative/substitute, I got them a 3G dongle as a gift. Obviously that's also a little hostage to coverage issues, but I'm finding central London not too bad unless I'm in a basement. Also, it's not necessary to have multi-MB speeds for HD video, or huge file download limits, in this case.
Given their expected stay in hospital, their existing 20MB ADSL + 802.11n WiFi coverage, and relatively infrequent normal out-of-home PC use (and iPhone ownership), there was no point in getting an 18 / 24 month ongoing contract. A packaged, boxed, prepaid dongle was ideal.
I saw that Vodafone announced a standalone prepaid dongle a week or so back, for £39 including an inclusive 1GB data allowance. You can buy more at £15 per GB through normal prepay channels. This seemed pretty good. Unfortunately, and despite what the first Voda store I visited told me, it doesn't currently support Apple Macs. Luckily, that one was out of stock anyway, and the second one on London's Oxford Street had some more clued-up staff who gave me accurate information.
Being two days before Xmas, with hordes of shoppers in London, I wasn't exactly getting personal service at some of the other stores. My patience ran thin in O2 and T-Mobile and Orange stores. But anyway, none seemed to have nice, convenient prepaid "in a box" dongles that didn't need contracts or other hard work to sign up, at least based on the display info and brochures. I was a bit surprised, as I'd though that T-Mo had a similar offer, but I couldn't find it.
So, my last port of call was 3 UK. They've been heavily advertising the dongle-as-Xmas-present offer all over London, so I was pretty confident they'd have what I needed. And yes, they have three pre-packaged options, based on Huawei E160 modem sticks (HSDPA 3.6 only, but that's good enough). They offer 1GB, 3GB and 12GB prepaid boxed bundles, with the data "credit" lasting 1/3/12 months respectively. Apple support was indicated on the outside of the box.
I was in store for about 10-15 minutes, was dealt with by two busy-but-pleasant staffers, who helpfully pointed out that the midrange 3GB bundle was, in fact, discounted to just £50 instead of the indicated £70. (Compared to £30 or £100 for the 1GB or 12GB offers). They also offered me the option of black or white dongles - obviously white to go with the MacBook.
I paid £50 (ie €52 / $74) for the modem and 3GB / 3 months of access. At that rate, it's essentially disposable. Whether 3 is making any profit on it, if most of the 3GB gets used up, but the account doesn't get topped up afterwards, is another question entirely.
As a side-note - given that some London hotels still charge £10 or even £20 for a day's WiFi, anyone coming to London for a week or more would be insane not to use the same approach.
Thursday, December 18, 2008
So without further ado, I'm going to revisit my 2008 predictions, originally posted here
1) There is a notable shift towards non-operator unlocked 'vanilla' handsets
Yes and no. There's certainly been a healthy market of people buying SIM-only mobile contracts in Europe, either with new handsets or unlocked legacy ones. On the other hand, operator exclusivity and better integration/optimisation on devices like the iPhone, BlackBerry Storm and T-Mobile G1, and 3/INQ 1 have ensured continued use of carrier channels for smartphones. The picture varies a lot by country and region, though.
2) The European Commission cracks down on data roaming prices.
Yes. You can't say you weren't warned.
3) Mobile broadand continues its rapid growth - but 3G-embedded laptops lose out even further to USB-based 3G modems
Yes. Enough said.
4) At least one mobile operator will face an investigation over reported numbers
No. Unless I missed anything, I don't think there has been any major incident of slapped wrists. I suspect investors are becoming a bit more savvy about interpreting reported numbers like ARPU, data revenues and "subscribers" for themselves.
5) Android... hmmm, it's just another platform
Yes. "We'll be back here in 12 months saying that Google's Android might be a big deal in 2009". Quite.
6) Technologies for exploiting end-user context and state become the hot topic of the year.
Up to a point. Lots of talk and conferences on APIs and presence and context-awareness, so yes it's certainly been a "hot topic". Plenty of discussion about possible business models whereby telecom operators can monetise their internal knowledge about their subscribers' context. But not that much real revenue or really impressive service launches yet.
7) Operators realise that knee-jerk attempts to block VoIP are counterproductive.
Yes, by and large. Many mobile operators have recognised that VoIP simply isn't a big threat, given complexity of use, and the fact that competitive and regulatory pressures are exerting much greater impact on circuit voice pricing. It varies by operator and region, but the DPI vendors' rhetoric is now much more about throttling video and P2P than blocking Skype.
8) Femtocells have a year of ups and downs. Some niche success, but practicalities will mean it's H2'09 or 2010 before massmarket deployment.
Yes. We're getting there on standards, and we're seeing the first soft-launches, but there are still too many optimists with steep hockey-stick curves. My forecast for 2009 shipments is still below one million units.
9) OK, this might be wishful thinking, but I'm hoping to see more pragmatism & innovation around the concept of mobile multiplicity.
Yes! There's a fast-growing acceptance that people will have multiple devices, and multiple operators. Even Verizon is talking about a 400%+ target for penetration. We're seeing dual-SIM phones, options for multiple numbers or personalities on a single device and so forth. Bundled phone+dongle contracts, too. We've still got a long way to go to get shared data plans across multiple gadgets, but even that is being discussed regularly.
10) No, No, No, No, No
- Mobile search. Still pointless. Yes. I know it's contentious, but I still see no evidence that people "want to find, not search" on their handsets. Tell that to Google with all their searches from the iPhone....
- Mobile advertising will definitely grow - but it's not going to get beyond a few % of ARPU in the foreseeable future. Yes. Definitely growing, but it won't solve world hunger.
- Mobile centrex. Yes. Still stuck on case studies of 11-person media agencies in Stockholm. No sign of massmarket uptake among either large businesses or SMEs.
- UMA dual-mode services. OK, maybe we'll get to 2-3 million users in 2008. Yes. No recent hard numbers from Orange or T-Mobile US - but I'm sure they'd be trumpeting loudly if they'd grown significantly. If anything, we may still be below 2 million in terms of actual active users.
- Unfortunately, we'll still have mobile industry dinosaurs referring to handsets as 'terminals'. Yes. As I said, it's a good filter for who "gets" mobile in 2009. Do you reckon Steve Jobs ever refers to the iPhone as a terminal?
- Unlicenced-spectrum wide area wireless. Yes. Nothing happening yet, although the white spaces discussion in the US is interesting, but still a way from real-world usage.
- GSMA's IPX. Actually, this is an interesting one. There's been much less emphasis on the IMS-based QoS-driven parallel pseudo-Internet this year. Instead, they're doing useful interoperability stuff around things like number translation and carrier ENUM.
I'll be working on my 2009 predictions over the next couple of weeks.
We're all familiar with this is when we use "free" WiFi in hotels or cafes - the venue owner picks up the bill, or a conference organiser.
I also saw a very good press release yesterday from mBlox, talking about the need for content providers to pick up the tab for connectivity - he makes an analogy with the sender of a letter or parcel paying for the stamp (sender pays), rather than than the receiver. mBlox's chairman, Andrew Bud, is a regular speaker at events like Telco 2.0 and articulates the need for this type of model very well.
However, my view is that the content-oriented "sender pays" model is just a narrow slice of a much broader opportunity for sponsored / third-party data. In general, mBlox and Andrew Bud tend to be very focused on "packaged" mobile transactions and content - downloadable videos, premium SMS, MMS, ringtones, songs, games and so on.
While these are undoubtedly important, in my view access to more open Internet mobile services will vastly eclipse that segment over time. Packaged content should grow, but I expect that ultimate demand is tiny compared with that for mobile access to FaceBook, Google search and maps, general web access, VoIP, corporate remote access and assorted web-based applications and widgets.
There needs to be a generic mechanism for mobile data - on phones, PCs and other devices - to be sold on a wholesale basis by mobile operators.
There needs to be a "free HSPA" capability to mirror "free WiFi" for notebook users at conferences (or across whole cities).
There needs to be a way to get free mobile web access paid for by advertisers.
There needs to be an mechanism that spans multiple operators and roaming, so for example a local tourist authority could sponsor free access by visitors to websites of local attractions and restaurants on their phones.
There needs to be a way for a new social network site to offer free mobile access, across all operators, to new users for their first month.
There needs to be a way that I can get a "global subscription" to the FT on my mobile device, without worrying about one-off charges.
There needs to be a way to be a way for the government to provide free mobile data for unemployed citizens without PCs, so they can check recruitment websites on their phones.
There needs to be a way to buy a mobile backup service, without the "sender" (ie the user) paying extra.
At the moment, the way to do this is typically a bit of a cludge - perhaps using a separate mobile APN, or trying to use content-based billing based on IP addresses or "application type". Or more usually, just giving people a flatrate bundle. But not everyone has flatrate data, and in any case it doesn't cover roaming, or multi-operator deals. As usual, the SIM gets in the way, which means at the moment it's easier to do all these things via WiFi instead of the cellular connection.
EDIT: Ideally, there also needs to be a way for the service/content provider to choose between different data wholesale products. So there should be different per-MB costs for fast vs slow access, extra-low latency connections for games or VoIP, options for peak vs. off-peak times of access, differential rates for macro vs. femto vs. WiFi access, and so on.
So while I agree with mBlox that "sender-pays" is a useful concept, let's not over-focus on a name and vision that it solely geared around delivery of packaged mobile content. My over-riding belief is that new business models for general mobile access and communication is always more important than mere content, and it's certainly the case here.
I cover sponsored / third-party pays data in the recently-published Disruptive Analysis Mobile Broadband Computing report.
Monday, December 15, 2008
Some pertinent ones:
"Can I run Microsoft Office?" - essentially, no, it doesn't have a CD drive or the processor clout to give good performance.
" If I switch network operator, can I still use my Dell netbook? : After your agreement with Vodafone has ended you can use the netbook with another operator. You will need a USB modem from your new operator and will need to re-configure the settings to be able to access their network."
"Can I upgrade the Dell netbook, e.g. larger hard drive, faster processor, more memory?" - No.
At one level, this is a very sensible concept - I've been trying to convince network-side manufacturers that they take the phones for granted for years. Numerous network technologies - 3G, IMS, UMA and potentially femtocells - have been delayed or rendered useless because the infrastructure or standards folk thought that the phones "were the easy bit". And while implementing some of the "hard" nuts-and-bolts protocols is occasionally simple, that certainly can't be said for the "soft" user-facing applications and all-round experience. I wrote about delays in creating IMS-capable phones 2.5 years ago, and we're still waiting for things like RCS.
In an ideal world, the network would be "exposed" to handset-based apps, and the handset "exposed" to network-based apps. The network would know (for example) what the phone's battery level was like, or if it was on charge, set to "silent" or had limited available free memory - all valuable data for applications. And it could help guarantee security through certificates or other mechanisms. Conversely, a handset-resident app could query different networks' levels of congestion, speed, price and choose the best one (obviously a lot of server-side APIs are already standard, courtesy of the web).
My concern is that Ericsson may not be the best-positioned or most neutral broker - especially if its proposition is based on Java and IMS as this other article suggests. If there is to be a successful common API, it absolutely needs to work in BOTH operator-controlled and 3rd-party-controlled fashions.
As a baseline, it needs to be absolutely neutral towards the philosophies of "over the top" or "through the middle" application architecture. It's fine if its *implementation* can be skewed one way or another in different contexts, though. For a handset provided by and subsidised by an operator, you'd expect there to be optimised applications and control/billing suited to the carrier's preferred platform, whether it's IMS or something else. But for an handset provided through other channels - and especially if it is subsidised or provided by a non-operator player (maybe Apple, Google, Cisco etc), the converse should be true.
There's also an large open question about how this fits with initiatives like OMTP's BONDI, which is "addressing the problem that an application written for one phone must be rewritten again and again if it is to work on all phones" (sound familiar?), but which is using a web runtime and browser/widget world-view, rather than Java or IMS. That said, Ericsson's also a sponsor of OMTP, so presumably there's some alignment going on in the background somewhere.
Sunday, December 14, 2008
So it's interesting timing to see that Nokia has just announced entry into the HSPA dongle marketplace.
This follow on from SonyEricsson doing the same earlier this year, and adding in extra functions like GPS and decent industrial design. Then LG, its LTE chipset announcement the other day, also mentioned that it is working on a prototype data card. And Toshiba does this cool-looking dongle/basic phone/MP3/memory hybrid thing called a G450.
I wonder if we'll see any cool-looking Apple-branded dongles at some point....
Thursday, December 11, 2008
The service is used, a modem is needed, money is paid by the subscriber (or a 3rd party) for the broadband. There's cash coming in, so in theory at least, everyone should be happy, with a bit of decent negotiation among all the parties.
Now consider another scenario.
A laptop is bought from a retailer or online, with a built in module. But the end user either has no intention of using mobile broadband, or perhaps will just try the free trial period included at the time of purchase. So no money is spent on broadband access.
But someone's had to pay for the module. Either an operator, the OEM, or the end user themselves. And whoever it is (or a combination of them) has wasted perhaps $50-100. The user is unlikely to be wearing the cost directly, especially if they've shopped around for competitively-priced laptops, or configured it online. That cash would go on a higher-spec machine, or stayed in their pocket.
If it's bought through a non-operator channel as a "vanilla" embedded PC, there's no clear way for a particular operator to be involved, as the user could put a choice of SIMs in it. There might be a small payment for a "trial" SIM to be in the box, much like AOL used to put a CD in with PC's back in the 1990s. But I can't see an operator paying for the full cost of the module for the privilege of marketing in this fashion.
Which means that for the modules included in "default" notebook configurations, sold through ordinary channels, the OEM is essentially footing the bill, if it never gets used.
Now according to this piece of research, the gross margin on a high-end notebook ($1300) is about 30%. But Dell's overall gross margin is currently 19%, and references to higher-margin products in the mix in its earnings call referred to things like storage and services rather than notebooks. And Asus has been making 20% gross margins on eeePCs, but they are thought to be falling to 15% - and it has now warned that shipments may be lower than expected.
So for the sake of argument, lets say gross margin is 20% for notebooks. Probably lower on an ultracompetitive netbook, and certainly higher on an Apple Macbook Air.
So on a netbook, the gross margin on the wholesale price is, perhaps, $50. And on a mid-range corporate or consumer notebook, $150.
I'm really not convinced that putting in an extra $50-100 cost of an unwanted, unvalued, unused 3G module is going to make those numbers look very pretty, from a CEO's or investor's eyes. Frankly, even if the wasted cost is $30 it's still not going to make people shrug in indifference.
Now, I wonder why Apple hasn't put any WWAN modules in Macs yet, given their continued targetting of an average 30%+ gross margin?
This is one of the reasons why I'm predicting slow uptake of built-in 3G (and WiMAX) in my new report on Mobile Broadband Computing.
There's also another big difference. When Americans say "cut the cord", they usually don't mean "cut the cable TV cord as well". They just mean the copper telephone line. But in markets which are ADSL-biased for broadband, that's not a realistic option for most household. Instead, they may be able to use an "unbundled local loop", and keep their fixed broadband, but get rid of their PSTN telephone subscription.
However, with the rise of mobile broadband, many operators' marketing teams are trying to get rid of ADSL connections as well. Obviously in markets with fully-converged operators and quad-play, that's less likely, but for mobile-only operators, they're possibly storing up trouble for the future.
At some point, the operator is possibly going to want to deploy femtocells, WiFi, or some other offload approach - especially if you're a heavy "mobile only" user. And at that point, the lack of an existing broadband connection is going to be a problem.
Not only that, but your copper line will be disconnected at the exchange - and so even if they want to offer you a new fixed+mobile package with a home gateway including a femto, someone has to pay for it to be reconnected and tested. It's even possible that they'll need to send someone to your house to check the wiring still works OK. (When I moved into my current house 2 years ago, I needed an engineer to reconnect everything & install a new socket, before I could get ADSL provisioned).
So for operators, although cord-cutting sounds like a great way to get more fixed-mobile substitution in place, there's a longterm downside with regard to future flexibility for macro network offload.
Wednesday, December 10, 2008
Embedded-3G laptops over-hyped – will only account for 30% of
Combination of Credit Crunch and Capacity Crunch will dramatically slow take-up
However, the report, “Mobile Broadband Computing: Device Market Forecasts & Business Model Scenarios” predicts that in the long term, embedded mobile broadband will indeed overtake separate modems, in terms of both shipments and the active user base. By 2014, there will be 150m users of notebooks and the smaller “netbooks” with embedded mobile broadband worldwide. In terms of device shipments, 100m wireless-enabled laptops will be sold annually by then – although not all of them will actually be activated.
The study identifies numerous reasons for the slower-than-anticpated growth of embedded WWAN (wireless wide area networking). Key reasons include: the global recession impacting notebook purchases, unfavourable pricing differentials; the limitations of the sales and support channels for mobile-enabled notebooks; and the typical two-year monthly contract payment model, which does not fit with much of the target market for these devices. This makes comparisons with the rapid rate of adoption of WiFi in laptops appear over-simplistic.
The report’s author,
Other findings from the report include:
- The new market category of “Mobile Internet Devices” (MIDs) will grow only slowly. Only 3m will be sold in 2009, although by 2014 this should grow to ten times that figure.
- By 2012, there will be 45m users of WiMAX mobile broadband computing devices. 11m of these will also use 3G or LTE connections in various hybrid or multimode approaches.
- An increasing proportion of subscribers will use their 3G handsets as “tethers” for their PCs, instead of using separate modems or built-in modules. However, fewer than 10% of people will use tethers as their sole access method.
The report also predicts that 2009 will be a much more difficult year for mobile broadband, compared with the huge growth experienced in 2008. The recession and non-availability of credit will drive a softening of demand for laptops generally, as well as a focus on value. For most people, built-in 3G or WiMAX is a “nice to have”, not a “must have”.
Some mobile operators, especially in
Finance officers at the operators have been happy about getting back some revenue on their existing, expensive and under-utilised network assets. However, they are much less enthusiastic at spending on upgrades. This combination of Credit Crunch and Capacity Crunch will have a marked effect.
One outcome will be a shift to new business models for mobile broadband. As well as revised prices and bandwidth caps, Disruptive Analysis expects to see new payment mechanisms emerge. Prepay (“pay as you go”) accounts are already popular in some markets and this will increase. In addition, new session-based, sponsored or “free” mobile broadband models will start to mirror the WiFi hotspot business – especially where network congestion can be lowered by the use of new “femtocell” access points. Conventional, long-term, monthly contracts will account for only 40% of worldwide mobile broadband subscribers by the end of 2011.
The report, “Mobile Broadband Computing: Device Market Forecasts & Business Model Scenarios” is available to buy from Disruptive Analysis. It includes detailed analysis of new product sales (3G laptops, netbooks, dongles, MIDs), installed base and mobile broadband service uptake by device type, network technology and business/payment model. Details are available at www.disruptive-analysis.com.
Sunday, December 07, 2008
As always in the technology industry, there are few precise definitions and lots of grey areas and exceptions. Different companies and observers pick & choose their own preferred terms to fit with their own market positions.
I've tried to use a consistent set of definitions in my work on mobile broadband and emerging devices. For reference, my thoughts (in brief) are:
Smartphone: device with an open operating system, full telephony stack, decent browser, and a size that means it can be held up to the ear for voice calls, without looking like an idiot. Typically with a screen size up to about 3 or 3.5 inches in width, although resolution may vary. Examples: Apple iPhone, E- and N-series Nokias, BlackBerries, most Windows Mobile devices
Obviously, they all have WWAN capability - predominantly 3G, as WiMAX isn't really mobile voice-optimised yet. Most now have WiFi as well.
MID (Mobile Internet Device): Handheld device with 4-8" screen, probably with resolution of VGA (640x480) or above. Various OS options, but not with a full PC-type specification and user experience. You'd be unlikely to do large spreadsheets on it, for example. Primary applications are generic Internet/computing tasks like web browsing, email, web-based productivity apps, social networking, access to some forms of content. Secondary applications vary, but perhaps navigation, voice, TV. Various form-factors (clamshell, slider, tablet etc), but generally too large to hold to the head as a "phone" unless you want to look like Dom Joly.
I make a distinction between "generic MIDs" - ie Internet-primary devices like Nokia N810 or the Aigo - and "application-primary" devices which also do mobile Internet stuff as a secondary capability, like the Archos 5, which I'd say is first and foremost a media player. Similarly, the Sony PSP has long had WiFi and is usable for web and VoIP, but is clearly a gaming product primarily. I reckon the iPod Touch is slightly too small to really be a full-spec MID but it's very very close to my (admittedly arbitrary but consistent) definition size-wise.
I'm expecting virtually all generic MIDs to have embedded WWAN - like phones, they'd be almost useless without it, and not necessarily support separate external modems with ease. Some will be 3G, some WiMAX and in future, some with both - and WiFi as well.
UMPCs (Ultra Mobile PC): Although the term is still used a lot, I'd define a UMPC as, essentially, a full-spec Windows Vista PC in a smaller physical form-factor, perhaps with a touchscreen and no conventional keyboard. Generally, UMPCs are quite expensive and may be tailored for particular vertical markets. Not, for the most part, massmarket consumer products. An example is the Samsung Q1. Almost all have WWAN capability built in, mostly 3G. All have WiFi.
Netbooks: These are probably the most prominent "small computer" form at the moment, exemplified by the Asus eeePC and more recently a plethora of competitors like the Dell Mini 9, HP Mini 1000, Acer Aspire One and numerous others. Basically, these are small notebooks, with a laptop-type form factor. They have 7-10" screens, and 80-90% size keyboards, with a mix of Linux and Windows OS's (no Mac netbooks yet....). They are typically intended for "lightweight" online and offline PC applications (eg web-based services, office applications) although they are generally not up to "heavy" gaming or corporate apps as they have less-powerful processors then full-size PCs.
Typically priced sub-$500, an increasing number will have built-in 3G or WiMAX, but despite predictions this will be far from "default" for a long time, as it will be an option rather than pre-configured. All have WiFi.
Sub-notebooks: The classic "small notebook" form that's been around for years. Normally a premium-priced, quite high-spec device such as various models of Sony Viao or Toshiba Libretto.
Notebook: Classic laptop, with a screen size of 11" and above. Mostly $500 and above. All have WiFi, and a small but (slowly) growing number will have embedded 3G and/or WiMAX.
Maybe in a year's time I'll need to review these definitions, but as I said above, it's always going to be possible to find exceptions or corner-cases.
Published December 2008: Disruptive Analysis research report on Mobile Broadband Computing. For details see here.
Funny, I always thought US consumers used Google, rather than the other way around.
Google's own pithy rebuttal is here
Honestly, both fixed and mobile operators need to get real about Net Neutrality, before it comes round and bites them on their collective backside. How would they react if some of the larger Internet companies started displaying announcements saying things like
"You appear to be using CompanyX as your ISP / mobile broadband provider. We will be ceasing to support access for users from that provider in 12 months time. We suggest you churn to CompanyY or CompanyZ, which have more enlightened policies on Net Neutrality"
Who do you think would win in a "battle of loyalty" between Google or FaceBook, versus a typical ADSL or HSDPA provider?
Going forward, I'm half-expecting Google to start charging unfriendly ISPs or cableco's. "$0.02 per search, unless you want your customers to get a 3-second interstitial advert for your competitors"
This is why I find it hard to get exercised by the lobbying around this area. The market will sort it out - if they keep whining, "pipe" providers are going to get taught some hard lessons about market power and competition. This isn't rocket science - it's been in strategy textbooks for 30 years. As Tim says - the answer's in the business model, and in the locked-up capabilities of operators' networks and billing systems and customer devices. Exploit your existing assets, not your Washington lobbying budget.
However, at prices like an extra $199 as charged by HP in the US, we're certainly not there yet - especially as that's up against the much more attractive "free" price point available to buyers of external USB modems. I also remain highly unconvinced that there's a massmarket of people out there who want to tie themselves into multi-year monthly contracts for their embedded notebooks. It's conspicuous that this is occurring at a time when many consumers are moving away from long-term plans for their voice phones, towards SIM-only and rolling one-month contracts.
As a quick heads-up, the Disruptive Analysis report on Mobile Broadband Computing is (finally!) published this week. Watch out for more details over the next few days.
Edit: One possibility is that some notebooks/netbooks might ship with embedded 3G modules, but with the OEM only paying its supplier if the mobile broadband is actually activated. Given the push to get more and more mobile broadband users online, I could imagine some of the more strategic suppliers (eg Ericsson or Qualcomm or the GSMA Mobile Broadband consortium) actually subsidising the modems' upfront cost, in the hope of encouraging an "aftermarket" of actual service activation.
Friday, December 05, 2008
This article, from the UK's mobile trade newsletter, seems to point to the threat becoming real.
It's not immediately obvious whether it's the retailer or the operator that wears this risk in this scenario, and I'd guess it'll vary depending on whether it's an independent retailer or the MNO's branded outlets. There's also all sorts of complexities around commissions and other payments, but the net effect on the overall mobile value chain is similar, if ultimately retailers go bust.
I've seen breathless commentary about the entry of a new "manufacturer" of Android phones, the Australian company Kogan. "Kogan simply designed a device, had it manufactured in China, and loaded Android onto it". Hmm, a small company importing LCD TVs and other electronics "designed" a device? They did interoperability testing against all the HSPA network hardware variants? They designed the antenna? Given their own price went up by Aus$100 over initial expectations because of currency movements, who'd want to guess their margins a year out?
Well, one thing's for sure, they announced it the same week that Nokia said that the global handset market is falling off a cliff.
It's tempting to say "Oh, well, it's still a billion+ devices a year, I'm sure new entrants can make money".
Same deal with all the new cool MID (mobile Internet devices) coming onto the market with shiny Intel Atom or TI OMAP or Qualcomm Snapdragon chipsets. Surely everyone wants a portable web device?
But think through the other ramifications:
- Nokia & Samsung will presumably take the opportunity to lower prices, benefit from scale economies, gain market share and squeeze/kill the less fortunate. Ditto the larger notebook vendors.
- Consumers will sign up for longer contracts as operators try to extend upgrade cycles
- Consumers buying unlocked phones are going to think very carefully before parting with their hard-earned cash (or they hard-crunched credit)
- Enterprise buyers will defer upgrades and replacements of anything. Especially if they expect to fire 20% of their employees next year. And where they do buy stuff, they'll know that they can get great deals out of existing suppliers who have friendly and trusted salespeople desperate to keep their jobs.
- Suppliers are going to want cash upfront, especially from startups.
- Distributors and retailers will want to cut risk and costs. Putting large marketing budgets towards "cool new stuff" from "cool new companies" will be less likely.
- Nobody is going to want to tie up capital in large inventories of anything - components, finished devices etc.
- Various companies in the value chain will go bust, leaving holes in supply chains, distribution channels.
- Any operator giving a subsidy for anything will be doing doubly-diligent credit checks.
- It's anyones guess what happens to currency movements. Bigger companies can hedge better than smaller ones. (OK, unless they're investment banks, it seems....)
- Any capacity-crunched networks are likely to stay crunched for a bit longer, unless a CTO physically forces the chequebook out of the CFO's hands.
Sorry to sound like a doom-monger on this, but really, I think some of the unbridled enthusiasm around high-end smartphones, MIDs, network deployments like LTE and broadband-enabled notebooks and netbooks is going to come down to earth with hard bump.
Frankly, I hope I am wrong on this. It doesn't do my business any good either - not many companies tend to spend money on pessimistic advice. And there will certainly be bright spots - Apple, for instance, as long as iPhone revenues offset any weakness in iPods, and its operators don't start playing hardball in negotiations. Nokia's new N97 (and its scale and reach in low-end devices) should help it weather the storm better than most.
Overall, I'd rather start 2009 with the glass half-empty, and then hope to get a top-up. Too many people are going for half-full, without spotting the cracks.
Thursday, December 04, 2008
I suspect the winner will be "both".
There's an interesting comment from Clearwire about this here . And I'm hearing about dual-mode LTE/WiMAX chips and devices.
In fact, I'm forecasting that by the start of next decade, there will be more dual-mode modems and modules sold, than standalone WiMAX ones.
It seems cool (although Fring has also been on the platform for a while), but I'm wondering who it's aimed at. The pitch seems to be around kids & college students. But most of them don't actually use voice these days - just SMS. A few use mobile IM, but that's definitely a minority sport in comparison, and likely to stay that way for the foreseeable future.
I'm not sure how many Touch's have shipped so far - I certainly don't see that many around in London on the Tube, but I guess it may have had more traction in the US. And I'm curious about user experience - what happens when you get an inbound call: is there a loud ringtone or vibrate? What about if you're listening to music at the time? (Can the Touch handle multi-tasking of the music player & background apps?)
So, I'm a bit skeptical about all this, as my 2005 predictions of WiFi-only VoIP devices have turned out to be over-optimistic. I was expecting to see lots of single-mode VoWLAN devices start to replace DECT and other cordless phones. But even with the advent of Skype-integrated WiFi handsets, there's been little traction, despite some decent brands like Netgear getting involved. Various DECT phones can now hook into Skype or SIP VoIP services via terminal adapters, but that's not exactly set the world on fire either.
Having VoIP as a secondary application on an iPod or a personal media player or a handheld navigation device is all very well, but I'm just not convinced that many people will want to use it, when they've got a mobile phone in their other pocket, with their main address book & SMS. As Andy points out, various VoIP applications have been usable on Nokia tablets, Sony PSP and other devices for some time. They don't appear to have made a huge impact, though.
Yes, the VoIP/iPod combination is probably quite convenient for people wanting to make international or long-distance VoIP calls away from their PC, if they haven't got an unlocked dual-mode smartphone.
Also, it's possible that in the US there are still kids & students who actually speak to each other on the phone rather than text, so maybe I'm looking at this through Euro-centric eyes.
Actually, I still think the most important non-handset platform for 'mobile' VoIP is the laptop - I often see people in airports or hotel lobbies with headset attached to their PC, and it's certainly my own main use case for VoWLAN. I often do conference calls using Skype-over-WiFi from my notebook, as it means I can take notes into a Word document and look at a website or slide deck simultaneously.
So, where's the Truphone client for Windows?
I am getting a sudden increase in email, phone calls and other inbound traffic which is based on the presumption I'm a blogger. Lots of stuff about SEO (ugh), "driving traffic", offers to write posts on my blog, offers to pay me to write posts, people trying to talk me into writing posts about their companies and so on. I'm not interested in "stories", I'm not interested in your web buttons, I'm not interested in having my posts syndicated on sites not frequented by my clients & contributors. And I'm certainly not interested in unsolicited press releases that Executive A from Company B is speaking at Conference Z.
So let me set the record straight: I am not a blogger.
I'm an industry analyst & consultant who happens to write a blog. My company is called Disruptive Analysis, not Disruptive Wireless.
I am no more a blogger than a politician, or a vendor's head of product marketing, or a TV journalist that happens to use a blog to reach out to potential customers / voters / viewers / whomever.
This blog is a tool, like the phone or email. I'm not an "emailer" or a "phoner". There is no advertising. It's a means to an end, not an end in itself. If it ceased to generate business for me, I'd stop writing it tomorrow without a second thought. I'm not interested in some nebulous and fluffy "blogging community".
The blog exists to generate interest in my published reports, consulting service, workshops and speaking engagements. Its secondary role is through the comments, which have often valuable insights. And the third role is for awareness - it's useful to me when people I meet have read my stuff & recognise my name.
I do not mind receiving (relevant) press releases. But read first, to be sure it really is relevant and not tangential to what I research. And then ask me if it's OK to add me to your list.
The sole exception to this is for those (few) large vendor companies that weirdly have better-resourced and more responsive "blogger relations" teams than those for analyst relations. In those cases, I'm happy to masquerade temporarily as a blogger, if you get me better access to the company than an AR team that doesn't recognise independent analysts.
Rant over. Apologies if it changed your view of who I am & what I do - but that's how it is.
Wednesday, December 03, 2008
The Register has a great example of European inadequacies and worrying authoritarianism. State authorities installing trojans on peoples' PCs to enable remote searches. I presume that the same philosophy would be applied to smartphones as well.
Luckily, the concept fails on so many practical and technical levels, we probably don't need to get too worked up about it just yet - although continued vigilance against creeping State invasion of data privacy is pretty important.
Some obvious flaws in the concept:
- How this software is installed on PCs in the first place
- PC security software
- Separate hardware firewalls (eg in corporate networks - I can just imagine them being reprogrammed to allow external agents to peer inside PCs on the LAN)
- Threat of these trojans being subverted by other malicious users
- How this would work with roaming users - would the government have the right to snoop on visiting Chinese users' PCs? Or would your PC's data continue to be visible when you were outside Europe?
According to El Reg, it is the Germans who are most keen on this approach.
Of course, here in the UK, if the government wants to know what's on your PC or BlackBerry, (for example if you're an opposition MP receiving embarassing leaks), it's much easier just to take a leaf out of Robert Mugabe's book and arrest them and physically seize their computers and phones.
This year has been all about mobile broadband revenue and traffic growth. Dongles, iPhones, embedded PCs, Android, consumer BlackBerries, Nokia E/N series.
But there is a mismatch. While operator data revenues might have risen 50% or 100%, 3G traffic has gone up by 500% or 1000%.
Until now this has, largely, been absorbing existing 3G/HSDPA capacity that has been lying dormant up since original deployment. Clearly, this has been perceived as beneficial - generating at least some revenue from data is better than nothing, and there are also signs of additional upside in using mobile broadband as a retention tool.
But the storm clouds are gathering, in my view. Not everywhere - some operators, and some parts of their networks, are more exposed than others. In the US, traffic is being driven more by the iPhone and other "superphones", while in Europe it's consumer user of 3G dongles. Given variations in population density, cellsite locations (and planning process), spectrum allocations, speed of backhaul upgrade & numerous other factors, it's certainly unlikely that the whole industry will grind to a congested halt.
But while some networks will be more robust than others, that doesn't mask a simple fact - the macrocell capacity of 3G - or even WiMAX or LTE - is not unlimited. While it can be tweaked and optimised, with more spectrum and MIMO and improved coding and other tricks, the laws of physics start to intervene.
Put simply, I reckon that the theoretical, mid-term, aggregate capacity of all operators' macrocell mobile broadband in a given urban location is in the range of 1-3Gbit/s per square kilometre. In other words, all the mobile capacity in that area equates to a single fibre used for current-generation metro ethernet.
Yes, that's quite a lot of traffic. But it would get absorbed very quickly if used for real "heavy lifting" applications like corporate data, HD TV, mass use of P2P and so on. The growing availability of HSPA and WiMAX devices with good browsers and big screens represents an ideal breeding ground for the next "viral" application after social networking.
It's not just the radio network that's a future bottleneck either. It's also the backhaul transport, the core & gateway elements like SGSN and GGSN, any ancillaries like DNS servers and so on. The usual steady onward march of mobile technology generations is impressive: HSPA+, LTE, SAE etc - but it's not quite up to scaling at growth rates more generally expected of fixed-line ISPs.
The only answer I can see is this is offload. Take the traffic off the macro network, and off the existing backhaul and core as far and as fast as possible.
There are various solutions to this:
- Femtocells - these are the most visible heros of the offload strategy, but I'm not convinced they'll ride in for the rescue quite quickly enough. There's also not enough emphasis on local breakout onto the Internet - the mobile industry still wants to funnel everything through the femto gateway & GGSN to retain control.
- WiFi and dual-mode devices are due a resurgence - both in homes/offices and in public locations. There's a lot out there already that can be exploited: hence AT&T's acquisition of Wayport
- Flattened IP cores, bypassing the SGSN. Ericsson and Nokia-Siemens Networks have already been deploying these for certain carriers.
- Optimised backhaul - there are various strategies here, including shunting all the Internet-destined traffic onto higher-bandwidth / lower-QoS / lower-cost connections, keeping voice and other priority traffic separate.
- Smarter and ultimately software-defined radios that can choose less-congested frequencies or technologies, or operate in shared spectrum like white spaces.
- Content delivery networks (CDNs) can also spare the operator core network the pain of dealing with some of the real high-volume traffic - although these don't yet delivery rich media like video direct to the base station. As we move towards IP-based RANs, that should also improve.
Of course, all these are very network-centric approaches. My expectation is that device, OS and application vendors will also take matters into their own hands, and develop their own offload approaches. There will be a rise of smarter connection managers and APIs, that will allow the apps to pick the appropriate bearer and adjust their traffic profile to suit it. They'll monitor congestion, latency and packet loss. They'll actively look for their own offload channels, especially via WiFi.
The bottom line - 2009 will be about "offload" from a network viewpoint, and "connection optimisation" from an app/handset viewpoint. Much of the time the strategies will be aligned, but there will also be some conflicts.
I also refer to the "capacity crunch" issue in the new December 2008 Disruptive Analysis research report on Mobile Broadband Computing. For details see here.
Tuesday, December 02, 2008
I've been calling this an "inside-out network" approach.
On the face of it, I can see a lot of positives - it potentially reduces the overall capex requirement needed for network rollout, and solves a lot of issues about indoor coverage for markets where LTE is most likely to be deployed in the 2.6GHz band.
But I'm a bit wary about some of the assumptions being made. Particularly comments like "In 2013, 60% of mobile data usage will be indoors".
Maybe. Maybe not. It's a brave person who'll pre-judge what applications will be used, on what devices, in what contexts, five years out. A year or so ago, nobody was expecting the use of Google Maps on handsets to be one of the prime drivers of macrocellular 3G traffic. Although I'm perennially skeptical, maybe someone will have finally worked out a way to get people to consume mobile TV. Nokia's been talking up the idea of a sort of augmented-reality overlay, superimposing extra information on a view of the "real world". Maybe we'll all have Bluetooth head-up displays, showing streamed video adverts on what we'd normally see as blank walls.
Are operators going to be happy about possibly being held hostage to future application innovation? Someone comes up with a great revenue-earning new service - but it's used outdoors, so it can't be deployed on an inside-out network.
And in any case.... assuming that a lot of that mobile data use is indeed indoors, what % will be on devices that also have WiFi in them? PCs, iPhone, most high-end smartphones (Blackberry Storm excluded, obviously...). Over 90%, perhaps 99% of that indoor data could be offloaded to WiFi.
And I certainly don't believe all the femto hype about substituting for WiFi, especially in markets with 4-5 competing LTE operators and no national roaming, so you'd need an array of separate operator-specific femtos. Yes, there might be the odd single-person household or combined family-plan home that could work for, but they'll be the exceptions, not the rule. And you'd definitely need to support all operators in public hotspots.
Then there's backhaul. Putting an LTE femto on a 2Mbit/s ADSL line isn't going to be tenable, especially if it's from a third-party ISP which decides to throttle IPsec traffic at busy periods, as one example in Europe apparently does. You'd need high-end cable or VDSL or fibre to do justice to LTE. And outside Japan, I don't see much ubiquitous nationwide FTTH any time soon, given the economic outlook.
Last of all, there's the voice issue to consider. The current crop of early femto deployments from Sprint and Starhub tend to have heavily voice-centric homezone-type components to their business model. And yet, the question of LTE handset availability pivots on deciding what the LTE voice service looks like. Clearly, it needs to be some sort of VoIP - but what, exactly?
Overall, despite the attractions of the inside-out model, I'm not yet convinced it's truly the answer to LTE. I'm reminded of the words of a Nokia radio networks guy I talked to at 3GSM about 3 years ago, asking about femtos and picos. He said (in a gruff Finnish accent):
"Outside-in, always wins"
Some operators in the UK are now heavily discounting 3G USB dongles and monthly contracts - or after rebates, actually giving them away to existing voice/phone customers. In other words, they're using them as customer retention tools to reduce churn. Buy one contract, get one free.
He reports on Vodafone giving him a free HSUPA dongle, and discounted £7.50 / month connectivity, offset with a £75 bill credit. I also regularly see adverts in the window of my local 3 UK shop offering half-price dongles to phone customers, and I guess some hard negotiation could yield some sort of extra rebate as well.
My view is that not only are these type of deals commoditising mobile broadband pricing still further, they make some of the other business models look really inflexible and old. I think that the "traditional" 12/18/24 month standalone monthly contract for mobile broadband will become a minority option - whether it's for a dongle or an embedded-3G PC.
At the moment, I estimate that these traditional post-paid monthly billing models account for about 80% of all mobile broadband subscriptions. But by the end of 2010, that will have fallen to 50%. And further out beyond that, I expect to see various new options, like 3rd-party sponsored "free mobile broadband", to reduce the monthly-bill segment to below 20% of users by 2014.
Anyone who is working on basic "per month" revenue models for their mobile broadband services needs to rethink them. The term "subscriber" will swiftly become meaningless, as most 3G users won't have classical subscriptions for their PC connectivity.
My new Disruptive Analysis report, Mobile Broadband Computing, has full forecasts for these different business models, broken out by 3G vs. WiMAX, and for dongles, embedded notebooks and MIDs. It is published this week. Please email information AT disruptive-analysis DOT com for more details.
Monday, December 01, 2008
On the one hand, it introduces consumer-centric legislation on competition which is broadly positive. Termination and roaming fees are in many cases egregiously high, and ordinary competition largely fails to bring them down to realistic levels. This is because despite retail-level choice for consumers, there is an effective monopoly by your service provider on interconnecting calls/messages to a given number.
And it's not like the industry isn't given enough warnings, so the recent moves to cap intra-European SMS and data roaming at €0.11 per message and €1.00 per MB can't come soon enough. In fact, there's probably a good argument for dropping another zero or two from the data roaming threshold, but it's a good start at least.
(Actually, the whole notion of "roaming" access for data is ludicrous - why is the 95%+ of Internet-destined 3G data traffic backhauled all the way via your home network anyway, rather than broken out onto the Internet locally in the visited country for pennies?).
So that's the good side of the European Commission. Competition, free trade, international tariffs and so on. It has also gone a long way to ensure that Internet connectivity is a basic right of citizens.
The downside comes when it tries to intervene in actual technology decisions or attempts to harmonise laws and regulatory regimes in a heavy-handed fashion. The recent "Telecoms Package" included many onerous, and in some cases frankly authoritarian, demands. Luckily the EU Council of Telecom Ministers has thrown out some of the more ridiculous aspects, including the suggestion of an EU-wide super-regulator and centralised spectrum policies.
While the original GSM Directive which mandated both technology and frequency choice has indeed, in hindsight, been a major success, the EC needs to recognise that the world has moved on. In particular, it misses the fundamental move from a vertically-integrated and voice-centric telecoms industry, to one which is layered, data/Internet-driven and intimately entwined with IT and entertainment industries, and increasingly various others as well.
Now, attempts to impose external legal requirements on particular layers of technology has a huge potential for introducing extra cost, delay or outright market failure. We have already seen pointless and wasteful intervention in the market for Mobile TV, where the insistence on DVB-H was completely in contradiction to the spectrum policy moves towards "technology neutrality" for wireless access.
The latest efforts by the commission to meddle in the market have been around supposed harmonisation of spectrum policy. In theory, that's a laudable aim which could help scale economies for suppliers, but in reality each European market is very different in terms of market structure, technology preferences, customer psychology and national government stance on key issues. The notion of an unaccountable Brussels-based authority that could veto specific national ideas is completely anathema to most observers.
Some of the thinking around "net neutrality" seems pretty woolly as well, especially given the likely emergence of innovative business models in some of the most competitive markets. There's nothing wrong with non-neutral models if people are easily able to switch providers. That said, legislation on openness and transparency about non-neutrality would be welcome, which is a very important distinction.
In my view the whole argument about the necessity of EU legislation to protect the relevance of the overall "European telecoms industry" in a global context misses the point. Why should accidental geographical contiguity of 27 countries determine information technology policy anyway? Why shouldn't Romania be able to adopt the spectrum policies of China if it chooses? Why should a country with high population density and lots of fibre have the same mobile vs. broadcast frequency allocations as one with sparse population? Why should a future libertarian government in the UK be forced to apply the same data-retention laws as those in France or elsewhere?
In my view, the European Commission and Viviane Reding should generally stick to issues impacting consumer protection and competition. And it needs to be especially wary of consultants that make huge sums through trying to steer EU-wide policy towards specific technologies or applications.
And I'm sorry Ajit, but I reckon that Reding's latest hobby-horse is another guarantee of failure "We must make sure that Web 3.0 is made and used in Europe". Frankly, that's the scariest and most megalomaniac statement I've heard from a bureaucrat in a long time (well, about technology at least), and an almost certain guarantee that nothing of the sort will occur. The only "main step that Europe has to take to respond to the next wave of the Information Revolution" is to get out of the way, and leave innovation to the innovators.