Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Looking for a provocative & influential keynote speaker, an experienced moderator/chair, or an effective & diligent workshop facilitator? To discuss Dean Bubley's appearance at a specific event, contact information AT disruptive-analysis DOT com

Sunday, March 30, 2008

Will too many frequency band choices delay LTE?

A few years ago, the first 3G UMTS networks to be launched involved some relatively simple choices.

In Japan in 2001, DoCoMo launched its FOMA service initially with just a single frequency band supported. This was 3GPP Band I, using 1920-1980MHz paired with 2110-2170MHz, commonly just called the 2100MHz or 2.1GHz band. Initial devices were launched without 2G dual-mode support.

In Europe, operators used the same 2.1GHz band, but were reluctant to launch devices which didn't also support GSM for fall-back. This typically meant adding support for 900MHz and 1800MHz 2G into the handsets, which was one of the contributory reasons for the delay in wide launch of 3G handsets, as well as their expense, clunkiness and poor battery size.

Nevertheless, at least the handset designers and operators knew what they were dealing with, when it came to frequency support - the 2.1GHz was a known quantity: challenging, but at least the goal-posts were fixed.

Now ratchet forward to the future introduction of LTE.

What's going to be the main launch frequency for LTE in Europe and Asia?

Is it the forthcoming 2.6GHz band, for which auctions are already starting? This has a fair amount of capacity, but may suffer from poor indoor penetration or the need for much denser networks of cell sites. There are also some issues around possible interference from WiMAX or other TDD technologies in the middle of the band, especially if certain regulators like Ofcom look to adopt national-specific bandplans with a unique mix of TDD and FDD.

Is it refarmed 900MHz GSM spectrum? It's a prime frequency, but not every operator has some of this to refarm. Will regulators insist on a reassignment as part of the refarm process? And will this be enough for LTE, especially as the broadband business case really needs 10MHz or 20MHz channels? There's certainly not enough for 3 or 4 operators per country to have 10, 20 or more MHz.

Could it be 1800MHz refarmed spectrum? Again a possibility, but not for everyone.

Maybe a straight replacement of existing 2.1GHz 3G?

What about the UHF Digital Dividend spectrum down at 450MHz? That's also patchy, and not available until analogue TV is switched off in different places at different times.

Then there's 700MHz spectrum in the US, and various other local variants in Japan. And what about China?

The bottom line is that nobody really knows. 2.6GHz is looking like a probable candidate, but certainly not the only one. And if handset manufacturers and chipset vendors have to start out with dual- or tri-band LTE, and also support 2G and 3G radios for fall-back, I think that the timelines and costs will suffer.

Thursday, March 27, 2008

Carnival of the Mobilists

Tip o'the hat to the Situational Marketing blog for referencing one of my recent posts on handset OS fragmentation in this week's Carnival.

And I also realised I missed the link at the previous one at Chetan Sharma's great blog as well.

Catching up - China and Heathrow Terminal 5

I haven't posted this week because I've been behind the Great Firewall of China, at ZTE's analyst conference (more on that company in a later post). While there seems to be a bit more flexibility in net non-neutrality from that part of the world these days (Skype & Yahoo Messaging, plus BBC News' website was fine, so no problems reading about events in Tibet), I couldn't access Blogger, or other blogs hosted on TypePad or Wordpress. Definitely another "ah hah" moment when I realised what was happening.

By coincidence, the trip meant that I was able to score a seat on this morning's inaugural flight BA26, from Hong Kong into Heathrow's gleaming new Terminal 5. As well as being another temple to architecture deity Richard Rogers, it also features a comprehensive WLAN infrastructure from Aruba. Apparently, as well as being used for baggage handling (which from my perspective seemed to work a lot more efficiently than other bits of LHR) airport operator BAA "acts as a service provider to .... multiple hotspot Internet service providers."

Unfortunately though, I had just flown in from Hong Kong's Chep Lak Kok airport, which is a similarly-impressive edifice (courtesy of Norman Foster) but conspicuously offers 24-hr free WiFi, courtesy of PCCW throughout. (Free WiFi at the hotel in Shenzhen, too).

That Heathrow T5 attempts to perpetuate the overpriced European WiFi paid-hotspot market, rather than looking at how to provide passengers with free access as a utility doesn't surprise me in the least. It's run by BAA, the rapacious mall operator best-known for forcing its customers through endless rows of shops, viewing the flights themselves as inconveniences that limit the 'retail experience'.

It's also been advised in wireless strategy by consulting and engineering firm Red-M. The system enables it to set & monitor policies about exactly what wireless coverage is available where. I imagine they've been able to extract similar rents from the hotspot & cellular guys as they do from ordinary retail stores.

And I noticed that I couldn't get a signal for Hutchison 3G, or 3G on my O2 phone, down in the Heathrow Express train station. Wouldn't want 3G modems to compete with the presumably lucrative contract offered to T-Mobile to run WiFi there, would they? Will be interesting to see how good 3G coverage is on each operator in the rest of the building - I'd guess that some operators were prepared to stump up the requisite cash and others weren't. Expensive-looking Vodafone 3G modem adverts adorn the arrival hall, so I guess that Big Red's network will be fine.

Put simply, this is another example of the danger of giving network policy tools to organisations whose policies are questionable. It wouldn't surprise me if BAA took a leaf out of China's book and also invested in some firewalling policy-management gear as well, and started to limit what websites or IP services are accessible via the WiFi.

So let me know if you manage to read this blog from T5's departure lounge, or whether I've been network-blacklisted for daring to make a negative comment about the BAA regime....

(Sidenote - British Airways gave passengers on my flight into T5 a small gift to commemorate the opening. Bizarrely, it's a wireless computer mouse, of the type not usually permitted for use onboard planes.)

Friday, March 21, 2008

Attn AR/PR folk: No, I'm not going to be at CTIA

...so please refrain from sending emails asking if I can meet with your clients there.

Cheers

Dean

US 700MHz - Will Qualcomm do something disruptive?

So the US 700MHz auctions are over, except for the unloved D-Block.

What's changed? Well, there's no Google, that's for certain (no big surprise there, to be honest, I really couldn't see it wanting to hire 1000's of radio engineers). And AT&T, Verizon, Qualcomm & Frontier Wireless hoovered up most of the licences on offer. Although it should be interesting to see if the requirement for openness makes a meaningful difference.

According to the FCC, 99 bidders other than the current US incumbents took over 700 individual licences between them, meaning that at least in theory, there are potential new wireless broadband competitors in each regional US market.

I'll be honest & say I don't know much about Frontier, which is essentially now a nationwide spectrum owner.

But the one that caught my eye was Qualcomm. It cherry-picked some of the key US markets - Boston, NY, LA, SF & Silicon Valley, Philadelphia - with some hefty bids for B- (12MHz of paired spectrum - FDD) and E-Block (6 MHz unpaired - TDD) spectrum.

Now, it already owns some spectrum that underpins its Mediaflo TV platform. I'm wondering if, just maybe, it might extend that model towards its own unloved UMB evolution of its main cellular technology, which seems to have been sidelined by LTE and WiMAX in recent months. UMB can work in both FDD and TDD spectrum.

It could potentially develop a wholesale network, which could then host services from (say) ISPs for fixed-broadband, local government & public safety (who often have regional requirements not needing full national coverage), and various other channels. Yes, it would potentially compete with Sprint's WiMAX Xohm network in some cases - but this could help salvage UMB, which could have the side-effect of persuading some remaining CDMA waverers (notably KDDI in Japan) to take the plunge as well.

I'll readily admit that this is all speculation - it could just be it wants more Mediaflo spectrum, or perhaps there's another hidden plan I've missed. But I have a sneaking suspicion that UMB will raise its head again somewhere.

Thursday, March 20, 2008

Rant: Enough with the NXTcomm spam

I don't know if NTXcomm is a good trade show or not. I've never been. And I'll never go.

Two years ago I stupidly signed up for a press/analyst pass, as I thought I might be able to attend. That simple act, combined with exceptionally poor database management on the organisers' part, led to a seeming never-ending stream of emails about the show. I think they take the PR/analyst list & redistribute it to about 12 departments within the same company, none of whom bother to check to see if I'm a relevant target for their dross.

Last year I received over 40 emails from them. Despite clicking on the "unsubscribe" link at least 8 times.

Eventually, I got in touch with some of the PR folk & told them if it didn't halt, I'd write a pretty coruscating post about their capabilities in terms of data protection, and their lack of competence in managing something as simple as "unsubscribe" request . I asked that they delete every reference to me from their databases.

Now, I'm already receiving 2008 messages from the organisation. And again they've ignored unsubscribe requests.

I note that the US "CAN SPAM" act contains the legal instruction "It requires that your email give recipients an opt-out method. You must provide a return email address or another Internet-based response mechanism that allows a recipient to ask you not to send future email messages to that email address, and you must honor the requests. "

Now, at the moment it's just a nuisance. Although it's obviously very unprofessional of them, I'm not the sort of person to pursue legal action over something this trivial. I'd rather just give them a bit of a kicking throuhg this blog. But if you're an exhibitor or sponsor of the event, next time your sales representative calls, tell them to sort it out.

And if you're an journalist, analyst or other potential attendee of the event - can I suggest you set up a "disposable" email address with Yahoo or similar, rather than trust the system with your primary address.

EU stance on DVB-H: Inconsistency and irrelevance

I see that the European Telecoms Commissioner Viviane Reding has issued a pronouncement about mobile TV, encouraging companies & countries to standardise on DVB-H, rather than alternatives like T-DMB, 3GPP MBMS or Qualcomm MediaFlo or Nextwave's TDtv.

Yet the term "For Mobile TV to take off in Europe, there must first be certainty about the technology" seems to be completely contradictory when compared with the emerging European view on technology neutrality in other mobile services, especially wireless broadband.

Ongoing trends such as WAPECS (EUspeak for tech-neutrality) in 2.6Ghz and 3.5GHz bands are looking like a certainty, while other moves are afoot to allow refarming of GSM spectrum for 3G or LTE (& maybe other) radios. Various arms of regulatory bodies are bending backwards to accomodate WiMAX in particular as a competitor to UMTS & LTE standards.

Plus, the European Commission seems to be stuck in 2003 when it comes to their bullish predictions on Mobile TV. Yet the general consensus (and my own long-term view) is that hardly anybody wants mobile broadcast - it's a niche application, not a massmarket one. It's much easier than any other new mobile service for customers to understand..... ("it's telly, on your phone").... yet it's almost universally viewed as an irrelevance.

The typical digital-TV subscription model also doesn't work too well with prepaid handset users, who account for the bulk of most European countries' user bases. This makes it much more difficult to create a business case as it will be based around pay-per-view or perhaps one-off payments for day/week/month access. Few operators will want to subsidise DVB-H functionality in phones if there's a chance that the customer may only ever spend €1 (or zero) on using it.

In order to succeed, there will also need to be some utility for TV-enabled phones working in "offline" (ie free) mode, to get people to become familiar with the general concept, then hope that revenues follow over time.

This gives a prime example of why government or supra-governmental organisations shouldn't try and pick technology choices. I guess we should be thankful that Reding chooses to standardise things that nobodyreally cares about, rather than the important ones.

Tuesday, March 18, 2008

Handset OS fragmentation is here to stay - and may even impact future network architecture

A couple of years ago there were lots of hand-wringing comments about how the proliferation of handset OS's was harming the chances of any consistency in mobile applications development and uptake of data services. Operators like Vodafone made pronouncements of how they intended to rationalise their software platform base.

At the time, I was skeptical that there would be any easy way to meaningfully reduce the number of smartphone OS's as well as proprietary featurephone platforms. And in fact, if anything the number has risen, with the continued growth of RIM, the emergence of Apple's OS, Google/Android on the horizon and a continued plethora of other Linux platforms vying for attention. Meanwhile, none of the major handset vendors has committed to phasing out their proprietary platforms like Nokia S40 or the various Moto / Samsung / LG / S-E equivalents.

On the merchant-featurephone software stack there have been a few changes to be fair, with Qualcomm's BREW/UIone appearing on some more devices, but with TTPCom's Ajar being absorbed into Motorola's software maelstrom. Others like OpenWave and Obigo and Intrinsyc are still around too.

There is no obvious emerging contender to sweep away all others, especially at a global level. I can't see Nokia pushing S60 much further down into S40's domain. I can't see Apple or Google making a huge mark outside North America. I can't see RIM or Microsoft abandoning the enterprise sector. UIQ and a couple of the Linux players are not the strongest, but if they disappear I'm sure there will be plenty of newcomers to take their place - isn't it about time we saw a reincarnated Java/SavaJe-type player, for instance?

Meanwhile, all the real action seems to be migrating to the presentation layer - Webkit browser, Adobe Flash, Microsoft Silverlight and so forth. There seems to be at least a vague semblance of homogeneity emerging there, at least on high-end devices. Sure, even at that level there won't be "one platform to rule them all", but some variation of Web+Widgets+XML+various extensions is perhaps not as tricky as the current app-porting challenges.

There are some interesting side-effects of all this - even propagating through to the network side of the industry, despite the fact it likes to think itself divorced from developments with mere handset software.

In particular, the industry seems to be moving even further away from fully IMS-capable handsets. Sure, in some ways the IMS client framework is starting to get a little more realistic, but with the shift in value (and certainly the shift in innovation) towards the browser it's starting to look a little irrelevant. Yes, we might get some IMS-based apps on the phone - IM, presence, even full VoIP - but that's not where the cool stuff is, and I think that the web aspects of the handset are moving too fast to be captured in even a next-gen IMS/Web architecture.

In particular, the chance for VoIP to become an "anchor tenant" for IMS has diminished, with the standardisation of the fairly-pointless MMTel Multimedia Telephony. As I've mentioned before, 3GPP has totally missed the point with MMTel: next-gen mobile telephony will be about integration, mashups & web services.... not video-calling/sharing and other "media".

Given that some sort of VoIP is mandatory with LTE (unless operators want to run GSM/UMTS networks in parallel forever), somebody really needs to get working on a more useful mobile VoIP standard NOW, and work out how to implement it in phones. In fact, whoever does the work should start from the standpoint of the user, with a phone in his/her hand and then work backwards to determine what the network has to look like to support a proper version of Mobile Telephony v2.0.

One of the themes I've discussed with a few companies recently has been the impact this could have on cellular network architecture. In particular, I've been looking at the current converging mess at the edge of the network that is some combination of SBC, security gateway, softswitch, GGSN and assorted other functions. I've been speaking to companies like Stoke, Acme Packet, Sonus, Nextpoint, Mavenir etc recently, and I'm starting to wonder if the required aggregation models will be driven indirectly by device type & usage cases, as much as they are by infrastructure-based decisions for multi-access.

If the traffic delivered by handsets - and the capabilities valued by end-users - moves away from session-based services like voice & IM, and towards more web/browser/widget functionality - what effect does that have on the boxes at the edge? I haven't got a full answer to this yet, but it's definitely something I'm looking into closely.

Thursday, March 13, 2008

eComm day 2 - Android and others

Some quick notes on today's presentations at eComm in Mountain View.

Rich Miner from Google/Android presented this morning. I've refused to be drawn in by the Android hype over the recent months, and today's pitch has done nothing to convince me either. I'm not quite at the stage of saying "The Emperor has no clothes" - idealists with money are inherently unpredictable - but I'm definitely not convinced that it's going to be very important, especially outside the US.

Miner's presentation got off to a very shaky start with the tired old canard about "1 billion Internet users, 3 billion phones" and how this somehow proves that many people would be "One screen" users only using their handsets. I realise that Google's expertise about fruit & vegetables is limited, but I would have thought they'd have worked out the difference between apples and oranges by now.

There are also many open questions about Android in terms of radio support (especially 3G), SIP APIs, and just how "open" it really is. It obviously still has the "Google can do no wrong" halo among developers in Silicon Valley, but there still appears to be a US-centric view that end-users actually want to install applications on their handsets.

At least Miner had a better answer to my question "why does this matter to prepay users who don't have data plans" than Symbian has in the past. (Answer: it'll scale down to low-end featurephones & reduce bill-of-materials, plus people can sideload apps rather than download them).

Overall - I'm not writing off Android. It is Google after all. But I'm deeply skeptical that it'll become very important, very quickly. It's certainly not going to be ubiquitous.

Apart from Android:

  • Ribbit has a very clever-looking API platform for creating telephony mashups, based on a "big iron" Lucent class-5 softswitch. Discussions about blending voice+web have cropped up increasingly of late, and although this isn't my core area of coverage, it looks impressive at first sight.
  • Intel had anthropologist talking about context, GPS sensors, location-based services and related topics. Very well-observed commentary distinguishing between "machine-readable" context [GPS, motion sensors, light/dark etc] and more personal forms of context [commute vs journey, home vs house, relevance to conversation etc]. She comprehensively demolished some of the sillier location-related business models like sending coupons/ads to passers-by at a particular location.
  • Yahoo's Fire Eagle cross-platform location API and service platform sounds intelligently thought-through. It basically captures a wide variety of location inputs (GPS, cell, addresses etc) and maps them to a personalised profile for a person, enabling each "querying" application to have separate, user-defined permissions settings for what data & granularity is divulged. It also gives the users reports to work out which applications have been regularly asking for location lookups, so you can track which software has been snooping on you.
  • Embarq gave a presentation about privacy vs security, which was interesting, but not as memorable as a phrase in the upfront disclaimer "some concepts are still in the ideation phase". I'm stealing 'ideation' as it's the most obtuse jargonese I've seen in ages.

Wednesday, March 12, 2008

Notes from eComm

I'm in Mountain View for the next couple of days, at eComm 2008, Lee Dryburgh's get-together for an impressive assortment of forward thinkers around personal communications, billed as the "trillion dollar rethink" - what telecoms will be when it grows up.

I'll add thoughts & edits to this post as the event evolves.

I'm not sure I agree with Lee's opening stance that "the telephone is dead, long term". The idea being that it morphs into a "relationship lifecycle management device". I'm not so sure - I think the phone has been around for 100+ years and people quite like it. For people with 2+ devices (or people with 1 device but not much money), I'm unconvinced that cool stuff like social networking & presence on their primary phone is very valuable or desired. Especially compared to (say) a camera, MP3 player or cute design.

David Isenberg (of "stupid network" fame) talked about Net Neutrality and "The Political Layer" above the usual ISO network protocol stack. He dragged Martin Geddes on stage and asked what I thought was a weird question "How many ISPs can you get access to in Edinburgh?". Martin answered "about 30", and I shrugged. Then he asked the (predominantly US-resident) audience members "Who here can get access to more than 1 or 2 ISPs?". Pretty much the only people raising their hands were the Europeans.

Now I understand why Americans get so exercised about open access, net neutrality and so on. I hadn't realised quite how appallingly competition policy had failed in the US - I knew it was bad.... but I hadn't really grasped just how bad. The stuff I take for granted in the UK - lots of wholesale ISPs, and quite a few local-loop-unbundled ones, plus cable - just doesn't map onto the reality in the US. I've realised that I sometimes assume that ordinary, mundane, competition will give deep packet inspection and application-blocking a solidly good kicking. Which is fine, if ordinary, mundane competition actually works.

Edit, 2pm Wednesday

There's been a lot of discussion about assorted voice-recognition technologies and applications, like Voice XML. Frankly, I don't think it's that interesting - I know lots of people seem to think that voice-based applications are some sort of magically futuristic appeal, but apart from widely-detested IVR systems and in-car handsfree voice-dialling, I'm just not convinced. Voice apps fail the "real world" test - I don't know anyone (outside the industry) who likes them, and I can't imagine anything that's likely to change that. Most of the suggestions I've seen are solutions looking for a problem, whether they're mobile-oriented or fixed.

Handset volumes - when's the point of inflection?

There's been quite a bit of discussion in financial circles over the past couple of days concerning a possible slowdown in the expected growth in handset shipments in 2008. Much of this seems to have been provoked by Texas Instruments' gloomy outlook, although this partly reflects Nokia's decision to source chipsets from more suppliers. TI has also suffered a bit compared to Qualcomm through having less exposure to 3.5G growth (lower volumes but higher margins), in areas such as HSDPA modems.

Nevertheless, despite the possible market-share shifts on the silicon side of things, it's also worth keeping a realistic view on the handset market itself. One issue is that the high end devices (eg 3G phones for mature markets) often have wholesale prices up to 10x that of low-end phones popular in emerging economies - and commensurately higher margins. Using overall headline figures like 1.1 bn handset shipments makes great statistics for the industry, but distorts the underlying picture of financial health.

A related thing to consider is the segmentation of the market between:

- sales of phones to new mobile users
- sales of replacement phones to existing users
- sales of 2nd / 3rd devices to existing users

Clearly, in mature markets the majority of purchases fall into the second two categories. Some interesting things to ponder there, though:

- Firstly, replacement phones are often highly contingent on operator-inspired upgrades. And it's notable that in markets like the UK, there's been a huge push recently to move people from 12 to 18/24-month contracts, to reduce the necessary costs of customer retention (eg subsidy) and reduce churn. This is especially true of 3G phones, where operators are loathe to upgrade people to their second or third 3G handset when they're not really using the network that much. Essentially, they'd just be subsidising people to move from a 2MP camera to a 5MP camera on the device - and currently they're using this as a lever to get people to sign up to longer terms.
- Secondly, additional devices (2nd/3rd phones) may well not be as expensive as a user's primary device. They might be cheap fashion-phones, or 3G dongles, or novelties like the 3 Skype Phone. This is good for volumes, but not average selling prices.

Overall, I can't see the handset market taking a nosedive. But I definitely think that some of the underlying dynamics are shifting in ways that perhaps the financial market hasn't priced-in.

Friday, March 07, 2008

The return of low power GSM

About 2 years ago, there was a sudden buzz of excitement around a new spectrum auction in the UK - referred to as "low power GSM", it represented a sliver of frequencies that were formerly a "guard band" between GSM @ 1800MHz and DECT. There wasn't enough spectrum to build out a national network, and it was expected that instead it would just be used for low-power indoor coverage, typically with picocells. And rather than selling the spectrum to a single operator, it was suggested that multiple indoor providers could share it - with operators expected to work with each other to avoid interference.

Ofcom sold 11 LPGSM licences for quite low prices, to an eclectic range of companies, from BT and C&W though to Swedish operator Spring and some very random ones like Philippines Long Distance Telecom.

Much excitement ensued, with speculation (by me among other people) that this was another interesting FMC route, competing with WiFi-based alternatives, using ordinary GSM phones. I'd been following the use cases for picocells since 2001, and this was quite a milestone. (Bear in mind that May 2006 was before the femtocell hype started to hit home).

But.....

Not much has happened since. The big problem is that having private indoor cellular networks is a great concept, but runs into practical and user-experience problems when people leave the building. There are four options for what happens next:

  • The phone stops working. The SIM card is for the indoor-only operator
  • The indoor service provider also has a macro service network outside, and the phone just hands over
  • The user does a manual "network reselect" on their handset, either using two SIM cards or with the indoor operator doing something clever with a small offshore roaming service provider
  • The indoor operator signs a "national roaming" deal (ie becomes an MVNO) with one of the main national MNOs for outdoor coverage.
The only UK LPGSM operator with its own macro GSM network is O2. But it seems to have been singularly distinterested in doing anything with its licence, and perhaps just bought it cheaply to check up on what everyone else was doing.

Two licencees already had MVNO deals in place that could have possibly enabled them to knit together overall solutions. But BT has been more focused on Fusion and dual-mode WiFi-cellular, and Carphone Warehouse probably doesn't have the enterprise reach needed to justify the price of picocells.

[Sidenote: picocells cost around $2k apiece. And most LPGSM opportunities are on corporate sites that would need integration with existing PBX and LAN infrastructure]

Until recently, Teleware's arm Private Mobile Networks was the only commercial LPGSM operator with a launched service and growing channel. I met with the company recently, and it has also developed a productised version of its solution tailored for the construction industry. Teleware's background is in developing communications servers & applications for the enterprise market, so it's got a lot of the important systems integration and channel-partner credentials. (And also, in the construction industry, building sites obviously don't have existing PBX/LANs).

My understanding is that MVNO terms from the major operators have been less-than-generous. Understandable, given that the LPGSM model represents a possible cannibalisation threat for their corporate business. Also, the timing of rollout of corporate dual-mode WiFi FMC was also delayed, so this more cellular-centric approach hasn't even been considered as useful hedge. That situation has changed recently however, with real-world dual-mode deployments in some UK companies (eg BT Corporate Fusion) & education facilities.

But now, C&W has finally cut a roaming deal with Orange to become another live LPGSM operator. (see the press release and also articles like ZDnet's). It's intending to launch a corporate FMC solution by the end of 2008.

One of the interesting thing is that C&W's recent business update presentation refers to its supplier for the solution as Ericsson. The significance is that Ericsson has long talked up its 2G femtocell (rather than picocell) products. Most of the femto vendors have been focused on 3G rather than 2G because of a different set of challenges in network integration, and the fact that 2G coverage is generally pretty good anyway. Presumably this will bring the capex requirement per-site down from a few thousand pounds towards a much lower figure.

I've also heard rumblings that a few of the other LPGSM providers are also stirring from their slumber. Having 2G femtos available as well as picos should definitely help. I suspect we'll see a few more announcements over the next few months.

Wednesday, March 05, 2008

Whatever happened to HSUPA?

It seems to me that in the rush to deploy and enhance HSDPA services, scaling downloads speeds from 3.6 to 7.2 to 14.4 and onwards, most operators have put their plans for high-speed uploads, through HSUPA, on the backburner.

According to the GSA (Global Mobile Suppliers Association) and also GSMA, we're on about 28 network launches, with another 24 in the works. There's a good table here. And there are now about 30 devices (the GSA says 33, GSMA says 26).

Sounds promising on the face of it, but it seems to be a much slower ramp-up compared with HSDPA, and certainly slower than the migration to higher speeds within HSDPA. And it's also worth noting that the vast majority of devices are still laptop data cards/dongles and fixed modems. In terms of phones, there's just a handful on DoCoMo's network in Japan, and a couple of HTC and i-Mate high-end PDA-type devices. It was conspicuous that none of the major handset vendors at 3GSM last month were talking up 'full HSPA' in their devices.

I think the problem is that nobody really knows what the added-value of HSUPA is. Just as most consumers don't really scrutinise upstream bandwidth for home DSL/cable, there doesn't appear to be an immediate need for symmetry on mobile either.

Yes, I know about user-generated content - but to be honest, just how many MB of video can any average person upload per month? Otherwise, yes, I can see the benefit for low-latency gaming (HSUPA improves ping time as well as bandwidth). And yes, professional photographers would like to send their 10-megapixel images to the newsdesk or a media agency. But ironically, most of the other HSUPA applications are probably "naughty" ones that the operators don't really want to encourage - VoIP, P2P, TV place-shifting (SlingBox et al), and people illicitly using their wireless endpoint as a web server.

And given that HSDPA is selling like hotcakes at the moment..... why move to HSUPA before it's really needed, given that the proposition is much less clear?

Monday, March 03, 2008

eComm interview about wireless, openness and spectrum

I had a conversation the other day with Jonny Bentwood, who has recently updated his Top 100 Analyst Blogs chart. He commented that as well as blogging, I could probably benefit from looking at some of the other Web 2.0 and social media approaches to distributing my blog content and building a greater community around my research themes. As well as things like Twitter, we also discussed podcasting and other approaches. I'm certainly going to try and look into some of these, and I'll mention them here as time goes on.

It's therefore also a pleasing coincidence that Lee Dryburgh has put up a transcript and audio recording of a conversation we had a couple of weeks ago. He interviewed me in advance of my presentation at eComm, about some of my thoughts on wireless VoIP, new ways of looking at spectrum regulation, and the general move towards 'openness'.

One of the areas that Lee touched on was the concept of 'open spectrum' - whether people will ever be able to 'hack' radio in the same way they do software. I'm not convinced, beyond the current existing unlicenced frequency bands. While I can certainly appreciate the concept of free-market, democratic, innovation-inspiring access to the airwaves, I think there's just too much risk of interference and other unwanted side-effects from allowing a free-for-all. Nevertheless, I'm definitely curious about whether there are methods to stimulate more clever RF inventions, and less bureaucratic revelations. If anyone else has any thoughts on how we might practically create a "radio playground" I'd be fascinated to hear them.