Pages

Pages

Thursday, April 01, 2010

The dangers of over-reliance on simplistic metrics

I wrote yesterday about the decreasing relevance of "$ per GB" as a yardstick for revenues or costs in mobile broadband.

Thinking about it, I reckon it's symptomatic of an industry that tends to live by snappy, marketing-friendly soundbites that obscure underlying complexities.

Simple messages are great for headlines, but can lead to wrong decisions if they become too entrenched.

A classic pair of errors in mobile has been the unthinking over-use of two basic metrics:

- Number of subscribers
- ARPU

Subscriber numbers have been simple to measure - largely because they map nicely to SIM card MSISDNs (a phone number to most of us). While they're easy to count, they don't really give a good view of either the actual or potential base of customers. Some people have multiple SIMs, some are shared, while an increasing % go into machines rather than phones. In fact, the whole terminology has skewed business models towards subscription-based types - while non-subscription models (eg transactional) have been downplayed or totally eschewed.

Almost no service business should rely totally on subscription models. Yes, it's a useful *segment* and trends are useful indicators, but it shouldn't be viewed as a pivotal metric.

Worse still is ARPU. More accurately, it's Average Revenue per Subscription, not Per User. Ironically, if it were used properly, it would be more useful.

The fact that I have a £40 per month phone, plus a £15 per month 3G dongle (from a different operator) makes me a £55 a month mobile user. Describing me as two £27 per month subs on average doesn't really help anyone to understand their business.

ARPU is inherently biased towards operators giving large subsidies, then recouped as "revenue" over the contract. Yes, it's possible to do some maths with (ARPU minus acquisition/retention costs) but there's still often too much focus on the headlines.

I remember that 3UK always used to trumpet its high ARPU. Until it realised that this was simply because it didn't *have* any low-tier packages or prepay, so obviously the average was going to look high. Nowadays it has realised how much money it was leaving on the table, and has targeted those segments aggressively, and is about to reach profitability (finally), largely thanks to *lower* ARPU.

A classic example of ARPU-blindness has been the reticence to focus too heavily on M2M services. "What, an extra 10 million subscribers on $5 per month? What will that do to our figures? What will our investors think?" . "But they're vending machines and remote utility meters. They hardly use the network. We'll make $4 on each in profit margin"......

There have been plenty of suggestions about using Average Margin per Subscriber/User and so forth - and certainly, most operators' internal management teams are rather more sophisticated about financial analysis these days.

But nevertheless, the ghost of ARPU lives on. While it might be largely discredited by those who really care and play with spreadsheets deep in the strategy department, it is still measured and watched by observers - and used as a tool by vendors in their marketing. It hasn't really gone away.

Its influence remains disproportionately pervasive.

Another example is handset shipments, which lumps together a $15 GSM phone on an Asian market stall, with a $5000 Vertu in a Dubai shopping mall. There's still a regular refrain that Apple is irrelevant because it ships tens of millions of devices per year, compared to Nokia's half a billion. Only rarely is it mentioned that Nokia's average selling price is €63 while Apple's is perhaps eight times that figure. Or that Apple and RIM account for a hugely disproportionate % of handset industry margins.

It's like claiming that Honda is more important than Toyota, because it sells 13m vehicles a year against 7m. Let's ignore the fact that 10m of them are motobikes. Or that Giant is in the top 5 vehicle manufacturers. Never heard of them? I'm not surprised, as they make bicycles rather than cars - but hey, they all have wheels, yes?

To sum up - raw, headline numbers rarely tell the whole story. Over-focus on them can actually damage a business, and even where management "understands" this, it's still possible to be subconciously swayed.

As I mentioned yesterday, the next oversimplified metric to hit the headlines is "$ per GB". My recommendation is to take it with a pinch of salt - chasing that figure either in terms of revenue or capex/opex costs is likely to be a mistake.

Remember, the best-value way of transporting data, if you just used $ per GB as a metric, would be to drive a truck full of flash memory from A to B. The latency is pretty lousy, though.

(But if someone would like to pay me at 1 cent per GB, for a network that gets mobile data around the UK at a terabit per sec, please get in touch. I reckon a guy on $20 bike from Giant can easily a petabyte of memory cards....)

1 comment:

  1. The cost/GB to transfer data in flash disks is more expensive than a 3G/3.5G network. You will have to buy all the flash disks so your CAPEX will be high. Also your OPEX may be high due to the truck leasing costs etc.

    And your throughput will also be lower (depending on truck top speed).

    Joking aside, if we narrow down the network topology, radio access techs used and several other parameters, the cost/GB metric is a good tool to identify which technology is more cost effective. E.g. macro vs femto for capacity purposes.

    If the cost/GB is too generic, which other metric do you think we should use to track network costs? CAPEX/OPEX is too generic and varies wildly according to volume, vendor, technology etc.

    ReplyDelete