The other day, I was invited to the Cambridge Wireless conference on quantum computing and communications (link). Fascinating and brain-melting domain, that has profound implications for many other areas of technology (and telecom). Even though I have a physics degree, I can't claim to be able to keep up with all the maths and concepts that are discussed - but I took away a few real-world implications of what seems to be occurring. Quantum technology is a pretty broad area, that relates to the weird properties exhibited by individual atoms or photons (light). If you've heard of Schrodinger's Cat, then you'll know how strange some of the concepts can be - especially a "qubit" (quantum bit) that can simultaneously be a 1 or 0, or "entanglement" where pairs of particles remain spookily connected at a distance.
These properties can be used to create computers, communications systems, sensors, clocks and various other applications. In a way, quantum tech is a "foundational" idea similar to semiconductors (which are themselves based on quantum mechanical principles): there will be many, many applications.
Terminology alert: often people in this sector compare quantum computers versus "classical" alternatives.
Some quick highlights and comments:
It's early days. Although there are some existing quantum solutions, they are not "universal" computers, but tailored for particular use-cases. Cooler stuff is 5-10 years away depending on your level of optimism (and stealth)
There were a lot of telecom people in the room - although that's partly a function of Cambridge Wireless's community (link).
Many of the opportunities (& threats) from quantum are "several layers up". For example, we should be able to make more accurate clocks, which means better timestamping, which means more accurate transactions or positioning, which means better ways to create networks... It's pretty hard to extrapolate through all the layers to work out what the "real world" impacts might be, as there are variables & uncertainties & practicalities at each stage. Same thing for quantum improving AI systems.
There will be a lot of hybrid quantum/classical systems - including being integrated on the same chip.
Some crypto & PKI systems are going to be compromised by quantum-enabled decryption. It makes mincemeat of some algorithms, but others are much more "quantum-proof". There might be a "Y2Q" problem digging out where the old and vulnerable ones might be, buried inside other systems and software. This might be a "big deal", but there was also debate among experts about whether some of the risks claimed might actually be scaremongering or limited in scope. I think there will be a big ramp-up in "quantum compliance consulting" though - if enough people can understand it.
Quantum tech also enables totally-secure* networks to be built, using quantum key distribution (QKD). There's a bunch of tests and prototypes working around the world. At the moment these are mostly fibre-based, although some are using free-space optics. (*I'm not a cryptanalyst. Or a quantum wizard. My understanding is that secure here means non-interceptible or perfect interception-detection, but as always with security there are other weak links in the chain when humans are involved).
We're not getting some sort of magical massmarket "quantum broadband" any time soon, fibre or (definitely) mobile. There might be quantum-related components in networks for timing or security, but the actual physics of shipping-around of bits through air and fibre isn't likely to change.
One caveat - if I understand correctly (and it's possible I don't) some quantum applications might make it more appropriate either to use dedicated individual fibres, or to use frequency multiplexing (separate colours essentially) rather than networks with other forms of multiplexing. One of my "to do's" is to get my head around what quantum-level transport really means for the way we build IP networks - and whether it's only ultra-secure point-to-point connections that are impacted, rather than general "routed" ones. At the moment it seems the main use is parallel QKD streams to secure the main "media" stream. I've found some stuff on early concepts of quantum routing (link) and quantum-aware SDN (link) but if anyone has a view on the commercial impact of this, I'm all ears.
A lot of the current work on quantum computing seems oriented towards creating better ways to do machine learning - essentially the ability to absorb many, many different things "in parallel" rather than sequentially. Beyond AI/ML, many important tasks involve optimisation or pattern-recognition - quantum solutions should help. This has applications across the board, from finance to healthcare to telecoms, although there weren't many suggested use-cases in BSS/OSS or network design at the event. I suspect there could be a variety of interesting options & will think more about this over coming months. (Let me know if you'd like to discuss it)
There's lots of complexity in getting quantum engineering to work for computing - components often need to be cryogenically cooled, there's all manner of software design and error-correction and control issues, maybe some engineering of microwave systems to link bits together and so on. This is Big Science. It's not going to be in the iPhone 9. (Although some of the sensing and clock stuff seems to be "smaller")
There's some cool stuff being done around quanutum-based accelerometers, gravity sensors etc. One of the biggest drivers is the desire to create a GPS-type positioning system that doesn't rely on signals from satellites - which can be jammed, blocked or even destroyed. Currently GPS is turning into a bit of a "single point of failure" for the entire planet - especially including cellular networks and devices and financial transactions which need times-stamps.
Someone else has beaten me to the term QCaaS (link) so I'll have to settle with QDN "Quantum Defined Networking". You heard it here first....
There are various implied links with IoT (sensors) and blockchain (crypto). I'll keep an eye on those for future work.
Overall, a fascinating topic - and one which the UK government, academia and industry is pumping a ton of cash into. It's perhaps not as sexy as some other futurist obsessions like AI, genetic engineering or blockchain - but it's potentially just as transformative, not least by helping accelerate the progress of all of the others. For the telecoms industry, there's relatively little to be worried about yet - although getting older network and IT systems' crypto checked over seems important given the timelines to replace legacy equipment. Given the rising desire to exploit PKI and identity in telecoms and IoT as a long-term business, the 10-year horizon for "sci-fi" possibilities is a bit uncomfortable, especially if new breakthroughs are made. And that's before second-guessing how much extra progress has been made by intelligence communities, and how fast Messrs Snowden and Assange get to hear about it.
We might see quantum tech appearing first in clocks used in networks, or specific optimisation problems solved with early computers from the likes of D-Wave. In my mind there's a few options around NFV/SDN and network-planning that might be a fit, for instance. There's also some cool possible opportunity around super-secure communications and non-GPS navigation. But good news if you're a serious telco quantum doom-monger, don't worry about the prospect of Netflix quantum-entangling videos direct to peoples' TVs and smartphones just yet.
If you're interested in learning more about Disruptive Analysis' work on "TelcoFuturism" please get in touch at information AT disruptive-analysis dot com. My introduction to the concept is here (link) and I've also written about AI/machine learning (link) and Blockchain (link). I gave my first keynote presentation on TelcoFuturism a few months ago (link) and will be progressively ramping this up - get in touch if you need a speaker.
Imagine a cellular-connected interactive teddy-bear. It has a button for a child to speak to their parents, perhaps a video-camera inside an eye, and maybe some basic sensors for movement, temperature or GPS. Conceivably, it could connect via social-networks to other toys. Ideally it needs to be mobile, as it'll be used in the car, or at kindergarten, or on holiday - not just at home. The cellular radio is embedded deep inside the bear, along with a SIM card. The toy has to be soft - and there can't be any small removable parts like a SIM tray and conventional card, as they could be a choking-hazard. So it makes sense to use an embedded, soldered in SIM - which also gets around the problems of fluff and fibres blocking a normal SIM-tray. And because the bears are shipped around the world from a single factory, the concept of eSIM and remote-provisioning seems to make a lot of sense as well. Seems like an ideal eSIM use-case, doesn't it? A classic example of consumer mobile-enabled IoT?
(There are already various smart/connected WiFi or Bluetooth bears - see this link or this link for example. Others are being planned - linkEDIT: apparently the day I published this was "National Teddy Bear Day" link . A complete coincidence, but very amusing! ) But now think a little more closely about the user-journey, the design and sale process, the economics, and the new value-chain that needs to support the mBear's creation, distribution and use.
Where does the bear get purchased? Presumably, nobody's going to go to a phone store to buy a soft toy. They'll get it in a toy shop, or perhaps online. As it's a tactile, soft product, a lot of people will want to touch it first, compare it with other unconnected bears, decide if the extra cost of the electronics is worth a perhaps-lower grade of stitching and fur. How many toy-shop sales assistants are likely to be able to describe the benefits of a connected bear? How many will be able to advise on how to get it connected, what happens after the initial data-included period ends, or talk knowledgeably about service plans? The bear, of course, has no display. So any configuration and setup will need to be done from a PC or mobile app, or in-store (good luck with that, at 9am on Xmas morning). Is there some sort of "connectivity app-store", where you can choose which network you want the bear on? Can you add it to a parent's existing multi-device cellular plan, or a family plan? Can you set up an entirely new subscription if your normal operator doesn't support eSIM and remote provisioning? How do you register your ID for countries which require it? Your ID or the child's? Do you need to connect the bear to WiFi first, or use Bluetooth from your phone, in order to boot-strap the mobile provisioning? Can the eyeball-cam scan a QR code from your phone, perhaps?
How exactly does the provisioning work, and is it the same process used by the mToaster the parents also got given as an Xmas gift? Actually, who's responsible for the bear's connectivity if it was given to the child as a gift by a relative? (Who perhaps bought the bear overseas). What does the "user licence" say and who agrees to it? What security issues might arise? [See this link, for a non-cellular connected bear]
Are there different data/voice-plans for the bear? Does it offer postpay and prepay? Are they available on all carriers or just one? How is this displayed in the store? Does it support roaming, when the family goes on holiday? At what price, and how is this notified? If the "call" button is permanently on because the child is sitting on the toy, what happpens next? What about returns? If the arm falls off the bear a week after Xmas, and the eSIM has already been activated, what happens when the customer returns it to the store? How is the number/SIM ported to a new bear, perhaps of a different design? What happens if the child's playroom is in the basement and there's no coverage? Can the network be switched? Can the customer get a refund? Who pays and how? There are also questions about the design and manufacture process. Who decides to make the bear? A normal plush-toy company? A mobile device maker? (iBear?) Someone who sets up a Kickstarter campaign and then contracts a manufacturer? How do they select a module & design the rest of the system (eg battery)? What extra cost does this add? Are there enough operators supporting remote provisioning and eSIM? Is a standard-SIM version needed, or perhaps one that's WiFi-only and tethered to a nearby phone or in-car cellular radio? Does the toy's packaging need to be different as it's now a cellular device? How is it classified by shipping companies - as a toy, or a "phone"? What certifications are needed at what point in the process? What import/export duties apply? You get the picture. It's all much harder than the initial picture suggests - partly because of the cost of the cellular radio irrespective of SIM type, but also because of real-world user journey and practicalities of eSIM. There are plenty of other issues I haven't mentioned here. For some eSIM use-cases such as connected cars, and perhaps tablets and mi-fi type products, there's an existing channel and business model. Remote provisioning can simplify this, take costs out, add the ability to switch networks and so on.
But for many other new categories of IoT, both consumer and B2B, there are huge complexities that will need to be worked through. There will likely be several years of clunkiness, false starts and unanticipated problems. These may not be insuperable - but they may well prove costly to solve.
The problems will also likely vary by device category and target audience - imagine re-writing this post about a fridge, a VR headset, a bicycle lock, a drone or an industrial oil-pump. eSIM in smartphones is much harder still. Just standardising the remote-provisioning part of eSIM does not solve the myriad of other issues that "connecting" IoT devices with a cellular radio entails. In many cases, it will be simpler just to stick with WiFi or Bluetooth, especially for toys mostly used when parents (and their phones) are around.
Such issues are why I'm forecasting a slow start to eSIM, and patchy adoption in new IoT categories. (And also why lack of eSIM in the iPhone 7 was not a surprise). It will gradually be sorted out - by 2021 there could be 1 billion eSIM-enabled devices - but it certainly won't be a game-changing shift overnight. These posts highlights some of the issues and concepts that are covered in the new Disruptive Analysis eSIM Market Status & Forecast report (link). Please get in touch at information at disruptive-analysis dot com, if you are interested in the report, strategy workshops, or speaking engagements. I'll also be chairing a conference session on eSIM & eUICC at the Smart Security conference in Marseilles on Sep 28th (link)
I've been giving a lot of thought recently to 5G - the technology, major use-cases, likely business models, timelines and implications for adjacent sectors such as IoT.
5G fits into both my own TelcoFuturism analyst/advisory work on the intersections of multiple technologies in telecoms, and also my secondary role working with STL Partners as Associate Director and lead analyst of its Future of the Network research programme (link).
A philosophical split is emerging among operators and vendors:
"One network to rule them all" idealists
"Make it functional ASAP & add other stuff later" pragmatists
There are various nuances, middle-ground thinkers and shades of grey, but in general the former tend to be companies driven by the core and services domains, and the latter have a radio/access bias. The core-network group tends to view things through the lens of NFV, and with a 2020 target date. It sees a world that spans diverse 5G use-cases from smartphones to sensors to vehicle-to-vehicle communications, taking in police cars and replacing FTTH and WiFi along the way. It wants to use sophisticated MANO (management and orchestration) layers and next-gen OSS to create network "slices", supposedly from "end to end". Such slices would, in theory, be optimised for different business models, or verticals, or virtual networks - leaning heavily on policy-management, differentiated QoS and assorted other big service-layer machinery. Mobile edge-computing would, ideally, extend the operator's cloud infrastructure into a distributed, Amazon-beating "fog". Often, terms like "HetNet" will be added in, with the notion that 5G can absorb (and assimilate) WiFi, LPWAM, corporate networks and anything else into a unified service-based fabric. The other group is driven by more pragmatic concerns - "better faster cheaper" upgrades to 4G in a 2018-19 timeframe (and 2017 trials), replacing DSL in rural areas where fibre is too expensive but cable is growing, more spectral efficiency to squeeze more usage out of frequency allocations, lower-cost mobile broadband for emerging markets, better cell-edge coverage, and (ideally) lower power consumption for the RAN. Perhaps unsurpringly, they focus more on the nuts and bolts of radio propagation in different bands, different modulation mechanisms, frame structures needed to optimise latencies - as well as practicalities such as small-cell backhaul. Business model discussions are secondary - or at least decoupled - although obviously there is a large IoT element again. The core network may well remain the same for some time, and 5G access will not necessarily imply NFV/SDN deployment as a pre-requisite. (I've spoken with CTOs who are quite happy without virtualisation any time soon). In my view, it is the latter group which better understand the "hard" technology compromises that need to be made, as well as the timing considerations around deployment, spectrum availability - and the implied competition from diverse substitute technologies like SigFox, gigabit-speed cable, near-ubiquitous WiFi and even next-gen satellite (assuming no more SpaceX Falcons have unfortunate "anomalies"). A key concern is how to squeeze the ultra-low latency capabilities into a network architecture that also supports low-cost, mass IoT deployment. Conversely, the other camp is often guilty of wishful-thinking. "Let's control flying public-safety robots with millisecond latency & QoS via MEC nodes & 6GHz+ licensed-band 5G from totally virtualised & sliced service creation & activation platforms". This would probably work as the basis of a 2023 Michael Bay movie, but faces quite a few obstacles as a near-term mobile operator strategy. [Note: this concept is only a very mild exaggeration of some of the things I've had suggested to me by 5G zealots].
There are various practical and technical issues that limit the sci-fi visions coming true, but I want to just note a couple of them here:
It is far from clear that there will be enough ultra-performance end-points to justify having the millisecond-latency tail wag the 5G dog. I'd guesstimate a realistic 100 mllion-or so device target, out of a universe of 10-20bn connections. Unless the related ARPU is huge (and margin after all the costly QoS/slicing gubbins is added in), it's not justifiable if it delays the wider market or adds extra complexity. Given that 100m would also likely be thin-sliced further with vertical-specific requirements (cars, emergency, medical, drones, machinery etc.) the scale argument looks even weaker.
A significant brake on NFV at the moment is the availability of well-trained professionals and developers. As one telco exec put it to me recently "we don't have the resources to make the architects' dreams come true". And this is for current NFV uses and architectures. Now consider the multi-way interdependencies between NFV + 5G + verticals + Cloud/MEC. The chances of telcos and their vendors building large and capable "5G slice" teams rapidly are very small. What would a "5G Slice development kit" look like? How exactly would an IoT specialist create a 5G-enabled robot anti-collision system for a manufacturing plant with arc-welders generating radio interference, for example? (And let's leave aside for now the question of what 5G NFV slices look like to regulators concerned about neutrality....)
In other words, I think that the "slice" concept is being over-hyped. It sounds great in principle, but it's being driven by the same core-network folk who've been trying to sell "differentiated QoS" in mobile for 15+ years. It took 7+ years to even get zero value-add VoLTE phone calls to work well on 4G with QoS, when that service had been specced and defined to within an inch of its life by committee. The convenient IoT/NFV/5G developer SDK and corresponding QoSPaaS isn't appearing any time soon. That said, I'm sure that there will be some basic forms of network-slicing that appear - perhaps for the public-safety networks that are moving to 4G/5G whether it's truly ready or not. But the vision of 10, 100 or 1000 differentiated 5G slices all working as a nicely-oiled and orchestrated machine is for the birds. Instead, I think the right metaphor is hacking not slicing. I don't mean hack in the malware/blackhat-in-a-basement sense, but in terms of taking one bit of technology and tuning/customising it and creating derivatives to serve specific purposes. We already see this with 4G. There's a mainstream version of LTE (and LTE-A and enhancements), but there's also PS-LTE for public safety, NB-IoT for low-end IoT, and LTE-U/LAA/MuLTEfire for unlicensed spectrum. Those are essentially "hacks" - they're quite different in important ways. They benefit from the LTE mother-spec's scale effects and maturity - and probably would not have evolved as standalone concepts without it. In a way, the original railway GSM-R version of GSM was a similar hack.
I think 5G will need something similar. As well as the so-called "New Radio", there is also work being down on a nextgen core - but there may well also have to be spin-off variants and hacks as well. This could allows the mainstream technology to avoid some possibly-intractable compromises, and could also be a way to bring in vertical specialists that currently think the mobile industry doesn't "get" their requirements - as per my recent post that telecoms can't just be left to the telcos (link).
As usual, the biggest risk to the mobile industry is strategic over-reach. If it persists in trying to define 5G as an all-encompassing monolithic architecture, with the hope of replacing all fixed and private networks, it will fail.The risk is that if it tries to create a jack-of-all-trades, it will likely end up as master-of-none. 5G has huge potential - but also needs a dose of pragmatism, given that it is running alongside a variety of adjacent technologies that look like potential disruptors.
Ignore the sneers that SigFox is just 2016-era WiMAX and look at the ever-present use of 3rd-party WiFi as a signpost - and the emergence of WiGig. Look too at the threat that SD-WAN is having against MPLS and NFV-powered NaaS in fixed-line enterprise networks (link) - an illustration of the power of software in subverting telco-standardised business models. This time around, non-3GPP wireless is "serious" - especially where it leans on IEEE and ethernet.
In its fullest version, the "slice" concept is far too grandiose and classically 1990s-era telco-esque. Hacking is much more Internet/IETF-style "rough consensus and running code". It will win. A forthcoming report on the Roadmap for 5G will be published as part of STL Partners' research stream on the Future of the Network soon. Please contact STL or myself (information AT disruptive-analysis DOT com) for more details, or for inquiries about custom advisory work and speaking engagements on 5G, NFV, LPWAN and related themes.