Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here

Showing posts with label sensors. Show all posts
Showing posts with label sensors. Show all posts

Saturday, April 18, 2020

Rethinking wireless networks for post-COVID19 Smart Buildings

For the past month or so, I've been thinking about the longer-term technology, policy and business trends that might emerge in the wake of the current pandemic. I'm especially interested in those that could directly or indirectly affect the use and deployment of networks and communications.

I wrote up my initial scenarios for what might lie ahead for the telecoms industry in the recent STL Partners report on COVID-19: Now, Next & After (link), and also discussed them on the STL webinar on the same topic (link). The next update webinar is on May 6th - link. 

I've also participated in other client webinars and podcasts on campus networks (link), private 4G/5G (link) and Wi-Fi6E (link) recently - and I always include a section considering the pandemic's impact. (Any market analysis or opinion formed more than 2 months ago now needs to be reconsidered in the light of the pandemic and coming economic recession).

With that in mind, one area I've started thinking about is that of in-building wireless and smart buildings, especially relating to business locations. (Residential coverage of cellular and better home broadband / Wi-Fi is also top-of-mind, but I'll tackle that separately another time).

Obviously, offices and shopping malls are currently empty in much of the world, but eventually they will return to regular use, to some level at least. Even buildings sadly vacated by companies that cannot survive the economic impact will likely gain new tenants and uses.  


Making buildings pandemic-proof 

We already have building codes and regulations to protect us against fire risks, and even earthquakes. For fires, we have sensors, alarms, fire escapes, drills, signage and so on. In parts of the world there are specific rules governing indoor coverage for public safety radios, and they are being updated as agencies upgrade from P25 / TETRA systems to 4G / 5G critical-communications cellular alternatives.

So what else would it take to make a building "pandemic-proof"? I'm especially interested how we manage social distancing - both during the next phase of recovery and a gradual return to near-normal when a vaccine becomes available, but also during possible future waves of COVID or new entirely outbreaks. 

In the wake of the 2008 financial crisis there was a big focus on banks' transparency, financial stability and regulatory "stress tests". I'd be very surprised if equivalent changes don't take place over the next few years - especially as many coronavirus infections are understood to occur indoors.

I've found various articles about smart buildings and the pandemic already, where the main focus seems to be on general hygiene and infection control. Using thermal cameras (and perhaps facial recognition) can automate detection of people with fevers. LED lights can provide disinfection in some cases, and bathroom sensors can help enforce hand-washing. Remote access to building-management systems allow facilities personnel to work from home. Better management of temperature and humidity may reduce the survival time of viruses and bacteria.

I can imagine a range of strategies being adopted in coming years:
  • Temperature-detection and hygiene management, as above.
  • Ability for remote building-management wherever possible
  • Design guidelines for wider corridors and stairways, better ventilation, virus-unfriendly surfaces, automated doors rather than handles, and so on
  • Ways to impose, measure and enforce social-distancing rules in emergencies - for example by dynamically lowering maximum numbers of permitted people in enclosed spaces, or digital signs for making corridors or aisles into one-way systems.
  • Use of sensors to measure occupancy, density and flow of people, and control entry/exit better
  • Automated disinfection systems or processes (maybe using robots)
  • Use of occupants' / visitors' phones or other devices to help them navigate / work more safely
  • Ability for authorities to use cameras, admission-control and other data for contact-tracing purposes (subject to emergency laws on privacy etc).
Clearly, not all of these can apply to all buildings - and there is obviously a huge spectrum of venue types with different requirements. A supermarket is different to an office block, a corner-shop, a factory or warehouse full of robots. Older buildings are not likely to be able to widen corridors, while a "cube farm" has more flexibility. 

But what that means is that in a future outbreak, a government could say: "Workplaces certified to standard PNDMC-A can remain open, if they reduce occupancy to X, Y & Z metrics. PNDMC-B locations must comply with emergency rules A, B & C. All others must close."

Clearly, those type of rules will incentivise building owners and developers to upgrade their sites wherever possible. While it is too early to guess exactly how the specific regulations might be formulated, there are nonetheless some initial ideas and steps to think through.


The role of networks

Given my own focus on mobile and wireless systems, a key theme immediately leaps to my mind: many of these techniques and practices will require better and wider indoor connectivity than is common today in many places. 

While some building-management systems will be based on wired connections (not least as they'll need cables for power anyway), I expect wireless networks to be extremely important for much of this.

I see wireless networks being employed both indirectly (for connection of sensors, cameras or other devices, such as smartphones used for distancing apps) and directly by using the network itself as a sensing and measurement tool. Indoor mapping and positioning will be needed in tandem with wireless for various use-cases.

There are particular challenges and opportunities for indoor wireless systems here:
  • There will be a need to support both public networks (for indoor use of nationwide MNO networks and services) and localised private wireless, for the building or company's own needs.
  • Almost inevitably, both 3GPP cellular (4G/5G) and Wi-Fi (5/6/7) will be essential for different use-cases and device types, plus public-safety wireless such as TETRA. In many some cases additional technologies such as Bluetooth low-energy, ZigBee or proprietary systems will also be required as well. 
  • All of this will occur while major transitions to 5G (at different frequencies) and private cellular networks are ongoing in coming years.
  • Any real-time mobile app, whether it is giving alerts, or uploading updates on location, will be dependent on good wireless connectivity, either via Wi-Fi or in-building cellular connections
  • Proximity-based apps (for instance using Bluetooth) will risk false-positives if they are not integrated with building location and indoor-mapping systems. You can safely stay 2 metres from someone infected, if there is a wall or floor/ceiling between you.
  • IoT systems such as disinfectant robots will also need access to indoor maps and granular positioning technology.
  • Next-generation networks such as private/campus 5G and also recent Wi-Fi meshes have improved wireless-positioning abilities. This could allow both real-time and reported proximity-monitoring - as well as enabling remote working & even "lights out" full automation in industrial settings
  • Both Wi-Fi and cellular networks can work out how many devices/users are not just connected, but detected, even if they do not attempt - or are not permitted - to connect to a given system. That could yield good data on user-density, especially if they are personal devices such as smartphones.
  • Wi-Fi enhancements already enable motion-detection - which can be considerably more accurate than traditional infra-red, and also work through walls. One technology innovator here is Cognitive Systems (link) but there are others as well. I've also seen suggestions that future 5G variants may be able to do something similar, if deployed with small cells. (I'm not sure how it would work with other in-building shared networks, though).
  • Potential to use localised cell-broadcast messaging, or Wi-Fi hotspot captive-portal pages, to distribute public health information and advice
  • There may be a growing need to align the indoor wireless network(s) with nearby outdoors connectivity, or link multiple buildings together well. Campus networks are already growing in importance for multiple reasons (link) and social-distancing and control adds another set of use-cases. (Consider private/public spaces such as courtyards, rooftop bars, parking lots and so on).
  • The use of virtualised radio networks (or specific variants such as OpenRAN) could also prove valuable here - for instance to enable operators to scale up/down capacity dedicated to indoor 5G wireless systems, or switch radio VNFs between indoor and outdoor coverage. (This goes far beyond pandemic-proofing and I will write about it another time). 
  • Neutral-host indoor wireless systems will be able to onboard new tenant networks such as public safety, or private building management networks, depending on future requirements and spectrum licensing policy.
  • There may be edge-computing requirements driven by pandemic-proofting, although that doesn't necessarily imply either on-prem or very granular nearby edge facilities, rather than metro-level.
This is still just a very rough draft of my ideas - and clearly there are various policy / regulatory hypotheses here as well as technology direction. I'm not a specialist on building regulations, so it's quite possible I've made unreasonable assumptions. But this is intended as the start of a discussion, rather than a definitive forecast. I expect this topic and more detailed discussion to surface in coming months and years.

Your comments are very welcome - and if you want to get in touch with me directly, please connect on my LinkedIn, or send me a Twitter DM. If you're hosting any webinars, or holding internal brainstorms on this, I'd be very interested in participating.


Monday, December 04, 2017

5G & IoT? We need to talk about latency



Much of the discussion around the rationale for 5G – and especially the so-called “ultra-reliable” high QoS versions – centres on minimising network latency. Edge-computing architectures like MEC also focus on this. The worthy goal of 1 millisecond roundtrip time is often mentioned, usually in the context of applications like autonomous vehicles with snappy responses, AR/VR headsets without nausea, the “tactile Internet” and remote drone/robot control.

Usually, that is accompanied by some mention of 20 or 50 billion connected devices by [date X], and perhaps trillions of dollars of IoT-enabled value.

In many ways, this is irrelevant at best, and duplicitous and misleading at worst.

IoT devices and applications will likely span 10 or more orders of magnitude for latency, not just the two between 1-10ms and 10-100ms. Often, the main value of IoT comes from changes over long periods, not realtime control or telemetry.

Think about timescales a bit more deeply:

  • Sensors on an elevator doors may send sporadic data, to predict slowly-worsening mechanical problems – so an engineer might be sent a month before the normal maintenance visit.
  • A car might download new engine-management software once a week, and upload traffic observations and engine-performance data once a day (maybe waiting to do it over WiFi, in the owner’s garage, as it's not time-critical).
  • A large oil storage tank, or a water well, might have a depth-gauge giving readings once an hour.
  • A temperature sensor and thermostat in an elderly person’s home, to manage health and welfare, might track readings and respond with control messages every 10 minutes. Room temperatures change only slowly.
  • A shared bicycle might report its position every minute – and unlock in under 10 seconds when the user buys access with their smartphone app
  • A payment or security-access tag should check identity and open a door, or confirm a transaction, in a second or two.
  • A networked video-surveillance system may need to send a facial image, and get a response in a tenth of a second, before they move out of camera-shot.
  • A doctor’s endoscope or microsurgery tool might need to respond to controls (and send haptic feedback) 100 times a second – ie every 10ms
  • A rapidly-moving drone may need to react in a millisecond to a control signal, or a locally-recognised risk.
  • A sensitive industrial process-control system may need to be able to respond in 10s or 100s of microseconds to avoid damage to finely-calibrated machinery
  • Image sensors and various network sync mechanisms may require response times measured in nanoseconds
I have not seen any analysis that tries to divide the billions of devices, or trillions of dollars, into these very-different cohorts of time-sensitivity. Given the assumptions underpinning a lot of 5G business cases, I’d suggest that this type of work is crucial. Some of these use-cases are slow enough that sending data by 2G is fine (or by mail, in some cases!). Others are so fast they’ll need fibre – or compute capability located locally on-device, or even on-chip, rather than in the cloud, even if it’s an “edge” node.

I suspect (this is a wild guess, I'll admit) that the proportion of IoT devices, for which there’s a real difference between 1ms and 10ms and 100ms, will be less than 10%, and possibly less than 1% of the total. 

(Separately, the network access performance might be swamped by extra latency added by security functions, or edge-computing nodes being bypassed by VPN tunnels)

The proportion of accrued value may be similarly low. A lot of the IoT examples I hear about are either long time-series collections of sensor data (for asset performance-management and predictive maintenance), or have fairly loose timing constraints. A farm’s moisture sensors and irrigation pumps don’t need millisecond response times. Conversely, a chemical plant may need to alter measure and alter pressures or flows in microseconds.

Are we focusing 5G too much on the occasional Goldilocks of not-too-fast and not-too-slow?

Tuesday, July 11, 2017

Sensors: implications for wireless connectivity & video communications

Quick summary
  • Sensor technology is complex, diverse, fascinating & fast-evolving.
  • There are dozens of sensor types & technologies.
  • Nobody believes the 20-50bn devices forecasts, especially if they are based on assumptions that 1 sensor = 1 device
  • Some sensors improve the capabilities of already-connected devices, like phones or (increasingly) cars.
  • Some sensors enable creation of new forms of connected device & application.
  • Most sensors connect first via one or two tiers of local gateways, sub-systems or controllers, rather than directly connect to the Internet / cloud individually
  • While the amount of sensor-generated data is growing hugely, not all of this needs real-time collection and analysis, and so network needs are less-extreme.
  • Many industrial sensors use niche or unfamiliar forms of connectivity.
  • Genuine real-time controls often need sensors linked to "closed-loop" systems, rather than using Internet connections / cloud.
  • WiFi & short-range wireless technologies like Bluetooth & ZigBee are growing in importance. There is limited concern about using unlicensed spectrum
  • LoRa radios (sometimes but not always with LoRaWAN protocols) are growing in importance rapidly
  • Cellular connectivity is important for certain (especially standalone, remote/mobile & costly) sensor types, or sensor-rich complex objects like vehicles. 
  • The US seems more keen on LTE Cat-1 / Cat-M than NB-IoT for sensor-based standalone devices. Europe and Asia seem more oriented towards NB-IoT
  • There are no obvious & practical sensor use-cases that need 5G, but it will likely improve the performance / economics / reach of some 4G applications.
  • Camera / image sensors are becoming hugely important and diverse. These are increasingly linked to either AI systems (machine vision) or new forms of IoT-linked communication applications
  • "Ordinary" video sensors/modules are being supplemented by 3D, depth-sensing, emotion-sensing, 360degs, infra-red, microscopy and other next-gen capabilities.
  • AI and analytics will sometimes be performed on the sensor or controller/gateway itself, and sometimes in the cloud. This may reduce the need for realtime data transmission, but increase the need for batch transfer of larger files.
  • Conclusion: sensors are central to IoT and evolving fast, but the impact on network connectivity - especially new cellular 4G and 5G variants - is diffuse and non-linear.

Narrative
 
A couple of weeks ago I went to Sensors Expo 2017 in San Jose. This topic is slightly outside my normal beat, but fits with my ongoing interest in "telcofuturism", especially around the intersection of IoT, networks and AI. It also dovetails well with recent writing I've done on edge computing (link & link), a webinar [this week] and paper on IoT+video for client Dialogic (link), and an upcoming report I'll be writing on LPWAN for my Future of the Network research stream at STL Partners (link).

First things first: listening to some of the conference speeches, and then walking around the show floor, made me realise just how little I actually knew about sensors, and how they fit into the rest of the IoT industry. I suspect a lot of people in telecoms - or more broadly in wireless networking and equipment - don't really understand the space that well either.

For a start, there's a bewildering array of sensor types and technologies - from tiny silicon accelerometers that can be built into a chip (based on MEMS - micro-electromechanical systems), right up to sensors woven into large-scale fabrics, that can be used to make tarpaulins or tents which know when someone tries to cut them. There's all manner of detectors for gases, proximity, light, pressure, force, airflow, air quality, humidity, torque, electrical current, vibration, magnetic fields, temperature, distance, and so forth.

Secondly, a lot of sensors have historically been part of "closed-loop" systems, without much in the way of "fully-connected" computing, permanent data collection, networking, cloud platforms or analysis. 

An easy example to think about is an old-fashioned thermostat for a heating system. It senses temperature - and switches a boiler or radiator on or off accordingly - without "compute" or networking resource. This has been reinvented by Nest and others. Plenty of other sensors just interact with "real-time" systems - for example older cars' airbags, or motion-detection alarms which switch on lights.

In industry, a lot of sensors hook into the "real-time control" systems, whether that's for industrial production machinery, quality control, aircraft avionics or whatever. These often use fixed connectivity, with a bewildering array of network and interface types. It's not just TCP/IP or familiar wireless technologies. If you haven't come across things like Modbus or Profibus, or terms like RS485 physical connections, you perhaps don't realise the huge complexity and unfamiliarity of some of these systems. This is not telco territory.

This is important, as it brings in an entire new realm to think about. From a telco perspective, we're comfortable talking about the touch-points of networks and IT. We are don't often talk about OT or "operational technology". A lot of people seem to naively believe that we can hook up a sensor or a robot or a piece of industrial machinery straight to a 4G/5G/WiFi connection, then via Internet or VPN to a cloud application to control it, and that's all there is to it. 

In fact, there may well be one, two or three layers of other technology involved first, notably PLC units (programmable logic controllers) as well as local gateways. A lot of this is the intranet-of-things, not the Internet-of-things - and may well not even be using IP as most people in networking and telecoms normally think about it.

In other words, there's a lot more optionality around ISO layers - there are a broad range of sector-specific or proporietary protocols, that control sensors or IoT devices over a particular "physical layer". That contrasts with most users' (and telco-world observers') day-to-day expectations of "IP everywhere" and using HTTP and TCP/IP and similar protocols over ethernet, WiFi, 4G or whatever. The sensor world is much more fragmented than that.

These are some of the specific themes I noted at the event:
  • Despite the protocol zoo I've discussed, WiFi is everywhere nonetheless. Pretty much all the sensor types have WiFi connectivity options somewhere, unless they're ultra-low power. There's quite a bit of Bluetooth and ZigBee / other varieties of IEEE 802.15.4 for short-range access too.
  • Almost nobody seems bothered about the vagaries of unlicensed spectrum, apart from a few seriously mission-critical, time-critical applications, in which case they'll probably use fixed connections if they can. Bear in mind that a lot of sensors are actually fairly time-insensitive so temporary interference or congestion doesn't matter much. Temperatures usually only change over seconds / minutes, not milliseconds, for example. Bear in mind though, that this is for sensing (ie gathering data) not actuating (doing stuff, eg controlling machines or robots).
  • Most sensors send small bursts of data - either at set intervals, or when something changes. There are exceptions (notably camera / image sensors)
  • I saw a fair amount of talk about 5G (and also 4G and NB-IoT) but comparatively little action. Unlike Europe, the US seems more interested in LTE Cat-1 and Cat-M rather than NB-IoT. Cat-M can support VoLTE, which makes it interesting for applications like elder/child-trackers, wearable and building security. NB-IoT seems fairly well-suited to things like parking meters, environmental sensors, energy metering etc. where each unit is comparatively standalone, and needs to link to cloud/external resources like payments.
  • There's also lot of interest in LoRa, both as a public network service (Senet was prominently involved), and also as privately-owned infrastructure. I think we're going to see a lot of private LoRa embedded into medium-area sensor networks. Imagine 100 moisture sensors for a farm, connected back to a central gateway on top of the barn, and then on to a wide-area connection (fixed or mobile) and a cloud-based application. The 100 sensors don't need a wireless "service" - they'll be owned by the farmer, or else perhaps the connectivity will be offered as a part of a broader "managed irrigation service" by the software company.
  • There's an interest in wireless connectivity to reduce regulatory burdens for some sensors. For example, to connect a temperature sensor in an area of an oil refinery with explosion risks, to a server in another building, requires all manner of paperwork and certification. The trenching, ducting and physical wire between them needs approval, inspection and so on. It's much simpler to do it with wireless transmitters and receivers.
  • A lot of the extra sensors getting connected are going to be bundled with existing sensors. Rather than just a vibration sensor, the unit might also include temperature and pressure sensors in integrated form. That probably adds quite a lot to the IoT billions number-count, without needing separate network links.
  • A lot of sensors will get built into already-connected objects. Cars and aircraft will continue to add cameras, material stress sensors, chemical analysis probes for exhaust gases, air/fluid flow sensors, battery sensors of numerous types, more accelerometers and so on. This means more data being collected, and perhaps more ways to justify always-on connections because of new use-cases - but it also means a greater need for onboard processing and "bulk" transfers of data in batches.
  • Safety considerations often come ahead of security, and a long way ahead of performance. A factory robot needs sensors to avoid killing humans first. Production quality, data for machine learning and efficiency come further down the list. That means that connecting devices and sensors via wider-range networks might make theoretical or economic sense - but it'll need to be seen through a safety lens (and often sector-specific regulation) first. Taking things away from realtime connections and control systems, into a non-deterministic IP or wireless domain, will need careful review.
  • Discussion of sensor security issues is multi-layer, and encouragingly pervasive. Plenty of discussions around data integrity, network protection, even device authenticity and counterfeiting.
  • Imaging sensors (cameras and variants of them) are rapidly proliferating in terms of both capabilities and reach into new device categories. 3D depth-sensing cameras are expected on phones soon, for example for facial recognition. 360-degree video is rapidly growing, for example with drones. Vehicles will use cameras not just for awareness of surrounding, but also to identify drivers or check for attentiveness and concentration. Rooms or public-spaces will use cameras to count occupancy numbers or footfall data. New video endpoints will link into UC and collaboration systems "Sensed video" will need greater network capacity in many instances. [I am doing a webinar with Dialogic about IoT+video on July 13th - sign up here: link]
  • Microphones are sensors too, and are also getting smarter and more capable. Expect future audio devices to be aware of directionality, correct for environmental issues such as wind noise, recognise audio events as triggers - and even do their own voice recognition in the sensor itself.
  • Textile and fabric sensors are really cool - anything from smart tarpaulins for trucks to stop theft, through to bandages which can measure moisture and temperature changes, to signal a need for medical attention. 
  • There's a lot of modularity being built into sensors - they can work with multiple different network types depending on the use-case, and evolve over time. A vibration sensor module might be configurable to ship with WiFi, BLE, LoRa, NB-IoT, ZigBee and various combinations. I spoke to Advantech and Murata and TE Connectivity, among others, who talked about this.
  • Not many people seemed to have thought about SIMs/eSIMs much, at a sensor level. The expectation is that they will be added by solution integrators, eg vehicle manufacturers or energy-meter suppliers, as needed.
  • AI will have a range of impacts both positive and negative from a connectivity standpoint. The need for collecting and pooling large volumes of data from sensors will increase the need for network transport... but conversely, smarter endpoints might process the data locally more effectively, with just occasional bulk uploads to help train a central system.
Overall - this has really helped to solidify some of my thinking about IoT, connectivity, the implications for LPWAN and also future 4G/5G coverage and spectrum requirements. I'd recommend readers in the mainstream telecom sector to drop in to any similar events for a day or two - it's a good way to frame your understanding of the broader IoT space and recognise that "sensors" are diverse and have varying impacts on network needs.

Friday, September 16, 2016

TelcoFuturism - the impact of Quantum Technology

The other day, I was invited to the Cambridge Wireless conference on quantum computing and communications (link). Fascinating and brain-melting domain, that has profound implications for many other areas of technology (and telecom). Even though I have a physics degree, I can't claim to be able to keep up with all the maths and concepts that are discussed - but I took away a few real-world implications of what seems to be occurring.

Quantum technology is a pretty broad area, that relates to the weird properties exhibited by individual atoms or photons (light). If you've heard of Schrodinger's Cat, then you'll know how strange some of the concepts can be - especially a "qubit" (quantum bit) that can simultaneously be a 1 or 0, or "entanglement" where pairs of particles remain spookily connected at a distance.

These properties can be used to create computers, communications systems, sensors, clocks and various other applications. In a way, quantum tech is a "foundational" idea similar to semiconductors (which are themselves based on quantum mechanical principles): there will be many, many applications. 

Terminology alert: often people in this sector compare quantum computers versus "classical" alternatives. 


Some quick highlights and comments:
  • It's early days. Although there are some existing quantum solutions, they are not "universal" computers, but tailored for particular use-cases. Cooler stuff is 5-10 years away depending on your level of optimism (and stealth)


  • There were a lot of telecom people in the room - although that's partly a function of Cambridge Wireless's community (link). 
  • Many of the opportunities (& threats) from quantum are "several layers up". For example, we should be able to make more accurate clocks, which means better timestamping, which means more accurate transactions or positioning, which means better ways to create networks... It's pretty hard to extrapolate through all the layers to work out what the "real world" impacts might be, as there are variables & uncertainties & practicalities at each stage. Same thing for quantum improving AI systems.
  • There will be a lot of hybrid quantum/classical systems - including being integrated on the same chip.
  • Some crypto & PKI systems are going to be compromised by quantum-enabled decryption. It makes mincemeat of some algorithms, but others are much more "quantum-proof". There might be a "Y2Q" problem digging out where the old and vulnerable ones might be, buried inside other systems and software. This might be a "big deal", but there was also debate among experts about whether some of the risks claimed might actually be scaremongering or limited in scope. I think there will be a big ramp-up in "quantum compliance consulting" though - if enough people can understand it.
  • Quantum tech also enables totally-secure* networks to be built, using quantum key distribution (QKD). There's a bunch of tests and prototypes working around the world. At the moment these are mostly fibre-based, although some are using free-space optics. (*I'm not a cryptanalyst. Or a quantum wizard. My understanding is that secure here means non-interceptible or perfect interception-detection, but as always with security there are other weak links in the chain when humans are involved).
  • We're not getting some sort of magical massmarket "quantum broadband" any time soon, fibre or (definitely) mobile. There might be quantum-related components in networks for timing or security, but the actual physics of shipping-around of bits through air and fibre isn't likely to change.
  • One caveat - if I understand correctly (and it's possible I don't) some quantum applications might make it more appropriate either to use dedicated individual fibres, or to use frequency multiplexing (separate colours essentially) rather than networks with other forms of multiplexing. One of my "to do's" is to get my head around what quantum-level transport really means for the way we build IP networks - and whether it's only ultra-secure point-to-point connections that are impacted, rather than general "routed" ones. At the moment it seems the main use is parallel QKD streams to secure the main "media" stream. I've found some stuff on early concepts of quantum routing (link) and quantum-aware SDN (link) but if anyone has a view on the commercial impact of this, I'm all ears. 

  • A lot of the current work on quantum computing seems oriented towards creating better ways to do machine learning - essentially the ability to absorb many, many different things "in parallel" rather than sequentially. Beyond AI/ML, many important tasks involve optimisation or pattern-recognition - quantum solutions should help. This has applications across the board, from finance to healthcare to telecoms, although there weren't many suggested use-cases in BSS/OSS or network design at the event. I suspect there could be a variety of interesting options & will think more about this over coming months. (Let me know if you'd like to discuss it)


  • There's lots of complexity in getting quantum engineering to work for computing - components often need to be cryogenically cooled, there's all manner of software design and error-correction and control issues, maybe some engineering of microwave systems to link bits together and so on. This is Big Science. It's not going to be in the iPhone 9. (Although some of the sensing and clock stuff seems to be "smaller")

  • There's some cool stuff being done around quanutum-based accelerometers, gravity sensors etc. One of the biggest drivers is the desire to create a GPS-type positioning system that doesn't rely on signals from satellites - which can be jammed, blocked or even destroyed. Currently GPS is turning into a bit of a "single point of failure" for the entire planet - especially including cellular networks and devices and financial transactions which need times-stamps.


  • Someone else has beaten me to the term QCaaS (link) so I'll have to settle with QDN "Quantum Defined Networking". You heard it here first....
  • There are various implied links with IoT (sensors) and blockchain (crypto). I'll keep an eye on those for future work.
Overall, a fascinating topic - and one which the UK government, academia and industry is pumping a ton of cash into. It's perhaps not as sexy as some other futurist obsessions like AI, genetic engineering or blockchain - but it's potentially just as transformative, not least by helping accelerate the progress of all of the others.

For the telecoms industry, there's relatively little to be worried about yet - although getting older network and IT systems' crypto checked over seems important given the timelines to replace legacy equipment. Given the rising desire to exploit PKI and identity in telecoms and IoT as a long-term business, the 10-year horizon for "sci-fi" possibilities is a bit uncomfortable, especially if new breakthroughs are made. And that's before second-guessing how much extra progress has been made by intelligence communities, and how fast Messrs Snowden and Assange get to hear about it. 

We might see quantum tech appearing first in clocks used in networks, or specific optimisation problems solved with early computers from the likes of D-Wave. In my mind there's a few options around NFV/SDN and network-planning that might be a fit, for instance. There's also some cool possible opportunity around super-secure communications and non-GPS navigation. But good news if you're a serious telco quantum doom-monger, don't worry about the prospect of Netflix quantum-entangling videos direct to peoples' TVs and smartphones just yet.

If you're interested in learning more about Disruptive Analysis' work on "TelcoFuturism" please get in touch at information AT disruptive-analysis dot com. My introduction to the concept is here (link) and I've also written about AI/machine learning (link) and Blockchain (link). I gave my first keynote presentation on TelcoFuturism a few months ago (link) and will be progressively ramping this up - get in touch if you need a speaker.