Q&A: What data trends are shaping the future of networks?


7 Nov 2018

Image: BT

Winston Carrera from BT Global Services answers our questions on building and future-proofing network architectures in the data-driven age.

Winston Carrera is CTO of future networks in BT’s Global Services division. His experience in network architectures, creating products and solutions in both enterprise and capital markets, spans more than 30 years.

In his role, Carrera offers leadership and guidance on the exploitation of future network and systems technology, steering investments across industries such as financial services, aviation, retail and pharma.

We spoke to Carrera to find out more about future networks in the age of big data, and how Brexit might raise complications in data transfer.

What does it mean to be a CTO of future networks?

My role is to be a forward-looking business technologist engaging with BT’s largest customers, providing thought leadership and guidance on the exploitation of future network and systems technology in the LAN, WAN and cloud to digitally transform the enterprise in alignment with business strategy and regulatory requirements.

How is data science being deployed in future networks?

Data science and analytics are not deployed per se within a network. Rather, intelligence is embedded within the network either through instrumentation logic or functionality employed in software, or hardware with a particular emphasis on business rules. It is the information derived and extracted from this approach that data analysis is applied to as an aggregate set, which must ultimately correlate to a business imperative and our outcome to provide a tangible benefit.

Can you give us some industry use cases where this is being applied to future network products?

There is a paradigm shift independent of industry verticals to interface with the network using code-based tools (and treat it in respect of developing, testing and deploying of functionality that enables a business outcome) similar to how the IT domain treats application code, in order to increase activation velocity and agility, and simultaneously reduce costs in operations. This level of automation demands near real-time analysis of performance and operational data from the network at sufficient granularity, which is only useful through the application of data science and analysis that work in tandem with the automation toolset.

One of the biggest questions around a data-driven future is how we can build networks to support it. What are your forecasts on the growth of big data and what kind of future-proofing do networks need to support this?

The challenge as ever is how to consume large quantities of disparate data at pace across a distributed environment and then incrementally consolidate where appropriate to gain actionable insight.

The emergence of IoT [the internet of things] at scale coupled with 5G will only exacerbate over the next two to five years. Therefore, edge-based computing capability alongside network functions virtualisation (NFV) with software-defined networking (SDN) control will become critical in being able to support future demand.

Apart from technological advancement, are there other trends you are monitoring that can impact the flow and use of data on future networks?

In financial services, particularly the capital markets, regulation tends to be the key driver in technology adoption. A good example is the desire of the European Securities and Markets Authority (ESMA) to perform market surveillance in near real-time across the trade life cycle, which has implications on the ability to maintain accurate clocks and synchronisation across multiple domains over a wide area in order to timestamp events.

Brexit throws up the challenge of whether the UK voluntarily maintains compliance of the ESMA regulatory technical standards or does something different, which may or may not challenge the manner in which timestamped data is collected, analysed and distributed.