Artificial intelligence will maximise efficiency of 5G network operations
Compared with previous types of networks, 5G networks are both more in need of automation and more amenable to automation. Automation tools are still evolving and machine learning is not yet common in carrier-grade networking, but rapid change is expected.
Emerging standards from 3GPP, ETSI, ITU and the open source software community anticipate increased use of automation, artificial intelligence (AI) and machine learning (ML). And key suppliers’ activities add credibility to the vision and promise of artificially intelligent network operations.
“Growing complexity and the need to solve repetitive tasks in 5G and future radio systems necessitate new automation solutions that take advantage of state-of-the-art artificial intelligence and machine learning techniques that boost system efficiency,” wrote Ericsson’s chief technology officer (CTO), Erik Ekudden, recently.
In 2020, Ericsson engineers demonstrated machine learning software that orchestrated virtual machines on a web server. They reported that during a 12-hour stress test, their software decreased idle cycles to 2%, from a baseline of 20%. Similar efficiency gains could enhance collections of edge computers and computers within cloud-native 5G infrastructure.
Considering that 5G core networks are evolving towards increased dependence on software and generic computing resources, Ericsson’s demonstration suggests that large-scale use of AI solutions could help carriers use infrastructure as efficiently as possible while handling a mix of traffic types that change dynamically and fulfilling diverse service-level agreements.
Nokia marketing manager Filip De Greve recently stated: “The benefits of AI and ML are unquestionable – all it needs is the right approach and the right partner to unlock them.”
A whitepaper from Nokia describes potential roles for AI and ML in virtually all phases of a service provider’s operations. Last month, Nokia announced the availability of its Software Enablement Platform, whose features include a means for making use of AI and ML in edge computers that run both open radio access networks (O-RANs) and application-level services. Nokia’s platform provides data that is important to machine learning developments for software-defined radios.
Carriers and third parties can develop software for Nokia’s platform, which comes with some samples that are in current commercial trials. One included “xApp” relies on machine learning methods for traffic steering – roughly speaking, a type of service-aware load balancing for radio channels.
Huawei, too, has engaged in a number of machine learning developments in recent years, but seems to have made relatively few disclosures about the matter recently. The company said its management and orchestration (MANO) solution “uses AI and big data technologies to implement automatic deployment, configuration, scaling and healing”.
Needs and points of entry for carrier-grade AI
Needs for machine learning arise from expected challenges in managing future 5G networks. Future deployments will likely have traffic-carrying capacity orders of magnitude greater than existing infrastructures. Many suppliers, researchers and developers expect to need machine learning to make efficient use of 5G technologies.
Opportunities to use machine learning are arising with increased reliance on cloud-native resources in telecommunications networks. Carriers also experience the same powerful currents that impel many industries towards “softwarisation”, use of virtual machines, DevOps principles and other global vectors for intelligent automation.
Suppliers to telecoms carriers and advanced researchers are developing machine learning software that, for example, controls smart antennas with split-second timing, assigns and reassigns bandwidth within a packet core and orchestrates assignments for an edge computer’s virtual machines.
Essentially, the software plays a game, aiming to predict traffic loads and use the fewest resources to carry traffic in accordance with service-level agreements. The intended result would improve the availability of resources to serve additional customers at times when loads are at their peak. When loads abate, the software can cause hardware to operate in power-saving standby mode.
Rules-based scripts and statistical models can accomplish some of these goals, but hand-crafted algorithms face challenges. A vast number of parameters specify a connection event in a 5G network – more so than in previous generations. That is why machine learning could be a requirement, not simply an optimisation tool, for efficient resource utilisation in full-scale 5G operations.
Varieties of AI tasks in cellular networking
Recent reports have surveyed a range of wireless communications applications that machine learning researchers and developers are working on, yielding many candidate technologies for carrier roadmaps.
From a business lifecycle perspective, opportunities exist for machine learning developments to expedite network planning and design, operations, marketing and other duties that normally require an intelligent human. Developers are targeting network management functions, including fault management assurance, configuration, accounting, performance and security (FCAPS).
From a network technology perspective, machine learning applications in research and development phases could affect every layer of the communications stack, from low-level physical and data link layers, through media access, transport, switching, session, presentation and application layers.
At lower layers of radio access networks, generic computers process baseband signals, and they schedule and form directional radio beams by synchronising many antenna elements. Machine learning systems can alleviate congestion by assigning optimal modulation parameters and rapidly scheduling beams that are calculated to fulfil immediate demands.
At higher layers of communications stacks, softwarisation yields opportunities to use and reuse virtual network functions (VNFs) in dynamic combinations to handle changes in traffic patterns. For example, intelligent systems can right-size (autoscale) temporary combinations of resources to support a large video conference and reassign those resources to other jobs after the event.
In packet core networks, intelligent selection is among the astronomical number of ways to mix and match network functions to cut idling while keeping customers satisfied. In radio access networks, intelligent tweaks to power levels, symbol sets, frame sizes and other parameters promise to squeeze the greatest capacity from the available spectrum.
Cyber security and privacy measures can also benefit from machine learning. In theory, intelligent domain isolation can open and shut access automatically in accordance with knowledge encoded in large databases such as event logs. Distributed learning methods can run on edge computers and user devices, keeping private data separate from centralised databases.
Juniper’s slogan “the self-driving network” expresses a vision of autonomous communications services, analogous to autonomous vehicles. Many other network technology developers have embraced similar ideas. Engineers and marketers often describe intent-based networking (IBN), one-touch provisioning, and zero-touch network and service management.
Most suppliers will probably use one of these phrases, or a similar phase. All of them refer to a subset of network operations that can occur autonomously, or nearly so. In fact, many software-defined networking technology concepts rely on rules-based systems, a programming strategy that the artificial intelligence community developed decades ago.
Verizon network architect Mehmet Toy recently described one interpretation of IBN to mean “deploying and configuring the network resources according to operator intentions automatically”. While developments often focus on fulfilling the intentions of network managers, Toy also envisions network configurations that respond to changes in user intentions.
Imaginably, a future network manager could employ natural language to revise a bandwidth-throttling policy. But beware of hype surrounding network automation. In some enterprise networks, zero-touch nodes configure automatically when a technician powers up a new rack. In contrast, installing a carrier-class fibre termination node remains complex.
Much as driverless cars are requiring more time and development resources than some expected, the vision of fully autonomic networks seems to remain a distant one. One major challenge consists of acquiring and analysing abundant telemetry data within service providers’ networks.
Many systems do not expose the data that data-hungry machine learning systems need to predict and respond to changes in traffic loads. Systems that do provide telemetry use diverse protocols and data structures, complicating AI software developments. Perhaps suppliers will see telemetry data as having high value as intellectual property and worthy of encryption.
A 2020 Nokia whitepaper advocates a multistage technology roadmap to manage the opportunities and risks. Nokia acknowledges that AI is rare in today’s networks. More commonly, expert human network managers create, implement and often adjust statistical and rules-based models that govern automated systems in telecommunications networks.
Intermediate between today’s model-driven practices and the future vision of autonomic networks, Nokia sees the emergence of intent-driven network management processes, enabled by closed-loop automation systems. Automated resource orchestration would free up human network managers to focus on business needs, service creation and DevOps.
Does AI threaten network managers’ jobs? In one sense, a changing technology landscape often challenges networking professionals to keep up with new developments. In another sense, AI tools in diverse fields tend to be productivity enhancers rather than redundancy generators. Similarly, for doctors and attorneys, AI is more of a tool than a threat.
One or another industry player seems to be always buzzing about “intelligent networks”. AT&T has been at it the longest, initially using the phrase in the 1980s to describe an early network computing initiative. Expectations of artificial intelligence in networks have focused and refocused repeatedly over the years. This time may be different. Are we there yet?
Now that computers control or constitute virtually all network nodes, software seems to be more agile at all layers of communications stacks. Business evolution will determine which AI and ML developments contribute most to business results and customer experiences, and which nodes in a network provide maximum leverage for machine learning software to add value.