What you’ll learn:
- The size and complexity of the cloud.
- Verification and compliance issues in the cloud.
- What can be done to address these issues.
One of the often-overlooked aspects of our modern society is the layered, multifaceted, and interconnected nature of the technical ecosystem it relies on. The IoT and other cloud-based systems are wireless solutions based on massive amounts of wiring behind the walls and the infrastructure it connects.
These systems, both wired and wireless, are continuously increasing in scope, size, and complexity, challenging those who must design, test, and troubleshoot them. The development of the next generation of solutions based on 6G telecommunications, edge computing, AI, and their related infrastructures is the next great challenge to the electronic design engineering community. The end result is a mixture of cloud environments that must be addressed.
Data centers have become the backbone of modern computing, enabling the storage, processing, and delivery of data everywhere. You can think of hyperscalers as data centers that specialize in delivering massive amounts of computing power and storage capacity to organizations and individuals around the world.
By ensuring seamless scalability with the ability to handle massive workloads without compromising on performance or efficiency, hyperscalers support edge-based cloud environments and enable cloud solution companies that are pure IP. This “fabless” development environment creates additional verification and compliance concerns.
Verification and Compliance in the Cloud
Now the industry is starting to recognize that these open interfaces make for an interoperability nightmare, with various levels of maturity for some of them. For example, the open fronthaul between the radio unit and the baseband unit is now quite mature for single radio deployments, but issues arise when you get into massive MIMO. The industry is beginning to create blueprints for vendors with this cloud technology stack for a particular set of features.
They're taking all possible combinations and permutations in terms of the number of vendors and the number of features that can work together. Every deployment is slightly different and sufficiently different whereby it must be tested to avoid costly combinations. Some are migrating to a continuous integration, development, and test environment.
This is where VIAVI comes into play, as our virtualized and containerized solutions can now run a set of regression tests every time to do a software update. Once that passes, it gives the customer service provider (CSP) the confidence that it can deploy the new version of software in the cloud.
VIAVI made a conscious decision to move away from FPGAs and toward x86 architectures to benefit from the cost savings that it would bring us. It was a general trend that we saw with our more progressive customers.
The company had virtualized a security test solution and moved it into AWS to test that firewall, which you could click and select for extra security 10 years ago, before its acquisition. So, we were familiar with the space, but in a different industry. And we brought that expertise to the wireless customers who had this idea of moving toward the cloud. Now the cloud has gained critical mass and momentum, such that the next round of expansion by operators will see a deployment of control solutions in cloud technologies.
Network Test and Verification
To support a successful transition to a more functional cloud environment, test practices must be developed and refined to ensure the consistent wireless performance demanded by end-users. An ecosystem of collective tools, software, protocols, and practices are required for all wireless cloud deployments to verify the download speeds, latency, and coverage density. Advanced test solutions are playing a vital role in the development, deployment, and operational excellence of emerging wireless networks.
All big changes require commitment, and next-generation cloud-enabled performance is no exception, with complexity and technical challenges to the test arena. Companies like VIAVI have created a fully integrated selection of cloud-enabled test devices and equipment, software automation services, and network test solutions.
Wireless testing has become a critical enabler of cloud potential, and test solutions have quickly adapted to complex use cases and wholesale architectural advances encompassing core, transport, radio access network (RAN), and fiber network elements simultaneously. This has necessitated advanced emulation and verification technology in the test lab that’s scalable to full deployment in the field.
For example, 5G fiber networks are being challenged to meet fronthaul and backhaul demands, with the bar set higher for speed, bandwidth, reliability, and synchronization while network function virtualization and edge computing introduce additional visibility obstacles. This convergence of dynamic system elements makes automated, real-time intelligence platforms another important pillar of 5G network performance testing and optimization necessary.
The combination of millimeter-wave (mmWave) utilization, MIMO, and beamforming comprises the infrastructure of 5G, and the added complexity introduced by these innovations can also pose challenges for 5G test networks as well as the overall 5G testing process. MIMO essentially means more antennas, which means a higher burden for 5G testers, as measurement connectors for each antenna will no longer be feasible based on the architecture and density.
Using mmWave and beamforming at these high frequencies presents additional obstacles, since they are more susceptible to propagation loss from environmental conditions, making over-the-air (OTA) testing potentially less consistent and more complex. Since conducted-mode testing can’t be performed without discrete connection points, OTA will be required more frequently to avoid limited results.
Channel emulation is also more complex with 5G, as the number of RF channels exponentially increases versus the linear expansion that occurred with 3G and 4G releases. 5G test equipment must involve creative solutions that minimize chamber testing and other expensive test elements, without compromising test coverage and accuracy.
AI in Cloud Solution Testing
AI is becoming more useful in the creation of test solutions for the cloud. One of the products offered by VIAVI is called RIC test, an AI-assisted RAN intelligent controller designed to optimize RAN resources, reduce signaling, and improve capacity (see figure). We have a RAN scenario generator product for datasets to enable AI-based algorithms to optimize the network. So, we came up with hundreds of scenarios to generate tens of thousands of datasets to train machine-learning (ML)-based models.
The reality is it will take a while for those interfaces to deploy on the network, and the algorithms themselves don't distinguish between real-time, near real-time, or non-real-time because the algorithms will run off the datasets that they have available. What we see now is that the industry has taken a more pragmatic approach, with access to the datasets and management systems. We will take those datasets and apply the ML models to the data we have. In essence, they're going to bypass the RIC until that technology is available on the network.
You will see more AI-based RAN optimization decisions soon. We’re helping the operator develop our own AI models that predict the best possible outcome based on the constraints set by the operator.
A good example is power savings—if I want to save power in the evening time, I want to do it at the expense of quality of experience based on my policy. My policy might say okay, I'll sacrifice some operational time to save power. But how much I sacrifice is up to the operator. Then the algorithms will determine what cells should be powered down or what advanced sleep modes should be set on my base stations.
We need to have that tradeoff between experience and power, with our algorithms advising the operators how well their machine-learning models have done. In other words, our AI will predict how well their AI algorithms will perform.
One of the problems with such automation is that operators are nervous about what goes on. For example, initially, algorithms might show good results, but they might diverge over time. They want to have a high level of confidence in these algorithms before deploying them. That's now the major challenge for AI.
We're going from very simple ML algorithms all the way up to full-blown semi-cognizant AI systems. It's a lot to take in and relates to our ability to migrate things to technological solutions that may outpace our ability to absorb it. Operators don’t want to relinquish control because they don’t trust the various algorithms, which might compete. You may have one or more optimization algorithms running on the network, and you might find that they're fighting each other to make the correct decision.
VIAVI believes that the next generation of networks will be deployed using cloud technology, and there will be a mixture of public and private infrastructures over time driven by financial decisions. Is it more cost beneficial for me to run secondaries of my network on public versus private cloud? We don't know how that will pan out, but there will be some successes with the operators choosing hyperscalers to host their infrastructure. Some will probably keep it in-house, and others will have a mixture.
One great example is an operator using a hyperscaler for a disaster recovery scenario. During normal operation, it runs on their infrastructure. However, when disaster strikes and some of their infrastructure is knocked out, they can then fall back to a public cloud. It seems to have the best of both worlds for some operators.