Today we’re joined by Vladislav Zaimov, a seasoned specialist in enterprise telecommunications and the risk management of vulnerable networks. As the telecom industry stands at a crossroads, eager to embrace AI but often held back by decades of technical debt, we’re diving deep into the practical application of agentic AI. We’ll explore how this technology is helping major players like Lumen move beyond the hype, tackling foundational challenges in data migration, automating complex processes like device modeling, and finally making the concept of digital twins a cost-effective reality.
Lumen is undertaking a major overhaul of its legacy inventory system. How exactly are AI agents assisting with this complex data migration and cleanup, and what are the most significant manual tasks being automated to establish that foundational “single source of truth”?
It’s a massive undertaking, and you’re right to focus on the foundation. So many operators get dazzled by the promise of AI without realizing their data house is not in order. For Lumen, the process involves migrating to a new, modern inventory system. The AI agents are being deployed to automate the incredibly tedious and error-prone manual work that comes with this. Imagine sifting through years of disparate, inconsistent data across multiple legacy platforms. The agents are designed to simplify this by cleaning and rationalizing that data as it moves. This automation is what allows them to build that essential “system of record”—a single, reliable source of truth. Without that, any advanced AI capability you try to layer on top is built on sand.
Device model generation is often a labor-intensive, manual process for operators. Could you walk us through the specific steps an AI agent takes to automate this? Please describe the information it uses and how it creates these models without direct human intervention.
Absolutely. Traditionally, this has been a real bottleneck. An engineer would have to effectively poll every new device on the network, manually document its specific characteristics—things like the number and type of ports, its connectivity protocols, and other unique attributes—and then painstakingly build a model of it within the inventory system. It’s slow and requires deep domain knowledge. With an agentic AI framework, the process is transformed. You simply feed the raw device information, the same data an engineer would look at, directly to an AI agent. The agent then intelligently parses this data, understands the device’s function and specifications, and automatically generates the corresponding model for the inventory. It completely removes the manual, repetitive modeling work, freeing up skilled engineers to focus on more strategic tasks.
The concept of digital twins has reportedly faced adoption challenges. How does agentic AI make building and maintaining these virtual network replicas more cost-effective, and can you share a practical example of how this technology lowers the barrier to entry for network operators?
The excitement around digital twins has often been tempered by the sheer cost and complexity of building and maintaining them. Agentic AI directly addresses this by leveraging the foundational work we’ve discussed. Once you have that clean, consolidated inventory—that single source of truth—you have a perfect digital blueprint of your physical network. An AI agent can then use this data to build and continuously update a virtual replica far more cheaply than manual methods would allow. For example, instead of a massive, standalone project to build a twin, an operator can now use AI to intelligently simulate network changes. They can ask, “What happens if we reroute traffic this way?” or “How will this new service impact our IP and optical layers?” The AI simulates these scenarios on the virtual replica, providing insights without the enormous cost or risk of a traditional approach. It democratizes the technology.
Many telcos are eager to deploy AI but are hindered by technical debt and inconsistent data. What are the critical first steps for establishing a clean, rationalized inventory, and why is this foundational work absolutely essential before attempting to implement more advanced AI capabilities?
The first step is to resist the temptation to jump straight to the sexy AI applications. The critical, non-negotiable starting point is a complete overhaul of the legacy inventory. This means committing the resources to cleaning, rationalizing, and consolidating all your network data into one unified system. It’s about creating a system of record that you can actually trust. This work is absolutely essential because AI is only as good as the data it’s trained on. Trying to deploy agents on top of messy, inconsistent data is like wanting to go scuba diving without knowing how to swim—it’s not only ineffective, it’s dangerous. You will get flawed insights and make poor decisions. You have to build that solid foundation first; only then can you start applying intelligence to simulate, automate, and optimize your operations effectively.
What is your forecast for agentic AI adoption in the telecom industry over the next five years?
Over the next five years, I believe we’ll see a significant shift from pilot projects to widespread, practical deployment of agentic AI. The initial wave will be focused exactly on what we’re seeing with Lumen: solving the foundational problems of data quality and inventory management, because the industry has recognized this is the main roadblock. As more operators establish this clean data layer, we will see an explosion of use cases. We’ll move beyond simple automation to agents that can predict failures, optimize network traffic in real time, and even autonomously manage service layers. The operators who invest in their data foundation now will be the ones leading the charge, operating with an efficiency and intelligence that will be impossible for laggards to compete with. It will become a core competitive differentiator.
