Events

Experts discuss telcos’ path to AI-native benefits

by
January 15, 2026
5
minute read

Put five AI experts with a variety of perspectives on a discussion panel and the result is a lively and informative conversation about how telcos can be successful on their journey to becoming AI-native.

That’s just what happened in the AI-Native Telco Forum sponsored by Telecom TV. The panel featured experts across the telecom industry:

  • Andrew Collinson, Founder and Principal, Connective Insight
  • Danielle Rios, Acting CEO, Totogi
  • Eric van Vliet, Vice President EMEA Telecom Systems Business Sales and GTM Organization, Dell Technologies
  • Nabil Lahyani, Head of Autonomous Networks Analytics Product Line, Cloud and Networks Services, Nokia
  • Vivek Chadha, SVP and Global Head of Telco Cloud, Rakuten Symphony.

This blog post is a summary of the key themes and discussions that took place during the panel.

Moving from cloud-native to AI-native

The panel started out reframing the emerging era of AI-native compared to the current cloud-native mindset. The goal with AI-native networks is to codify decisions into data-driven, learning systems, with humans increasingly curating and supervising rather than manually executing.

“Cloud native was essentially what’s the best way of doing deployment,” Chada said. “What’s different in AI native? You no longer think about how do I deploy? You think of who decides, what they decide and who controls the decision.”

This shift matters because it elevates AI from an efficiency add-on to a network governance and control paradigm. This changes the locus of value from infrastructure automation to decision orchestration across product, operations, finance, and customer experience.

“Cloud native was essentially what’s the best way of doing deployment,” Chada said. “What’s different in AI native? You no longer think about how do I deploy? You think of who decides, what they decide and who controls the decision.”

Three layers of AI‑native: Technology, business, people

Another way to look at being AI-native was proposed by Collinson, who views it as a three-layer stack comprising technology, business, and people. From this perspective, the technology might be the easy part because putting together AI-Native systems that change the business hinges on stakeholder buy-in.

“The real challenge in telecoms is getting change made, is getting stuff done,” he said. At this moment in the progression of AI-Native networks, “… you haven’t got the natives on your side.”

He also warned that external stakeholders aren’t enamored by labels: “Nobody cares if you’re AI native - show me the money, show me the results.” The lesson: While AI-native is a goal that operators want to reach, stakeholders want to see the achievement of measurable business outcomes. To get there, he said, it’s important to make the “people journey” as intentional as the “tech journey.” Without that, even the best algorithms risk stalling against cultural resistance.

AI integration is important

AI-native could also be referred to as AI-embedded, given the tight integration required to maximize value. That’s why Lahyani said, “We need to differentiate between AI-native and AI-assisted. AI-native means you have to start from the foundation with AI,” he said. Furthermore, the network must be driven by AI, including operations, business processes, and the organization itself, which also needs to be AI-driven.

AI, then, becomes so pervasive in the network that it should determine how telcos build, run, and improve the network, not just assist after the fact.

Quick wins first, bold transformation next

The road to AI-native is a long one, and the panel agreed that early victories are vital for momentum. Rios said, “It’s super important that you start to show quick wins so you can just say, ‘We’re doing it, it’s real,’ and people start to feel that change.”

Where to start, however, was a point of differentiation. Rios advocated beginning in a low-risk / high-reward part of the organization – and not the production network itself.

She felt that all carriers should already have AI in their customer support areas because, historically, 60% of support tickets are related to billing questions. “I wouldn’t start with the network first, I’d probably pick areas far away from that,” she said. Her first target would be the BSS because of the impact it can have on the network.

Chadha stated that Rakuten Symphony’s customer AI work focuses on two broad areas. Run the business (RTB), which looks for quick wins across cost, quality, and time. And change the business (CTB), which advocates for AI to drive larger transformations.

Chada said this approach builds credibility without overreaching, allowing the telco to prove value fast, then scale thoughtfully.

Design AI adoption processes for employee engagement

During the webinar, several speakers had advice for how to engage employees in the AI adoption process. van Vliet tied adoption to everyday wins that can help with the mindset shift: “When you see your colleagues do something a lot faster than you did, that’s a small win and a sign that you’re going to be left behind. So, you want to get on that train.”

Chadha discussed the quarterly AI hackathons organized by Rakuten as a practical mechanism to implement cultural changes. These hackathons get their culture-changing power when employees’ ideas are acknowledged and celebrated, thus elevating ownership and relevance.

Whatever mechanism is used, the speakers emphasized that it’s essential to view cultural adoption as a participatory, visible, and ongoing process, rather than a one-time event.

Data readiness is the “Holy Grail” that powers everything

Multiple speakers cautioned that data quality and observability are two factors that can make or break AI programs. Lahyani put it bluntly: “You can have the best algorithm, but if you don’t understand the data, or if you don’t trust the data, forget about AI. Rubbish in, rubbish out.”

He praised operators who take the quality, lineage, completeness, and latency of their data seriously. Comparing data readiness to future resiliency, van Vliet commented: “Make sure your data is AI-ready so that when the time comes, you can do it.”

Sometimes the required data is siloed in a different process and unavailable to a new algorithm, Chadha warned. Building AI at scale requires breaking down these siloes and allowing your AI team to focus on modeling and not data wrangling.

Paying for the journey

The panel discussed the impact of AI adoption on the bottom line and whether operators should focus on cost savings or generating new revenue.

van Vliet argued telcos must look beyond cost savings and find revenue-generating use cases. One obvious service could be bundling connectivity with an AI/GPU-as-a-service offering.

Rios stressed that model costs are dropping sharply, making it imperative to focus on the value layer and speed of “shipping” new services.

Together, these views suggest a funding flywheel: use quick wins to free capacity/cash, invest in revenue-oriented AI products, and ride the declining cost curve.

Conclusion

To wrap up his part of the panel, Chada reiterated the three big challenges he sees when he speaks to customers: lack of clarity on what AI-native means to their network or business, a lack of the right skill sets, especially to scale projects, and data quality and quantity (you can have too much data). If this feels like your organization, Chada has this encouragement: “There's no shame … because there isn't a silver or a gold standard today available for anyone.”

To watch the full panel discussion, visit this Telecom TV link.

Events
Spotlight on Tech