The True Measure of LLM Intelligence in Telecom
Today’s enterprise AI landscape is dominated by cloud-centric solutions—powerful but often expensive, energy-intensive, and constrained by privacy and security concerns. These limitations make it difficult for traditional enterprises to adopt AI at scale. This keynote introduces Hybrid AI Agent Systems, a transformative approach that distributes AI workloads intelligently across cloud and edge environments such as on-prem clusters, PCs, and IoT devices. In this architecture, cloud AI specializes in global reasoning, planning, and multi-agent orchestration, while local AI executes tasks securely and efficiently without exposing private data. Repetitive and high-frequency tasks can be cached and executed entirely on-device, delivering lower latency, reduced cost, and improved sustainability. This hybrid collaboration enables enterprise-grade AI that is secure, responsive, energy-efficient, and easier to deploy across diverse operational environments. It opens the door for new classes of applications that combine powerful cloud intelligence with trustworthy local execution. The talk will also highlight how hybrid AI agent systems align with a broader industry movement toward a unified agentic computing layer, where users interact through a single intelligent interface while cloud and edge agents coordinate seamlessly behind the scenes. This emerging model is poised to redefine how workplaces operate and how enterprise AI is delivered across devices. Hybrid AI agent systems represent a practical, scalable path to bringing “AI Everywhere” into reality—and a foundation for the next generation of secure, sustainable, and truly intelligent enterprise solutions..
Dr. Olena (Jianfang) Zhu is Head of AI Solutions at Intel’s Client Computing Group, leading AI solution development for PC clients. She drives collaborations with global partners to build next-generation hybrid AI agents and platform, including Intel’s AI Assistant Builder (formerly Project SuperBuilder) for multi-agent, hybrid local & cloud AI. Dr. Zhu is also an Adjunct Professor at Purdue University. She has authored 50+ papers, holds 40+ U.S. patents, and is recognized with multiple industry awards.
6G marks a shift from connectivity-driven networks to an intelligent, adaptive digital fabric that seamlessly integrates the physical, digital, and biological worlds. Unlike previous generations focused primarily on speed and capacity, 6G is defined by native intelligence, immersive experiences, and deep integration across networks, computing, and data.At its core, 6G embeds AI into every layer of the system, enabling autonomous operation, contextual awareness, and real-time adaptation to user intent and environmental conditions. Networks evolve from passive transport platforms into active, learning systems capable of sensing, reasoning, and optimizing themselves continuously.6G will enable new classes of applications such as immersive extended reality, holographic communication, large-scale digital twins, and environment-aware services. These experiences demand not only extreme performance, but also trust, resilience, and sustainability by design, making security and energy efficiency fundamental requirements rather than add-ons.Ultimately, 6G is not a network upgrade—it is a strategic digital platform that will reshape industries, public services, and digital economies in the decade ahead.ample text. Click to select the Text Element.
Dr. Achin Bhowmik is the Chief Technology Officer and Executive Vice President of Engineering at Starkey, a global leader in hearing technology. He leads the company’s initiatives to transform hearing aids into multifunctional, AI-powered communication and health devices. Dr. Bhowmik is an adjunct professor at Stanford University School of Medicine and an affiliate faculty member of the Stanford Institute for Human-Centered Artificial Intelligence and the Wu Tsai Neurosciences Institute.Before joining Starkey, he served as Vice President and General Manager of Perceptual Computing at Intel, where he led pioneering work in 3D sensing, computer vision, and interactive computing devices. Dr. Bhowmik is a Fellow of IEEE, SID, AAIA, and AIIA, and serves on the boards of RealSense, Mojo Vision, Astranu, and the National Captioning Institute.He has authored more than 200 publications, including three books, and over 80 patents worldwide. His work has been recognized with numerous honors, including TIME’s Best Inventions, the Red Dot Design Award, the Artificial Intelligence Excellence Award, and the Gold Globee Award for Most Innovative Person in Healthcare. .
Transforming Hearing Aids into MultifunctionalCommunication and Health Devices with Artificial Intelligence Over 1.5 billion people worldwide live with hearing loss, making it one of the most significant global health challenges of our time. Beyond impaired communication, untreated hearing loss is associated with increased risks of dementia, depression, social isolation, and falls. Traditional hearing aids, however, have been limited by stigma and a narrow focus on sound amplification. This keynote explores how embedded sensing and advances in artificial intelligence are transforming hearing aids into powerful, multifunctional wearable devices. These nearly invisible systems now incorporate deep neural networks for speech enhancement, noise suppression, and spatial awareness; stream media; monitor physical and cognitive health; detect and alert falls; translate languages; transcribe conversations; and serve as personal AI assistants. Enabled by breakthroughs in ultra-low-power computing and wireless connectivity, these devices illustrate how disruptive innovation in consumer technology can profoundly elevate human capabilities. This talk will offer a forward-looking view into the future of wearable AI, health technology, and human augmentation.
Adlen Ksentini is a full professor in the Communication Systems Department of EURECOM. He is leading the Netsoft group (23 people) actively contributing to network softwerization of 5G and 6G. Adlen Ksentini's research interests are Network Softwareization and Network Cloudification, focusing on topics related to ML and AI for 5G and 6G networks. He has been participating in several H2020 and Horizon Europe projects on 5G and beyond, such as 5G!Pagoda, 5GTransformer, 5G!Drones, MonB5G, ImagineB5G, 6GBricks, 6G-Intense, Sunrise-6G, AC3, Flecon-6G, and 6G-DALI. He is the technical manager of Flecon-6G, 6G-Intense, and AC3, on zero-touch management of 6G resources and applications and Cloud Edge Continuum, respectively. He is interested in system and architectural issues, as well as algorithmic problems related to those topics, using Optimization algorithms and ML. Adlen Ksentini has given several tutorials at IEEE international conferences, including IEEE Globecom 2015, IEEE CCNC 2017/2018/2023, IEEE ICC 2017, IEEE/IFIP IM 2017, and IEEE School 2019. Adlen Ksentini received the best paper award from IEEE GlobeCom 2025, IEEE IWCMC 2016, IEEE ICC 2012, and ACM MSWiM 2005. He has been awarded the 2017 IEEE Comsoc Fred W. Ellersick (best IEEE Communications Magazine’s paper).
ntent-Based Networking (IBN) represents a fundamental shift in how networks are designed, operated, and managed, enabling stakeholders to express what the network should achieve rather than how it should be configured. By allowing service owners to specify high-level goals and constraints, IBN lays the foundation for truly autonomous network management. In this direction, standardization efforts such as those led by TM Forum have introduced intent-driven architectures that pave the way toward self-managing networks in the 6G era. Despite this progress, today’s intent frameworks largely rely on declarative formats such as JSON or YAML, which still require detailed knowledge of intent models and northbound interfaces. This creates a cognitive and operational gap between human intent and network behavior, limiting the accessibility and agility promised by autonomous networking. A natural and transformative evolution of IBN is to elevate intent expression to natural language. In this keynote, we present an LLM-centric vision for intent-based management in next-generation networks, where Large Language Models (LLMs) serve as semantic bridges between human objectives and network operations. By translating natural language intents into operational policies and configurations, LLMs can dramatically simplify service deployment and management. We further discuss how few-shot learning and human-in-the-loop feedback enable continuous adaptation, accountability, and trust, positioning LLMs as key enablers of cognitive, goal-driven, and fully autonomous 6G networks.
CTSoc AdministratorCharlotte Kobert charlotte.kobert@ieee.org