The Linux Foundation Projects
Skip to main content

Author: Ranny Haiby, CTO Networking, Edge, and Access, LFN

Reflecting on the core insights from the latest “Trends – Artificial Intelligence” report by Mary Meeker et al. at BOND. This report is a comprehensive look at the foundational trends shaping the AI landscape, confirming what many of us are experiencing: the pace of change is truly unprecedented. While the entire report is packed with critical data, the section on AI Model Compute Costs, is a key takeaway for anyone in tech leadership.

The report underscores a fundamental dynamic shaping the AI landscape:

  • AI Model Training Costs are High and Rising: Training the most powerful large language models (LLMs) has become an extraordinarily expensive and capital-intensive endeavor.
  • AI Inference Costs Per Token are Falling Rapidly: In contrast to training, the cost of running models at scale in real-time (inference) is on a sharply declining curve.

This divergence creates a fascinating dynamic. While the barrier to training leading-edge models remains incredibly high due to ballooning compute costs, the plummeting cost of inference significantly lowers the barrier to deploying and using AI at scale. This becomes even more challenging for industry verticals like Networking, Energy, etc. where general purpose LLMs are inefficient and lack the domain specific knowledge. These verticals require domain specific AI models and applications, forcing them to bear the cost of development.

AI development cost could be a significant hurdle for many organizations, but open source software collaboration offers a powerful antidote. By embracing an open source model, the AI community can collectively share the burden of developing the fundamental, common components that underpin nearly all AI applications. Imagine the efficiency gains when resources aren’t duplicated across countless proprietary projects, but instead channeled into a shared, robust foundation. This collaborative approach not only democratizes access to cutting-edge AI tools but also harnesses the immense power of a global community. When a critical mass of innovators—from individual developers to academic institutions and large enterprises—contribute to a shared ecosystem, the pace of innovation accelerates dramatically, leading to more cost-effective, powerful, and accessible AI for everyone.

At the Linux Foundation Networking we believe open source initiatives present an opportunity for the telecom industry to efficiently develop the models and applications required for effectively building the AI-Native networks of the future. They address both the aspects of the cost of model development as well as the complexity of building AI applications.

Here are just a few of the initiatives to efficiently develop AI capabilities using open collaboration and open source software:

  • Open source projects like the recently launched Essedum provide a set of reusable building blocks for AI in networking and reduce the overhead cost for building Networking AI applications.
  • Cross industry alliances like the AI-RAN Alliance are the perfect platform for exchanging use cases and collaborating on a common architecture
  • Initiatives like the GSMA LLM Benchmarking help avoid the cost of creating bespoke telco models by helping model consumers identify the right one for their needs from a set of existing ones.

The lightning-fast evolution of AI technology, from groundbreaking new models to novel training techniques, presents a unique challenge for any development paradigm. Open source communities, with their inherent agility and collaborative spirit, are currently in a dynamic phase of adjustment to this rapid pace. While proprietary solutions often struggle with the inertia of centralized development and release cycles, the decentralized nature of open source projects allows for near-instantaneous adaptation and integration of the latest advancements. This inherent dynamism—where contributions can come from anywhere, new ideas can be prototyped and iterated upon at speed, and code can be scrutinized and improved by a global collective—makes open source the optimal platform for not just keeping pace with AI trends, but actively driving them.

What are some of the ways you and your organization are using to avoid the high cost of model building? Come share them with us in projects like Essedum or the 5G-Super-Blueprint.

Author