Legacy cloud platforms have proved invaluable in training AI models with vast volumes of data. But as organizations want to further scale and optimize AI applications, legacy platforms are quickly showing their limitations.
In a Forrester study commissioned by Akamai, we uncovered the issues that IT leaders and developers are experiencing with legacy cloud platforms. Many agree that moving compute resources to edge computing will help to reduce latency, bandwidth usage, and cost in the era of AI inference.
As AI training paves the way for AI inference, we need solutions that can cope with increased workloads and compute needs without harming performance and the end user experience.
What the Shift to AI Inference Means for Application Development
With AI’s compute needs poised to shift to inference rather than just training, what does this mean for developers in AI application development?
According to Jack Gold, President and Principal Analyst at J.Gold Associates, the balance between training and inference is currently at 80% training and 20% inference. He predicts that this will swap in the next three years, with compute resources focused on 80% inference and just 20% training.

AI inference is the process of using trained models to draw conclusions and results from new data. As more models are trained with huge datasets, the next stage is using those models to cope with new information to scale the capabilities of AI solutions.
This, paired with user expectations of increased personalization, high performance and real-time data usage in applications, means that current cloud solutions must adapt.
Why Core Cloud Architecture Can’t Keep Up
The goal of AI inference is to assist in real-time results and a more seamless user experience in AI applications. However, legacy cloud solutions come up against a roadblock, or more specifically three key challenges that make it difficult to scale and optimize with AI.
In the study, we revealed that the top challenges developers and IT leaders are struggling with are:
- Latency issues (56% face processing delays)
- Cost problems (60% struggle with storage/processing costs)
- Scaling difficulties (45% have trouble scaling)
“A majority of survey respondents said their current cloud strategies hinder their ability to build real-time, data-driven decision-making into their applications.”
Core cloud solutions are already struggling to keep up with the requirements of AI scaling and optimization. So is there a more effective solution?
Edge computing brings data processing closer to the end user, reducing latency issues and improving performance, allowing developers to scale applications more effectively.
However, even this has its challenges, including compliance and regulation concerns, vendor lock-in, and difficulties hiring and retaining technical talent with the relevant skills. Edge computing typically uses different architectural approaches, specialty protocols, and different vendors, causing operational complexities and increased business risk.
Distributed Cloud Computing as the Next Stage of AI Application Development
An alternative solution that’s quickly gaining traction among IT leaders is distributed cloud computing. More than half of the respondents in this study said that they were currently self-managing some form of distributed architecture.

Most said that their current cloud strategies are hindering their ability to build real-time, data-driven decision-making into their applications due to the costs of additional storage and processing power and processing delays caused by high latency.
Like edge computing, distributed cloud architecture allows developers to serve data from locations closer to users, improving the latency and performance of applications, cloud databases, streaming media, and other workloads.
The difference is that while part of a distributed network, this form of cloud computing can include a variety of different locations for data including third-party data centers, on-premises locations, and private cloud locations.
As we drive toward AI inference, more applications and scaling opportunities will emerge and with it, the need for more efficient ways of handling data.
Application users will expect fast results and a seamless experience. By investing in distributed architectures, developers can meet the demands for real-time data processing and inference in modern applications and ensure their company’s competitive edge with users.
How Distributed Cloud Computing Mirrors Business Goals
We found that the flexibility to deploy compute resources closer to end users while also maintaining central management and control addressed some of the top concerns that our survey respondents have, including:
- 55% said it would address fears of increased business risk
- 54% said it would help to protect against increased runaway costs
- 49% said it would improve slow time-to-value
Where distributed cloud architecture outperforms edge computing is that it makes it easier to deploy and manage AI applications with the same skill sets, architectural paradigm, and cloud primitives from core cloud computing.
On top of that, distributed cloud provides reduced latency, better scalability, improved reliability, and greater overall control over data when building applications.
Is it Better to Switch Sooner Rather Than Later?
From Akamai’s study with Forrester, it’s important to note that IT leaders and application developers are prioritizing goals such as innovating around AI, modernizing digital experiences, personalizing experiences, and improving data management. Distributed cloud’s advantages therefore directly tie into what most developers are seeking.

Based on predictions surrounding AI inference and a rising trend toward distributed cloud setups, switching to a distributed approach seems an obvious choice. Making the move sooner rather than later could help you remain competitive as the market and AI technology evolves.
To learn more from our report on the future of distributed cloud versus edge computing and core cloud architectures, download the full report today.
* In this Forrester Consulting study commissioned by Akamai, 163 North America-based cloud strategy decision-makers from the retail, media, and telecommunications industries were asked for their thoughts on the current state of cloud computing.
Comments