As AI spending is predicted to reach $632 billion by 2028, this raises a key question: how well are current cloud platforms prepared to manage and deploy AI applications with compute-intensive workloads?
In a Forrester Consulting study, Distributed Cloud: Taking AI to the Edge, commissioned by Akamai, cloud strategy decision-makers were asked for their thoughts on the current state of cloud computing.
The key themes revealed:
- Developers relying on legacy cloud platforms are seeing cost, performance, and latency issues that make it hard to scale and manage applications.
- Edge computing is growing in popularity thanks to its performance improvements from moving resources physically closer to users – but this too has its own complexities.
- The use of distributed cloud solutions is rising with developers seeing enhanced performance, low latency, and the flexibility to scale.
The study revealed that the way AI application developers use cloud solutions is changing – with an increased focus on distributed and edge technologies that improve user experience, real-time performance, and make personalization more achievable for AI applications. In fact, 76% of respondents cited that distributed cloud solutions are critical to meeting organizational goals.

For a summary of the key findings, keep reading or download the full report today.
The Problems with Legacy Cloud Platforms
Today, centralized legacy cloud platforms are still widely used, but developers are running into a common problem – these platforms make it more difficult for developers to build real-time, data-driven decision-making into their applications.
With today’s application development reliant on real-time, data-intensive delivery, centralized cloud architecture is falling short on what developers today need.
The biggest issues facing respondents in our survey included:
- Cost of additional storage and processing power (60%)
- Processing delays caused by high latency (56%)
- Performance and latency limitations (48%)
- Security concerns (48%)
- Difficulty scaling compute to meet demand (45%)
The solution that developers are increasingly opting for is the use of edge computing and distributed cloud solutions.
Is Edge AI the Future?
With AI compute poised to shift from training via core cloud computing to inference over the next few years, edge computing is predicted to take center stage. The result is that it will help to drive the next generation of real-time, personalized applications powered by AI and make low-latency, compute-intensive workloads the norm.
“Moving compute closer to end users is essential for the next wave of digital transformation and real-time, data-intensive, global application development and delivery needs.“
As AI shifts to inferencing, compute challenges such as latency, bandwidth usage, and high cost will need a long-term solution. The consensus in the developer community seems to be that moving to edge applications will help to reduce those performance and latency issues.
The problem? Edge computing also has its own limitations outside of performance. Survey respondents highlighted that compliance and regulatory responsibilities are the top concern, followed by the limitations of vendor lock-in, and the wider issue of hiring and retaining talent.
Why Developers Are Turning to Distributed Cloud Solutions
The switch to edge computing is heading in the right direction, but with its operational and business complexities, will developers be content to rely on the edge?
Another solution, distributed cloud architecture, offers the benefits of edge computing while making the practical aspect of managing AI applications easier and more flexible.
Distributed cloud can include a range of different data solutions including on-premises locations, private cloud, third-party data centers, and co-locations. All of these managed from a central control plane provides more control over where the data is located.
Why developers are using distributed cloud services
- Reduced latency & increased application responsiveness
- Increased visibility through a single console
- Greater scalability without the cost of building new data centers
- Improved reliability and backup services if one location goes offline
The Distributed Cloud: Taking AI to the Edge report revealed that although not without limitations, the use of distributed cloud platforms does closely align with the top priorities of organizations today.
- 77% want to promote innovation or innovate with AI
- 69% want to modernize digital experiences
- 65% cited improving data management and decisioning
- 62% prioritize personalized experiences
- 58% highlight improving security
Already, many organizations have shown support for distributed architecture, with 88% of survey respondents already running six or more workloads in multiple locations.
Discover More Insights
The future of AI-powered applications is a move to highly personalized experiences and real-time results and data processing. To make that a reality, there needs to be a data solution that makes management and deployment seamless while also ensuring a great end user experience.
To make sure that your applications are competitive for users but also efficient for your team, distributed cloud seems a clear choice.
Read the full report to learn more about how developers across various industries are approaching application development today.
* In this Forrester Consulting study commissioned by Akamai, 163 North America-based cloud strategy decision-makers from the retail, media, and telecommunications industries were asked for their thoughts on the current state of cloud computing.
Comments