Akamai and Neural Magic team up to accelerate AI workloads on edge CPU servers – SiliconANGLE

admin
6 Min Read

Akamai and Neural Magic team up to accelerate AI workloads on edge CPU servers

Content delivery network and cloud computing company Akamai Technologies Inc. today announced it’s partnering with the artificial intelligence acceleration software firm NeuralMagic Inc. to bolster the AI processing capabilities of its cloud infrastructure platform.

The company said it intends to “supercharge” its deep learning capabilities by leveraging Neural Magic’s unique software, which enables AI workloads to be run more efficiently on traditional central processing unit-based servers, as opposed to more advanced hardware powered by graphics processing units.

Neural Magic has been elevated to the status of Akamai Qualified Computing Partner, which means its software is being made available directly on Akamai’s highly distributed cloud infrastructure platform.

Neural Magic’s software makes it possible for developers to run deep learning models on cost-effective CPU servers with almost the same performance of those that run on much more expensive and hard to come by GPU servers. Its software achieves this by accelerating AI workloads using automated model sparsification techniques, which are made available as a CPU inference engine.

It means companies now have a way to deploy advanced, high-performance AI workloads across Akamai’s distributed cloud computing infrastructure, which delivers some of the lowest latencies in the business for cloud-based applications.

Akamai made its name as a CDN provider, enabling organizations to host their content hundreds of globally distributed “edge” locations and ensure rapid delivery. In 2022 it transitioned to the cloud computing industry when it acquired a company called Linode LLC for $900 million.

Following that acquisition, it announced its Connected Cloud infrastructure platform, aimed at smaller companies and solo developers, claiming it to be the most widely distributed cloud infrastructure platform in the industry. Akamai’s claim stems from the fact that it can leverage its global CDN network to host its cloud infrastructure in hundreds of different locations across the globe, giving it an even more extensive reach than that of traditional cloud providers such as Amazon Web Services Inc. and Microsoft Azure.

The company has since expanded on that, launching an initiative called Gecko, for Generalized Edge Compute, in February. It extends its compute and storage capabilities to the edge of the network, allowing it to deliver even more responsive workloads with some of the lowest latencies available.

John O’Hara, Neural Magic’s senior vice president of engineering and chief operating officer, said that delivering AI models efficiently at the network edge is much more challenging than most people realize. Yet that is precisely what the two companies are doing, he explained. “Specialized or expensive hardware and associated power and delivery requirements are not always available or feasible,” he said, meaning that many organizations have been forced to miss out on the benefits of running AI inference at the edge.

Akamai said the partnership with Neural Magic aims to change that, and gives it yet another advantage: Its traditional cloud infrastructure rivals’ AI efforts are mostly focused on GPU-based servers, which are in high-demand from customers and therefore incredibly expensive.

CPU-based servers are much more readily available, and Neural Magic’s software will supposedly make them a more valid proposition for various AI applications. The company said the combination is particularly well-suited for AI applications that need to process data that’s generated at the network edge, as its distributed infrastructure is uniquely suited to this.

Akamai Chief strategist Ramanath Iyer said the company intends to make AI smarter and faster without the disadvantage of the spiraling costs associated with GPU-based infrastructures. “Scaling Neural Magic’s capabilities to run deep learning inference models across Akamai gives organizations access to much-needed cost efficiencies and higher performance,” he said.

According to Holger Mueller of Constellation Research Inc., the partnership announcement can be taken as a sign that Akamai’s transition to offering edge-based cloud infrastructure is paying off. “It’s boosting its platform by offering customers more advanced edge capabilities for AI and that will appeal,” he said. “For customers, it’s great news, because down the road they will likely have a lot of choice in terms of edge-based AI infrastructure. Akamai is getting there early, though, and it’s an exciting time to be one of its customers.”

Neural Magic Chief Executive Brian Stevens appeared on theCUBE, SiliconANGLE Media’s mobile livestreaming studio, in February 2023 to discuss in depth how the company optimizes deep learning models to make them run more efficiently on CPU-based hardware:

Share This Article
By admin
test bio
Please login to use this feature.