www.hostingjournalist.com - HostingJournalist.com
HostingJournalist.com V3.0 Has Been Launched. List Your Business and Start Publishing Today. Free 14 Day Trial. SHOW ME

5th Gen Intel Xeon: AI Acceleration in Every Core for Enhanced Performance

CategoryIaaS Hosting
PublishedDecember 19, 2023

News Summary

Intel’s ‘AI Everywhere’ launch debuts 5th Gen Xeon processors with AI acceleration, promising significant performance boosts and efficiency in AI applications.


Join HostingJournalist Insider Today

5th Gen Intel Xeon: AI Acceleration in Every Core for Enhanced Performance

Photo: Sandra Rivera, Intel Executive Vice President and General Manager of the Data Center and AI Group, displays a 5th Gen Intel Xeon processor. (Credit: Intel Corporation)

Intel has unveiled a new portfolio of AI technologies during its ‘AI Everywhere’ launch in New York City to support customers’ AI solutions across the network, edge, data center, and cloud. Intel CEO Pat Gelsinger presented the company’s wide-ranging AI portfolio, which includes networks, volume clients, cloud, and enterprise servers, as well as pervasive edge environments. 

Additionally, Pat Gelsinger restated Intel’s commitment to delivering five new process technology nodes within the next four years. More powerful AI for the data center, cloud, network, and edge with the new Xeon processor.

The introduction of the 5th generation Intel Xeon processor family would result in a notable improvement in both performance and efficiency. These processors would allow 36% greater average performance per watt across a variety of client workloads and give a 21% average performance boost for general computation performance when compared to the previous generation of Xeon. Clients upgrading from even earlier generations and adhering to a standard five-year refresh cycle may save up to 77% on their total cost of ownership, according to Intel.

With up to 42% better inference and fine-tuning on models with up to 20 billion parameters, the new 5th Gen Xeon would be the first mainstream data center CPU with built-in AI acceleration. Furthermore, according to Intel, it is a CPU to consistently and steadily outperform a set of benchmark results for MLPerf training and inference.

Communication service providers, content delivery networks (CDNs), and a wide range of vertical markets, including retail, healthcare, and manufacturing, can now deploy demanding network and edge workloads more effectively and manageably thanks to Xeon's built-in AI accelerators, optimized software, and improved telemetry capabilities.

Performance 5th Generation Processors

IBM also revealed at the event that, in testing, 5th generation Intel Xeon processors outperformed prior generation Xeon processors in terms of query performance on the Watsonx.data platform by up to 2.7 times. Google Cloud, which plans to roll out 5th Gen Xeon next year, said that by using the built-in acceleration in 4th Gen Xeon via Google Cloud, Palo Alto Networks was able to double the performance of its threat detection deep learning models. And to save money and delay on their AI-based game Proxi, independent gaming company Gallium Studios used Numenta's AI platform running on Xeon processors to enhance inference speed by 6.5x over a GPU-based cloud instance.

This type of speed would open up new applications for sophisticated AI, not only in cloud and data centers but also in global networks and edge applications.

It's possible that Intel Core Ultra and 5th Gen Xeon will end up in unexpected locations, stated Intel. Imagine a production floor that detects quality and safety concerns at the source, an ultrasound that sees what human eyes would miss, a power system that precisely regulates energy, and a restaurant that tailors its menu offerings to your budget and dietary requirements.

These edge computing use cases would reflect the fastest-growing computing sector, where AI is the fastest-growing workload. This area is expected to rise to a $445 billion worldwide market by the end of the decade. The need for inference in that industry is being driven by edge and client devices 1.4 times more than by the data center.

Customers will often use a combination of AI solutions stated Intel. To provide the greatest user experience and financial results, Zoom, for example, runs AI workloads on Intel Core-based client systems and Intel Xeon-based cloud solutions inside its all-in-one communications and collaboration platform. Zoom utilizes AI to create an email and meeting summary, muffle the barking dog of your neighbor, and blur your messy home office.

Intel incorporates optimizations into the AI frameworks used by developers (such as PyTorch and TensorFlow) and provides foundational libraries (through oneAPI) to make software highly performant and portable across various hardware types in an effort to make AI hardware technologies as accessible and user-friendly as possible.

Cutting-edge development tools, such as Intel's OpenVINO toolkit and oneAPI, would enable developers to swiftly create, tune, and implement AI models on a range of inference targets by using hardware acceleration for AI workloads and solutions.

Intel Gaudi3 AI Accelerator: A Sneak Peek

CEO Pat Gelsinger also gave an update on Intel Gaudi3, which would launch the following year. He unveiled the next-generation AI accelerator for large-scale generative AI models and deep learning for the first time. The combination of “expanding and proven performance advantages with very competitive price and total cost of ownership (TCO)” has allowed Intel to quickly expand its Gaudi pipeline, according to the company. Intel anticipates that its Gaudi-led suite of AI accelerators will account for a greater share of the accelerator industry in 2024 because to the growing need for generative AI solutions.








Follow HostingJournalist

Other Channels