Chipmaker Intel has launched its 3rd Gen Intel Xeon Scalable processors, also known as Cooper Lake, as new additions to its hardware and software artificial intelligence (AI) portfolio. The additions will enable customers to accelerate the development and use of AI and analytics workloads running in the data center, network and intelligent-edge environments.
The Santa Clara, California-based company announced the new processors alongside other data center products, including second-generation Optane persistent memory, new Intel SSDs and a new AI-focused FPGA.
The processor is designed for deep learning, virtual machine (VM) density, in-memory database, mission-critical applications and analytics-intensive workloads, the company said in a statement.
The industry’s first mainstream server processor has built-in bfloat16 support, which is a compact numeric format that uses half the bits as today’s FP32 format but achieves comparable model accuracy with minimal software changes required, it said.
The addition of bfloat16 support accelerates both AI training and inference performance in the CPU, the statement added.
According to the company, the new processors will make AI inference and training more deployable on general-purpose CPUs for applications that include image classification, recommendation engines, speech recognition and language modeling.
“The ability to rapidly deploy AI and data analytics is essential for today’s businesses. We remain committed to enhancing built-in AI acceleration and software optimizations within the processor that powers the world’s data center and edge solutions, as well as delivering an unmatched silicon foundation to unleash insight from data.” Lisa Spelman, Intel corporate vice president and general manager, Xeon and Memory Group said.
The launch comes at this time when AI and analytics are opening up new opportunities for customers across a broad range of industries, including finance, healthcare, industrial, telecom and transportation.
Massachusetts-based market research firm IDC (International Data Corporation) has predicted that it will impact the AI (Artificial Intelligence) investments for organisations in India in 2020 and beyond.
The firm predicts that by 2021, 75% of commercial enterprise apps will use AI. And by 2025, IDC estimates that roughly a quarter of all data generated will be created in real time, with various internet of things (IoT) devices creating 95% of that volume growth.
Some of the other key launches include Intel Optane persistent memory, new Intel 3D NAND SSDs, and Intel AI-optimized FPGA.
The new Intel Optane persistent memory 200 series, provides up to 4.5TB of memory per socket to manage data-intensive workloads, such as in-memory databases, dense virtualization, analytics and high-powered computing.
Whereas, Intel 3D NAND SSDs are built with Intel’s triple-level cell (TLC) 3D NAND technology and an all-new low-latency PCIe controller to meet the intense IO requirements of AI and analytics workloads and advanced features to improve IT efficiency and data security.
Intel’s first AI-optimized FPGAs are targeted for high-bandwidth, low-latency AI acceleration. These FPGAs will offer customers customizable, reconfigurable and scalable AI acceleration for compute-demanding applications such as natural language processing and fraud detection.