Loading...

Computex 2023: Chip firms unveil new products for next-gen AI applications

Computex 2023: Chip firms unveil new products for next-gen AI applications
Loading...

Computex, an annual computer trade show held in Taiwan, returned as an in-person event this week (May 30 to June 2) after a three-year hiatus with focus on artificial intelligence (AI) as several firms announced hardware to support next gen AI applications.  

Top chip and PC firms including Nvidia, Arm, Qualcomm, Intel, MediaTek, MSI and Acer participated in the tech show to talk about new products and partnerships. Here are some of the key announcements made so far in the enterprise space.

Nvidia announces new AI supercomputer
Nvidia unveiled a new AI supercomputer called DGX GH200 at the Computex trade show. The supercomputer is powered by the new Grace Hopper superchip and is designed to develop and support large language models (LLMs), recommender systems, and data analytics workloads. It can help tech companies develop successors to ChatGPT, a LLM based chatbot developed by OpenAI.

Loading...

“DGX GH200 AI supercomputers integrate Nvidia’s most advanced accelerated computing and networking technologies to expand the frontier of AI,” said Jensen Huang, CEO of Nvidia.

The DGX GH200 has a shared memory space of 144TB, a significant upgrade from the previous generation of DGX systems, providing nearly 500 times more memory. Nvidia claims it’s 2.2x faster than a last-gen DGX H100 cluster at GPT3 training.
Currently, Microsoft, Meta, and Google Cloud have access to the new supercomputer, the company said.

Qualcomm to focus on chips with AI capabilities
Qualcomm said it is expanding its focus from providing chips for communications devices to supporting AI workloads. The company is transitioning to becoming an "intelligent edge computing" firm, said Alex Katouzian, senior vice president at Qualcomm in his keynote speech. This means it will focus on developing chips that will allow for faster and more efficient AI applications.

Loading...

Katouzian also said that AI workloads require a lot of compute power and that moving forward, it would not be feasible for them to be hosted purely in the cloud.  That would mean the most efficient experience would come from sharing workloads between client devices and the cloud.

Several Qualcomm products already support AI in some form. For instance, the high-end Snapdragon System on chips (SOC) for mobile have AI accelerators that can be used to run AI applications. Qualcomm also offers a number of AI software development tools such as the Qualcomm AI Engine Direct SDK, the Qualcomm AI Stack, and the Qualcomm AI Studio.

In February, Qualcomm announced the Snapdragon X75, its latest 5G modem which is equipped with Qualcomm’s 5G AI processor gen 2. It can reportedly process AI workloads 2.5 times faster than its predecessor, the X70.

Loading...

MediaTek, Nvidia to partner on AI car technology
Nvidia, which became the first trillion dollar chipmaker this week, announced a partnership with MediaTek at the event to create an AI powered infotainment technology for smart cars that will allow users to stream videos and games, and interact with drivers.
As part of the partnership, MediaTek will integrate an Nvidia graphic processing unit (GPU) chiplet and software with its own Dimensity Auto platform used for automakers’ infotainment displays. It also offers an environmental monitoring capability which will be displayed on the car’s dashboard and driver-monitoring cameras.
“Through this special collaboration with Nvidia, we will together be able to offer a truly unique platform for the compute-intensive, software-defined vehicle of the future,” said Rick Tsai, CEO of MediaTek.
 The platform also includes Auto Connect, a feature that will ensure drivers remain wirelessly connected with high-speed telematics and Wi-Fi networking, the company said.

Arm launches new chip designs to support next-gen AI applications
British chip design IP firm, Arm, announced new chip blueprints designed to enable chipmakers to make SoCs that will improve performance especially while handling AI applications with better battery efficiency.
Arm unveiled fourth generation Arm Cortex-X4, which Arm claims will bring 15% more performance than its predecessor, the Cortex X-3, with a focus on enabling AI/ML-based apps. Arm also showcased Immortalis-G720, which is based on its fifth-generation GPU architecture. The company claims it can deliver 25% more peak performance and consume 22% less memory bandwidth in comparison to its predecessor the Immortalis-G715.
Arm also said that it is taping out the Cortex-X4 on the TSMC 3nm node process, which means that the design of the Cortex-X4 CPU is ready to be sent to TSMC for manufacturing.

Intel showcase AI capabilities of its upcoming Meteor Lake processors
Chipmaker Intel also made announcements that would strengthen its position in the AI industry. The US firm demonstrated its first working prototype of the VPU-enabled Meteor Lake (14th gen) processor and its AI capabilities.
 VPU or Vision Processing units is a type of processor that is specifically designed for AI tasks. The VPU will allow Meteor Lake to perform AI tasks much faster than previous generations of Intel processors, according to Intel.

Loading...

Intel has introduced VPU with its 13th-gen Raptor Lake processors, but only with select models. It is now planning to offer it on all Meteor Lake chips, which are expected to launch at the end of 2023. John Rayfield, Intel’s vice president and general manager of client AI, said Meteor Lake will first show up in laptops, specifically in the mobile thin-and-light segment.

 


Sign up for Newsletter

Select your Newsletter frequency