What are you looking for ?
Advertise with us
RAIDON

With LPDDR5X Memory, Micron Collaborates with Qualcomm to Accelerate-Gen AI at Edge for Smartphones

Sampling for Snapdragon 8 Gen 3, delivering 9.6Gb/s

Micron Technology, Inc. is shipping production samples of its low-power double data rate 5X LPDDR5X memoryan industry’s 1β (1-beta) mobile-optimized memory – for use with Qualcomm Technologies, Inc.’s latest mobile platform, Snapdragon 8 Gen 3.

Micron Technology, Inc. announced  that it is shipping production samples of its low-power double data rate 5X LPDDR5X memory - an industry’s 1β (1-beta) mobile-optimized memory — for use with Qualcomm Technologies, Inc.’s latest mobile platform, Snapdragon 8 Gen 3. 

Running at the world’s fastest speed grade of 9.6Gb, the company’s LPDDR5X provides the mobile ecosystem with the fast performance needed to unlock generative AI at the edge. Enabled by its innovative, 1β process node technology, the firm’s LPDDR5X also delivers advanced power-saving capabilities for mobile users.
“Generative AI is poised to unleash unprecedented productivity, ease of use, and personalization for smartphone users by delivering the power of large language models to flagship mobile phones,” said Mark Montierth, corporate VP and GM, mobile business unit, Micron. “Micron’s 1β LPDDR5X combined with Qualcomm Technologies’ AI-optimized Snapdragon 8 Gen 3 Mobile Platform empowers smartphone manufacturers with the next-generation performance and power efficiency essential to enabling revolutionary AI technology at the edge.”
As the industry’s fastest mobile memory offered in speed grades up to 9.6Gb, the company’s LPDDR5X provides over 12% higher peak bandwidth (1) compared to the previous generation — critical for enabling AI at the edge. The Snapdragon 8 Gen 3 allows powerful generative AI models to run locally on flagship smartphones, unlocking a new generation of AI-based applications and capabilities. Enabling on-device AI additionally improves network efficiency and reduces the energy requirements and expense of more costly cloud-based solutions, which require back-and-forth data transfer to and from remote servers.
“To date, powerful generative AI has mostly been executed in the cloud, but our new Snapdragon 8 Gen 3 brings revolutionary generative AI use cases to users’ fingertips by enabling large language models and large vision models to run on the device,” said Ziad Asghar, SVP, product management, Qualcomm Technologies. “Our collaboration with Micron to pair the industry’s fastest mobile memory, its 1β LPDDR5X, with our latest Snapdragon mobile platform opens up a new world of on-device, ultra-personalized AI experiences for smartphone users.”
Built on Micron’s 1β process node and delivering the industry’s most advanced power-saving capabilities such as enhanced dynamic voltage and frequency scaling core techniques, LPDDR5X offers a nearly 30% power improvement (2) and the flexibility to deliver workload-customized power and performance. These power savings are especially crucial for energy-intensive, AI-fueled applications, enabling users to reap the benefits of generative AI with prolonged battery life.
Offered in capacities up to 16GB and providing the industry’s highest performance and lowest power consumption, the company’s LPDDR5X delivers unprecedented support for on-device AI, accelerating generative AI’s capabilities at the edge.
(1) Compared to 8.533Gbs for previous-gen LPDDR5X
(2) Measured against competitors’ 1-alpha-based LPDDR5X

Running at speed grade of 9.6Gb/s, the company’s LPDDR5X provides the mobile ecosystem with the fast performance needed to unlock generative AI at the edge. Enabled by its innovative, 1β process node technology, tit also delivers power-saving capabilities for mobile users.

Generative AI is poised to unleash unprecedented productivity, ease of use, and personalization for smartphone users by delivering the power of large language models to flagship mobile phones,” said Mark Montierth, corporate VP and GM, mobile business unit. “Micron’s 1β LPDDR5X combined with Qualcomm Technologies’ AI-optimized Snapdragon 8 Gen 3 Mobile Platform empowers smartphone manufacturers with the next-gen performance and power efficiency essential to enabling revolutionary AI technology at the edge.”

As the industry’s fastest mobile memory offered in speed grades up to 9.6Gb/s, the LPDDR5X provides over 12% higher peak bandwidth (1) compared to the previous-gen – critical for enabling AI at the edge. The Snapdragon 8 Gen 3 allows new-gen AI models to run locally on flagship smartphones, unlocking a new-gen of AI-based applications and capabilities. Enabling on-device AI additionally improves network efficiency and reduces the energy requirements and expense of more costly cloud-based solutions, which require back-and-forth data transfer to and from remote servers.

To date, powerful generative AI has mostly been executed in the cloud, but our new Snapdragon 8 Gen 3 brings revolutionary generative AI use cases to users’ fingertips by enabling large language models and large vision models to run on the device,” said Ziad Asghar, SVP, product management, Qualcomm. “Our collaboration with Micron to pair the industry’s fastest mobile memory, its 1β LPDDR5X, with our latest Snapdragon mobile platform opens up a new world of on-device, ultra-personalized AI experiences for smartphone users.”

Built on Micron’s 1β process node and delivering advanced power-saving capabilities such as enhanced dynamic voltage and frequency scaling core techniques, LPDDR5X offers a nearly 30% power improvement (2) and the flexibility to deliver workload-customized power and performance. These power savings are especially crucial for energy-intensive, AI-fueled applications, enabling users to reap the benefits of generative AI with prolonged battery life.

Offered in capacities up to 16GB and providing the industry’s highest performance and lowest power consumption, the company’s LPDDR5X delivers support for on-device AI, accelerating generative AI’s capabilities at the edge.

(1) Compared to 8.533Gb/s for previous-gen LPDDR5X
(2) Measured against competitors’ 1-alpha-based LPDDR5X

Articles_bottom
ExaGrid
AIC
ATTOtarget="_blank"
OPEN-E