MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
nvidia
Search

Nvidia pushes ARM supercomputing

Monday June 17, 2019. 09:22 PM , from Ars Technica
Enlarge (credit: Lawrence Berkeley National Laboratory [Public domain])
Graphics chip maker Nvidia is best known for consumer computing, vying with AMD's Radeon line for framerates and eye candy. But the venerable giant hasn't ignored the rise of GPU-powered applications that have little or nothing to do with gaming. In the early 2000s, UNC researcher Mark Harris began work popularizing the term 'GPGPU,' referencing the use of Graphics Processing Units for non-graphics-related tasks. But most of us didn't really become aware of the non-graphics-related possibilities until GPU-powered bitcoin-mining code was released in 2010, and shortly thereafter, strange boxes packed nearly solid with high-end gaming cards started popping up everywhere.
From digital currencies to supercomputing
The Association for Computing Machinery grants one or more $10,000 Gordon Bell Prize every year to a research team that has made a break-out achievement in performance, scale, or time-to-solution on challenging science and engineering problems. Five of the six entrants in 2018—including both winning teams, Oak Ridge National Laboratory and Lawrence Berkeley National Laboratory—used Nvidia GPUs in their supercomputing arrays; the Lawrence Berkeley team included six people from Nvidia itself.

The impressive part about the segmentation masks overlaid on this map projection has nothing to do with antialiasing—it's the 300+ petaflops needed to analyze an entire planet's worth of atmospheric data in order to produce it. (credit: Lawrence Berkeley National Laboratories)

In March of this year, Nvidia acquired Mellanox, makers of the high-performance network interconnect technology InfiniBand. (InfiniBand is frequently used as an alternative to Ethernet for massively high-speed connections between storage and compute stacks in enterprise, with real throughput up to 100Gbps.) This is the same technology the LBNL/Nvidia team used in 2018 to win a Gordon Bell Prize (with a project on deep learning for climate analytics).
Read 6 remaining paragraphs | Comments
https://arstechnica.com/?p=1523321
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Apr, Sat 20 - 00:12 CEST