Navigation
Search
|
D3D raytracing no longer exclusive to 2080, as Nvidia brings it to GeForce 10, 16
Tuesday March 19, 2019. 10:35 PM , from Ars Technica
Enlarge / A screenshot of Metro Exodus with raytracing enabled. (credit: Nvidia)
Microsoft announced DirectX raytracing a year ago, promising to bring hardware-accelerated raytraced graphics to PC gaming. In August, Nvidia announced its RTX 2080 and 2080Ti, a pair of new video cards with the company's new Turing RTX processors. In addition to the regular graphics-processing hardware, these new chips included two extra sets of additional cores, one set designed for running machine-learning algorithms and the other for computing raytraced graphics. These cards were the first, and currently only, cards to support DirectX Raytracing (DXR). That's going to change in April, as Nvidia has announced that 10-series and 16-series cards will be getting some amount of raytracing support with next month's driver update. Specifically, we're talking about 10-series cards built with Pascal chips (that's the 1060 6GB or higher), Titan-branded cards with Pascal or Volta chips (the Titan X, XP, and V), and 16-series cards with Turing chips (Turing, in contrast to the Turing RTX, lacks the extra cores for raytracing and machine learning). The GTX 1060 6GB and above should start supporting DXR with next month's Nvidia driver update. (credit: Nvidia) Unsurprisingly, the performance of these cards will not match that of the RTX chips. RTX chips use both their raytracing cores and their machine-learning cores for DXR graphics. To achieve a suitable level of performance, the raytracing simulates relatively few light rays and uses machine-learning-based antialiasing to flesh out the raytraced images. Absent the dedicated hardware, DXR on the GTX chips will use 32-bit integer operations on the CUDA cores already used for computation and shader workloads. Read 4 remaining paragraphs | Comments
https://arstechnica.com/?p=1476009
|
25 sources
Current Date
Nov, Fri 22 - 23:15 CET
|