August 25th, 2015, 21:18 Posted By: wraggster
Last month Intel and Micron announced they are developing3D XPoint memory tech, which will help to power 8K gaming in the future. But when will 8K - or even 16K - hit the mainstream? PCR asks the experts.
When asked how long until 8K becomes the norm, AMD's gaming scientist Richard Huddy told PCR: "It will do, I’ve got no doubt about that, but we’re a little while from it. We’re only just at the stage where we can run a 4K display at a reasonable refresh rate.
"If you want to run a high-end game that has some fairly aggressive graphics settings on a 4K display, then you need a couple of Fury Xs or something comparable to that to give you 60fps that provides a quality gaming experience.
"Now. if you’re going to go for 8K gaming, you’re going to double up the pixels in both width and height. Four times as much horsepower is going to be needed.
"I think what we’ve done with HBM1… we’ll continue to improve upon it in future generations. So there’s great opportunity for us to go beyond 4K gaming. But it will still take quite a lot of graphics horsepower before we can deliver a really good experience there."
Intel and Micron's 3D XPoint tech claims to be 1,000 times faster than NAND, which could drastically improve the performance of games in the years ahead.
When asked if 8K is really the future of PC gaming, Intel's UK channel sales manager Matt Birch said: "It is one future, as there are a lot of new experiences the technology can enable. We believe that technologies like 3D XPoint are critical to these new experiences though and we will continue to innovate."
"When you get to 16K, improving on that buys you nothing – that’s kind of a done deal then. But if you think about how much extra horsepower that is, that’s a lot from where we are at the moment."
Richard Huddy, AMD
So where does 16K gaming come into all of this? Should we even be thinking about that yet?
Huddy explained: "I did see some stuff from the most recent Google I/O, where they were talking about a kind of ideal graphics system being able to support 16K per eye.
"Oh my!" he gasped. "They have some pretty serious aspirations there. And I understand why. If you look at the human visual system, it’s typically claimed that a person with 20/20 vision needs something like 8K vertically and 16K horizontally to get everything right.
"There’s a thing in computer technology called a nyqist limit, where if you want to represent a signal, then you have to have twice as much frequency in order to represent that signal all the time. This means if the human eye goes up to 8K by 8K, then really the display needs to go up to 16k by 16k, so that you don’t get any strobing or other effects that would emerge from wandering at just 8K by 8K.
"That’s a phenomenon that comes from the fact that although the image is perfect, the dynamics of the image aren’t quite perfect at 8K. When you get to 16K, improving on that buys you nothing – that’s kind of a done deal then. But if you think about how much extra horsepower that is, that’s a lot from where we are at the moment.
Huddy went on: "If I’m claiming we can do 4K gaming for a pair of eyes, one 4K display on a high-end gaming rig with a couple of graphics cards, if you simply scale up to 16K per eye, then you really are doing a total of 16 times as much work – four times as much on x and four times as much on y, and you’re maybe doubling the eye’s load as well.
For more information and downloads, click here!
There are 0 comments - Join In and Discuss Here