The launch focused primarily on the fact that this is a brand-new ‘Turing’ architecture, 10 years in the making, and that the new cards can do something the old ones can’t: realtime ray-tracing.

Should you pre-order a GTX 2080 Ti?

Ok, so before we talk about ray-tracing, let’s make this simple: no, you should not pre-order a GTX 2080 Ti. Or a 2080 or a 2070. Our advice is to wait and see how things pan out first. That applies to the vast majority of people. Here’s why:

The new cards are more expensive than the old ones were at launchFew details are currently known about overclocked cards, and how much faster they areLittle is known about how much faster the new cards are than the old ones (for current games)Ray-tracing isn’t supported in most current games

Until the new cards have been properly put through their paces, they remain something of an unknown quantity: they’re not merely an incremental update on previous generation as we’ve seen for the past few years. However, if you’ve money to burn, go ahead and pre-order. You’ll probably be very happy with your new graphics card. Others, especially those on cards 3-4 years old might get a better bargain by waiting for 10-series cards to drop in price once the 20-series cards are on sale.

RTX 2080 vs GTX 1080 Ti benchmarks

Another day, another leak. This time we have an apparent score for the RTX 2080 Ti from VideoVardz.com. While this result should once again be called into question as these are far from confirmed to be real, they’re interesting to look at. We’re not quite seeing the performance increases that we were originally promised at the launch of the RTX series early on at Gamescom 2018.

An overclocking expert in Thailand called Tum Apisak has posted some 3DMark Time Spy results that appears to show the RTX 2080 outpacing the 1080 Ti.

This 500 point increase represents a 5% upgrade in speed over the 1080 ti, and a 35% increase in speed over it’s direct predecessor of the previous generation, the 1080. The TRX 2080 is currently shown as a ‘Generic VGA’ card, and the Core clock certainly seems a little high considering the advertised stock clock on the 2080 is closer to 1800MHz. We’re going to hold our breath until we get our hands on the cards ourselves, these numbers should be taken with a pinch of salt although they’re certainly interesting.

Why RTX, not GTX?

RT stands for ray-tracing, and it’s pretty much all Nvidia talked about during the launch. Ray-tracing is not new. Far from it. Computers have been able to do it since the 1980s. Possibly even earlier. What’s important is that the new RTX cards can ray-trace a scene in realtime. This is huge, especially for gamers. Realtime ray-tracing has been a feature gamers have wanted for ages (and game developers too) but until now, consumer graphics cards simply haven’t had the raw performance necessary to perform this feat at 30, 60 or even more frames per second. Ray-tracing, for the uninitiated, is a technique where (as the name suggests) rays of light are traced from a virtual eye to their source, via objects they reflect off, intersect with or are absorbed by. What it means is that games which support ray-tracing will look a lot more realistic than they do currently. Reflections, shadows – everything to do with lighting – will be more life-like with ray-tracing. Overall, this is excellent news. The problem is that it’s one of those chicken-and-egg situations: games can’t support ray-tracing until the hardware exists, developers won’t spend money adding ray-tracing support to their games until people own the cards, but no-one is going to buy a super-expensive card with nothing to play on it. That’s a very simplistic way of looking at it, but the bottom line is that – as we said at the start – RTX cards are an unknown quantity in terms of how they perform with games right now. We won’t go into the deep technical details, but the number of CUDA cores hasn’t risen significantly so if you already have a 1070, 1080 or 1080 Ti, it’s possible that upgrading to the new version won’t give you much of a boost in frame rates. Here’s a ray-traced scene to illustrate how life-like they can be:

Is the RTX 2080 Ti 10x faster than the GTX 1080 Ti?

No. Not for current games. Based on the fact it has 21% more CUDA cores and 27% more memory bandwidth, it is unlikely to be even 1.5x as fast, but until we can run some benchmarks we won’t know for sure. The 10x faster claim comes from the fact that the RTX 2080 Ti is capable of  more ‘Giga rays per second’ than the 1080 Ti. That is a made-up measurement of performance because Nvidia says that traditional metrics “don’t capture the performance of these RTX cards”. Leaving aside hard facts and figures for a second, then, the new cards should perform extremely well in games that support ray-tracing. But that is compared to the older cards which cannot do real-time ray-tracing at all. In RTX-enabled games, a GTX 1080 Ti will simply render graphics as before, using rasterization. You can read more in Nvidia’s blog about the differences between the two techniques.

Do existing games support ray-tracing?

No. But quite a few will soon. Nvidia has released a list of 21 games which will support the new RTX cards’ capabilities. It also clarified that having RTX support does not mean the game uses ray-tracing elements. RTX can also mean the game has AI elements (using Nvidia’s DLSS). Here are the 11 games which do support ray-tracing:

Assetto CorsaAtomic HeartBattlefield 5ControlEnlistedJusticeJX3Metro ExodusProjectDH Shadow of the Tomb Raider

It’s inevitable that other developers will get in on the act and update their games, but right now that is a pretty small list and if your favourite isn’t on it, then splashing out £1099/US$1199 on a 2080 Ti probably isn’t too sensible. You can read more about the games listed above and watch demos of the games in action.

RTX 2080 Ti vs GTX 1080 Ti: Specs comparison

So you want the facts and figures. Fair enough. Here are the main ones. Note that Nvidia hasn’t officially said how many tensor cores are in the 2080 Ti.

What are the Tensor cores for?

They’re a slightly strange inclusion in a gaming card. In the enterprise versions of the Turing-based graphics cards, Tensor cores are for machine learning. It’s unclear what benefits they will bring to games, so right now, they’re not a reason to buy an RTX 2080 Ti over a 1080 Ti.  Jim has been testing and reviewing products for over 20 years. His main beats include VPN services and antivirus. He also covers smart home tech, mesh Wi-Fi and electric bikes.

Nvidia RTX 2080 Ti vs GTX 1080 Ti - 80Nvidia RTX 2080 Ti vs GTX 1080 Ti - 72