Gaming PCs have been part of my life for as long as I can remember. I like to build them myself and through the years I’ve learned that getting the best quality components is less expensive in the long run. Much like buying your kid a pair of sneakers that is one size too big, high-end equipment has a longer useable life because it is overpowered for the current generation of games and applications.
At well over a thousand dollars for the least expensive iteration, does Nvidia’s new flagship chipset provide enough bang for the buck? Will it offer enough future-proofing at this price point? I’ve read tons of benchmark results, perused many articles both for as well as against, and watched countless review videos. I’ve decided that it is time to find out first-hand. This one component will cost more than everything else in my system put together. It better be worth it!
Just deciding which manufacture’s card and which of their ten or more models to get was a project unto itself. The decision was made more difficult by many options being unavailable. Some of my top contenders were back-ordered for as long as six weeks. The various cards differ from each other in a couple of ways. First is whether they have a factory overclock applied or not. Second is how many fans their chassis have attached. Generally speaking, the higher the overclock, the more fans you need to keep it cool. The current generation of these cards come in one, two, or three fan configurations. I ended up with an EVGA Gefroce RTX 2080 Ti XC Ultra Gaming card. It has a slight overclock and two high-speed fans.
Nvidia’s new monster requires dual eight pin power connectors and a minimum of six-hundred and fifty watts. The PSU in Elder-Wand that ran my previous RX-480 had the dual connectors but was rated for only six-hundred watts. I decided to try it with the 2080ti anyway, just to see what would happen. At first I thought everything was going to be ok. The system booted, the Nvidia drivers loaded, and my desktop screen looked great. My web browser and Visual Studio worked well. Everything went sideways when I tried to launch Destiny 2. The fans on the GPU went nuts and my system froze completely. Luckily this was just an experiment and I had a Corsair RM 1000x on deck.
After installing the new PSU, I fired up the same game (Destiny 2) and set all the graphics options to their maximum modes. I went to Earth in the game because the Trostland (EDZ) has a variety of environments and lighting situations in a small area. I was floored. Staring at my 40 inch 4K screen was like looking through a window at an actual church. Albeit one in which odd purple aliens are running around shooting at each other.
I systematically set all of the games I am currently playing to 4K ultra and they all preformed flawlessly. The card wasn’t struggling to keep up and 60 FPS (max for my screen) was a breeze. The fans were in cruising mode and it was obvious there’s a lot of head room between what current games are consuming and the power this chipset can bring to bear. I don’t think future proofing is going to be an issue, but the price per year is going to end up on the high side. If the card lasts the typical three years I’m looking at four-hundred per year which is the equivalent of buying a new top-tier console every birthday.
Besides being the most powerful consumer card on the market, Nvidia’s other claim to fame for the new chipset is being the first to enable real-time ray-tracing. The technique allows the GPU to simulate the path that rays of light would follow in nature, providing a realistic photo like picture. Especially where reflective surfaces like water, clouds, or ice are concerned. Until now ray-tracing required server farms to render and was only used in CG for movies and TV.
There are only a handful of games that can utilize ray-tracing right now and it remains to be seen whether the tech will catch on in the main stream. Lucky for me, my purchase came with one of the DRX enabled titles, Battelfield V. I was impressed with results. You do take a hit to FPS when enabling the feature, but I was still able to stay close to sixty most of the time.
The net result is that surfaces look almost real. I imagine that if I removed the text and in-game overlays from the screen grabs above and below, you would be hard-pressed to identify them as a computer generated images. These were taken in the middle of an on-line multiplayer battle. Notice the superb reflections from the slight dampness in the ditch on the right and the gleam off the weapon’s surfaces and the shooter’s skin in the picture below. The water in the picture above is the best I’ve seen in a game, period.
The other gaming function that the 2080ti excels at is Virtual Reality. Current VR goggles are somewhat lacking in resolution and this causes items that are “far away” to appear grainy or digitized. One technique to help minimize the degradation is to enable supersampling. Essentially supersampling tells the system to use multiple copies of every image to fill in the detail; it’s a type of anti-aliasing. This operation takes a lot of horsepower from your GPU especially when you consider that you need to run two displays at 90 frames per second while doing it. The 2080ti was easily able to support 5.0 (highest setting in SteamVR) supersampling for all of my games.
The price of this chipset puts it out of reach for a lot of gamers and that is a shame. The power to run all games at 2160P with ultra everything is intoxicating. The 2080ti lives up to the hype in my opinion. I doubt if it will last long enough to be considered a wise financial decision, but don’t most hobbies end up costing you a lot of money in the the end?
Photographs appearing in this article are courtesy of Tyler Trent.