Nvidia plays the game every bit as dirty as Intel. In this case, Nvidia has created something called "GAMEWORKS", a proprietary closed-source library of routines specifically designed to collapse the performance of games on AMD hardware (or older Nvidia hardware). Nvidia pays shills to counter information like this in forums like this one, so let me give you one example.

The best current anti aliasing is a free, open source collection of methods from Crytek (the people behind Crysis and the original Far Cry). Their methods run with excellent performance on older hardware, and slightly favour AMD (because AMD hardware is always more shader powerful than Nvidia at a given class). Not good for Nvidia. So Nvidia "invented" TXAA - a horrifically bad AA method both in appearance and "hit" on performance - but a method that runs far better on new Nvidia hardware than it does on new AMD hardware.

Nvidia actually pays developers like Ubisoft to NOT include the best, SMAA methods from Crytek (remember, they are free for any publisher to use). Instead, Nvidia only allows FXAA (also Nvidia created, but lightweight on all hardware, at the cost of not being so good), MSAA (the old fashioned hardware anti-aliasing that comes with horrible restrictions), and TXAA (hated even by Nvidia fanboys because of its impact on performance). EVERYONE is asking where SMAA T2X is on Unity - but as I said, Nvidia paid Ubisoft to exclude it.

TXAA is universally loathed (even HardOCP, the elitist PC gaming site that insists on benchmarking games with every possible setting set to max - regardless of the trade off - stated that TXAA was such an atrocity, they'd always use SMAA), but for Nvidia it is the perfect model for how they seek to ruin the gaming experience of everyone, in order to synthetically make Nvidia GPUs seem "better".

"GAMEWORKS" increases the number of TXAA like performance destroyers in a modern engine (Xbox One, PS4 or PC) exponentially. Ultra slow GPU libraries to handle trivial things like particles, AI pathfinding, occlusion calculations and the like. Remember, gaming PCs and new consoles are CPU rich. No serious PC gamer runs less than a 4-core i5. The consoles have 8-cores.

Nvidia literally doesn't care if bouncing ten simple particles on your screen uses 30% of your GPU performance, so long as the same effect on an AMD GPU takes 80%. Nvidia is this dirty.

Disgustingly, Epic have taken a large Nvidia pay off to make "GAMEWORKS" the exclusive "enhancement library of Unreal 4 (the current most successful licensed engine), and the team behind Witcher 3 (the most anticipated open world fantasy game ever) have agreed to ruin the performance of that game on AMD GPUs (when it is released early next year) in order to gain Nvidia funding.

Remember how a weak back, more than a decade after the crime, Intel got a TINY court punishment for paying sites like Anandtech to use bent Intel benchmarks "proving" that the putrid Intel Netburst x86 CPUs were "better" than the vastly superior (at the time) AMD CPUs? The owner of Anandtech himself made a point of informing his readers that one core was better than two (when only AMD had gone dual core), that 64-bit was pointless joke (when AMD invented x64, long before Intel licensed the tech from AMD), and that Netburst's intent to reach 10GHz showed that only Intel had the right tech and ideas.

Nvidia no more fears punishment (in the courts or court of public opinion) than does Intel. Nvidia relies on the vicious trolling of its PR teams to hurt its opponents, and to fool the public.

For how Unity looks (far, far from remarkable), it should run at least THREE times faster on given hardware, with the most pointless settings notched down. Or, it could be THREE times better at the current framerates - and truly appear "next gen". Nvidia steals our gaming experiences to enrich itself. Just as Intel loves bloated abstracted, buggy junk like .NET on Windows, because it synthetically needs a much more expensive Intel CPU to run well.


IMSIDC2BB4821C5F6A921F0B88F4EA68A0060341F3AC8