Προς το περιεχόμενο

Ακόμα περισσότερα cheats...


MY80-

Προτεινόμενες αναρτήσεις

Δημοσ.

Ούτε ο Ευαγγελάτος να το είχε πιάσει το θέμα <img src="http://www.insomnia.gr/ubbthreads/images/graemlins/tongue.gif" alt="" />

Όπως βλέπουμε εδώ , σύμφωνα με τον Unwinder (που έφτιαξε το Rivatuner), και η ΑΤΙ και η nVidia "κλέβουν" στο nature test του 3dmark2001 (76% η nVidia, 57% η Ati). Η nVidia μάλιστα συνεχίζει αυτή τη συμπεριφορά ακόμα και σε παιχνίδια (όπως το UT2003).

 

 

Δημοσ.

Indeed.

 

Apo post tou unwinder sto b3d:

 

Finally, I'd like to make some comments about the current test results. NVAntiDetector hurts NV performance much more that ATIAntiDetector hurts ATI results. Currently ATIAntiDetector affected performance in both 3DMark2001 and 2003 (performance drop in 2003 is similar to result of installing 330 patch).

NVAntiDetector caused performance drop in a lot of 3DApplications including UT2003, CodeCreatures, AquaMark etc. Performance drop in 3DMark2003 is not comparable to 330 results, results are a way *lower* so it seems like FM missed some detections:

 

 

H 5900ultra sto 3dm2k3 XWRIS "optimizations" xtypaei mono 3198marks :o To endoiaferon meros (games), logika tha to doume sto arthro pou etoimazei to digit-life kai anamenetai syntwma...

Δημοσ.

H nvidia kai h ati milane oson afora AUTO

 

Akolouthei c/p apo b3d.

 

Nvidia:

NVIDIA works closely with games deveopers and 9 out of 10, and eventually nearer 10 out of 10 games will be either developed on their hardware, developed with Cg or developed with their direct input. How can they be accused of not conforming to industry standard shader routines when they ARE the driving force that sets the industry standards in shaders. Games developers are not likely to go shuffling instructions the way benchmark creators are and any games developer that wants to succeed will write their code so that it runs the way NVIDIA's shaders expect it to. the fact that their shader don't cut it on rarely heard of benchmarks and code that's been "tinkered" with is of little or no concern to them. They won't spend valuable time defending themselves against something they don't see as worth defending. Changing code doesn't expose cheating, it simply feeds code to their shaders in a way that they're not designed to handle it. Games developers will never do that, and if it's games performance that matters then NVIDIA is where the clever money is being spent.

 

When I asked about the FX's relatively poor performance in our "real" game tests the reply wasn't entirely clear but they certainly claim to have doubts on the reliability of FRAPS and the reliability of those using it.

In a nutshell they're saying that you can analyse all you want, future titles will be coded to run at their best on NVIDIA hardware and they suggested we ask the developers who they think is on top right now.

 

ATI:

It's fair to say that NVIDIA hardware is used in the development of most games. That's true - but it's not as spectacular a domination as NVIDIA would have people believe. ATI is also used in the development of most games. In fact I suspect that you couldn't find a single example of a game which was developed without some use of both Vendor's hardware. That's the way the process works. No game could be released with a respectable QA process which involves everyone's hardware.

 

But what NVIDIA are trying to claim is that developers produce code which is specifically tuned to work best on their hardware - and that claim is completely bogus. Sure they have an active DevRel team who try to intervene in the development process and steer things NVIDIA's way - but we all know that the two real dominating forces in games are (a) schedule and (B) the game. For that reason most shaders are written by the games developers in general kinds of ways - most of them are not tuned for NVIDIA hardware at all. NVIDIA don't control this industry nor will they ever.

 

They claim to be "the driving force that sets the industry standards in shaders". If that's the case then it's odd that they arrived late with their DX9 support (about 6 months behind ATI), that they have been shown to re-write several DX9 benchmark shaders to run on their DX8-style fixed point interfaces, that the OpenGL ARB declined to use Cg as the basis for OpenGL's high level shading language, that their own demo 'Dawn' runs faster on ATI hardware than on NVIDIA hardware even with the extra layers of software involved etc., etc.

 

NVIDIA are trailing for the moment. Now I don't think they're going to trail forever - but they still haven't come to terms with the fact that they're very much second best at the moment. And in that sense they're like alcoholics... The first step to recovery is to come to terms with the truth in the situation. Right now NVIDIA can't seem to do that, instead they're just saying that everyone else is wrong.

 

As for the claim that games developers don't shuffle shader ops around well that's an odd statement. Developers clearly do tweak shaders during the development process mostly to make experimental tweaks to the functionality - and often that means that reordering happens many times along the way. But sure, when a game is released it tends to remain unchanging. Then if NVIDIA become interested in the benchmarking built into the game then if they want to, they can go in and detect the shaders and consider substituting in other shaders that do 'similar' things (usually with some image quality reduction) but run faster. At that point NVIDIA would describe them as Application Specific Optimisations - but since they are authored solely with the purpose of getting higher benchmark scores then the so-called optimisations may be totally inactive during actual game play.

 

It's also clear that NVIDIA have been involved in this process in a very extensive way. The revelations regarding both ShaderMark and 3DMark03 make it abundantly clear that NVIDIA do re-write their shaders specifically to rise their scores in synthetic benchmarks. Clearly they're very interested in benchmark scores no matter how synthetic.

 

The statement that, "Changing code doesn't expose cheating, it simply feeds code to their shaders in a way that they're not designed to handle it" is also very obviously not true. If this were the case you might expect to see a small reduction in shader performance - but you cannot explain the massive performance drops that have been seen in recent cases. It would be remarkable indeed if NVIDIA had designed hardware that could only run the shaders from this "rarely heard of benchmark" at decent speed and any changes to that setup would many times slower. That would suggest that their hardware was perhaps the most badly designed you could imagine. Where's all this programmability they keep claiming to have? If you use it then you lose all your performance?

 

Actually on reflection I guess that you could argue that the above quote from NVIDIA _is_ true. Take it literally - and don't worry about the word 'cheating' in there - we'll let them use their "Get Out Of Jail Free" card for that. What the NVIDIA defence could be claiming is that their hardware is not designed to handle DX9 shaders. Something I guess I'd be happy to accept.

"When I asked about the FX's relatively poor performance in our "real" game tests the reply wasn't entirely clear but they certainly claim to have doubts on the reliability of FRAPS and the reliability of those using it. In a nutshell they're saying that you can analyse all you want, future titles will be coded to run at their best on NVIDIA hardware and they suggested we ask the developers who they think is on top right now."

It's a fine sight isn't it? A company that used to lead by example with innovative technology and honest product positioning is reduced to saying that anyone who uses FRAPS to check on NVIDIA's story is unreliable. There's no reason I know of to doubt FRAPS - it's widely used and well respected.

 

It reminds me of the guy who was talking to his psychologist and his psychologist said, "You're in denial". To which the guy's simple response was, "No I'm not".

 

Developers genuinely like the fact that there's some intense competition in graphics these days. They see that as a good thing - and many of them like the spectacle of the struggle for technological supremacy. I don't think they're impressed by this kind of nonsense.

 

 

Akoma perimenoume apanthseis ki apo tis 2 etairies oson afora ta findings tou unwinder...

 

Btw, exei katalavei kaneis ti thelei na pei h nvidia ? Ta synthetic benchmarks einai anakrivh, ta games, twn opoiwn ta fps metriountai me to fraps einai ki auta anakrivh...ti tha prepei na kanoune oi reviewers dld ? Na vazoune tis kartes se patinia na doune ekei pia "trexei" pio grhgora ? <img src="http://www.insomnia.gr/ubbthreads/images/graemlins/tongue.gif" alt="" />

 

EDIT:Twra pou to xanaskeytomai...lete stis "application-specific optimizations" na perasoune kai to fraps sto mellon ? <img src="http://www.insomnia.gr/ubbthreads/images/graemlins/tongue.gif" alt="" />

Αρχειοθετημένο

Αυτό το θέμα έχει αρχειοθετηθεί και είναι κλειστό για περαιτέρω απαντήσεις.

  • Δημιουργία νέου...