New
#21
No problem I just find it kind of wierd that he would get that
but then again never can rule out anything
I hope you sort this issue out and would like to know the end result after he gets his 680
No problem I just find it kind of wierd that he would get that
but then again never can rule out anything
I hope you sort this issue out and would like to know the end result after he gets his 680
Will post back here when he decides matey
It's much more more complex than that. Games aren't movies, that even on IMAX 3D get away with 24-ish fps.
Because fps isn't everything, and a movie can do things that GPUs do not do.
This is a debate where most people either agree to disagree, or simply disagree.
The default argument is you can't notice a difference past 24fps. 'Feel wise' you can sure as hell notice a difference between a game capped at 30fps and one at 60fps.
And you can notice a difference on a 120hz panel. Things just 'feel' smoother all round. Almost like an turbo effect. Hard to describe, but it exists nevertheless.
And I noticed frame dips with my 670sli setup when I played it. The frames were rougher on my single 580. Dual helps this game. When it works across.
That is a great Idea Smarteyeball Didn't think if he uninstalled drivers properly
also I forgot to mention some of the 7000 series cards had clock issues where they would drop and the advice given was to ramp up the gpu clock up I never had t his issue but could be what your friend is suffering from too
Fair point. I usually do this only if asked, as it's time-consuming.
I can give PERCEPTION OF TEMPORAL ORDER OF FLASHING LIGHTS AS A NAVIGATION AID, a navy study that shows people can tell the difference between two or more flashing lights when the difference in flashing interval of the flashing lights is around 10ms. That's around 100 fps of raw eye sensitivity sitting there. Telling the difference between dumb flashing lights requires pretty much 0 brain elaboration (to the contrary of a moving image on a screen), so it's reasonable to assume that brain isn't bottlenecking eyes.
When you add the brain's work to that it becomes all weirder though, and the above gets bottlenecked as far as fps foes, but quality remains more or less unaffected.
Here is a reliable source about how stuff changes when the brain is into the equation.
They say that when it reaches the brain, everything slows down to one image per 50-100 ms (depnding on light wavelenght, or "colour"). That's between 20 and 10 fps, give or take.
But since even a child can tell the difference between 15 and 60 fps on a screen (here an example so you can see with your own eyes), there is something fishy going on (like say brain chooses only some images per batch according to some algorithm, or other stimulus). After all the brain is still being heavily reasearched on.
Movies manage to sidestep the whole issue and are fine at 24-30 fps by using motion blur, GPUs displaying games don't do that. They can make only tons of still images, they don't blur things.
(there are some tricks to force the GPU to do motion blur, if the game is coded the right way, but it's intensive as it is doing something much harder than just making a ton of still images, as motion blur changes with each object's speed and other factors, something very complex to implement in an artificial 3D environment, most games with motion blur as an option do it kinda wrong. Better than without, but still kinda wrong.)