New
#11
Okay superman, I'm not going to bother having this argument with someone again.
Whatever makes you feel better.
I suppose movies look just awful to you since most of them are only filmed at 24 fps.
Okay superman, I'm not going to bother having this argument with someone again.
Whatever makes you feel better.
I suppose movies look just awful to you since most of them are only filmed at 24 fps.
Lets assume you have a monitor with a refresh rate of 60, which is the most common for LCD monitors.
Quite simply, that means it can not display more than 60FPS.
So even if the card is drawing 100FPS, only 60 are getting displayed and the rest never make it to the display.
I know many claim they can see the difference, but I do not understand how. The monitor isn't even displaying them ..
Generally speaking, 30-60FPS is the sweet spot.
You can even get by with 25, so long as it doesnt drop below that. Once it starts dropping below 25 is when you start noticing it.
note how I said PERFORMANCE and not visuals.
If my comp is going under 60 FPS in any game, I notice visually because my comp is struggling to handle the game (which, for the games I play, it never should be unless I'm having hardware issues). Unless I have a FPS graph/meter, I don't notice above 60. But I know for the most part that it probably is above 60 based on graphics settings. My eyesight sucks.
Wish, as for your point, I understand the concept but for me its just what I described above.
If you notice computer/gameplay performance issues below 60 and no more if above 60... then either you're looking at the wrong measurement OR the settings of the Graphical card is messed up... OR, log shot... You have an IRQ problem that goes away on >60 settings because the Graphical card claims priority in a higher frequency and now gets it whilst below 60... might be having trouble.
Sorry if the English here is not clear, it is not my first language...
How does the game look now with settings you currently have?
If your getting hiccups at times, you may be able to just lower the AA some to smooth it out.
Personally, Id leave the AF at x16. It usually doesn't cause that much of a hit (at least not to the extent AA does)
AF is basically model detail, and only applies to the models. AA can and in some instances does account for every single pixel on the screen.
I've not noticed a huge difference using massive aa when at 1080p resolutions myself.
yeah there's a difference, just not a big enough one to justify the performance hit for most people.
I've been having no issues. I have AF at 8x and AA at 32x. I might lower AA to 16x because 32x might be overkill, but my comp handles it no problem.
Getting a solid 70+ fps with current settings and no jagged edges. Was playing Crysis on max also and was getting about 50+ fps.
thanks for all your help guys!