New
#1
Upgrade GPU and CPU. CPU usage very high. GPU not perform as intended.
(Pardon the botched title. Character limits suck.)
I have switched my previous processor (AMD A10-5800K APU with Radeon HD Graphics 3.80 GHz) to the processor that was in another computer (linux-fedora)(AMD A6-6420K APU with Radeon HD Graphics 4.0Ghz).
I have applied new thermal paste and temperatures idle at around 25C and under stress they don't exceed 55C (max safe is 70C). The processor seems to run anywhere between 1.8GHz to 4.3GHz at any given time (no overclock).
At this point I removed my previous GPU (EVGA SSC 01G-P4-3652-KR GeForce GTX 650 Ti 1GB 128-bit GDDR5) and installed it in the linux machine. I ran on the integrated graphics of the processor (the A6) without drivers for a couple days until my new GPU arrived.
Windows ran in what I suppose was software rendering making things look buggy and having the CPU run around at least 20% or more all the time (I was expecting this from running the integrated graphics without drivers).
Then I decided to play Minecraft with my friends and I had to install the AMD proprietary drivers. They did not help in the CPU usage department and I could not run my server without my friends lagging. Minecraft ran at about 10 FPS which seems odd to me as I'm pretty sure that the graphical capabilities of my APU are enough to run it decently better. But Aero was restored and dragging windows no longer created visual artifacts.
Then, the next day I received my new EVGA GeForce GTX 780 Superclocked 3GB 384-Bit GDDR5. I uninstalled all the AMD video drivers (I kept the USB 3.0 helper stuff and the Dual Core helper thing) and all the Nvidia Drivers on my system, rearranged all the cables in my case to make room for the new card, put in the new card, then booted up my computer (I did boot it up accidentally the first time, the DVI cable was still in the motherboard and not in the card, then I turned it off. I may have missed the EFI telling me my hardware changed if that's significant). I then proceeded to install the Nvidia drivers for my card. During the time the card was plugged in I noticed that the computer was slower even than when it was on integrated graphics with no drivers. When I installed the drivers and reset, the new drivers were working but the CPU usage was still rather high. Dragging windows across the desktop maxed out the CPU and programs took forever to open. Also Minecraft ran at a dismal 25FPS; much, much lower than what I know this card is capable of. Then I opened up Portal 2, maxed out its settings, and it ran like an absolute peach. when i alt-tabbed to the task manager it seemed like portal 2 made the CPU go at a steady 50% usage (I should mention that no specific program uses CPU more than another. they seem to take turns using up the CPU). I then ran a benchmark of Hitman: Absolution at max settings. The average FPS was around 25 but I think that game may be badly optimized so it may be a moot point of comparison.
I then proceeded to use CCleaner to fix registry errors and delete temporary files. When I restarted my computer then, everything started fast. It was almost as if I was still using my old rig to start (It did, however took slightly longer and the widgets on my desktop appeared much later than usual). But, the problems with the CPU persisted and whenever I dragged a window or opened a program the CPU usage would spike from a usual idle of ~6% usage to max. Programs however did not have any adverse effects. It was as if the CPU wasn't busy with something else even though it said it was. The only programs that did seem to suffer adverse effects were GeForce Experience (it comes with the display drivers and failed to start on most occasions) and the Nvidia Control Panel (It started every time, albeit with delay, and whenever I would access another section, say to adjust my desktop resolution, the program would freeze for a few seconds and then display the settings).
I will run some additional benchmarks on my GPU and post them below. My speculation is that windows is still somehow using software rendering in some way to display windows and such. But my expertise is limited to hardware and thus I came here for help. I really do hope you guys can help me. these forums have never failed to solve my problems :)
Grid 2: Average 35 FPS on Ultra & 4xMSAA @1920x1080.
Other sources on similar platforms achieved an average of 70 FPS on Ultra @2560x1440.
My card should be getting frame rates higher than 70 due to the decreased resolution. Also the CPU was at around 80% usage the benchmark I think.
Tomb Raider: Average 80 FPS at max except for FXAA and no TressFX. Others at the same settings and at 2560x1600 got an average of 75 FPS.
I may have only done slightly below the norm here. But the CPU usage was at 90% the whole benchmark from what i could tell. And when I Alt+Tab away from the game to the task manager, the CPU usage goes way down to like 20%.
Last edited by Harpiesd; 10 Sep 2014 at 06:04. Reason: Benchmark post