|02 Mar 2015||#1|
Why is the Windows 7 x64 wasting my RAM???
Hey guys. I've been very thankful to the wisdom at SevenForums many times in my past. They are a bastion of intelligence, creativity, and problem solving in a dying world of idiot computer n00bs that don't know their RAM from ROM.
TL;DR-- skip to bottom image
If I recall in the Vista days Microsoft pre-emptively told DDR2 mem manufacturers to spin up production in expectation of future demand-- Vista was going to take a lot of RAM.
Then the 2008 recession hit and people weren't upgrading their PCs like expected. Excessive capacity and supply in the line, hence DDR2 prices dropped like a rock. I remember buying DDR2 2GB sticks for $20.
There was also an outcry again Microsoft over the system resources used in Vista.
One of the fusses was the RAM requirement.
In Vista days, it mirrored the VRAM on the dedicated video card for integrity purposes because of something related to desktop compositing.
In Windows 7, this was supposed to be removed.
My personal theory is this and other features that slowed Vista down so much, were removed and then patched back in to Windows 7.
I'm running SP1 and don't have an option to check an OEM non-SP1 installation of Windows 7 so all I can say is, my netbook was usable with a fresh Win7 install. When I installed all the Windows Updates, it became super slow-- there was a search indexing function that was patched back in that would run on log in, every log in, that wasn't there on the stock Windows 7 install. I didn't have the energy or care to bother tracking it down, I simply uninstalled all the updates and continued enjoying a usable netbook.
Now, I have hit a wall: 7 seems to still be mirroring the VRAM from the video card, and then some. I thought this had been removed; maybe it was; maybe it never was. Regardless, I want the RAM back. This was one of the reasons I had to upgrade my previous rig-- I had 8GB RAM and between Chrome taking up more resources, and switching to an AMD graphics card (didn't have this problem on Nvidia or so I recall) I abruptly (as in, I can draw a line after card swap/upgrade when it began) began hitting OOM errors in Windows (I run with the Swap file disabled on purpose) (NO, THIS IS PERFECTLY ACCEPTABLE, IF YOU THINK OTHERWISE YOU [ARE AN IDIOT] DON'T UNDERSTAND HOW SWAP AND VIRTUAL MEMORY WORK. THE PROGRAM CAN'T TELL IF IT'S USING VIRTUAL MEMORY OR RAM) at 6GB, after having previously been able to push all the way up to 7.8GB of RAM used (launching every program and game that I have installed, ever) without OOM issues.
Part of me thinks it's something specific to AMD graphics drivers in typical [by the way, I am an AMD fanboy] AMD fashion where it "gets the job done" but is not cleanly implemented or what one would call stable IE is not 'confidence inspiring' in the same way that NVidia's drivers have been (ignoring their marketing evilness).
What I'm trying to figure out is why the **** this 3840MB of RAM that's being nefariously stolen from me for no legitimately necessary reason isn't mine to control.
BTW-- why is it 3840MB? That's like 4k resolution. This is fishy.
|My System Specs|
|04 Mar 2015||#8|
Why ? Because every MB is different.
A computer programmer, that explains it
You need to start researching computer hardware and how it works.
If there are no options in the bios to disable on board video memory usage, then there`s nothing you can do.
Except buy another motherboard or add more memory to it. (which I recommend) it`s cheap.
After you`ve built a 1000 different machines and installed every OS you can find, then you`ll understand
Post a shot of resource monitor showing the memory tab.
Task Manager>Performance>Resource Monitor>Memory
And why do you have Windows 7 Business in your specs ?
There`s no such thing.
|My System Specs|
|04 Mar 2015||#10|
Yep, that's incorrect, the only Business version of Windows was in Vista.
The "System Video Memory" part on my two Custom computers indicate there is no video adapter built into the motherboard, requires an Add-in PCIe 16X video card.
I've never seen a setting for disabling the built-in video adapter in the BIOS, only sometimes a setting for the amount of memory to assign. When one really thinks about it, disabling the built-in video in favor of an add-in card and then removing the card would make the computer unusable, couldn't see how to do anything. Most times the BIOS seems to automatically redirect to the add-in.
|My System Specs|
|Thread Tools||Search this Thread|
|Similar help and support threads|
Am I wasting my money on a 4790k?
Hey, I do a lot of 3D visualisation/animation + some video editing, all which involves a lot cpu rendering, animation in particular. Recently I thought about upgrading to a 4790k from my i7 920. This would set me back about £330 (£260 for the cpu and about £70 for a respectable motherboard)...
|PC Custom Builds and Overclocking|
Am I wasting money?
I'm sure this is unlikely but let me ask you guys. I'm selling my guitars, amp and pedals and will have $500-$600USD to spend, what laptop would you buy? I'm partial to gateway because I can get the reinstall DVD's from them. I want it to have: HDMI number pad on right
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.
© Designer Media Ltd
All times are GMT -5. The time now is 11:13.