New
#11
Thank you for taking the time to respond. I'm quite interested in your team's design solutions to these vexing problems.
So what effect does reducing the working set of a process have on the performance of that process? Is there likely to be any downside?
So your application steps in and sets a minimum and maximum working set size for each process? What mechanism does it use to try to reduce the working set of a process once it gets close to its set maximum?
I'm not sure I follow - does the lack of control by the OS (as you described it) lead to a memory leak? If not, please help me to understand.
A reason to do what exactly? Thank you for your kind explanation!