New
#10
So long as chips are manufactured in the general way they currently are, heat will always be a problem for integrated. They also tend to burn out in about 4 years if they are put under any real stress.
So long as chips are manufactured in the general way they currently are, heat will always be a problem for integrated. They also tend to burn out in about 4 years if they are put under any real stress.
Hello ionbasa,
But I think that nVidia don't want to embroil themselves in the x86 market (in terms of Tegra~x86)?
Nvidia believes that x86 is history
It would be a massive shame if they follow through only in their vision as stated in the linked article, as that of x86 having no future. Bringing new tech to the table would be a great plus, in my opinion. But the other 'big two' have had decades of foundation to build upon, nVidia would be highly disadvantaged for their first few desktop chips. Would this last sentence be an accurate statement?
I think so. And since Windows 8 will work on ARM, Nividia probably sees minimal incentives to make an x86 chip at all (the exception being Desktop apps, which Nvidia can spin to say something lice "none of their chips are susceptible to 99% of Windows viruses, being that these viruses are x86. I'm not sure how well new viruses will work on the new architecture (windows AND CPU)
I think it's more that ARM-based Windows 8 will not support x86 applications, and will not have backwards compatibility with older OSs.
That means that current viruses will not work, and virus writers will have to code for a completely retooled OS that has no means of bypassing newer and more secure designs.
I'm not saying it's impossible, just that it'll be more difficult. It may even only have Mac OS X levels of malware (though probably a little more, as the incentive is there)
Yep, definitely. For some reason, this correspondence reminds me of a certain 'philosophy', amongst some circles of Amiga-land (former Amigahead here ) of "security through obscurity", wherein AmigaOS is safe because no one really knows about it and has such a miniscule market share. Although I do remember the Saddam Hussein virus in the early 1990's - which was cause for some consternation back then. I'm not saying that this is what nVidia is claiming, but to my ears it has that resonance, in a warped and twisted way. I hope nVidia do not become complacent.
I have a z68 myself, I don't use the lucid id thing that swaps with the intel 3000HD though. I'm solely on the dedicated graphics card. It's not quite so bad in desktops as it is with laptops (where integrated is really the only option) If you have a case with good airflow you should be fine. I would go ahead and add a dedicated graphics card even if it's not a performance model just to make proper use of the board anyway.