New
#1
I've heard of concepts like this before. One guy I know even did a white paper on how it could be used to simulate human intelligence...
Read more at:A team of researchers from prominent institutions around the world claim that they've figured out how to make computer processors smaller, faster and more power efficient than ever before: by letting chips mess up once in a while. No, seriously. By allowing "inexact" chips to make a pre-calculated amount of errors rather than striving for absolute perfection, the researchers claim that drastic power reductions can be made -- and they already have a working prototype.
Maximum PC | Researchers Use Calculation Errors To Greatly Improve Processor Speed And Power Efficiency
I've heard of concepts like this before. One guy I know even did a white paper on how it could be used to simulate human intelligence...
I think I had an old computer that did that...
I do that already when I don't count my change properly. Imagine it in a micro surgery controller.
Not if Windows is programmed to allow the exceptions in said errors which would be released as an update or already be programmed into the OS.
Is this not the way human brains work? By making errors, then correcting the next step?
At the moment, computers are set up to make logical, absolutely correct decisions. Humans are not always logical, (Vulcan/Human breeds excepted!) but are inclined to make illogical leaps of deduction which can often seem illogical. That's genius in action.
If computers are allowed to make mistakes and learn from them, can A.I. be far away?
To be a bit more precise: human neaurons are actually redundantized, with a bunch of neurons performing the same action. That way errors are reduced.
With humans once they trip themselves up enough times they finally get the idea that something else has to be tried! The old expression "knocking some sense into them" would seem to fit the bill there!
Now speaking of computers developing intellect "Hal 9000"? or Computers that understand emotions | KurzweilAI