By Guest Contributor: Taran Volckhausen, Contributing Editor at Vector (http://www.indexer.me)Moore’s Law, which states that processing speeds will double every two years as we cram more and more silicon transistors onto chips, has been faltering since the early 2000s when the law started to run up against fundamental limitations presented by the laws of thermodynamics. While the chip industry, with Intel leading the charge, has found ways to sidestep the limitations up until now, many are now saying that despite the industry’s best efforts, the stunning gains in processor speeds will not be seen again by the simple application of Moore’s Law. In fact, there is evidence to show that we are reaching the plateau for the number of transistors that will fit on a single chip. Intel has even suggested silicon transistors can only keep getting smaller during the next five years.As a result, Intel has resorted to other practices to improve processing speeds, such as adding multiple processing cores. However, these new methods are just a temporary solution because computing programs can benefit from multi-processors systems up until a certain point.RIP Moore’s Law: Where do we go from here?No doubt, the end of Moore’s Law will certainly present headaches in the immediate future for the technology sector. But is the death of Moore’s Law really all bad news? The fact the situation is stirring heightened interest in quantum computing and other “supercomputer” technology gives us reason to suggest otherwise. Quantum computers, for instance, do not rely on traditional bit processors to operate. Instead, quantum computers make use quantum bits, known as “qubits,” which is a two-state quantum-mechanical system that can process both 1s and 0s at the same time.The advances in processing speeds made possible by quantum computing would make Moore’s Law look like a caveman’s stone tool. For instance, the Google-funded D-Wave quantum supercomputer is able

View Entire Article on KevinJackson.Blogspot.com