A CPU may have more than one microprocessor; the more microprocessors a computer has, the faster the computer executes commands. With the amount of information that is now available to us via the internet, how quickly can …show more content…
As Paul Merolla wrote in their research, “A long standing dream has been to harness neuroscientific insights to build a versatile computer that is efficient in terms of energy and space, homogenously scalable to large networks or neurons and synapse, and flexible enough to run complex behavioral models of the neocortex as well as networks inspired by neural architecture (688).”
Neural Networks (Artificial Intelligence) vs. Human Brain Neural Networks
A neural network or artificial neural network, is an artificial intelligence system capable of finding and differentiating patterns. According to Sergey Lobov, “Artificial neural networks (ANN) represent one of the effective tools currently used for pattern recognition in many applications, both as classifiers and for dimension reduction. (27895)” The human brain acts in a similar manner as a neural network, as it has learned to consider factors in combination with recognizing and differentiating objects. (Haag, Cummings 111)
Neuroscientist Christof Koch proclaimed that, “Any creature vastly outperforms any machine today” (qtd in Watson). If creatures can outperform any machine today, could a technology be developed that mimics how the human brain …show more content…
Enabling personal devices, such as smart phones with the TrueNorth technology, would give everyday people the power to process information like never before. Artificial intelligence that TrueNorth will be capable of has made some question the possible downsides and long-term implications of the technology. “(A) software engineer who has since joined Google cautioned that our instincts about privacy must change now that machines can decipher images” (Simmonite). Another downside of this new technology is the fact that IBM’s chip would require a new line of programming (Simmonite). Aside from the privacy aspect and new programming requirements, Maney points out that “if the current trend lines for data and computing power continue in their current directions, we could end up either chocking on data or getting buried under data centers.” Another concern raised by Conner Forrest is that “by altering the hardware, compared to a general purpose computer, you could be limiting your flexibility and your ability to implement new paradigms, should they