A New York Times article on advances in neuromorphic processors piqued my interest and I wound up wanting to learn more about them. I found some very interesting articles in Go Parallel, the Technology Review, and Gizmag. The latter two are the best for computer science laymen. I used to work in a semiconductor fabrication laboratory, so it’s a subject I know enough about to understand a little bit. Yet, I’m still a novice when it comes to the underlying science involved.
What intrigued me the most are the immense potential savings in energy costs associated with computing. Paul Krugman thinks Bitcoin is evil because he doesn’t think it is a stable store of value, but I think it is evil because of the energy it consumes. For example:
Established in the Kwai Chung industrial building in Hong Kong, company Asicminer has created not just a Bitcoin mining rig, but an entire facility. The actual mining equipment is so large that it resembles some kind of supercomputer — large black rack filled with green boards — called blades — and cooling tanks aligning the walls of a long corridor.
Due to the massive amount of power the facility is consistently churning out, an equally massive amount of heat is being generated, so the boards need a special kind of cooling. The blades are submerged in 3M cooling liquid inside the tanks, which can hold up to 92 blades each. The heat generated by the rig is enough to cause the cooling liquid to bubble, but the system — a combination of that liquid air pumps that reach through the roof — manages to keep the temperature below 98.6 degrees.
If you need the near-equivalent of a nuclear plant’s pressurized water reactor to cool your Bitcoin computers, I consider that a wee problem with the currency. However, some of the neuromorphic chips scientists are designing are big energy-savers.
“The neurons implemented with our approach have programmable time constants,” Prof. Giacomo Indiveri, who led the research efforts, told Gizmag. “They can go as slow as real neurons or they can go significantly faster (e.g. >1000 times), but we slow them down to realistic time scales to be able to have systems that can interact with the environment and the user efficiently.”
The silicon neurons, Indiveri told us, are comparable in size to actual neurons and they consume very little power. Compared to the supercomputer approach, their system consumes approximately 200,000 times less energy – only a few picojoules per spike.
A neuromorphic chip uses its most basic components in a radically different way than your standard CPU. Transistors, which are normally used as an on/off switch, here can also be used as an analog dial. The end result is that neuromorphic chips require far fewer transistors than the standard, all-digital approach. Neuromorphic chips also implement mechanisms that can easily modify synapses as data is processed, simulating the brain’s neuroplasticity.
The neuromorphic chips aren’t meant to replace standard semiconductors, but to aid in parallel processing and assist us in understanding how the human brain works. But, I don’t see why we can’t use spin-based neuromorphic microchips to massively reduce the energy costs of standard computing across the board.
Spin states are inherent to electrons, which are constantly spinning, imparting a momentum to their electrical charge which can be oriented “up” or “down”. Such spin-polarized electrons can be used to encode digital ones and zeros using much less energy than just piling up charge on a capacitor. Ideally, a single electron could be used to store a digital one as “up” spin and a digital zero as “down” spin, enabling the ultimate downsizing for parallel processors to one-bit-per-electron. And for intrinsically parallel applications, such as emulating the billions of neurons in the human brain, the super low power achieved by spin-polarized digital encodings could enable the ultimate parallel processing applications of the future.
One-bit-per-electron seems like a worthy goal. Where do I invest?