While we have seen an unparalleled expansion of computational power, rivalling, and even surpassing, the capabilities of the human brain, there is one crucial aspect where our biological supercomputers continue to outshine their silicon counterparts - energy efficiency.
As Prof. Kaustav Banerjee, an authority in the field of nanoelectronics from the University of California, Santa Barbara (UCSB), highlights, the most sophisticated computers still consume energy around 10,000 times more than the human brain when performing tasks like image processing and recognition. Moreover, the mounting worldwide energy consumption by on-chip electronics is amplifying the urgency to develop more energy-efficient computing technologies.
Neuromorphic (NM) computing, which emulates the structure and functionality of the human brain, has come to the fore as a promising solution to bridge this energy efficiency gap. A paper published in Nature Communications by Banerjee and his colleagues, in association with researchers from Intel Labs, has proposed an ultra-energy efficient platform that could potentially narrow down the energy requirements to be about 100 times that of the human brain.
The concept and potential of neuromorphic computing have been known for some time. However, recent advancements in transistor technology that allow for more compact, denser arrays, hence more functionality with less power consumption, have only scratched the surface of what this technology can offer. Additionally, the emerging influence of AI and the Internet-of-Things (IoT) make this an area of considerable future interest.
The team's proposed solution leverages 2D transition metal dichalcogenide (TMD)-based tunnel-field-effect transistors (TFETs), designed to meet the burgeoning demand for processing power without a substantial increase in energy consumption. These cutting-edge, atomically thin, nano-scale transistors operate at low voltages and can replicate the high energy efficiency of the human brain. According to Banerjee, the TFET's lower Subthreshold Swing (SS) indicates lower operating voltages, leading to faster and efficient transitioning.
In neuromorphic computing architectures like these, circuits fire only when necessary, mirroring the operating pattern of neurons in the human brain. Prominent tech companies like Intel and IBM have pursued this approach, developing brain-inspired platforms that tap into billions of interconnected transistors, which results in significant energy savings.
However, as Banerjee points out, most of the energy in these systems gets wasted through leakage currents when the transistors are off. This underlines the potential for significant energy efficiency improvements using the team's approach of tunneling transistors with substantially lower off-state current.
When integrated into a neuromorphic circuit, these TFETs demonstrated more energy efficiency than conventional metal-oxide-semiconductor field-effect transistors (MOSFETs), particularly FinFETs. Banerjee and his colleagues envision three-dimensional versions of these 2D-TFET based neuromorphic circuits that could even more closely imitate the human brain's operations.
Disclaimer: The above article was written with the assistance of AI. The original sources can be found on ScienceDaily.