Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

Researchers Develop New Way to Increase Energy Efficiency of Smart Computers

Source: unite.ai

Researchers from the Cockrell School of Engineering at The University of Texas at Austin have discovered a new way to increase the energy efficiency of smart computers. This comes during a time when there is an increased need for energy in order to process massive amounts of data, a result of newly developed technology. 

Infrastructure of Computers

Silicon chips are normally used to build the infrastructure that powers computers, but the newly developed system relies on magnetic components instead of silicon. The silicon chips are beginning to reach their limitations, due to things like artificial intelligence, self-driving cars, and 5G and 6G phones. New applications require faster speeds, reduced latency, and light detection, all requiring increased energy. Because of this, alternatives to silicone are being looked at. 

By studying the physics of the magnetic components, the researchers found new information about how energy costs can be decreased. They also discovered ways to decrease the requirements of training algorithms, which are neural networks capable of recognizing patterns and images. 

Jean Anne Incorvia is an assistant professor in the Cockrell School’s Department of Electrical and Computer Engineering. 

“Right now, the methods for training your neural networks are very energy-intensive,” said Jean Anne Incorvia. “What our work can do is help reduce the training effort and energy costs.”

The findings from the research were published in IOP Nanotechnology. 

Lateral Inhibition

Incorvia was joined by first author and second-year graduate student Can Cui. Together, they led the study and discovered that the ability for the artificial neurons, or magnetic nanowires, to compete against each other can be naturally increased by spacing them in certain ways. In this situation, the most activated ones end up winning, and the effect is called “lateral inhibition.” 

Lateral inhibition normally increases costs and takes more energy and space, due to the extra circuitry that is required within computers. 

According to Incorvia, the new method is far more energy-efficient than a standard back-propagation algorithm. When performing the same learning tasks, there is an energy reduction of 20 to 30 times achieved by the researchers’ method. 

When looking at new computers, there is a similarity between them and human brains. Much like the way human brains contain neurons, the computers contain artificial versions. Lateral inhibition takes place when the slower neurons are prevented from firing by the fastest firing neurons. This results in a decreased need for energy usage in processing data. 

Incorvia has indicated that there is a fundamental change taking place within computers and how they operate. One of the new trends is called neuromorphic computing, which can be seen as the process of designing computers to think like human brains. 

Newly developed smart devices are designed to analyze massive amounts of data simultaneously, rather than just processing individual tasks. This is one of the foundational aspects of artificial intelligence and machine learning. 

The main focus of this research was the interactions between two magnetic neurons and the interactions of multiple neurons. The team will now apply their findings to larger sets of multiple neurons. 

The research was supported by a National Science Foundation CAREER Award and Sandia National Laboratories. Resources were provided by UT’s Texas Advanced Computing Center.

Related Posts

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence