Encompassing a mere four square centimeters and holding more than 5 billion transistors, the chip also features 1 million digital neurons that function via 256 million digital synapses. It's hard to ignore the chip's parallels to the human brain
Kayla Matthews | Productivity Bytes
Most consumers use some form of artificial intelligence (AI) and machine learning every day without even realizing it. From AI-driven applications like Google Maps to autopilot mechanisms on commercial flights to anti-spam filters that depend on machine learning to adjust their rules over time, next-gen technology is everywhere.
Technologies like machine learning and AI are both here to stay — and they're getting more intelligent every day. Recent breakthroughs, like neuromorphic chips, promise to add even more functionality to the chips that power the new generation of smart devices — but they're not alone. In many cases, manufacturers are turning to age-old technology — in the form of analog hardware — to provide next-gen connectivity.
The Origins of Neuromorphic Chips
Neuromorphic chips are quickly becoming the preferred device of top IT manufacturers. One of the earliest examples of this trend comes in the unveiling of IBM's TrueNorth chip in 2014.
Encompassing a mere four square centimeters and holding more than 5 billion transistors, the chip also features 1 million digital neurons that function via 256 million digital synapses. It's hard to ignore the chip's parallels to the human brain — regarding both the chip's technical terminology and the functionality.
But the true origins of neuromorphic chips date back several decades. They've only recently begun gaining momentum in next-gen computing, where they're in everything from digital cameras and smartphones to health care systems that automatically monitor vital signs — and they're still in the earliest stages of development and implementation.
The current generation of robotics relies heavily upon neuromorphic chips, among a myriad of other electrical components. With printed circuit boards getting smaller than ever before, consumers benefit from more functionality packed into devices with smaller footprints — like the TrueNorth chip.
But TrueNorth isn't the only example of a neuromorphic chip in use today. Initiatives like Qualcomm's Zeroeth project and the UK-based SpiNNaker project are well underway, both of which are exploring advanced functionality in neuromorphic chips. Graphics developer NVIDIA is also exploring next-gen applications of deep learning, and might consider replacing their current generation of graphic processing units with neuromorphic chips in the future.
The Traditional Approach
Despite the recent emphasis on neuromorphic chips, some applications in AI and machine learning work better with traditional, analog chips. Many deep learning algorithms even prefer analog or neuromorphic chips over the newer, all-digital platforms.
While we tend to equate the word "analog" with "outdated technology," this isn't always the case. Because analog is so simplistic, it often performs repetitive tasks with greater speed and reliability than digital alternatives.
Lux Capital's recent launch of Nervana demonstrated this effect. Instead of using next-gen networks or highly sophisticated data processing routines to train deep learning algorithms, they relied on application-specific integrated circuits to perform the same task. The results were more akin to the learning processes used by the human brain.
Unfortunately, the task consumes too much power for integration into any current-gen mobile device. Despite its low energy efficiency, Nervana still attracted the interest of Intel, which quickly acquired the company outright. Where it lacks in one area, it certainly makes up for in overall speed and reliability — and this gives developers a strong starting point to refine and upgrade the technology as they see fit.
From Analog to Digital — and Back Again
Analog circuits first captured the public's attention following World War II. They've evolved by leaps and bounds since then.
Although digital hardware replaced many analog devices in the ‘90s and early 21st century, they're experiencing a resurgence thanks to new breakthroughs in AI and machine learning that demand the efficiency and reliability of the older hardware.
Image Credit: Brian Kostiuk
The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow
Kayla Mathews - Contributing Author
Matthews is a tech journalist and writer, whose work has appeared on websites such as VentureBeat, The Week, VICE's Motherboard and Inc.com. She is also a senior writer at MakeUseOf and the owner of ProductivityBytes.com.
This post does not have any comments. Be the first to leave a comment below.
Post A Comment
You must be logged in before you can post a comment. Login now.