“The human brain has 100 billion neurons, each neuron connected to 10 thousand other neurons. Sitting on your shoulders is the most complicated object in the known universe,” Michio Kaku, Physicist and Futurist
The human brain, which not just stores but also computes, is by far the most powerful and complex computers in the world that occupies just 1.3 litres of space and consumes about 20 watts of power. In comparison, the finest supercomputers in the world require gigawatts of power, massive real estate, infrastructure, and dedicated cooling systems while attempting to perform brain-like tasks.
Understanding how the human brain functions and replicating it has been a lifelong quest for the scientific and research community. Enter neuromorphic computing, a concept developed by American scientist and researcher Carver Andress Mead in the late 1980s – which tries to emulate certain functions of the human brain in silicon.
And, why should you care about neuromorphic computing? Because it is a subset of the larger artificial intelligence hardware industry and one of the limiting factors for the growth of AI has been the processors that deliver the computing.
A professor at Indian Institute of Technology-Delhi may have the latest answer in the quest for neuromorphic computing. Manan Suri (pictured above)’s invention is the result of a finding he made that certain types of storage devices displayed properties that allowed them to emulate certain functions of the brain in silicon.
Suri, an assistant professor with the department of electrical engineering at IIT-Delhi, in 2010 first discovered that a certain type of non-volatile memory, usually used for data storage, displayed properties that lend itself to be an ideal solution to develop neuromorphic computing hardware.
“The brain is an amazingly sustainable machine; it consumes only ~ 20 W of power, occupies less than 2 litres of volume, doesn’t need to connect to any cloud and excels at performing complex computations in real-time and in highly noisy and uncertain environments,” says Suri who realised that emerging non-volatile memory (eNVM) displayed the learning ability and energy efficiency of computational elements inside the human brain, like synapses. His research in the field lead to him being recognised as one of the ‘35 Innovators under 35’ by MIT Technology Review.
eNVM and Neuromorphic Computing
What is a non-volatile memory? Non-volatile memory is essentially a memory that is used to store data long term even when it is not powered. E.g.: the hard disk on your computer or the USB drive you used to store data on are non-volatile memory types. On the other hand, your computer’s RAM (short for random access memory) is a volatile memory because the data stored on it is lost as soon as you shut off the machine.
eNVM, or emerging non-volatile memory, devices are a special class of advanced nanoelectronic storage devices that offer higher performance and lower cost as compared to today’s devices.
“We saw that there are some exciting properties that they (eNVM devices) have which are similar to biological synapses and neurons. We can take these things and build small circuits and functional architectures so that they start behaving like very efficient neural networks for neuromorphic systems,” says Suri, who was awarded his PhD in nanoelectronics and nanotechnology from France’s Institut Polytechnique de Grenoble in 2013.
Today, most traditional computing systems are based on the von Neumann computing principles where memory and processing are two distinct and isolated blocks. Dedicated resources are needed to implement storage and logic functionality. This leads to several inefficiencies and bottlenecks while solving certain types of data-rich problems. Neuromorphic architectures are essentially non-Von Neumann in nature, which means that memory and processing are not fully isolated. Memory is intelligent in such systems and it actively contributes to computations, a lot like how the human brain functions.
“Most modern computers have only a few processing cores that can compute and abundant memory that can store the computed results. All that abundant memory just dumbly stores the data sitting there. This is fundamentally different from computing in nature, where memory is not dumb anymore. This is what we mean when we refer to neuromorphic computing,” says Suri.
What can neuromorphic computing do? Most computational hardware today – especially used in the fields including AI, deep learning and neural networks – is not considered efficient in terms of energy consumption or size in comparison to the brain.
Today, much of the deep learning computations are rendered on graphics cards or graphics processing units (GPUs). Even though GPU makers are releasing better and better graphics cards every year, we are yet to see a brain-like intelligence chip – or anything close to it – come out.
“The fundamental problem is still that you are nowhere close to the power requirement. At the end of the day, your brain is running on 20 watts, it has 100 billion compute units each with three orders of magnitude more connectivity. You can throw huge supercomputers with lots of GPUs and then maybe you might equate the computational power of the brain, even with all the deep learning models,” says Anand Chandrasekaran, CTO and co-founder of AI-focused, Chennai company Mad Street Den.
According to Chandrasekaran, who was part of the team that built a neuromorphic system called Neurogrid during his postdoctoral studies at Stanford, the fundamental approach is that you need specialised hardware and the GPU is one step towards it but it is nowhere near specialised enough.
“Specialised hardware is going to come out of things like neuromorphic research because there the premise itself is ‘Can you do all that computation using ultra-low power and build it using the kind of architecture you find in the brain’,” says Chandrasekaran. “In the long run, the only way you are going to be able to match the brain in terms of size and complexity is if you can bring down the power usage low enough that you can actually build a practical device that can house an entire brain and that is the promise of neuromorphic engineering.”
Neuromorphic computing development is still in its early days, most companies and hardware being developed out there is a lot in the R&D or proof-of-concept stage. Qualcomm’s Zeroth platform, Intel’s Loihi chip, IBM’s TrueNorth are some examples of products and platforms being developed in the field of neuromorphic engineering globally.
Hello, Cyran Tech
In March this year, Suri founded Cyran Tech Solutions with an aim to develop commercial solutions based on the neuromorphic AI R&D he has been doing.
According to him, the eNVM-based neuromorphic hardware that he has researched on is efficient on three counts – energy, power, and area – compared to conventional solutions. It is also faster and unsupervised in some cases.
“After the discovery, there were several inventions down the path. The discovery was that the device can act as a synapse but then it needed several inventions in each step to actually put these devices into a system. For this purpose, several of circuits were invented, architectures were invented, learning rules were invented or modified,” says Suri.
Cyran Tech has developed a variety of both hardware-specific and hybrid solutions, he adds. The company is in the midst of some proof-of-concept solutions in field trials in Singapore and Europe. These are early days in the company’s journey and its product development path is yet to be chosen, Suri says. “Six months down the line we are not sure if we will be purely in neuromorphic software or purely in neuromorphic hardware.”
Cyran is also looking into using the neuromorphic hardware to develop solutions for cyber-physical security.
Suri sees a future where the memory on devices increases and if these memory units can also compute, it can bring about much more power as compared to today. Then, he says, it makes sense to have a memory that can do all functions. “It can help the system store, compute, sense and secure itself and this is our research track,” he says.
Subscribe to FactorDaily
Our daily brief keeps thousands of readers ahead of the curve. More signals, less noise.
To get more stories like this on email, click here and subscribe to our daily brief.
Updated at 01:45 pm on August 14, 2018 to change the headline (it was earlier 'This IIT-D professor’s brain-inspired chip uses storage to compute') and add a reference to Manan Suri's picture.
Disclosure: FactorDaily is owned by SourceCode Media, which counts Accel Partners, Blume Ventures, Vijay Shekhar Sharma, Jay Vijayan and Girish Mathrubootham among its investors. Accel Partners and Blume Ventures are venture capital firms with investments in several companies. Vijay Shekhar Sharma is the founder of Paytm. Jay Vijayan and Girish Mathrubootham are entrepreneurs and angel investors. None of FactorDaily’s investors has any influence on its reporting about India’s technology and startup ecosystem.