Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Loihi 2 chip. Show all posts

Could Brain-Like Computers Be a Game Changer in the Tech Industry?

 

Modern computing's demand for electricity is growing at an alarming pace. By 2026, energy consumption by data centers, artificial intelligence (AI), and cryptocurrency could potentially double compared to 2022 levels, according to a report from the International Energy Agency (IEA). The IEA estimates that by 2026, these sectors' energy usage could be equivalent to Japan's annual energy consumption.

Companies like Nvidia, which produces chips for most AI applications today, are working on developing more energy-efficient hardware. However, another approach could be to create computers with a fundamentally different, more energy-efficient architecture.

Some companies are exploring this path by mimicking the brain, an organ that performs more operations faster than conventional computers while using only a fraction of the power. Neuromorphic computing involves electronic devices imitating neurons and synapses, interconnected similarly to the brain's electrical network.

This concept isn't new; researchers have been investigating it since the 1980s. However, the rising energy demands of the AI revolution are increasing the urgency to bring this technology into practical use. Current neuromorphic systems mainly serve as research tools, but proponents argue they could greatly enhance energy efficiency.

Major companies like Intel and IBM, along with several smaller firms, are pursuing commercial applications. Dan Hutcheson, an analyst at TechInsights, notes, "The opportunity is there waiting for the company that can figure this out... it could be an Nvidia killer." In May, SpiNNcloud Systems, a spinout from the Dresden University of Technology, announced it would begin selling neuromorphic supercomputers and is currently taking pre-orders.

Hector Gonzalez, co-chief executive of SpiNNcloud Systems, stated, "We have reached the commercialization of neuromorphic supercomputers ahead of other companies." Tony Kenyon, a professor at University College London, adds, "While there still isn’t a killer app... there are many areas where neuromorphic computing will provide significant gains in energy efficiency and performance, and I’m sure we’ll start to see wide adoption as the technology matures."

Neuromorphic computing encompasses various approaches, from a brain-inspired design to near-total simulation of the human brain, though we are far from achieving the latter. Key differences from conventional computing include the integration of memory and processing units on a single chip, which reduces energy consumption and speeds up processing.

Another common feature is an event-driven approach, where imitation neurons and synapses activate only when they have something to communicate, akin to the brain's function. This selective activation saves power compared to conventional computers that are always on.

Additionally, while modern computers are digital, neuromorphic computing can also be analog, relying on continuous signals, which is useful for analyzing real-world data. However, most commercially focused efforts remain digital for ease of implementation.

Commercial applications of neuromorphic computing are envisioned in two main areas: enhancing energy efficiency and performance for AI applications like image and video analysis, speech recognition, and large-language models such as ChatGPT, and in "edge computing" where data is processed in real-time on connected devices under power constraints. Potential beneficiaries include autonomous vehicles, robots, cell phones, and wearable technology.

However, technical challenges persist, particularly in developing software for these new chips, which requires a completely different programming style from conventional computers. "The potential for these devices is huge... the problem is how do you make them work," Hutcheson says, predicting that it could take one to two decades before neuromorphic computing's benefits are fully realized. Cost is another issue, as creating new chips, whether using silicon or other materials, is expensive.

Intel's current prototype, the Loihi 2 chip, is a significant advancement in neuromorphic computing. In April, Intel announced Hala Point, a large-scale neuromorphic research system comprising 1,152 Loihi 2 chips, equating to over 1.15 billion neurons and 128 billion synapses—about the neuron capacity of an owl brain. Mike Davies, director of Intel's neuromorphic computing lab, says Hala Point shows real viability for AI applications and notes rapid progress on the software side.

IBM's latest brain-inspired prototype chip, NorthPole, is an evolution of its previous TrueNorth chip. According to Dharmendra Modha, IBM's chief scientist of brain-inspired computing, NorthPole is more energy and space efficient and faster than any existing chip. IBM is now working to integrate these chips into a larger system, with Modha highlighting that NorthPole was co-designed with software to fully exploit its architecture from the outset.

Other smaller neuromorphic companies include BrainChip, SynSense, and Innatera. SpiNNcloud’s supercomputers commercialize neuromorphic computing developed at TU Dresden and the University of Manchester under the EU’s Human Brain Project. This project has produced two research-purpose supercomputers: SpiNNaker1 at Manchester, operational since 2018 with over one billion neurons, and SpiNNaker2 at Dresden, capable of emulating at least five billion neurons and currently being configured. SpiNNcloud's commercial systems are expected to emulate at least 10 billion neurons.

According to Professor Kenyon, the future will likely feature a combination of conventional, neuromorphic, and quantum computing platforms, all working together.