Privacy policy

Our website uses cookies to enhance your online experience by; measuring audience engagement, analyzing how our webpage is used, improving website functionality, and delivering relevant, personalized marketing content.
Your privacy is important to us. Thus, you have full control over your cookie preferences and can manage which ones to enable. You can find more information about cookies in our Cookie Policy, about the types of cookies we use on Atos Cookie Table, and information on how to withdraw your consent in our Privacy Policy.

Our website uses cookies to enhance your online experience by; measuring audience engagement, analyzing how our webpage is used, improving website functionality, and delivering relevant, personalized marketing content. Your privacy is important to us. Thus, you have full control over your cookie preferences and can manage which ones to enable. You can find more information about cookies in our Cookie Policy, about the types of cookies we use on Atos Cookie Table, and information on how to withdraw your consent in our Privacy Policy.

Skip to main content

Neuromorphic computing: The future of AI and beyond

In my role as a CTO at Atos, I am constantly on the lookout for emerging technologies that have the potential to revolutionise our industry. One such technology that has been gaining significant attention is neuromorphic computing. This innovative approach to computing is inspired by the structure and function of the human brain, and it promises to bring about a paradigm shift in how we design and implement artificial intelligence (AI) systems.

Understanding neuromorphic computing

Neuromorphic computing is a technology that aims to mimic how the human brain works, using either digital or analogue methods. Traditional AI systems typically rely on deep neural networks (DNNs), complex algorithms designed to recognize patterns and make decisions. However, neuromorphic computing takes a different approach by using spiking neural networks (SNNs).

Spiking neural networks are more similar to how neurons in the human brain communicate. In the brain, neurons send electrical signals to each other, often called "firing." Neuromorphic computing attempts to replicate this process, making it more like the natural way our brains process information.

In simpler terms, while traditional AI systems use a more straightforward and structured method to process data, neuromorphic computing tries to imitate the brain's more dynamic and natural way of working. This can potentially lead to more efficient and powerful computing systems that operate closer to how we think and learn.

Why neuromorphic computing matters

Most of today's artificial intelligence developments rely heavily on graphics processing units (GPUs). GPUs are specialized hardware originally created to render graphics in video games. Now, they are widely used in AI based on their ability to handle complex calculations and simultaneously process large amounts of data. This ability to perform many calculations simultaneously makes them ideal for training AI models, which require quickly processing vast amounts of data.

However, this power comes at a cost. GPUs consume a significant amount of energy, which can be a drawback, especially in environments where energy efficiency is crucial. 

On the other hand, neuromorphic computing offers a more energy-efficient alternative. Unlike traditional AI systems that use GPUs and deep neural networks (DNNs), neuromorphic computing mimics how the human brain works using spiking neural networks. These networks are designed to operate more like biological neurons, which communicate through electrical spikes. This approach significantly reduces power consumption, making neuromorphic computing ideal for applications where saving energy is important.

Neuromorphic computing represents a significant leap forward in AI, mimicking the human brain and offering opportunities to create more efficient, adaptable, and powerful AI systems.

For example, in edge computing, where data processing occurs close to the source of data (like sensors or IoT devices), energy efficiency is critical. Neuromorphic computing can provide the necessary computational power without draining the battery life of these devices. Similarly, in the Internet of Things (IoT) ecosystem, where numerous devices are interconnected and constantly exchanging data, energy-efficient technologies like neuromorphic computing can lead to longer-lasting and more sustainable systems.

In summary, while GPUs are powerful and essential for many AI applications, they are energy intensive. With its brain-like approach and lower power consumption, Neuromorphic computing offers a promising alternative for energy-sensitive applications like edge computing and IoT devices.

The impact on AI and beyond

Neuromorphic computing is expected to change many current AI technologies by saving power and improving performance in ways that current AI chips cannot. Early uses include detecting events, recognizing patterns, and training with small datasets. These abilities help AI systems handle the unpredictability of the real world, making them more robust and adaptable.

Additionally, it makes product development easier, allowing leaders to create AI systems that quickly respond to real-time events and information. This will lead to many future AI products, like autonomous drones and advanced robots.

For example, driverless cars are often considered dangerous because they can't react as quickly as humans in split-second situations. Tesla tried to solve this problem to avoid hitting obstacles on the road, but it failed in one case when a Tesla hit a deer and kept going because its AI didn't see the danger. Neuromorphic computing will help solve these kinds of problems in the near future.

Potential use cases

  1. Image and video recognition: Neuromorphic systems can be trained to recognize patterns and objects in images and videos, making them useful for tasks such as surveillance, self-driving cars, and medical imaging
  2. Robotics: Neuromorphic computing can be used in robotics to create more adaptive and intelligent robots that can learn from their environment and perform complex tasks with greater efficiency
  3. Edge AI: Neuromorphic computing is ideal for edge AI applications, where low power consumption and real-time processing are critical. This includes IoT devices and other edge computing scenarios
  4. Fraud detection: Neuromorphic systems can be used to detect fraudulent activities by recognizing unusual patterns in transaction data, providing a more efficient and accurate method of fraud detection
  5. Neuroscience research: Neuromorphic computing can aid in neuroscience research by providing a platform to simulate and study brain functions, leading to a better understanding of neurological disorders and the development of new treatments

These use cases highlight the versatility and potential benefit of neuromorphic computing in various fields, from enhancing AI capabilities to improving energy efficiency in edge devices.


Neuromorphic computing represents a significant leap forward in the field of AI and computing. Its ability to mimic the human brain's structure and function offers unprecedented opportunities for creating more efficient, adaptable, and powerful AI systems. I am excited about the possibilities that neuromorphic computing brings and look forward to seeing how it will shape the future of technology.

Posted on: February 5, 2025

Share this blog article