What happens when robots begin to ask ‘who am I’?
Artificial intelligence is rapidly evolving – but, asks Vishwanathan Mohan, can it get to the point where robots have a true sense of self?
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Having a sense of self lies at the heart of what it means to be human. Without it, we couldn’t navigate, interact, empathise or ultimately survive in an ever-changing, complex world of others.
We need a sense of self when we are taking action, but also when we are anticipating the consequences of potential actions, by ourselves or others.
Given that we want to incorporate robots into our social world, it’s no wonder that creating a sense of self in artificial intelligence (AI) is one of the ultimate goals for researchers in the field. If these machines are to be our carers or companions, they must inevitably have an ability to put themselves in our shoes. While scientists are still a long way from creating robots with a human-like sense of self, they are getting closer.
Researchers behind a new study, published in Science Robotics, have developed a robotic arm with knowledge of its physical form – a basic sense of self. This is nevertheless an important step.
There is no perfect scientific explanation of what exactly constitutes the human sense of self. Emerging studies from neuroscience show that cortical networks in the motor and parietal areas of the brain are activated in many contexts where we are not physically moving. For example, hearing words such as “pick or kick” activate the motor areas of the brain. So does observing someone else acting.
The hypothesis emerging from this is that we understand others as if we ourselves were acting – a phenomenon scientists refer to as “embodied simulation”. In other words, we reuse our own ability to act with our bodily resources in order to attribute meanings to the actions or goals of others. The engine that drives this simulation process is a mental model of the body or the self. And that is exactly what researchers are trying to reproduce in machines.
The physical self
The team behind the new study used a deep learning network to create a self-model in a robotic arm through data from random movements. Importantly, the AI was not fed any information about its geometrical shape or underlying physics, it learned gradually as it was moving and bumping into things – similar to a baby learning about itself by observing its hands.
It could then use this self-model containing information about its shape, size and movement to make predictions related to future states of actions, such as picking something up with a tool. When the scientists made physical changes to the robot arm, contradictions between the robot’s predictions and reality triggered the learning loop to start over, enabling the robot to adapt its self-model to its new body shape.
While the present study used a single arm, similar models are also being developed for humanoid robots through the process of self-exploration (dubbed sensory motor babbling) – inspired by studies in developmental psychology.
The complete self
Even so, a robotic sense of self does not come close to the human one. Like an onion, our self has several mysterious layers. These include an ability to identify with the body, being located within the physical boundaries of that body and perceiving the world from the visuospatial perspective of that body. But it also involves processes that go beyond this, including integration of sensory information, continuity in time through memories, agency and ownership of one’s actions and privacy (people can’t read our thoughts).
While the quest to engineer a robotic sense of self that encompasses all these multiple layers is still in its infancy, building blocks such as the body schema demonstrated in the new study are being created. Machines can also be made to imitate others and predict the intentions of others or adopt their perspective. Such developments, along with growing episodic memory, are also important steps towards building socially cognitive robotic companions.
Interestingly, this research can also help us learn more about the human sense of self. We know now that robots can adapt their physical self-model when changes are made to their bodies. An alternative way to think about this is in the context of tool use by animals, where diverse external objects are coupled to the body (sticks, forks, swords or smartphones).
Imaging studies show that neurons active during hand grasping in monkeys also become active when they grasp using pliers, as if the pliers were now the fingers. The tool becomes a part of the body and the physical sense of self has been altered. It is similar to how we consider the avatar on the screen as ourselves while playing video games.
An intriguing idea originally proposed by Japanese neuroscientist Atsushi Iriki is that the ability to literally incorporate external objects into one’s body and the ability to objectify other bodies as tools are two sides of the same coin. Remarkably, this blurred distinction requires the emergence of a virtual concept – the self – to act as a place holder between the subject/actor and objects/tools. Tweaking the self by adding or removing tools can, therefore, help us probe how this self operates.
Robots learning to use tools as an extension to their bodies are fertile test beds to validate such emerging data and theories from neuroscience and psychology. At the same time, the research will lead to the development of more intelligent, cognitive machines working for and with us in diverse domains.
Perhaps this is the most important aspect of the new research. It ultimately brings together psychology, neuroscience and engineering to understand one of the most fundamental questions in science: who am I?
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments