GFT and NEURA Robotics form strategic partnership to develop cutting-edge software for physical AI.
Robots are still mainly found in industry. In the future, they will help us around the house: tidying up, taking out the garbage, making breakfast. Science fiction? David Reger, CEO of Neura Robotics, believes this will be possible in just a few years.
By Ingmar Höhmann
To mark t3n's 20th anniversary (click here for our anniversary hub), we asked experts which trends and technologies will shape the future. One of them is David Reger, founder and CEO of Neura Robotics. The Metzingen-based company develops AI-controlled robots for industry and the home. This report from the lAutomatica trade fair explains what the devices can - and cannot - do so far.
t3n: How will robots change the way we work and live in the next 20 years?
David Reger: Fundamentally. Robots will enable us to lead a more self-determined life by taking over everyday tasks such as taking out the garbage, tidying up or putting away the dishwasher. They can even prepare breakfast - if we want them to.
Will these robots look like humans?
Not at first, at least. They will be more functional devices - for example, a "smartphone on wheels" with arms, sensors and AI. Our service robot MiPA looks like this. It can detect a heartbeat without touching people and react in an emergency. I no longer have to drive an hour and a half to see my mother. Instead, I can play chess with her remotely via the display on the robot or just have a quick look to see if everything is OK.
We will soon see the greatest benefits of household robots in the care sector. There are more and more people who need help, and at the same time there is a shortage of care workers. Robots can already help with many tasks here. This leaves carers more time for personal contact.
When will such household robots become part of everyday life?
Very soon. In two to three years, everyone will be able to buy a domestic robot. Technologically, we are on the verge of this: robots can now analyze rooms well, remember their original state and recognize changes - an important prerequisite for tidying up, for example.
This is not a distant future: a rocket that flies into space and lands on a floating platform is technically more complex than a robot that tidies a room. The greater challenge is the natural interaction with humans.
What is the problem?
For safety reasons, robots slow down their movements as soon as humans are nearby. But if simple movements take too long, acceptance suffers. Nobody has the patience to watch a robot spend minutes trying to pour a cup of coffee. It's too slow and seems unnatural.
A human can accidentally touch something without it being perceived as a risk - robots lack this trust. That is why we are developing a sensory skin that recognizes where people are before they come into contact. If it does, the robot stops or brakes in time to avoid any danger.
When robots stand on two legs, the risk of falling is higher. If a humanoid robot falls on a person, it can seriously injure them.
In industry, this can be controlled well because we can design the environment in a targeted manner. Humanoid robots move at a reduced speed, work with safe electronics and simply switch off in an emergency. If a fall does occur, the robot falls away from the human in a controlled manner.
This is more difficult in the home. The environment is constantly changing, people or pets often move unpredictably. This is why we see humanoid robots first in industry, where safety requirements are easier to implement
Where will we see the greatest technological progress?
The biggest breakthrough will not be a technical detail, but access to data from the physical world. With voice models such as ChatGPT, the basis was a huge amount of text data. This has accelerated development. In robotics, there are no comparable data sets from the real world: about movement, material behavior or human reactions. That is the bottleneck.
Instead of each robot manufacturer collecting its own training data, a common basis is needed. We have developed our Neuraverse platform for this purpose, which works like an app store: Developers contribute applications, for example for care, household or industrial tasks. These can be used independently of the manufacturer so that different robots can access the same knowledge. This creates physical intelligence that enables robots to continuously improve.